For this week’s project I did a composite of live film and an animated character. This character is a chicken that I had previously created in Autodesk Maya. To create the chicken I started with a full-scale character design picture, which I uploaded into Maya to use as reference while I modeled it with polygons. After modeling the character by taking basic shapes like spheres and stitching them together, I rigged the character (implemented a skeleton with joints) and shaded it.
To composite the video I first filmed myself running around, imagining the chicken chasing me. To stabilize this video I learned that Youtube actually has a stabilizing button which performs the stabilization for free just by uploading and editing through youtube. After stabilizing, I downloaded the video and exported this .mov file into a sequence of Tiff images (using Premiere) to import into Maya. After this I created a new camera object that pointed at an image plane that played the sequential tiff images so I could overlay my character on top. Finally I animated my chicken, lit the scene and rendered; adding sound back into this new composited video.
Some bumps I ran into included animating the chicken to appear as if it came out of the computer. Originally I wanted to animate the chicken out of the computer, but this proved more difficult than I imagined, so I used a particle effect explosion. I originally wanted this puff of smoke to be white, however I found in order for it to shine white I needed a light to shine on it, which would in turn shine on the chicken, making it way too bright. So I decided to make the smoke black to avoid blowing out the chicken.
Some adjustments I would like to make for the future include maybe some textures for my chicken and maybe some better lighting for more accurate shadows. I would also like to turn the smoke white, and try and animate the chicken as if it were coming out of the computer screen.
Inspiration for this work came from reading the fantastical books by J.R.R. Tolkien. In several of these books, including The Hobbit and The Lord of the Rings: The Fellowship of the Ring, moonlight reveals hidden doors that are not visible with ordinary sunlight. Thus, I desired to discover how moonlight was physically distinct from sunlight. Literature concerning this topic describes that moonlight consists largely of sunlight reflected off the Moon’s surface, in addition to some minor sources like starlight. It has a lower intensity than sunlight and looks bluish to humans; the blue hue is attributed to the Purkinje effect. To gain insight into how to reproduce moonlight artificially, I implemented a twofold approach. The first part dealt with creating a software program that applied image processing on an input image of an outdoor scene cast in natural sunlight to produce an output image of the same scene in moonlight. In the second part, I built a rudimentary LED flashlight with a moonlight mode using a function generator.
Photographers are able to manipulate their camera settings to make images taken in natural sunlight exhibit the appearance of moonlit scenes. Utilizing many of these same techniques, I wrote a program in Matlab to accomplish this goal. The code for this program can be found at the following link (open with Notepad to see it easily):
First, the images were white balanced; this affected the warmth and coolness of colors in the image. Next, the saturation was decreased to half of its original value, as moonlit scenes are less bright than sunlit ones. Finally, the contrast was increased to widen the difference between low and high pixel intensities. Using the default values for white balancing, calculated from averaging the RGB values, the resultant images were too bright. To remedy this, the white balancing was decreased. Two pairs of images in sunlight (original) and moonlight (calculated from program) are given below. The first pair shows an outdoor scene with a far depth of field, while the second has a near depth of field.
Many LED flashlights have a moonlight setting that is noticeably dimmer than the normal bright mode. I made my own form of this type of light source using a 10V peak-to-peak function generator and a strip of 12V LEDs. I set the frequency to 1 kHz and selected the square wave option. To start, I applied a high peak-to-peak voltage of approximately 9.4V to simulate sunlight. Next, I gradually decreased the voltage on the function generator, dimming the LEDs. Comparing a photograph of moonlight with the intensity of the LED strips, it was determined that a peak-to-peak voltage of 7.2V yielded a luster akin to moonlight. Photographs of the LEDs with an applied voltage of 9.4V peak-to peak (top) and an applied voltage of 7.2V peak-to-peak (bottom) are illustrated below.
The illusion I created relied on apparent motion using video. Originally, I applied the technique known as stop motion on a decorative object. Two pieces of tape were put in a cross pattern on a desk to mark the location for the object to be set. With the object positioned on the desk, I took a picture of the scene. Next, I very slightly turned the object clockwise, making sure that it retained its place on the tape, and took another picture. This process of rotating the object incrementally and snapping a picture was repeated until one complete revolution was made. In total, 71 images were taken. A program called 3DRT took these input images and created a GIF file that cycled through them at a rapid rate; this rate was so high that we perceive the succession of images as continuous motion. An online converter changed the GIF file into an MP4 file, which is shown below.
To improve this trick, I sought a means to shed doubt on the implementation of stop motion. My inclination was to insert a second object into the scene and have it appear to move due to gravity. After all, had I resorted to stop motion alone to capture these two movements, I would have needed to take a burst of shots for each position of the rotating object on the desk, and then match each appropriate rotating/falling pair to assemble the video frames; this would be an exceedingly laborious process. I accomplished the introduction of a second object, as a dark orange ball, digitally with the aid of a program called Blender. Using Blender, I imported my MP4 file of the rotating object, created an orange sphere, put the MP4 file and sphere on a timeline, and inserted a series of LocRotScale keyframes while changing the timeline marker and sphere position between subsequent keyframes. I used an Alpha Under blend for the MP4 file and an Alpha Over blend for the sphere. In order to provide a convincing visual appearance of the two objects merged together in the scene, I had the table act as an interaction platform. That is, the ball fell from a high height and appeared to make contact with the table, after which it bounced back to a lesser height before falling once more. A video of the resulting blended scene is shown below.
I’ve been very interested in food in the past few weeks, and also in the human perception of food and taste.
I think food that is done and served well could definitely feel like magic.
So, for this week I wanted to try and do a gustatory (taste) illusion.
Research regarding our perception of taste as already been done before, and it is extremely interesting!
One famous experiment is the Fake tongue illusion
Other very interesting work has been done by the british chef Heston Blumenthal (1, 2)
And by the artist Miriam Simun
I’ve decided to combine the facts I learned from the above readings with a show.
For this magic I used 3 pairs of wooden chopsticks, 3 small plates, 1 strawberry and a knife
When showing the magic, I asked a volunteer from the audience to cut the strawberry into 3 equal pieces and put one piece in each plate.
Than, the audience member was asked to eat the first (left most) piece of strawberry and describe it’s taste.
I then did a “magic movement” above the second strawberry. Now the second piece should taste differently from the first one. even though it was the same strawberry! and a different taste for the third piece as well.
The magic performed in class did not work well as I expected (worked better when I tested this on my office mates) but the third strawberry piece did tasted a little different (according to my volunteer).
The trick was adding different scents to the chopsticks. When holding the scented chopstick near the nose the smell has an effect on the way taste feels.
For the “Trick++” assignment, I wanted to use the computer as the magician.
Our group recently purchased a couple of Tobii eyex eyetrackers and this was a great opportunity for me to learn how to use one.
The magic worked as follows:
The audience member volunteered was asked to sit in front of the computer screen, which showed an animation of shuffling deck of cards.
When the volunteer is ready he/she clicks the “Magic” button and the computer shuffles the cards and open the top 5. Then, a short tune is starting to play and the user should now choose one card and concentrate on it really hard – “transmit your choice to the computer”.
After 10 seconds the music is over. The deck of card is again shuffled and then all the cards in the deck are spread on the screen, all facing down but the chosen card that is facing up.
The eye tracker at the bottom of the screen is tracking the user eyes while the cards are open and music is played. The chosen card is the one that the user looked on for the longest time.
The code for the webpage running the magic is here.
Since we were focusing on magic tricks for senses other than sight, I decided to perform one that would work for the blind.
Effect: This trick makes it appear as if two ropes are traveling through a hand both through touch and sight.
Method: You need two ropes or longer pieces of string. Originally this trick is meant to be done “through the neck” but since I used shoelaces, through a hand was more appropriate. First have the user examine the strings, noting that they are in fact normal strings. Then subtly fold each of the strings in half and tell the user to place their wrist on top of the end loops of each shoelace. It will feel like they are just placing their wrist over two ropes laid out. Then take an end of each shoelace from either side of the wrist, and tie one loose knot (as demonstrated in the video). Use patter to ask questions about what the user is feeling and to ensure that there is indeed a loose knot on one end. Then very quickly pull both ends of the shoelace. The loops underneath the hand will give and feel like the shoelaces are slipping through the hand. In the end the user can feel the rope on top of their hand.
Below is a picture of what it looks like. and below that are some links to a video of the trick being done on a blind participant 😀 enjoy!
For my cyber-magic project I created a Kinect based magic trick that included hand tracking. This project was done in Scratch and in order to connect the Xbox Kinect I used an add on called Kinect2Scratch4Mac.
Effect: Make it seem like I’m creating digital blue sparkles shoot out behind my hands on a screen I’m standing in front of. Magically I make it appear that I have grabbed the sparkles off of the screen when I ball my hands into fists and the sparkles stop. I then throw the sparkles into the real physical world.
Method: I used a kinect to track my left and right hands. When my hands were above a certain point on the y-axis (around my shoulder height) I activated sparkles that followed the hands on the screen background. The secret to stopping the sparkles was through use of a counter. I was timing my whole performance in my head. As soon as my counter was up based upon how long my left hand was below my shoulder level, I knew that the next time I lifted my hands, the sparkles would be gone. This is when I closed my hands into fists. The second trick was to hold glitter between my fingers the entire performance without letting the audience know. Once my hands were in fists and I opened them again, the audience sees the glitter but thinks it is coming out of thin air.
The code is posted here on scratch.com so you can all see how it works 🙂