Description :
Presented at Microsoft TechForum 2012 : In the first of two demonstrations given at TechForum by the Applied Sciences Team, this is a project that is looking at what’s possible with a Samsung transparent OLED screen, Kinect sensors and some software magic.
What you get is the ability to manipulate objects that lie behind the transparent screen - as well as objects on the screen with something called “view-dependent, depth-corrected gaze”.
After spending any time with Stevie Bathiche and his team you get used to such terminology. Much of the team's work explores the use of Kinect for tracking your eyes and delivering an image back to them that tracks your movements and gives depth and perspective.
The effect in this demo is you have a digital desktop that has depth allowing you to virtually place objects in space behind the screen, while also enabling the manipulation of objects on the screen from behind.
The effect in this demo is you have a digital desktop that has depth allowing you to virtually place objects in space behind the screen, while also enabling the manipulation of objects on the screen from behind.
As with many of these types of demos, the video does a far better job of showing you what is going on than my words can. It’s another example of how the boundaries between physical and digital worlds are blurring.
We hope you liked our article. Feel free to comment below and let us know your thoughts. Discover other interesting presentations and research studies from Microsoft Research and don't forget to share it with your friends.
*by andreascy*