Skip navigation

Monthly Archives: October 2011

Found this on TechCrunch:

TechCrunch Image


Soon you, too, will be able to talk to the hand. A new interface created jointly by Microsoft and the Carnegie Mellon Human Computer Interaction Institute allows for interfaces to be displayed on any surface, including notebooks, body parts, and tables. The UI is completely multitouch and the “shoulder-worn” system will locate the surface you’re working on in 3D space, ensuring the UI is always accessible. It uses a picoprojector and a 3D scanner similar to the Kinect.

Click here for more of the story!


X-Ray Heart by mmattes

Ever wondered how it might be if you could see through objects?

Well, three students – Jon Rodriguez, Ben Jensen, and James Kahramaner – at Stanford University are determined to create X-Ray Vision in Virtual Environments.  By calculating a focal point based on users’ pupil accommodation, their product Voxel Vision will enable users the ability to focus inside solid, opaque virtual objects or beyond them, effectively letting users see through-walls in a natural manner.  The ambitious three list visual surgical assistance, ambush deterrence, and cellular composition as possible use cases, emphasizing that the applications may extend to enhancing computer desktop interactions.

So far, their project has successfully managed to receive backing from their KickStarter followers.  With a little over an initial investment of $5 grand, Rodriguez, Jensen, and Kahramaner are aiming to finish their project by September 2012.  Along the way, they plan to recruit some talented specialists.  If this is in your alley, definitely contact them about the potential for a future collaboration.

Voxel Vision’s team welcomes any comments, questions, or critiques of their work.  For more information on how to get in touch, check out their KickStarter page.

Hi, guys!

Those of you who know me also know I’m looking into future careers in the immersive technologies industry.  So far, I’ve been posting personal musings on academic institutions and other little VR-related goodies, and now I’m going to start a series called “Cool Companies.”  Basically, Cool Companies will consist of small blurbs and videos/images about immersive technologies-related companies that you guys should check out if you’re interested in finding a career in this field.

Starting off our first company in the series is…


The era of one human, one mouse, one screen, one machine is giving way to what’s next: multiple participants, working in proximity and remotely, using a groundbreaking spatial interface to control applications and data spread across every display. This is what Oblong builds. It’s why we’re here.

That’s some pretty fancy jargon, but Oblong not only talks the talk, it also walks the walk with their g-speak spatial operating environment (SOE) platform:

g-speak overview 1828121108 from john underkoffler on Vimeo.

Look like The Minority Report?  Well, it should since g-speak made its debut in this movie.  In addition to g-speak, Oblong has collaborative, shared environment technologies such as their digital whiteboard-like product Mezzanine.

Sound interesting?  Find out more at their company website.

In the last Consumer Electronics Show (CES 2011), Sony showcased a number of 3D viewing technologies including TVs and a Head Mounted Display (HMD) prototype.  I don’t know about you guys, but my first HMD looked like this:

Virtual Boy

Yes, I was one of those lucky kids who got to play with a Nintendo Virtual Boy.  And you know what?  I had fun playing games like Mario Clash and Nester’s Funky Bowling in magnificent red 3D visuals.  I’m not sure if the VB might have contributed to my myopia, but I like to believe it only provided me a good time and maybe a slight neck ache from the weird angle I had to be at to play.

In any case, Virtual Boy wasn’t the only HMD I got to put on my noggin.  As one of the senior research programmers for Stanford’s Virtual Human Interaction Lab, I had the opportunity to develop 3D simulations that were seen through an nVisor SX60:

nVisor SX60

Creating a real-time immersive experience requires technical chops, finesse, and consideration to your users and subjects; running these simulations with precision point tracking can be even more taxing on your hardware as each frame must be updated at least 60 times per second in order to avoid motion sickness.  Tack in the several thick cables required to transmit all that data from your rendering machines to each screen for your eyes, and you’ve got quite some intricacies to manage.

How does this all tie in with today’s subject,  Sony’s new HMZ-T1?

Sony HMZ-T1 Personal 3D Viewer

Well, Sony highlights this new device as a “personal  3D viewer,” which basically means watching your shows and movies in 3D and in private.  This application is fine for your shy media watcher, but gamers and like-minded individuals are hoping Sony will fit future iterations of the HMZ-T1 with precision tracking, ultimately transforming the HMZT1 into a machine of magic – one that can create compelling immersive virtual environments that enhance gaming experiences.

There are some cautions about the current HMZ-T1 to be noted though.  Anyone who has used similar devices will recall the potential to contract motion sickness from ill-adjusted use.  Furthermore, Sony warns that children under 15 should not use the Personal 3D Viewer – possibly due to unforeseen effects on development?  Fortunately a can of ginger ale can alleviate the motion sickness symptoms.

The HMZ-T1 scheduled to debut commercially November 11 in Japan at a price tag of about $780 USD.  Sony promises Americans can get their hands on the device for $800 USD sometime November this year.  Sounds like a hefty sum to pay for a personal TV, but if the HMZ-T1 can be used to as a precision tracking HMD, the price may very well be worth the rainy day savings when compared to $25k$37k higher end competitors’ offers.

For detailed specs about Sony’s HMZ-T1 Personal 3D Viewer, as well as the ability to preorder your own, please visit Sony’s website here.

Kajimoto Laboratory

As one of our most under appreciated yet useful senses, touch has been a tricky experience to simulate realistically in immersive environments and VR simulations.  Most attempts to increase tactile presence (how real a situation feels to you) in simulations involve general vibrations across our skin such as the commercial N64’s Rumble Pak, force-feedback haptic devices like SensAble’s PHANTOM Omni or Novint’s Falcon, and shear forces (the sensation on your skin as an object’s surface runs along it) like some of the research conducted in the Kajimoto Laboratory.

Kaji-Lab has done extensive research in haptic and multimodal interfaces since 2006.  What I find most intriguing about this lab is their drive to study and virtually replicate a variety of interactions, including the sensations of tele-present kissing…

and simulated “bassari” (feeling of being cut by a sword)…

I haven’t heard any other news about the kissing machine, but the bassari research seems to have undergone a newer iteration as demonstrated in this article.

While some may be skeptical about these particular ideas, it is this kind of work that paves the way for improving our immersive technologies and experiences.  For more information about the Kajimoto Laboratory and their work, please check out their website here:

A virtual Reality and converging technologies conference located in France (March 28 – April 1 2012).  If you’re interested in VR or related fields, this is definitely the event to attend to get a feel for what’s cutting edge in industry and research.

As for logistics, The conference begins March 28th for professionals and opens up to the general public on March 31st and April 1st.  Major VR R&D companies like Mechdyne Corporation, Disney Research, and NVIDIA will be attending this year.  More information can be found at, but be warned that the site is predominately in French with limited English translations scattered about!