Skip navigation


Deakin’s Universal Motion Simulator (UMS) enables users to engage in a natural flight experience. Here, users feel g-forces as they soar through virtual and real space. The idea of immersive flight simulations is nothing new, but the advances in technology that provide a stronger feeling of presence cannot be overlooked.


Image copyright F-SIM Space Shuttle.

Admittedly, when I first saw this three things went through my mind. The first was a simulation I participated in as a child – I think it was in a NASA-affiliated museum of some sorts. In this simulator, I had to land a space rocket on the virtual runway. I never forget that experience because I managed to successfully land the space craft, but I didn’t know you had to press the “release parachute” button to slow down. So, my shuttle eventually ran into the grass at the end of the runway. Whoops.


The second thing that came to my mind was a game series on the PS2 called Zone of Enders. With the way the UMS moved about, I thought of how the Jehuty in ZOE moves about on the screen. Sure, I could have thought about other robot games like Armored Core or Gundam, but from my experiences these series always felt a bit slow and clunky compared to ZOE, which is saying something because AC’s battles can be quite fast at times.

And finally, the third thought I had pertains to combining ZOE-like combat with the immersion afforded by the UMS. Creating entertainment applications such as immersive gaming simulators is something I aspire to do one day. While UMS already supports the real-world flight simulation, I think the bigger market will be in the mass consumer sector once the technology becomes more affordable for the general public.

Advertisements

For all you Augmented Reality (AR) developers out there, Total Immersion has released a free version of their D’Fusion Studio.  D’Fusion Studio enables developers to design, create, and deploy non-commercial AR applications.  Using this software, you can create games like SkinVaders:

Sadly, it seems Total Immersion supports only Windows-based systems at this moment.  So, unless you have Windows or Boot Camp your Mac, you’ll be a bit out of luck.  While I’m disappointed I can’t test D’Fusion on my laptop, I’m still excited to try this beauty out on my desktop at home!

Don’t mind me, but I’m thinking it’d be really cool if we had some Trauma Center and da Vinci Trainer combo. You know, maybe with a bit less “Healing Touch” and a bit more Haptic Technology!

I came across this gem some days ago while browsing on ImTech’s front page. I must say, I’m quite impressed with the effort The Gadget Show put toward constructing such an immersive experience for their simulator’s users!

Some of you might think the show’s hosts were getting too excited about “how immersed” users seemed based on their reactions in the simulator. From a competitive gamer’s perspective, I could imagine this “hesitancy to move” being detrimental for your overall ranking, unless you were participating in survival ratings over the kill/death ratio FPS standard. However, from a researcher’s or experience designer’s perspective, the level of immersion – known as Presence (the measure of how “real” a simulation appears) – is quite important for influencing human behavior.

What does that really mean though? It means the way you act in the game/simulation may change radically depending on how present you feel in the environment. For example, a typical FPS gamer may just run into the battle ground and shoot at enemy targets while being shot at himself. There is no consequence for dying with the exception of a lower score or potentially getting kicked off your team if you don’t contribute much.

Now, contrast that low-risk style of game play with the presence felt in a fully-immersive simulator complete with physical pain mapping: you get players who will be much more meticulous with their actions as demonstrated in the video by SAS operator Andy McNab. If the user suffers a gunshot wound in the game, part of the change may be caused by the desire to avoid the paintball barrage effect penalty, but this consequence is coupled with the ability to walk, look, and aim through natural physical movement instead of button presses and joystick flicks. Together, these features provide several layers of immersion in the system, which delivers an overwhelming sense of a “second” reality. Despite knowing they are playing a game, users will undoubtedly treat the game more as if it were a real-life situation due to the immense presence they feel.

This is why presence is such a huge accomplishment to achieve in virtual environments and simulations. Without this level of immersion, people may treat the simulation as a low-risk environment and thus not engage with it like they would if they believed there was a higher connection. Think about using this kind of technology for training doctor, which is happening by the way. I can’t wait till we have even more immersive tech pervading through our society on a more commercial scale for others to appreciate!

Found this on TechCrunch:

TechCrunch Image

 

Soon you, too, will be able to talk to the hand. A new interface created jointly by Microsoft and the Carnegie Mellon Human Computer Interaction Institute allows for interfaces to be displayed on any surface, including notebooks, body parts, and tables. The UI is completely multitouch and the “shoulder-worn” system will locate the surface you’re working on in 3D space, ensuring the UI is always accessible. It uses a picoprojector and a 3D scanner similar to the Kinect.

Click here for more of the story!

X-Ray Heart by mmattes

Ever wondered how it might be if you could see through objects?

Well, three students – Jon Rodriguez, Ben Jensen, and James Kahramaner – at Stanford University are determined to create X-Ray Vision in Virtual Environments.  By calculating a focal point based on users’ pupil accommodation, their product Voxel Vision will enable users the ability to focus inside solid, opaque virtual objects or beyond them, effectively letting users see through-walls in a natural manner.  The ambitious three list visual surgical assistance, ambush deterrence, and cellular composition as possible use cases, emphasizing that the applications may extend to enhancing computer desktop interactions.

So far, their project has successfully managed to receive backing from their KickStarter followers.  With a little over an initial investment of $5 grand, Rodriguez, Jensen, and Kahramaner are aiming to finish their project by September 2012.  Along the way, they plan to recruit some talented specialists.  If this is in your alley, definitely contact them about the potential for a future collaboration.

Voxel Vision’s team welcomes any comments, questions, or critiques of their work.  For more information on how to get in touch, check out their KickStarter page.

Hi, guys!

Those of you who know me also know I’m looking into future careers in the immersive technologies industry.  So far, I’ve been posting personal musings on academic institutions and other little VR-related goodies, and now I’m going to start a series called “Cool Companies.”  Basically, Cool Companies will consist of small blurbs and videos/images about immersive technologies-related companies that you guys should check out if you’re interested in finding a career in this field.

Starting off our first company in the series is…

Oblong

The era of one human, one mouse, one screen, one machine is giving way to what’s next: multiple participants, working in proximity and remotely, using a groundbreaking spatial interface to control applications and data spread across every display. This is what Oblong builds. It’s why we’re here.

That’s some pretty fancy jargon, but Oblong not only talks the talk, it also walks the walk with their g-speak spatial operating environment (SOE) platform:

g-speak overview 1828121108 from john underkoffler on Vimeo.

Look like The Minority Report?  Well, it should since g-speak made its debut in this movie.  In addition to g-speak, Oblong has collaborative, shared environment technologies such as their digital whiteboard-like product Mezzanine.

Sound interesting?  Find out more at their company website.

In the last Consumer Electronics Show (CES 2011), Sony showcased a number of 3D viewing technologies including TVs and a Head Mounted Display (HMD) prototype.  I don’t know about you guys, but my first HMD looked like this:

Virtual Boy

Yes, I was one of those lucky kids who got to play with a Nintendo Virtual Boy.  And you know what?  I had fun playing games like Mario Clash and Nester’s Funky Bowling in magnificent red 3D visuals.  I’m not sure if the VB might have contributed to my myopia, but I like to believe it only provided me a good time and maybe a slight neck ache from the weird angle I had to be at to play.

In any case, Virtual Boy wasn’t the only HMD I got to put on my noggin.  As one of the senior research programmers for Stanford’s Virtual Human Interaction Lab, I had the opportunity to develop 3D simulations that were seen through an nVisor SX60:

nVisor SX60

Creating a real-time immersive experience requires technical chops, finesse, and consideration to your users and subjects; running these simulations with precision point tracking can be even more taxing on your hardware as each frame must be updated at least 60 times per second in order to avoid motion sickness.  Tack in the several thick cables required to transmit all that data from your rendering machines to each screen for your eyes, and you’ve got quite some intricacies to manage.

How does this all tie in with today’s subject,  Sony’s new HMZ-T1?

Sony HMZ-T1 Personal 3D Viewer

Well, Sony highlights this new device as a “personal  3D viewer,” which basically means watching your shows and movies in 3D and in private.  This application is fine for your shy media watcher, but gamers and like-minded individuals are hoping Sony will fit future iterations of the HMZ-T1 with precision tracking, ultimately transforming the HMZT1 into a machine of magic – one that can create compelling immersive virtual environments that enhance gaming experiences.

There are some cautions about the current HMZ-T1 to be noted though.  Anyone who has used similar devices will recall the potential to contract motion sickness from ill-adjusted use.  Furthermore, Sony warns that children under 15 should not use the Personal 3D Viewer – possibly due to unforeseen effects on development?  Fortunately a can of ginger ale can alleviate the motion sickness symptoms.

The HMZ-T1 scheduled to debut commercially November 11 in Japan at a price tag of about $780 USD.  Sony promises Americans can get their hands on the device for $800 USD sometime November this year.  Sounds like a hefty sum to pay for a personal TV, but if the HMZ-T1 can be used to as a precision tracking HMD, the price may very well be worth the rainy day savings when compared to $25k$37k higher end competitors’ offers.

For detailed specs about Sony’s HMZ-T1 Personal 3D Viewer, as well as the ability to preorder your own, please visit Sony’s website here.

Kajimoto Laboratory

As one of our most under appreciated yet useful senses, touch has been a tricky experience to simulate realistically in immersive environments and VR simulations.  Most attempts to increase tactile presence (how real a situation feels to you) in simulations involve general vibrations across our skin such as the commercial N64’s Rumble Pak, force-feedback haptic devices like SensAble’s PHANTOM Omni or Novint’s Falcon, and shear forces (the sensation on your skin as an object’s surface runs along it) like some of the research conducted in the Kajimoto Laboratory.

Kaji-Lab has done extensive research in haptic and multimodal interfaces since 2006.  What I find most intriguing about this lab is their drive to study and virtually replicate a variety of interactions, including the sensations of tele-present kissing…

and simulated “bassari” (feeling of being cut by a sword)…

I haven’t heard any other news about the kissing machine, but the bassari research seems to have undergone a newer iteration as demonstrated in this article.

While some may be skeptical about these particular ideas, it is this kind of work that paves the way for improving our immersive technologies and experiences.  For more information about the Kajimoto Laboratory and their work, please check out their website here:

http://kaji-lab.jp/en/index.php?FrontPage

A virtual Reality and converging technologies conference located in France (March 28 – April 1 2012).  If you’re interested in VR or related fields, this is definitely the event to attend to get a feel for what’s cutting edge in industry and research.

As for logistics, The conference begins March 28th for professionals and opens up to the general public on March 31st and April 1st.  Major VR R&D companies like Mechdyne Corporation, Disney Research, and NVIDIA will be attending this year.  More information can be found at laval-virtual.org, but be warned that the site is predominately in French with limited English translations scattered about!