Tuesday, July 26, 2011

mesh modeller that uses KINECTs depth perception and homemade data gloves

 

a simple mesh modeller that uses KINECTs depth perception and homemade data gloves for a more real world oriented user interaction in virtual 3d space.
realized only with open source software

Monday, July 25, 2011

SixthSense

sixthsense

 

system.jpghand_phone.jpgvideo_paper.jpg
SixthSense: How It Works: A webcam captures video, including specific hand signals that the laptop reads as commands. A mini-projector then displays the relevant content — e-mail, stock charts, photos — on the nearest surface  Bland Designs
Imagine a wearable device that lets you physically interact with interfaces that appear in front of you on any surface, where and when you want them. You can watch a video on your newspaper's front page, navigate through a map on your dining table, and flick through photos on any wall. The "Sixth Sense" system from Patti Maes' Fluid Interfaces Group at the MIT Media Lab does all this through a prototype built from $300 worth of off the shelf components. You can even take a photograph by simply holding your hand in the air and making a framing gesture.

 

SixthSense is a wearable gestural interface device developed by Pranav Mistry,

SixthSense - Wikipedia, the free encyclopedia

SixthSense is a wearable gestural interface device developed by Pranav Mistry, a PhD student in the Fluid Interfaces Group at the MIT Media Lab. It is similar to Telepointer, a neckworn projector/camera system developed by Media Lab student Steve Mann[1] (which Mann originally referred to as "Synthetic Synesthesia of the Sixth Sense").[2]

The SixthSense prototype comprises a pocket projector, a mirror and a camera contained in a pendant like, wearable device. Both the projector and the camera are connected to a mobile computing device in the user’s pocket. The projector projects visual information enabling surfaces, walls and physical objects around us to be used as interfaces; while the camera recognizes and tracks user's hand gestures and physical objects using computer-vision based techniques.[3] The software program processes the video stream data captured by the camera and tracks the locations of the colored markers (visual tracking fiducials) at the tips of the user’s fingers. The movements and arrangements of these fiducials are interpreted into gestures that act as interaction instructions for the projected application interfaces. SixthSense supports multi-touch and multi-user interaction.

Friday, July 15, 2011

Voyagers [openFrameworks]

Voyagers - permanent installation for @NMMGreenwhich by @Lightsurgeons @obviousjim @tgfrerer and others #openFrameworks | CreativeApplications.Net

 

Voyagers [openFrameworks]

Created by The Light Surgeons for the National Maritime Museum in London, the installation “Voyagers” engages with England’s long standing relationship to the sea, featuring thematic images and film from the museum’s collection animated atop a continually flowing ocean of typography across an abstract wave shaped structure. Together with a number of other projects, the installation opens to the public tomorrow. We got a chance to take a sneak peak earlier today and get some insight into the making together with what we enjoy most – the debug info and some fantastic behind the scene images.

 

James George from the New York studio Flightphase collaborated with TThe Light Surgeons to create custom application to animate the content in realtime.  Created using openFrameworks, the applications use a number of different tools to communicate the narratives. The ocean effect of type sweeping across the installation surface is a 3d wave simulation created using a vector field. The complete simulation is stitched and mapped across seven projectors covering the 20 metre triangulated surface.  The image sets were designed by the Light Surgeons to relate each of the six themes of the museum. openFrameworks parses the layouts and generates animations that cascade down the wave. Also, at the far end of the gallery is a Puffersphere, which is an internal spherical projector.  During the course of each cascade of images the puffersphere collects thematic keywords that relate to the images and prints them onto the surface of the globe. Likewise, the type waves trigger projected content of the sphere as they “hit” it’s surface. The audio created by Jude Greenaway is mixed dynamically by interfacing openFrameworks to SuperCollider over OSC.

James used Dan Shiffman‘s Most Pixels Ever library for synchronizing the application.  He has also released a number of changes to the library that can be found here (github). The team has also built a way to synchronize parameters over the network using MPE – github and through developing content for the Puffersphere, the team created a lightweight library for animating the surface of the sphere and can be found here.

Full credits:

Design/Direction: The Light Surgeons, Bespoke Software Design: Flightphase, Sound Design: Jude Greenaway, Additional Programming: Timothy Gfrerer, SuperCollider Programming: Michael McCrea and Exhibition Design: Real Studios

National Maritime Museum

 

Sunday, July 10, 2011

tactile pixels to create an electrical field you can feel.

Tactile Pixels Will Make It Easy To Read Braille On Touchscreens

It’s a familiar complaint. Even as companies like Apple try to tie the computer interface back to the natural world, touchscreens are still woefully flat. But recent developments, including Senseg’s new E-sense technology, could bring real-world touch into everyday computing. 

Obviously, this is nothing new. Research into programmable friction has already yielded impressive results, especially in the realm of stickiness. But this concept accomplishes the same goal by using “tixels” or tactile pixels to create an electrical field you can feel. Your skin responds by feeling whatever the interface wants it to feel: Buttons, perhaps, or even the fur of a virtual pet.

The project sounds really promising, and Senseg already has Toshiba backing them. Imagine video chat aided by tactile pixels. Being able to gentle touch the face of a newborn baby. Or the hand of a distant lover. The possibilities are as endless as ever. [The Next Web via Geekosystem]