“exTouch is a novel embodied spatially-aware interface system to manipulate actuated objects mediated by augmented reality. The “exTouch” system extends the users touchscreen interactions into the real world by enabling spatial control over the actuated object. When users touch a device shown in live video on the screen, they can change its position and orientation through multi-touch gestures or by physically moving the screen in relation to the controlled object.”
The website of a research project (alas German only) which revolves around working with computers with the input of your brain only. The idea comes from the notion of the BCI (brain-computer interface) which provides a direct communication between brain and computer with no muscular activity, thus enabling persons with severe disabilities such as ALS or MS to work with computers. The project is being developed by Adi Hoesle and Andrea Kübler.
Watch a video (again German only) here >>>
I have been working on a tutorial blog for novice 3D worlds users, geared towards the attendees of virtual conference session to be held as part of ISEA2011:
The tutorials are custom designed for a standalone metaverse called NGrid, which is hyperlinked to the OpenSim system, however I am hoping that most of the material will be useful to all novice metaverse users, regardless where their virtual locations might be.
More images of NGrid can be seen here:
And the page for the ISEA2011 event is here:
The Architectural Association School of Architecture, one of the world’s most respected and ambitious pedagogical laboratories for architectural design and spatial research, offer a 10 day international visiting programme in Paris running twice per year, upcoming: FALL 2011
To demonstrate the principle of the Mercedes-Benz PRE-SAFE® precrash system we simply made walls look like they were transparent. For the first time ever people could really see through the walls.
by Andreas Wolter, Jens Weber.
As the year marking the 90th anniversary of the establishment of the Bauhaus, 2009 is an appropriate year for an exhibition dedicated to the examination of the social networks of the Bauhaus movement. In preparation for this project, biographical details of all of the members of the Bauhaus will be systematically structured and entered into an online database. The impressive volume of information resulting from this effort will then be presented within an illuminated 4x4meter cube at the Bauhaus University in Weimar.
The exhibition then becomes an immersive yet highly-structured digital archive rich with historical details. Complex interrelationships will be made more accessible through the implementation of an innovative graphical interface. All visualizations of the complex network are drawn directly from the research database and presented in an intuitive computer-generated form. At an interactive digital tabletop, spectators can furthermore examine individual parts of the greater network in more detail.
GPS, video camera mounted on a vehicle fitted with a fisheye lens, a microphone attached to the data collected before and after the vehicle was rebuilt as an installation space in a way consistent with the actual direction.
The data: collected by GPS, Fish-eye lens mounted video camera, microphones that put on the car. They are reconstructed as an installation. The direction in the video and real-space are synchronized.
By revealing the social networks present within the urban environment, Invisible Cities describes a new kind of city—a city of the mind. It displays geocoded activity from online services such as Twitter and Flickr, both in real-time and in aggregate. Real-time activity is represented as individual nodes that appear whenever a message or image is posted. Aggregate activity is reflected in the underlying terrain: over time, the landscape warps as data is accrued, creating hills and valleys representing areas with high and low densities of data.
We augment humans with wearable, artificially intelligent bionic devices called exoskeletons. In 2008, Berkeley Bionics introduced HULC™, an untethered exoskeleton which allows people to carry up to 200 lbs. for hours. On Oct. 7, 2010, we unveiled eLEGS, an exoskeleton for wheelchair users who are committed to living life to its fullest. It powers you up to get you standing and walking.
This concept from Audi is something close to what I was thinking about for couple of months. How to introduce urban scale Augmented Reality into automobile industry. Current examples of heads up displays are good for night vision and navigational/telemetrics aid. However a car has 4 to 6 AR displays available. Think about that !
You can read the whole article from the Hindu