“exTouch is a novel embodied spatially-aware interface system to manipulate actuated objects mediated by augmented reality. The “exTouch” system extends the users touchscreen interactions into the real world by enabling spatial control over the actuated object. When users touch a device shown in live video on the screen, they can change its position and orientation through multi-touch gestures or by physically moving the screen in relation to the controlled object.”
The website of a research project (alas German only) which revolves around working with computers with the input of your brain only. The idea comes from the notion of the BCI (brain-computer interface) which provides a direct communication between brain and computer with no muscular activity, thus enabling persons with severe disabilities such as ALS or MS to work with computers. The project is being developed by Adi Hoesle and Andrea Kübler.
Watch a video (again German only) here >>>
I have been working on a tutorial blog for novice 3D worlds users, geared towards the attendees of virtual conference session to be held as part of ISEA2011:
The tutorials are custom designed for a standalone metaverse called NGrid, which is hyperlinked to the OpenSim system, however I am hoping that most of the material will be useful to all novice metaverse users, regardless where their virtual locations might be.
More images of NGrid can be seen here:
And the page for the ISEA2011 event is here:
The Architectural Association School of Architecture, one of the world’s most respected and ambitious pedagogical laboratories for architectural design and spatial research, offer a 10 day international visiting programme in Paris running twice per year, upcoming: FALL 2011
To demonstrate the principle of the Mercedes-Benz PRE-SAFE® precrash system we simply made walls look like they were transparent. For the first time ever people could really see through the walls.
April’s Fool Day, 2011. Want to preserve the link…
by Andreas Wolter, Jens Weber.
As the year marking the 90th anniversary of the establishment of the Bauhaus, 2009 is an appropriate year for an exhibition dedicated to the examination of the social networks of the Bauhaus movement. In preparation for this project, biographical details of all of the members of the Bauhaus will be systematically structured and entered into an online database. The impressive volume of information resulting from this effort will then be presented within an illuminated 4x4meter cube at the Bauhaus University in Weimar.
The exhibition then becomes an immersive yet highly-structured digital archive rich with historical details. Complex interrelationships will be made more accessible through the implementation of an innovative graphical interface. All visualizations of the complex network are drawn directly from the research database and presented in an intuitive computer-generated form. At an interactive digital tabletop, spectators can furthermore examine individual parts of the greater network in more detail.
GPS, video camera mounted on a vehicle fitted with a fisheye lens, a microphone attached to the data collected before and after the vehicle was rebuilt as an installation space in a way consistent with the actual direction.
The data: collected by GPS, Fish-eye lens mounted video camera, microphones that put on the car. They are reconstructed as an installation. The direction in the video and real-space are synchronized.
This video shows all displayable characters in the unicode range 0 – 65536 (49571 characters), one character per frame, in 30 minutes.
For more info (on the sound, for instance), visit here.