To demonstrate the principle of the Mercedes-Benz PRE-SAFE® precrash system we simply made walls look like they were transparent. For the first time ever people could really see through the walls.
April’s Fool Day, 2011. Want to preserve the link…
by Andreas Wolter, Jens Weber.
As the year marking the 90th anniversary of the establishment of the Bauhaus, 2009 is an appropriate year for an exhibition dedicated to the examination of the social networks of the Bauhaus movement. In preparation for this project, biographical details of all of the members of the Bauhaus will be systematically structured and entered into an online database. The impressive volume of information resulting from this effort will then be presented within an illuminated 4x4meter cube at the Bauhaus University in Weimar.
The exhibition then becomes an immersive yet highly-structured digital archive rich with historical details. Complex interrelationships will be made more accessible through the implementation of an innovative graphical interface. All visualizations of the complex network are drawn directly from the research database and presented in an intuitive computer-generated form. At an interactive digital tabletop, spectators can furthermore examine individual parts of the greater network in more detail.
GPS, video camera mounted on a vehicle fitted with a fisheye lens, a microphone attached to the data collected before and after the vehicle was rebuilt as an installation space in a way consistent with the actual direction.
The data: collected by GPS, Fish-eye lens mounted video camera, microphones that put on the car. They are reconstructed as an installation. The direction in the video and real-space are synchronized.
This video shows all displayable characters in the unicode range 0 – 65536 (49571 characters), one character per frame, in 30 minutes.
For more info (on the sound, for instance), visit here.
By revealing the social networks present within the urban environment, Invisible Cities describes a new kind of city—a city of the mind. It displays geocoded activity from online services such as Twitter and Flickr, both in real-time and in aggregate. Real-time activity is represented as individual nodes that appear whenever a message or image is posted. Aggregate activity is reflected in the underlying terrain: over time, the landscape warps as data is accrued, creating hills and valleys representing areas with high and low densities of data.
We augment humans with wearable, artificially intelligent bionic devices called exoskeletons. In 2008, Berkeley Bionics introduced HULC™, an untethered exoskeleton which allows people to carry up to 200 lbs. for hours. On Oct. 7, 2010, we unveiled eLEGS, an exoskeleton for wheelchair users who are committed to living life to its fullest. It powers you up to get you standing and walking.
This concept from Audi is something close to what I was thinking about for couple of months. How to introduce urban scale Augmented Reality into automobile industry. Current examples of heads up displays are good for night vision and navigational/telemetrics aid. However a car has 4 to 6 AR displays available. Think about that !
You can read the whole article from the Hindu
Finding solutions to physical, chronic pain through VR technologies:
The Twingly screensaver visualizes lobal blog activity in real time, giving you a 24/7 stream of all (viewer discretion advised) blog activity, straight to your screen. To use the screensaver you need a PC with Windows and a graphics card supporting OpenGL.
At the Venice Biennale, the mæve installation connects the entries of the EveryVille student competition and puts them into the larger context of MACE content and metadata. By placing physical project cards on an interactive surface, the visitors can explore an organic network of projects, people and media. mæve is designed and developed by the Interface Design team of the University of Applied Sciences Potsdam.