The prototype of Engineering's 3D table transforms digital information, also detected using stereoscopic devices, into a physical manifestation in order to offer the possibility of being able to interact with it.
Designed and realized by Engineering's Research Laboratory, the 3D table architecture provides for integration of an environment for tracking objects or movements using Kinect, which is mounted on a pedestal and which can be reproduced via a simulation environment based on web3D, and then via a table which comprises a number of actuators (servomotors and RGB LEDs) guided by control logic based on Arduino boards.
The 3D table is designed as both a simple output element, on which to reproduce scenes or objects, and also as a device that is part of a smarter architecture that integrates other objects that can connect to each other and exchange information to make a simple idea realizable and tangible.
If a model or prototype was reproduced, for example, via Google Glasses and augmented reality, it would be possible to read or even process the metadata connected to it, select the parts of the model that are of interest, or even change them on the table itself, and share the information gathered during this sensory experience with other users or with objects in the same network.
It would be possible to send this data to a 3D printer that then builds the object shown on the table. And all this could be done not only by a single user, but within a collaborative environment where several users can work together to produce items.
Practical examples of applying this development are: remote interaction and movement of objects, medical-surgical simulations, creation of tactile surfaces for the visually challenged, physical reproduction of digital modeling environments, creation of collaborative modeling environments.