Expo Installation

3rd I Design Expo

Telepresence Technologies

For the 3rdI projects a lot of innovative technology was developed. Learn more about the technical aspects here.

Various forms of advanced technology have been created for the exhibitions, ranging from unique user interfaces, Internet controlled robots and intelligent flooring that has embedded computer chips in its surface. Development by Graham Smith in cooperation with:

  • Internet Controlled Robots: Faculty Robotics and Interaction, University of Applied Sciences of Rotterdam, student Mohammad Yosuf Haydary; Engineering Faculty Chair of Dynamics and Control, University Duisburg-Essen, Prof. Dr.-Ing. Dirk Söffker, Dipl.-Ing. Dennis Gamrad.
  • Online Steering Interface: Faculty of Industrial Design Engineering, Technical University Delft, student Annelies Wisse, Master Design for Interaction.
  • Intelligent Flooring & Design of the Robots: Faculty Intelligent Product Design, HKU Utrecht School Of The Arts, students Jorrit van der Zee, Nicolo Giacomello.
  • Telepresence Furniture & The Web Table: Technical University Eindhoven, Sam Nemeth.

Multi Camera Robot (MCR)

2 types of telepresence video robots are used in the exhibitions. The first is the MCR which is controlled locally and utilizes numerous panoramic camera perspectives mounted onto its chassis that provides the user with a wide angle view of the miniature maquette landscape that it explores. The robot uses servo motors, an Arduino motor control board, Radio Control (RC) technology, batteries, sensors and numerous 2.4 Ghz video transmitters to send High Definition video imagery to the users. The MCR is unique in that it has a camera mounted directly above its center point that provides the user with a perspective of the floor/ground that surrounds the robot. This “Gods Eye View” provides the driver with an unparalleled overview of the environment and allows for a more natural interaction for people exploring the exhibition.

Internet Controlled Robot (ICR)

The second type of robot used in the exhibition is the ICR, which is linked to the Internet and controllable over Skype videoconferencing from any location in the world. These robots use mini computers with Intel Dual Core 2 chips, wide angle cameras with auto focus, Arduino motor control boards, high performance Lithium/Ion batteries, RFID readers, 12VDC gear motors and IR sensors. The robots use custom software that allows them to be controlled over the Internet via Skype by any user who downloads the control software that is available on the project website. The RFID reader is linked to the Intelligent Floor that provides users with a GPS type navigation tool allowing people to easily navigate through the exhibition and find specific projects.

Intelligent Flooring

The floor that the robots maneuver over has hundreds of Radio Frequency Identification (RFID) tags embedded into its surface that provides the robots with there own personal “Google Maps” type overview of their position within the exhibition. The tags are also positioned around each model in the exhibit, allowing users to upload detailed project descriptions about specific models and the artist/designers who created them. The RFID tags create a unique environment that reacts to the position of the robot guests, providing information, navigation and intelligence to the static structures.

Body Controlled Robot Interface (SPI)

To control the robots, users use an interface that utilizes body motion as a control input on a device called the Step Pad Interface (SPI). The SPI gives the audience the ability to steer 2 of the robots using the movement of their bodies in a natural way that mimics the feeling of a person walking. The first SPI is linked to the MCR robot that is controlled locally from the exhibit and incorporates an overhead video projector that surrounds the user with video imagery of the ground around them. The second SPI is located at the virtual exhibition site and linked to one of the ICR robots. These intuitive interface devices allow the audience to interact within the exhibit as if they were strolling through a real environment and offer a next generation experience for people interacting within a model environment.

Telepresence Furniture (TF)

People from both exhibition sites will be able to interact and talk with each other using Telepresence Furniture (TF) that links both audiences via videoconferencing devices that projects full size images of users and their tabletops into both linked environments to create a shared sense of presence for audiences in both countries. The TF creates a shared sense of presence for people, breaking down barriers and giving people direct eye contact, 3 Dimensional imagery, shared table space in an easy to use piece of furniture that invites people from both cultures to sit down and share a meal as if they were a meter away instead of the 10,000 kilometers (www.webchair.com).


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: