A room of innovative projects that only work in connection with our bodies through expanding and enhancing our senses, utilising our bio-signals and reacting to our body transmissions.
Touch, gesture, vibration, muscle motion and all our senses today are part of our digital world, our biofeedback data extending all around us and enabling us to send and receive signals and expand our bodies into new realms.
Come and talk to the innovators of these human-machine interaction systems, learn about the future of body-led tools and experience the blending of our physical and data bodies for work, care and fun.
Curated by body>data>space for FutureFest 2018
Quietude by Santa Chiara Fab Lab and WEAR Sustain
This beautiful set of jewellery has been developed to enhance the experience of deaf women in a sound-oriented world. The accessories detect sounds and translate them into shape changes, light patterns and vibrations. Wearing the accessories deaf women can perceive sounds through their bodies. Quietude aims at balancing the tension between a functional approach to disability with an ethical and aesthetic exploration of technologies supporting disabilities.
SUPERGESTURES by Ling Tan and CityVerve
This participatory project was co-created with young people across Manchester. They express, through body gestures using wearable technology, the relationship between a smart city agenda and the impact it has on people’s everyday lives. The project questions how much affect smart city technology can tangibly have on future generations. This installation will display the process behind the making of SUPERGESTURES. Watch the documentary film and listen to the audio stories co-created by the young participants through the interactive digital platform.
BioMusic by Atau Tanaka and Goldsmiths, University of London
This prototype wearable is a digital musical instrument using EMG biosignals from our bodies. The wearable allows you to make music through your body gestures, and is being explored for application in musical education and in the health sector for kinesiotherapy and rehabilitation. The system consists of an integrated hardware/software package that measures bioelectrical signals through an on-body harness, using machine learning algorithms for gesture recognition and digital sound synthesis. The demonstration involves the use of Myo Mapper and a prototype of an audio software for biosignals.
Mother by Inmi Lee and Kyle McDonald.
This artwork examines the synaesthetic connections between language and shapes. The participants’ individual interpretation of particular sounds was recorded along with their gestural reactions. This in turn generated 3D sculptures based on these gestures. The motion was captured using Microsoft’s Xbox Kinect, interpreted with openFrameworks and printed into 3D objects using rapid prototyping.
Pepper and MiRo by Cyberselves Roadshow, Sheffield Robotics.
Experience hands on some of the latest social robotics research. Come and say hello to MiRo, the biomimetic robot, who is being used in social care, education and as a platform for research by the Human Brain Project. Transcend the limits of your body by teleoperating a humanoid robot, and experience the world from the robot’s perspective through a virtual reality headset, remote controls and haptic interfaces.