Join for free and connect with our local tech scene
Stay on top of the latest companies and upcoming events with our weekly newsletter, and be counted among the people building the future of your local tech community.
Exploration of Neural-Gestural Interfaces for the Control of Robots - by Rebecca Oet, Melissa Kazazic
This project aims to use gestures & neural feedback to create a natural communication between the user and a robot. The gestural data is processed by the Myo armband; the neural data (user’s concentration level) is processed by the Emotiv headset. Combined, this data directs the robot’s movement.
- Background on neural interfaces, gestural interfaces, and robots
- Overview of technology used in the project: Emotiv headset, Myo Band, Raspberry Pi
- Python scripts
- Applications and future directions
Module of the Month: pybleno - by Gary Johnson
Talk 2: Open
Want to present a talk? Submit it here: https://www.papercall.io/clepy