In particular it uses the EPOC C++ SDK and 3 node modules — Bindings, Node-gyp and Nan.
Once she got that Node.js framework working, she built a small prototype using Three.js to be able to move a 3D cube forward in the browser just by thinking about pushing it. That’s the demo that caught my eye!
It turns out, it’s got a whole lot of possibilities! The add-on she created “exposes data such as the gyroscope data, facial expressions such as smiling, blinking, looking left, right, etc… as well as mental commands”.
In terms of mental commands, Charlie says the following are available at the moment: push, pull, lift, drop, left, right, rotate left, rotate right, rotate clockwise, rotate counter-clockwise, rotate forward, rotate reverse and disappear.
The process of training up what thoughts can interact with the headset doesn’t sound too difficult! Charlie went through the process from the EMOTIV side of things:
“To be able to track mental commands, you need to train them first in their software called the ‘EMOTIV Xavier Control Panel’. You have a dropdown of some ‘thoughts’ you can train and you have to focus on each thought for 8 seconds to record your brain activity in a user file you can use afterwards. So, once you’ve trained thoughts and saved them in your user profile, you can load that file in your program. While wearing the sensor, it will track your brain activity and check if your current ‘thought’ matches anything saved in your file.”
“When I started learning programming about 4 years ago, I realised the skills I was learning in web development could also be applied to programming for hardware, so I started tinkering with electronics and different devices like the Leap Motion, the Myo armband, etc.” — Charlie Gerard
Charlie says she’d love to try and use machine learning to identify other thoughts than the ones already available — which could be pretty darn amazing.
Learn to build for the Amazon Echo, Google Home, Facebook Messenger, Slack & more!