When JavaScript tinkerer extraordinaire, Charlie Gerard, has a demo that she controls with her mind, you just know we needed to cover it here on Dev Diner. I had a bunch of questions for her to see how she managed this magic!

Charlie Gerard and the Emotiv Epoc

Charlie and her JavaScript powered Emotiv Epoc cube demo!

Mind controlling JavaScript definitely raises a lot of questions, largely — how? It all works using the EMOTIV EPOC headset, but wasn’t originally JavaScript-compatible, Charlie built her own (now open-source!) solution.

“The project I’ve been working on is to be able to get data from the EMOTIV EPOC in JavaScript. To be able to do that, I used the C++ SDK and built an open-source Node.js add-on so any JavaScript developer who owns this sensor can build an application for it.” — Charlie Gerard

In particular it uses the EPOC C++ SDK and 3 node modules — Bindings, Node-gyp and Nan.

Once she got that Node.js framework working, she built a small prototype using Three.js to be able to move a 3D cube forward in the browser just by thinking about pushing it. That’s the demo that caught my eye!

What can it track?

It turns out, it’s got a whole lot of possibilities! The add-on she created “exposes data such as the gyroscope data, facial expressions such as smiling, blinking, looking left, right, etc… as well as mental commands”.

In terms of mental commands, Charlie says the following are available at the moment: push, pull, lift, drop, left, right, rotate left, rotate right, rotate clockwise, rotate counter-clockwise, rotate forward, rotate reverse and disappear.

Training up the brain sensing headband

The process of training up what thoughts can interact with the headset doesn’t sound too difficult! Charlie went through the process from the EMOTIV side of things:

“To be able to track mental commands, you need to train them first in their software called the ‘EMOTIV Xavier Control Panel’. You have a dropdown of some ‘thoughts’ you can train and you have to focus on each thought for 8 seconds to record your brain activity in a user file you can use afterwards. So, once you’ve trained thoughts and saved them in your user profile, you can load that file in your program. While wearing the sensor, it will track your brain activity and check if your current ‘thought’ matches anything saved in your file.”

Getting to this point

Charlie’s work up until now has all led quite nicely to this demo, she has had a focus on personal projects involving motion control. Even better, just like my own work, she uses JavaScript to do it!

“When I started learning programming about 4 years ago, I realised the skills I was learning in web development could also be applied to programming for hardware, so I started tinkering with electronics and different devices like the Leap Motion, the Myo armband, etc.” — Charlie Gerard

One of her projects involved trying to make a Sphero ball move using those different devices and JavaScript, so the next step after that for Charlie — brain sensors.

Charlie actually was part of a team at Sydney NodeBots Day when I ran the event in 2015! Even back then, her team used the Neurosky brain sensor and JavaScript to trigger colours on screen based on the wearer’s focus. Super cool!

What’s next?

Charlie says she’d love to try and use machine learning to identify other thoughts than the ones already available — which could be pretty darn amazing.

If you’ve got any ideas on the machine learning front, or just want to keep up with her JavaScript adventures, follow her on Twitter at @devdevcharlie! You can also check out her Epoc.js framework on GitHub! If you’re based in Europe, she’ll also be speaking at JSConf EU 2018 about this very project!

Know other emerging tech enthusiasts who might want to read this too? Please like and share this post with them!

Would you like to republish this article in your own publication?
Contact Dev Diner to request official republication of this article.

Leave a Reply

Your email address will not be published. Required fields are marked *

Want more?