Google is adding new functionality to Glass gradually, but developers are not just happy with the pace. So, what’s the best way to improve the device? One answer could be to develop a new third party app that adds a whole new feature to the device.
This is exactly what the makers of MindRDR (Mind reader) app did. The app allows the user to control their Google Glasses with their mind. Yes, you heard it right! Your MIND!
The app works with a third party component called NeuroSky EEG sensor. The external device reads the brainwaves of the user to perform functions. However, at this initial stage, there’s very limited functionality.
For example, currently you can only take a picture and send it to Facebook using your mind. While this might not be what you were expecting when we said mind control, but it definitely opens up new possibilities with glass, especially for disabled users.
The app basically measures your concentration and relaxation levels to perform different functions. The EEG on your brain sends out this information, which the app converts into actions.
For example, if you want to take a picture you’ll have to be relaxed. Once you are relaxed and the picture is taken, you can send it to Facebook by concentrating. There’s a horizontal line on the screen which indicates your concentration and relaxation levels and as you concentrate, the line moves up and sends the previous picture to Facebook.
Mind control works with your glass only when you have connected it with the EEG sensor. And at present, Neurosky EEG component is available for $80. This along with the MindRDR app, which you can side load from github should give you a glimpse of what the future holds.
The number of functions that you can perform with your mind really depends upon the kind of EEG sensor you have connected with your device. So, the more powerful your sensor is, the more signals it can detect thereby opening up new functions.
According to MindRDR’s makers, an advanced sensor like the Neurosky ThinkGear module would definitely add new capabilities to the device. Such devices can identify additional brainwave spectrums which can detect signals like attention, meditation and more. So, developers can assign additional functions for these signals.
And given that the code is made open source by the company, we can expect developers to refine and improve the overall software to upgrade the experience on glass.
For comments and suggestions, leave a message in the comments section below. Like and Follow our Facebook page for more stories and to stay up-to-date with the latest happenings.