Here is a video demonstration of a thought control device created by Emotiv Systems of Australia. Even though this lecture is kind of long I think it is extremely important to watch — this kind of technology is going to surround us quite soon.
What is so remarkable is that the device will be retailing for only $299 USD and is completely non-invasive. There are so many applications for this it is boggling. I would love to see it in action and some of my first questions were:
Can I record data of myself typing and can this be mapped to a point where I don't need a keyboard anymore?
As per the example in the video, if I map a rotate in different axes can the software learn when I am trying to combine actions? Could I then dynamically rotate in many axes at once?
Will this be available in an open way? Will we be able to apply the SDK on Linux for example?
Also, OCZ has a device called the Neural Impulse Actuator which will be available for $159 USD. I haven't seen a demonstration yet but I imagine its capabilities are somewhat similar.
I am excited about this as there is so much opportunity for business and innovation. It will be remarkable when we start seeing this kind of technology in our everyday lives and the potential is hard to imagine.
Next steps: They have an SDK available which I intend to poke around in and is there anyone in Vancouver already exploring this technology?