MIT Researchers Create ‘AlterEgo’ Headset That Interprets User’s Thoughts

MIT Researchers Create ‘AlterEgo’ Headset That Interprets User’s Thoughts


Many of us, at some point, have wished for a gadget that could perform tasks without so much as lifting a finger to type or even speaking. Our wish might just come true because researchers from MIT have built a computer interface that can read your thoughts.
They have developed a headset named AlterEgo that can interpret words that are spoken out loud in mind without actually speaking. This invention outdoes virtual assistants like Siri or Alexa that require voice commands to trigger a response.
AlterEgo is a wearable headset that wraps around the user’s ear and jaw, and the computing system integrated in the device processes the data picked up by its sensors.
When we speak in mind, neuromuscular signals are generated in our jaw and face. These signals are fed to the machine-learning system of AlterEgo which can later correlate particular signals with particular words.
The idea was to develop a computing platform that “melds human and machine” in a certain way and acts as “an internal extension of our own cognition,” said Arnav Kapur, the lead researcher on this project.

Submitted by: Arnfried Walbrecht


Comments are closed.