Robots are being trained to read human thoughts

sketch-of-brain-on-book-pages

If you use a voice assistant such as Alexa, Siri, or Cortana in your home or office, you’re interacting in real-time with an intelligent machine. In coming years this will be so common that no one will really think about it.

But a team of academic researchers recently demonstrated that live human interaction with machines can go far beyond touching (typing, mouse, etc.) or talking. Scientists at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) created a program that allows humans to control robots with brainwaves and hand gestures.

That’s right, we puny and weak humans can control robots with our minds. Who’s having an uprising now?

CSAIL’s Adam Conner-Simons writes, “By monitoring brain activity, the system can detect in real-time if a person notices an error as a robot does a task. Using an interface that measures muscle activity, the person can then make hand gestures to scroll through and select the correct option for the robot to execute.”

Unfortunately, as the video below shows, you have to wear a dorky EEG cap for the robot to detect your powerful, all-controlling brain waves. A small price to pay for the feeling of omnipotence.

Clearly the robot-human team in the video isn’t collaborating on a quantum physics project. It’s a fairly simple physical task (aiming a power drill), and it requires the human to monitor the robot’s movements. This could restrict the usefulness of the technology in the workplace — who has time to watch a robot all day, or even in concentrated doses? — at least until it’s further advanced.

Still, the fact that the robot improved its accuracy after getting feedback from the human through brain waves and gestures shows that humans and robots can partner effectively.

MIT team members say “they could imagine the system one day being useful for the elderly, or workers with language disorders or limited mobility,” Conner-Simons writes.

The latest project from CSAIL builds off the team’s work last year, which restricted the robots to binary choices based on EEG feedback from humans. No doubt the next projects likely will show even more progress toward a robot-human mind meld.

Comments

  1. Movie “Surrogates” is an example.

    Like

  2. Robots are made using AI and machine learning. They are programmed in such a way that they can read the human thought which may be beneficial in lots of case.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: