Various sci-fi and futuristic TV shows and movies cover people controlling technology with their minds. This might be via brain waves being captured by a funny looking hat, or by putting a small implant (chip) near/into the person's brain.
Whilst this sounds like pure fiction, senior Intel researches predicted in 2009 that we would soon be controlling tech with our brains. Companies like NeuroSky, Neuralink (by Elon Musk, the guy behind PayPal and Tesla) and even Facebook agree: they are currently working on this technology right now.
So what gives? Will a future Echo Dot only be controllable by a brain implant? Or is this futuristic technology – dubbed “Brain Computer Interface” (BCI) – doomed to fail?
The short answer is that consumer-level ‘brain reading' technology (via a headset) is not quite ready to reliably control your smart home for basic on/off commands, but with NeuraLink already having ‘beginner friendly' headsets and apps on the market, we are close to this point.
The longer answer? Well, read on as it's a very interesting area which will do a whole lot more good than being able to control your Echo!
👉 Related reading: How To Detect Mouse Jiggler and How Does It Work?
Beginner-friendly explanation of smart home brain control
You might know that brains work by sending and receiving loads of electrical signals – enough electric to power a low wattage light bulb, in-fact.
This electrical activity can be monitored by sticking electrodes on the scalp. This process – which is similar to having electrodes stuck on your chest to monitor your heart – is called electroencephalography or EEG (I'll stick to EEG from now on!).
This technology is currently being developed with heavily disabled and paralyzed people in mind – for example by giving them a way of controlling devices (such as wheelchairs and ordinary household items) with their brain waves. A cut-down example of this can be seen in this proof of concept video, showing how EEG controls a miniaturized wheelchair:
Common EEG devices include the Emotiv EPOC which retails for $799 (and was used in a 2018 research paper, and the more consumer-friendly NeuroSky MindWave Mobile 2 EEG sensor which starts at $99.
The general idea of controlling smart homes with your brain is straightforward:
- Place various sensors on the scalp: the more sensors, the better.
- Read the brainwaves: the sensors read brainwaves safely, just like using a thermal camera to read heat signatures: the heat is there anyway, so that data can be captured with relative ease.
- Have a central device which interprets the captured brainwaves: this could be a locally-hosted device, or the data could be sent to ‘the cloud' and interpreted there.
- Carry out an action based on the interpreted brainwave: for example, pause your TV or turn on a light. A more complex action (“Play Miley Cyrus' latest hit song“) would be much harder than turning a specific device on/off.
In other words, the process is:
This whole process is known as Brain-Computer Interface (BCI).
Controlling your smart home with your brain activity
NeuroSky's headset captures brain activity, before passing this to its ‘cloud service' which parses (i.e. understands) the brain activity for you, before relaying this back.
You can then use their developer resources to write custom hooks, which could include calling your smart home device's APIs to turn lights and TVs on/off (or whatever else you want to do).
This all sounds great – in theory.
The reality is that the NeuroSky's brain pattern monitoring would need to be paired with its blinking detection and a bunch of custom programming, which means that it's currently a little too complicated to effectively control your smart home with your brain right now.
Brain implants vs external sensors
As mentioned at the start, Intel research scientist Dean Pomerleau raised some eyebrows back in 2009 by predicting that we would be controlling computers (and hence other computerized devices) with our brains.
They predicted that ‘brain implants' would be the best way to achieve this, based on the idea that sensors inside the skull would be better able to read and interpret brainwaves than external sensors.
This makes sense, too: the more barriers you have in-between something, the weaker a signal gets. Take Wi-Fi: the more walls in the wall (especially insulated walls), the weaker the signal gets. The same is true of the electrical activity from brainwaves: by passing through the skull and skin/tissue, the electrical signal gets weaker. And the weaker the signal, the more chance there is that the signal is misunderstood (or cannot be understood at all).
Elon Musk's Neuralink company is also looking at brain implants: inserting tiny threads (which are thinner than a human hair) into the brain with robots, and then these transmit brainwave data to an external module. They call this a ‘neural lace' and eventually it will be installed via laser cuts (instead of drilling holes into the skull) – which would be less intrusive than standard brainwave sensors.
Since internal brain implants are more accurate, why isn't everyone using them? For example why are NeuroSky (probably the most well known EEG/BCI company) focusing all their R&D effort on external headsets instead?
Well you might be able to guess: most people don't want to have their brains drilled into or lasered to have implants installed. *Shudders*. I certainly don't! I would, however, happily put on a simple headset or sensor and start controlling my smart home with my thoughts.
The key thing here, however, is that Neuralink and similar companies are putting a lot of effort towards helping paralyzed and similarly immobile patients: hence the use of brain implants, as extra accuracy is crucial. Whereas NeuroSky are more aimed at entry-level consumer purposes: hence the use of external sensors, because who cares if the system doesn't understand your command to turn a light off once in a while?
It's a bit like hearing: for most cases of bad hearing and hearing loss, an external hearing aide is used. But for more severe cases, cochlear or auditory brainstem implants are used.
Current ‘brain control' companies and entrepreneurs
I've already mentioned a few ‘brain control' (i.e. EEG based BCI companies) companies whose work will ultimately move the field on, and eventually lead to smart home control via your brain. These general EEG/BCI companies include:
NeuroSky
NeuroSky are one of the biggest companies in the field, and their flagship product – the MindWave Mobile 2 Headset – retails for $99 and is aimed at consumers. Indeed, the website shows children gaming with the headset along with it being used in research, education and more.
Their headset only contains a single electrode, although the device also tracks eye blinks and more.
Once the headset reads your brainwaves, it sends them to ‘the NeuroSky cloud' to interpret the meaning. This is then sent back to the mobile app you install, where you can tell it what to do with certain thoughts/actions (including blinking).
Elon Musk's Neuralink
Last year Elon Musk (co-founder of PayPal and founder of Tesla and SpaceX, incase you forgot!) did a presentation streamed live on YouTube, regarding his Neuralink company:
Neuralink is aimed more at the medical markets, with Musk hoping to massively improve the lives of the immobile: people with physical disabilities who are unable to live an independent life.
Their approach, which I touched on earlier, involves inserting electrical threads near the brain. This builds a ‘neural mesh' which should be able to read brainwave activity fairly accurately, before transmitting this data externally to be actioned as required.
This could allow immobile people to control computers, play computer games and manage household electrics (including helper systems) with their mind. Giving back some level of independence to these people would be a great development, and I wish Neuralink well in their project.
BrainGate
BrainGate (who purchased Cyberkinetics a decade ago) are similar to NeuraLink in that they aim to improve the lives of handicapped people, with a sensor implanted by the brain which transmits the data to an external decoder unit.
Facebook's ‘Typing-by-brain'
Facebook Labs (where their experimental projects are done) are also working on brain-computer interaction with their ‘Typing by brain' project.
Exact detail is still scarce (and slow going – the 2019 news was the first since a 2017 announcement from Facebook), but the idea revolves around using fMRI to detect words that a person has already decided to communicate.
In other words, the technology would be like a post-filter: you think of a word, and then Facebook would detect this and potentially do something with it (post a random status update?!).
fMRI machines are not portable, however, meaning that right now Facebook seem to be more at the research stage. They are collecting data and refining their systems, which could eventually lead to a consumer-ready product. This seems to be a little way off right now, however – but their acquisition of Ctrl-labs (see below) suggests they have other projects on the go, as well.
They won first place in 2019's BCI Awards, and they are actively undertaken clinical trials to aid their research and development efforts.
👉 Related reading: Are Kasa’s Lights ZigBee Or Z-Wave?
CTRL-Labs
NYC statup CTRL-Labs started in 2015 and are known for their non-invasive “neutral interface technology”: basically meaning an externally worn device. What's different here, though, is that their CTRL-Kit is worn on the wrist instead of on the head.
But wait, doesn't this mean that it doesn't track brain waves? Well yes, despite some of the incorrect news headlines, CTRL-Labs' device detects electrical signals from muscles in your wrist (when you move your hands and fingers).
This then allows you to control computers just like you use a mouse.
So whilst physical movement is required, this is still sort of brain-control because your brain moves your hands and fingers.
Facebook purchased CTRL-Labs in 2019.
Paradromics
Paradromics are another 2015 American start-up specializing in BCI: aiming to help people overcome both physical and mental health barriers using technology.
As you can probably guess, this therefore involves an implantable chip which records electrical activity within the brain. Their technology has some form on on-chip (i.e. on-brain) processing, before the brainwave data is passed to an external device.
Kernel
It's worth finally mentioning Kernel, founded in 2016 in LA with a $100 million personal investment by its founder Bryan Johnson.
They still seem to very much be in the R&D phase and there's no immediate sign of a consumer-ready product, but they are another NeuraLink competitor: a company aiming to use implants to capture brain activity and act accordingly.
A look at smart home BCI (Brain Computer Interface) research
As part of writing this article, I read through three relevant research papers:
- Controlling of smart home system based on brain-computer interface (2018) by Q Gao et al.
- Brainwave-Controlled System for Smart Home Applications (2018) by M Nafea et al.
- Online Home Appliance Control Using EEG-Based Brain–Computer Interfaces (2019) by M Kim et al.
Even though they are academic papers, they're actually more approachable than you might think, and it's interesting to see the current state of EEG/BCI without any marketing messages confusing things.
The first research paper used a more expensive external EEG headset (the Emotiv EPOC, which has 16 electrodes). They used eight testers who each performed multiple experiments across 20 trials, and in general they were able to control a range of smart devices (including lights, blinds and web cameras) with a 80-90% ‘command' accuracy. This is pretty good considering that Alexa seems to mishear what I ask her fairly frequently!
It's worth noting that the ‘control' of the smart devices was to either turn the device on/off (or open/close the blinds) – more on this later.
The second research paper was interesting because it used NeuroSky's consumer-level MindWave technology (along with a custom control board to act as a bridge between NeuroSky's brain activity responses and the smart devices). Only a single test subject was used, and the NeuroSky headset has only a single electrode (along with blink detection) – this limits the usefulness of the results a bit, to be honest.
Nonetheless, four appliances were hooked up so they could be turned on/off and the test did successfully show a fairly reliable level of control of these four devices.
The final research paper was a large study which involved 60 test subjects who were split into different groups, with EEG used to control TV channels, digital door-locks and a light. Each test involved using an EEG cap: a full-head device with 31 electrodes on it. Therefore this was the largest study out of the three papers I read.
In general results were good: TV channels were controlled with 83% accuracy, door locks with 79% accuracy and lights with 80% accuracy. However there was a lot of variance: sometimes as little as 65% of TV channel ‘commands' were understood correctly. The paper concluded that controlling a TV requires more cognitive load (thought) than a door lock or light, and that it's easier to be distracted with a TV than the alternatives.
This therefore raises an interesting point, and it's one I hinted at earlier: all three research papers were successful in showing that smart devices can be turned on and off with fairly good accuracy.
But what you may have noticed is that none of these studies tried doing something like having someone think “Play the latest hit by Miley Cyrus” and action that thought. Why's this, you ask? Simple: because that technology doesn't exist yet!
Current EEG technology can just about understand on/off brain activity, but it certainly can't understand subjective thoughts/commands and parse these into machine-actionable commands.
Facebook are working on understanding words you have imagined (via expensive fMRI machines), but understanding random thoughts – and delivering this in a consumer-friendly product – is a little way off yet.
Continue reading: