Tech vs. Tech: Brain-computer interfaces vs. eye-tracking


5 min read

For a long time, my idea of immersion was imagining all the possible combinations of Tetris.


As a ‘90s kid, my experience with gaming and virtual worlds was primarily pixel lated. The sense of tangibility, such as it was, happened through a controller or mouse. 

When, in the 2010s, the first commercial virtual reality headsets were released, they didn’t feel unfamiliar and uncomfortable ... to me (even if women allegedly tend to be more prone to VR sickness). I was used to sticking my face to a screen, and the presence of remotes or controllers made me feel grounded. 


But is it truly immersion if our virtual “actions” are made by proxy, pushing invisible buttons our fingers aren’t actually pushing?


The next stage of virtual reality is controller-free. Two technologies are competing to bring that reality to mainstream lives. This Tech vs. Tech is between eye-tracking technology and brain-computer interfaces.


You may also be interested in

  1. Apple Glasses: date of publication, weight, characteristics and leaks
  2. Best VR\AR suits for purchase
  3. Books you must read about virtual reality
  4. Is Apple's Virtual Reality Headset one of those high-tech goods in the future?
  5. Apple's AR and VR headset may launch in 2022 as a premium device
  6. Apple’s Augmented Reality headset launching in next year


THE TECHNOLOGIES

As its name suggests, eye-tracking technology measures eye movements to identify the direction someone is gazing at. It is actually an old technology: The first eye-trackers can be dated back to the early 1900s. They were intrusive devices that took the form of contact lenses with a pointer attached to them. 


Nowadays, eye-tracking is done by projecting an infrared light directly into a subject’s eye, then using a camera to analyse the reflection of light inside the pupil and cornea to calculate gaze direction. The eye then becomes a “pointer”; users can interact with computers by focusing on specific areas for a predetermined number of milliseconds, to perform tasks like clicking or scrolling.


Unlike eye-tracking, brain-computer interfaces (BCIs) are relatively new. The first devices were developed in the 1970s. Like eye-tracking, their names reveal their function: Connecting the human brain to a computer. They are labeled invasive when surgically implanted in the brain, or non-invasive if you’re instead using an electroencephalography cap. Either way, the technology remains more or less the same: A multitude of sensors track brain activity and translate it into commands to a computer, enabling the user to control it hands-free. 


CURRENT USES

First developed as a way to study reading and learning patterns, today eye-tracking is mainly used for two purposes: Market and scientific research. Marketers use eye-tracking to determine consumer engagement or behaviour on a given website, for example, to gauge whether an advert or product draws enough desired attention, or whether the design of a website is user-friendly.


In academic research, eye-tracking is used in neuroscience, or cognitive or social psychology, to help diagnose or follow the progression of neurodivergent individuals. 


Until recently, BCIs hadn't even left the laboratory environment—of both research institutions and startups. Before it was brought to public attention by Elon Musk and the promise of “being one with a computer,” the technology was mostly used in the medical field for rehabilitation, mental health, and control of assistive robots. But commercial uses are starting to arise, and even the automotive industry is interested.




AND THE WINNER IS...

First eye-tracking, but ultimately BCIs.

The benefits of eye-tracking in human-machine interaction are undeniable, considering it offers seamless immersion, comfort (no need for a controller), and security (retinal biometric identification). On the other hand, accuracy and precision vary considerably across participants and environments.  


The HTC Vive Pro Eye, one of the first VR headsets to integrate eye-tracking technology, was released in 2019. Since then, the immersion industry hasn’t fully embraced it. 


Of course, the gaming industry was already experimenting with the technology to provide better immersion experiences. But it seems that BCIs attract more interest in terms of future commercial use. For instance, Valve, the video game developer and creator of Steam, recently revealed its experiments with BCI because they consider it the next step in immersive gaming. (Maybe this is thanks to, or in spite of, that awkward video of a monkey playing video games with its brain, courtesy of Musk’s Neuralink brain chip.) 


Researchers have compared both technologies to see which is more efficient when it comes to interactions with computers. In the case of eye-tracking, setup and calibration is faster, but is more tiresome to work with. BCI provides better results, but takes more time to set up and is considered more stressful by subjects.


Both technologies have a bright future that extends beyond immersion: They are premises for further accessibility and inclusion. For example, current experiments with both include gauging their efficacy in helping people with motor impairments and disabilities interact with virtual interfaces. 


For a long time, people with disabilities have found safe haven in virtual worlds, praising them for the sense of community they bring and the social interactions they offer. In 2017, the population of Second Life was 50 percent people with disabilities. Eye-tracking and BCI can further enable the lesser-abled to participate more advantageously in the virtual economy. The capacity to access design tools to create user generated content, or even mint or invest in NFTs, represents better possibilities and access to financial independence, too.


Overall, the gaming industry will likely adopt BCI faster due to greater need for immersion. Both will be used in the virtual economy realm, but not necessarily in the same timeframe. Eye-tracking will likely come first, because it's already out of research labs … and in 10 to 15 years, BCI may replace it as it becomes more socially palatable, and because it is ultimately more powerful.


Tech vs. Tech is a regular L’Atelier Insights feature that pits two up-and-coming technologies or trends against each other, using a single lynchpin… like the future of virtual collaboration.

Illustrations by Debarpan Das.


You should also check out the following articles:

  1. Apple’s first headset will focus on “high-quality” games, reporter claims
  2. Pets, people, and Android Phone Notifications are now visible in Quest 2.
  3. 30 The top virtual reality games in 2021
  4. November 26, 2021 is Black Friday. Deals on Video Games and Virtual Reality in 2021
  5. Facebook wants to build a metaverse. Microsoft is creating something even more ambitious.
  6. Metaverse’s Cryptocurrency Leaps in Price After Facebook Rebrands as Meta
  7. Best VR Gifts for Christmas in 2022
  8. What is Augmented reality (AR) ?
  9. How to succeed in the virtual reality world of tomorrow?
  10. Best VR Headset cyber Monday 2021
  11. Books you must read about virtual reality
  12. Best New Augmented Reality Books To Read In 2021
  13. virtual reality Life 2029
  14. Imagine Making Money in Rec Room
  15. The smart glasses revolution is about to get real
  16. Squid Games are now playing in virtual reality


Subscribe now to our YouTube channel


Subscribe now to our Facebook Page


Subscribe now to our twitter page


Subscribe now to our Instagram


Subscribe To my personal page on linkedin


Subscribe To my personal page on tiktok page for those who love to dance :)


Don't forget to be my friend. Sign up for my friend's letter. So I can tell you ALL about the news from the world of VR&AR, plus as my new friends I will send you my new article on how to make money from VR&AR for free.