Facebook's upcoming wrist controllers, It's actually MYO for those who remember.


19 Mar
19Mar

When augmented reality fully matures, putting digital overlays over the physical world through transparent glasses, it will entwine itself deeper into the fabric of your life than any previous technology. AR devices will keep trying to find new and better ways to serve you.

Facebook has already made strides in the VR with the Oculus Quest 2, and it's going for augmented reality now. In an online "road to AR glasses" briefing for the global media, the Facebook Reality Labs (FBRL) showcased some next-generation AR technology on the innovative. It's also asking the public to take part in the conversation on privacy and ethics with this upcoming device in the next few years.



hand-neural interfaces
Currently, our interactions with digital devices are coarse, constraining our growth. Revolutionary changes in human-to-machine interfaces (HMI) don't occur every day, although today we've seen three such lurches forward with computers: graphical user interfaces (GUI), desktops, and touchscreens.

In addition, you won't want to hold a controller all day just to use your smart glasses. This concept of camera-enabled hand-based hand tracking will always be cumbersome and for AR to take off, as the next enormous leap, and the Quest headsets must operate at the speed of thought to do so completely smoothly and efficiently.

To reach that, the FRL team has been developing a next-level wrist-mounted controller that remains unseen. Neural impulses don't even have to be strong enough to be moved in order for the wristband to give them to be obeyed.



electromyographic activity
Why this? Reardon explains, "It's a simple location to place these devices," "It's where all of your human abilities stored, and it's how you perform adaptive tasks in the actual world. We dedicate more neurons in your brain to controlling your wrist and hand than any other part of your body... You have a vast reservoir of untapped motor skills that you hardly ever use when interacting with machines today."

According to Reardon, today's human-machine interfaces have a lot of unnecessary steps that can eliminate. An action starts with a thought in the brain's motor cortex, travels down the arms' motor neurons as an electrical impulse, and then becomes a muscle contraction that moves a hand or a finger to operate a mouse or keyboard. The FRL team can intercept that impulse at the wrist, removing the intermediary.

"We've made a big step forward here," Reardon says. "That is why we believe this technology has a bright future. We can actually resolve these signals down to individual neurons and the single zeros and ones they send. As a result, we have a tremendous opportunity to give you control in ways you've never had before."


The neuro-motor interface, which is currently just a lab prototype that looks like a big, chunky watch on a wristband, can read neural activity down to millimeter-level deflections in a single finger's motion. "I wish I could show you," Reardon says, "because you could try everything." The system requires each user to teach the device a set of commands before it can develop its own intelligent mapping of how your specific nerves accomplish a task.

Demo of EMG Interaction
The FRL team has already used the neuro-motor interface as a game controller, describing it as a "six-dimensional joystick built into your body." It already has a latency of 50 milliseconds, which is fast enough for gaming, and even a guy who has never had a fully developed left hand could get up to speed in a matter of minutes and use the controller as easily as anyone else.

According to the team, interactions would begin with simple finger pinches and simple thumb movements for clicks or yes/no type responses. This would allow you to deal with contextually generated AR prompts without being distracted from a conversation; for example, if a friend mentions meeting you at a specific location and time, the system might pop up a discreet pop up asking if you want to add it to your calendar, and a simple finger motion could say yes or no and clear the pop up without you having to take your hands out of your pockets.

Things would progress from there, with the ability to control 3D AR objects with your hands and any kind of subtle gesture control you could imagine. The FRL team has also been working on key board less keyboard technology, which allows you to tap away on any flat surface while refining your predictive algorithms as it learns more about you. "This will be faster than any mechanical typing interface, and it will always be available because you are the keyboard," the FRL team claims.

Typing Demo by EMG
This unlocks the possibility of a full multi-screen AR computing setup you can use anywhere, with nothing more than your AR glasses and wrist controllers. That, my friends, would be even more Minority Report than Minority Report itself–and I say this fully aware of the consequences I'll face from the New Atlas editorial team, who have a long-standing policy of slapping wedgies and boogies on any writer who mentions Minority Report after a spate of overuse in the late 2000s.


Haptics are a type of advanced haptics.
The latest prototype, which would have built into a wrist-mounted neuro sensor like the ones described above, combines six " tactile actuators," presumably similar to the force feedback units in a game console controller, with a "novel wrist squeeze mechanism," according to the FRL team.

The TASBI

(Tactile And Squeeze Bracelet Interface) may not sound as exciting as a nerve-hijacking controller, but physical feedback will be an important part of AR systems that integrate into your life while requiring the least amount of attention.

"With the vibration and squeeze capabilities of TASBI, we've tried tons of virtual interactions in both VR and AR," said FRL Research Science Manager Nicholas Colonnese. "Simple things like pushing, turning, and pulling buttons, feeling textures, and moving virtual objects in space fall into this category. We've also tried more unusual activities, such as climbing a ladder and shooting a bow and arrow. We wanted to see if the vibration and squeeze feedback could make it feel you were interacting with these virtual objects naturally. The answer can be yes because of sensory substitution, in which your body integrates visual, audio, and haptic information. It can produce an interesting, natural, and easy-to-understand experience."


Interface that adapts the ethical and privacy implications of artificial intelligence: Before an augmented reality system can become truly indispensable, it must first learn and understand your behavior–perhaps even better than you do. It will have constantly assessed your surroundings and activities in order to figure out where you are and what you're doing. It'll have to get to know you, compiling a profile of your habits that it can use to predict your actions and desires. This will cause a lot of data crunching, and AI and deep learning technology will be critical for crunching video feeds and figuring out what's going on.

Once it has a good idea of what you might want, it should step in and offer to help in a helpful rather than annoying manner. It refers this to as the Adaptive Interface by FRL.

Tanya Jonker, FRL Research Science Manager, explains, "The underlying AI has some understanding of what to do in the future." “Perhaps you go for a jog outside, and the system predicts that you'll want to listen to your running playlist based on your previous behavior. It then asks you if you want to play a running playlist on the screen: ‘Do you want to play a running playlist?' At work, that's the adaptive interface. Then, with a micro gesture, you can confirm or change that suggestion. Because the interface surfaces something relevant based on your personal history and choices, and it allows you to do so with minimal input gestures, the intelligent click gives you the ability to take these highly contextual actions in a very low-friction manner.”

What are the chances of it working? If you look at a smart lamp, you may able to turn it on or off, and change its color or brightness. When you take out a package of food, the cooking instructions may appear in a small window, complete with pop-ups offering to start a timer just as you reach a critical step. When you walk into a familiar café, it may ask if you want to place your usual order. When you put on your running shoes, it may ask you if you want to begin your workout playlist. These are things that could have accomplished in the short to medium term–the sky's the limit after that.

Menus like this will only appear if you specifically request them; FRL wants to make sure it only shows you think you'll want.
Menus like this will only appear if you specifically request them; FRL wants to make sure it only shows you think you'll want.
Facebook Reality Labs is a division of Facebook.
All of this would be nice and convenient, but clearly, privacy and ethics will be a major concern for people—especially when a company like Facebook involved. Few people in history have ever had their lives so thoroughly scrutinized, cataloged, and analyzed by a third party. The opportunities for targeted advertising will be limitless, as will the opportunities for poor actors to take advantage of this gold mine of personal information.

However, this technology is on its way. According to the FRL team, it will be a few years away. However, the technology and experience are both proven in this area. They work, they'll be outstanding, and now it's just a matter of figuring out how to turn them into a mass-market-ready product. So, why is FRL now informing us about it? Well, this could be the most significant advancement in human-machine interaction since the touchscreen, and Facebook doesn't want to be seen as making such decisions behind closed doors.

Sean Keller, FRL Director of Research, said, "I want to address why we're sharing this research." "Today, we'd like to start a conversation with the public about how responsibly develop these technologies. We won't be able to expect or resolve all the ethical issues that will arise because of this technology on our own. What we can do recognized when technology has progressed beyond what people believe is possible and ensure that the information freely shared. We want to be open about our work so that people can express their concerns about this technology."

FRL CTO Mike Schroepfer emphasizes that this is not a Neura link. "You can't read thoughts, senses, or anything like that because we're nowhere near the brain." Other devices, such as the Facebook Portal, which sits in people's homes and uses AI tracking to move a camera and keep you in a frame while on a video call, have had to deal with some same issues in the past, he says. ""That system built from the ground up to run locally on the device, so we could provide the feature while protecting people's privacy," he explains. So that's one way we're attempting to use on-device AI to keep data local, which is yet another form of data protection. So there are a lot of things like that that this team is looking into."

How can you take part in this discussion? One way is to go through us. Fill in the blanks below with your questions, concerns, and ideas. We'll follow up with the FRL team in an interview as soon as possible.

Whether you believe we already live in a post-privacy world where people freely share information about themselves for the sake of convenience, or you believe technology like this is a step too far into the digital unknown, it's coming, and it'll change the world. Look at the video below.


Virtual Reality, Augmented and Artificial Intelligence 2021 specialist Amit Caesar wrote the article.
Send me an email: caesaramit@gmail.com


You may also be interested in 

  1. Pinball FX2 VR reviews: the best Pinball game revision
  2. Oculus Quest 2 full review.
  3. Amazing products for your virtual reality glasses from Amazon



What are your thoughts on psvr2 teasing? Please let us know in the comments section below!

Comments
* The email will not be published on the website.