Tom Emrich's AR News
For over a decade now, I have started my year off by taking some time to jot down the trends I will be watching in augmented reality (AR). The following are the 23 AR trends I will be keeping an eye on for 2023. It is important to note that these are not predictions but rather major areas of focus that are already in play that I believe will dominate the AR space this year. ☕️ Grab a cup of coffee and get comfy, this is a long read! This post is best enjoyed on LinkedIn so if you just received this in your inbox, hit the "Open in LinkedIn" button above for a more optimal reading experience.
Artificial Intelligence (AI) is set to knock the metaverse down the Hype Cycle curve by taking its place at the peak this year. Metaverse is defined differently by different people but for me, metaverse is not a single game, a virtual world, or an NFT collection, but rather an "aha moment". It is a realization that the next wave of computing consists of a new stack made up of emerging technologies (including blockchain, AI, IoT, AR and VR) that will all work together to create a fundamental shift in our relationship with technology. This shift can be summed up in one word: presence. This great virtual awakening was brought on by mass digital transformation fueled by the global pandemic which necessitated organizations and individuals to expedite their use of virtual technologies to survive and in turn gave them a solid foundation to see what's next. It also generated new expectations for our technology to deliver more human connection and better replicate the physical world experience.In my 2022 AR Trends post, I started my report expecting the metaverse to remain at the peak of the Gartner Hype Cycle last year which it did. I also correctly posited that we will soon see that the metaverse is "more mirage than miracle" as the closer we think we are getting to it, the more we will realize that there is still a way to go before we hit our final destination. That is not to say that the individual technologies which make up the metaverse, such as AR, are at the same point in the cycle. But rather that the grander vision of the metaverse is going to take a lot to accomplish and so it will move into the Trough of Disillusionment (or break out into its own Hype Cycle completely just like the Internet of Things (IoT) in 2016) and continue to evolve behind the scenes before we wake up one day and realize that we have arrived.Taking its crown will be one of these individual technologies, AI, or more specifically generative AI, which, at the end of 2022 was already beginning to dominate headlines and attention.
2023 will be a big year for XR (AR and VR). Like AI, AR and VR will benefit from the industry shifting its focus from the grander vision of the metaverse to the individual underlying technologies that will enable it.This year, the garden that XR has been cultivating for years will finally begin to bloom as AR and VR are unified through mixed reality devices which will create an extremely fertile ground for new apps and solutions. Headsets capable of video passthrough AR will not only give developers a new device to create content for, but these devices, I believe, will finally help the industry see that AR and VR are two sides of the same coin. This may help us move away from the “AR versus VR” mentality that many still have in and outside of the industry. I expect this unification will vitalize the developer and investment community as they size the XR opportunity as a whole rather than just focus on one fragment.
This will be a huge year for mixed reality (MR) headsets or virtual reality head-mounted displays that double as augmented reality headsets with color video passthrough added as a core feature.If you have been watching the AR industry for a while, you’ll know that everyone has been waiting for headworn AR to hit a breakthrough point with consumers. While most of the focus has been on optical see-through AR glasses, that look like regular eyewear and can be worn all-day anywhere you go, video passthrough AR has readied itself to transform every VR head-mounted display into an AR device which will take the lead in consumer adoption.2022 ended with a number of significant milestones in mixed reality headsets. Meta launched its Meta Quest Pro, Lynx shipped its R-1, and Lenovo launched its ThinkReality VRX for the enterprise. When I walked the CES floor this year, if there wasn’t already a mixed reality device debuted, such as those from TCL, Sonium Spaces and HTC, every VR manufacturer indicated a mixed realty module was coming including Pimax whose high-end, high resolution device, Pimax Crystal, is set to take on Varjo in offering AR and VR. Rumors of Samsung, Google and Apple joining this race continue to flood the tech news with Bloomberg and other analysts suggesting Apple may be debuting its mixed reality headset as early as this year. I call this milestone AR headworn’s “PC moment” as they are still relatively bulky, and expensive, are fixed to a room/indoor location and, for the few that can afford it, we most likely will only see one per household. That being said, they are ready for adoption and are only going to get better in a short amount of time. What is great about mixed reality devices is that they benefit from the adoption journey virtual reality has been on in the consumer space. The VR market has slowly gained adoption since 2016 when the first set of consumer devices hit the market. Statista estimated that 74 million people were using VR hardware in 2022, with that number getting closer to 100M this year. Consumers know VR, they know family and friends who have a VR device, and so adding AR as a feature to VR is just going to make something they are familiar with even more powerful. In this way, you could see VR HMDs as the trojan horse for AR headworn to get to the masses
While AR headworn is getting its PC moment with mixed reality head-mounted displays that don’t leave your home, we will see new connected eyewear options we can use more regularly out and about. But these devices are more the return of Google Glass and less Magic Leap for the masses. Despite this, they will play an important role in educating and acclimating us in wearing tech on our face so that we will be ready for when AR headworn gets its “mobile moment” with all-day everyday AR glasses.As a former Google Glass Explorer, I can tell you that one of the biggest things I learned was that the face is a very sensitive place on a person. Asking a person to wear tech on this area of their body is a massive undertaking we shouldn’t minimize. People are ok with wearing tech on their wrist, or constantly holding it in their hand, but when asked to wear it on their face outside in the world—it’s hard. It gets even more complicated as we are not only asking people to wear technology on their face but to wear glasses. Many folks don’t wear glasses, and for those that need them have moved on to wear contacts to avoid wearing frames on their face.We already have connected eyewear available for people today: Ray-Ban Stories, Nreal Air, Nreal Light, Rokid Air are just a few that come to mind. This year on the CES floor we saw TCL debut its RayNeo X2 binocular heads-up display and Vuzix’s Ultralite glasses which are their most fashionable frames yet. While all of these glasses may look similar they all do very different things. Some let you listen to music and record media, others act as a screen extension so you can play games, while others provide you with notifications and directions, very few provide actual AR experiences. I suspect that we will continue to see a myriad of different connected eyewear options continue to be available to consumers before they eventually consolidate into the holy grail of AR smart glasses. 2023 in particular will have a greater emphasis on the return of consumer heads-up displays, both monocular and binocular, providing more of a “smartwatch moment” for connected eyewear.
I say this every year but it is because it is true: the smartphone continues to become an even more powerful augmented reality machine with advancements in chips, cameras, displays and connectivity.Every year, flagship phones from Google, Apple, Samsung, Xiaomi and others get faster, sharper, and longer lasting. New chips, higher resolution displays, more complex camera systems, innovative battery solutions and support for the latest connectivity all bode well for mobile augmented reality. While the higher end phones get more powerful, each cycle the cost-conscious models inherit many of the new features that debuted the year before. These more affordable options play an important role in bringing AR to the masses.In addition we continue to see traction on new, innovative display options such as flip and foldable phones and the emergence of rollable and slidable phones. IDC predicts that the foldable phone market will reach 27.6 million units and Samsung reported strong success with its foldable phones, especially in the enterprise, so if you think these are just a fad, they are not. At CES this year, Samsung teased its Samsung Flex Hybrid, a rollable device that can fold and slide going from 10.5” to 12.4”. TCL and LG are among the manufacturers who have announced plans to make rollable phones that use motors to extend a regular smartphone into a tablet without hinges. More screen size gives a larger field-of-view for augmented reality which provides an even more immersive experience for the userWith the smartphone adopted by over 5 billion people worldwide, the phone is still where it is at to reach the most amount of users with augmented reality. Expect mobile AR to continue to deliver the biggest consumer opportunity for brands, marketers, platforms and games in 2023.
In my 2020 AR trends post, I wrote: “the optics and photonics community are literally trying to break the laws of physics to give us the components we need to create fashionable all-day AR glasses”. This statement remains true today. 2023 began with a number of major waveguide players showcasing the leaps and bounds they have made in components that one day will enable all-day, every day AR glasses. Meta Materials and Ant Reality both showcased electrochromic dimming at CES 2023. Electrochromic dimming enables greater control of blocking the physical world’s light using electrically-controlled liquid crystals. Advancements in dimming are key to enabling AR optical-see through devices to function properly outdoors or in areas with a lot of light. They can also enable a toggle between AR and VR content. Today’s waveguides are also brighter, lighter, and offer a wider field of view. Lumus debuted its second generation Z-Lens which can deliver a 3,000-nit display at 2K-by-2K resolution in an optical engine that’s 50% smaller. Ant Reality’s “Crossfire” boasts a 120 degree field of view with possible first commercial use in a future Nreal device. And Vuzix is hoping to fast track the creation of AR glasses with its OEM smart glasses platform, Vuzix Ultralite. Ultralite is designed to expedite the manufacturing of a smartphone accessory that weighs 38 grams, is super power-efficient with up to 48 hours of run time on a single charge, and packs an impressive waveguide to display information from your phone hands-free to your eye. Dispelix and VividQ teamed up to debut a major breakthrough in new 3D waveguide technology. It has developed a waveguide combiner that can accurately display simultaneous variable-depth 3D content within a user’s environment.While we can expect many of these advancements to begin to power next generation connected eyewear, especially smartphone accessories, these big steps forward continue to shine a light on just how hard highly functional optical see-through AR glasses are to create. While the optics industry continues to make good progress we are still a ways out until we have everything we need to create all-day, every day AR smart glasses that can try to replace our smartphones.
AR and VR are touted as immersive technologies but yet today’s experience is mainly focused on digitizing our sense of sight leaving most of the way we experience physical reality out of the metaverse. This is changing as head-mounted displays prioritize features that bring our ears and hands and more of our eyes into the mix. Eye tracking takes our sense of sight to the next level in AR and VR experiences. Eye tracking is critical for foveated rendering, a technique that provides a high-quality visual experience at very reduced computational workload by decreasing the image quality in our peripheral vision while in a headset. Eye tracking also ensures the displays in our headset are adjusted properly by enabling auto IPD, measuring the user’s IPD (interpupillary distance or the distance between your eyes) each time the headset is put on and then adjusting the IPD distance with a motorized drive. Eye tracking can also be used as a new interaction method in AR and VR experiences, enabling control of content to be done simply with your eye gaze. Hand tracking brings a new way to interact with AR and VR content. Bringing our hands into AR and VR offers a much more natural way to engage with 3D content compared to controllers. This becomes even more important in mixed reality where your hands are visible within the experience. Hand tracking can also be used to augment this part of our body to enable try-on experiences. In my 2022 “Reality Check” post, I highlighted the activity we saw in the spatial audio space, both in and out of AR/VR, last year. Indeed, spatial audio is steadily becoming the default listening experience for music and other entertainment and gaming content which is being created using the latest audio formats and listened to on spatial audio-enabled earbuds and advanced speaker systems in vehicles. But we have a ways to go with AR/VR headsets. In fact, the most optimal experience today for AR and VR is to pair your headset with spatial audio-enabled headphones. I expect this combination will be standard for this generation of headsets. I also expect that pairing spatial audio-enabled headphones with a mobile AR experience will grow in popularity, especially as more spatial audio tools and APIs become available to developers.
5G and Wi-Fi 6 (and on the horizon Wi-Fi 7) will continue to be a major focus in enabling next level augmented reality and virtual reality experiences, especially as more mixed reality devices hit the market that require more powerful connectivity for lower latency, higher fidelity content. But two enabling technologies I expect to hear more about this year will be ultra wideband and edge computing, both of which enable devices to react to their environment immediately, without the delay of transferring data elsewhere.Ultra Wideband is already in use by many Apple products as its U1 chip which is found in the latest iPhone models and of course Apple’s AirTags. This new chip gives devices spatial awareness by enabling a U1-equipped device to precisely locate and communicate with other U1-equipped devices around it. In 2022, Apple included a Nearby feature in iOS 16 that lets third-party devices talk to U1 chips in the background. It is expected that the U1 chip will play an important role in Apple’s rumored mixed reality headset. But Apple is not the only one using ultra wideband. Samsung and Google are also making use of this technology in their devices. I expect we will continue to see ultra wideband’s critical role in spatial computing as more devices adopt this technology and developers get greater access to APIs to make use of it. Edge computing brings the power and performance of data centers directly into devices by not requiring them to send all data off the device for processing. This can dramatically increase the power, consistency and quality of AR and VR experiences while preserving the devices battery life. This is especially key for spatial computing applications, such as digital twin solutions in the enterprise, which require a realistic and accurate presentation of a user’s surroundings. A recent report from Vertiv suggests that edge computing will grow from 21% of total compute to 27% in 2026. This acceleration will be helped by more accessible edge solutions from cloud service providers including options with a key focus on security, latency and sustainability. I expect we will see much more activity in edge computing in 2023 especially as AR devices gain adoption in the enterprise and location-based mobile AR content rises in demand.
Mobile AR is growing up and its entering its third generation.In 2009, I remember using LAYAR and Yelp Monocle when AR was used on a feature phone and was really just an overlay (I mean it showed me that I could easily get to my nearest Starbucks if I literally walked through a wall—if you know, you know). 2009 was also when I was scanning a large AR marker which Robert Downey Jr. was sitting on on the cover of Esquire magazine. AR was in its infancy but even back then it showed promise to bridge the gap between the digital and the physical world. I consider this era, Mobile AR 1.0.Today of course mobile AR is much more sophisticated thanks to advancements in computer vision which enables markerless world tracking, image tracking, face tracking, and more. Mobile AR 2.0, or most of the AR we know of today, leverages smarter phones than the previous generation along with technologies that enable AR to be more rooted in our reality. While this has all allowed for more immersive experiences on smartphones, the next phase of mobile AR, Mobile AR 3.0, gets much more contextual.The shift in Mobile AR 3.0 is a significant one. We are moving from AR that can be used on “any place”, “any face” and “any thing” to AR that requires “this place”, “this face” and “this thing”. This will be driven by technologies like VPS (Visual Positioning System), semantic understanding, and AI. Mobile AR 3.0 will begin to make good on AR’s promise of blending the digital with the physical, as it makes more use of the physical world as part of the experience. In fact, experiences that make use of this next generation of mobile AR technology will make the physical world an even more essential part of the experience, so much so that AR will feel scarce and precious as we may only able to experience it at certain times of the day, certain locations around the world, on certain objects and have it change depending on the user. In turn, this will make AR feel more real, more personal and much more valuable. This year, keep an eye on platforms as they rollout new tools and features which enable developers to create content that represents the third generation of mobile AR.
The web has become a powerful place for augmented reality. Mobile WebAR has grown in use and adoption thanks to platforms like 8th Wall and others. Last year, Niantic, placed a huge spotlight on the web through the acquisition of 8th Wall, which has now powered over 2,000+ commercial experiences to date.With XR in vogue, mobile WebAR with good footing, and browsers on mixed reality headsets capable of VR and AR content, I believe the web and WebXR is well positioned to be the focus in 2023.This year I will be keeping an eye out for an increased focus on the browser from major players in AR. I will also be watching the browser space for a greater adoption of standards and support for AR as well as making AR content easier to access, navigate between and discover. I also expect more developers will see the web as a massive opportunity for AR content which in turn will grow a brand new WebXR developer ecosystem, a community which is already underway.While web-based AR is not new, expect this space to heat up this year as the browser takes center stage for AR content creation.
With 2023 being a big year for AI, one of the key trends in this area will be the use of AI in 3D/AR/VR development. In particular, generative AI will play a key role in accelerating AR and VR content creation as it will not only be used to spark new ideas but also accelerate the creation of assets. One of the major roadblocks in creating AR and VR content is not a lack of AR and VR development tools but rather a lack of 3D assets to create content with. 3D modeling and 3D animation is still a highly specialized skill. While you can, and do, use 2D assets in AR and VR, as spatial experiences, they require 3D content to really be immersive. While it's still early days, we are already seeing how generative AI can be used to create 3D content. OpenAI, NVIDIA and Luma AI have demonstrated the path to using AI to produce 3D models from text prompts. Last year, Luma AI released a tool that can generate 3D printable 3D models from a text prompt. OpenAI debuted “Point-E”, a “system for generating 3D point Clouds from a complex prompt” which in turn could be converted into a mesh model with existing software tools. We may soon be able to generate 3D assets for games and AR/VR content just as easily as we are creating 2D images today using Midjourney, DALLE-E and Stable Diffusion. Democratizing the creation of 3D assets will unblock many in the development of AR and VR content. But the use of generative AI to assist in the development of AR content goes beyond assets. Midjourney can be used today to generate storyboards and character concepts, ChatGPT can be used to generate narratives and copy, and DALL-E can be used to generate textures for 3D models as just a few examples. Generating AI-enabled avatars is another way this trend will materialize as we are seeing with the likes of InWorld.Within this sphere, there is also growing use of neural radiance fields, or NeRFs, a technology that is revolutionizing the representation of 3D spaces. NeRF is a type of AI technology that is used to generate 3D scenes based on a limited number of input images. The technology essentially turns 2D photos into 3D scenes. Last year, NVIDIA debuted a new NeRF technique which the company claims is the fastest to date and only needs seconds to train and to generate a 3D scene. We also saw Polycam and Luma AI launch NeRF as a feature of their popular scanning apps. Keep an eye on the AI space, which is rapidly expanding and evolving, as it plays a key role in augmented reality.
As more mixed reality headsets become available and adopted by people worldwide, developers will find themselves with a new opportunity for content development. But in order to maximize their reach, developers will need to consider developing for a variety of platforms, a myriad of devices (smartphones, tablets, PCs, gaming consoles and MR devices), and even a mix of realities (AR/VR). As such, I expect that developers will have a growing need for development tools that enable them to build once and deploy everywhere.As I expect mixed reality devices to be no more than one per household, developers may also be looking for tools that can easily enable their content to be engaged with both on a headset for the primary user and on smartphones and tablets for the people around them.I originally included this trend in my 2022 outlook but it was arguably too early. What has changed in 2023 is a growing number of mixed reality head-mounted displays which are already or soon to be available.
In 2020, one of the trends I wrote was “filters will continue to influence the fashion and beauty industry while slowly becoming the future of it all at the same time”. This year I am bringing this trend back, but moving it forward a huge step as fashion brands are no longer just influenced by AR but rather are using AR as a way to tap into both the physical and virtual economy. In 2022, we saw a number of fashion brands make use of AR and 3D to drive sales of physical goods as well as to gain brand adoption and deliver new value in virtual worlds. I expect we will continue to see even more fashion brands use AR in a variety of ways this year including unlocking effects for physical apparel, digital replicas of physical clothing for your avatars, and digital-only looks you can only “wear” after capturing it on camera for your social feeds. This will all be facilitated by avatar platforms that make it easy to offer and sell virtual goods, AR platform features such as image targets and body tracking that can be used to bring virtual clothing into the physical world, and NFC and QR codes embedded into physical apparel to trigger experiences that can transform one shirt into an infinite amount of designs.
Just as AR is disrupting the fashion industry, so too is it disrupting retail. I expect AR to continue to transform the retail industry in 2023 both by making the ecommerce experience feel more physical and by making the brick-and-mortar experience more digital. In 2022, we saw a number of major retailers roll out virtual try-on as part of their ecommerce offering including Amazon, Walmart, H&M and Society6 to name a few. I expect virtual try-on and virtual try-out to steadily become a staple in ecommerce especially as success stories of AR’s ability to collapse the purchasing funnel are shared within the industry. Many retailers have already shared proof points of how AR is driving awareness, increasing conversion, expanding cart size, and reducing returns. And, as this year it is projected that worldwide retail ecommerce sales will total over $6 trillion and make up 22% of total retail sales, retailers will be even more incentivized to optimize their ecommerce sites.With location-based AR in play and image target technology widely available, retailers now have powerful AR tools to transform their everyday locations into experiential destinations. While I expect most of the AR activity in retail to be around ecommerce, I will also be keeping an eye on how retailers are using AR to drive foot traffic into malls and stores. AR has the potential to save the brick-and-mortar retail industry by giving shoppers new reasons to visit locations. It will also equip retailers with a powerful digital tool that can provide them with data they’ve never seen before in-store to help them drive business decisions.
Marketers have always played a significant role in new technology adoption as they experiment with innovative technologies to cut through the noise and engage people in brand new ways. AR has been no different. Marketing will continue to be a major driver for AR content creation and engagement by the mainstream in 2023. But with a couple of years and many experiences under their belts, many brands will have a growing focus on measurement and ROI to ensure that AR is helping to achieve business value. In this way I expect that in 2023 brand marketers will be looking for a return on AR (or ROAR) especially as many metaverse activities in 2021 resulted mostly in awareness and PR, an economic climate that continues to encourage businesses to watch their spend, and mobile AR campaigns which have already demonstrated success along the marketing funnel.This is an important time for mobile AR in the marketing industry as proving its value will be the determining factor in moving AR from being used for innovation to a staple in the marketing mix. Key successes in the next few years will solidify AR as part of the marketing strategy, giving it a line item in the marketing budget and making it a key pillar every campaign.
Like metaverse, I expect the NFT hype to chill this year. As the hype dust settles, NFTs will get more practical and, dare I say, boring, but that is a good thing. NFTs are set to become the new loyalty program for many brands, as the community and rewards aspect of NFTs lend itself well to increasing brand fidelity. As 2023 sees more brands roll out NFTs, following the lead from Nike’s .swoosh and Starbuck’s Odyssey, I expect that brands will look to AR to add more value to their digital collectibles.We began to see the intersection between AR and NFTs in 2022 with Jadu, which is building Web3's definitive AR game platform, Pixelynx’s NFT scavenger hunt game, Elynxir powered by Ligthship and AR players, such as 8th Wall, Geenee, Snap, Meta and Perfect Corp, updating their platforms to make it easier to bring NFTs into the physical world. As more brands enter the NFT space, I expect they will be looking for new ways to use their inventory. This will include using AR to find and unlock NFTs as well as bringing NFTs to life in the physical world so that people can use them including taking photos and videos with them that can be shared with their networks.
In my past trend reports, I highlighted the growing use of broadcast AR, or real-time CGI on television. In fact, I purported that “Gen Alpha will become the last generation to watch live events on TV without augmentation as broadcast goes all in on AR”. While I believe we are on track with this trend, and we will continue to see more live event shows use AR to make it extraordinary, this year I am also keeping an eye on the growing use of AR as a companion to TV shows. We caught a glimpse of how connected television (CTV) intends to use AR in 2022. Disney, NBCUniversal and others launched companion augmented reality experiences which brought the show off the screen and into the homes of viewers for them to engage with. With the advent of mixed reality and steady adoption of mobile AR, I expect more networks to launch their new shows and movies with AR experiences that allow people to go deeper into the story. In turn, this may increase screen time and encourage repeat viewing of a show in addition to generating awareness. Keep a look out for more on-screen QR codes that trigger augmented reality experiences aimed at elevating your CTV experience and possibly content on TV screens you view through mixed reality headsets to augment the show you are watching while wearing the headset.
AR as a medium is on a journey very similar to photos and videos. At one time, creating photos and videos required highly specialized skills but over time the tools to create this content were made more accessible and easier to use by the everyday person. While AR may not be as far along as photos and videos, AR filters and effects have democratized the use of special effects to enhance content for social media.AR effects and filters are not new but what I suspect will be novel in 2023 is the growing use of AR in longer form video. AR filters have mostly been used in photos and short video clips for stories. In longer form video, AR can be used to add real-time special effects which can elevate everyday video creation to a level we have previously only seen come out of studios. AR’s ability to remix reality will not only mean that the same effect can be used by Creators to produce content in an infinite amount of ways, but it also means that the same exact scene captured on video can be seen in a variety of ways by making use of different effects.
2022 was a big year for avatars. The race to have you create and invest in a digital self saw a lot of activity from tech giants such as Apple, Meta, Snapchat, TikTok, Microsoft, Zoom, Roblox and so many more last year. Your online identity has always been important but embodying it with an avatar and using this to represent you in virtual spaces and experiences will continue to be a serious trend this year. And with even more AR and VR content expected to be created, 2023 will surely present even more opportunities to use your digital double.The success of avatar systems relies on the avatar design and the ecosystem you can use it in. As such I will be keeping an eye on updates to avatar systems that give even more personalization options, make avatar creation easier, and APIs that enable the use of avatars beyond the walled garden systems they were created in. The latter will unlock even more opportunities for our digital doubles including purchasing clothing, accessories and experiences for our digital selves. With many avatar systems already in play, I will also be keeping tabs on the adoption of these systems by developers and consumers as I suspect that we will start to see some front runners (if we haven’t already).
Digital twins, or synchronized virtual representations of physical products, systems or processes, are already in use by a number of organizations across various industries. Digital twins are enabling next level monitoring and maintenance, accelerating design and development cycles, and elevating business strategy through simulation within many organizations today. In 2023, we will see an acceleration of digital twin adoption in the enterprise driven by the proliferation of digital twin and AI technologies and case studies from a number of early adopters highlighting the effective use of digital twin in their organization.When digital twins are combined with AR and VR it creates an immersive experience which brings a physicality to the simulation which makes engaging with virtual objects, people and processes feel much more natural. The latter is what McKinsey calls, “the enterprise metaverse—a digital and often immersive environment that replicates and connects every aspect of an organization to optimize experiences and decision making”.I expect we will see a growing focus on the “enterprise metaverse” especially as digital twins become standard practice in the enterprise, mixed reality headsets grow in adoption, and as the commercial metaverse hype dies down causing many to shift their attention to the metaverse at work.
The rise of remote work, and the availability of enterprise ready headworn wearables, is transforming the way we work which in turn is creating demand for a new set of applications to use on the job. These new solutions are changing the face of productivity such as upskilling workers with hands-free information to efficiently perform tasks, increasing the feeling of presence in a meeting or brainstorming session, or transforming the space in front of us into an infinite screen or surface for 3D content creation.With new mixed reality headsets becoming available at a price best suited for the enterprise, I expect that this year we will see a greater emphasis on collaboration and productivity software as a “killer app” for mixed reality. These solutions will come from both startups and well established players. I also expect that these solutions will explore ways to switch between VR and AR in a manner that optimizes our productivity, leaning into VR’s ability to shut out the world in front of us while opting to make use of our space with AR when it makes sense to do so.
We have seen a steady amount of discourse on the need for privacy, security and safety in this next wave of computing over the past few years. With metaverse on people’s minds, new headsets making their way to consumers, and more AR and VR content being created by developers, keeping people safe will become an even greater focus. This will be especially key as devices begin to gather more information with sensors that enable eye tracking and hand tracking, and as AR begins to use more spatial data to root itself in the scene.As adoption of these new technologies accelerates, conversation will need to turn into action as we will require new frameworks, regulations and social contracts for society to safely make use of XR. I expect this year we will see more activity from working groups, policy and standard discussions, and new software solutions focused on moderation and cyber threats, all aimed at keeping people safe in digital realities.
Artists and activists are not new to AR and VR. We have already seen many compelling examples which make use of these perceptual computing technologies to change the way we see humanity and society as a whole. I expect that the availability of mixed reality headsets will inspire new ways for Creators to tell stories that tap into XR’s empathy engine. This could include using mixed reality’s AR and VR toggle to immerse people in a simulation before showing them the impact on our physical world. Or bringing photos, portraits or screens to life to expand the storytelling of a physical work of art beyond the canvas. The use of AR and VR will help many causes cut through the noise in order to be heard. It will also offer people an opportunity to have new experiences that could shift their perspective, inspiring them to take action and make change.
Are there trends you are keeping an eye on this year that I don’t have reflected in my post? Add them to the comments! Want to look back at Augmented Reality in 2022? Read my "Reality Check: Looking Back at Augmented Reality in 2022" report on LinkedIn. Subscribe to my LinkedIn newsletter and follow me to get regular posts on augmented reality, spatial computing and the metaverse. And thank you for reading! This article was written as an independent piece. The ideas and opinions expressed in this post are mine alone and do not represent any organization, past, present or future, which I may be affiliated with.All images were created with Midjourney.
You should also check out the following articles: