Looking glass
Lens two: Evolving the human-machine experience
We are changing the way that we interact with the digital world and what we expect from it. Traditional devices are extending their reach with gestures and voice interaction and real-world scenarios are being tested through the use of digital twin simulations that guide consumers and model results. In moving with some inevitability toward the metaverse, the physical and digital worlds will further converge in a way that will open new possibilities for businesses..
Through the Looking Glass
Interfaces continue to evolve across gesture, voice and touch — engaging all the senses. Devices that work with us in our everyday lives are commonplace and a richer pairing of software and hardware. Devices themselves are becoming more ergonomic, designed to slot into everyday interactions with minimal disruption. We now see more intelligent devices, with local and cloud-based AI solutions supporting day to day decision-making.
Autonomous driving is not the only example of evolving interactions but provides powerful examples of this lens in action. We’ve moved very quickly from real-time, traffic-based mapping services, to self-driving cars that are constantly simulating all the possible future actions of the vehicles on the road to realize lower risk outcomes. In an autonomous vehicle instead of focusing on making small adjustments to steering, you can concentrate on the bigger picture. Perhaps the car notifies you of a major traffic incident. It can route around it but you are also able to investigate and pursue completely alternative goals, such as stopping for a meal now instead of after arriving at your destination.
Facebook’s plans for the metaverse have unleashed a torrent of hype and various big players are already jockeying for position. Yet the concept of a fully digital world is far from new. The first million-dollar virtual property was sold in Second Life 15 years ago. What has changed is that higher spec devices — such as phones and wearables — are in most people’s possession, making it possible to access new digital worlds in some form almost instantly. Even more specialized devices are sure to follow. Enterprises and investors are realizing this is the new frontier and that the way we work and live our lives will again change via the medium of technology.
Signals include:
A surge of investment in extended reality (XR) consumer solutions in readiness for the metaverse and related services
Increasing numbers of players entering the metaverse market alongside Facebook and Microsoft. Tencent recently announced their intention to build a metaverse platform, while Nike is positioning itself to become a virtual apparel provider
Rising investment from hardware vendors providing pre-rendered and streamed XR experiences as they move the heavy lifting to the cloud
While we’re still waiting for a consumer-grade augmented reality (AR) device from a player such as Apple, such an announcement could come at any time and will prompt the market to move quickly
The opportunity
People expect more from their interactions. It’s not just about function over form anymore: we want devices to look good, feel good, understand our emotions and be more aware of our needs. Providing service of this kind is table stakes, doing it well is the goal.
New on the horizon this year is the metaverse. Consumers have demonstrated enthusiasm for new platforms driven by fundamental trends and the metaverse hopes to be one such example. We will see new devices, as well as extensions to existing devices such as phones, designed for this emerging environment. It is also likely that we will see different companies provide competing environments. As Second Life has already demonstrated, it’s not just advertising revenue that will monetize these platforms; there are a myriad of possible products and services and it is already a proven model at a smaller scale.
The evolution of interactions can also contribute to the bottom line. The global metaverse market is expected to grow at over 40% per year, reaching US$800 billion by 2028, according to the latest analysis by Emergen Research.
What we’ve seen
Trends to watch: Top three
Adopt
Natural language processing. The ability to interpret human language -- converting speech into text, and text into meaning -- continues to improve, with impressive capabilities just a cloud API call away. The most obvious use-case for this technology is in customer service, where 85% of requests are customer initiated and an immediate response is preferred. But NLP can also be used to understand sentiment, create summaries of larger texts, interpret legal documents such as contracts, and much more. This makes NLP widely applicable, beyond the customer service department.
Analyze
Augmented reality combines the physical world with a purely digital space. A limited form of AR is now ubiquitous, delivered via Apple and Android cell phones, which are capable of overlaying virtual objects to a camera view of the world. More advanced AR is delivered via a dedicated headset such as Microsoft’s Hololens or Google Glass.
Anticipate
Metaverse. Seen by some as the future of the internet and others as the next stage in human evolution, the metaverse could offer compelling, integrated virtual environments to visitors. These virtual worlds have been around for a while but with progression in headset resolution and power, as well as the ability to create content in realtime in the cloud and stream to headsets, the experience promises to be more advanced. We are seeing large companies, retailers and governments all considering how to construct or participate in the metaverse.
Trends to watch: The complete matrix
- Smart systems and ecosystems
- Natural language processing
- Enterprise XR
- Intelligent assistants, agents and bots
- Increasing role of decentralized workforces
- Biometrics
- Computer vision
- Touchless interactions
- Facial recognition
- Augmented reality
- Connected homes
- Gaze tracking
- Synthetic media in a corporate context
- Digital twin
- Gesture recognition
- Addictive tech
Advice for adopters
Many of these emerging technologies require specialized knowledge which is not common in traditional enterprise software development. For example, users will interact in a VR environment in completely different ways than they interact with web-based applications. This requires the application creators to think about user experiences in entirely new ways. Organizations wishing to leverage these new experiences need to start building capabilities now.
Emerging interfaces will without question present a wealth of opportunities in the B2C world but many B2B possibilities will also be created that businesses should explore. Training, conferencing, gaming and virtual worlds are the classic examples but there are even more inventive ways in which devices and AI can partner with a human agent to produce better outcomes in the professional context, such as intelligent, self-piloted drones in agriculture or rescue.
The rollout of solutions will be hampered by available capabilities and to some extent by the race of the dominant technology sets. Relatively simple AR offerings are translating into available products and will become more commoditized over time.
Bear in mind that these technologies change the user experience and design process. In XR for example, working across multiple dimensions well is a challenge but emotional interactions also need to be considered. People represent differently in virtual worlds, which can have moral and ethical implications.
A certain degree of vendor lock-in is inevitable, whether in devices, digital worlds or the data that they generate. Accept this but also be prepared for change. Embracing one platform may be the best solution for your organization now but not necessarily over the longer term, depending on how the ecosystem and your needs develop.
By 2023, businesses will…
… begin to understand that the expanding frontiers of interaction don’t just pave the way for richer customer experience but can actually drive business and process improvements, by pairing technology-based speed, scale and precision with human capabilities and ingenuity.