Written by Director of Emerging Technology Dan Phillips
Reality is subjective. Not everyone or everything experiences the world in the same way. Sometimes differences are subtle, sometimes markedly extreme. Whether it’s how you react to an election result, hear a tone in a song, or taste a delicious dish, see a rainbow, observable reality and consistency of perception is often not as objective as we think it is.
Emerging technologies such as augmented and mixed reality will over time further expand and blur this line of perception. With AR on mobile devices and head-mounted displays, we’re well within the beginnings of what it means to live an augmented life. Humans are doing a lot of fun things right now, like bringing gaming into our physical world and making our faces into playthings of fun with endless filters and enhancements and props. We’re also starting to find utility for AR in enterprise and education and in customer experience, and with the emergence of hardware designed for specific applications in business.
But AR is not just about the future of vision changing. AR can be the technological prism through which we see the world, but for humans it will also become the common device for the combined knowledge of the species. We will expand our tech parameters beyond display technology to deeper integration with machine learning and artificial intelligences and instantly searchable databases. We will tap into the power of 5G connectivity and beyond to create new merged physical environments. We will be able to intuitively read the reactions of people we encounter based on the dilation of each other’s pupils and the pulses under our skin. Opinions and choices will be made through instantly accessible shared data. Want to make a key purchase, for example? Analyze the salesperson’s biometric response to your questions, and scan satellite imagery to see how much bargaining power you have based on how long the product has remained on the shelf.
Magic Leap, Microsoft’s Hololens and much anticipated but never confirmed moves into the wearable space by Apple give us mainstream hardware for AR. We also have next generation AR-enabled spectacles and contact lenses on the near horizon, or perhaps we will just jump straight to implants and nerve-driven control systems. If that sounds ridiculous and farfetched to you consider how the inventors of past innovations in spectacles could not have anticipated our use of laser corrected vision or human-computer interfaces used in experimental therapy today. If we think the oblong devices we carry in our pockets are the end of screen interface technology then we have learned nothing about the power and pace of technology to change and be adopted. Technologists have the free reign to debate the ethics of data driven modification where politicians and bioethicists do not. The question is not if these technologies will change our experience of reality, but how quickly.
Many animals already sense things we can’t and on spectrums not available to humans. Think of that when you put on an AR headset and find yourself motioning to the invisible. Your own visual experience can be completely unseen by the people around you, whilst remaining entirely real to you. What you see and your understanding of it will soon be different from the person next to you, and we will no longer have a common experience of our shared environment. When AR arrives in its fuller and more integrated state, the challenge for our technologically tiered society will be how we stay in sync with one another.