Augmented reality, virtual reality, mixed reality… What is the reality?

It’s been an interesting decade and I remember clearly that it was sometime in 2007 when we actually started working on head-mounted glasses project for one of our customers. The customer had an interesting idea of showing whatever you were watching on your laptop screen into the glasses and that was to be controlled and operated by voice commands. On your head, you had a hands-free computer that was controlled by your voice commands and your “hands were free” to do other tasks. If you look back nothing much has changed from that basic concept in 2007 but in reality, everything has changed.

Today we are at an inflection point for the AR, VR, MR and whatever other ‘R’ you want to call it as. What exactly has changed in last 10 years? When we started working on the glasses in those good old days, the entire idea was to realize hardware that could fit on your head, function like a computer, operate by voice command, and have Windows on it. Yes, the very first glasses that we realized had Windows Embedded OS! Android was still not in the vogue. It was a great achievement for us to show something like this at the 2009 and 2010 Consumer Electronics Show (CES) where it attracted huge crowds when it was shown for the first time. It was a standalone piece of hardware showing some basic stuff like photos, videos and such.

One thing that definitely helped the world to take notice of this technology was Google Glass. Google Glass 1.0 may not have been a critical commercial success for Google but it definitely did a great service to the smart-glasses world by showcasing the technology to the common man in a way he could understand. That was one of the biggest contributions of the Google Glass.

Coming back to today, what has changed? I think for any technology or product to succeed many things have to come together including:

  1. A good hardware platform
  2. Software
  3. Value for money
  4. And most important, real-world use cases: It should make a difference in users’ lives. It could be for business or in the day-to-day life of the common man, addressing some critical issues. At the end of the day for a product and service to succeed, it should make a significant difference in quality of life and the problem it is trying to solve.

So what do we have today that wasn’t there ten years back? Let’s take a look.

  1. The hardware has matured. Various ARM™-based platforms are giving PC-grade performance
  2. The software has been hardened and field tested. We actually have everything that is there in PC on these glasses too.
  3. And there’s more: The apps! Today there is a community out there writing apps for the AR, VR, and MR world.

But that’s not it. We actually have the real-world use cases that can impact or make a difference to the lives of people.

The advanced research in the areas of AI, machine learning and analytics has greatly contributed to the way these products are evolving; the seamless connectivity of these glasses to the cloud is making a huge difference in the way these devices and applications can be used in the real world to solve real problems.

So what is the real-world problem that these glasses are going to solve? The possibilities are countless. I can keep listing things like factories, technical support, advanced engineering support, supply chain management, aviation, medical and even in solving a great issue like blindness by actually giving an eye to a visually challenged person. Imagine what a great improvement it would make to visually challenged people if they can start leading a normal life without any support from anybody. This is not a fantasy and we will be seeing large scale deployment of these glasses in eradicating blindness as we know today.

So how is this really going to solve the day to day issues of the various things I have mentioned above? The scope is enormous but I want to look at couple of areas where it can make a real difference.

In today’s age of e-commerce, the supply chain is probably one of the biggest challenges. For the first time, Internet shopping exceeded in-store shopping during this Thanksgiving season in the US. E-commerce in general and Amazon in particular have changed the way we shop. The inflection point has already occurred as far as online shopping is concerned. So what’s the next frontier to capture? Second-day delivery is passé. I want to touch and feel the product on the same day that I buy. Is same day delivery possible?

Have you ever imagined the logistical nightmare that moving these huge inventories cause and the amount of manual labor that’s required to accomplish this? Giants like DHL and FedEx have already started pilot projects to assess the use of AR across the supply chain ranging from warehouse planning to transportation optimization and last-mile delivery.

Imagine huge warehouses measuring hundreds of thousands of square feet and you can only guess at the time you can save if you can direct the worker to the package directly instead of looking around all over the place. It has been demonstrated that visual picking using smart glasses can improve efficiency by as much as 25 percent over regular hand picking. In regular hand picking, workers search and pick packages in the warehouse using a paper list and a handheld scanner. But in vision picking, they wear smart glasses that provide them with visual instructions via AR on what to pick and where to place the goods.

Now look at the next stage, the RFID tag is prevalent and costs no more than a few cents. Having an RFID tag on each package accompanied by a RFID reader and combined with visual instruction on glasses provides a winning solution.

Advanced remote technical support is another area that has immense potential with a return on investment that can be realized in matter of days and not months or years. Specialized industries such as oil and gas and aviation face an aging workforce. It is estimated that the aviation industry is going to lose a large number of its experts due to retirement. These experts cannot travel to remote locations to troubleshoot and fix problems. Here is where the glasses can come in handy. If there is a breakdown of equipment or a technical glitch, a specialist in a remote location can assess the situation via AR devices worn by workers present at the site and help them resolve the issue using AR features such as voice instructions and images. This equips regular employees in the field to undertake assembly and repair tasks that would otherwise require specific training effort and time.

These are just a few examples. The possibilities offered by visual reality are countless and we are just seeing the tip of the iceberg!

 

*Published in embedded.com