True Full Lifecycle Management
As we’ve written about in the past (see PTC’s Journey From CAD/CAM Pioneer to Enabler of the ‘as-a-Service Economy’), PTC has evolved from introducing the first CAD with parametric modeling to having a full suite of solutions that leverage that core CAD design data throughout the full lifecycle of products. This includes mechanical, electrical, and software design tools, PLM, ALM and systems engineering capabilities, a full suite of service solutions (parts planning and optimization, service knowledge management, network management, field service, warranty management, and more), tools to publish technical information for use throughout the lifecycle, all ultimately enabling new service-based business models for manufacturers. Recently PTC has made heavy investments in the Internet of Things, with acquisitions of ThingWorx (IoT development platform), Axeda (IoT device cloud), ColdLight (IoT machine learning) and now Vuforia.
Where Augmented Reality Fits In
So what does PTC intend to do with augmented reality technology? For many years, PTC has been able to take CAD data and create step-by-step visual 3D instructions; a 3D graphical ‘show me’ video rendering that is far superior to written text instructions for physical tasks like repairing a machine. You can view a short example of visual instructions here. More recently, PTC has been touting the concepts of a digital twin and using augmented reality to provide dashboards overlaid on a machine and ‘x-ray vision’ to see the current conditions inside of the machine (based on sensor readings), see the internal structure and parts within the machine (based on the 3D CAD model and ‘as-serviced’ BOM), and see which parts will need service or replacement soon (based on predictive analytics).

That is powerful stuff, that pre-dated (but presaged) the Vuforia acquisition. It was easy to imagine how Vuforia could be used to super-impose those visual instructions and indicators onto a physical machine via a tablet, phone, or optical head-mounted display (such as ODG R7, Google Glass, and others). In fact, those were the kinds of applications that PTC demonstrated during ThingEvent.
KTM Motorcycle Demo — Diagnosis and Repair by a Non-Expert Technician
The first demo at ThingEvent showed diagnosis and repair of a motorcycle by a less experienced technician. It involved an actual KTM motorcycle in a mock repair shop and was in part intended to show how augmented reality can help with the shortage of experienced mechanics. One of the demo people, ‘Simon H.’, role-played a new technician. He showed the technician’s app (jointly developed by KTM and PTC using ThingWorx) on an iPad, with a dashboard that showed a list of the motorcycles to be repaired.

He tapped the first one on the list, which brought up a screen with detailed information about the bike. Since the bike is smart and connected,1 it let him automatically run diagnostics on it, which reported the problem: a bad Lambda Probe.


Next he went over and pointed the iPad’s camera at a VuMark™ symbol (more on this below) found on the side of the motorcycle, which identifies this specific bike. Once it was recognized by the software on the iPad, it super-imposed a green hexagon around the mark, to make it clear that it had identified this item.
From there the AR magic happens. The app showed the technician not only where the Lambda Probe is, but provided step-by-step visual instructions, super-imposed on the image of the bike in the camera. It showed each step to replace the probe, such as the first step shown below, to remove the screw holding the panel in front of the probe.

Note, these are screenshots from the actual live demo at ThingEvent. They are examples of what the technician sees on their iPad in real-time — with the augmented reality super-imposed on the objects in the iPad’s camera. This is one of the major capabilities of Vuforia’s: the ability to keep the super-imposed 3D images properly aligned and in synch with the real-world 3D image from the camera as the user moves around the scene. This particular demo is an example right up PTC’s alley, bringing together CAD, service, and IoT information within an AR interface.
Sysmex — Saving a Trip by the Field Service Technician
The next demo was by Sysmex, a manufacturer of laboratory testing reagents and equipment, such as blood analyzers. They developed an iPad application called MySysmex that provides a dashboard showing all the analyzers in the lab. It allowed the end user, a lab technician, to perform maintenance on the machine that might otherwise require a service technician to come onsite to perform. In this demo, the end user (a lab technician), sees that one of the machines is clogged and the automated cleaning cycle failed, so he needs to do a manual cleaning. The tech scans the VuMark and then sees step-by-step visual instructions overlaid on the machine — first the power button flashes to show he needs to power off the machine. Then it showed an animation of each of the steps, removing the cover, removing a plate underneath, and the actual motions he should do with the cleaning wand to properly clean it. The image below is the actual machine in the camera, with the super-imposed visual instructions on removing the cover.

Schneider Electric Micro Data Center — Intuitive AR-based UX
The final demo was Schneider Electric showing one of their micro data centers which could be remotely configured and monitored, including information from multiple systems (PTC, ServiceMax, Salesforce.com) to see warranty and service history, etc. Someone2 physically at the system was able to scan the VuMark and immediately it superimposed a virtual dashboard with battery level, temperature, power used, and a status indicator that showed that some service was required.

In this case, the battery needed to be replaced, and again visual instructions appeared showing the exact sequence of steps, an animation of each step as the various covers and screws were removed, showing the motion of each part — very intuitive and language independent.
Addressing The Broader Universe of Things and AR Applications
These demos showed the kind of service applications that are in PTC’s wheelhouse, leveraging CAD, PLM, IoT data and the ThingWorx platform, and Vuforia AR. PTC could have been satisfied just creating these apps with their customers. But they stepped back to look more generally at where the merging of IoT, enterprise data, and AR is going. If you stop to think about how many different types of things there are in the world, each with its own set of apps,3 it means there are potentially many millions of apps required to interface with all those different things. PTC realized there must be a better way than the standard app store to find and launch the right app for a specific thing that you encounter.
The bold idea PTC introduced at ThingEvent is encapsulated in what they are calling ‘Project ThingX’ (short for Thing Experience), which includes the concepts of the ThingBrowser and ThingServer. ThingBrowser is a standard browser (it supports HTML) extended to support augmented reality, via a new markup language dubbed “Thing Markup Language” (TML). TML is used to identify the thing, bring together all the data, and then marry and synchronize the appropriate augmented reality UI to be superimposed on the image from the camera — or superimposed on the reality the user is seeing directly via a heads up display (HUD).
Identifying Things
A key piece of this vision is that things need to be automatically identified. For many applications this means identifying not just what type of thing you are seeing, but which specific instance of that type of thing it is (e.g. conceptually, or in some cases literally, the serial number of the thing). PTC already understood that things need a way to identify themselves, in order to get the specific CAD, manufacturing, service, and sensor/IoT data related to that specific thing. In earlier (pre-Vuforia) demos, they scanned a barcode to identify a specific thing.
During ThingEvent, PTC highlighted the use of VuMark, the approach that Vuforia has pioneered. It has some aesthetic advantages over a bar code, because the identifying information can be flexibly incorporated into an icon or graphic mark. But VuMarks are certainly not the only way a ThingBrowser might identify things. A truly open ThingBrowser would have some sort of open and extensible way to add in any conceivable identification methodology. Examples might include traditional 1D or 2D barcode, RFID (enabling non-line-of-sight identification), facial recognition, and possibly optical object recognition,4 to name just a few.
ThingBrowser’s Immense Potential Beyond PTC’s Core Domain
This is a big bold idea — ThingBrowser, ThingServer, and TML forming an open, extensible, general purpose AR platform that lets things self-identify and automatically bring up the appropriate AR application. There is a virtually limitless set of applications one can imagine using the ThingBrowser paradigm (see sidebar). Of course, PTC did not invent augmented reality. But they have proposed a scalable approach to creating a universe of new AR applications, enabling a whole new way for people to interact with the world, especially when combined with advances in optical head-mounted displays.
Impact of ThingX for PTC — Questions Remain
Time will tell how this will all play out. Certainly the framework and direction PTC laid out at ThingEvent represents another significant step in their journey as a design-centric company that is extending the use of design data out into the physical world. What is less certain is how far PTC wants to take this into the much more enormous and infinitely diverse potential future uses of a ThingBrowser outside of PTC’s core focus, as suggested above.
Many questions remain for me, although PTC may have good answers on some of these. Is PTC’s introduction of the ThingBrowser meant just to get things going, or do they want to be in the browser business for the long haul?6 What is the commercial value for PTC of owning a browser business? There is unlikely to be a direct revenue contribution from such a business, since the expectation is that browsers are free. However, it could open up many possibilities for them. Will they offer an open API to add virtually any type of identification methodology to ThingBrowser? Is TML going to be submitted to W3C to become a truly open standard that is managed by an independent body? Where will PTC draw the boundaries on the kinds and sources of information they incorporate in the ThingBrowser?
A Valuable and Important Initiative for PTC
Regardless of the answers to those questions, ThingX is a major valuable initiative for PTC. It enables them to take their vision of app-building scalability to the next level with AR-oriented apps. It will help their customers in many ways, such as guiding the end user through tasks that previously required bringing an expensive skilled field technician on site. Or speeding up diagnostic and repair tasks while reducing errors. Or significantly reducing dependence on a shrinking pool of highly skilled technicians. Or providing highly extensible, upgradeable, and customizable dashboards and controls on their products, without requiring actual physical displays or controls, which adds expense and parts that can fail. This also makes products extremely configurable — the same base model can be turned into many different models with different capabilities just through software — the UX can be customized as easily as the UX of a software application. In short, whether or not PTC taps into the much broader universe of possibilities that ThingBrowser could enable, there is plenty of room for them to use ThingX initiative and new AR capabilities to do great things and realize value within their own domain and areas of focus.
________________________________________________
1 I talked to the KTM folks after the demo about how the bike is connected. For this demo, they used a direct Bluetooth connection, but stressed it was only a prototype. They still have work to do before these capabilities are commercialized and were not sure yet which communications technology would be used to connect the bike to the network. — Return to article text above
2 In this case, it was Mike Campbell, PTC’s EVP, Digital Twin, role-playing a technician. — Return to article text above
3 A single type of thing would often have several different apps associated with it, depending on the role and context of the person interacting with it. For example, a washing machine might have an assembly-instructions augmented reality application for the worker building the machine on the manufacturing line, a marketing/information-oriented AR application for the consumer in the store who is trying to make a purchasing decision, an installation AR app for the crew delivering and installing the washing machine, a use/monitor app for the owner using the washing machine in the home, a diagnose/service/repair app for the service technician who is coming in to repair it, and so forth. — Return to article text above
4 Facial recognition and optical object recognition require sophisticated video analytics. We’ve seen significant advances in those areas recently and expect to see more. Object recognition algorithms might tell you the type (i.e. make and model), but not identify which specific instance (e.g. serial number) of that type of object it is. — Return to article text above
5 The same app might be used voluntarily by citizens who wish to help; sending an alert to the police when the citizen drives by a stolen vehicle and warning the citizen not to approach that vehicle. — Return to article text above
6 I suspect they will decide questions like this over time, as they see how things evolve. — Return to article text above
To view other articles from this issue of the brief, click here.