The Apollo 13 mission launched in April, 1970 with high hopes for achieving the third lunar landing. Two days into the mission, disaster struck. An oxygen tank exploded, not only threatening the mission but the lives of the three astronauts aboard. On the ground in Houston and elsewhere, NASA engineers scrambled for a solution. 

Brilliance often strikes in the heat of the moment, and this was proven true when engineers creatively constructed a ground-based replacement of the damaged component, using only parts they knew were available to the crew. Using this replica, they were able to run a series of tests to ensure the best possible approach, before sharing the final solution with the crew in space. This method of using a replica, or “twin” to solve a difficult problem has since evolved into a purely software-based concept known as the “Digital Twin.”

Technology has, of course, come a long way since the days of Apollo 13, and we can now build accurate digital models that represent a physical object, complex system, building, service or even a person. These models, or simulations, are created from volumes of data collected from real-world assets, reflecting how they operate under various conditions and stages across the overall lifecycle. The end result is a “living” digital simulation that continuously improves as it ingests more data from its physical counterpart. This continuous improvement, also called tuning, requires both a deep understanding of physics as well as the data structures of the underlying asset (or system) that is being replicated. Getting a digital twin to actually mirror the physical asset is a tricky business, often demanding world-class domain experience and a bit of patience.  

Once the digital twin is created and properly tuned, it can be used for a variety of things, including prototyping a new product, connecting disparate systems, or testing fixes and upgrades. Because digital twins simulate the physical asset, they can be used to reduce downtime by helping operators gain a better understanding of what components are at risk for failure and when to best perform maintenance, thus extending an asset’s life by minimizing degradation. The key value digital twins provide is not only seeing what’s happening with the asset, but testing “what if” scenarios to better understand how it performs in different conditions and states.

Despite only becoming a more popular and accepted technology in the last several years, companies are staking high hopes on the predictive capability of digital twins. Gartner has named digital twins a top 10 Strategic Technology Trend. In addition, IDC predicts that by 2020, 30 percent of Global 2000 companies will be using data from digital twins of connected assets to improve product innovation success rates and organizational productivity, achieving gains of up to 25%.

Digital twins are brought to life using a number of underlying technologies, all of which have reached a point of maturity in the past three years to make them cost-effective and scalable. The first of these is readily accessible access to data from three primary sources: geometry data (digital drawings and schematics of assets), simulation data (computational models) and telemetry data (sensor data). Newer industrial machinery, for example, is now designed with robust, high-fidelity instrumentation that gathers millions of data points in a 24-hour cycle, and will range from discrete measurements  such as temperature to highly-sampled continuous data such as vibration.  

The ability to compute and store enormous amounts of data in the cloud at a low cost is also driving adoption of digital twins. Standing up servers and accompanying storage to compute complex simulations such as digital twins requires significant on-site infrastructure, making these types of investments primarily an R&D activity for most Fortune 1000 companies. Now, however, with the cloud, data can be safely ingested, aggregated, managed and orchestrated (computed) at a scale that was not economically feasible five years ago. 

Machine learning and advanced analytics further inform the latest generation of digital twins. Smart algorithms can be applied to your data to drive downstream systems that offer real world benefits and practical insights, such as inventory management for complex systems. The power of these algorithms lies in the well-organized data behind them, often sourced from disparate systems – many of which were previously isolated “data islands”. With better data access, machine learning has more raw material to drive impacts that businesses can feel.

What are some real-world use cases for digital twins? Big machines such as aircraft engines, power generation turbines, grid-scale batteries and locomotives are all commonly maintained and perfected with the help of digital twins. Engineers can test their designs in a variety of conditions and environments before deciding on the best configurations to use when optimizing the physical machine. 

Once sufficiently tuned, digital twins can even offer the possibility of predicting when a machine or part will fail. Predictive twins are very difficult and should generally not be something companies attempt during the first phase of a digital twin effort. When properly tuned, however, predictive twins can offer a disruptive competitive advantage to industrial enterprises by shaving significant costs from their operations with improved efficiency, more easily managed maintenance windows and reduced downtime.

What solutions does Peaxy offer?

Peaxy built one of the first combined physics and economic digital twins in 2016 for a Fortune 10 company. That effort delivered a digital twin of a complex system, including gas turbines, steam turbines, steam generation and power generation equipment all faithfully modeled and validated. While that first effort spanned over 200 days, we can now create twins in as little as 65 days, at a fraction of the cost.   

Peaxy offers predictive asset management for industrial equipment that turns operational data into insights that optimize productivity. Our Peaxy Lifecycle Intelligence (PLI) product is a modular, scalable, cloud-based asset management solution aligned with the needs of the value-driven enterprise. By rapidly turning operating data into financial insights, PLI lets operators minimize O&M costs and optimize performance, improving the lifetime value of industrial assets. PLI serves a wide range of use cases involving precision-engineered equipment, grid-scale battery installations, aviation components, gas turbines, steam turbines, wind turbines, compressors and propulsion systems.