Manuel Terranova, CEO Peaxy, recently spoke on a panel for the American Society of Naval Engineers (ASNE) in Washington, D.C. The panel covered digital twins, with a goal to provide actionable ideas to the audience on how to deploy a twin, including implementation of sensors, analytics, and the underlying infrastructure to support integration. 

Here are some highlights from the discussion:

  • Digital twins can span the gamut from high fidelity to lower fidelity. As compound simulations, they are created from volumes of data collected from real-world assets, reflecting how they operate under various conditions and stages across the overall lifecycle. This includes ambient data that affects the asset, such as temperature and relative humidity, and operational data such as torque, vibration and round trip efficiency. The end result is a “living” digital simulation that continuously improves as it ingests more data from its physical counterpart, and it can potentially take years to get the full data set.
  • Data access is absolutely key in successful implementations. Twins demand a robust “data substrate” that allows for access and aggregation. This includes unfettered access to the “unstructured crown jewels” … geometry, telemetry and simulation data sets. For a high fidelity twin, it’s critical that aggregation is done across organizational boundaries and includes the full lifecycle of the asset, from initial design drawings to baseline test results, fault codes and full maintenance records.
  • Standardization of digital twins requires owning not only the machine but also the data. I believe asset owners in general need to be more assertive about owning their maintenance data, and actively break down barriers to sharing that data with any third parties involved. Software providers can be guilty of obfuscating simulation files, for example, without including the encoding schema just because they can.