Digital-twin technology is attractive to businesses trying to get the most out of their physical assets and increasingly to organizations attempting a systematic study of complex systems, such as smart cities and oil-and-gas supply chains.
The announcement last month of the Digital Twin Consortium is an attempt to make digital twin technology more powerful and usable than ever before, through addressing one of the key problems slowing down its development: interoperability. The consortium is an open-standards organization under the auspices of the Object Management Group, backed by Microsoft, Dell, Ansys and Lendlease, among many others.
Digital twins are, put simply, virtual copies of real-world pieces of equipment. The idea is to offer a way to let the designers, manufacturers and operators of that equipment turn real-world data into accurate predictions and simulations of what might happen in various use cases. Creating a digital twin involves physicists, mathematicians and data scientists projecting how real-world forces affect the equipment being simulated. The systems being twinned can be as simple as a mileage calculator for a car or as complicated as a model of an entire city’s traffic patterns.
At its highest level, digital twinning is a management tool, according to Gartner research analyst and vice president Al Velosa. It abstracts a layer of complexity out of the basic processes of monitoring and managing systems.
“I don’t care how you get the data about that thing or that process, I just want that data so I can make better decisions,” he said.