Defining Digital Twin
Perhaps the most ubiquitous term in the industry today is “digital twin.” Nearly every vendor in the industry is offering their version of digital twin. The annual growth rate for the pulp and paper market is approximately 40% and is expected to reach $160 billion by 2030.
Some of the attributes of the offerings being promoted consist of:
- software simulation of industrial processes
- connectivity of smart sensors to systems that can analyze the data
- engineering design tools that show the impact of design choices
- digital representations of data for real-time visualization
- process models that adapt based on real-time feedback
Given these disparate attributes, are we talking about the same thing?
We might be. This is because “digital twin” is not for a specific application. Instead, it is an inclusive term that encompasses many applications.
The TAPPI Industry 4.0 Lexicon (TIP 1103-04) defines it as “Digital data that represents the physical process.”
This is an extensive scope that can include:
- a dataset of process data
- a process graphic in a SCADA/DCS HMI
- a digital simulation of a process
- process engineering tools using the first principle or data-driven feedback.
This is also why the Lexicon states that digital twin is “A comprehensive term, which applies to any digital representation of a physical thing.” However, arriving at this definition came from consulting several authorities.
Industry 4.0 Talk with Lisa Seacat Deluca
In developing the Industry 4.0 Lexicon, I had the privilege of speaking with Lisa Seacat DeLuca, who led IBM’s digital twin development and deployment. She defined a digital twin as “A digital representation of a physical thing.” IBM considers the term to be a broad envelope that encompasses several applications, including designations such as:
- Simulation Twin
- Autonomous Twin
- Compliance Twin
- Asset Twin
- Operational Twin
- Engineering / Design Twin
- Maintenance Twin
Another perspective for digital twin is a formal definition given by ISO: “A digital representation of an observable manufacturing element with a means to enable convergence between the element and its digital representation at an appropriate rate of synchronization” (ISO, 2020). Let’s break this definition into parts to clarify possible intent.
The first part of this definition seems to pertain to sensors on a manufacturing element, such as a paper machine, connected to devices that convert the signals to digital representations of the temperature, flow, motor status, or other associated process measurements. The second part of this definition talks about converging the manufacturing element (a paper machine) with its digital representation. This can be done by giving context to the digital data. Assigning a tag name and description to the tag is the first step, but these tags can then be structured in a historian asset tree, organized on a process graphic, and structured into SCADA/DCS logic.
Lastly, the third part of this definition mentions synchronization. Essentially, this means that the system we describe is a real-time system with the digital representation mirroring the manufacturing element as close as possible. Now putting all of it together essentially means this definition describes the SCADA/DCS systems we have had for many years. While this understanding of digital twin may be new to the consumer space, it is not unique to the industrial realm we are in daily.
The IoT and Digital Twins
So, confusion must arise when mixing the consumer space with the industrial environment. In the consumer space, there has been a revolutionary idea of making “dumb” things smart so they can be digitally represented and connected in the cloud. This revolution here is called the Internet of Things (IoT). In this space, the ISO definition above explains what is happening with smart cars, refrigerators, and even entire houses. Within our industrial environment, we have a revolution called the Industrial Internet of Things (IIoT). A key difference to what is happening in the consumer space is our revolution began with smart sensors already in place. Further, while our environment has been digital for a long time, our revolution has to do with connectivity. We now have begun using the internet and could resource in industrial applications.
Therefore, digital twins are not new to us. However, whether we are talking about digital process control systems or process simulation, those digital twins existed long before the Industry 4.0 era.
What’s new with Digital Twins for the future?
So, is there anything new here for us?
Russel Rhinehart has penned several articles I have found helpful in describing the nuance. In November 2021, in a 3-part series in CONTROL magazine, Russ told the application of dynamic process models and how digital twins differ. The differences boil down to adaptation. The process models can be any kind that makes sense; first principle, polynomial, neural network, or other. Training a model can require a lot of data, and much of the work is pre-processing the data so that you feed valuable data. As we all know, garbage in equals garbage out. We do not want to provide a bunch of noise into a model and train it to learn noise. That is why in part 3 of his series, Rhinehart emphasizes the challenge of trying to use online data to adapt a model. If you do not have someone carefully analyzing data to pre-process before it is fed into model training, a process model can degrade and become dangerously wrong.
The takeaway is that when someone suggests an investment in a digital twin, you should find someone you trust to guide you through discerning the definition used and how to address the challenges associated with a particular implementation. As stated previously, the investments in the digital twin market are huge and growing at a rapid pace. To ensure your investment gets the desired results, Pulmac can be your partner to successful deployment.