Dr. William Bain, CEO and founder of ScaleOut Software, explains what the future of digital twin technology requires for broad market adoption.
Traditional stream-processing and complex event-processing systems do an excellent job of extracting patterns from incoming telemetry, but they make it challenging and inefficient to track dynamic information about individual data sources. As a result, it can be difficult to fully analyse what incoming telemetry is saying about the state of a live system and take timely, effective action.
That is where the power of real-time digital twins comes in. While digital twin models have been used for decades in product life cycle management to help design new devices, they have only recently been applied to stateful stream-processing. Advances in scalable, in-memory computing pioneered by my company, ScaleOut Software, have helped enable digital twins to make this possible.
We released the ScaleOut Digital Twin Streaming Service™ two years ago to offer a simple, intuitive technique for tracking important, dynamically evolving state information about individual IoT data sources and use that information to implement real-time analytics of incoming telemetry. The use of per-device state information enables deep introspection per device within milliseconds and more effective feedback than previously possible.
In the past two years, we have applied the digital twin model to numerous use cases in widely different applications. These include telematics, logistics, physical security alerting, contact tracing, device tracking, industrial control monitoring, cloned license plate detection, and airline system tracking – among others. Building applications for these use cases have demonstrated the power of the digital twin model in creating useful streaming analytics that can scale with ease to track large numbers of data sources. Since its creation, we have continued to enhance this unique platform with new capabilities.
For example, we created a rules engine for implementing the logic within a digital twin so that new models can be created without the need for specialised programming expertise. We then added machine learning to our digital twin models using Microsoft’s ML.NET library. This enables digital twins to look for patterns in telemetry that are difficult for humans to see or define with code. More recently, we integrated our digital twin model with Microsoft’s Azure Digital Twins platform to accelerate real-time processing using our in-memory computing technology while also providing new visualisation and persistence capabilities.
Our digital twin model has also evolved along with developer needs. Recently, I sat down with my team to consider what the future of this technology requires for broad market adoption. We came up with the following:
Integrate Timers to Improve Digital Twin Alerting: What if digital twins incorporated timers to detect missing or delayed messages from IoT devices? These timers could trigger code execution that signals alerts when needed. This would be essential to identify failed or unreliable devices in live applications, such as smoke detectors and security sensors.
- Enhance Portability with .NET 6 and C#: Linux is the best-known and most-used open-source operating system that has grown in popularity over several years. Using .NET 6 to build C# digital twin models, developers could simultaneously target both Windows and Linux systems, thereby maximising portability in C# and maintaining parity with Java.
- Allow Aggregate Initialisation of Digital Twin Applications: Our digital twin platform was initially designed to automatically create digital twins when it received telemetry from a new data source (typically, a device). However, our experience has shown us that many applications need to pre-create the full population of digital twin instances to detect missing devices or for other reasons, such as initialising a complex digital-twin hierarchy. What if we were able to create and initialise digital twins using file-based data rather than waiting for incoming telemetry?
- Add Simulation Capabilities: Our experience building applications has shown that it is essential to use simulations for testing and demonstrating the capabilities of digital twins in streaming analytics. In a variety of industries, simulations help us measure how well digital twins provide real-time answers before they are deployed in live systems. The use of simulations would help validate designs and save both time and money on implementations.
I’m happy to say that our team has achieved many of these goals in our recently released Version 2 of the ScaleOut Digital Twin Streaming Service™. This release provides exciting new features for building digital twin models that meet the needs of real-world applications, and we are planning to add support for simulation in an upcoming release. These features have all been driven by requirements that surfaced during application development. This approach matches our design philosophy of starting with a simple, coherent model and then carefully enhancing it as we learn from real-world experience.
After more than two years of building real-world applications with digital twins, we have confirmed the leverage they provide in streaming analytics. Because digital twins bring together telemetry, state information, and application logic for each physical device, they enable deep introspection, tracking evolving behaviour, and enabling feedback. By making use of scalable, in-memory computing, the streaming platform can accomplish all this with an unusually simple and efficient programming model which allows applications to focus on implementing analytics code while deferring the challenges of data visualisation and throughput scaling to the platform. This technology can change the way we monitor and interact with large populations of data sources. It has been exciting to watch digital twins address challenges in a diverse set of applications. How will they evolve in the next two years?
Want to learn more about blockchain from industry leaders? Check out Blockchain Expo taking place in Amsterdam, California and London. The event is co-located with Digital Transformation Week.
Explore other upcoming enterprise technology events and webinars powered by TechForge here.