After years of talk and theory-crafting, the hydrogen industry will soon take its first real steps into existence. While there is plenty of enthusiasm to start in an industry looking to grow by approximately 5,000% before 2030, the complex engineering involved requires careful planning to get right. And so, luckily for engineers designing the first utility-scale commercial electrolysis plants, the next generation of planning tools was born. Electrolyser manufacturer ITM Power plans to build a commercial green hydrogen electrolysis facility with 1GW capacity in Humberside, UK. To create this “Gigastack” plant, larger than any currently in operation today, the company decided to first create the plant using software.
This type of digital twin allows companies to build and refined planned facilities within software before breaking ground. This enables them to change designs more easily and gather feedback on those designs from those who will work in them quickly, optimising the plant before construction begins. ITM contracted offshore services supplier Worley to design the plant within facility modelling software Optiplant, made by Aspen Technology . The program then fed data into NVIDIA ’s Omniverse software, which produced a visualised 3D model for teams to work on collaboratively. Nvidia, best known for its computer hardware manufacturing, launched its Omniverse platform in 2021. We spoke to Marc Spieler, global energy director at Nvidia, about the limits of the technology and the practicality of visualising a new industry.
Creating the digital twin of an unknown industry
Despite the promised benefits, full-scale digital twins remain relatively uncommon. In mature industries, existing knowledge generally provides enough insight to construct new plants with high efficiency. In its infancy, the hydrogen sector cannot draw from a pool of knowledge. Plant designers must create new machinery, and planning software must incorporate these designs.
“We can put stuff into Omniverse by a few different ways,” says Speieler. “One is to create new [assets], one is to take existing models that in platforms that can be connected into Omniverse, and the third is… Let’s say the infrastructure is 30 years old, and nobody knows where the original CAD models were. We can go and take LIDAR scans and other types of imagery and use that technology to recreate the model in a 3D.
“There’s an ecosystem of partners that do LIDAR scans and others. Depending on the format, you could then go and scan an offshore rig or a manufacturing facility and get real time representation using LIDAR. We create it, push it into Omniverse, and start overlaying other factors on top of that. It’s an exciting time.
“Once that virtual world is created, then we can add more and more AI on top of that world, and try things out.”
The practical ends of the Omniverse
Omniverse is not itself a digital design tool. Its capabilities depend on the design programs that feed it data, with Omniverse interpreting its inputs simultaneously and visualising them to allow feedback from staff. As such, use of Omniverse relies on existing expertise in using other programs, also requiring some training for use itself. In the case of the Gigastack project, Worley used Optiplant.
Spieler says: “We’re not asking people to go and learn a new piece of software. What the Omniverse does is it takes the pixels coming out of Optiplant and basically recreates it into the Omniverse, and it does the same with other software stacks. This is what’s so great about this; you don’t have to have everybody using the exact same software stack in order to collaborate and see what other people are working on.
“If you’re a customer and you just want to see how your project is coming along, as opposed to sitting down with 10 different [pieces of ]software and opening them all up to look at these things, you can view them all together already as people continue to work.
“Instead of monitoring for maintenance, we can create simulated data”
Omniverse can also change the simulated conditions of the site, testing how the plant functions in a variety of circumstances. This will allow new factories to increase their resilience to the increasing effects of climate change, as extreme highs and lows in temperature occur more regularly. Since the Covid-19 pandemic, site managers have stepped up their monitoring of biosecurity, which site planning software now seeks to build in.
With control of environmental variables, Omniverse can allow site managers to simulate emergencies without lost production time. Optimising emergency procedures will still require drills with staff, but digital twins can simulate equipment until its virtual failure.
Spieler says: “Weather is going to be a huge one, right? You know, as we start to see major swings in weather patterns, how can we anticipate the effects of those weather patterns, especially on renewables? Sun, cloud cover, wind speed, hurricanes… Eventually, for offshore safety and resiliency, when do you shut stuff down? When do you not?
“Things like vegetation management, too, are going to affect grid infrastructure. A huge area of focus for me right now is how we create a more resilient and reliable grid. Wildfires and transmission and distribution lines can be affected by these things.
“If we can create a digital world, an Omniverse, or a metaverse of these things, and then basically apply the data we know, [such as] the speed at which trees grow, the rainfall and all these things, and start to anticipate this the rate at which trees are going to grow, we can then simulate it, and basically come up with stronger vegetation management plans.
“We can create synthetic data, and then train it so that, as opposed to monitoring production for certain factors, managers have already seen what a failure would look like in a simulated world. Then we can apply that to the real world.”
How detailed can a digital twin simulation go?
Spieler continues: “Say we have a wooden pole that’s holding power lines. We’ve seen this before, and it typically will fail in [for example] three to six months. So those are the types of things that there’s a lot of data out there. Typically, getting through all that data, then putting it in a visual manner and basically simulating the physics on top of it just hasn’t been possible.
“We’re entering a world where that’s all going to be possible moving forward.
“We’re working with a client right now who has gone out and collected data, and now they’re going through and labelling it. Well, that’s a very lengthy process.
“But using our technology, you can actually give characteristics to these power lines including the different flaws in wood, and then eventually create synthetic data that shows slight variations, to the point where now you can start to create a world in which these flaws actually occur. You may be able to train datasets on them.”
This level of mathematical simulation requires several technologies that promise business leaders the world but, at this moment, only deliver a buzz. Accurately computing the physical reactions of real objects requires colossal volumes of information on the formation of those objects: big data. Calculating the interactions of objects, particularly involving fluids, remains an active area of research among mathematicians.
Oddly enough, makers of computer-generated imagery also keep a keen interest in the maths behind correctly modelling different materials. Animated film studio Pixar supposedly takes at least 30 minutes to render a single frame of one of its films, after years of process optimisation. This process uses supercomputers or cloud computing services to handle the weight of calculations required. But doing these calculations will become easier with the dawn of another digital disruptor: quantum computing.
The images created by Omniverse are not Pixar quality. However, they serve a flexible viewpoint, and may serve much more in future.
“In theory, you could simulate the future of your plant”
However, Spieler believes that the sky is the limit for modelling and predicting system difficulties. He says: “It’s comprehensive as you choose to make it. We’ve got AI databases that can look at tonnes of vegetation, and within seconds tell you what kinds of tree it sees. Those aren’t our databases, those are public. The point of Omniverse is to pull from those databases and model those more comprehensively altogether.
“You know how you can go to Google Earth, and go back and look at the past, but you can’t go look at the future? In theory, you could simulate what the future is going to look like.
“Today, it’s done more as data science. But eventually, you will take that data science, and you will push it into a digital twin, where when you see those trees, and you can actually start to grow those trees at their realistic rate.
“Because you know, an aspen, or an oak, or a maple tree is going to grow at this rate in this location with this much moisture and this much rainfall. You can look at the statistics and anticipate what it will be. And you could then in theory, using things like our generative adversarial network technology [a type of algorithm designed to constantly improve], draw that tree growing and growing, as it would in the future.”
The Omniverse technology forms part of the system behind Nvidia’s “Earth 2” model, announced in November last year. This aims to create a digital twin of the entire planet, helping to model weather patterns and climate change scenarios, but also encourage advancement in realistic simulation software.
Spieler continues: “In the work that we’re doing with Earth 2, it is going to be huge for helping to predict and simulate future activities. Once again, it’s all going to come down to how much data you can process in the time that you have to look at it.
“If you need to know about the state of your operations tomorrow from data you gathered today, you’re going to need to either limit the data you use or get a large system. But individual companies and people will make those decisions for themselves.