In collaboration with the HPE Data Science Institute, Numerical Algorithms Group (NAG) delivered a presentation entitled, ‘Optimizing Seismic in the Cloud for Performance and Cost.’
Branden Moore, HPC and benchmarking manager at NAG, presented distinctive strategies to utilize regarding optimization and how they can impact performance, time and cost. He articulated how these strategies are affecting technology and research.
“Companies are moving their workload to the cloud,” Moore said.
This appears to be the next biggest move in computation and numerous people are gravitating to it. Moore introduced Reverse Time Migration (RTM) code to the audience and how it can be used to detect seismic pressures on Earth.
He explained the large amount of data, such as input and temporary data, it requires. A person can transition an RTM workload to the cloud, if necessary. The largest source of costs for RTM are time-stepping wave equations and image condition. Computation can be expensive. Adding more components, such as physics/fidelity, can lead to an increase in price and computation.
There are multiple methods to configure time-stepping within computation. Optimizing cache utilization and GPUs in an individual’s calculations are two common tactics. Targeting cost-to-solution is also important in optimization. Sometimes, an individual doesn’t need to focus on the cost per hour for using resources. It is up to them to decide if the price is worth the quality of the resources and which compute options provide the best opportunities for their workload.
Moore described how enabling the cloud can present various options to choose from and reinforced the benefits the strategy can provide toward a researcher’s studies.
“The cloud brings to us a ton of different pieces...let your application guide you,” Moore said.