World’s Fastest Supercomputer Conducting ‘Virtual Manhattan Project’ to Test Nukes

World’s Fastest Supercomputer Conducting ‘Virtual Manhattan Project’ to Test Nukes



The cryptocurrency industry has long feared the day when computers will be able to crack blockchains and take down networks like Bitcoin and Ethereum; that day may be closer than they think—but even at today’s supercomputer speeds, only quantum computers could have that potential.

Last week, researchers from the Lawrence Livermore National Laboratory announced that their latest supercomputer, El Capitan, is able to conduct 2.79 quadrillion calculations per second, making it the world’s fastest computer.

To grasp the magnitude, that’s 2.79—followed by 15 zeroes.

“To put this in some perspective, it would take more than a million of the latest iPhones working on one calculation at the same time to equal what El Capitan can do in one second,” Jeremy Thomas of the Lawrence Livermore National Laboratory told Decrypt. “That’s a stack of phones over 5 miles high.”

Binance

The results were announced during the annual SC Conference, an international conference focused on high-performance computing in Atlanta, Georgia on Monday. El Capitan was named number one on the list of the Top 500 Project’s biannual list of the 500 most powerful supercomputers in the world.

Based in Livermore, California, the Lawrence Livermore National Laboratory worked with Hewlett Packard Enterprise, AMD, and the Department of Energy to develop El Capitan.

As the name suggests, a supercomputer is designed for complex tasks such as simulations, developing artificial intelligence, and research while operating far faster than the average computer. For example, a computer like El Capitan performing 2.7 quadrillion operations per second is as much as 5.4 million times faster than a typical home computer.

Using the phone analogy, Thomas estimated that it would take more than 8 billion people working simultaneously for eight years to accomplish what El Capitan can do in one second.

El Capitan’s power has raised questions about its potential impact on the blockchain industry, where security relies heavily on strong cryptography. However, blockchain encryption experts argue that such fears are unfounded.

“They would need to brute-force every possible private key,” Arcium CEO and co-founder Yannik Schrade explained to Decrypt. “For instance, if a private key is 256 bits long, an attacker trying to forge your transactions would need to test every possible 256-bit key.”

Launched in 2022, Switzerland-based Arcium focuses on building a decentralized supercomputer network that enables developers to run encrypted computations securely and efficiently on blockchains.

According to Schrade, even with its ability to run 2700 quadrillion operations per second, it would take a supercomputer like El Capitan 10 billion years to brute force a 256-bit private key because of the computational asymmetry inherent in the cryptographic schemes used, like elliptic curve cryptography found in blockchains like Bitcoin, Ethereum, and Solana.

Elliptic Curve Cryptography (ECC) is popular among blockchain developers for its efficiency and strong security. The larger threat to the security of these cryptographic systems, Schrade explains, comes from quantum computing which he said has the potential to defeat the computational asymmetry that makes current cryptography models secure.

“A supercomputer still processes data in binary, using bits, and its power primarily comes from parallelization—handling many tasks simultaneously,” Schrade said. “Quantum computers, on the other hand, use quantum bits, or qubits, which leverage quantum principles like superposition and entanglement. It’s a different dimension of computing.”

Other cybersecurity experts and academics shared Schrade’s sentiment around the potential threat of quantum computing.

“The modern encryption systems that underpin blockchains cannot be broken on any current classical system, including supercomputers. However, the threat from emerging quantum computers is far more serious,” Duncan Jones, head of cybersecurity at the UK-based quantum computing developer Quantinuum, told Decrypt. “In as little as a decade, we may see sufficiently powerful quantum computers that can break the algorithms used by blockchains today. Blockchain owners need to deploy quantum security technology ASAP to prepare for this risk.”

However, while supercomputers may not pose a threat to blockchains and quantum computers are still years away, experts caution about putting too much faith in the technology to secure itself.

“The blockchain industry often places too much trust in its perceived security, overlooking potential vulnerabilities,” San Jose State University Professor of Engineering Ahmed Banafa told Decrypt. “Many proponents view blockchain’s immutability as synonymous with invulnerability, but this perspective ignores risks posed by advancing technologies like quantum computing and practical issues like software flaws.”

Banafa highlighted that while the blockchain industry recognizes quantum computing’s potential risks, few platforms have adopted quantum-resistant measures like quantum key distribution, leaving them vulnerable to future threats.

“While decentralization is a strength, it also complicates the process of rolling out critical updates to enhance security,” Banafa said. “Without a proactive approach to addressing these challenges, the blockchain industry risks undermining the trust it has built over the years.”

Despite the concerns of blockchain and the cybersecurity industry, the LLNL said El Capitan won’t be used to crack blockchain encryption. Instead, El Capitan will be used to run nuclear blast simulations and support AI-powered research, including material discovery, design optimization, advanced manufacturing, and intelligent AI assistants trained on classified data.

Although the days of setting off bombs at Los Alamos, New Mexico—like those depicted in the Best Picture Oscar-winning film “Oppenheimer”—are long gone, Thomas acknowledged that virtual tests have limitations, and researchers are not solely relying on simulations for certifying the nuclear stockpile.

“Simulations will never be exact, but we use more and more computing power on ensembles of thousands of simulations to be able to reduce any uncertainty we have in our models,” Thomas said. “We have large efforts around validating and verifying that our codes are accurate. We also have a large amount of data from our underground tests, prior to the end of nuclear testing in 1992, that give us confidence in our codes.”

Edited by Andrew Hayward

Generally Intelligent Newsletter

A weekly AI journey narrated by Gen, a generative AI model.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Pin It on Pinterest