Trustless. Quantum Computing, Blockchain and the End of a Promise
Quantum Computing · Cryptography · Bitcoin · Power
In forty days — from March 2 to April 8, 2026 — four separate events changed the coordinates of the problem. Google Quantum AI redefined the technical timeline for breaking Bitcoin and Ethereum’s cryptography using quantum computing hardware. The post-quantum standards ratified by NIST carry the imprint of IBM, which also sells the services to implement them. Blockchain wallets with public keys exposed for sixteen years are the most vulnerable targets. The New York Times identified the possible creator of Bitcoin as the CEO of a company that monetizes that ecosystem. This article reads these facts as parts of the same power structure.
Bitcoin was designed to make trust in intermediaries unnecessary: banks, states, institutions. The guarantee was mathematical — the cryptography protecting every wallet makes it impossible to trace someone’s private key from their public key, unless you have hardware resources that no single actor can concentrate. On March 30, 2026, Google Quantum AI published the calculations that shift that threshold: within minutes, using machines within reach of a state or a large corporation — exactly the actors Bitcoin was born to circumvent.
March 30 is a moment of discontinuity for reasons that go beyond the technical number. In the same week Google publishes the quantum computing whitepaper, NIST has already ratified post-quantum standards written by IBM, intelligence agencies have been operating for years with storage infrastructures designed for data that cannot be decrypted today, and 1.1 million original Bitcoin lie in blockchain addresses with access keys visible to anyone for sixteen years — the format most vulnerable to a quantum attack.

Quantum Computing: the Document and Its Double
On March 2, 2026, four weeks before the Google whitepaper, the Advanced Quantum Technologies Institute distributed a press release via PR Newswire titled “Cybersecurity Apocalypse in 2026.” It announced the JVG algorithm, capable of breaking RSA-2048 — the system that encrypts the majority of internet communications — in eleven hours with fewer than 5,000 qubits. The paper was on Preprints.org, which does not conduct peer review. The lead author has a background in fluid dynamics. The experimental data underpinning the entire projection involve the factorization of five small numbers — 15, 21, 143, 1,363, 67,297. Physicist Scott Aaronson, one of the world’s leading experts in quantum computing, published a detailed critique within forty-eight hours. SecurityWeek had already turned the press release into an article distributed by trade newsletters worldwide.
The relevance of JVG lies in the sequence, not in the technical content: it arrives four weeks early, occupies the field of attention, and sets the emotional register — urgency, apocalypse, imminent crisis — with which non-specialist audiences will read subsequent news. When on March 30 Google publishes a 57-page whitepaper co-signed by the Ethereum Foundation and Stanford, demonstrating the result mathematically without revealing the attack method and in coordination with the US government, that document is read inside a climate someone else has prepared. The technical result is real: the same cryptography protecting Bitcoin can be broken within minutes. On the same day, a second group of researchers — using different technology — reaches the same conclusion: ten days for the same objective. Two groups, two approaches, the same week. Google has not published the instructions for replicating it.
The researchers building the hardware write in the paper: it is conceivable that the first cryptographically relevant quantum computers will be detected on the blockchain before being announced — through anomalous wallet movements, private keys derived from already-exposed public keys, without anyone issuing a press release.
Quantum Computing, Decentralization and Bitcoin’s Physical Constraint
In November 2008, six weeks after the collapse of Lehman Brothers, Satoshi Nakamoto published the Bitcoin whitepaper on a cryptographers’ mailing list. The premise was structural: cryptography guarantees where institutions have failed, because violating it would require computational resources that no single actor can concentrate. Decentralization is not proclaimed — it is built as a distributed physical constraint, which held as long as the hardware needed to break it remained beyond the reach of any single actor. The Google whitepaper changes that condition.
A cryptographically relevant quantum computer operates at 15 millikelvin — colder than interstellar space. IBM had to design a proprietary refrigerator for its own systems because none available on the market was large enough — described internally as a missile silo. A system capable of breaking Bitcoin’s cryptography would require multiple such devices networked together, hundreds of millions of dollars in infrastructure, and materials sourced from very few suppliers worldwide. Only those with billions to invest and access to specific industrial supply chains can build it: the same nation-states, the same technology corporations, the same federally funded laboratories that the 2008 whitepaper cited as the problem to circumvent. A technical proposal to update Bitcoin’s cryptographic signatures is in testing but not yet active. Any modification to the protocol requires consensus from the entire network — a deliberately slow process, designed to prevent any single actor from imposing it. Meanwhile, approximately 6.9 million BTC sit in addresses with access keys already visible on the blockchain.
Bitcoin’s decentralization was founded on a physical constraint: breaking the cryptography cost more than any actor could afford. That constraint depended on the cost of hardware. The cost of quantum hardware is falling, and the concentration required to build it favors exactly the largest actors.

Who Writes the Rules of Post-Quantum Cryptography
On March 25, 2026, five days before the whitepaper, Google announced it was moving up its complete migration to post-quantum cryptography to 2029 — six years ahead of the deadline set by the US government, four years ahead of NIST. That migration will use three standards already ratified by NIST as mandatory for US public administrations. All three derive from algorithms developed by IBM Research. IBM is simultaneously the leading provider of consulting services for the migration toward those same standards. The post-quantum cryptography market is worth $1.68 billion in 2025, projected to reach $30 billion by 2034. The consulting and implementation segment accounts for 63.7% of total revenues.
The NIST process lasted eight years with international public review — more transparent than any previous precedent. The concentration of results around IBM algorithms describes a structure of incentives that public discourse has not yet formalized: the same actor that won the open technical-regulatory process now sells the market that process created. The documented precedent is 2013: Snowden’s revelations showed that the NSA had inserted a hidden mechanism into a NIST encryption standard — a random number generator known as DUAL_EC_DRBG — that allowed those who knew its parameters to derive future output from past output. No one had checked those parameters for years because they were embedded in a standard. The PQC process has been open; the question is whether openness of process and concentration of results are compatible in the long term.
Quantum Computing and Satoshi’s Wallet on the Blockchain
On April 8, 2026, the New York Times published an investigation by John Carreyrou — the journalist who dismantled Theranos — identifying Adam Back as the most credible candidate to be Satoshi Nakamoto. Back is a British cryptographer, inventor of Hashcash — the mechanism that forces computers to consume real energy to validate transactions, cited in the Bitcoin whitepaper as a direct precedent — and current CEO of Blockstream, the company that develops sidechain technology for Bitcoin and monetizes its ecosystem. Stylometric analysis — the study of authorial fingerprints in texts, from rhythm to punctuation to lexical choices — cross-referenced thousands of posts across three cryptographic mailing lists from the 1990s and 2000s using three independent methodologies: double space between sentences, British spelling, “proof-of-work” written with a hyphen. All three converged on Back. Back denies it. No cryptographic proof exists, and by design none can: Satoshi stopped communicating in 2010 without signing anything definitive.
Satoshi’s wallet — approximately 1.1 million BTC identified through the “Patoshi pattern,” a statistical signature in the mining behavior of the earliest blocks, unmoved since 2009 — contains addresses in an older format in which the access key to the funds is visible to anyone consulting the blockchain. Sixteen years of exposure. At current value, over $90 billion. The Google whitepaper indicates that attacks on already-visible and stationary keys are more accessible than attacks on ongoing transactions, because there is no time limit to execute them. The paper introduces the “digital salvage” framework — an analogy with maritime law on shipwrecks — to name the problem the Bitcoin community will have to face: assets that no living holder can move toward addresses protected by the new cryptographic standards, but that sufficient hardware could open. The protocol has no mechanism to handle this. The community would have to choose between two equally problematic options: freeze those wallets by collective decision — violating the principle of immutability of cryptographic property — or do nothing and risk someone with quantum hardware emptying them, destabilizing the network. If Adam Back were Satoshi, the person who designed the trustless system would today have a direct financial interest in the ecosystem that system generated, and his wallets would be the most exposed target to the breakdown of the cryptography on which that system is founded.
Quantum Computing and Intelligence: the Collection That Does Not Expire
The “Harvest Now, Decrypt Later” logic is not a prediction: it is a documented practice. The NSA through the MUSCULAR program physically intercepted fiber optic cables between Google and Yahoo data centers, collecting encrypted data in bulk. Britain’s GCHQ through Tempora buffered traffic on transatlantic cables. China’s Ministry of State Security breached the OPM databases in 2015, extracting the classified records of 22 million US federal employees — not to use them immediately, but to archive them in anticipation of hardware sufficient to decrypt them. The NSA’s Utah Data Center, completed in 2014, was designed to store yottabytes — one trillion terabytes. These structures exist, operate, and continue to collect. Cryptographically relevant quantum computing hardware is not yet available. But the collection has already happened, and the data does not expire.
The relationship between these facts is temporal, not speculative: those who built those infrastructures knew in 2014 that quantum hardware was a real, not distant, technological horizon. Bitcoin was founded on the premise that the computational cost of breaking cryptography is prohibitive — that breaking costs enormously more than building. When that ratio shifts, the guarantees that depend on it shift with it: the context in which the system was designed has changed, and the system was not built to adapt to that change.
Massive storage infrastructures were built between 2013 and 2014, when cryptographically relevant quantum hardware was already an identifiable technological horizon. The data collected then has no expiration date. The hardware to decrypt it is what the March 2026 papers describe as buildable.
- Google Quantum AI — Whitepaper on breaking ECC cryptography, March 30, 2026
- Caltech / Oratomic — Independent paper on neutral atom architecture, March 2026
- NIST — Post-Quantum Cryptography Standardization: ML-DSA, ML-KEM, SLH-DSA
- Bitcoin BIP-360 — Proposal for post-quantum signature upgrade
- New York Times / John Carreyrou — Investigation identifying Adam Back as Satoshi Nakamoto, April 8, 2026
- Scott Aaronson — Critique of the JVG paper / Advanced Quantum Technologies Institute
Post scriptum
StarkNet is the only Layer 2 — a system built on top of Ethereum to increase its speed and capacity — that the Google whitepaper cites as resistant to quantum computing. It is, because it uses a mathematical foundation different from the vulnerable one, a choice made for computational efficiency reasons, not preventive security. No other Layer 2 of comparable scale is in the same position. The rest of the ecosystem depends on a technical proposal that requires consensus from the entire network — the same slow governance mechanism that guarantees censorship resistance — to implement an urgent modification. The system is protected from arbitrary changes because no one can impose them; for the same reason, it cannot be updated quickly when the context changes.
The technical window is not closed. It is narrowing asymmetrically: faster for those holding older-format addresses with keys visible for years, slower for those operating on more recent formats. That asymmetry already produces a concrete risk hierarchy today — between old and new addresses, between those who have already migrated and those who cannot because they no longer control the keys. The effects of the vulnerability precede the actual attack.







