Safeguarding cryptocurrency by disclosing quantum vulnerabilities responsibly

83 points
1/21/1970
2 days ago
by madars

Comments


spr-alex

Beware the Ides of march: this is 1 of 2 cryptographic doom papers that was released this week. This google paper with Babbush, Gidney, Boneh is authoritative. And we also have another with Preskill and Hsin-Yuan Huang (widely cited for classical shadows among other quantum work) among others: https://arxiv.org/pdf/2603.28627

"Here, by leveraging advances in high-rate quantum error-correcting codes, efficient logical instruction sets, and circuit design, we show that Shor’s algorithm can be executed at cryptographically relevant scales with as few as 10,000 reconfigurable atomic qubits. "

That's physical, not logical qubits.

2 days ago

newpavlov

Have they factored 21 yet? [0] IMO most of us can ignore such pieces until a practical factorization of arbitrary 32 bit integers is demonstrated on a QC. And even after this "easy" milestone is achieved, I think it will be at least a decade until QC will be a practical cryptographic threat. And it's generously assuming that a Moore-like scaling is possible for QC.

[0]: https://algassert.com/post/2500

2 days ago

spr-alex

My read of this post is that there's nuance here and that by the time we see 32-bit integers being factored then the roadmap to 256 bit integers can be counted in months on ten fingers rather than being a decade out. The underlying scaling needed to go to 32 bit requires only linear progress to get to 256

a day ago

newpavlov

>The underlying scaling needed to go to 32 bit requires only linear progress to get to 256

Nope. Firstly, for RSA you need to scale from 32 to 4096. Secondly, Shor requires N^2*log(N) quantum gates where N is number of bits in the integer, so the scaling is superquadratic. And it's very much an open question whether QEC protocols will continue to work with the same efficiency on the required scales.

a day ago

spr-alex

We are talking to different things. There is linear engineering progress for getting from 32 bits to 256 bits being factored is my claim.

If we want to talk RSA the engineering journey from factoring 21 to 35 is big, because it requires creating logical qubits with error rates that we are only now seeing companies report. But the engineering journey from 32 bits that are tolerant enough to run a factoring algorithm to doing the same with 4096 appears linear in engineering cost is what I am claiming.

For RSA specifically the resource have come down. I am not yet up to date on this round of papers however the 2024 result https://eprint.iacr.org/2024/222 had it down to n/2 + O(N) logical qubits.

a day ago

newpavlov

>it requires creating logical qubits with error rates that we are only now seeing companies report

And yet 21 was not factored on a real hardware.

>There is linear engineering progress for getting from 32 bits to 256 bits being factored is my claim.

IMO it's a very bold claim until linear progress is demonstrated between 8, 16, and 32 bits. Not in theoretical papers. On a real hardware. With honest experiments using arbitrary integers.

It's easy to claim "QC will repeat Moore's law!" especially when your salary depends on it, but the practical evidence is quite lacking at the moment.

a day ago

spr-alex

So once again since I think I am not explaining it well, it might take a long time to go from factoring 21 to 35, and a long time from 35 to anything bigger, but from that point on the engineering has scaled up to the point that progress is very sudden. So if the canary in the coal mine is a 32-bit integer being factored, then the runway for deploying fixes is terminally short for defenders

a day ago

pseudohadamard

  And yet 21 was not factored on a real hardware.
Yes it was, they used a VIC-20. Also an abacus. Not to mention a barking dog. https://eprint.iacr.org/2025/1237
8 hours ago

FrasiertheLion

It's unfortunate that we're past the point where all quantum computing progress is public. Between this and the unbearable secrecy of AI labs, balkanization of knowledge is in full force.

2 days ago

PowerElectronix

I think the incentive to share progress is still orders of magnitude higher than the incentive to keep it private.

2 days ago

Jarwain

I agree, but I do feel like there's a bit of a lag. The shape of this lag has changed over time, and maybe we're in an era where the lag is growing

a day ago

ta988

You are assuming they have things to hide about QC...

2 days ago

DoctorOetker

> [...] including transitioning blockchains to post-quantum cryptography (PQC), which is resistant to quantum attacks.

PQC is not defined as "being resistant to quantum attacks" nor does it necessarily have this property: PQC is just cryptography for which no quantum attack is known yet (for example even when no one has tried to design a quantum computation to break the cryptography). One can not demonstrate that a specific PQC altorithm is resistant to quantum attacks, it is merely presumed until proven otherwise.

2 days ago

tromp

I think that "having no known quantum attack" is a reasonable interpretation of "quantum resistant". If there were no possible "quantum attack" (under appropriate complexity assumptions, such as EC-DLP not being in P), then we could call it "quantum proof" instead of quantum resistant.

2 days ago

DoctorOetker

I understand what you mean, but I think such a concept or definition would be highly misleading: "having no known quantum attack" means every novel encryption method would be automatically "quantum resistant" for having had 0 adversarial attempts to find quantum or even classical weaknesses!

There should be some measure of competence-level-adjusted man-hours of cryptographers and mathematicians trying to swing their favorite hammers at the problem; in order to estimate this "quantum resilience".

19 hours ago

defrost

In minutes, on a single computer, for example, is the lowest bar.

* https://mathematical-research-institute.sydney.edu.au/quantu...

* https://magma.maths.usyd.edu.au/magma/

Props to John Cannon, George Havas, Charles Leedham-Green, et al.

18 hours ago

nadis

> "Quantum computers promise to solve otherwise impossible problems, including examples in chemistry, drug discovery, and energy. However, large-scale cryptographically relevant quantum computers (CRQCs) will also be able to break current, widely used public-key cryptography that protects things like people’s confidential information. Governments and others, including Google, have been preparing for this security challenge for many years. With continued scientific and technological progress, CRQCs are getting closer to reality, requiring a transition to PQC, which is why we recently introduced our 2029 migration timeline."

Is this as wild a news as I think it is? I'm surprised I haven't yet seen more reactions to the 2029 migration timeline plan (proposed?).

a day ago

blitzar

If I find a cryptocurrency vulnerability I am reallocating (the blockchain never lies) as much of it as I can and cashing it out.

Its the only responsible thing to do.

2 days ago

kevmo314

If someone else finds a cryptocurrency vulnerability, they too will reallocate as much of your allocation as they can and cash it out.

2 days ago

blitzar

A fool and their money are easily parted.

2 days ago

bdangubic

often, a mind capable of doing something like this is not the kind that gives a lot of sh*t about things like "money" so I would put a chance of your statement being true at ... 12.78% :)

a day ago

84adam

I have compiled some notes about these recent announcements here, as it pertains to ECC and Bitcoin.

Note: There are specific address types that are safer to use for long-term storage than others, such as Native SegWit addresses starting with `bc1q`. The newest Taproot type starting with `bc1p` is insecure because it directly encodes a "tweaked" public key into the addresses.

https://bc1984.com/quantum-feasibility

a day ago

xnx

A few blog posts from Google's Quantum team recently make it seem they are confidently on the path to cracking traditional cryptography. Real Setec Astronomy stuff.

21 hours ago

vessenes

I haven't seriously looked at Bitcoin's PQ plan for a couple of years, so I might be (I am almost certainly) out of date, but my recollection is that there's a "pre working attack" phase required, in which everyone basically signs a new PQ secure address, and a cutoff date.

This would leave holders who did not sign in two categories:

1) If you never sent a tx with an address, then you did not reveal your public key, and have some safety, e.g. you could do the PQ signature, wait, and be fine.

2) If you did, then you revealed your public key, and didn't bother to make the cutoff, and well, too bad.

There was a bunch of frankly dumb analysis about how long this would take the chain to process and how expensive it would be assuming that miners would all continue to enforce 10 minute blocks and transaction fees for these signature txs. I would be very surprised if the mining industry shot itself in the foot like that. The actual time to process 200mm or so new signatures just isn't that long. Hey we could do it on Solana if we needed to. That said, I imagine the papers this week plus Google moving up its timeline mean that there will be a concerted effort in Bitcoin land to get a real process down and tested in the next couple of years. Pretty cool.

Finally, I've read very little analysis about whether or not miners would choose to continue the energy dependent nature of mining, or try and move on. I think this is a pretty interesting economic question; I'm looking forward to finding out the answer. I expect mining will have a longer lead time than the signature problem - we're a long way from having Grover implementing SHA-256 as far as I know. And even then you still have 128 bits to deal with ONCE you get an equivalent amount of Grover-capable quantum compute out to the current ASIC ecosystem.

2 days ago

cinquemb

Pretty much all of quantum control right now is based on the idea that qubits are these fragile things that have to be corrected, but thats because poor assumptions for quantum control are used (SO(3) precession). And even when they are treated like open quantum systems (like everything naturally is, even at 10mK and 10^-11 Torr), stuff like linblad master equations are used which is based on born-markov assumption that the env is a memoryless bath... when one stops using these poor assumptions and treat the system as a dynamical object that has natural states of stability that dont need to be actively corrected... these crypto breaking alarms are going to seem very tame.

This also has implications for alot of PQC and QKD stuff that's based on static model assumptions...

a day ago

dandanua

Why do they care about cryptocurrencies but not about the entire world's infrastructures that are based on RSA and elliptic curve algorithms, such as HTTPS and many other electronic signature solutions? Is this a case of cryptocurrency market manipulation?

And why do they think that the US government would care about securing cryptocurrencies? Aren't they designed to circumvent the government regulation?

2 days ago

FrasiertheLion

Yes they absolutely care and have been doing serious work to migrate PKI to PQC.

This was the first of several articles coming out of Google: https://blog.google/innovation-and-ai/technology/safety-secu...

And the timeline for web migration is 2027 Q1: https://security.googleblog.com/2026/02/cultivating-robust-a...

And this was Sophie Schmieg’s talk at a cryptography conference this month (they lead PQC migration efforts at Google) tracking migration efforts and urging folks to prioritize signature migrations in lieu of accelerated quantum timelines: https://westerbaan.name/~bas/rwpqc2026/sophie.pdf

a day ago

vessenes

> Is this market manipulation?

No

> why do they think that the US government would care about securing cryptocurrencies?

Our largest institutions manage tens of billions of dollars in cryptocurrency and the US government has designated currencies appropriate for the strategic crypto reserve

> Why do they [not care] about the entire world's infrastructures that are based on RSA and elliptic curve algorithms, such as HTTPS

I'm sure they do. But if you had a working quantum computer that could a) get Satoshi's keys or b) read some emails, most people choose door a first. So it's both a smoke test and a high value target with an easy to assess dollar value.

2 days ago

dandanua

I'm also sure that someone at Google do care about those. It is strange to see a blog post targeting cryptocurrencies while it is certainly a specific case of a much larger problem.

2 days ago

dbdr

For one thing, stablecoin issuers hold more than $100B of US treasury bills, on the same level as some major countries. For better or worse, the old and new systems are interconnected now.

https://www.brookings.edu/articles/the-rise-of-stablecoins-a...

2 days ago

seanhunter

$100B sounds like a lot of money to any sane human being, but for the T-Bill market it's really a drop in the ocean. Current T-Bill Market cap[1] is 29 Trillion give or take a little, so $100B is about 30bps of the total. Would nudge the market a little bit, but not that much.

[1] Here's my source and they should of course know https://fred.stlouisfed.org/series/MVMTD027MNFRBDAL

2 days ago

cinquemb

Citibank has a good report: https://www.citigroup.com/rcs/citigpa/storage/public/Citi_In...

Tradfi has way more at risk... and the hardware/software that cant be upgraded that the financial system uses every day...

a day ago

throawayonthe

https://security.googleblog.com/2026/02/cultivating-robust-a...

i think google is just a disgustingly large company lol, it's hard to talk about them "caring" about one thing but not another

2 days ago