TFTC: A Bitcoin Podcast

#700: Preparing Bitcoin for the Quantum Era with Jonas Nick & Mikhail Kudinov

74 min
Dec 31, 20254 months ago
Listen to Episode
Summary

Jonas Nick and Mikhail Kudinov discuss their research paper on hash-based signature schemes as a potential solution to protect Bitcoin from quantum computing threats. The episode explores the technical tradeoffs of implementing post-quantum cryptography in Bitcoin, including signature size, verification time, and infrastructure changes needed for wallet systems.

Insights
  • Hash-based signatures offer the most conservative security assumptions for Bitcoin since they rely only on SHA-256, which Bitcoin already depends on for mining and consensus
  • Transitioning to post-quantum signatures would break existing Bitcoin infrastructure like HD wallets, multi-signatures, and XPUB derivation systems that users rely on today
  • The quantum threat timeline remains uncertain, but preparing now through research and soft fork proposals allows Bitcoin to migrate gradually without rushing into a potentially flawed solution
  • Signature size reduction from 7,800 bytes to 3,400-4,000 bytes through parameter optimization is critical for maintaining Bitcoin's transaction throughput and block propagation
  • A phased upgrade using Taproot script paths could allow wallets to add post-quantum spending conditions today while deferring the protocol change until quantum threats materialize
Trends
Post-quantum cryptography research moving from theoretical to practical implementation phase with focus on Bitcoin-specific optimizationGrowing recognition that quantum threat requires long-term ecosystem coordination rather than emergency responseShift toward exploring multiple signature scheme options (hash-based, lattice-based) rather than committing to single standardIncreasing developer participation in post-quantum Bitcoin research beyond core cryptography teamsEmphasis on maintaining Bitcoin's simplicity and security properties while adding quantum resistance capabilitiesHardware wallet and wallet software compatibility becoming critical constraint in post-quantum signature scheme selectionState management in signature schemes emerging as major practical challenge for distributed Bitcoin infrastructure
Topics
Hash-Based Signature Schemes for BitcoinPost-Quantum Cryptography ImplementationSPHINCS+ Optimization and Parameter SelectionQuantum Computing Threat Timeline and Risk AssessmentBitcoin Protocol Upgrade Mechanisms and Soft ForksHierarchical Deterministic Wallet CompatibilityMulti-Signature and Threshold Signature SchemesTaproot Script Path Spending ConditionsSignature Size and Block Space TradeoffsNIST Post-Quantum Standardization ProcessLattice-Based Signature SchemesKey State Management in Cryptographic SchemesBitcoin Consensus and Chain Split RisksHardware Wallet Integration ChallengesAddress Reuse and Privacy Implications
Companies
Blockstream
Jonas Nick and Mikhail Kudinov are researchers at Blockstream conducting post-quantum cryptography research for Bitcoin
NIST
National Institute of Standards and Technology standardized SPHINCS+ and other post-quantum signature schemes being e...
Apple
Mentioned as potential adopter of post-quantum signature standards for secure element integration in iPhones
Google
Google's Willow quantum chip advancement with 105 qubits discussed as recent progress in quantum computing capabilities
People
Jonas Nick
Blockstream cryptographer co-authoring research paper on hash-based signature schemes for Bitcoin quantum resistance
Mikhail Kudinov
Blockstream researcher co-authoring paper on hash-based signatures and post-quantum cryptography optimization for Bit...
Satoshi Nakamoto
Bitcoin creator who originally selected SECP256K1 elliptic curve for Bitcoin's signature scheme
Tim Ruffing
Blockstream colleague who proved Taproot script spends remain secure against quantum computers despite key spend vuln...
Matt Corello
Bitcoin developer who popularized the hash-based signature opcode upgrade path proposal on the mailing list
Ethan Haleman
Bitcoin developer who proposed hourglass mechanism to throttle quantum-exposed signature transitions per block
Conduition
Bitcoin developer who published optimization research on hash-based signatures with significant performance improvements
Quotes
"In a world where central bankers are tripping over themselves to devalue their currency, Bitcoin wins."
HostOpening segment
"Hash-based means we can use any hash. The natural choice for the hash function would just be SHA-256, because if SHA-256 is broken, then arguably we have even bigger problems than just transaction authorization not working."
Jonas NickMid-episode
"In Bitcoin, specifically in Bitcoin, we don't need that many signatures. We create an address. Usually we only want to use it once."
Mikhail KudinovMid-episode
"There is no easy answer to this. I wouldn't be able to reproduce all the pros and cons there."
Jonas NickDiscussing coin migration tradeoffs
"We want to have a system based on merit and there's no secret knowledge that needs to be obtained to be part of any sort of group."
Jonas NickClosing remarks
Full Transcript
You've had a dynamic where money has become freer than free. When you talk about a Fed just gone nuts, all the central banks going nuts. So it's all acting like safe haven. I believe that in a world where central bankers are tripping over themselves to devalue their currency, Bitcoin wins. In the world of fiat currencies, Bitcoin is the victor. I mean, that's part of the bull case for Bitcoin. If you're not paying attention, you probably should be. Jonas McHale, welcome to the show. Thank you for joining me. Thank you for having us. Been a big week. You guys wrote a paper. There's a lot of discussion on it, on X and other platforms. The paper is hash-based signature schemes for Bitcoin. You guys are in this paper trying to tackle the question of, okay, if quantum computers do manifest and people need to protect their Bitcoin, what is the optimal sort of solution to that problem to make sure that Bitcoiners are storing their Bitcoin in addresses that can't be attacked by quantum computers, among other things? But first, I think we should start thinking of someone who doesn't have a cryptography background. Can you just explain what hash-based signatures are, why they might be important for Bitcoin's futures, and why hash-based signatures in the first place? I think I can start, and then Mike can fill in the details that I'm missing. So where do we use signatures in Bitcoin? We use it to authorize transactions. And the current signatures that we use, they are based on the security of an elliptic curve. That's called SECP256K1 in Bitcoin. It was chosen by Satoshi. There are alternative signature schemes that depend on different assumptions. and one of those alternatives are hash-based signature schemes. And all of these alternatives, they have different trade-offs. We started looking at hash-based signature schemes because the assumptions that they use, they are relatively conservative compared to other signature schemes. and hash signature schemes that just means that the signature scheme is the security of the signature scheme is based on the security of the hash function why is that attractive for Bitcoin specifically well we already are relying on the security of SHA-256 for example this is how we refer to previous blocks in the blockchain this is how transactions get committed to a block in the Merkle tree. And from that perspective, we started looking at these hash-based signatures first. Essentially, it's also important to say for most of the other approaches that one can consider, whether there is one popular lattice-based schemes that require some other assumptions, most of them still rely on hash functions. So this is essentially as minimal as you can get in terms of like security assumptions and requirements from the scheme is requiring just a hash function to be secure. This is a very conservative approach and I guess arguably the most secure way the possibilities this has the least amount of attack vectors in that regard. So, Janice, you referenced it. So Bitcoin already relies on SHA-256 for mining and transaction IDs. So this is just sort of a doubling down on what's already worked to date and what we know to be pretty secure. Right, right. So hash-based means we can use any hash. There are also different hash functions we could choose, also with different trade-offs with respect to performance. or even proving in zero-knowledge snarks. But the natural choice for the hash function would just be SHA-256, because if SHA-256 is broken, then arguably we have even bigger problems than just transaction authorization not working. The blockchain doesn't really work anymore. We don't know what the truth is. We don't have consensus on the blockchain anymore. So, yeah, from that perspective, it makes sense to look at these hash-based signature schemes. But they have a reputation for being quite large in terms of size. So that is a consideration that one needs to look into when considering hash-based signatures. And before we get to that, because I think that's exactly what your paper focused on. I'm like, okay, if we want to be quantum resistant, we're going to use hash base. How do we basically weigh the tradeoffs and make sure that Bitcoin is still usable and scalable at the end of the day? But before we get to that, even like we've all heard the term post-quantum thrown around. What does that actually mean? What is the quantum we're supposed to be afraid of and supposed to be preparing for? Yeah, so I think, so from my perspective, if we as cryptographers do security proofs, so for example, for Schnorr signatures or music or frost or signature aggregation, dahlias, etc. We have these little theorems. They are a paragraph or two, contain some numbers, and they very precisely state what the assumptions are under which these signatures are secure. And in the case of Schnorr signatures, it tells you directly when you read it or write it down, it relies on the security of this curve that you're choosing, SECP256K1 in the case of Bitcoin. and we know that this assumption that SecP256K1 is not broken is actually wrong. It is broken. The challenge is just to build a machine that exploits quantum mechanics enough to break it, but this is different from other types of cryptography. We can also do cryptography where we don't have assumptions like that, so just writing this down, this statement, personally makes makes me a little bit uneasy as well so a quantum computer would be able to break our curve and therefore would be able to break shore signatures now the challenge of post-quantum cryptography is to find cryptographic schemes that are secure even in the presence of quantum computers. And that requires also some sort of guesswork or some research as to what problems quantum computers are actually good at. We know that they are very good at breaking our elliptic curve and it's very unlikely that they are good at breaking hashes, let's say. They are better than a classical computer, but they won't be able to break it in a way that they can break elliptic curves. And essentially what I can add here is that we are even sure that the quantum computers cannot do certain tasks good enough to break the security of that. For example, if we have a database, a random database, and we ask quantum computers to search for a specific input in that database. We don't know where it is. We know that it is a hard problem for quantum computers and you cannot do better than certain complexity. And the hash function is essentially this random database because we take an input, we hash it, we get an arguably random looking output. and our assumption here is that we will not find an algorithm that can exploit the actual description of the hash function but one can view this very similar to how classical analysis of hash functions works because classically we also rely on that we cannot find an algorithm that can exploit the actual description of the hash function the actual algorithm that does the hashing. And so far, SHA-256 was not under good attacks there. It proved its security, and there was no significant improvement in terms of exploiting the structure. So for any quantum advances there, there must be a structure in the hash function that I can exploit. And so far, we couldn't find anything, right? Okay. And so let's dive back into the paper and run with the assumption that quantum computers could exploit that hash function at some point in the future and talk about the trade-offs in the paper. It seems like you guys settled on Sphinx Plus as a standardized solution because it already exists. It's already standardized by the NIST. Why do we need this research? What's the gap that you guys are trying to fill by applying this to Bitcoin specifically? Yeah, so NIST has looked at which post-quantum signature schemes to standardize. And they picked multiple. One of them was Sphinx Plus that they standardized. So Sphinx Plus exists. We could use it. They exist in Bitcoin. I mean, there exists high quality implementations with formal verification and proof, etc. So we could just use it. But the kind of research question that we had when we started the paper was the kind of goal of signatures, usually outside of Bitcoin or blockchains is to sign software and to sign certificates for the web. And this is a very different application to what we're doing in Bitcoin with signatures. So what we were asking was, can we adapt this standardized scheme or also other schemes that exist in the literature that have not been standardized? and see how, in what way we could change them to better fit to this Bitcoin application. And again, one of the trade-offs that we're trying to solve for here is the size of the signatures. In the paper you mentioned, going from about 7,800 bytes down to around 3,400 to 4,000 bytes with optimizations. I think for the listeners out there getting them to understand why does the size of these signatures matter? What's the trade-off that you're making? Yeah, I think Mike can speak best to that. Yeah, so first of all, the size determines how many signatures can fit in the block. And then the bigger the signatures, the fewer transactions you can fit and that determines... So in the current block, that is right. we have these maximum 4 megabyte blocks and then if your signatures are suddenly huge then you can of course fit fewer transactions in the block unless you increase the block size as well which is another topic has its own ups and downs but yeah, if we consider that for now that the block stays the same the size matters on the bend this is one thing another thing is that the sphinx plus was designed to support a lot of signing operations so the requirement for the nist was for the people to be able to sign two to the 64 different messages so that's just to add to that that's a number with 19 digits So it's a huge number. It's practically infinite. That was exactly the idea. It's like, okay, we set up this bar that will be never reached. You don't have to worry about it. It's 2 to the 64. It is huge for all practical applications. You just don't need to think about it. There's no problem. And this requirement for hatch-based signatures actually determines a lot in terms of the size. And the first observation that we can make is that if we set this bar lower and we actually can have a limit on less number of signing operations, we can significantly decrease the signature size. Moreover, with the signature size, the verification time can also be decreased. and verification determines on how fast can we validate the block, how fast we then can propagate it and so on so on so this is one of the core observations here if we speak of another improvements is that Sphinx Plus there was a process of standardization and the ideas were coming and certain improvements were made but they had their own timeline. And so they had to decide at a certain point what we accept and what we standardize. But the research didn't stop there, and there were different modifications and different improvements also suggested. And we can use those as well in the Bitcoin environment. When NIST sets these standards, do they ever change? Can you have updates to the specific standards on Sphinx Plus? Or if you're going to go use the other ideas that were suggested but not included in the standard, if you include those, are you out of NIST standard? And is that frowned upon or just considered experimental? So if we deviate from the standard now then yeah we're using a different one Theoretically modifications can be done to the standard and there can be an update there can be a different standard or just a new version of that There are different also like modifications that we can do One can just use different parameters And for example, this way it will be easy to reuse some other implementation. But some modifications that we discussed in the paper also affect some of the structure of the signature scheme that will have to adjust the implementation as well. So there are various options that we can use that give us a certain performance boost or sizes boost or other trade-offs. And they come also in the cost of varying further from the standard or sticking more with it. And so we already said about one trade-off, which is the size of the signatures. I think the other one, one of the other ones that you discussed in the papers, if you have heavier signature sizes and bytes in terms of bytes, you could sort of limit the amount of signatures that any public key can be associated with any public key. What's the truth? I think this is a strange technical thing that one needs to get his head around, right? And right now in our world of signatures, we don't care how many signatures we do. It's just a technicality of these hash-based signatures that when you design such a scheme, there is a limit of signatures you can make per public key. And to be clear, we were not the first to come up with reducing the number of supported signatures. This is a pretty obvious idea when you look at those schemes, how they are structured. But one of the insights that you might have when trying to apply this to Bitcoin is just that in Bitcoin, specifically in Bitcoin, we don't need that many signatures, right? We create an address. Usually we only want to use it once. We don't want address reuse. And then we sign, produce a transaction. That's it. Then you might have to do RBF to bump fees. Then you have to sign again. How often does that happen Maybe it happens more often If there a more competitive block space market it doesn happen often right now and it certainly doesn happen to the 64 times there are other things where you need to sign more often for example in in layer twos so lightning so that payment shows that's that's sort of a different discussion perhaps but you still won't sign that many times there's still going to be an upper bound that is below to the 64 and um yeah using those ideas we can get down uh the signature size um and yeah as i said this is quite quite a technical quite a technical thing very peculiar to these to these hash-based signatures lower number of supported signatures means uh you can get a smaller signature and um What happens if you sign more often? I think that's important to explain. So if you have a signature scheme, a hash-based signature scheme that supports, let's say, a thousand signatures, and you do more, then the security sort of degrades. So it will make it easier and easier for an attacker to actually forge a signature, which in Bitcoin would mean that they could be able to steal their coins, make a malicious transaction to steal their coins. So you should really, if you have that limit of support and interest, you should not exceed it. You must not exceed it. So, Freaks, this rip of TFTC was brought to you by our good friends at BitKey. BitKey makes Bitcoin easy to use and hard to lose. It is a hardware wallet that natively embeds into a two or three multi-sig. You have one key on the hardware wallet, one key on your mobile device, and Block stores a key in the cloud for you. This is an incredible hardware device for your friends and family or maybe yourself who have Bitcoin on exchanges and have for a long time but haven't taken a step to self-custody because they're worried about the complications of setting up a private public key pair, securing that seed phrase, setting up a pin, setting up a passphrase. Again, BitKey makes it easy to use, hard to lose. It's the easiest zero to one step, your first step to self-custody. If you have friends and family on the exchanges who haven't moved it off, tell them to pick up a BitKey. Go to BitKey.world. Use the key TFTC20 at checkout for 20% off your order. That's bitkey.world, code TFTC20. What's up, Freaks? Have you noticed that governments have become more despotic? They want to surveil more. They want to take more of your data. They want to follow you around the Internet as much as possible so they can control your speech, control what you do. It's imperative in times like this to make sure that you're running a VPN as you're surfing the web, as we used to say back in the 90s. and it's more imperative that you use the right VPN, a VPN that cannot log because of the way that it's designed. And that's why we have partnered with Obscura. That is our official VPN here at TFTC, built by a Bitcoiner, Carl Dung, for Bitcoiners focused on privacy. You can pay in Bitcoin over the Lightning. So not only are you private while you're perusing the web with Obscura, but when you actually set up an account, you can acquire that account privately by paying in Bitcoin over the Lightning network. Do not be complacent. When it comes to protecting your privacy on the internet, go to Obscura.net, set up an Obscura account, use the code TFTC for 25% off. When I say account, you just get a token. It's a string of token. It's not connected to your identity at all. Token sign up, pay with Bitcoin, completely private. Turn on Obscura, surf the web privately. Obscura.net, use the code TFTC for 25% off. Yeah, it's funny. Again, I say this a lot, especially when I talk to developers and cryptographers like yourself. It's like so many people take for granted that the system just works and not understanding these deep cryptographic primitives that exist. But it is always infinitely fascinating diving into these details, because I think it's important, even if you're not technical, to have a rough understanding of what's happening under the hood. obviously we talked about Sphinx plus some of the optimizations there we also talked about watts plus C, force plus C pores plus FP with potential optimizations I guess without getting too far into the weeds what are the ideas with these optimizations I think I can do some over here so the core ideas here is we can make the signer to work a little bit more on finding certain inputs except we don't only include the message but some extra seed or an extra random number or a counter and that allows us to search for better values that we can then sign and this requires some extra work from the signer but because we have this extra properties from what we are signing this can reduce the signature size and this also helps the verifier so by doing extra work on the on the signer part we reduce the size and we reduce the verification time sometimes this extra work also comes in the cost kind of yes we search for this nice value to sign afterwards but because this value is nice this also eases the further work for the signer so we had these different parameters choices in the beta and for most of them we make the signing time a little bit more than the original Sphinx Plus because we want more effort from the signer but really smaller signatures. And on the other hand, we can still balance this out while keeping the signature size small. And so this would affect wallet software, right? Because I'm thinking of, I played around with some DLC apps, Atomic Finance, and when you go and you're doing like a rollover transaction with DLCs that take a lot of signatures, it sometimes takes like a minute or two to actually sign and then broadcast the transaction. So this wouldn't necessarily affect the nodes, right, and bandwidth propagation and things like that. It's just literally on the setup to sign and then broadcast, correct? Yes, I think so. I think this is a very interesting point that you're making right now that I hadn't really considered before. so I think this kind of trade-off between increasing the signing time while reducing signature size and verification time makes a lot of sense in Bitcoin because at least on the Bitcoin blockchain we have a natural limit of the number of transactions that are supported per second right we can at most like whatever 10 transactions per second and a few more signatures than that whatever but it's not thousands of signatures per second that you need to make on the other hand we also have hardware wallets that are of course lower powered and that might take longer and i think what you're pointing out is an interesting thing you might have situation where you are pre-signing a lot of transactions and this is what is happening naturally in dlcs right now and this would be affected by a change like this, by a change to hash-based signatures because producing these signatures, depending on which parameters you pick, which is, I think, the biggest part of our work, trying to figure out what are reasonable parameters, it might take much longer to sign them than it takes currently. Yeah, and that's the trade-off you have to weigh. Are you willing to wait more to broadcast the transaction? Right. More generally, perhaps important to say that in this post-quantum world, we always have to deal with downsides. This is also something that I feel like people on Twitter sometimes misunderstand because they are asking, why haven't you done anything right now already? perhaps we would have we would have changed to more conservative assumptions in terms of signatures if it was for free but that's not the case whenever we want to add post-quantum security to Bitcoin we get very significant downsides be it new assumptions signature sizes verification time, signing time statefulness so there's no we can't just there's a big risk I think to just try to switch something which we later figure out wasn't the right choice because it picks the wrong trade-offs so what we're doing mainly in this paper is to explore the trade-off space particularly with respect to hash-based signatures to inform the community as to what are reasonable parameters to pick yeah that makes a lot of sense i mean you mentioned statefulness that you guys touch on it on the paper stateful versus stateless signature schemes why is being stateless so important to bitcoin specifically so yes stateful versus stateful sense is a good topic as well here what does what does a stateless versus stateful mean is that again now we don't have to think about it, ECDSA is essentially stateless. And what does it mean is that when you sign you don't have to touch your secret key in terms of changes and update it. It stays the same and you can kind of, in more or less implementations, you can separate it, you can backup it as it is, there is no problem. Everything again stays the same. When we say stateful hash based scheme, means every signing operation must update the secret key. And if there is a misuse of this secret key or the update didn't go through or you lost some key and you run a backup that has an older state of the key, that will compromise the security of the scheme. But it comes from the other side with benefits of having, again, better performance and better sizes. with a stateful scheme if you're happy with these secret key manipulations you can have better performance but it's important to mention this can be very quick for example if you have just two separate devices and you want to run them with the same key pair if they don't communicate with each other maybe you can separate the states but that will limit the number of signatures that they can provide but there is much involved there in that regard with managing your key pair. Yeah, so the state management thing I think is another very technical thing but that is kind of important when trying to pick these schemes. I think state fullness is pretty fragile and it's one of those things that won't work for every signer. So another example is so you run bitcoin core on your node um let's say you produce a backup you back up your wallet dot that uh you make a few transactions perhaps and uh your machine crashes whatever it break your machine breaks you need to restore the backup that you've created including the wallet dot that and if you do that and the state would be stored in the wallet dot that which of course no one would do but if you did that then uh essentially you would uh allow an attacker to produce forgery steal steal your bitcoin so that is why it's very fragile and it won't work for everyone so for example it won't probably won't easily work for share care okay i guess another thing that this affects too though is like hd wallets hierarchical deterministic wallets and you mentioned the backup system that we rely on. Again, they don't work the same way with hash-based signatures either. So what do we do with HD? Yeah, I think this is one of the other sort of downsides of hash-based signatures is that they don't have this nice mathematical structure that we currently enjoy on our elliptic curve, which means that we cannot use any of the tricks that we've developed over the past years in Bitcoin. And that includes HD wallets. It includes multi-signatures, threshold signatures, silent payments, aggregate signatures, and taproot, I guess, taproot commitments. So that was one of the other research questions that we also had when we started this entire project is is there anything we can do with hash based signatures in the past it was sort of it was intuitive that it wouldn't work but I thought maybe we could do something here or there and improve on at least the trivia of things I think the answer is mostly we can't really do anything fancy there we described some methods in in the paper that would reduce the size of multi-signatures by a little bit but it makes the signing protocol very complicated and there are other techniques like that that you could use especially for multi and threshold signatures it's not maybe there are some scenarios where you could use them but right now it doesn't seem it doesn't seem to make a lot of sense to base the entire standardization effort and picking specific schemes just to make it work a little bit better with these multi-signature schemes because they don't really work that well. Yeah, they don't give much there. And for HD wallets, there we have so HD is multiple things one is private derivation and public derivation so what you can have still with hash-based signatures that works very straightforwardly you have a seed that you can generate many addresses from it perfectly fine works well what you can't have is this kind of XPUB where some other person would or some other entity would derive new public keys from some given XPub. That does not work. And this XPub system, the XPub setup is particularly important today in hardware, software wallet setups. If you have your hardware wallet, export your XPub from the hardware wallet to the software wallet. software wallet is then able to derive new addresses and scan the chain in that way. And that wouldn't be possible anymore with hash-based signatures. What you need to do instead, so the only thing we know of is you have your XPub is not a short public key, but rather a long list of public keys that you give to the software wallet and once this list is exceeded the software wallet needs to talk to the hardware wallet and ask for more public keys again so it's not impossible but it doesn't work as well with the infrastructure that we've created with descriptors in the past and there is also this kind of worry that because it's a little bit harder to make this work that people would or Wallet developers might decide to just reuse addresses, for example, instead of building this system. That would bork a lot of things there. Yeah. Yeah. So if you're a listener, I think one of the examples that Jonas just walked through, like say you have a cold card and you want to set up the private public e-pair offline using the wallet, and then you want to put it in a safe or something, but you want to be able to receive Bitcoin to it. So you download Sparrow, you export the XPUB to Sparrow, and then from there you can get these addresses. This would be made harder. You'd have to do it a different way with these hash-based. And similarly with multisig, you share XPUBs with your quorum partners to make the multisig XPUB, and that would be made impossible with hash-based signatures. Right, right. this is another kind of scenario that sort of wouldn't work as well anymore. That seems like a pretty big change. Maybe something we should talk through. Of course. Well, it might be. I mean, there are other... We said it's not only hash-based signatures. There are other assumptions we can base. They might support these kind of scenarios better, but they have other downsides. What up Freaks Have you noticed that governments have become more despotic They want to surveil more They want to take more of your data They want to follow you around the Internet as much as possible so they can control your speech control what you do It imperative in times like this to make sure that you running a VPN as you surfing the web as we used to say back in the 90s And it more imperative that you use the right VPN a VPN that cannot log because of the way that it's designed. And that's why we have partnered with Obscura. That is our official VPN here at TFTC, built by a Bitcoiner, Carl Dung, for Bitcoiners focused on privacy. You can pay in Bitcoin over the lightning. So not only are you private while you're perusing the web, with Obscura, but when you actually set up an account, you can acquire that account privately by paying in Bitcoin over the Lightning Network. Do not be complacent when it comes to protecting your privacy on the internet. Go to Obscura.net, set up an Obscura account, use the code TFTC for 25% off. When I say account, you just get a token. It's a string of token. It's not connected to your identity at all. Token sign up, pay with Bitcoin, completely private. But turn on Obscura, surf the web privately. Obscura.net, use the code TFTC for 25% off. What's up, freaks? Been seeing a lot of YouTube comments. Marty, your skin looks so good. You're looking fit these days. How are you doing it? Well, number one, I'm going to the gym more. Trying to get my swell on. Trying to be a good example for my young son to fit. Healthy dad, but part of that is having a good regimen, particularly staying hydrated, making sure I have the right electrolytes and salts in my body. That is why I use salt of the earth. I drink probably three of these a day with one packet of salt of the earth. I'm liking the pink lemonade right now. It's my flavor of choice. This is their creatine. I've added this to my regiment. They have it in these packets as well. Makes it extremely convenient if you're traveling. You want to work out while you're traveling, but you don't want to be carrying a white bag of powder going through TSA. It's very, very nerve wracking at times. You have to explain hates. It's not what you think it is. It's creatine. I'm trying to get my swell on. Make sure you're staying hydrated. I have become addicted to these. It's made my life a lot better. I can supplement this for coffee in the morning and be energized right away. I can supplement. I can bring the koreatine wherever I need to. Just put a couple packets in here before I head to the gym. Bring this to the gym, drink it out of a glass bottle. Make sure I'm not injecting any microplastics into my body. Go to drinksauté.com. Use the code TFTC. and you'll get 15% off anything in the store. That's drinksauté.com, code TFTC. Well, that was going to be my next question. Obviously, this research paper was focused on hash-based signature schemes. Do you guys plan on doing more research on different signature schemes moving forward? Yes, so our current plan is also to look at lettuce-based schemes. as exactly Jonas mentioned although the lattice base they introduce extra assumptions this is called lattice base based on lattices and the structure with certain heart problems but they can potentially offer these upsides in terms of hardware what is public key derivation multi-signatures and stuff like this. So for now, I think it's a little bit early to talk in details about that. But this is the follow-up that we're working at. Yeah. And I also think it's important to note that this entire post-quantum story for Bitcoin is a spectrum. from what do we do in an emergency if it happens now, let's say. It might not even be a quantum computer. It could be just our curve is broken classically. That's a scenario that some people bring up. And how can Bitcoin really work in a post-quantum future? And those are different questions, I think, because when we want Bitcoin to work in a post-quantum future, we need to make sure that people can still make transactions and they're not just like a few transactions that can make it into the chain because the blocks are so small compared to the signature sizes. So those are different questions. And then there's a spectrum in between where we look at, okay, maybe this won't be the perfect solution, but at least it doesn't rely on these crazy assumptions. And I think this hash-based signature scheme is more on the sort of emergency, something we could get consensus on relatively easily, whereas the lattice-based stuff is a little bit farther away on that spectrum. Yeah, there would be a lot of reshuffling if we had to do, if and potentially when we have to do this transition, so being prepared for it. Also, it's also possible to add multiple signature schemes to Bitcoin And even just looking at hash-based signatures, this might be a viable path forward because there are just so many different types of signers. There are these, I don't know, Kleb hardware wallets and there are Kleb lightning nodes and there are also big exchanges and they have different requirements. and even just for hash-based signature schemes this might mean that it makes sense to add hash-based signature schemes with different parameters. This has downsides as well, of course. It increases the complexity of the Bitcoin protocol which we want to keep relatively small because this might result in chain splits or other vulnerabilities in bugs if there actually is something that goes wrong. And so another downside would be that it reduces privacy. One of the big advantages of introducing Taproot, right, was that every transaction should look like every other transaction, even at least in the best case and to improve privacy. And if we introduce these many different signature schemes now, this might be a problem because now you can look at the blockchain and just follow certain passive transactions and maybe certain wallets, they have identifiable fingerprints and so on. So these are the downsides if you add multiple signature schemes, but it's certainly possible. So you get degradation and privacy via process of elimination of different people using different signature schemes. Yes. Yeah. That makes sense. Well, I guess this all comes back to how real is this threat? How imminent is it? Or how far away is it? How much time do we have to prepare? Obviously, Jonas, you mentioned there is a potential for classical means. You don't even need a quantum computer just to break the curve. But I think a lot of the discussion is on quantum specifically, and obviously early this year, you had Google's willow chip. It made a lot of headlines. I think the sort of advancement there was 105 qubits, computations that would take supercomputers 10 septillion years to perform. And is that advancement enough to make Bitcoiners wary? And then another thing, which is I'm trying to wrap my head around reading people's reactions to the quantum discussion this week, is like how much signal is in these advancements, whether it's Willow or other quantum advancements that have been made recently. I've read people explaining that some of these advancements have sort of hard-coded variables or assumptions in them, which are sort of cheating in a way. And even if Willow got to 105 qubits and it was legitimate, I think it's a very good case to be made that it was. Bitcoin's cryptography would require around 13 million qubits working in parallel to break the curve. How viable is that? How realistic is that in a timeline of 5, 10, 15 years? So I'm not a quantum engineer, and I think there is no consensus, at least among the experts, when this will happen. My angle on this is that I told you, we write these theorems. They rely on the elliptic curve. We know the elliptic curve is broken. this makes me uneasy. I want Bitcoin to be secure and that includes my own little stash. I want it to be secure against very, very powerful adversaries. And I don't want it to only remain secure as some random guy on Twitter turns out to be right. So I think the Bitcoin community should certainly look at this problem and this is why we are working on it. right now I wouldn't lose my sleep over it for sure as well. But on the other hand, 10 years in Bitcoin even is not such a long time. If we need to prepare for an upgrade like that, you said it yourself, the HD wallet stuff, for example, that would completely be broken. So we would need to have an ecosystem update for that. Maybe that doesn't take 10 years, but people migrating their coins, the more time they have the better yeah yeah I mean it's very hard to just say okay this will be 5 years, 10 years or whatever years because the advancement goes from every possible direction as you said for example counting the number of cubits there is a very interesting point in terms of it's a little bit different from how the classical computers work in terms of we need several physical qubits to encode one logical means one logical is that we actually actually performs like we expected it to perform because there is a lot of noise in the quantum computations and one need to eliminate that noise to actually you will need to do proper computations. One can increase the number of those qubits. One can improve the noise cancellation algorithms. Then it's also very important in terms of the quantum algorithms that are actually solving the problem. And for example, there was a recent advancement that reduced, I think, by 20 times the number of qubits that required the algorithms there. So the advancement go from different directions. I would agree with Jonas in terms of I wouldn't lose sleep today, but we have to work towards the solutions to be ready. If we just wait, wait, wait, wait, then we never come there. But now we have to work and progress and see what possible solutions we have and prepare for the downsides and changes that these solutions bring. How would you two describe your – what's the word I'm looking for? Are you happy with the amount of focus on this? Obviously, BIP 360 has been talked about for the last – I mean, before it was BIP 360 even. And people have been talking about this hunter and his work. People have been looking at that and discussing it. It seems like over a year now. At this point, obviously, you two have done a ton of research. We had the quantum meeting at Presidio Bitcoin in July. Is there enough discussion, in your opinion, on this topic right now? Are enough people focused on it and taking it seriously? and of course it would always be better if there was more quality discussions but i think the type of discussions that we have right now is is pretty good and has also increased quite a bit especially over uh over the last year so like you you like on twitter there there was not only noise i think there was also a bunch of signal and it was good that even the wider bitcoin community also talked about this and talked this through and on the bitcoin mailing list there's always a bunch of discussions of course there's always this kind of bitcoin thing that might happen where just there is some discussion and then there's there's no follow-up nothing ever happens right this is pretty standard thing that happens in bitcoin but i'm also not too worried about that to be honest because i think if it really if it was really uh if if the quantum computer was imminent then i think the the bitcoin community would find consensus on some sort of upgrade what is of course going to be really really painful and there's no way to sugarcoat is what happens with the coins that don't migrate and i think this will be extremely painful but that's not something that that cryptography can solve unfortunately what's your take there don't do anything with them let the treasure hunts begin that's I think that's my stance because you always have to think of the man in the coma right which is if you were to change it or make their coins unspendable and the guy wakes up and he's like hey I have five Bitcoin here that I can't move anymore because you did something and quant computers haven't stolen it yet. You're sort of screwing that guy over. I think there are reasonable arguments for both positions. That's what makes it painful. and I don't know I guess my opinion right now is that if I look at it extremely selfishly what code would I run on my node then I think it would be the one that freezes the coins because at least I think in the mid to long term it would probably result in a better outcome for the Bitcoin network but you could find counter arguments to that I'm sure yeah I thought I followed the discussion on the mailing list. I've been following it. I thought earlier this year, I think there's still a lot of assumptions baked into this, but I thought Ethan Haleman's post on this was interesting, how you could essentially throttle the transition and only make a certain amount of transactions from quantum exposed signatures possible per block. Is this this hourglass idea? I think so. I'm pretty sure. Let me see. It goes all the way up. I think it is. He also said it was a way to solve the JPEG spam as well. But this idea that – and that's the other thing. Maybe we should just dive into the tradeoffs that exist with what you do with the coins that haven't transitioned. what are the solutions that are being thrown out there right now or the trade-offs that exist. I mean, I personally haven't looked too much into this. I know that there is a discussion on the mailing list, and there probably should be more discussion. I don't know. But as I said, there's no easy answer to this. I wouldn't be able to reproduce all the pros and cons there. All right. Well, then taking it in a different direction, like how like obviously you guys are doing research. There's a lot of discussion about this in terms of like a first step beyond research and having the discussions. What would a transition to a post-quantum signature scheme look like? Like how do you prep the market? How do you begin moving? how do you set up the node software to make this stuff viable? As for the next steps that we envision after this paper, so one thing that is a contribution of this paper is that we have this script, a Python script, where you can explore the parameter space or you can say, okay, I want to support that many signatures and the tree to look like this, and then it spits out the signature size, verification time, etc. And we also have tables in the paper that do show some of these values already for the parameters that we pick. So explore the parameter space and then next, the next step there is implementations. Benchmarking. Yes, implementations and actual benchmarks. we just estimate through some crude proxy what the performance would be but we haven't really measured actual performance and at least from my experience implementations usually they inform specifications quite a bit because you find things that you wouldn notice just going from the from direction And it nice that actually there are people actively participating in that regard And also on the mailing list, there were several posts of people trying to implement hash-based schemes, either Sphinx Plus or Sphinx Plus with lower number of signatures. now they're also looking at possible modifications that we also discussed in this paper and I think this will significantly help with the decision that we will have to make so yeah I think that's very good to point out because they have really so specifically I want to give a shout out to Conduition he wrote this awesome blog post where he optimized hash-based signatures like crazy and certainly worth a read but for the bigger discussion how do we actually migrate now that this research exists or even before that just with the standardized NIST schemes so one thing that could happen that is I think a relatively reasonable upgrade path that unfortunately does not have a catchy name I think so Matt Corello he was the first one to popularize it on the mailing list but others I think have discussed it before which is let's say we just introduce a hash based signature opcode like check sick verify that we have for our curve elliptic curve based signatures we just introduce one for hash-based signatures. It doesn't matter how it looks like, something like that. And what people could do today already, wallets could do, generate a post-quantum public key, a hash-based public key, and then add this public key along with this new opcode in a taproot Merkle tree that already exists. So this would be a soft fork. So we add this opcode. And there's no cost for a wallet except implementation complexity to add this additional spending path, if they already use Taproot, to the Taproot tree. And so they do that. and of course taproot itself is not secure so the full taproot not secure against a break of the elliptic curve because we have these key spins as well which just require an elliptic curve short signature for the public key that is directly in the output so that's a keypad but there's also the taproot tree now what our colleague Tim Rufing at Blockstream proved is that Taproot, the Taproot script spends, so the mercanties inside the Taproot, they are still secure even if a quantum computer arrives, a cryptographically relevant quantum computer. So what you can do is at a certain point, you disable the key spends from Taproot and then you still, hopefully many wallets have done that, They have these additional post-quantum public keys in the tree. So they will be able to spend those coins with a hash-based signature. And the advantage of this is you can use Taproot as you do right now. You have these very official key spends. You have the Taproot path. You have music, frost, whatever. only once the Bitcoin community does this upgrade to disable keyspans hypothetically then you need to use these new script pass in the Merkle tree, in the Taproot tree this will be one variant actually a very important benefit of the Bitcoin system in this regard is that for example other systems that we're looking like I don't know TLS or whatever a lot of time we want encryption and people that are also scared of quantum computers today they're also worried about this attack called store now decrypt later so they also are migrating in terms of encryption so for example now the post-quantum is called encapsulation mechanism that help you to actually encrypt your messages and encrypt your communications are already getting deployed. While the signatures can wait a little bit later because for most use cases, you are scared of your signatures broken now. You don't care if they, I don't know, they see your public key that you don't use anymore 10 years later. And so these scripting parts that you have described, exactly we can migrate closer when we see the actual thread while the solution is already there. You just want to use the solution when we're actually needed. And this may be a dumb question, but Jonas, you mentioned like Musig and Frost. Is there a way to just leverage those on top of like your private public key pair to harden that? private key even more from a quantum attack? Does that make sense? You mean... You just obscure the spending conditions, like make it harder. Without introducing hash-based signatures or any post-quantum assumption? Yes. No. No. The problem is that whenever the adversary sees a public key, they can run their quantum computer and compute the private key that's corresponding to it. So music or frost or any obfuscation attempts like this will fail. Okay, that's good to know. Fascinating stuff. So what are the dangers if we move too fast on this? I mean, you mentioned them earlier. Because that's what I worry about. The most disheartening thing about the conversation last week, I think it's good that the conversation is happening, but the idea that the community of developers working on Bitcoin haven't been thinking about this. Obviously, you two spend a lot of time on your research. Blockstream does a ton of research, and there's plenty of others outside of your organization working on this stuff. What I worry about most, as I agree, I think we should want Bitcoin to be as secure as possible. And if it's a threat of quantum computing or even classical elliptic curve breaking computers are to manifest, we should prepare for this. But what could be just as dangerous is moving too fast, right? And so trying to have a level-headed discussion about what to do. And obviously you can't use the fear of moving too fast forever because then you're complacent. But there's this sort of tradeoff between urgency and complacency. how do you view it? I think it's, I mean, first of all, yeah, there is a lot of value in the Bitcoin protocol being as simple as possible because, yeah, we want Bitcoin to be secure and that requires a relatively simple protocol. Whenever we add something to Bitcoin, there is a risk. I think adding a hash-based signature scheme to Bitcoin, think that risk is manageable, but still it will require resources to maintain it, to make sure that the versions are all compatible with each other and there are no consensus relevant issues, which is a problem that is specific to Bitcoin. It doesn't exist in non-blockchain or non-consensus systems. So there's a risk in that. And then if you add something and no one uses it, I think that's just a lot of wasted effort. But I also don't think right now, looking at the discussion on the mailing list and people I had in person, even on Twitter, I don't really think that it is realistic that we will have a post-quantum upgrade very soon in Bitcoin. It's at least my impression. And as you said, we want to explore in full what are our options. And as we said before, the next topic would be to look at latest-based, how they affect, what are the assumptions, what are the benefits, and doing a proper comparison of what are the options gives us, I guess, the best outcome here. So, crushing it Also like this With the fear in the ice Is also not the best way to go No And again, who knows That's the thing about this quantum Risk It's hard to know, it's like a big unknown It's like climate change almost It's just around the corner But it's been said for some time And obviously there are some advancements But again, as a layman, it's not easy to understand where the signal and these advancements are. You can talk to some quantum physicists. They say, to your point, this doesn't scale. Literally, you need more physical qubits to create logical qubits. And if you need thousands of logical qubits to attack Bitcoin, that's an exponential amount of physical qubits. This is the way I understand it. I'm speaking out of my ass. Yeah, I mean, if I may, if I'm allowed to tell a personal anecdote, I actually, I studied cognitive science for my bachelor, which is right at the intersection of AI and human intelligence. And for my master's, I studied computer science with a focus on artificial intelligence. If I had told anyone that I believe we would have ChetGPT3 by 2022, they would have ridiculed me forever. At that time, people, not only that, they thought maybe, I think one of, or a lot of people thought that something like GPT-3 would never be possible. It's just so much intelligence coming from a machine. And look at where we are now. I actually sort of thought when I studied this, I thought this AI stuff is going nowhere. it's not not going to be a breakthrough i'm i'm focusing all my time on bitcoin back when i was a student right so i am just very skeptical of people oh first of all i want to say of course this analogy fails in many ways quantum computing very different to ai of course but i am skeptical of people saying or confidently claiming there's no clear path towards a cryptographically relevant quantum computer. Maybe there is not one right now, but I think there could be one any time. Yeah, and you have to prepare. If you don't prepare, and then suddenly you find out you were wrong? Yep. Yeah. No, that goes back to measuring and balancing complacency versus urgency. Because again, to my point, I think this is a discussion that's been taken seriously for many years now. And it's like a rush to fix it right now. It's like, whoa, hey, there's good conversations going on. There's good research being done. Let's make sure we get this right because it's critically important. And on that note, I want to thank you, gentlemen, not only for joining me but for doing the research on this subject and coming to explain it to everybody. It's been incredibly illuminating for me. And again, like I said earlier, a lot of people take for granted the technical details of the Bitcoin protocol. It just works for them at the end of the day due to the work, research that individuals like yourself and other developers have done. And it's pretty insane sci-fi tech once you look under the hood. and trying to upgrade that tech is not an easy feat, especially when you have a distributed protocol and you need coordination between all those individual actors within the system. Yes, but one thing I want to add there is that what we're doing is also not rocket science in the sense that it's impossible to understand or anything like that because there's also this impression that I see sometimes on Bitcoin that there is this closed group of perhaps Bitcoin core people, high priests or whatever. If that is the case, then that is very bad. And I think everyone in the Bitcoin community agrees that it will be very bad. Well, we want to have a system based on merit and there's no secret knowledge that needs to be obtained to be part of any sort of group. At least this is the system we want to have. And we want to have the best people working on Bitcoin. And this was also our intention, writing this paper. We tried to give a good introduction into this hash-based solution and slowly building up from very basic, from LEMPRO signatures, which probably we will not use, but just to build the intuition. so go take a look at the paper we're trying to introduce the stuff slowly there and moving on and on and showing what are the solutions, what are the trade-offs, what can be done and yeah let's have a discussion and if I could read your research have a conversation with you and know enough to understand what's happening in the background like verification and signature verification, its effect on the protocol. Anybody can do it. That's to say the least. Gentlemen, any parting notes, anything that you want to leave the audience with right before we wrap up here? I think I would encourage further discussions, more technical discussions, reading into what we've done, doing your own stuff, suggest and let's have it with a good pace, not rushing but also not putting it away and saying like this will never happen I agree I mean there's still a lot of feedback that would be interesting for us in particular from wallet developers. That is sort of an open question. Maybe there are wallet developers, for example, that can deal with a state issue. And we have a scheme that we describe, which is very efficient if you can hold state securely. So that will be interesting feedback. Then there is this benchmarking question, especially with respect to siding on low-powered devices and yeah i think those those were the the the main open questions and the big question is like what is the most interesting parameter set to choose from the set that we provide and then there's also the question i think some people would also argue that just using the standard versus our optimized thing would be preferable because then it would be supported by the secure element on your iPhone or whatever. So that is, I think, also one of the big open questions with respect to hash-based signatures specifically. Could you propose a new standard? I mean, we could propose, but would it be accepted as the bigger question? We make our own standards, right? We have the BIPS process, and maybe we can get Apple to adopt our standards. That would, of course, be the most preferable path. Yeah. Well, gentlemen, thank you. This was, again, incredibly illuminating. Thank you for your work, and hopefully we can do this again. Maybe when you finish your research on lattice base signatures, we can catch up then. Or maybe not. Maybe we won't have to wait that long, but thank you. And I hope you guys enjoy the end of the year here. Thank you for joining me so close to Christmas and the holidays. So this was awesome. Yeah. Thank you very much. It was fun. Alright, peace and love, freaks. Take care.