"The compensation paid to validators derives from users’ transaction fees, which must therefore be sufficiently high to attract enough honest validators to keep the ledger running. So ledgers throttle transaction throughput by design, creating congestion and so increasing users’ willingness to pay fees to avoid it."
Hmm. It's certainly true that validators need to be compensated enough to keep the ledger running. Their payment does come from transaction fees, but it also comes in the form of newly created (mined) coins.
In a proof-of-work system (i.e. permissionless/trustless blockchains) sufficient incentive means roughly that the compensation (block reward) exceeds the cost of the electricity needed to mine the block.
It's not clear to me that getting the validator incentives right implies that a ledger would *have to* "throttle transaction throughput" by design.
From the BIS article:
"Achieving sufficiently high rewards requires the maximum number of transactions per block to be limited."
I'm not sure why this would necessarily have to be the case. As a validator, if you could include more transactions in your block, you'd get yourself more transaction fees. Would allowing unlimited transactions somehow cause validators to run away? I don't see why it would. Maybe I'm missing something?
Also from the BIS article, but unrelated:
"Smart contracts were made possible by the development of Ethereum, the first major blockchain that allowed for programmability."
This is not true. Bitcoin has smart contracts. A difference is that Ethereum allows you to write loops whereas Bitcoin script doesn't.
3. Bitcoin smart contracts. You say "a difference," but absence of loops, or Turing incompleteness more generally, is a huge difference not a trivial one. And at a high level "not programmable" seems like a not-inaccurate description.
1. Good point regarding mining vs. fees. Mining works value via dilution or seigniorage. So not exactly connected to congestion.
2. The big point is whether one could still pay validators if there were no congestion, as the BIS piece claims and which I accepted. The BIS graph is pretty convincing empirically, but I suppose a concrete refutation would take a bit more work.
Just like dealers wouldn't provide liquidity if liquidity were perfect/free, validators won't validate if throughput is infinite/free. That makes sense to me. But I don't see how this implies that congestion has to worsen as the usage scales up.
It might be true that blockchains have transaction limits, but I'm not convinced that this is a reasonable story as to why.
And it might be true that congestion is part of what leads to fragmentation, but people trying to make money by creating a new speculative instrument and hyping it up as a "better technology" might be more of the story here.
Maybe it's not as cut-and-dry as I had initially imagined. But:
If total network throughput it as a certain level, and you want to scale it up, by definition new validation resources will be needed, resources that are not already enticed by current validation rewards. So fees per unit of throughput have to rise, and I think it may be right that that means congestion has to rise.
If being constrained by "validation resources" were truly the explanation, then why would a new blockchain somehow have access to cheaper validation resources than the old blockchain? Do you have an intuition for why the two blockchains wouldn't compete for the same validation resources?
What *is* a validation resource, anyway? As far as I understand it, validation on the blockchain is so cheap as to be nearly free compared to the other costs involved. The resources (energy) are not going into validating transactions. They're going into establishing a distributed consensus over which block of *already validated* transactions should be added to the end of the chain.
My understanding is that it's all really just a lottery system. You buy lottery tickets with computing power (energy). The more energy you dump into the system, the more lottery tickets you get. You win if your block gets chosen. That's proof-of-work. And proof-of-work has nothing to do with validating transactions.
I skimmed the paper enough to know that I'm not interested in reading it further.
This line from the conclusion, in particular, stands out to me as a red flag:
"Bitcoin presents a computer science breakthrough"
"The compensation paid to validators derives from users’ transaction fees, which must therefore be sufficiently high to attract enough honest validators to keep the ledger running. So ledgers throttle transaction throughput by design, creating congestion and so increasing users’ willingness to pay fees to avoid it."
Hmm. It's certainly true that validators need to be compensated enough to keep the ledger running. Their payment does come from transaction fees, but it also comes in the form of newly created (mined) coins.
In a proof-of-work system (i.e. permissionless/trustless blockchains) sufficient incentive means roughly that the compensation (block reward) exceeds the cost of the electricity needed to mine the block.
It's not clear to me that getting the validator incentives right implies that a ledger would *have to* "throttle transaction throughput" by design.
From the BIS article:
"Achieving sufficiently high rewards requires the maximum number of transactions per block to be limited."
I'm not sure why this would necessarily have to be the case. As a validator, if you could include more transactions in your block, you'd get yourself more transaction fees. Would allowing unlimited transactions somehow cause validators to run away? I don't see why it would. Maybe I'm missing something?
Also from the BIS article, but unrelated:
"Smart contracts were made possible by the development of Ethereum, the first major blockchain that allowed for programmability."
This is not true. Bitcoin has smart contracts. A difference is that Ethereum allows you to write loops whereas Bitcoin script doesn't.
3. Bitcoin smart contracts. You say "a difference," but absence of loops, or Turing incompleteness more generally, is a huge difference not a trivial one. And at a high level "not programmable" seems like a not-inaccurate description.
1. Good point regarding mining vs. fees. Mining works value via dilution or seigniorage. So not exactly connected to congestion.
2. The big point is whether one could still pay validators if there were no congestion, as the BIS piece claims and which I accepted. The BIS graph is pretty convincing empirically, but I suppose a concrete refutation would take a bit more work.
Just like dealers wouldn't provide liquidity if liquidity were perfect/free, validators won't validate if throughput is infinite/free. That makes sense to me. But I don't see how this implies that congestion has to worsen as the usage scales up.
It might be true that blockchains have transaction limits, but I'm not convinced that this is a reasonable story as to why.
And it might be true that congestion is part of what leads to fragmentation, but people trying to make money by creating a new speculative instrument and hyping it up as a "better technology" might be more of the story here.
Maybe it's not as cut-and-dry as I had initially imagined. But:
If total network throughput it as a certain level, and you want to scale it up, by definition new validation resources will be needed, resources that are not already enticed by current validation rewards. So fees per unit of throughput have to rise, and I think it may be right that that means congestion has to rise.
This article https://academic.oup.com/restud/article/88/6/3011/6169547 seems to do the work of this piece of the story. It's written in economics but not terrible for that.
But I take the suggestion that this line of thinking needs some work!
If being constrained by "validation resources" were truly the explanation, then why would a new blockchain somehow have access to cheaper validation resources than the old blockchain? Do you have an intuition for why the two blockchains wouldn't compete for the same validation resources?
What *is* a validation resource, anyway? As far as I understand it, validation on the blockchain is so cheap as to be nearly free compared to the other costs involved. The resources (energy) are not going into validating transactions. They're going into establishing a distributed consensus over which block of *already validated* transactions should be added to the end of the chain.
My understanding is that it's all really just a lottery system. You buy lottery tickets with computing power (energy). The more energy you dump into the system, the more lottery tickets you get. You win if your block gets chosen. That's proof-of-work. And proof-of-work has nothing to do with validating transactions.
I skimmed the paper enough to know that I'm not interested in reading it further.
This line from the conclusion, in particular, stands out to me as a red flag:
"Bitcoin presents a computer science breakthrough"
Good luck.