= 144. So does that mean you have to download 144gb of data a day?
Yes, according to my understanding, if all blocks are 100% full. But we are loooooooooooooooong ways from that.
Doesn’t that totally decimate the ability for anyone to run a node?
When? And where?
Well even spv nodes need to download recent blocks. They solve the storage requirement but you can never get around the data requirement. So if In the future it takes 144gb of data per day to run a node that takes a ton, if data costs stays the same cost, of people out of the runnings. Even myself. We only get 1.2 tb of download per months. This alone would be 3tb+.
The fallacy in your thinking is that you are applying today's limits to tomorrow's problems (where "tomorrow" is really years from now). Yes, with your today's limits, it would be a challenge to run 1 GB blocks. But 32 MB not so much. Check out https://wisevoter.com/country-rankings/internet-speed-by-country/ for average internet speeds by country in 2023. And of course, in each country people can get access to above average services if they want to. So 32 MB blocks are doable today.
also this is wrong - i think you're confusing spv with pruning. pruned nodes do need to download all the txs anyway, spv does not.
Wait so they don’t even download full blocks?
spv clients don't download full blocks, that's the whole point of spv
So then they do no validating for the network?
If users are creating more then 144 GB of traffic each day, then if you want to run a node you will need to download all this data. And, in this case, the network will be broken, because users will experience slow unreliable service, and the network will need a maximum block size several times larger. Or, if you wish, if the maximum block size should be 1GB then each node will have to download much less, perhaps 50 GB of data, because most blocks should be far from full. Where I live in rural US, I recently acquired a new computer for $800. I ran some network speed tests and showed that I could download and upload more than 100 MB per second, so downloading 144 GB would take less than a half hour. Indeed, my new computer downloaded, synced and indexed the entire BCH network from scratch in under three hours using the latest BCHN software on Ubuntu 23.04. What about uploading? Except for mining nodes that generate new blocks and new nodes just getting started, each node on the average needs to upload as much data as it downloads. However, there are leeching nodes that don’t upload, perhaps victims of obsolete network suppliers. Because I have lots of unused bandwidth I have been uploading an average of 100 GB per day to help these people. Bandwidth is not a problem, even in rural areas. Nor is availability of hardware. The limit is availability of efficient multi-threaded node software.
What’s your per month data plan? Like how much data can you use?
yes, but you don't have to store it, validate it, update the utxo state, and throw it away
But if everybody trows it away how do you sync a new node?
utxo commitments will make it so that if you've been offline for a long time, you'll only need last 10k blocks or so + utxo snapshot, and you're up and running again
utxo commitments + utxo snapshot
Обсуждают сегодня