> Another of Wright’s experts would testify on whether David Kleiman “had the requisite skills and experience to have written or significantly have contributed to the original Bitcoin software application released in 2009.”
Ira Kleiman's team should just expose Wright the same way. There's no way Wright could have coded bitcoin as he lacked the expertise, knowledge, and coding skill to do so.
edit: WHO THE FUCK DOWNVOTED ME. Fuck you, Craig Wright lover.
sidhujag4 - 5 years account age. 250 - 500 comment karma.1 month ago
It’s not about bsv Or bch or even big blockers because bitcoin is what it is and cannot change through a consensus fork. If you read the ln paper it clearly suggests what bandwidth requirements and resource consumptions are needed to get to 7b people doing 2tx per year onchain. What he is saying is that LN alone won’t be solution because onchain won’t support that, some more innovation is needed.. hinting at possibly interoperability or some new math magic to compress information further.
The message is bang on 100%. His ideals match what original intention of bitcoin was. However how we get there will be up to imaginations of engineers who are delivering decentralized solutions at scale and not pony show of staking/validator/shard based consensus mechanisms that cover up the dirt
It's hilarious to me that so many people here are calling for block size increases when we literally had completely empty blockchain every day for months prior to the halving. Ofcourse right now it's a little congested, once difficulty dicreases that's gonna clear up again and things are, again, fine.
The inelastic throughput is a problem, whether it matters right now or not. Either BTC is meant as a value store and is relegated to large transactions and accounting between centralized parties where low throughput is fine, or BTC is meant to be used as a daily peer-to-peer money where it sure as hell be able to handle more transactions.
Imagine how pissed people would be if their Visa stopped working on Black Friday, or if they charged higher fees due to traffic.
Bitcoin has never "stopped working", the fee is just fluctuating. Believe it or not, when you pay with your visa, that costs a fee aswell, you're just paying it in other ways (or it gets deferred to the merchant).
As for practicality, Bitcoin will be both - a store of value and a secure way to pay large amounts on-chain, and a p2p money mostly used off-chain with ways to settle on-chain aswell.
Bitcoin is still in its infancy, it is 11 years old. The internet was invented in 1983, so you can compare it with the internet in 1994.
Well bitcoin hasn't crashed or anything like that, but "stopped working" depends on your definition. If a "working" bitcoin network keeps the mempool empty and fees lower than say $0.50, it has stopped working. If "working" is only that blocks get solved eventually, that's a low bar.
Off-chain is a whole other debate. There are really interesting things that can be done there, but fundamentally they do throw out the trust less peer-to-peer goals of bitcoin. Whether that matters is up for debate, but going off chain is a huge change.
Off-chain solutions like the lightning network are massively complex solutions to a very simple problem. If one of my backend server is running out of resources in production I throw another server at it to double capacity. I should then go back to optimize things, lower memory usage and compute time, but you add capacity short term so you can have time to solve the problem properly. Off-chain solutions are more akin to trying to resign the internet or custom design a new chip architecture because my server is overloaded.
> but fundamentally they do throw out the trust less peer-to-peer goals of bitcoin
I think that's entirely untrue.
One can create off chain things that themselves don't have trustless peer to peer behaviour, like a digicash server... and get other advantages in exchange. But non-p2p and trusty things are just one option out of *many*.
And fortunately, all of the many options can exist at once and users can use whichever best meet their needs.
There are definitely cool things that can be done, and don't get me wrong I don't think all off-chain solutions are garbage. I am a bit concerned with the idea that the only way to make bitcoin scale is off-chain though.
It can add really novel features and platforms, but the main protocol really still should be able to stand on its own without it.
Your definition of "stopped working" completely depends on what your subjective definition is of what a working bitcoin means and what to expect of bitcoin. Which is in your case is clearly a bogus expectation.
\> If a "working" bitcoin network keeps the mempool empty
That's the opposite of a working bitcoin. Mempools \_must\_ be non-empty.
\> fees lower than say $0.50
Who promised you that? Not Satoshi for sure. You believed some random people who didn't themselves understand bitcoin and now you think the universe owes you cheap transaction and bitcoin has to bend over backwards just to give it to you?
\> but going off chain is a huge change
Bullshit, Satoshi already talked about payment channels and even implemented the necessary bits for them. Unfortunately there were bugs in those bits which caused them to be disabled or ignored until proper fixes/replacements were invented and implemented.
\> complex solutions to a very simple problem
Bullshit again. Nothing about a decentralized incentive balanced blockchain is simple.
\> If one of my backend server
A decentralized P2P system is not a backend server.
\> but you add capacity short term
Not if that is too expensive or risks the whole system stopping.
\> Off-chain solutions are more akin to trying to resign the internet or custom design a new chip architecture because my server is overloaded.
No, off chain solutions are, in your bad analogy, more like actually smart things like caching, QOS and load balancing.
Man its always impressive how quick people on rbtc and rbitcoin are to jump from conversation to aggressive, defensive argument. Let's try this one at a time.
\> That's the opposite of a working bitcoin. Mempools \_must\_ be non-empty.
Fair enough, though a bit nitpicky. The mempool is the short term buffer to hold transactions until a block is available. The mempool staying at less 1MB in size isn't a big deal, but once it starts filling up and falling behind its a scaling problem. Easy solutions are increasing block size, longer term solutions may be minimizing the memory size of each transaction.
\> Who promised you that? Not Satoshi for sure. You believed some random people who didn't themselves understand bitcoin and now you think the universe owes you cheap transaction and bitcoin has to bend over backwards just to give it to you?
Unnecessarily personal, but yes the idea that transaction fees stay low is a given in the goals of bitcoin. Who would reasonably propose a peer-to-peer money that costs say $8 per transaction???
\> Bullshit, Satoshi already talked about payment channels and even implemented the necessary bits for them. Unfortunately there were bugs in those bits which caused them to be disabled or ignored until proper fixes/replacements were invented and implemented.
This is news to me, I was not aware of any plans for off-chain channels prior to the lightning network timeframe. Regardless, this is not a black or white debate and I see merit in LN. That said, if its worth it then there's no need to arbitrarily keep the block size hamstrung. If LN delivers on zero-fee transactions and almost instant conformation then it will be adopted.
\> Bullshit again. Nothing about a decentralized incentive balanced blockchain is simple.
What? I'm not saying the bitcoin protocol is simple. I'm saying the short term fix of increasing block size is a much easier fix to the problem than building LN. It has been in development for years and has plenty of problems to solve before its ready for broad use.
\> A decentralized P2P system is not a backend server
Sure, but its the same problem. The transaction throughput is bottlenecked by the protocol's block size. There is no technical reason for the hard cap or why it can't be increased, even if only to buy time for more robust solutions.
\> Not if that is too expensive or risks the whole system stopping.
What are the risks with raising block size? I'm only aware of issues related to it being a hard fork, is there something else though?
\> No, off chain solutions are, in your bad analogy, more like actually smart things like caching, QOS and load balancing.
Sorry but this is completely wrong. You cache data that hasn't changed so you don't need to spend higher resources to read from the single source of truth (usually your database). Off-chain strategies are purposely changing data outside the single source of truth (the blockchain), it isn't a cache. If you want to compare it to data, its more akin to running periodic backups. Your database is left with stale data purposely to avoid scaling issues, and you occasionally take the time to write fresh data back to the DB. The risk is that any data not yet saved could be lost or corrupted, and if needed elsewhere it may not be available.
\> aggressive, defensive
I don't see where I am aggressive. Defensive is just a requirement as rbtc and co are deliberately infecting newcomers with disinformation and completely wrong explanations on what bitcoin is and how it works. That nonsense then ends up being repeated over and over again here.
What is that arbitrary cutoff point that you picked there? There is nothing related to 1MB in current bitcoin.
\> but once it starts filling up and falling behind its a scaling problem.
Nope. Mempool needs several times the blocksize of backlog during normal operation such that even when 3 or 4 blocks are found in quick succession, there's always enough fee to fill those blocks. If blocks are not always full, security suffers.
\> Easy solutions are increasing block size
Maybe you've missed the last 8 years of discussions, but there's absolutely nothing easy about that. It's also a hard fork, which by definition is hard (even though that's not what the "hard" in the name stands for).
\> fees stay low is a given in the goals of bitcoin ... $8
Nope. Check the white paper. Absolutely not a goal and certainly impossible to put a price tag on how "cheap". The goal is a working decentralized system, everything else comes 2nd to that at best.
\> I'm saying the short term fix of increasing block size is a much easier fix to the problem than building LN.
Which is untrue. Other than the block size increase that has already happened several years ago.
Plenty of shitcoins have already proven this too BTW: all bigblock forks are broken in many ways, including security, as well as simply dead. Not only obvious when you think theoretically about it for a minute, also observable in practice.
\> This is news to me
I think they were called payment channels, or maybe that's just the name I've heard Peter Todd use for it lateron. Not sure anymore. Either way: transactions have a version number in them that satoshi intended to allow newer version transactions "overwrite" older ones. Only the last version would settle on the blockchain. From very far away that's exactly the idea of LN, but Satoshi's implementation was susceptible to DOS attacks and was generally not secure (a cheating miner could mine a non-latest version transaction).
\> That said, if its worth it then there's no need to arbitrarily keep the block size hamstrung.
Nobody is doing that.
\> If LN delivers on zero-fee transactions
It won't and nothing will be zero-fees. Again: that is not a promise that needs to be delivered. You were lied to if you believe that promise and you should stop listening completely to whoever promised you that. Same for the idea that a "block size increase is easy". You're just wasting your time with that line of thought, which is blocking you from discovering many more exciting rabbit holes that bitcoin has to offer.
\> and almost instant conformation
It already does.
\> The transaction throughput is bottlenecked by the protocol's block size.
There is no point in putting a bigger harddisk in your backend server if the bottleneck is somewhere else. Similarly, the block size can not be increased without first taking out other bottlenecks.
\> There is no technical reason for the hard cap or why it can't be increased, even if only to buy time for more robust solutions.
Yes there is. And time was already bought by a blocksize increase a few years ago. This nonsense is old debunked bullshit, you're not saying anything new.
\> What are the risks with raising block size? I'm only aware of issues related to it being a hard fork, is there something else though?
A good question is always better than throwing out false assumptions as truth.
Increased block size means increased cost of running a full node, which means fewer people can or will do that, which means centralization. Bitcoin without decentralization is just a crappy version of Paypal.
\> You cache data that hasn't changed so you don't need to spend higher resources to read from the single source of truth (usually your database).
There are many different caches in a PC, not just webcache. LN can be viewed as literally a form of (write) caching: some blob of data changes many times, but only gets written to permanent record once in the end.
Besides, you're attacking an analogy now, one which I already said was bad.
\> Off-chain strategies are purposely changing data outside the single source of truth (the blockchain), it isn't a cache.
That's exactly what a disk write cache does as well.
\> write fresh data back to the DB
Except the blockchain is an immutable database, you only get one chance. So best to wait as long as possible.
Well we definitely disagree on a few things but I do appreciate all the time you put into having a conversation.
You are right, I misspoke with the 1MB limit. I thought the increase to 2MB was completely cancelled after segwit drama, apparently its more or less a hard cap at 4MB but generally blocks fill at around 2MB.
I still don't agree with the idea that blocks always need to be full, or that the mempool should have multiple blocks worth of transactions pending, but that may not matter much. Personal I think block size should be allowed to fluctuate similar to how difficulty fluctuates, it hurts the system to have transactions that get stuck for days or fees that get bid way up.
I'm also not sure how a hard fork is technically hard. Contentious sure, but its really just a software patch that much be rolled out to a majority of miners. It has caused plenty of drama over the years, but there's no way to avoid that when decentralized.
With regards to full node sizes, I'm not sure how block size guards against that. A block doesn't need to take up the max space on disk, the amount of space required will be dependant on the number/size of transactions stored. You could set max block size of 1TB but if there's only 2MB of transactions that all it needs on disk. (Don't misconstrue that as a recommendation to raise the block size this some ridiculous level, I have seen people say that and its a terrible idea).
All that said, I do see real merit in LN and similar ideas. Its a really clever idea and can have great uses, but I have a lot of concerns for bitcoin of the only way for mass adoption is by going off chain. Inevitably off-chain solutions will be centralized in my opinion, even if only because a company builds a better mousetrap that breaks into mass adoption.
\> Well we definitely disagree on a few things but I do appreciate all the time you put into having a conversation.
Awesome. You're welcome.
\> I still don't agree with the idea that blocks always need to be full, or that the mempool should have multiple blocks worth of transactions pending, but that may not matter much.
The point is: fees need to be paying for mining. Ignore the current subsidy: bitcoin needs to run for at least another 100 years, otherwise you should consider it already failed. Thus there needs to be a healthy supply/demand market for block space. Supply is basically fixed (although irregular) so demand fully determines the price. Whenever the mempool is empty that means that people can bid the absolute minimum fee resulting in low security: a miner can earn more money trying to re-org the last block that had some fees rather than mining a near empty block with no fees. Or they simply turn off their mining machines until enough fees aggregate in the mempool. All that is already pretty bad, but it also means that a real 51% attack would become significantly cheaper.
So there must be enough backlog in the mempool to fill several blocks with enough fee paying transactions.
\> I'm also not sure how a hard fork is technically hard.
The basic change may be very simple, but getting consensus is hard and then coordinating and signaling for the switchover is extremely hard and basically guaranteed to result in at least some split where some stubborn people, as well as lazy people that didn't update their software, to stay on the old chain. I would be surprised if that wouldn't take multiple years of rolling out. And then there's a known wishlist of bugs and annoyances that can only be fixed through a hard fork, so it would be a shame to not also fix all of those in the same hard fork.
\> it hurts the system to have transactions that get stuck for days
Not really. It's just the way people need to understand the system works. Manage their priorities and expectations correctly: pay for what guarantee you want: 10 minutes, one hour, 6 hours, a day, a week, a month.... and while waiting, keep updating the transaction to do more batching (more efficient size, thus lower average fee per UTXO) and use RBF and CPFP to upgrade to a faster class.
That will have to happen no matter what the block size ends up being (for reasonable block sizes). For extreme blocksizes the other problems just break bitcoin anyway, so those are irrelevant to even consider: too large means no fees and prohibitive cost to run a full node.
\> the amount of space required will be dependant on the number/size of transactions stored
That's a very old argument and easily debunked: infinite size means .000...001 fee, which means everyone can backup their porn collection onto the blockchain and there would still be no security. Blocks not only must always be full for bitcoin to work, they \_will\_ always be full because there's an infinite demand for high grade distributed backup capacity. The only question is find find the sweet spot where the aggregate fee is high enough while still allowing full nodes to stay decentralized.
IOW: the fee in bitcoin is also an anti-spam mechanism: too cheap and it will just fill with spam.
\> Don't misconstrue that as a recommendation to raise the block size this some ridiculous level
Looks like your level of understanding is already far past the level that you would recommend that. :) But don't be mistaken: there are many people that have said that and a lot that still do. Gavin Andreesen was pushing for 8GB if I'm not mistaken. He went as far as claiming that he did a scientific research by testing it on his laptop and it worked, so it was fine. We're living to this day with idiots and scammers that still cite that bullshit (same as with his testimony that Craig Wright was Satoshi). This is why it's important to pinch new and upcoming bullshit in the butt as soon as possible. Every day disinformation lives it spreads wider and then takes 10x as many days to get rid of again.
\> but I have a lot of concerns for bitcoin
It's definitely good to have concerns. A lot about bitcoin is still not fixed or even unknown. It's still beta and will be for another decade or more. It can still fail completely. It can still go to 0.
\> only way for mass adoption is by going off chain.
That not only depends on how bitcoin develops and what the capacity ends up being (including efficiency improvements and block size increases) but also on what you definition of mass adoption is. On chain is guaranteed to never be used for daily cups of coffee. And why would it? Why would the whole world verify 7 billion cups of coffee transactions every day? Bitcoin will have to find some equilibrium between cups of coffee and major international settlements on the order of $1B transactions between central banks and multinational corporations. I hope it will be somewhere on the lower end of that scale, but there are just too many variables and that's too far in the future to worry too much about right now.
Either way, other use cases can chose from a wide range of off chain solutions from LN to liquid to statechains to discreet log contracts to others and even to more trusted ones. All of which wouldn't be even possible without bitcoin existing. And all of which are in many respects much better than current banks and credit cards and money transmitters (but those too will probably find their place in this picture and still exist in some form, they themselves in turn settling on bitcoin or one of the off chain solutions).
\> Inevitably off-chain solutions will be centralized in my opinion
Many are already not centralized (LN). Either way: there are many shades of gray that can be a fine trade off depending on the use case. Many more shades of gray that didn't and can't even exist without bitcoin. So it's always a win, even if not completely 100%.
That issue is actually much harder to fix than bitcoin's congestion. Again, the water will never ever run out, it will just cost you a little bit more during a drought, if you can't wait 'til 9pm. To stay within your rather labored analogy.
There is a problem. It's evident now and it could be more evident in the future. They have the ability to fix it.
"Yea, my car is usually fine. The engine got a little hot and now it's on fire but that'll stop after it finishes burning."
Jk with that last analogy ;)
The internet was invented in 1983, so think of bitcoin right now as the internet in 1994. The same main protocols are still used to navigate the world wide web, but solutions have been implemented to increase scaling (and scalibity) and ease-of-use. The same will happen to bitcoin to the point where the average user might not even consciously notice they're using it or whether they're using it off- or on-chain.
There are always obstacles, the smartest solutions are just usually not the simplest.