Odalv
Legendary
Offline
Activity: 1414
Merit: 1000
|
 |
October 04, 2017, 09:39:34 PM |
|
You don't understand what you're talking about, do you? The bolded part above and the "small" difference remark is indicative you're clueless.
Do not bother.
|
|
|
|
|
jbreher
Legendary
Offline
Activity: 3122
Merit: 1767
lose: unfind ... loose: untight
|
 |
October 04, 2017, 09:42:49 PM |
|
I think the declining BTC value of BCH indicates the market is pricing in the increasing odds of 2MB Bitcoin blocks becoming reality.
Most likely. There are two separate beliefs that could lead to this: 1) the belief that 2MB is good enough, thereby reducing the advantage of 8MB; and 2) the belief that the sum of the value of S1X and the value of S2X at time = t0+ will be greater than that of the SonlyX at t=0- (much like the BCH split resulted previously in a larger aggregate value).
I think the declining BTC value of BCH (I suppose that's the altcoin you meant) indicates that a certain gang thinks their money is best burned supporting the next attack, so BCH purchases are being phased out.
I doubt it, but I must admit it is plausible.
|
|
|
|
|
Odalv
Legendary
Offline
Activity: 1414
Merit: 1000
|
 |
October 04, 2017, 09:43:53 PM |
|
Mhhh, so I think the best way to split your B2X coins would be to create a transaction very early after the split to an address which you control. It will probably be included in a >1 MB block in the B2X chain, thus making it incompatible anymore on the real chain. Is my thinking correct ? Edit: Nvm, I think I am not correct. The transactions after that could still be replayed on both chains.  With 0 BTC fee, but you will have to wait for B2X confirmed and BTC not (you will have to make new transaction and add FEE in BTC transaction)
|
|
|
|
|
becoin
Legendary
Offline
Activity: 3431
Merit: 1233
|
 |
October 04, 2017, 09:52:01 PM |
|
What a boogie woogie without replay protection..  The lack of replay protection would be a major problem for the B2X network, not for BTC!
|
|
|
|
|
elite3000
Legendary
Offline
Activity: 1073
Merit: 1000
|
 |
October 04, 2017, 10:01:22 PM |
|
I think the declining BTC value of BCC indicates the market is pricing in the increasing odds of 2MB Bitcoin blocks becoming reality.
I think the declining BTC value of BCH (I suppose that's the altcoin you meant) indicates that a certain gang thinks their money is best burned supporting the next attack, so BCH purchases are being phased out. Yep. Pretty much you sum it up. RIP BCH. Hello B2X. Next free money drop, please! I'm not sure it is "Next free money drop, please!". Your BTC transaction will be replayed on BTC2X. Why should I care what is happening on B2X network or on any other Bitcoin [put-some-extra-word-here] network that is trying to mimic Bitcoin network? I simply take what they give me and dump it for bitcoins. Case solved. They share same keys, same addresses. If you dump at B2X then they dump your BTC. (it is same network, the difference is small -> block size ) ... there is not replay protection 1) if you dump BTC then your transaction will apear on BTC2X and you lose BTC2X 2) if you dump BTC2X then there is a chance that your dump will be replayed on BTC Then it is ever worth the risk of claiming the forkcoins  Weren't the chains supposed to be separated after the fork day  This is a bit confusing
|
|
|
|
|
Meuh6879
Legendary
Offline
Activity: 1512
Merit: 1013
|
 |
October 04, 2017, 10:03:01 PM |
|
Like i said : - Bitcoin XT (try to enforce 8Mb) - Bitcoin Classic (try to enforce 8Mb & XTHIN) - Bitcoin BU (try to enforce infinite & XTHIN) - Bitcoin NYA (try to enforce 8Mb of weight ... the Segwit 2Mb block size)  We don't need replay ... because they already try from a very long time. but, without replay, no airdrop ... 
|
|
|
|
|
jbreher
Legendary
Offline
Activity: 3122
Merit: 1767
lose: unfind ... loose: untight
|
 |
October 04, 2017, 10:03:25 PM Last edit: October 04, 2017, 10:18:52 PM by jbreher |
|
empirical evidence pretty much shows that the 2x part of segwit2x is not really justified
Bullshit. The vision is the entire world employing Bitcoin. How are we going to get there? Nobody can spend Bitcoin if they don't already have Bitcoin. Have you thought about the limit of adoption even assuming the desire was there? How long will it take to onboard the world? Currently it would take over 30 years to send each person on earth a single Bitcoin transaction. Think about that. Lightning does nothing to alleviate that. Segwit does nothing to alleviate that. Schnorr sigs does nothing to alleviate that. The only metric that matters in this is transaction throughput - pure and simple. To a first order approximation (i.e. with already minimal-size transactions), this is directly and linearly proportional to block size. True, 2X is insufficient. As is 8X. But they are steps in the right direction. The only direction that can possibly get us to our goal. Larger blocks are a requirement for Bitcoin to be meaningful to humanity as a whole.
|
|
|
|
|
Odalv
Legendary
Offline
Activity: 1414
Merit: 1000
|
 |
October 04, 2017, 10:04:37 PM |
|
Then it is ever worth the risk of claiming the forkcoins  Weren't the chains supposed to be separated after the fork day  This is a bit confusing Yes it is. "forkcoins" can wipeout your BTC
|
|
|
|
|
jbreher
Legendary
Offline
Activity: 3122
Merit: 1767
lose: unfind ... loose: untight
|
 |
October 04, 2017, 10:13:20 PM |
|
What a boogie woogie without replay protection..  The lack of replay protection would be a major problem for the B2X network, not for [Bitcoin Segwit Core - ed] Yet it is core acolytes that are screaming bloody murder over such lack. hmm....
|
|
|
|
|
AlcoHoDL
Legendary
Online
Activity: 2996
Merit: 6451
Addicted to HoDLing!
|
 |
October 04, 2017, 10:25:40 PM |
|
True, 2X is insufficient. As is 8X. But they are steps in the right direction. The only direction that will get us to our goal. Larger blocks are a requirement for Bitcoin to be meaningful to humanity as a whole.
Larger blocks are a step in the wrong direction. The reasons are many, and some are so blatantly obvious that it is pointless to waste time and energy explaining them in a forum post. Instead, I will simply quote a paragraph from the Lightning Network paper [1]: "The payment network Visa achieved 47,000 peak transactions per second (tps) on its network during the 2013 holidays, and currently averages hundreds of millions per day. Currently, Bitcoin supports less than 7 transactions per second with a 1 megabyte block limit. If we use an average of 300 bytes per bitcoin transaction and assumed unlimited block sizes, an equivalent capacity to peak Visa transaction volume of 47,000/tps would be nearly 8 gigabytes per Bitcoin block, every ten minutes on average. Continuously, that would be over 400 terabytes of data per year." That's gigabytes, not megabytes... That should put things in perspective. [1] Joseph Poon and Thaddeus Dryja, "The Bitcoin Lightning Network: Scalable Off-Chain Instant Payments", January 14, 2016.
|
|
|
|
|
bitserve
Legendary
Offline
Activity: 2072
Merit: 1772
Self made HODLER ✓
|
 |
October 04, 2017, 10:32:04 PM |
|
True, 2X is insufficient. As is 8X. But they are steps in the right direction. The only direction that will get us to our goal. Larger blocks are a requirement for Bitcoin to be meaningful to humanity as a whole.
Larger blocks are a step in the wrong direction. The reasons are many, and some are so blatantly obvious that it is pointless to waste time and energy explaining them in a forum post. Instead, I will simply quote a paragraph from the Lightning Network paper [1]: "The payment network Visa achieved 47,000 peak transactions per second (tps) on its network during the 2013 holidays, and currently averages hundreds of millions per day. Currently, Bitcoin supports less than 7 transactions per second with a 1 megabyte block limit. If we use an average of 300 bytes per bitcoin transaction and assumed unlimited block sizes, an equivalent capacity to peak Visa transaction volume of 47,000/tps would be nearly 8 gigabytes per Bitcoin block, every ten minutes on average. Continuously, that would be over 400 terabytes of data per year." That's gigabytes, not megabytes... That should put things in perspective. [1] Joseph Poon and Thaddeus Dryja, "The Bitcoin Lightning Network: Scalable Off-Chain Instant Payments", January 14, 2016.The step in the RIGHT direction was done a few months ago with Segwit and enabling LN. We will most probably need a blocksize increase in the near future (hope so, because that would mean an increased adoption of Bitcoin) but it will be as a support of LN being blockchain the backbone of the payment network and LN some sort of multiplier of its transaction capacity.
|
|
|
|
|
jbreher
Legendary
Offline
Activity: 3122
Merit: 1767
lose: unfind ... loose: untight
|
 |
October 04, 2017, 10:36:59 PM |
|
True, 2X is insufficient. As is 8X. But they are steps in the right direction. The only direction that will get us to our goal. Larger blocks are a requirement for Bitcoin to be meaningful to humanity as a whole.
"The payment network Visa achieved 47,000 peak transactions per second (tps) on its network during the 2013 holidays, and currently averages hundreds of millions per day. Currently, Bitcoin supports less than 7 transactions per second with a 1 megabyte block limit. If we use an average of 300 bytes per bitcoin transaction and assumed unlimited block sizes, an equivalent capacity to peak Visa transaction volume of 47,000/tps would be nearly 8 gigabytes per Bitcoin block, every ten minutes on average." Over the long term, not a problem. How big was your disk a decade ago? Note also that this is peak. We can tolerate transaction backlog, as long as it clears without too much time passing. Average is much lower. Note that the published paper makes a fundamental logic error. Using a peak figure as an input to an average calculation is either gross stupidity or simple fearmongering. Now lets actually discuss how we get Bitcoin into the hands of each and every person. What's your solution for that? Hmm?
|
|
|
|
|
jbreher
Legendary
Offline
Activity: 3122
Merit: 1767
lose: unfind ... loose: untight
|
 |
October 04, 2017, 10:40:54 PM |
|
The step in [a] direction was done a few months ago with Segwit and enabling LN.
Neither of which do anything to further the goal of onboarding the world. Whether or not we have Segwit or Lightning, 1MB blocks will require 30 years to get a single transaction to each person on earth. The only way to improve this is with larger blocks.
|
|
|
|
|
bitserve
Legendary
Offline
Activity: 2072
Merit: 1772
Self made HODLER ✓
|
 |
October 04, 2017, 10:46:20 PM |
|
The step in [a] direction was done a few months ago with Segwit and enabling LN.
Neither of which do anything to further the goal of onboarding the world. Whether or not we have Segwit or Lightning, 1MB blocks will require 30 years to get a single transaction to each person on earth. The only way to improve this is with larger blocks. We will need bigger blocks. I wouldn't even mind if we increased to 2x already.... if core would support it (which it seems they don't YET). In fact I would like they did... but they don't. So we will have to wait. That said... LN will play a bigger role in transaction capacity than any blocksize increase. That's why I have never understood why some people insist in bigger blocks but were oppossed to Segwit and LN. It doesn't make sense to me. We will need both to really SCALE (not linearly) Bitcoin transaction capacity in the orders of magnitude (LN playing the bigger role in that increase of capacity).
|
|
|
|
|
BlindMayorBitcorn
Legendary
Offline
Activity: 1260
Merit: 1116
|
 |
October 04, 2017, 10:48:48 PM |
|
The step in [a] direction was done a few months ago with Segwit and enabling LN.
Neither of which do anything to further the goal of onboarding the world. Whether or not we have Segwit or Lightning, 1MB blocks will require 30 years to get a single transaction to each person on earth. The only way to improve this is with larger blocks. This is one blockchain, not the bloody Noah's ark. The limits of on-chain scalability are evident even to the likes of me... wtf
|
|
|
|
|
jbreher
Legendary
Offline
Activity: 3122
Merit: 1767
lose: unfind ... loose: untight
|
 |
October 04, 2017, 10:53:19 PM |
|
The step in [a] direction was done a few months ago with Segwit and enabling LN.
Neither of which do anything to further the goal of onboarding the world. Whether or not we have Segwit or Lightning, 1MB blocks will require 30 years to get a single transaction to each person on earth. The only way to improve this is with larger blocks. We will need bigger blocks. I wouldn't even mind if we increased to 2x already.... if core would support it (which it seems they don't YET). That said... LN will play a bigger role in transaction capacity than any blocksize increase. Have you read what I wrote above? A person who does not yet have Bitcoin cannot use LN. He must first get Bitcoin via an on-chain transaction. So the absolute theoretical minimum number of on-chain transactions required per person is two - one to get Bitcoin, and the other to spend every Bitcoin he will ever own into a single LN channel. In this minimal limit, it would require -- at 1MB -- 60 years for every person on earth to be able to participate. Accordingly, no. While LN might become an important adjunct, the most important dimension to work on is simple block size.
|
|
|
|
|
d_eddie
Legendary
Offline
Activity: 3122
Merit: 5229
|
 |
October 04, 2017, 11:00:37 PM |
|
The step in [a] direction was done a few months ago with Segwit and enabling LN.
Neither of which do anything to further the goal of onboarding the world. Whether or not we have Segwit or Lightning, 1MB blocks will require 30 years to get a single transaction to each person on earth. The only way to improve this is with larger blocks. On the contrary, LN and similar Layer2 technology can do much. Given a sufficiently large number of channels, kept open for a sufficiently long time, the transaction rate (summed over all open channels) can grow almost without limit. In reality, there will be practical limits on channel number and channel lifespan, so a block size increase will likely be needed at some point in the future. How big an increase, and when? I think we should find out by looking at real LN transaction flow numbers.
|
|
|
|
|
jbreher
Legendary
Offline
Activity: 3122
Merit: 1767
lose: unfind ... loose: untight
|
 |
October 04, 2017, 11:01:38 PM |
|
Neither of which do anything to further the goal of onboarding the world. Whether or not we have Segwit or Lightning, 1MB blocks will require 30 years to get a single transaction to each person on earth. The only way to improve this is with larger blocks.
This is one blockchain, not the bloody Noah's ark. The limits of on-chain scalability are evident even to the likes of me... So you're happy with a limit of getting 2% of the population onto LN per year, with no bandwidth left for any other other transactions (like, maybe, to rescind a cheating LN transaction by your channel counterparty trying to steal your funds)? Fine. In the boxes next to your name, I'll check the 1MB4EVAH column.
|
|
|
|
|
becoin
Legendary
Offline
Activity: 3431
Merit: 1233
|
 |
October 04, 2017, 11:02:12 PM |
|
What a boogie woogie without replay protection..  The lack of replay protection would be a major problem for the B2X network, not for [Bitcoin Segwit Core - ed] Yet it is core acolytes that are screaming bloody murder over such lack. hmm.... In general, big blocktards are technically and economically ignorant and don't get it. Without replay protection B2X fork is a blatant attempt for 51% attack against Bitcoin. Simple as that! People that burn hundreds of millions to organize this attack don't care about improving Bitcoin. They want to crash Bitcoin.
|
|
|
|
|
jbreher
Legendary
Offline
Activity: 3122
Merit: 1767
lose: unfind ... loose: untight
|
 |
October 04, 2017, 11:02:54 PM |
|
The step in [a] direction was done a few months ago with Segwit and enabling LN.
Neither of which do anything to further the goal of onboarding the world. Whether or not we have Segwit or Lightning, 1MB blocks will require 30 years to get a single transaction to each person on earth. The only way to improve this is with larger blocks. On the contrary, LN and similar Layer2 technology can do much. Given a sufficiently large number of channels, kept open for a sufficiently long time, the transaction rate (summed over all open channels) can grow almost without limit. In reality, there will be practical limits on channel number and channel lifespan, so a block size increase will likely be needed at some point in the future. How big an increase, and when? I think we should find out by looking at real LN transaction flow numbers. Did you: a) not read what I wrote; or b) end up being incapable of understanding it?
|
|
|
|
|
|