Discussion:
[Bitcoin-development] Block Size Increase
Matt Corallo
2015-05-06 22:12:14 UTC
Permalink
Recently there has been a flurry of posts by Gavin at
http://gavinandresen.svbtle.com/ which advocate strongly for increasing
the maximum block size. However, there hasnt been any discussion on this
mailing list in several years as far as I can tell.

Block size is a question to which there is no answer, but which
certainly has a LOT of technical tradeoffs to consider. I know a lot of
people here have varying levels of strong or very strong opinions about
this, and the fact that it is not being discussed in a technical
community publicly anywhere is rather disappointing.

So, at the risk of starting a flamewar, I'll provide a little bait to
get some responses and hope the discussion opens up into an honest
comparison of the tradeoffs here. Certainly a consensus in this kind of
technical community should be a basic requirement for any serious
commitment to blocksize increase.

Personally, I'm rather strongly against any commitment to a block size
increase in the near future. Long-term incentive compatibility requires
that there be some fee pressure, and that blocks be relatively
consistently full or very nearly full. What we see today are
transactions enjoying next-block confirmations with nearly zero pressure
to include any fee at all (though many do because it makes wallet code
simpler).

This allows the well-funded Bitcoin ecosystem to continue building
systems which rely on transactions moving quickly into blocks while
pretending these systems scale. Thus, instead of working on technologies
which bring Bitcoin's trustlessness to systems which scale beyond a
blockchain's necessarily slow and (compared to updating numbers in a
database) expensive settlement, the ecosystem as a whole continues to
focus on building centralized platforms and advocate for changes to
Bitcoin which allow them to maintain the status quo[1].

Matt

[1] https://twitter.com/coinbase/status/595741967759335426
Tier Nolan
2015-05-06 22:44:53 UTC
Permalink
Post by Matt Corallo
Personally, I'm rather strongly against any commitment to a block size
increase in the near future.
Miners can already soft-fork to reduce the maximum block size. If 51% of
miners agree to a 250kB block size, then that is the maximum block size.

The question being discussed is what is the maximum block size merchants
and users will accept. This puts a reasonable limit on the maximum size
miners can increase the block size to.

In effect, the block size is set by the minimum of the miner's and the
merchants/user's size.min(miner, merchants/users).
Post by Matt Corallo
This allows the well-funded Bitcoin ecosystem to continue building
systems which rely on transactions moving quickly into blocks while
pretending these systems scale. Thus, instead of working on technologies
which bring Bitcoin's trustlessness to systems which scale beyond a
blockchain's necessarily slow and (compared to updating numbers in a
database) expensive settlement, the ecosystem as a whole continues to
focus on building centralized platforms and advocate for changes to
Bitcoin which allow them to maintain the status quo[1].
Would you accept a rule that the maximum size is 20MB (doubling every 2
years), but that miners have an efficient method for choosing a lower size?

If miners could specify the maximum block size in their block headers, then
they could coordinate to adjust the block size. If 75% vote to lower the
size, then it is lowered and vice versa for raiding.

Every 2016 blocks, the votes are counter. If the 504th lowest of the 2016
blocks is higher than the previous size, then the size is set to that
size. Similarly, if the 504th highest is lower than the previous size, it
becomes the new size.

There could be 2 default trajectories. The reference client might always
vote to double the size every 4 years.

To handle large blocks (>32MB) requires a change to the p2p protocol
message size limits, or a way to split blocks over multiple messages.

It would be nice to add new features to any hard-fork.

I favour adding an auxiliary header. The Merkle root in the header could
be replaced with hash(merkle_root | hash(aux_header)). This is a fairly
simple change, but helps with things like commitments. One of the fields
in the auxiliary header could be an extra nonce field. This would mean
fast regeneration of the merkle root for ASIC miners. This is a pretty
simple change.
Matt Corallo
2015-05-06 23:12:17 UTC
Permalink
Replies inline.
Post by Matt Corallo
Personally, I'm rather strongly against any commitment to a block size
increase in the near future.
-snip-
Post by Matt Corallo
The question being discussed is what is the maximum block size merchants
and users will accept. This puts a reasonable limit on the maximum size
miners can increase the block size to.
In effect, the block size is set by the minimum of the miner's and the
merchants/user's size.min(miner, merchants/users).
Indeed, "the bitcoin community of users and miners can decide to do
whatever they want", but this is univeral - "they" could decide whatever
they want if "they" want to hardfork. That said, "we" should be having a
rigorous technical discussion about whether it is sane to recommend a
given course of action by releasing software which makes it happen.
Post by Matt Corallo
This allows the well-funded Bitcoin ecosystem to continue building
systems which rely on transactions moving quickly into blocks while
pretending these systems scale. Thus, instead of working on technologies
which bring Bitcoin's trustlessness to systems which scale beyond a
blockchain's necessarily slow and (compared to updating numbers in a
database) expensive settlement, the ecosystem as a whole continues to
focus on building centralized platforms and advocate for changes to
Bitcoin which allow them to maintain the status quo[1].
Would you accept a rule that the maximum size is 20MB (doubling every 2
years), but that miners have an efficient method for choosing a lower size?
If miners could specify the maximum block size in their block headers,
then they could coordinate to adjust the block size. If 75% vote to
lower the size, then it is lowered and vice versa for raiding.
Every 2016 blocks, the votes are counter. If the 504th lowest of the
2016 blocks is higher than the previous size, then the size is set to
that size. Similarly, if the 504th highest is lower than the previous
size, it becomes the new size.
There could be 2 default trajectories. The reference client might
always vote to double the size every 4 years.
To handle large blocks (>32MB) requires a change to the p2p protocol
message size limits, or a way to split blocks over multiple messages.
It would be nice to add new features to any hard-fork.
I favour adding an auxiliary header. The Merkle root in the header
could be replaced with hash(merkle_root | hash(aux_header)). This is a
fairly simple change, but helps with things like commitments. One of
the fields in the auxiliary header could be an extra nonce field. This
would mean fast regeneration of the merkle root for ASIC miners. This
is a pretty simple change.
The point of the hard block size limit is exactly because giving miners
free rule to do anything they like with their blocks would allow them to
do any number of crazy attacks. The incentives for miners to pick block
sizes are no where near compatible with what allows the network to
continue to run in a decentralized manner.
Tier Nolan
2015-05-06 23:33:56 UTC
Permalink
Post by Matt Corallo
The point of the hard block size limit is exactly because giving miners
free rule to do anything they like with their blocks would allow them to
do any number of crazy attacks. The incentives for miners to pick block
sizes are no where near compatible with what allows the network to
continue to run in a decentralized manner.
Miners can always reduce the block size (if they coordinate). Increasing
the maximum block size doesn't necessarily cause an increase. A majority
of miners can soft-fork to set the limit lower than the hard limit.

Setting the hard-fork limit higher means that a soft fork can be used to
adjust the limit in the future.

The reference client would accept blocks above the soft limit for wallet
purposes, but not build on them. Blocks above the hard limit would be
rejected completely.
Matt Corallo
2015-05-06 23:41:37 UTC
Permalink
Post by Matt Corallo
The point of the hard block size limit is exactly because giving miners
free rule to do anything they like with their blocks would allow them to
do any number of crazy attacks. The incentives for miners to pick block
sizes are no where near compatible with what allows the network to
continue to run in a decentralized manner.
Miners can always reduce the block size (if they coordinate).
Increasing the maximum block size doesn't necessarily cause an
increase. A majority of miners can soft-fork to set the limit lower
than the hard limit.
Sure, of course.
Post by Matt Corallo
Setting the hard-fork limit higher means that a soft fork can be used to
adjust the limit in the future.
The reference client would accept blocks above the soft limit for wallet
purposes, but not build on them. Blocks above the hard limit would be
rejected completely.
Yes, but this does NOT make an actual policy. Note that the vast
majority of miners already apply their own patches to Bitcoin Core, so
applying one more is not all that hard. When blocks start to become
limited (ie there is any fee left on the table by transactions not
included in a block) there becomes incentive for miners to change that
behavior pretty quick. Not just that, the vast majority of the hashpower
is behind very large miners, who have little to no decentralization
pressure. This results in very incompatible incentives, mainly that the
incentive would be for the large miners to interconnect in a private
network and generate only maximum-size blocks, creating a strong
centralization pressure in the network.
Peter Todd
2015-05-07 02:16:44 UTC
Permalink
Post by Matt Corallo
Yes, but this does NOT make an actual policy. Note that the vast
majority of miners already apply their own patches to Bitcoin Core, so
applying one more is not all that hard. When blocks start to become
limited (ie there is any fee left on the table by transactions not
included in a block) there becomes incentive for miners to change that
behavior pretty quick. Not just that, the vast majority of the hashpower
is behind very large miners, who have little to no decentralization
pressure. This results in very incompatible incentives, mainly that the
incentive would be for the large miners to interconnect in a private
network and generate only maximum-size blocks, creating a strong
centralization pressure in the network.
I'll also point out that miners with the goal of finding more blocks
than their competition - a viable long-term strategy to increase market
share and/or a short-term strategy to get more transaction fees -
actually have a perverse incentive(1) to ensure their blocks do *not*
get to more than ~30% of the hashing power. The main thing holding them
back from doing that is that the inflation subsidy is still quite high -
better to get the reward now than try to push your competition out of
business.

It's plausible that with a limited blocksize there won't be an
opportunity to delay propagation by broadcasting larger blocks - if
blocks propagate in a matter of seconds worst case there's no
opportunity for gaming the system. But it does strongly show that we
must build systems where that worst case propagation time in all
circumstances is very short relative to the block interval.

1) http://www.mail-archive.com/bitcoin-***@lists.sourceforge.net/msg03200.html
--
'peter'[:-1]@petertodd.org
000000000000000004dc867e4541315090329f45ed4dd30e2fd7423a38a72c0e
slush
2015-05-06 22:30:12 UTC
Permalink
I don't have strong opinion @ block size topic.

But if there'll be a fork, PLEASE, include SIGHASH_WITHINPUTVALUE (
https://bitcointalk.org/index.php?topic=181734.0) or its alternative. All
developers of lightweight (blockchain-less) clients will adore you!

slush
Post by Matt Corallo
Recently there has been a flurry of posts by Gavin at
http://gavinandresen.svbtle.com/ which advocate strongly for increasing
the maximum block size. However, there hasnt been any discussion on this
mailing list in several years as far as I can tell.
Block size is a question to which there is no answer, but which
certainly has a LOT of technical tradeoffs to consider. I know a lot of
people here have varying levels of strong or very strong opinions about
this, and the fact that it is not being discussed in a technical
community publicly anywhere is rather disappointing.
So, at the risk of starting a flamewar, I'll provide a little bait to
get some responses and hope the discussion opens up into an honest
comparison of the tradeoffs here. Certainly a consensus in this kind of
technical community should be a basic requirement for any serious
commitment to blocksize increase.
Personally, I'm rather strongly against any commitment to a block size
increase in the near future. Long-term incentive compatibility requires
that there be some fee pressure, and that blocks be relatively
consistently full or very nearly full. What we see today are
transactions enjoying next-block confirmations with nearly zero pressure
to include any fee at all (though many do because it makes wallet code
simpler).
This allows the well-funded Bitcoin ecosystem to continue building
systems which rely on transactions moving quickly into blocks while
pretending these systems scale. Thus, instead of working on technologies
which bring Bitcoin's trustlessness to systems which scale beyond a
blockchain's necessarily slow and (compared to updating numbers in a
database) expensive settlement, the ecosystem as a whole continues to
focus on building centralized platforms and advocate for changes to
Bitcoin which allow them to maintain the status quo[1].
Matt
[1] https://twitter.com/coinbase/status/595741967759335426
------------------------------------------------------------------------------
One dashboard for servers and applications across Physical-Virtual-Cloud
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
_______________________________________________
Bitcoin-development mailing list
https://lists.sourceforge.net/lists/listinfo/bitcoin-development
Eric Lombrozo
2015-05-06 23:06:00 UTC
Permalink
I don’t really have a strong opinion on block size either…but if we’re going to do a hard fork, let’s use this as an opportunity to create a good process for hard forks (which we’ll inevitably need to do again in the future). The change in block size is a very simple change that still allows us to explore all the complexities involved with deployment of hard forks. Let’s not just do a one-off ad-hoc thing.

- Eric Lombrozo
But if there'll be a fork, PLEASE, include SIGHASH_WITHINPUTVALUE (https://bitcointalk.org/index.php?topic=181734.0 <https://bitcointalk.org/index.php?topic=181734.0>) or its alternative. All developers of lightweight (blockchain-less) clients will adore you!
slush
Recently there has been a flurry of posts by Gavin at
http://gavinandresen.svbtle.com/ <http://gavinandresen.svbtle.com/> which advocate strongly for increasing
the maximum block size. However, there hasnt been any discussion on this
mailing list in several years as far as I can tell.
Block size is a question to which there is no answer, but which
certainly has a LOT of technical tradeoffs to consider. I know a lot of
people here have varying levels of strong or very strong opinions about
this, and the fact that it is not being discussed in a technical
community publicly anywhere is rather disappointing.
So, at the risk of starting a flamewar, I'll provide a little bait to
get some responses and hope the discussion opens up into an honest
comparison of the tradeoffs here. Certainly a consensus in this kind of
technical community should be a basic requirement for any serious
commitment to blocksize increase.
Personally, I'm rather strongly against any commitment to a block size
increase in the near future. Long-term incentive compatibility requires
that there be some fee pressure, and that blocks be relatively
consistently full or very nearly full. What we see today are
transactions enjoying next-block confirmations with nearly zero pressure
to include any fee at all (though many do because it makes wallet code
simpler).
This allows the well-funded Bitcoin ecosystem to continue building
systems which rely on transactions moving quickly into blocks while
pretending these systems scale. Thus, instead of working on technologies
which bring Bitcoin's trustlessness to systems which scale beyond a
blockchain's necessarily slow and (compared to updating numbers in a
database) expensive settlement, the ecosystem as a whole continues to
focus on building centralized platforms and advocate for changes to
Bitcoin which allow them to maintain the status quo[1].
Matt
[1] https://twitter.com/coinbase/status/595741967759335426 <https://twitter.com/coinbase/status/595741967759335426>
------------------------------------------------------------------------------
One dashboard for servers and applications across Physical-Virtual-Cloud
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y <http://ad.doubleclick.net/ddm/clk/290420510;117567292;y>
_______________________________________________
Bitcoin-development mailing list
https://lists.sourceforge.net/lists/listinfo/bitcoin-development <https://lists.sourceforge.net/lists/listinfo/bitcoin-development>
------------------------------------------------------------------------------
One dashboard for servers and applications across Physical-Virtual-Cloud
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y_______________________________________________
Bitcoin-development mailing list
https://lists.sourceforge.net/lists/listinfo/bitcoin-development
Matt Corallo
2015-05-06 23:13:22 UTC
Permalink
For now, lets leave the discussion to JUST the block size increase. If
it helps - everyone should assume that their pet feature is included in
a hard fork or, if you prefer, that no other features are included in a
hard fork.
I'm not so much opposed to a block size increase as I am opposed to a hard fork. My problem with a hard fork is that everyone and their brother wants to seize the opportunity of a hard fork to insert their own pet feature, and such a mad rush of lightly considered, obscure feature additions would be extremely risky for Bitcoin. If it could be guaranteed that raising the block size limit would be the only incompatible change introduced in the hard fork, then I would support it, but I strongly fear that the hard fork itself will become an excuse to change other aspects of the system in ways that will have unintended and possibly disastrous consequences.
Tom Harding
2015-05-07 00:00:46 UTC
Permalink
Post by Matt Corallo
Long-term incentive compatibility requires
that there be some fee pressure, and that blocks be relatively
consistently full or very nearly full.
I think it's way too early to even consider a future era when the fiat
value of the block reward is no longer the biggest-by-far mining incentive.

Creating fee pressure means driving some people to choose something
else, not bitcoin. "Too many people using bitcoin" is nowhere on the
list of problems today. It's reckless to tinker with adoption in hopes
of spurring innovation on speculation, while a "can kick" is available.

Adoption is currently at miniscule, test-flight, relatively
insignificant levels when compared to global commerce. As Gavin
discussed in the article, under "Block size and miner fees… again," the
best way to maximize miner incentives is to focus on doing things that
are likely to increase adoption, which, in our fiat-dominated world,
lead to a justifiably increased exchange rate.

Any innovation attractive enough to relieve the block size pressure will
do so just as well without artificial stimulus.

Thanks for kicking off the discussion.
Bryan Bishop
2015-05-07 00:07:41 UTC
Permalink
Post by Matt Corallo
the maximum block size. However, there hasnt been any discussion on this
mailing list in several years as far as I can tell.
Well, there has been significant public discussion in #bitcoin-wizards
on irc.freenode.net which is available in public logs, specifically
about why increasing the max block size is kicking the can down the
road while possibly compromising blockchain security. There were many
excellent objections that were raised that, sadly, I see are not
referenced at all in the recent media blitz. Frankly I can't help but
feel that if contributions, like those from #bitcoin-wizards, have
been ignored in lieu of technical analysis, and the absence of
discussion on this mailing list, that I feel perhaps there are other
subtle and extremely important technical details that are completely
absent from this--and other-- proposals. I have some rather general
thoughts to offer.

Secured decentralization is the most important and most interesting
property of bitcoin. Everything else is rather trivial and could be
achieved millions of times more efficiently with conventional
technology. Our technical work should be informed by the technical
nature of the system we have constructed.

I suspect that as bitcoin continues to grow in all dimensions and
metrics, that we will see an unending wave of those who are excited by
the idea of Something Different in the face of archaic, crumbling
software and procedures in the rest of the financial world. Money has
found its way into every aspect of human life. There's no doubt in my
mind that bitcoin will always see the most extreme campaigns and the
most extreme misunderstandings. Like moths to a flame or water in the
desert, almost everyone is excited by ANY status quo change
whatsoever. This is something that we have to be vigilante about,
because their excitement is motivation to do excellent work, not
simply any work. For some who are excited about general status quo
changes that bitcoin represents, they may not mind if bitcoin
decentralization disappears and is replaced with just a handful of
centralized nodes. Whereas for development purposes we must hold
ourselves to extremely high standards before proposing changes,
especially to the public, that have the potential to be unsafe and
economically unsafe. We have examples from NASA about how to engineer
extremely fault tolerant systems, and we have examples from Linux
about how to have high standards in open-source projects. Safety is
absolutely critical, even in the face of seemingly irrational
excuberance of others who want to scale to trillions of daily coffee
transactions individually stored forever in the blockchain.

When designing bitcoin or even some other system, an important design
target is what the system should be capable of. How many transactions
should the system perform? What is the correct number of transactions
for a healthy, modern civilization to perform every day? And how fast
should that (not) grow? Should we allow for 2 billion trillion coffee
transactions every day, or what about 100 trillion transactions per
second? I suspect that these sorts of questions are entirely
unanswerable and boring. So in the absence of technical targets to
reach during the design phase, I suspect that Jeff Garzik was right
when he pointed out a few months ago that bitcoin is good at
settlement and clearing. There are many potential technical solutions
for aggregating millions (trillions?) of transactions into tiny
bundles. As a small proof-of-concept, imagine two parties sending
transactions back and forth 100 million times. Instead of recording
every transaction, you could record the start state and the end state,
and end up with two transactions or less. That's a 100 million fold,
without modifying max block size and without potentially compromising
secured decentralization.

The MIT group should listen up and get to work figuring out how to
measure decentralization and its security :-). Maybe we should be
collectively pestering Andrew Miller to do this, too. No pressure,
dude. Getting this measurement right would be really beneficial
because we would have a more academic and technical understanding to
work with. I would also characterize this as high priority next to the
"formally verified correctness proofs for Script and
libbitcoinconsensus".

Also, I think that getting this out in the open on this mailing list
is an excellent step forward.

- Bryan
http://heybryan.org/
1 512 203 0507
Gregory Maxwell
2015-05-07 00:37:54 UTC
Permalink
On Wed, May 6, 2015 at 10:12 PM, Matt Corallo <bitcoin-***@bluematt.me>
wrote: > Recently there has been a flurry of posts by Gavin at >
http://gavinandresen.svbtle.com/ which advocate strongly for increasing >
the maximum block size. However, there hasnt been any discussion on this >
mailing list in several years as far as I can tell.

Thanks Matt; I was actually really confused by this sudden push with
not a word here or on Github--so much so that I responded on Reddit to
people pointing to commits in Gavin's personal repository saying they
were reading too much into it.

So please forgive me for the more than typical disorganization in this
message; I've been caught a bit flatfooted on this and I'm trying to
catch up. I'm juggling a fair amount of sudden pressure in my mailbox,
and trying to navigate complex discussions in about eight different
forums concurrently.

There have been about a kazillion pages of discussion elsewhere
(e.g. public IRC and Bitcointalk; private discussions in the past),
not all of which is well known, and I can't hope to summarize even a
tiny fraction of it in a single message-- but that's no reason to not
start on it.
Block size is a question to which there is no answer, but which >
certainly has a LOT of technical tradeoffs to consider.

There are several orthogonal angles from which block size is a concern
(both increases and non-increases). Most of them have subtle implications
and each are worth its own research paper or six, so it can be difficult
to only touch them slightly without creating a gish gallop that is hard
to respond to.

We're talking about tuning one of the fundamental scarcities of the
Bitcoin Economy and cryptosystem--leaving the comfort of "rule by
math" and venturing into the space of political decisions; elsewhere
you'd expect to see really in-depth neutral analysis of the risks and
tradeoffs, technically and economically. And make no mistake: there
are real tradeoffs here, though we don't know their exact contours.

Fundamentally this question exposes ideological differences between people
interested in Bitcoin. Is Bitcoin more of a digital gold or is it more
of a competitor to Square? Is Bitcoin something that should improve
personal and commercial autonomy from central banks? From commercial
banks? Or from just the existing status-quo commercial banks? What are
people's fundamental rights with Bitcoin? Do participants have a
right to mine? How much control should third parties have over their
transactions? How much security must be provided? Is there a deadline
for world domination or bust? Is Bitcoin only for the developed world?
Must it be totally limited by the most impoverished parts of the world?

Bitcoin exists at the intersection of many somewhat overlapping belief
systems--and people of many views can find that Bitcoin meets their
needs even when they don't completely agree politically. When Bitcoin
is changed fundamentally, via a hard fork, to have different properties,
the change can create winners or losers (if nothing else, then in terms
of the kind of ideology supported by it).

There are non-trivial number of people who hold extremes on any of
these general belief patterns; Even among the core developers there is
not a consensus on Bitcoin's optimal role in society and the commercial
marketplace.

To make it clear how broad the views go, even without getting into
monetary policy... some people even argue that Bitcoin should act
as censor-resistant storage system for outlawed content; -- I think
this view is unsound, not achievable with the technology, and largely
incompatible with Bitcoin's use as a money (because it potentially
creates an externalized legal/harassment liability for node operators);
but these are my personal value judgments; the view is earnestly held
by more than a few; and that's a group that certainly wants the largest
possible blocksizes (though even then that won't be enough).

The subject is complicated even more purely on the technical side
by the fact that Bitcoin has a layered security model which is not
completely defined or understood: Bitcoin is secure if a majority of
hashrate is "honest" (where "honesty" is a technical term which means
"follows the right rules" without fail, even at a loss), but why might
it be honest? That sends us into complex economic and social arguments,
and the security thresholds start becoming worse when we assume some
miners are economically rational instead of "honest".
increase in the near future. Long-term incentive compatibility requires
that there be some fee pressure, and that blocks be relatively >
consistently full or very nearly full. What we see today are

To elaborate, in my view there is a at least a two fold concern on this
particular ("Long term Mining incentives") front:

One is that the long-held argument is that security of the Bitcoin system
in the long term depends on fee income funding autonomous, anonymous,
decentralized miners profitably applying enough hash-power to make
reorganizations infeasible.

For fees to achieve this purpose, there seemingly must be an effective
scarcity of capacity. The fact that verifying and transmitting
transactions has a cost isn't enough, because all the funds go to pay
that cost and none to the POW "artificial" cost; e.g., if verification
costs 1 then the market price for fees should converge to 1, and POW
cost will converge towards zero because they adapt to whatever is
being applied. Moreover, the transmission and verification costs can
be perfectly amortized by using large centralized pools (and efficient
differential block transmission like the "O(1)" idea) as you can verify
one time instead of N times, so to the extent that verification/bandwidth
is a non-negligible cost to miners at all, it's a strong pressure to
centralize. You can understand this intuitively: think for example of
carbon credit cap-and-trade: the trade part doesn't work without an
actual cap; if everyone was born with a 1000 petaton carbon balance,
the market price for credits would be zero and the program couldn't hope
to share behavior. In the case of mining, we're trying to optimize the
social good of POW security. (But the analogy applies in other ways too:
increases to the chain side are largely an externality; miners enjoy the
benefits, everyone else takes the costs--either in reduced security or
higher node operating else.)

This area has been subject to a small amount of academic research
(e.g. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2400519). But
there is still much that is unclear.

The second is that when subsidy has fallen well below fees, the incentive
to move the blockchain forward goes away. An optimal rational miner
would be best off forking off the current best block in order to capture
its fees, rather than moving the blockchain forward, until they hit
the maximum. That's where the "backlog" comment comes from, since when
there is a sufficient backlog it's better to go forward. I'm not aware
of specific research into this subquestion; it's somewhat fuzzy because
of uncertainty about the security model. If we try to say that Bitcoin
should work even in the face of most miners being profit-maximizing
instead of altruistically-honest, we must assume the chain will not
more forward so long as a block isn't full. In reality there is more
altruism than zero; there are public pressures; there is laziness, etc.

One potential argument is that maybe miners would be _regulated_ to
behave correctly. But this would require undermining the openness of the
system--where anyone can mine anonymously--in order to enforce behavior,
and that same enforcement mechanism would leave a political level to
impose additional rules that violate the extra properties of the system.

So far the mining ecosystem has become incredibly centralized over time.
I believe I am the only remaining committer who mines, and only a few
of the regular contributors to Bitcoin Core do. Many participants
have never mined or only did back in 2010/2011... we've basically
ignored the mining ecosystem, and this has had devastating effects,
causing a latent undermining of the security model: hacking a dozen or
so computers--operated under totally unknown and probably not strong
security policies--could compromise the network at least at the tip...
Rightfully we should be regarding this an an emergency, and probably
should have been have since 2011. This doesn't bode well for our ability
to respond if a larger blocksize goes poorly. In kicking the can with
the trivial change to just bump the size, are we making an implicit
decision to go down a path that has a conclusion we don't want?

(There are also shorter term mining incentives concerns; which Peter
Todd has written more about, that I'll omit for now)
pretending these systems scale. Thus, instead of working on technologies
which bring Bitcoin's trustlessness to systems which scale beyond a
I made a few relevant points back in 2011
(https://en.bitcoin.it/w/index.php?title=Scalability&action=historysubmit&diff=14273&oldid=14112)
after Dan Kaminsky argued that Bitcoin's decentralization was pretext:
that it was patently centralized since scaling directly in the network
would undermine decentralization, that the Bitcoin network necessarily
makes particular tradeoffs which prevent it from concurrently being all
things to all people. But tools like the Lightning network proposal could
well allow us to hit a greater spectrum of demands at once--including
secure zero-confirmation (something that larger blocksizes reduce if
anything), which is important for many applications. With the right
technology I believe we can have our cake and eat it too, but there needs
to be a reason to build it; the security and decentralization level of
Bitcoin imposes a _hard_ upper limit on anything that can be based on it.

Another key point here is that the small bumps in blocksize which
wouldn't clearly knock the system into a largely centralized mode--small
constants--are small enough that they don't quantitatively change the
operation of the system; they don't open up new applications that aren't
possible today. Deathandtaxes on the forum argued that Bitcoin needs
a several hundred megabyte blocksize to directly meet the worldwide
transaction needs _without retail_... Why without retail? Retail needs
near instant soft security, which cannot be achieved directly with a
global decentralized blockchain.

I don't think 1MB is magic; it always exists relative to widely-deployed
technology, sociology, and economics. But these factors aren't a simple
function; the procedure I'd prefer would be something like this: if there
is a standing backlog, we-the-community of users look to indicators to
gauge if the network is losing decentralization and then double the
hard limit with proper controls to allow smooth adjustment without
fees going to zero (see the past proposals for automatic block size
controls that let miners increase up to a hard maximum over the median
if they mine at quadratically harder difficulty), and we don't increase
if it appears it would be at a substantial increase in centralization
risk. Hardfork changes should only be made if they're almost completely
uncontroversial--where virtually everyone can look at the available data
and say "yea, that isn't undermining my property rights or future use
of Bitcoin; it's no big deal". Unfortunately, every indicator I can
think of except fee totals has been going in the wrong direction almost
monotonically along with the blockchain size increase since 2012 when
we started hitting full blocks and responded by increasing the default
soft target. This is frustrating; from a clean slate analysis of network
health I think my conclusion would be to _decrease_ the limit below the
current 300k/txn/day level.

This is obviously not acceptable, so instead many people--myself
included--have been working feverishly hard behind the scenes on Bitcoin
Core to increase the scalability. This work isn't small-potatoes
boring software engineering stuff; I mean even my personal contributions
include things like inventing a wholly new generic algebraic optimization
applicable to all EC signature schemes that increases performance by 4%,
and that is before getting into the R&D stuff that hasn't really borne
fruit yet, like fraud proofs. Today Bitcoin Core is easily >100 times
faster to synchronize and relay than when I first got involved on the
same hardware, but these improvements have been swallowed by the growth.
The ironic thing is that our frantic efforts to keep ahead and not
lose decentralization have both not been enough (by the best measures,
full node usage is the lowest its been since 2011 even though the user
base is huge now) and yet also so much that people could seriously talk
about increasing the block size to something gigantic like 20MB. This
sounds less reasonable when you realize that even at 1MB we'd likely
have a smoking hole in the ground if not for existing enormous efforts
to make scaling not come at a loss of decentralization.


I'm curious as to what discussions people have seen; e.g., are people
even here aware of these concerns? Are you aware of things like the
hashcash mediated dynamic blocksize limiting? About proposals like
lightning network (instant transactions and massive scale, in exchange
for some short term DOS risk if a counterparty opts out)? Do people
(other than Mike Hearn; I guess) think a future where everyone depends
on a small number of "Google scale" node operations for the system is
actually okay? (I think not, and if so we're never going to agree--but
it can be helpful to understand when a disagreement is ideological).
Peter Todd
2015-05-07 01:49:52 UTC
Permalink
Post by Matt Corallo
Personally, I'm rather strongly against any commitment to a block size
increase in the near future. Long-term incentive compatibility requires
that there be some fee pressure, and that blocks be relatively
consistently full or very nearly full. What we see today are
transactions enjoying next-block confirmations with nearly zero pressure
to include any fee at all (though many do because it makes wallet code
simpler).
Agreed.

I'm not sure if you've seen this, but a good paper on this topic was
published recently: "The Economics of Bitcoin Transaction Fees"

Abstract
--------

We study the economics of Bitcoin transaction fees in a simple static
partial equilibrium model with the specificity that the system security
is directly linked to the total computational power of miners. We show
that any situation with a fixed fee is equivalent to another situation
with a limited block size. In both cases, we give the optimal value of
the transaction fee or of the block size. We also show that making the
block size a non binding constraint and, in the same time, letting the
fee be fixed as the outcome of a decentralized competitive market cannot
guarantee the very existence of Bitcoin in the long-term.

-http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2400519

In short, without either a fixed blocksize or fixed fee per transaction
Bitcoin will will not survive as there is no viable way to pay for PoW
security. The latter option - fixed fee per transaction - is non-trivial
to implement in a way that's actually meaningful - it's easy to give
miners "kickbacks" - leaving us with a fixed blocksize.
Post by Matt Corallo
This allows the well-funded Bitcoin ecosystem to continue building
systems which rely on transactions moving quickly into blocks while
pretending these systems scale. Thus, instead of working on technologies
I think this lack of understanding of the limitations of blockchain tech
is very dangerous, never mind, downright misleading. I keep running into
startups at conferences with completely unrealistic ideas about how
large they'll be able to grow their on-blockchain businesses. For
example, a few weeks ago at the Stanford blockchain conference I spoke
to a company planning on using multisig escrow contracts to settle
financial instruments, and expected to be doing about as many
transactions/day on the blockchain for their business within a year or
so as all other Bitcoin users currently do combined. These guys quite
frankly had no understanding of the issues, and had apparently based
their plans on the highly optimistic Bitcoin wiki page on
scalability.(1) (I'd fix this now, but the wiki seems to not be allowing
logins)

We'd do a lot of startups a lot of good to give them accurate, and
honest, advice about the scalability of the system. The wiki definitely
isn't that. Neither is the bitcoin.org developer documentation(2), which
doesn't mention scalability at all.
Post by Matt Corallo
which bring Bitcoin's trustlessness to systems which scale beyond a
blockchain's necessarily slow and (compared to updating numbers in a
database) expensive settlement, the ecosystem as a whole continues to
focus on building centralized platforms and advocate for changes to
Bitcoin which allow them to maintain the status quo[1].
Even a relatively small increase to 20MB will greatly reduce the number
of people who can participate fully in Bitcoin, creating an environment
where the next increase requires the consent of an even smaller portion
of the Bitcoin ecosystem. Where does that stop? What's the proposed
mechanism that'll create an incentive and social consensus to not just
'kick the can down the road'(3) and further centralize but actually
scale up Bitcoin the hard way? The only proposal that I've seen that
attempts to do this is John Dillon's proof-of-stake blocksize vote(4),
and that is far from getting consensus.

1) https://en.bitcoin.it/wiki/Scalability
2) https://bitcoin.org/en/developer-guide
3) http://gavinandresen.ninja/it-must-be-done-but-is-not-a-panacea
4) http://www.mail-archive.com/bitcoin-***@lists.sourceforge.net/msg02323.html
--
'peter'[:-1]@petertodd.org
000000000000000004dc867e4541315090329f45ed4dd30e2fd7423a38a72c0e
Justus Ranvier
2015-05-07 03:03:47 UTC
Permalink
Post by Peter Todd
I'm not sure if you've seen this, but a good paper on this topic
was published recently: "The Economics of Bitcoin Transaction
Fees"
..for some very strange definitions of "good".

That paper may present valid game theory, yet game theory has a
well-known limitation when it comes to predicting real world behavior
in that the predictions are only as good as the simplified model those
predictions are based on is accurate.

At the very least, we should wait to draw any conclusions from that
paper until it has been sanity checked by a praxeological review.
Thomas Zander
2015-05-08 11:02:56 UTC
Permalink
Post by Peter Todd
I'm not sure if you've seen this, but a good paper on this topic was
published recently: "The Economics of Bitcoin Transaction Fees"
The obvious flaw in this paper is that it talks about a block size in todays
(trivial) data-flow economy and compares it with the zero-reward situation
decades from now.

Its comparing two things that will never exist at the same time (unless
Bitcoin fails).
--
Thomas Zander
Pieter Wuille
2015-05-07 03:47:16 UTC
Permalink
Post by Matt Corallo
Recently there has been a flurry of posts by Gavin at
http://gavinandresen.svbtle.com/ which advocate strongly for increasing
the maximum block size. However, there hasnt been any discussion on this
mailing list in several years as far as I can tell.
Thanks for bringing this up. I'll try to keep my arguments brief, to avoid
a long wall of text. I may be re-iterating some things that have been said
before, though.

I am - in general - in favor of increasing the size blocks: as technology
grows, there is no reason why the systems built on them can't scale
proportionally. I have so far not commented much about this, in a hope to
avoid getting into a public debate, but the way seems to be going now,
worries me greatly.

* Controversial hard forks. I hope the mailing list here today already
proves it is a controversial issue. Independent of personal opinions pro or
against, I don't think we can do a hard fork that is controversial in
nature. Either the result is effectively a fork, and pre-existing coins can
be spent once on both sides (effectively failing Bitcoin's primary
purpose), or the result is one side forced to upgrade to something they
dislike - effectively giving a power to developers they should never have.
Quoting someone: "I did not sign up to be part of a central banker's
committee".

* The reason for increasing is "need". If "we need more space in blocks" is
the reason to do an upgrade, it won't stop after 20 MB. There is nothing
fundamental possible with 20 MB blocks that isn't with 1 MB blocks.
Changetip does not put their microtransactions on the chain, not with 1 MB,
and I doubt they would with 20 MB blocks. The reason for increase should be
"because we choose to accept the trade-offs".

* Misrepresentation of the trade-offs. You can argue all you want that none
of the effects of larger blocks are particularly damaging, so everything is
fine. They will damage something (see below for details), and we should
analyze these effects, and be honest about them, and present them as a
trade-off made we choose to make to scale the system better. If you just
ask people if they want more transactions, of course you'll hear yes. If
you ask people if they want to pay less taxes, I'm sure the vast majority
will agree as well.

* Miner centralization. There is currently, as far as I know, no technology
that can relay and validate 20 MB blocks across the planet, in a manner
fast enough to avoid very significant costs to mining. There is work in
progress on this (including Gavin's IBLT-based relay, or Greg's block
network coding), but I don't think we should be basing the future of the
economics of the system on undemonstrated ideas. Without those (or even
with), the result may be that miners self-limit the size of their blocks to
propagate faster, but if this happens, larger, better-connected, and more
centrally-located groups of miners gain a competitive advantage by being
able to produce larger blocks. I would like to point out that there is
nothing evil about this - a simple feedback to determine an optimal block
size for an individual miner will result in larger blocks for better
connected hash power. If we do not want miners to have this ability, "we"
(as in: those using full nodes) should demand limitations that prevent it.
One such limitation is a block size limit (whatever it is).

* Ability to use a full node. I very much dislike the trend of people
saying "we need to encourage people to run full nodes, in order to make the
network more decentralized". Running 1000 nodes which are otherwise unused
only gives some better ability for full nodes to download the block chain,
or for SPV nodes to learn about transactions (or be Sybil-attacked...).
However, *using* a full node for validating your business (or personal!)
transactions empowers you to using a financial system that requires less
trust in *anyone* (not even in a decentralized group of peers) than
anything else. Moreover, using a full node is what given you power of the
systems' rules, as anyone who wants to change it now needs to convince you
to upgrade. And yes, 20 MB blocks will change people's ability to use full
nodes, even if the costs are small.

* Skewed incentives for improvements. I think I can personally say that I'm
responsible for most of the past years' performance improvements in Bitcoin
Core. And there is a lot of room for improvement left there - things like
silly waiting loops, single-threaded network processing, huge memory sinks,
lock contention, ... which in my opinion don't nearly get the attention
they deserve. This is in addition to more pervasive changes like optimizing
the block transfer protocol, support for orthogonal systems with a
different security/scalability trade-off like Lightning, making full
validation optional, ... Call me cynical, but without actual pressure to
work on these, I doubt much will change. Increasing the size of blocks now
will simply make it cheap enough to continue business as usual for a while
- while forcing a massive cost increase (and not just a monetary one) on
the entire ecosystem.

* Fees and long-term incentives. I put this last, not because I don't think
it is not serious, but because I don't understand nearly enough about it.
I'll let others comment.

I don't think 1 MB is optimal. Block size is a compromise between
scalability of transactions and verifiability of the system. A system with
10 transactions per day that is verifiable by a pocket calculator is not
useful, as it would only serve a few large bank's settlements. A system
which can deal with every coffee bought on the planet, but requires a
Google-scale data center to verify is also not useful, as it would be
trivially out-competed by a VISA-like design. The usefulness needs in a
balance, and there is no optimal choice for everyone. We can choose where
that balance lies, but we must accept that this is done as a trade-off, and
that that trade-off will have costs such as hardware costs, decreasing
anonymity, less independence, smaller target audience for people able to
fully validate, ...

Choose wisely.

Thanks for reading this,
--
Pieter
Mike Hearn
2015-05-07 09:25:04 UTC
Permalink
Hey Matt,

OK, let's get started ....

However, there hasnt been any discussion on this
Post by Matt Corallo
mailing list in several years as far as I can tell.
Probably because this list is not a good place for making progress or
reaching decisions. Those are triggered by pull requests (sometimes).

If you're wondering "why now", that's probably my fault. A few days ago
Wladimir posted a release timeline. I observed to Wladimir and Gavin in
private that this timeline meant a change to the block size was unlikely to
get into 0.11, leaving only 0.12, which would give everyone only a few
months to upgrade in order to fork the chain by the end of the winter
growth season. That seemed tight.

Wladimir did not reply to this email, unfortunately. Perhaps he would like
the issue to go away. It won't - if Bitcoin continues on its current growth
trends it *will* run out of capacity, almost certainly by some time next
year.

What we need to see right now is leadership and a plan, that fits in the
available time window.
Post by Matt Corallo
Certainly a consensus in this kind of technical community should be a
basic requirement for any serious commitment to blocksize increase.
I'm afraid I have come to disagree. I no longer believe this community can
reach consensus on anything protocol related. Some of these arguments have
dragged on for years. Consensus isn't even well defined - consensus of who?
Anyone who shows up? And what happens when, inevitably, no consensus is
reached? Stasis forever?
Post by Matt Corallo
Long-term incentive compatibility requires that there be some fee
pressure, and that blocks be relatively consistently full or very nearly
full.
I disagree. When the money supply eventually dwindles I doubt it will be
fee pressure that funds mining, but as that's a long time in the future,
it's very hard to predict what might happen.
Post by Matt Corallo
What we see today are
transactions enjoying next-block confirmations with nearly zero pressure
to include any fee at all (though many do because it makes wallet code
simpler).
Many do because free transactions are broken - the relay limiter means
whether a free transaction actually makes it across the network or not is
basically pot luck and there's no way for a wallet to know, short of either
trying it or actually receiving every single transaction and repeating the
calculations. If free transactions weren't broken for all non-full nodes
they'd probably be used a lot more.
Post by Matt Corallo
This allows the well-funded Bitcoin ecosystem to continue building
systems which rely on transactions moving quickly into blocks while
pretending these systems scale.
I have two huge problems with this line of thinking.

Firstly, no, the "Bitcoin ecosystem" is not well funded. Blockstream might
be, but significant numbers of users are running programs developed by tiny
startups, or volunteers who don't have millions in venture capital to play
with.

Arm-twisting "the ecosystem" into developing complicated Rube Goldberg
machines in double quick time, just to keep the Bitcoin show on the road,
is in fact the opposite of decentralisation - it will effectively exclude
anyone who isn't able to raise large amounts of corporate funding from
writing code that uses the Bitcoin network. Decentralisation benefits from
simplicity, and bigger blocks are (in Gavin's words) "the simplest thing
that will work".

My second problem is the claim that everyone is playing pretend about
Bitcoin, except you guys. I would put it another way - I would say those
people are building products and getting users, by making reasonable
engineering tradeoffs and using systems that work. Yes, one day those
systems might have to change. That's the nature of scaling. It's the nature
of progress. But not today. Probably not tomorrow either.

What I would like to see from Blockstream is a counter-proposal. So far you
have made lots of vague comments that we all agree with - yes,
decentralisation is good, yes some block size limit must exist, if only
because computers are finite machines.

What I don't see from you yet is a *specific and credible plan* that fits
within the next 12 months and which allows Bitcoin to keep growing. Not
some vague handwave like "let's all use the Lightning network" (which does
not exist), or "let's do more research" (Gavin has done plenty of
research), or "but what about the risks" (Bitcoin is full of risks). A
plan, with dates attached, and a strong chance of actually being deployed
in time.
Peter Todd
2015-05-07 10:12:50 UTC
Permalink
Post by Mike Hearn
Post by Matt Corallo
Certainly a consensus in this kind of technical community should be a
basic requirement for any serious commitment to blocksize increase.
I'm afraid I have come to disagree. I no longer believe this community can
reach consensus on anything protocol related. Some of these arguments have
dragged on for years. Consensus isn't even well defined - consensus of who?
Anyone who shows up? And what happens when, inevitably, no consensus is
reached? Stasis forever?
Care to be specific?

We've made lots of protocol related changes, as well as non-consensus
policy changes, often in quite short timeframes, and with little drama.
For instance BIP66 adopting is progressing smoothly, and itself was very
quickly developed as part of a broader response to a serious OpenSSL
flaw. My own BIP65 is getting wide consensus with little drama and good
peer review, and that's happening even without as much attention paid to
it from myself as I should have been giving it. The BIP62 malleability
softfork is going more slowly, but that's because peer review is finding
issues and fixing them - something to be expected in an environment
where we simply must be cautious.

As for the v0.11 release, it will have pruning, perhaps the biggest
change to the way Bitcoin Core works that we've ever made. Equally it's
notable how many people collaborated on the implementation of pruning,
again with little drama.

Sure, some stuff has been hard to get consensus on. But those things
carry high risks, and involve code and practices known to be dangerous.
In most cases we've found out the lack of consensus was spot on, and
controversial changes turn out later to have severe security
vulnerabilities. I read that as a sign that the peer review and
consensus building process works just fine.
--
'peter'[:-1]@petertodd.org
00000000000000000af0c4ba9d91c00d48c4493899d7235fd819ac76f16d148d
Btc Drak
2015-05-07 10:42:19 UTC
Permalink
Post by Mike Hearn
What I don't see from you yet is a *specific and credible plan* that fits
within the next 12 months and which allows Bitcoin to keep growing. Not
some vague handwave like "let's all use the Lightning network" (which does
not exist), or "let's do more research" (Gavin has done plenty of
research), or "but what about the risks" (Bitcoin is full of risks). A
plan, with dates attached, and a strong chance of actually being deployed
in time.
Would you please explain where this 12 months timeframe comes from?
Jorge Timón
2015-05-07 10:52:26 UTC
Permalink
I observed to Wladimir and Gavin in private that this timeline meant a change to the block size was unlikely to get into 0.11, leaving only 0.12, which would give everyone only a few months to upgrade in order to fork the chain by the end of the winter growth season. That seemed tight.
Can you please elaborate on what terrible things will happen if we
don't increase the block size by winter this year?
I assume that you are expecting full blocks by then, have you used any
statistical technique to come up with that date or is it just your
guess?
Because I love wild guesses and mine is that full 1 MB blocks will not
happen until June 2017.
What we need to see right now is leadership and a plan, that fits in the
available time window.
Post by Matt Corallo
Certainly a consensus in this kind of technical community should be a
basic requirement for any serious commitment to blocksize increase.
I'm afraid I have come to disagree. I no longer believe this community can
reach consensus on anything protocol related. Some of these arguments have
dragged on for years. Consensus isn't even well defined - consensus of who?
Anyone who shows up? And what happens when, inevitably, no consensus is
reached? Stasis forever?
We've successfully reached consensus for several softfork proposals already.
I agree with others that hardfork need to be uncontroversial and there
should be consensus about them.
If you have other ideas for the criteria for hardfork deployment all I'm ears.
I just hope that by "What we need to see right now is leadership" you
don't mean something like "when Gaving and Mike agree it's enough to
deploy a hardfork" when you go from vague to concrete.
Post by Matt Corallo
Long-term incentive compatibility requires that there be some fee
pressure, and that blocks be relatively consistently full or very nearly
full.
I disagree. When the money supply eventually dwindles I doubt it will be fee
pressure that funds mining, but as that's a long time in the future, it's
very hard to predict what might happen.
Oh, so your answer to "bitcoin will eventually need to live on fees
and we would like to know more about how it will look like then" it's
"no bitcoin long term it's broken long term but that's far away in the
future so let's just worry about the present".
I agree that it's hard to predict that future, but having some
competition for block space would actually help us get more data on a
similar situation to be able to predict that future better.
What you want to avoid at all cost (the block size actually being
used), I see as the best opportunity we have to look into the future.
Post by Matt Corallo
What we see today are
transactions enjoying next-block confirmations with nearly zero pressure
to include any fee at all (though many do because it makes wallet code
simpler).
Many do because free transactions are broken - the relay limiter means
whether a free transaction actually makes it across the network or not is
basically pot luck and there's no way for a wallet to know, short of either
trying it or actually receiving every single transaction and repeating the
calculations. If free transactions weren't broken for all non-full nodes
they'd probably be used a lot more.
Free transactions are a gift from miners that run an altruistic policy.
That's great but we shouldn't rely on them for the future. They will
likely disappear at some point and that's ok.
In any case, he's not complaining about the lack of free transactions,
more like the opposite.
He is saying that's very easy to get free transactions in the next
block and blocks aren't full so there's no incentive to include fees
to compete for the space.
We can talk a lot about "a fee market" and build a theoretically
perfect fee estimator but we won't actually have a fee market until
there's some competition for space.
Nobody will pay for space that's abundant just like people don't pay
for the air they breath.
What I don't see from you yet is a specific and credible plan that fits
within the next 12 months and which allows Bitcoin to keep growing. Not some
vague handwave like "let's all use the Lightning network" (which does not
exist), or "let's do more research" (Gavin has done plenty of research), or
"but what about the risks" (Bitcoin is full of risks). A plan, with dates
attached, and a strong chance of actually being deployed in time.
Ok, this is my plan: we wait 12 months, hope that your estimations are
correct (in case that my guess was better than yours, we keep waiting
until June 2017) and start having full blocks and people having to
wait 2 blocks for their transactions to be confirmed some times.
That would be the beginning of a true "fee market", something that
Gavin used to say was his #1 priority not so long ago (which seems
contradictory with his current efforts to avoid that from happening).
Having a true fee market seems clearly an advantage.
What are supposedly disastrous negative parts of this plan that make
an alternative plan (ie: increasing the block size) so necessary and
obvious.
I think the advocates of the size increase are failing to explain the
disadvantages of maintaining the current size. It feels like the
explanation are missing because it should be somehow obvious how the
sky will burn if we don't increase the block size soon.
But, well, it is not obvious to me, so please elaborate on why having
a fee market (instead of just an price estimator for a market that
doesn't even really exist) would be a disaster.
Andrew
2015-05-07 11:15:57 UTC
Permalink
I'm mainly just an observer on this. I mostly agree with Pieter. Also, I
think the main reason why people like Gavin and Mike Hearn are trying to
rush this through is because they have some kind of "apps" that depend on
zero conf instant transactions, so this would of course require more
traffic on the blockchain. I think people like Gavin or Mike should state
clearly what kind of (rigorous) system for instant transactions is
satisfactory for use in their applications. Be it lightning or something
similar, what is good enough? And no zero conf is not a real secure system.
Then once we know what is good enough for them (and everyone else), we can
implement it as a soft fork into the protocol, and it's a win win situation
for both sides (we can also benefit from all the new users people like Mike
are trying bring in).
I observed to Wladimir and Gavin in private that this timeline meant a
change to the block size was unlikely to get into 0.11, leaving only 0.12,
which would give everyone only a few months to upgrade in order to fork the
chain by the end of the winter growth season. That seemed tight.
Can you please elaborate on what terrible things will happen if we
don't increase the block size by winter this year?
I assume that you are expecting full blocks by then, have you used any
statistical technique to come up with that date or is it just your
guess?
Because I love wild guesses and mine is that full 1 MB blocks will not
happen until June 2017.
What we need to see right now is leadership and a plan, that fits in the
available time window.
Post by Matt Corallo
Certainly a consensus in this kind of technical community should be a
basic requirement for any serious commitment to blocksize increase.
I'm afraid I have come to disagree. I no longer believe this community
can
reach consensus on anything protocol related. Some of these arguments
have
dragged on for years. Consensus isn't even well defined - consensus of
who?
Anyone who shows up? And what happens when, inevitably, no consensus is
reached? Stasis forever?
We've successfully reached consensus for several softfork proposals already.
I agree with others that hardfork need to be uncontroversial and there
should be consensus about them.
If you have other ideas for the criteria for hardfork deployment all I'm ears.
I just hope that by "What we need to see right now is leadership" you
don't mean something like "when Gaving and Mike agree it's enough to
deploy a hardfork" when you go from vague to concrete.
Post by Matt Corallo
Long-term incentive compatibility requires that there be some fee
pressure, and that blocks be relatively consistently full or very nearly
full.
I disagree. When the money supply eventually dwindles I doubt it will be
fee
pressure that funds mining, but as that's a long time in the future, it's
very hard to predict what might happen.
Oh, so your answer to "bitcoin will eventually need to live on fees
and we would like to know more about how it will look like then" it's
"no bitcoin long term it's broken long term but that's far away in the
future so let's just worry about the present".
I agree that it's hard to predict that future, but having some
competition for block space would actually help us get more data on a
similar situation to be able to predict that future better.
What you want to avoid at all cost (the block size actually being
used), I see as the best opportunity we have to look into the future.
Post by Matt Corallo
What we see today are
transactions enjoying next-block confirmations with nearly zero pressure
to include any fee at all (though many do because it makes wallet code
simpler).
Many do because free transactions are broken - the relay limiter means
whether a free transaction actually makes it across the network or not is
basically pot luck and there's no way for a wallet to know, short of
either
trying it or actually receiving every single transaction and repeating
the
calculations. If free transactions weren't broken for all non-full nodes
they'd probably be used a lot more.
Free transactions are a gift from miners that run an altruistic policy.
That's great but we shouldn't rely on them for the future. They will
likely disappear at some point and that's ok.
In any case, he's not complaining about the lack of free transactions,
more like the opposite.
He is saying that's very easy to get free transactions in the next
block and blocks aren't full so there's no incentive to include fees
to compete for the space.
We can talk a lot about "a fee market" and build a theoretically
perfect fee estimator but we won't actually have a fee market until
there's some competition for space.
Nobody will pay for space that's abundant just like people don't pay
for the air they breath.
What I don't see from you yet is a specific and credible plan that fits
within the next 12 months and which allows Bitcoin to keep growing. Not
some
vague handwave like "let's all use the Lightning network" (which does not
exist), or "let's do more research" (Gavin has done plenty of research),
or
"but what about the risks" (Bitcoin is full of risks). A plan, with dates
attached, and a strong chance of actually being deployed in time.
Ok, this is my plan: we wait 12 months, hope that your estimations are
correct (in case that my guess was better than yours, we keep waiting
until June 2017) and start having full blocks and people having to
wait 2 blocks for their transactions to be confirmed some times.
That would be the beginning of a true "fee market", something that
Gavin used to say was his #1 priority not so long ago (which seems
contradictory with his current efforts to avoid that from happening).
Having a true fee market seems clearly an advantage.
What are supposedly disastrous negative parts of this plan that make
an alternative plan (ie: increasing the block size) so necessary and
obvious.
I think the advocates of the size increase are failing to explain the
disadvantages of maintaining the current size. It feels like the
explanation are missing because it should be somehow obvious how the
sky will burn if we don't increase the block size soon.
But, well, it is not obvious to me, so please elaborate on why having
a fee market (instead of just an price estimator for a market that
doesn't even really exist) would be a disaster.
------------------------------------------------------------------------------
One dashboard for servers and applications across Physical-Virtual-Cloud
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
_______________________________________________
Bitcoin-development mailing list
https://lists.sourceforge.net/lists/listinfo/bitcoin-development
--
PGP: B6AC 822C 451D 6304 6A28 49E9 7DB7 011C D53B 5647
Mike Hearn
2015-05-07 11:29:44 UTC
Permalink
Post by Jorge Timón
Can you please elaborate on what terrible things will happen if we
don't increase the block size by winter this year?
I was referring to winter next year. 0.12 isn't scheduled until the end of
the year, according to Wladimir. I explained where this figure comes from
in this article:

https://medium.com/@octskyward/bitcoin-s-seasonal-affective-disorder-35733bab760d

It's a fairly simple estimate based on previous growth patterns.

Because I love wild guesses and mine is that full 1 MB blocks will not
Post by Jorge Timón
happen until June 2017.
OK, it could be. But do you think this debate will play out significantly
differently if you are right, I am wrong, and we have this discussion next
summer instead? Because in several years of watching these debates, I
haven't seen much change in them.
Post by Jorge Timón
We've successfully reached consensus for several softfork proposals already.
Are you sure about that?

What if Gavin popped up right now and said he disagreed with every current
proposal, he disagreed with side chains too, and there would be no
consensus on any of them until the block size limit was raised.

Would you say, oh, OK, guess that's it then. There's no consensus so might
as well scrap all those proposals, as they'll never happen anyway. Bye bye
side chains whitepaper.
Post by Jorge Timón
I just hope that by "What we need to see right now is leadership" you
don't mean something like "when Gaving and Mike agree it's enough to
deploy a hardfork" when you go from vague to concrete.
No. What I meant is that someone (theoretically Wladimir) needs to make a
clear decision. If that decision is "Bitcoin Core will wait and watch the
fireworks when blocks get full", that would be showing leadership .....
albeit I believe in the wrong direction. It would, however, let people know
what's what and let them start to make longer term plans.

This dillydallying around is an issue - people just make vague points that
can't really be disagreed with (more nodes would be nice, smaller pools
would also be nice etc), and nothing gets done.
Post by Jorge Timón
"no bitcoin long term it's broken long term but that's far away in the
future so let's just worry about the present".
I never said Bitcoin is broken in the long term. Far from it - I laid out
my ideas for what will happen when the block subsidy dwindles years ago.

But yes, it's hard for me to care overly much about what happens 30 years
from now, for the same reason you probably care more about what happens
tomorrow than what happens after you are dead. The further into the future
you try and plan, the less likely your plans are to survive unscathed.
Post by Jorge Timón
What you want to avoid at all cost (the block size actually being
used), I see as the best opportunity we have to look into the future.
I think I see one of the causes of disagreement now.

I will write more on the topic of what will happen if we hit the block size
limit soon, maybe this evening. I have some other tasks to do first.

Regardless, I don't believe we will get any useful data out of such an
event. I've seen distributed systems run out of capacity before. What will
happen instead is technological failure followed by rapid user abandonment
that pushes traffic back below the pressure threshold .... and those users
will most likely not come back any time soon.
Post by Jorge Timón
Ok, this is my plan: we wait 12 months, hope that your estimations are
correct (in case that my guess was better than yours, we keep waiting
until June 2017) and start having full blocks and people having to
wait 2 blocks for their transactions to be confirmed some times.
I disagree that'd be the outcome, but good, this is progress. Now we need
to hear something like that from Wladimir, or whoever has the final say
around here.

With respect to the fee market: I think it's fairer to say Gavin wants a
market to exist, and he also wants supply to be plentiful. 20mb limit
doesn't actually mean every block will be 20mb the day after, no more than
they're all 1mb today. Miners may discover that if they go beyond 5mb they
have too many orphans and then propagation speed will have to be optimised
to break through the next bottleneck. Scaling is always about finding the
next bottleneck and removing it, ideally, before you hit it.
Jorge Timón
2015-05-07 12:26:10 UTC
Permalink
Post by Mike Hearn
I was referring to winter next year. 0.12 isn't scheduled until the end of
the year, according to Wladimir. I explained where this figure comes from in
It's a fairly simple estimate based on previous growth patterns.
Ok, thanks.
Post by Mike Hearn
Post by Jorge Timón
We've successfully reached consensus for several softfork proposals already.
Are you sure about that?
Yes, Peter Todd gave more details.
Post by Mike Hearn
What if Gavin popped up right now and said he disagreed with every current
proposal, he disagreed with side chains too, and there would be no consensus
on any of them until the block size limit was raised.
Would you say, oh, OK, guess that's it then. There's no consensus so might
as well scrap all those proposals, as they'll never happen anyway. Bye bye
side chains whitepaper.
Well, yes, it is true that "universally uncontroversial" (which is
what I think the requirement should be for hard forks) is a vague
qualifier that's not formally defined anywhere.
I guess we should only consider rational arguments. You cannot just
nack something without further explanation.
If his explanation was "I will change my mind after we increase block
size", I guess the community should say "then we will just ignore your
nack because it makes no sense".
In the same way, when people use fallacies (purposely or not) we must
expose that and say "this fallacy doesn't count as an argument".
But yeah, it would probably be good to define better what constitutes
a "sensible objection" or something. That doesn't seem simple though.
Post by Mike Hearn
Post by Jorge Timón
I just hope that by "What we need to see right now is leadership" you
don't mean something like "when Gaving and Mike agree it's enough to
deploy a hardfork" when you go from vague to concrete.
No. What I meant is that someone (theoretically Wladimir) needs to make a
clear decision. If that decision is "Bitcoin Core will wait and watch the
fireworks when blocks get full", that would be showing leadership .....
albeit I believe in the wrong direction. It would, however, let people know
what's what and let them start to make longer term plans.
This dillydallying around is an issue - people just make vague points that
can't really be disagreed with (more nodes would be nice, smaller pools
would also be nice etc), and nothing gets done.
Well, there's two different things here.
One thing is the Bitcoin core project where you could argue that the 5
committers decide (I don't know why Wladimir would have any more
authority than the others).
But what the bitcoin network itself does it's very different because
unlike the bitcoin core software project, the Bitcoin network is
decentralized.
If the people with commit access go nuts and decide something that's
clearly stupid or evil, people can just fork the project because it is
free software.
You cannot be forced to use specific features of free software, you
can always remove them and recompile, that's the whole point.
So, no, there's no authority to decide on hardforks and that's why I
think that only clearly uncontroversial things can get through as
hardforks.
Post by Mike Hearn
Post by Jorge Timón
What you want to avoid at all cost (the block size actually being
used), I see as the best opportunity we have to look into the future.
I think I see one of the causes of disagreement now.
I will write more on the topic of what will happen if we hit the block size
limit soon, maybe this evening. I have some other tasks to do first.
Regardless, I don't believe we will get any useful data out of such an
event. I've seen distributed systems run out of capacity before. What will
happen instead is technological failure followed by rapid user abandonment
that pushes traffic back below the pressure threshold .... and those users
will most likely not come back any time soon.
Ok, so in simple terms, you expect people to have to pay enormous fees
and/or wait thousands of blocks for their transactions to get included
in the chain.
Is that correct?
Post by Mike Hearn
Post by Jorge Timón
Ok, this is my plan: we wait 12 months, hope that your estimations are
correct (in case that my guess was better than yours, we keep waiting
until June 2017) and start having full blocks and people having to
wait 2 blocks for their transactions to be confirmed some times.
I disagree that'd be the outcome, but good, this is progress. Now we need to
hear something like that from Wladimir, or whoever has the final say around
here.
As said above there's no authority to decide on what Bitcoin the p2p
network does. Again, that's the whole point.
But, yes, I agree that both sides understanding each other better is progress.
Post by Mike Hearn
With respect to the fee market: I think it's fairer to say Gavin wants a
market to exist, and he also wants supply to be plentiful. 20mb limit
doesn't actually mean every block will be 20mb the day after, no more than
they're all 1mb today. Miners may discover that if they go beyond 5mb they
have too many orphans and then propagation speed will have to be optimised
to break through the next bottleneck. Scaling is always about finding the
next bottleneck and removing it, ideally, before you hit it.
I'm sure he wants a fee market to eventually exist as well.
But it seems that some people would like to see that happening before
the subsidies are low (not necessarily null), while other people are
fine waiting for that but don't want to ever be close to the scale
limits anytime soon.
I would also like to know for how long we need to prioritize short
term adoption in this way. As others have said, if the answer is
"forever, adoption is always the most important thing" then we will
end up with an improved version of Visa.
But yeah, this is progress, I'll wait for your more detailed
description of the tragedies that will follow hitting the block
limits, assuming for now that it will happen in 12 months.
My previous answer to the nervous "we will hit the block limits in 12
months if we don't do anything" was "not sure about 12 months, but
whatever, great, I'm waiting for that to observe how fees get
affected".
But it should have been a question "what's wrong with hitting the
block limits in 12 months?"
Mike Hearn
2015-05-07 14:05:41 UTC
Permalink
Post by Jorge Timón
If his explanation was "I will change my mind after we increase block
size", I guess the community should say "then we will just ignore your
Post by Jorge Timón
nack because it makes no sense".
Oh good! We can just kick anyone out of the consensus process if we think
they make no sense.

I guess that means me and Gavin can remove everyone else from the developer
consensus, because we think trying to stop Bitcoin growing makes no sense.

Do you see the problem with this whole notion? It cannot possibly work.
Whenever you try and make the idea of developer consensus work, what you
end up with is "I believe in consensus as long as it goes my way". Which is
worthless.
Post by Jorge Timón
One thing is the Bitcoin core project where you could argue that the 5
committers decide (I don't know why Wladimir would have any more
authority than the others).
Because he is formally the maintainer.

Maybe you dislike that idea. It's so .... centralised. So let's say Gavin
commits his patch, because his authority is equal to all other committers.
Someone else rolls it back. Gavin sets up a cron job to keep committing the
patch. Game over.

You cannot have committers fighting over what goes in and what doesn't.
That's madness. There must be a single decision maker for any given
codebase.
Post by Jorge Timón
Ok, so in simple terms, you expect people to have to pay enormous fees
and/or wait thousands of blocks for their transactions to get included
in the chain. Is that correct?
No. I'll write an article like the others, it's better than email for more
complicated discourse.

As others have said, if the answer is "forever, adoption is always the most
Post by Jorge Timón
important thing" then we will end up with an improved version of Visa.
This appears to be another one of those fundamental areas of disagreement.
I believe there is no chance of Bitcoin ending up like Visa, even if it is
wildly successful. I did the calculations years ago that show that won't
happen:

https://en.bitcoin.it/wiki/Scalability

Decentralisation is a spectrum and Bitcoin will move around on that
spectrum over time. But claiming we have to pick between 1mb blocks and
"Bitcoin = VISA" is silly.



Peter: your hypocrisy really is bottomless, isn't it? You constantly
claim to be a Righteous Defender of Privacy, but don't even hesitate before
publishing hacked private emails when it suits you.

Satoshi's hacker had no illusions about your horrible personality, which is
why he forwarded that email to you specifically. He knew you'd use it. You
should reflect on that fact. It says nothing good about you at all.
Bryan Bishop
2015-05-07 14:18:17 UTC
Permalink
Post by Mike Hearn
Maybe you dislike that idea. It's so .... centralised. So let's say Gavin
commits his patch, because his authority is equal to all other committers.
Someone else rolls it back. Gavin sets up a cron job to keep committing the
patch. Game over.
You cannot have committers fighting over what goes in and what doesn't.
That's madness. There must be a single decision maker for any given
codebase.
Hmm, git repositories don't quite work like that. Instead, you should
imagine everyone having a local copy of the git repository. Each
developer synchronizes their git repository with other developers.
They merge changes from specific remote branches that they have
received. Each developer has their own branch and each developer is
the "single decision maker" for the artifact that they compile.

- Bryan
http://heybryan.org/
1 512 203 0507
Peter Todd
2015-05-07 14:22:24 UTC
Permalink
Post by Mike Hearn
Peter: your hypocrisy really is bottomless, isn't it? You constantly
claim to be a Righteous Defender of Privacy, but don't even hesitate before
publishing hacked private emails when it suits you.
Satoshi's hacker had no illusions about your horrible personality, which is
why he forwarded that email to you specifically. He knew you'd use it. You
should reflect on that fact. It says nothing good about you at all.
As you know I was forwarded that email first, and because I *do* respect
your privacy I consulting with you via private IRC chat first, and as
you wished I didn't publish it. The hacker presumably gave up waiting
for me to do so and published it themselves seven months ago; to make
that clear I linked the source(1) of the email in my message. Those
emails simply are no longer private.

Frankly personal attacks like this - "your hypocrisy really is
bottomless, isn't it?", "Satoshi's hacker had no illusions about your
horrible personality" - simply don't belong on this mailing list and I
think we would all appreciate an apology.

1) https://www.reddit.com/r/Bitcoin/comments/2g9c0j/satoshi_email_leak/
--
'peter'[:-1]@petertodd.org
000000000000000012a3e40d5ee5c7fc2fb8367b720a9d499468ceb25366c1f3
Peter Todd
2015-05-07 14:40:12 UTC
Permalink
Post by Mike Hearn
Post by Jorge Timón
One thing is the Bitcoin core project where you could argue that the 5
committers decide (I don't know why Wladimir would have any more
authority than the others).
Because he is formally the maintainer.
I quite liked Wladimir's description of what someone with the ability
to merge pull requests into Bitcoin Core is:

@orionwl github.com/bitcoin/bitcoin repository admin, or maybe just "janitor"

-https://twitter.com/orionwl/status/563688293737697281

In any case, we can't force people to run Bitcoin Core - an unpopular
patch that fails to reach consensus is a strong sign that it may not get
user acceptance either - so we might as well accept that centralized
authority over the development process isn't going to fly and deal with
the sometimes messy consequences.

Like I said, you're welcome to fork the project and try to get user
acceptance for the fork.
--
'peter'[:-1]@petertodd.org
000000000000000013e67b343b1f6d75cc87dfb54430bdb3bcf66d8d4b3ef6b8
Gavin Andresen
2015-05-07 14:52:54 UTC
Permalink
For reference: the blog post that (re)-started this debate, and which links
to individual issues, is here:
http://gavinandresen.ninja/time-to-roll-out-bigger-blocks

In it, I asked people to email me objections I might have missed. I would
still appreciate it if people do that; it is impossible to keep up with
this mailing list, /r/bitcoin posts and comments, and #bitcoin-wizards and
also have time to respond thoughtfully to the objections raised.

I would very much like to find some concrete course of action that we can
come to consensus on. Some compromise so we can tell entrepreneurs "THIS is
how much transaction volume the main Bitcoin blockchain will be able to
support over the next eleven years."

I've been pretty clear on what I think is a reasonable compromise (a
one-time increase scheduled for early next year), and I have tried to
explain why I think it it is the right set of tradeoffs.

There ARE tradeoffs here, and the hard question is what process do we use
to decide those tradeoffs? How do we come to consensus? Is it worth my
time to spend hours responding thoughtfully to every new objection raised
here, or will the same thing happen that happened last year and the year
before-- everybody eventually gets tired of arguing
angels-dancing-on-the-head-of-a-pin, and we're left with the status quo?

I AM considering contributing some version of the bigger blocksize-limit
hard-fork patch to the Bitcoin-Xt fork (probably "target a hobbyist with a
fast Internet connection, and assume Nelson's law to increase over time),
and then encouraging merchants and exchanges and web wallets and
individuals who think it strikes a reasonable balance to run it.

And then, assuming it became a super-majority of nodes on the network,
encourage miners to roll out a soft-fork to start producing bigger blocks
and eventually trigger the hard fork.

Because ultimately consensus comes down to what software people choose to
run.
--
--
Gavin Andresen
Peter Todd
2015-05-07 14:56:58 UTC
Permalink
Post by Gavin Andresen
I AM considering contributing some version of the bigger blocksize-limit
hard-fork patch to the Bitcoin-Xt fork (probably "target a hobbyist with a
fast Internet connection, and assume Nelson's law to increase over time),
and then encouraging merchants and exchanges and web wallets and
individuals who think it strikes a reasonable balance to run it.
And then, assuming it became a super-majority of nodes on the network,
encourage miners to roll out a soft-fork to start producing bigger blocks
and eventually trigger the hard fork.
Would you please explain what you mean by "a soft-fork to start
producing bigger blocks"
--
'peter'[:-1]@petertodd.org
00000000000000000d49f263bbbb80f264abc7cc930fc9cbc7ba80ac068d9648
Alex Morcos
2015-05-07 15:04:25 UTC
Permalink
That strikes me as a dangerous path forward.

I don't actually think there is anything wrong with this: "everybody
eventually gets tired of arguing angels-dancing-on-the-head-of-a-pin, and
we're left with the status quo"

What gives Bitcoin value aren't its technical merits but the fact that
people believe in it. The biggest risk here isn't that 20MB blocks will
be bad or that 1MB blocks will be bad, but that by forcing a hard fork that
isn't nearly universally agreed upon, we will be damaging that belief. If
I strongly believed some hard fork would be better for Bitcoin, say
permanent inflation of 1% a year to fund mining, and I managed to convince
80% of users, miners, businesses and developers to go along with me, I
would still vote against doing it. Because that's not nearly universal
agreement, and it changes what people chose to believe in without their
consent. Forks should be hard, very hard. And both sides should recognize
that belief in the value of Bitcoin might be a fragile thing. I'd argue
that if we didn't force through a 20MB fork now, and we ran into major
network difficulties a year from now and had no other technical solutions,
that maybe we would get nearly universal agreement, and the businesses and
users that were driven away by the unusable system would be a short term
loss in value considerably smaller than the impairment we risk by forcing a
change.
Post by Gavin Andresen
For reference: the blog post that (re)-started this debate, and which
http://gavinandresen.ninja/time-to-roll-out-bigger-blocks
In it, I asked people to email me objections I might have missed. I would
still appreciate it if people do that; it is impossible to keep up with
this mailing list, /r/bitcoin posts and comments, and #bitcoin-wizards and
also have time to respond thoughtfully to the objections raised.
I would very much like to find some concrete course of action that we can
come to consensus on. Some compromise so we can tell entrepreneurs "THIS is
how much transaction volume the main Bitcoin blockchain will be able to
support over the next eleven years."
I've been pretty clear on what I think is a reasonable compromise (a
one-time increase scheduled for early next year), and I have tried to
explain why I think it it is the right set of tradeoffs.
There ARE tradeoffs here, and the hard question is what process do we use
to decide those tradeoffs? How do we come to consensus? Is it worth my
time to spend hours responding thoughtfully to every new objection raised
here, or will the same thing happen that happened last year and the year
before-- everybody eventually gets tired of arguing
angels-dancing-on-the-head-of-a-pin, and we're left with the status quo?
I AM considering contributing some version of the bigger blocksize-limit
hard-fork patch to the Bitcoin-Xt fork (probably "target a hobbyist with a
fast Internet connection, and assume Nelson's law to increase over time),
and then encouraging merchants and exchanges and web wallets and
individuals who think it strikes a reasonable balance to run it.
And then, assuming it became a super-majority of nodes on the network,
encourage miners to roll out a soft-fork to start producing bigger blocks
and eventually trigger the hard fork.
Because ultimately consensus comes down to what software people choose to
run.
--
--
Gavin Andresen
------------------------------------------------------------------------------
One dashboard for servers and applications across Physical-Virtual-Cloud
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
_______________________________________________
Bitcoin-development mailing list
https://lists.sourceforge.net/lists/listinfo/bitcoin-development
Jeff Garzik
2015-05-07 15:09:18 UTC
Permalink
100% agree, RE hard forks should be hard.

However, it is the paradox of growth, morale and adoption that bitcoin
might never reach the point where it is saturated & expensive to the point
where larger blocks are demanded by 95%+... simply because people and
companies chose not to adopt bitcoin in the first place due to an unmoving,
[perceived | real] scalability roadblock.
Post by Alex Morcos
That strikes me as a dangerous path forward.
I don't actually think there is anything wrong with this: "everybody
eventually gets tired of arguing angels-dancing-on-the-head-of-a-pin, and
we're left with the status quo"
What gives Bitcoin value aren't its technical merits but the fact that
people believe in it. The biggest risk here isn't that 20MB blocks will
be bad or that 1MB blocks will be bad, but that by forcing a hard fork that
isn't nearly universally agreed upon, we will be damaging that belief. If
I strongly believed some hard fork would be better for Bitcoin, say
permanent inflation of 1% a year to fund mining, and I managed to convince
80% of users, miners, businesses and developers to go along with me, I
would still vote against doing it. Because that's not nearly universal
agreement, and it changes what people chose to believe in without their
consent. Forks should be hard, very hard. And both sides should recognize
that belief in the value of Bitcoin might be a fragile thing. I'd argue
that if we didn't force through a 20MB fork now, and we ran into major
network difficulties a year from now and had no other technical solutions,
that maybe we would get nearly universal agreement, and the businesses and
users that were driven away by the unusable system would be a short term
loss in value considerably smaller than the impairment we risk by forcing a
change.
Post by Gavin Andresen
For reference: the blog post that (re)-started this debate, and which
http://gavinandresen.ninja/time-to-roll-out-bigger-blocks
In it, I asked people to email me objections I might have missed. I would
still appreciate it if people do that; it is impossible to keep up with
this mailing list, /r/bitcoin posts and comments, and #bitcoin-wizards and
also have time to respond thoughtfully to the objections raised.
I would very much like to find some concrete course of action that we can
come to consensus on. Some compromise so we can tell entrepreneurs "THIS is
how much transaction volume the main Bitcoin blockchain will be able to
support over the next eleven years."
I've been pretty clear on what I think is a reasonable compromise (a
one-time increase scheduled for early next year), and I have tried to
explain why I think it it is the right set of tradeoffs.
There ARE tradeoffs here, and the hard question is what process do we use
to decide those tradeoffs? How do we come to consensus? Is it worth my
time to spend hours responding thoughtfully to every new objection raised
here, or will the same thing happen that happened last year and the year
before-- everybody eventually gets tired of arguing
angels-dancing-on-the-head-of-a-pin, and we're left with the status quo?
I AM considering contributing some version of the bigger blocksize-limit
hard-fork patch to the Bitcoin-Xt fork (probably "target a hobbyist with a
fast Internet connection, and assume Nelson's law to increase over time),
and then encouraging merchants and exchanges and web wallets and
individuals who think it strikes a reasonable balance to run it.
And then, assuming it became a super-majority of nodes on the network,
encourage miners to roll out a soft-fork to start producing bigger blocks
and eventually trigger the hard fork.
Because ultimately consensus comes down to what software people choose to
run.
--
--
Gavin Andresen
------------------------------------------------------------------------------
One dashboard for servers and applications across Physical-Virtual-Cloud
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
_______________________________________________
Bitcoin-development mailing list
https://lists.sourceforge.net/lists/listinfo/bitcoin-development
------------------------------------------------------------------------------
One dashboard for servers and applications across Physical-Virtual-Cloud
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
_______________________________________________
Bitcoin-development mailing list
https://lists.sourceforge.net/lists/listinfo/bitcoin-development
--
Jeff Garzik
Bitcoin core developer and open source evangelist
BitPay, Inc. https://bitpay.com/
Mike Hearn
2015-05-07 15:12:38 UTC
Permalink
Post by Alex Morcos
What gives Bitcoin value aren't its technical merits but the fact that
people believe in it.
Much of the belief in Bitcoin is that it has a bright future. Certainly the
huge price spikes we've seen were not triggered by equally large spikes in
usage - it's speculation on that future.

I quite agree that if people stop believing in Bitcoin, that will be bad. A
fast way to bring that about will be to deliberately cripple the technology
in order to force people onto something quite different (which probably
won't be payment channel networks).
Post by Alex Morcos
I'd argue that if we didn't force through a 20MB fork now, and we ran into
major network difficulties a year from now and had no other technical
solutions, that maybe we would get nearly universal agreement
I doubt it. The disagreement seems more philosophical than technical. If
Bitcoin fell off a cliff then that'd just be taken as more evidence that
block chains don't work and we should all use some network of payment hubs,
or whatever the fashion of the day is. Or anyone who doesn't want to pay
high fees is unimportant. See all the other justifications Gavin is working
his way through on his blog.

That's why I conclude the opposite - if there is no fork, then people's
confidence in Bitcoin will be seriously damaged. If it's impossible to do
something as trivial as removing a temporary hack Satoshi put in place,
then what about bigger challenges? If the community is really willing to
drive itself off a cliff due to political deadlock, then why bother
building things that use Bitcoin at all?
Jeff Garzik
2015-05-07 15:17:03 UTC
Permalink
Post by Mike Hearn
That's why I conclude the opposite - if there is no fork, then people's
confidence in Bitcoin will be seriously damaged.
Yes, that is a possibility.
Post by Mike Hearn
If it's impossible to do something as trivial as removing a temporary hack
Satoshi put in place, then what about bigger challenges?
This is absolutely not a trivial change.

It is a trivial *code* change. It is not a trivial change to the economics
of a $3.2B system.
--
Jeff Garzik
Bitcoin core developer and open source evangelist
BitPay, Inc. https://bitpay.com/
Mike Hearn
2015-05-07 15:29:10 UTC
Permalink
Post by Jeff Garzik
It is a trivial *code* change. It is not a trivial change to the
economics of a $3.2B system.
Hmm - again I'd argue the opposite.

Up until now Bitcoin has been unconstrained by the hard block size limit.

If we raise it, Bitcoin will continue to be unconstrained by it. That's the
default "continue as we are" position.

If it's not raised, then ....... well, then we're in new territory
entirely. Businesses built on the assumption that Bitcoin could become
popular will suddenly have their basic assumptions invalidated. Users will
leave. The technical code change would be zero, but the economic change
would be significant.
Jeff Garzik
2015-05-07 15:35:47 UTC
Permalink
Yes - but you must recognize that is precisely 50% of the picture.

Others have made different assumptions - taking the [1MB-constrained]
market *as it exists today*, rather than in some projected future.

Raising the block size limit then becomes a *human decision* to favor some
users over others, a *human decision* to prevent an active and competitive
free fee market developing at 1MB, a *human decision* to keep transaction
fees low to incentivize bitcoin adoption, a *human decision* to value
adoption over decentralization.

These statements are not value judgements - not saying you are wrong -
these are observations of some rather huge, relevant blind spots in this
debate.
Post by Jeff Garzik
It is a trivial *code* change. It is not a trivial change to the
Post by Jeff Garzik
economics of a $3.2B system.
Hmm - again I'd argue the opposite.
Up until now Bitcoin has been unconstrained by the hard block size limit.
If we raise it, Bitcoin will continue to be unconstrained by it. That's
the default "continue as we are" position.
If it's not raised, then ....... well, then we're in new territory
entirely. Businesses built on the assumption that Bitcoin could become
popular will suddenly have their basic assumptions invalidated. Users will
leave. The technical code change would be zero, but the economic change
would be significant.
--
Jeff Garzik
Bitcoin core developer and open source evangelist
BitPay, Inc. https://bitpay.com/
Justus Ranvier
2015-05-07 16:18:32 UTC
Permalink
Post by Jeff Garzik
Raising the block size limit then becomes a *human decision* to
favor some users over others, a *human decision* to prevent an
active and competitive free fee market developing at 1MB, a *human
decision* to keep transaction fees low to incentivize bitcoin
adoption, a *human decision* to value adoption over
decentralization.
At the moment none of the following assertions have been proven true,
yet are constantly cited as if they have been:

* A competitive fee market will develop when the transaction rate
becomes constrained by the block size limit
* More users of Bitcoin means less decentralization

Furthermore, the term "decentralization" is frequently used without
being precisely defined in a way that would allow for such proofs to
be debated.

If there's going to be a debate on those points, then the people
presenting points on both sides should take the time to show their
work and explain the methodology they used to reach their conclusions.
Jorge Timón
2015-05-07 16:21:50 UTC
Permalink
Post by Gavin Andresen
I would very much like to find some concrete course of action that we can
come to consensus on. Some compromise so we can tell entrepreneurs "THIS is
how much transaction volume the main Bitcoin blockchain will be able to
support over the next eleven years."
Mhmm, I hadn't thought about this. This makes sense and actually
explains the urgency on taking a decision better than anything else
I've heard.
Post by Gavin Andresen
If it's not raised, then ....... well, then we're in new territory entirely.
Businesses built on the assumption that Bitcoin could become popular will
suddenly have their basic assumptions invalidated. Users will leave. The
technical code change would be zero, but the economic change would be
significant.
This, on the other hand, is a non sequitur [1], another type of fallacy.
Well, several of them, actually:

- If it's not raised, then bitcoin cannot become popular
- If it's not raised, then users will leave
- Businesses built on the assumption that Bitcoin could become popular
were also assuming that it's going to be risen.

These statements may even be true, but they're no logical conclusions
even if they seem obvious to you.
I don't think those claims are strictly true, specially because they
involve predictions about what people will do.
But if they're true they require some proof or at least some explanation.

[1] http://en.wikipedia.org/wiki/Non_sequitur_(logic)#Affirming_the_consequent
Peter Todd
2015-05-07 17:29:56 UTC
Permalink
Post by Jorge Timón
Post by Gavin Andresen
I would very much like to find some concrete course of action that we can
come to consensus on. Some compromise so we can tell entrepreneurs "THIS is
how much transaction volume the main Bitcoin blockchain will be able to
support over the next eleven years."
Mhmm, I hadn't thought about this. This makes sense and actually
explains the urgency on taking a decision better than anything else
I've heard.
I've spent a lot of time talking to companies about this, and the
problem is telling them that isn't actually very useful; knowing the
supply side of the equation isn't all that useful if you don't know the
demand side. Problem is we don't really have a good handle on what
Bitcoin will be used for in the future, or even for that matter, what
it's actually being used for right now.

As we saw with Satoshidice before and quite possibly will see with smart
contracts (escrows, futures, etc) it's easy for a relatively small
number of use cases to drive a significant amount of transaction volume.
Yet, as Wladimir and others point out, the fundemental underlying
architecture of the blockchain has inherently poor O(n^2) scaling, so
there's always some level of demand where it breaks, and/or incentivizes
actors in the space to push up against "safety stops" like soft
blocksize limits and get them removed.

Note how the response previously to bumping up against soft policy
limits was highly public calls(1) at the first hint of touble: "Mike
Hearn: Soft block size limit reached, action required by YOU"

1) https://bitcointalk.org/index.php?topic=149668.0
--
'peter'[:-1]@petertodd.org
000000000000000002761482983864328320badf24d137101fab9a5861a59d30
Mike Hearn
2015-05-07 19:37:28 UTC
Permalink
Post by Jorge Timón
These statements may even be true, but they're no logical conclusions
even if they seem obvious to you.
I don't think those claims are strictly true, specially because they
involve predictions about what people will do.
But if they're true they require some proof or at least some explanation.
Thank you for your patience, Jorge.

I have written up an explanation of what I think will happen if we run out
of capacity:

https://medium.com/@octskyward/crash-landing-f5cc19908e32

Now I'm going to go eat some dinner :)
Jérémie Dubois-Lacoste
2015-05-07 19:44:13 UTC
Permalink
Any proposal to switch to a new hardcorded value so we have time to
*really* figure out later what's next and all implications, is a road
to a gigantic issue later when we want to switch to that "next".

Sure we would have more time to think about, research all
implications, simulate, discuss, etc. But the ability then to agree
enough on a change to roll it out successfully will be much smaller,
because of the economy being built on top of Bitcoin being much larger
and the technical specifications of Bitcoin being closer to a complete
freeze.

What I'm trying to say is that we should look at long term lasting
solutions even if it takes more effort and time right now and puts the
network into some "troubles" for a while, because they're short term
"troubles". (You define "troubles", depending on which side you stand
at the moment...).

I personally believe in adaptive block size mechanisms, because:

(i) common sense tells me harcoding is never a solution for a system
whose usage is for many aspects unpredictable
(ii) we can't rely on human consensus to adapt it (seeing the mess
it is already this time).

It would have the advantage to place this block size issue entirely as
part of the algorithmic contract you agree on when you use Bitcoin,
similar to the difficulty adapation or the block reward.


Jérémie
Post by Mike Hearn
Post by Jorge Timón
These statements may even be true, but they're no logical conclusions
even if they seem obvious to you.
I don't think those claims are strictly true, specially because they
involve predictions about what people will do.
But if they're true they require some proof or at least some explanation.
Thank you for your patience, Jorge.
I have written up an explanation of what I think will happen if we run out
Now I'm going to go eat some dinner :)
------------------------------------------------------------------------------
One dashboard for servers and applications across Physical-Virtual-Cloud
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
_______________________________________________
Bitcoin-development mailing list
https://lists.sourceforge.net/lists/listinfo/bitcoin-development
Jérémie Dubois-Lacoste
2015-05-07 20:20:03 UTC
Permalink
Post by Mike Hearn
I have written up an explanation of what I think will happen if we run out
Looks like a solid description of what would happen.

I fail to see how this description wouldn't be applicable also to a
20MB-network in some time in the future, say ~3 years from now, if
Bitcoin keeps taking off.
If you agree that it will be harder in the future to change the block
limit again, and we switch to hardcoded 20MB, then aren't we just
going from an immediate relief to a future larger blockage?
Post by Mike Hearn
Now I'm going to go eat some dinner :)
------------------------------------------------------------------------------
One dashboard for servers and applications across Physical-Virtual-Cloud
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
_______________________________________________
Bitcoin-development mailing list
https://lists.sourceforge.net/lists/listinfo/bitcoin-development
Matthew Mitchell
2015-05-07 15:58:13 UTC
Permalink
In my personal opinion, this does make some sense to me, assuming I
understood Gavin.

I suppose it could be done with a new flag (like the P2SH flag) which
displays miner support for larger blocks. The new rules would apply when
a large majority of miners support the new rules by counting the number
of flagged blocks over a certain number of blocks on the network in a
deterministic fashion.

This way miners can continue to produce blocks which are supported by
both old and new clients. When it appears most people have migrated to
the new client, miners can start flagging support for the new rules, and
when a large majority of miners agree, the new rules would kick in for
all miners/clients running the new software. Miners could therefore glue
together the network during the migration phase until enough people have
updated to avoid severe fork scenarios. The only problem is ensuring
that miners will continue to support both networks for long enough to
enable successful migration.

And if too many people disagree to make a clean hard fork (too many
people stubbornly stick to the old rules), then it could be that the
hard fork is aborted and everyone goes back to the old rules, or quite
simply that the miners never give support for the new rules despite the
mechanism being included in the new client. In those cases it would be
as if nothing changed.

This way the hard fork would be determined by user participation as
judged by the miners.

If it is done, I can't think of a fairer way.

Matthew Mitchell
Post by Gavin Andresen
For reference: the blog post that (re)-started this debate, and which
http://gavinandresen.ninja/time-to-roll-out-bigger-blocks
In it, I asked people to email me objections I might have missed. I
would still appreciate it if people do that; it is impossible to keep up
with this mailing list, /r/bitcoin posts and comments, and
#bitcoin-wizards and also have time to respond thoughtfully to the
objections raised.
I would very much like to find some concrete course of action that we
can come to consensus on. Some compromise so we can tell entrepreneurs
"THIS is how much transaction volume the main Bitcoin blockchain will be
able to support over the next eleven years."
I've been pretty clear on what I think is a reasonable compromise (a
one-time increase scheduled for early next year), and I have tried to
explain why I think it it is the right set of tradeoffs.
There ARE tradeoffs here, and the hard question is what process do we
use to decide those tradeoffs? How do we come to consensus? Is it worth
my time to spend hours responding thoughtfully to every new objection
raised here, or will the same thing happen that happened last year and
the year before-- everybody eventually gets tired of arguing
angels-dancing-on-the-head-of-a-pin, and we're left with the status quo?
I AM considering contributing some version of the bigger blocksize-limit
hard-fork patch to the Bitcoin-Xt fork (probably "target a hobbyist
with a fast Internet connection, and assume Nelson's law to increase
over time), and then encouraging merchants and exchanges and web wallets
and individuals who think it strikes a reasonable balance to run it.
And then, assuming it became a super-majority of nodes on the network,
encourage miners to roll out a soft-fork to start producing bigger
blocks and eventually trigger the hard fork.
Because ultimately consensus comes down to what software people choose
to run.
--
--
Gavin Andresen
------------------------------------------------------------------------------
One dashboard for servers and applications across Physical-Virtual-Cloud
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
_______________________________________________
Bitcoin-development mailing list
https://lists.sourceforge.net/lists/listinfo/bitcoin-development
Matthew Mitchell
2015-05-07 16:47:41 UTC
Permalink
One thing to add is that perhaps in a future version of Bitcoin Core,
there could be an option for users to continue using the old consensus
rules, or an option to support the new rules (an option when they update
and an ability to change in the settings). Both types of user can
benefit from the software updates and choose with a single piece of
software what they support. Information for whether or not a user is
supporting the changes could be included in the version message.
Possibly this information could be incorporated into transactions also.

If they wish to support the new rules, then their client would support
larger blocks when there is majority miner consensus, otherwise their
clients will always only support the old rules.

This way the decision is not being forced upon the user in any way.

Just an idea.
Post by Matthew Mitchell
In my personal opinion, this does make some sense to me, assuming I
understood Gavin.
I suppose it could be done with a new flag (like the P2SH flag) which
displays miner support for larger blocks. The new rules would apply when
a large majority of miners support the new rules by counting the number
of flagged blocks over a certain number of blocks on the network in a
deterministic fashion.
This way miners can continue to produce blocks which are supported by
both old and new clients. When it appears most people have migrated to
the new client, miners can start flagging support for the new rules, and
when a large majority of miners agree, the new rules would kick in for
all miners/clients running the new software. Miners could therefore glue
together the network during the migration phase until enough people have
updated to avoid severe fork scenarios. The only problem is ensuring
that miners will continue to support both networks for long enough to
enable successful migration.
And if too many people disagree to make a clean hard fork (too many
people stubbornly stick to the old rules), then it could be that the
hard fork is aborted and everyone goes back to the old rules, or quite
simply that the miners never give support for the new rules despite the
mechanism being included in the new client. In those cases it would be
as if nothing changed.
This way the hard fork would be determined by user participation as
judged by the miners.
If it is done, I can't think of a fairer way.
Matthew Mitchell
Post by Gavin Andresen
For reference: the blog post that (re)-started this debate, and which
http://gavinandresen.ninja/time-to-roll-out-bigger-blocks
In it, I asked people to email me objections I might have missed. I
would still appreciate it if people do that; it is impossible to keep up
with this mailing list, /r/bitcoin posts and comments, and
#bitcoin-wizards and also have time to respond thoughtfully to the
objections raised.
I would very much like to find some concrete course of action that we
can come to consensus on. Some compromise so we can tell entrepreneurs
"THIS is how much transaction volume the main Bitcoin blockchain will be
able to support over the next eleven years."
I've been pretty clear on what I think is a reasonable compromise (a
one-time increase scheduled for early next year), and I have tried to
explain why I think it it is the right set of tradeoffs.
There ARE tradeoffs here, and the hard question is what process do we
use to decide those tradeoffs? How do we come to consensus? Is it worth
my time to spend hours responding thoughtfully to every new objection
raised here, or will the same thing happen that happened last year and
the year before-- everybody eventually gets tired of arguing
angels-dancing-on-the-head-of-a-pin, and we're left with the status quo?
I AM considering contributing some version of the bigger blocksize-limit
hard-fork patch to the Bitcoin-Xt fork (probably "target a hobbyist
with a fast Internet connection, and assume Nelson's law to increase
over time), and then encouraging merchants and exchanges and web wallets
and individuals who think it strikes a reasonable balance to run it.
And then, assuming it became a super-majority of nodes on the network,
encourage miners to roll out a soft-fork to start producing bigger
blocks and eventually trigger the hard fork.
Because ultimately consensus comes down to what software people choose
to run.
--
--
Gavin Andresen
------------------------------------------------------------------------------
One dashboard for servers and applications across Physical-Virtual-Cloud
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
_______________________________________________
Bitcoin-development mailing list
https://lists.sourceforge.net/lists/listinfo/bitcoin-development
------------------------------------------------------------------------------
One dashboard for servers and applications across Physical-Virtual-Cloud
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
_______________________________________________
Bitcoin-development mailing list
https://lists.sourceforge.net/lists/listinfo/bitcoin-development
Matt Corallo
2015-05-07 17:26:10 UTC
Permalink
Post by Gavin Andresen
For reference: the blog post that (re)-started this debate, and which
http://gavinandresen.ninja/time-to-roll-out-bigger-blocks
In it, I asked people to email me objections I might have missed. I
would still appreciate it if people do that; it is impossible to keep up
with this mailing list, /r/bitcoin posts and comments, and
#bitcoin-wizards and also have time to respond thoughtfully to the
objections raised.
People have been sharing the same objections as on this list for months,
I'm not sure what is new here.
Post by Gavin Andresen
I would very much like to find some concrete course of action that we
can come to consensus on. Some compromise so we can tell entrepreneurs
"THIS is how much transaction volume the main Bitcoin blockchain will be
able to support over the next eleven years."
I think this is a huge issue. You've been wandering around telling
people that the blocksize will increase soon for months, when there is
very clearly no consensus that it should in the short-term future. The
only answer to this that anyone with a clue should give is "it will
very, very likely be able to support at least 1MB blocks roughly every
10 minutes on average for the next eleven years, and it seems likely
that a block size increase of some form will happen at some point in the
next eleven years", anything else is dishonest.
Gavin Andresen
2015-05-07 17:40:59 UTC
Permalink
Post by Matt Corallo
I think this is a huge issue. You've been wandering around telling
people that the blocksize will increase soon for months
I think the strongest thing I've ever said is:

"There is consensus that the max block size much change sooner or later.
There is not yet consensus on exactly how or when. I will be pushing to
change it this year."

This is what "I will be pushing to change it this year" looks like.
--
--
Gavin Andresen
Mike Hearn
2015-05-07 17:43:30 UTC
Permalink
The only answer to this that anyone with a clue should give is "it
will very, very likely be able to support at least 1MB blocks roughly
every 10 minutes on average for the next eleven years, and it seems
likely that a block size increase of some form will happen at some point in
the next eleven years", anything else is dishonest.
Matt, you know better than that. Gavin neither lacks clue nor is he
dishonest.

He has been working on the assumption that other developers are reasonable,
and some kind of compromise solution can be found that everyone can live
with. Hence trying to find a middle ground, hence considering and writing
articles in response to every single objection raised. Hence asking for
suggestions on what to change about the plan, to make it more acceptable.
What more do you want, exactly?

And I'll ask again. Do you have a *specific, credible alternative*? Because
so far I'm not seeing one.
Btc Drak
2015-05-07 18:03:55 UTC
Permalink
Post by Mike Hearn
And I'll ask again. Do you have a *specific, credible alternative*?
Because so far I'm not seeing one.
I think you are rubbing against your own presupposition that people must
find and alternative right now. Quite a lot here do not believe there is
any urgency, nor that there is an immanent problem that has to be solved
before the sky falls in.
Mike Hearn
2015-05-07 18:06:09 UTC
Permalink
Post by Btc Drak
I think you are rubbing against your own presupposition that people must
find and alternative right now. Quite a lot here do not believe there is
any urgency, nor that there is an immanent problem that has to be solved
before the sky falls in.
I have explained why I believe there is some urgency, whereby "some
urgency" I mean, assuming it takes months to implement, merge, test,
release and for people to upgrade.

But if it makes you happy, imagine that this discussion happens all over
again next year and I ask the same question.
Ross Nicoll
2015-05-07 18:21:47 UTC
Permalink
Can I just add my own support for this - as has been stated elsewhere in
this discussion, hard forks are difficult, and risky. The earlier we
have a decision, and the earlier the change goes into the code, the
easier that is.

Even if the decision was the actual block size change is fine to leave
until 2020, I'd like to see the code committed ASAP so that every new
install, and every upgrade from there on gets the new version.

My personal opinion only is that 7 transactions a second is insanely
limited even if the main chain does nothing but act as a backbone
between other chains and transaction networks. I don't think that's
overly controversial. I think 2016 is too early for a 20mb block size,
though. I'm inclined to suggest a schedule of expansion, say to 2mb in
2016, 4mb in 2018, 8mb in 2020 and 20mb in 2022 where it stops. The
intent would be to provide enough size pressure to motivate scaling
work, while not limiting Bitcoin overly.

Further, I think this highlights that we need more work on fees. Right
now fees and transactions included are fairly naive, but I'd like to see
the absolute block size limit as a hard upper bound, with miners
imposing soft limits based on a balance cost of storage, number of
outputs vs inputs (and therefore impact on the UTXOs), and risk of
orphan blocks to determine which transactions are actually worth
including in each block. If anyone has numbers on block size vs orphan
rate that would be really useful, BTW.

Ross
Post by Btc Drak
I think you are rubbing against your own presupposition that
people must find and alternative right now. Quite a lot here do
not believe there is any urgency, nor that there is an immanent
problem that has to be solved before the sky falls in.
I have explained why I believe there is some urgency, whereby "some
urgency" I mean, assuming it takes months to implement, merge, test,
release and for people to upgrade.
But if it makes you happy, imagine that this discussion happens all
over again next year and I ask the same question.
------------------------------------------------------------------------------
One dashboard for servers and applications across Physical-Virtual-Cloud
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
_______________________________________________
Bitcoin-development mailing list
https://lists.sourceforge.net/lists/listinfo/bitcoin-development
Gavin Costin
2015-05-07 18:40:50 UTC
Permalink
Can anyone opposed to this proposal articulate in plain english the worst
case scenario(s) if it goes ahead?

Some people in the conversation appear to be uncomfortable, perturbed,
defensive etc about the proposal Š. But I am not seeing specifics on why it
is not a feasible plan.

From: Mike Hearn <***@plan99.net>
Date: Friday, 8 May, 2015 2:06 am
To: Btc Drak <***@gmail.com>
Cc: Bitcoin Dev <bitcoin-***@lists.sourceforge.net>
Subject: Re: [Bitcoin-development] Block Size Increase
I think you are rubbing against your own presupposition that people must find
and alternative right now. Quite a lot here do not believe there is any
urgency, nor that there is an immanent problem that has to be solved before
the sky falls in.
I have explained why I believe there is some urgency, whereby "some urgency"
I mean, assuming it takes months to implement, merge, test, release and for
people to upgrade.

But if it makes you happy, imagine that this discussion happens all over
again next year and I ask the same question.

----------------------------------------------------------------------------
-- One dashboard for servers and applications across Physical-Virtual-Cloud
Widest out-of-the-box monitoring support with 50+ applications Performance
metrics, stats and reports that give you Actionable Insights Deep dive
visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y_____________________
__________________________ Bitcoin-development mailing list
Bitcoin-***@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bitcoin-development
Btc Drak
2015-05-07 18:46:14 UTC
Permalink
Post by Gavin Costin
Can anyone opposed to this proposal articulate in plain english the worst
case scenario(s) if it goes ahead?
Some people in the conversation appear to be uncomfortable, perturbed,
defensive etc about the proposal 
. But I am not seeing specifics on why it
is not a feasible plan.
See this response:
http://www.mail-archive.com/bitcoin-***@lists.sourceforge.net/msg07462.html
Bernard Rihn
2015-05-07 19:31:53 UTC
Permalink
It seems to me like some (maybe most) of the pressure is actually external
from companies that might release something that dramatically increases
"adoption" & transaction rates (and that the data on historic rate of
adoption & slumps is somewhat disconnected from their interests in a quick
roll-out)?

It seems like the question actually becomes what is our maximum acceptable
cost (hardware capex & bandwidth & power opex) associated with running a
full node without hardware acceleration and with hardware acceleration
(something which presumably "doesn't exist" yet)? Are we making the
assumption that hardware acceleration for confirmation will become broadly
available and that the primary limiter will become anonymous bandwidth?

Excuse my ignorance, but I imagine somebody must have already looked at
confirmation times vs. block size for various existing hardware platforms
(like at least 3 or 4? maybe a minnowboard, old laptop, and modern desktop
at least?)? Is there an easy way to setup bitcoind or some other script to
test this? (happy to help)

Re Moore's law: yeah, some say stuff like 5nm may never happen. We're
already using EUV with plasma emitters, immersed reflective optics, and
double-patterning... and in storage land switching to helium. Things may
slow A LOT over the next couple decades and I'd guess that a quadratic
increase (both in storage & compute) probably isn't a safe assumption.
Post by Matt Corallo
Post by Gavin Costin
Can anyone opposed to this proposal articulate in plain english the worst
case scenario(s) if it goes ahead?
Some people in the conversation appear to be uncomfortable, perturbed,
defensive etc about the proposal 
. But I am not seeing specifics on why it
is not a feasible plan.
------------------------------------------------------------------------------
One dashboard for servers and applications across Physical-Virtual-Cloud
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
_______________________________________________
Bitcoin-development mailing list
https://lists.sourceforge.net/lists/listinfo/bitcoin-development
Alan Reiner
2015-05-07 19:31:46 UTC
Permalink
This *is* urgent and needs to be handled right now, and I believe Gavin
has the best approach to this. I have heard Gavin's talks on increasing
the block size, and the two most persuasive points to me were:

(1) Blocks are essentially nearing "full" now. And by "full" he means
that the reliability of the network (from the average user perspective)
is about to be impacted in a very negative way (I believe it was due to
the inconsistent time between blocks). I think Gavin said that his
simulations showed 400 kB - 600 kB worth of transactions per 10 min
(approx 3-4 tps) is where things start to behave poorly for certain
classes of transactions. In other words, we're very close to the
effective limit in terms of maintaining the current "standard of
living", and with a year needed to raise the block time this actually is
urgent.

(2) Leveraging fee pressure at 1MB to solve the problem is actually
really a bad idea. It's really bad while Bitcoin is still growing, and
relying on fee pressure at 1 MB severely impacts attractiveness and
adoption potential of Bitcoin (due to high fees and unreliability). But
more importantly, it ignores the fact that for a 7 tps is pathetic for a
global transaction system. It is a couple orders of magnitude too low
for any meaningful commercial activity to occur. If we continue with a
cap of 7 tps forever, Bitcoin *will* fail. Or at best, it will fail to
be useful for the vast majority of the world (which probably leads to
failure). We shouldn't be talking about fee pressure until we hit 700
tps, which is probably still too low.

You can argue that side chains and payment channels could alleviate
this. But how far off are they? We're going to hit effective 1MB
limits long before we can leverage those in a meaningful way. Even if
everyone used them, getting a billion people onto the system just can't
happen even at 1 transaction per year per person to get into a payment
channel or move money between side chains.

We get asked all the time by corporate clients about scalability. A
limit of 7 tps makes them uncomfortable that they are going to invest
all this time into a system that has no chance of handling the economic
activity that they expect it handle. We always assure them that 7 tps
is not the final answer.

Satoshi didn't believe 1 MB blocks were the correct answer. I
personally think this is critical to Bitcoin's long term future. And
I'm not sure what else Gavin could've done to push this along in a
meaninful way.

-Alan
Post by Btc Drak
I think you are rubbing against your own presupposition that
people must find and alternative right now. Quite a lot here do
not believe there is any urgency, nor that there is an immanent
problem that has to be solved before the sky falls in.
I have explained why I believe there is some urgency, whereby "some
urgency" I mean, assuming it takes months to implement, merge, test,
release and for people to upgrade.
But if it makes you happy, imagine that this discussion happens all
over again next year and I ask the same question.
------------------------------------------------------------------------------
One dashboard for servers and applications across Physical-Virtual-Cloud
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
_______________________________________________
Bitcoin-development mailing list
https://lists.sourceforge.net/lists/listinfo/bitcoin-development
Jeff Garzik
2015-05-07 19:54:13 UTC
Permalink
Post by Alan Reiner
(1) Blocks are essentially nearing "full" now. And by "full" he means
that the reliability of the network (from the average user perspective) is
about to be impacted in a very negative way
Er, to be economically precise, "full" just means fees are no longer zero.
Bitcoin behaves as it always has. It is no longer basically free to dump
spam into the blockchain, as it is today.

In the short term, blocks are bursty, with some on 1 minute intervals, some
with 60 minute intervals. This does not change with larger blocks.
Post by Alan Reiner
(2) Leveraging fee pressure at 1MB to solve the problem is actually really
a bad idea. It's really bad while Bitcoin is still growing, and relying on
fee pressure at 1 MB severely impacts attractiveness and adoption potential
of Bitcoin (due to high fees and unreliability). But more importantly, it
ignores the fact that for a 7 tps is pathetic for a global transaction
system. It is a couple orders of magnitude too low for any meaningful
commercial activity to occur. If we continue with a cap of 7 tps forever,
Bitcoin *will* fail. Or at best, it will fail to be useful for the vast
majority of the world (which probably leads to failure). We shouldn't be
talking about fee pressure until we hit 700 tps, which is probably still
too low.
[...]

1) Agree that 7 tps is too low

2) Where do you want to go? Should bitcoin scale up to handle all the
world's coffees?

This is hugely unrealistic. 700 tps is 100MB blocks, 14.4 GB/day -- just
for a single feed. If you include relaying to multiple nodes, plus serving
500 million SPV clients en grosse, who has the capacity to run such a
node? By the time we get to fee pressure, in your scenario, our network
node count is tiny and highly centralized.

3) In RE "fee pressure" -- Do you see the moral hazard to a software-run
system? It is an intentional, human decision to flood the market with
supply, thereby altering the economics, forcing fees to remain low in the
hopes of achieving adoption. I'm pro-bitcoin and obviously want to see
bitcoin adoption - but I don't want to sacrifice every decentralized
principle and become a central banker in order to get there.
--
Jeff Garzik
Bitcoin core developer and open source evangelist
BitPay, Inc. https://bitpay.com/
Justus Ranvier
2015-05-07 19:59:18 UTC
Permalink
Post by Jeff Garzik
By the time we get to fee pressure, in your scenario, our network
node count is tiny and highly centralized.
Again, this assertion requires proof.

Simply saying things is not the same as them being true.
Tom Harding
2015-05-08 01:40:32 UTC
Permalink
Post by Jeff Garzik
In the short term, blocks are bursty, with some on 1 minute intervals,
some with 60 minute intervals. This does not change with larger blocks.
I'm pretty sure Alan meant that blocks are already filling up after long
inter-block intervals.
Post by Jeff Garzik
2) Where do you want to go? Should bitcoin scale up to handle all the
world's coffees?
Alan was very clear. Right now, he wants to go exactly where Gavin's
concrete proposal suggests.
Jeff Garzik
2015-05-08 02:09:42 UTC
Permalink
Post by Tom Harding
Post by Jeff Garzik
2) Where do you want to go? Should bitcoin scale up to handle all the
world's coffees?
Alan was very clear. Right now, he wants to go exactly where Gavin's
concrete proposal suggests.
G proposed 20MB blocks, AFAIK - 140 tps
A proposed 100MB blocks - 700 tps
For ref,
Paypal is around 115 tps
VISA is around 2000 tps (perhaps 4000 tps peak)

I ask again: where do we want to go? This is the existential question
behind block size.

Are we trying to build a system that can handle Paypal volumes? VISA
volumes?

It's not a snarky or sarcastic question: Are we building a system to
handle all the world's coffees? Is bitcoin's main chain and network -
Layer 1 - going to receive direct connections from 500m mobile phones,
broadcasting transactions?

We must answer these questions to inform the change being discussed today,
in order to decide what makes the most sense as a new limit. Any
responsible project of this magnitude must have a better story than "zomg
1MB, therefore I picked 20MB out of a hat" Must be able to answer /why/
the new limit was picked.

As G notes, changing the block size is simply kicking the can down the
road: http://gavinandresen.ninja/it-must-be-done-but-is-not-a-panacea
Necessarily one must ask, today, what happens when we get to the end of
that newly paved road.
--
Jeff Garzik
Bitcoin core developer and open source evangelist
BitPay, Inc. https://bitpay.com/
Tom Harding
2015-05-08 05:13:08 UTC
Permalink
Post by Jeff Garzik
G proposed 20MB blocks, AFAIK - 140 tps
A proposed 100MB blocks - 700 tps
For ref,
Paypal is around 115 tps
VISA is around 2000 tps (perhaps 4000 tps peak)
I ask again: where do we want to go? This is the existential
question behind block size.
Are we trying to build a system that can handle Paypal volumes? VISA
volumes?
It's not a snarky or sarcastic question: Are we building a system to
handle all the world's coffees? Is bitcoin's main chain and network -
Layer 1 - going to receive direct connections from 500m mobile phones,
broadcasting transactions?
We must answer these questions to inform the change being discussed
today, in order to decide what makes the most sense as a new limit.
Any responsible project of this magnitude must have a better story
than "zomg 1MB, therefore I picked 20MB out of a hat" Must be able to
answer /why/ the new limit was picked.
As G notes, changing the block size is simply kicking the can down the
http://gavinandresen.ninja/it-must-be-done-but-is-not-a-panacea
Necessarily one must ask, today, what happens when we get to the end
of that newly paved road.
Accepting that outcomes are less knowable further into the future is not
the same as failing to consider the future at all. A responsible
project can't have a movie-plot roadmap. It needs to give weight to
multiple possible future outcomes.
http://en.wikipedia.org/wiki/Decision_tree

One way or another, the challenge is to decide what to do next. Beyond
that, it's future decisions all the way down.

Alan argues that 7 tps is a couple orders of magnitude too low for any
meaningful commercial activity to occur, and too low to be the final
solution, even with higher layers. I agree. I also agree with you,
that we don't really know how to accomplish 700tps right now.

What we do know is if we want to bump the limit in the short term, we
ought to start now, and until there's a better alternative root to the
decision tree, it just might be time to get moving.
Mike Hearn
2015-05-08 09:43:42 UTC
Permalink
Post by Tom Harding
Alan argues that 7 tps is a couple orders of magnitude too low
By the way, just to clear this up - the real limit at the moment is more
like 3 tps, not 7.

The 7 transactions/second figure comes from calculations I did years ago,
in 2011. I did them a few months before the "sendmany" command was
released, so back then almost all transactions were small. After sendmany
and as people developed custom wallets, etc, the average transaction size
went up.
Alan Reiner
2015-05-08 14:59:34 UTC
Permalink
This isn't about "everyone's coffee". This is about an absolute minimum
amount of participation by people who wish to use the network. If our
goal is really for bitcoin to really be a global, open transaction
network that makes money fluid, then 7tps is already a failure. If even
5% of the world (350M people) was using the network for 1 tx per month
(perhaps to open payment channels, or shift money between side chains),
we'll be above 100 tps. And that doesn't include all the
non-individuals (organizations) that want to use it.

The goals of "a global transaction network" and "everyone must be able
to run a full node with their $200 dell laptop" are not compatible. We
need to accept that a global transaction system cannot be
fully/constantly audited by everyone and their mother. The important
feature of the network is that it is open and anyone *can* get the
history and verify it. But not everyone is required to. Trying to
promote a system where the history can be forever handled by a low-end
PC is already falling out of reach, even with our miniscule 7 tps.
Clinging to that goal needlessly limits the capability for the network
to scale to be a useful global payments system
Post by Alan Reiner
(2) Leveraging fee pressure at 1MB to solve the problem is
actually really a bad idea. It's really bad while Bitcoin is
still growing, and relying on fee pressure at 1 MB severely
impacts attractiveness and adoption potential of Bitcoin (due to
high fees and unreliability). But more importantly, it ignores
the fact that for a 7 tps is pathetic for a global transaction
system. It is a couple orders of magnitude too low for any
meaningful commercial activity to occur. If we continue with a
cap of 7 tps forever, Bitcoin *will* fail. Or at best, it will
fail to be useful for the vast majority of the world (which
probably leads to failure). We shouldn't be talking about fee
pressure until we hit 700 tps, which is probably still too low.
[...]
1) Agree that 7 tps is too low
2) Where do you want to go? Should bitcoin scale up to handle all the
world's coffees?
This is hugely unrealistic. 700 tps is 100MB blocks, 14.4 GB/day --
just for a single feed. If you include relaying to multiple nodes,
plus serving 500 million SPV clients en grosse, who has the capacity
to run such a node? By the time we get to fee pressure, in your
scenario, our network node count is tiny and highly centralized.
3) In RE "fee pressure" -- Do you see the moral hazard to a
software-run system? It is an intentional, human decision to flood
the market with supply, thereby altering the economics, forcing fees
to remain low in the hopes of achieving adoption. I'm pro-bitcoin and
obviously want to see bitcoin adoption - but I don't want to sacrifice
every decentralized principle and become a central banker in order to
get there.
Joel Joonatan Kaartinen
2015-05-08 01:51:35 UTC
Permalink
Having observed the customer support nightmare it tends to cause for a
small exchange service when 100% full blocks happen, I've been thinking
that the limit really should be dynamic and respond to demand and the
amount of fees offered. It just doesn't feel right when it takes ages to
burn through the backlog when 100% full is hit for a while. So, while
pondering this, I got an idea that I think has a chance of working that I
can't remember seeing suggested anywhere.

How about basing the maximum valid size for a block on the total bitcoin
days destroyed in that block? That should still stop transaction spam but
naturally expand the block size when there's a backlog of real
transactions. It'd also provide for an indirect mechanism for increasing
the maximum block size based on fees if there's a lot of fees but little
bitcoin days destroyed. In such a situation there'd be incentive to pay
someone to spend an older txout to expand the maximum. I realize this is a
rather half baked idea, but it seems worth considering.

- Joel
Post by Alan Reiner
This *is* urgent and needs to be handled right now, and I believe Gavin
has the best approach to this. I have heard Gavin's talks on increasing
(1) Blocks are essentially nearing "full" now. And by "full" he means
that the reliability of the network (from the average user perspective) is
about to be impacted in a very negative way (I believe it was due to the
inconsistent time between blocks). I think Gavin said that his simulations
showed 400 kB - 600 kB worth of transactions per 10 min (approx 3-4 tps) is
where things start to behave poorly for certain classes of transactions.
In other words, we're very close to the effective limit in terms of
maintaining the current "standard of living", and with a year needed to
raise the block time this actually is urgent.
(2) Leveraging fee pressure at 1MB to solve the problem is actually really
a bad idea. It's really bad while Bitcoin is still growing, and relying on
fee pressure at 1 MB severely impacts attractiveness and adoption potential
of Bitcoin (due to high fees and unreliability). But more importantly, it
ignores the fact that for a 7 tps is pathetic for a global transaction
system. It is a couple orders of magnitude too low for any meaningful
commercial activity to occur. If we continue with a cap of 7 tps forever,
Bitcoin *will* fail. Or at best, it will fail to be useful for the vast
majority of the world (which probably leads to failure). We shouldn't be
talking about fee pressure until we hit 700 tps, which is probably still
too low.
You can argue that side chains and payment channels could alleviate this.
But how far off are they? We're going to hit effective 1MB limits long
before we can leverage those in a meaningful way. Even if everyone used
them, getting a billion people onto the system just can't happen even at 1
transaction per year per person to get into a payment channel or move money
between side chains.
We get asked all the time by corporate clients about scalability. A limit
of 7 tps makes them uncomfortable that they are going to invest all this
time into a system that has no chance of handling the economic activity
that they expect it handle. We always assure them that 7 tps is not the
final answer.
Satoshi didn't believe 1 MB blocks were the correct answer. I personally
think this is critical to Bitcoin's long term future. And I'm not sure
what else Gavin could've done to push this along in a meaninful way.
-Alan
I think you are rubbing against your own presupposition that people
must find and alternative right now. Quite a lot here do not believe there
is any urgency, nor that there is an immanent problem that has to be solved
before the sky falls in.
I have explained why I believe there is some urgency, whereby "some
urgency" I mean, assuming it takes months to implement, merge, test,
release and for people to upgrade.
But if it makes you happy, imagine that this discussion happens all over
again next year and I ask the same question.
------------------------------------------------------------------------------
One dashboard for servers and applications across Physical-Virtual-Cloud
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
_______________________________________________
------------------------------------------------------------------------------
One dashboard for servers and applications across Physical-Virtual-Cloud
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
_______________________________________________
Bitcoin-development mailing list
https://lists.sourceforge.net/lists/listinfo/bitcoin-development
Peter Todd
2015-05-08 03:41:21 UTC
Permalink
Post by Alan Reiner
We get asked all the time by corporate clients about scalability. A
limit of 7 tps makes them uncomfortable that they are going to invest
all this time into a system that has no chance of handling the economic
activity that they expect it handle. We always assure them that 7 tps
is not the final answer.
Your corporate clients, *why* do they want to use Bitcoin and what for
exactly?
--
'peter'[:-1]@petertodd.org
0000000000000000054c9d9ae1099ef8bc0bc9b76fef5e03f7edaff66fd817d8
Chris Wardell
2015-05-07 18:38:10 UTC
Permalink
Instead of raising the block size to another static number like 20MB, can
we raise it dynamically?

Make the max block size something like:
pow(2, nHeight/100000) * 1MB; //double every ~2 years
Alex Mizrahi
2015-05-07 18:55:32 UTC
Permalink
Just to add to the noise, did you consider linear growth?

Unlike exponential growth, it approximates diminishing returns (i.e. tech
advances become slower with time). And unlike single step, it will give
people time to adapt to new realities.

E.g. 2 MB in 2016, 3 MB in 2017 and so on.
So in 20 years we'll get to 20 MB which "ought to be enough for anybody".
But if miners will find 20 MB blocks too overwhelming, they can limit it
through soft work, based on actual data.
Ross Nicoll
2015-05-07 18:59:41 UTC
Permalink
I'm presuming that schedule is just an example, as you'd end up with
insanely large block sizes in a few years.

Absolutely, yes, an increase schedule is an option if people agree on
it, and I think the better option, as the current limit too low, but
jumping straight to a value big enough for "indefinitely" is a huge jump.

Gave some thought to scaling block size based on transaction fees, but
suspect it would end up with miners sending huge fees to themselves with
transactions that aren't relayed (so they only are actioned if they make
it into a block that miner mines) to make the network allow bigger blocks.

Ross
Post by Chris Wardell
Instead of raising the block size to another static number like 20MB,
can we raise it dynamically?
pow(2, nHeight/100000) * 1MB; //double every ~2 years
------------------------------------------------------------------------------
One dashboard for servers and applications across Physical-Virtual-Cloud
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
_______________________________________________
Bitcoin-development mailing list
https://lists.sourceforge.net/lists/listinfo/bitcoin-development
Matt Corallo
2015-05-07 19:03:52 UTC
Permalink
Replies inline.
The only answer to this that anyone with a clue should give is "it
will very, very likely be able to support at least 1MB blocks
roughly every 10 minutes on average for the next eleven years, and
it seems likely that a block size increase of some form will happen
at some point in the next eleven years", anything else is dishonest.
Matt, you know better than that. Gavin neither lacks clue nor is he
dishonest.
No, I dont think Gavin is being deliberately dishonest, and I'm rather
confident he phrased everything in a way that is technically true (ie
the quote in his response). However, others have definitely not taken
away the correct interpretation of what he said, and this is a serious
problem. Setting expectations correctly as this is a very contentious
issue and one that does not appear to be reaching consensus quickly in
the technical community is important.
More generally, consider the situation we're in now. Gavin is going off
pitching this idea to the general public (which, I agree, is an
important step in pulling off a hardfork) while people who actually
study the issues are left wondering why they're being ignored (ie why is
there no consensus-building happening on this list?).
He has been working on the assumption that other developers are
reasonable, and some kind of compromise solution can be found that
everyone can live with. Hence trying to find a middle ground, hence
considering and writing articles in response to every single objection
raised. Hence asking for suggestions on what to change about the plan,
to make it more acceptable. What more do you want, exactly?
The appropriate method of doing any fork, that we seem to have been
following for a long time, is to get consensus here and on IRC and on
github and *then* go pitch to the general public (either directly or by
releasing software) that they should upgrade. I admit that hardforks are
a bit different in that the level of software upgrade required means
additional lead time, but I'm not sure that means starting the
public-pitching phase before there is any kind of consensus forming
(actually, I'd point out that to me there seems to be rahter clear
consensus outside of you and Gavin that we should delay increasing block
size).
As far as I can tell, there has been no discussion of block sizes on
this list since 2013, and while I know Gavin has had many private
conversations with people in this community about the block size, very
little if any of it has happened in public.
If, instead, there had been an intro on the list as "I think we should
do the blocksize increase soon, what do people think?", the response
could likely have focused much more around creating a specific list of
things we should do before we (the technical community) think we are
prepared for a blocksize increase.
And I'll ask again. Do you have a *specific, credible alternative*?
Because so far I'm not seeing one.
A specific credible alternative to what? Committing to blocksize
increases tomorrow? Yes, doing more research into this and developing
software around supporting larger block sizes so people feel comfortable
doing it in six months. I acknowledge that Gavin has been putting a lot
of effort into this front, but, judging by this thread, I am far from
the only one who thinks much more needs done.
Jeff Garzik
2015-05-07 19:13:23 UTC
Permalink
Post by Matt Corallo
More generally, consider the situation we're in now. Gavin is going off
pitching this idea to the general public (which, I agree, is an
important step in pulling off a hardfork) while people who actually
study the issues are left wondering why they're being ignored (ie why is
there no consensus-building happening on this list?).
This sub-thread threatens to veer off into he-said-she-said.
Post by Matt Corallo
If, instead, there had been an intro on the list as "I think we should
do the blocksize increase soon, what do people think?", the response
could likely have focused much more around creating a specific list of
things we should do before we (the technical community) think we are
prepared for a blocksize increase.
Agreed, but that is water under the bridge at this point. You - rightly -
opened the topic here and now we're discussing it.

Mike and Gavin are due the benefit of doubt because making a change to a
leaderless automaton powered by leaderless open source software is breaking
new ground. I don't focus so much on how we got to this point, but rather,
where we go from here.
--
Jeff Garzik
Bitcoin core developer and open source evangelist
BitPay, Inc. https://bitpay.com/
Mike Hearn
2015-05-07 19:34:02 UTC
Permalink
Post by Matt Corallo
The appropriate method of doing any fork, that we seem to have been
following for a long time, is to get consensus here and on IRC and on
github and *then* go pitch to the general public
So your concern is just about the ordering and process of things, and not
about the change itself?

I have witnessed many arguments in IRC about block sizes over the years.
There was another one just a few weeks ago. Pieter left the channel for his
own sanity. IRC is not a good medium for arriving at decisions on things -
many people can't afford to sit on IRC all day and conversations can be
hard to follow. Additionally, they tend to go circular.

That said, I don't know if you can draw a line between the "ins" and "outs"
like that. The general public is watching, commenting and deciding no
matter what. Might as well deal with that and debate in a format more
accessible to all.
Post by Matt Corallo
If, instead, there had been an intro on the list as "I think we should
do the blocksize increase soon, what do people think?"
There have been many such discussions over time. On bitcointalk. On reddit.
On IRC. At developer conferences. Gavin already knew what many of the
objections would be, which is why he started answering them.

But alright. Let's say he should have started a thread. Thanks for starting
it for him.

Now, can we get this specific list of things we should do before we're
prepared?
Post by Matt Corallo
A specific credible alternative to what? Committing to blocksize
increases tomorrow? Yes, doing more research into this and developing
software around supporting larger block sizes so people feel comfortable
doing it in six months.
Do you have a specific research suggestion? Gavin has run simulations
across the internet with modified full nodes that use 20mb blocks, using
real data from the block chain. They seem to suggest it works OK.

What software do you have in mind?
Matt Corallo
2015-05-07 21:29:01 UTC
Permalink
Post by Matt Corallo
The appropriate method of doing any fork, that we seem to have been
following for a long time, is to get consensus here and on IRC and on
github and *then* go pitch to the general public
So your concern is just about the ordering and process of things, and
not about the change itself?
No, I'm very concerned about both.
Post by Matt Corallo
I have witnessed many arguments in IRC about block sizes over the years.
There was another one just a few weeks ago. Pieter left the channel for
his own sanity. IRC is not a good medium for arriving at decisions on
things - many people can't afford to sit on IRC all day and
conversations can be hard to follow. Additionally, they tend to go circular.
I agree, thats why this mailing list was created in the first place
(well, also because bitcointalk is too full of spam, but close enought :))
Post by Matt Corallo
That said, I don't know if you can draw a line between the "ins" and
"outs" like that. The general public is watching, commenting and
deciding no matter what. Might as well deal with that and debate in a
format more accessible to all.
Its true, just like its true the general public can opt to run any
version of software they want. That said, the greater software
development community has to update /all/ the software across the entire
ecosystem, and thus provide what amounts to a strong recommendation of
which course to take. Additionally, though there are issues (eg if there
was a push to remove the total coin limit) which are purely political,
and thus which should be up to the greater public to decide, the
blocksize increase is not that. It is intricately tied to Bitcoin's
delicate incentive structure, which many of the development community
are far more farmiliar with than the general Bitcoin public. If there
were a listserv that was comprised primarily of people on
#bitcoin-wizards, I might have suggested a discussion there, first, but
there isnt (as far as I know?).
Post by Matt Corallo
If, instead, there had been an intro on the list as "I think we should
do the blocksize increase soon, what do people think?"
There have been many such discussions over time. On bitcointalk. On
reddit. On IRC. At developer conferences. Gavin already knew what many
of the objections would be, which is why he started answering them.
But alright. Let's say he should have started a thread. Thanks for
starting it for him.
Now, can we get this specific list of things we should do before we're
prepared?
Yes....I'm gonna split the topic since this is already far off course
for that :).
Post by Matt Corallo
A specific credible alternative to what? Committing to blocksize
increases tomorrow? Yes, doing more research into this and developing
software around supporting larger block sizes so people feel comfortable
doing it in six months.
Do you have a specific research suggestion? Gavin has run simulations
across the internet with modified full nodes that use 20mb blocks, using
real data from the block chain. They seem to suggest it works OK.
What software do you have in mind?
Let me answer that in a new thread :).
21E14
2015-05-07 23:05:29 UTC
Permalink
I am more fazed by PR 5288 and PR 5925 not getting merged in, than by this
thread. So, casting my ballot in favor of the block size increase. Clearly,
we're still rehearsing proper discourse, and that ain't gonna get fixed
here and now.
Post by Matt Corallo
Post by Matt Corallo
The appropriate method of doing any fork, that we seem to have been
following for a long time, is to get consensus here and on IRC and on
github and *then* go pitch to the general public
So your concern is just about the ordering and process of things, and
not about the change itself?
No, I'm very concerned about both.
Post by Matt Corallo
I have witnessed many arguments in IRC about block sizes over the years.
There was another one just a few weeks ago. Pieter left the channel for
his own sanity. IRC is not a good medium for arriving at decisions on
things - many people can't afford to sit on IRC all day and
conversations can be hard to follow. Additionally, they tend to go
circular.
I agree, thats why this mailing list was created in the first place
(well, also because bitcointalk is too full of spam, but close enought :))
Post by Matt Corallo
That said, I don't know if you can draw a line between the "ins" and
"outs" like that. The general public is watching, commenting and
deciding no matter what. Might as well deal with that and debate in a
format more accessible to all.
Its true, just like its true the general public can opt to run any
version of software they want. That said, the greater software
development community has to update /all/ the software across the entire
ecosystem, and thus provide what amounts to a strong recommendation of
which course to take. Additionally, though there are issues (eg if there
was a push to remove the total coin limit) which are purely political,
and thus which should be up to the greater public to decide, the
blocksize increase is not that. It is intricately tied to Bitcoin's
delicate incentive structure, which many of the development community
are far more farmiliar with than the general Bitcoin public. If there
were a listserv that was comprised primarily of people on
#bitcoin-wizards, I might have suggested a discussion there, first, but
there isnt (as far as I know?).
Post by Matt Corallo
If, instead, there had been an intro on the list as "I think we
should
Post by Matt Corallo
do the blocksize increase soon, what do people think?"
There have been many such discussions over time. On bitcointalk. On
reddit. On IRC. At developer conferences. Gavin already knew what many
of the objections would be, which is why he started answering them.
But alright. Let's say he should have started a thread. Thanks for
starting it for him.
Now, can we get this specific list of things we should do before we're
prepared?
Yes....I'm gonna split the topic since this is already far off course
for that :).
Post by Matt Corallo
A specific credible alternative to what? Committing to blocksize
increases tomorrow? Yes, doing more research into this and developing
software around supporting larger block sizes so people feel
comfortable
Post by Matt Corallo
doing it in six months.
Do you have a specific research suggestion? Gavin has run simulations
across the internet with modified full nodes that use 20mb blocks, using
real data from the block chain. They seem to suggest it works OK.
What software do you have in mind?
Let me answer that in a new thread :).
------------------------------------------------------------------------------
One dashboard for servers and applications across Physical-Virtual-Cloud
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
_______________________________________________
Bitcoin-development mailing list
https://lists.sourceforge.net/lists/listinfo/bitcoin-development
Jorge Timón
2015-05-07 15:33:54 UTC
Permalink
Post by Mike Hearn
Post by Jorge Timón
If his explanation was "I will change my mind after we increase block
size", I guess the community should say "then we will just ignore your
nack because it makes no sense".
Oh good! We can just kick anyone out of the consensus process if we think
they make no sense.
I guess that means me and Gavin can remove everyone else from the developer
consensus, because we think trying to stop Bitcoin growing makes no sense.
Do you see the problem with this whole notion? It cannot possibly work.
Whenever you try and make the idea of developer consensus work, what you end
up with is "I believe in consensus as long as it goes my way". Which is
worthless.
That is not what I said. Then you demonstrated that it was absurd.
That's called a straw man argument and it's a well known fallacy, it
is precisely the example of arguments that can be safely ignored.
It is an argument against my admittedly vague definition of
"non-controversial change".
More importantly, I never said anything about "removing anyone", I was
always talking about arguments and not people.
One person could use fallacious arguments to attack or defend a given
proposal and use perfectly valid ones in another, a person can even
mix valid and invalid arguments in the same mail.
Post by Mike Hearn
Post by Jorge Timón
One thing is the Bitcoin core project where you could argue that the 5
committers decide (I don't know why Wladimir would have any more
authority than the others).
Because he is formally the maintainer.
Yes, the maintainer of the Bitcoin core free software project (I
cannot stressed this enough, that can be forked by anyone), not the
president of Bitcoin the p2p network.
Post by Mike Hearn
Maybe you dislike that idea. It's so .... centralised. So let's say Gavin
commits his patch, because his authority is equal to all other committers.
Someone else rolls it back. Gavin sets up a cron job to keep committing the
patch. Game over.
You cannot have committers fighting over what goes in and what doesn't.
That's madness. There must be a single decision maker for any given
codebase.
I'm sure that if they become that stupid, developers would move to a
fork of the project in no time.
Post by Mike Hearn
Post by Jorge Timón
Ok, so in simple terms, you expect people to have to pay enormous fees
and/or wait thousands of blocks for their transactions to get included
in the chain. Is that correct?
No. I'll write an article like the others, it's better than email for more
complicated discourse.
Ok, thanks in advance.
Post by Mike Hearn
Post by Jorge Timón
As others have said, if the answer is "forever, adoption is always the
most important thing" then we will end up with an improved version of Visa.
This appears to be another one of those fundamental areas of disagreement. I
believe there is no chance of Bitcoin ending up like Visa, even if it is
wildly successful. I did the calculations years ago that show that won't
https://en.bitcoin.it/wiki/Scalability
Decentralisation is a spectrum and Bitcoin will move around on that spectrum
over time. But claiming we have to pick between 1mb blocks and "Bitcoin =
VISA" is silly.
Again, I didn't say any of that. My point is that a network that
becomes too "centralized" (like visa, that is centralized vs p2p, not
vs distributed) doesn't offer any security or decentralization
advantage over current networks (and of course I meant that could
happen with larger blocks, not 1 MB blocks).
I'm sure that's not what the proponents of the size increase want, and
I'm not defending 1 MB as a sacred limit or anything, but my question
is "where is the limit for them?"
Even a limitless block size would technically work because miners
would limit it to limit the orphan rate. So "no hardcoded consensus
limit on transaction volume/block size" could be a valid answer to the
question "what is the right consensus limit to block size?" for which
there's no real right answer because there is a tradeoff between
transaction volume and centralization.

Should we maintain 1 MB forever? Probably not.
Is 20 MB a bad size? I honestly don't know.
Is this urgent? I don't think so.
Should we rush things when we don't have clear answers to many related
questions? I don't think so.

You think that it is too soon to start restricting transaction volume
in any way. You will answer why in your post.
When is the right time and what is the right limitation then?

I want to have fee competition as soon as possible, at least
temporarily. But you say that it can wait for later.
Ok, when do you think we should make that happen then?
When 20 MB are full, will that be the right time to let the fee market
develop then or will it be urgent to increase the block size again?
Should we directly remove the limit then and let miners handle it as
they want?
If so, why not now?
Maybe we can increase to 2 MB, then wait for fee competition, then
wait for 2 more subsidy halvings and then increase to 11 or 20 MB?
There's so many possibilities that I don't understand how can be
surprising that "20 MB, as soon as possible" is not the obvious answer
to everyone...
Mike Hearn
2015-05-07 16:11:11 UTC
Permalink
Post by Jorge Timón
It is an argument against my admittedly vague definition of
"non-controversial change".
If it's an argument against something you said, it's not a straw man, right
;)

Consensus has to be defined as agreement between a group of people. Who are
those people? If you don't know, it's impossible to decide when there is
consensus or not.

Right now there is this nice warm fuzzy notion that decisions in Bitcoin
Core are made by consensus. "Controversial" changes are avoided. I am
trying to show you that this is just marketing. Nobody can define what
these terms even mean. It would be more accurate to say decisions are
vetoed by whoever shows up and complains enough, regardless of technical
merit. After all, my own getutxo change was merged after a lot of technical
debate (and trolling) ..... then unmerged a day later because "it's a
shitstorm".

So if Gavin showed up and complained a lot about side chains or whatever,
what you're saying is, oh that's different. We'd ignore him. But when
someone else complains about a change they don't like, that's OK.

Heck, I could easily come up with a dozen reasons to object to almost any
change, if I felt like it. Would I then be considered not a part of the
consensus because that'd be convenient?
Post by Jorge Timón
I'm sure that's not what the proponents of the size increase want, and
I'm not defending 1 MB as a sacred limit or anything, but my question
is "where is the limit for them?"
20mb is an arbitrary number, just like 1mb. It's good enough to keep the
Bitcoin ecosystem operating as it presently does: gentle growth in usage
with the technology that exists and is implemented. Gavin has discussed in
his blog why he chose 20mb, I think. It's the result of some estimates
based on average network/hardware capabilities.

Perhaps one day 20mb will not be enough. Perhaps then the limit will be
raised again, if there is sufficient demand.

You are correct that "no limit at all" is a possible answer. More
precisely, in that case miners would choose. Gavin's original proposal was
20mb+X where X is decided by some incrementing formula over time, chosen to
approximate expected improvements in hardware and software. That was cool
too. The 20mb figure and the formula were an attempt to address the
concerns of people who are worried about the block size increase: a
meet-in-the-middle compromise.

Unfortunately it's hard to know what other kinds of meet-in-the-middle
compromise could be made here. I'm sure Gavin would consider them if he
knew. But the concerns provided are too vague to address. There are no
numbers in them, for example:

- We need more research -> how much more?
- I'm not against changing the size, just not now -> then when?
- I'm not wedded to 1mb, but not sure 20mb is right -> then what?
- Full node count is going down -> then what size do you think would fix
that? 100kb?
- It will make mining more centralised -> how do you measure that and
how much centralisation would you accept?

and so on.
Jorge Timón
2015-05-07 16:47:53 UTC
Permalink
Post by Mike Hearn
Post by Jorge Timón
It is an argument against my admittedly vague definition of
"non-controversial change".
If it's an argument against something you said, it's not a straw man, right
;)
Yes, but it was an argument against something I didn't said ;)
Post by Mike Hearn
Consensus has to be defined as agreement between a group of people. Who are
those people? If you don't know, it's impossible to decide when there is
consensus or not.
Right now there is this nice warm fuzzy notion that decisions in Bitcoin
Core are made by consensus. "Controversial" changes are avoided. I am trying
to show you that this is just marketing. Nobody can define what these terms
even mean. It would be more accurate to say decisions are vetoed by whoever
shows up and complains enough, regardless of technical merit.
Yes, that's why I drafted a definition for "uncontroversial change"
rather than "change accepted by consensus".
It will still be vague and hard to define, but consensus seems much harder.
And, yes, you're right, it is more like giving power to anyone with
valid arguments to veto hardfork changes.
But as you say, that could lead to make hardforks actually impossible,
so we should limit what constitutes a valid argument.
I later listed some examples of invalid arguments: logical fallacies,
unrelated arguments, outright lies.
Certainly I don't think technical merits should count here or that we
could veto a particular person from vetoing.
We should filter the arguments, not require an identity layer to
blacklist individuals.
We should even accept arguments from anonymous people in the internet
(you know, it wouldn't be the first time).
Post by Mike Hearn
Unfortunately it's hard to know what other kinds of meet-in-the-middle
compromise could be made here. I'm sure Gavin would consider them if he
knew. But the concerns provided are too vague to address. There are no
We need more research -> how much more?
Some research at all about fee market dynamics with limited size that
hasn't happened at all.
If we're increasing the consensus max size maybe we could at least
maintain the 1MB limit as a standard policy limit, so that we can
study it a little bit (like we could have done instead of removing the
initial policy limit).
Post by Mike Hearn
I'm not against changing the size, just not now -> then when?
I don't know yet, but I understand now that having a clearer roadmap
is what's actually urgent, not the change itself.
Post by Mike Hearn
I'm not wedded to 1mb, but not sure 20mb is right -> then what?
What about 2 MB consensus limit and 1 MB policy limit for now? I know
that's arbitrary too.
Post by Mike Hearn
Full node count is going down -> then what size do you think would fix that?
100kb?
As others have explained, the number of full nodes is not the
improtant part, but how easy it is to run one.
I think a modest laptop with the average internet connection of say,
India or Brazil, should be able to run a full node.
I haven't made those numbers myself but I'm sure that's possible with
1 MB blocks today, and probably with 2 MB blocks too.
Post by Mike Hearn
It will make mining more centralised -> how do you measure that and how much
centralisation would you accept?
This is an excellent question for both sides.
Unfortunately I don't know the answer to this. Do you?
Gavin Andresen
2015-05-07 16:59:13 UTC
Permalink
Fee dynamics seems to come up over and over again in these discussions,
with lots of talk and theorizing.

I hope some data on what is happening with fees right now might help, so I
wrote another blog post (with graphs, which can't be done in a mailing list
post):
http://gavinandresen.ninja/the-myth-of-not-full-blocks

We don’t need 100% full one megabyte blocks to start to learn about what is
likely to happen as transaction volume rises and/or the one megabyte block
size limit is raised.
--
--
Gavin Andresen
Peter Todd
2015-05-07 17:42:20 UTC
Permalink
Post by Gavin Andresen
Fee dynamics seems to come up over and over again in these discussions,
with lots of talk and theorizing.
I hope some data on what is happening with fees right now might help, so I
wrote another blog post (with graphs, which can't be done in a mailing list
http://gavinandresen.ninja/the-myth-of-not-full-blocks
We don’t need 100% full one megabyte blocks to start to learn about what is
likely to happen as transaction volume rises and/or the one megabyte block
size limit is raised.
Sounds like you're saying we are bumping up against a 1MB limit. However
other than the occasional user who has sent a transaction with an
extremely low/no fee, what evidence do we have that this is or is not
actually impacting meaningful usage form the user's point of view?

Do we have evidence as to how users are coping? e.g. do they send time
sensitive transactiosn with higher fees? Are people conciously moving
low value transactions off the blockchain? Equally, what about the story
with companies? You of course are an advisor to Coinbase, and could give
us some insight into the type of planning payment processors/wallets are
doing. For instance, does Coinbase have any plans to work with other
wallet providers/payment processors to aggregate fund transfers between
wallet providers - an obvious payment channel application.
--
'peter'[:-1]@petertodd.org
00000000000000000232164c96eaa6bf7cbc3dc61ea055840715b5a81ee8f6be
Jorge Timón
2015-05-07 18:05:22 UTC
Permalink
Fee dynamics seems to come up over and over again in these discussions, with
lots of talk and theorizing.
I hope some data on what is happening with fees right now might help, so I
wrote another blog post (with graphs, which can't be done in a mailing list
http://gavinandresen.ninja/the-myth-of-not-full-blocks
We don’t need 100% full one megabyte blocks to start to learn about what is
likely to happen as transaction volume rises and/or the one megabyte block
size limit is raised.
Ok, the fact that the fee increases the probability of getting
included faster already is a good thing, the graphs with the
probability of getting included in the next block were less important
to me.
Although scarce space (beyond what miners chose to limit by
themselves) would increase the fee competition, I didn't knew that
there is actually some competition happening already.
So I guess this diminishes the argument for maintaining the limits
longer to observe the results of more scarce space.
Still, I think maintaining a lower policy limit it's a good idea, even
if we decide not to use it to observe that soon.
For example, say we chose the 20 MB consensus limit, we can maintain
the policy limit at 1 MB or move it to 2 MB, and slowly moving it up
later as needed without requiring everyone to upgrade.
Of course, not all miners have to follow the standard policy, but at
least it's something.
So please take this as a suggestion to improve your proposal. You can
argue it like this "if we want to maintain the limits after the
hardfork or increase them slowly, for observing fee dynamics with more
scarce space or for any other reason, those limits can be partially
enforced by the standard policy". I mean, I think that could be a
reasonable compromise for that concrete line of arguments.
Btc Drak
2015-05-07 19:57:20 UTC
Permalink
Post by Mike Hearn
Right now there is this nice warm fuzzy notion that decisions in Bitcoin
Post by Mike Hearn
Core are made by consensus. "Controversial" changes are avoided. I am
trying to show you that this is just marketing.
Consensus is arrived when the people who are most active at the time
(active in contributing to discussions, code review, giving opinions etc.)
agreed to ACK. There are a regular staple of active contributors. Bitcoin
development is clearly a meritocracy. The more people participate and
contribute the more weight their opinions hold.
Post by Mike Hearn
Nobody can define what these terms even mean. It would be more accurate to
Post by Mike Hearn
say decisions are vetoed by whoever shows up and complains enough,
regardless of technical merit. After all, my own getutxo change was merged
after a lot of technical debate (and trolling) ..... then unmerged a day
later because "it's a shitstorm".
I am not sure that is fair, your PR was reverted because someone found a
huge exploit in your PR enough to invalidate all your arguments used to get
it merged in the first place.
Post by Mike Hearn
So if Gavin showed up and complained a lot about side chains or whatever,
what you're saying is, oh that's different. We'd ignore him. But when
someone else complains about a change they don't like, that's OK.
Heck, I could easily come up with a dozen reasons to object to almost any
change, if I felt like it. Would I then be considered not a part of the
consensus because that'd be convenient?
I don't think it's as simple as that. Objections for the sake of
objections, or unsound technical objections are going to be seen for what
they are. This is a project with of some of the brightest people in the
world in this field. Sure people can be disruptive but their reputation
stand the test of time.

The consensus system might not be perfect, but it almost feels like you
want to declare a state of emergency and suspend all the normal review
process for this proposed hard fork.
Btc Drak
2015-05-07 15:39:32 UTC
Permalink
Post by Mike Hearn
Maybe you dislike that idea. It's so .... centralised. So let's say Gavin
commits his patch, because his authority is equal to all other committers.
Someone else rolls it back. Gavin sets up a cron job to keep committing the
patch. Game over.
You cannot have committers fighting over what goes in and what doesn't.
That's madness. There must be a single decision maker for any given
codebase.
You are conflating consensus with commit access. People with commit access
are maintainers who are *able to merge* pull requests. However, the rules
for bitcoin development are that only patches with consensus get merged. If
any of the maintainers just pushed a change without going through the whole
code review and consensus process there would be uproar, plain and simple.

Please don't conflate commit access with permission to merge because it's
just not the case. No-one can sidestep the requirement to get consensus,
not even the 5 maintainers.
Peter Todd
2015-05-07 13:02:40 UTC
Permalink
Post by Mike Hearn
What if Gavin popped up right now and said he disagreed with every current
proposal, he disagreed with side chains too, and there would be no
consensus on any of them until the block size limit was raised.
Would you say, oh, OK, guess that's it then. There's no consensus so might
as well scrap all those proposals, as they'll never happen anyway. Bye bye
side chains whitepaper.
If Gavin had good points to make, he'd probably eventually change
everyone's mind.

But if he fails to do that at some point he'll just get ignored and for
all practical purposes won't be considered part of the consensus. Not
unlike how if someone suggested we power the blockchain via perpetual
motion machines they'd be ignored. Bitcoin is after all a decentralized
system so all power over the development process is held only by social
consent and respect.

At that point I'd suggest Gavin fork the project and go to the next
level of consensus gathering, the community at large; I'm noticing this
is exactly what you and Gavin are doing.

Speaking of, are you and Gavin still thinking about forking Bitcoin
Core? If so I wish you the best of luck.

Sent: Wednesday, July 23, 2014 at 2:42 PM
From: "Mike Hearn" <***@plan99.net>
To: "Satoshi Nakamoto" <***@gmx.com>
Subject: Thinking about a fork
I don't expect a reply, just getting some thoughts off my chest. Writing them down helps.

Forking Bitcoin-Qt/Core has been coming up more and more often lately in conversation (up from zero not that long ago). Gavin even suggested me and him fork it ... I pointed out that maintainers don't normally fork their own software :)

The problem is that the current community of developers has largely lost interest in supporting SPV wallets. Indeed any protocol change that might mean any risk at all, for anyone, is now being bogged down in endless circular arguments that never end. The Bitcoin developers have effectively become the new financial regulators: restricting options within their jurisdiction with "someone might do something risky" being used as the justification.

If alternative low-risk protocols were always easily available this would be no problem, but often they require enormous coding and deployment effort or just don't exist at all. Yet, wallets must move forward. If we carry on as now there simply won't be any usable decentralised wallets left and Bitcoin will have become an energy-wasting backbone for a bunch of banks and payment processors. That's so far from your original vision, it's sad.

I know you won't return and that's wise, but sometimes I wish you'd left a clearer design manifesto before handing the reigns over to Gavin, who is increasingly burned out due to all the arguments (as am I).

Source: https://www.reddit.com/r/Bitcoin/comments/2g9c0j/satoshi_email_leak/
--
'peter'[:-1]@petertodd.org
0000000000000000066f25b3196b51d30df5c1678fc206fdf55b65dd6e593b05
Matt Corallo
2015-05-07 19:14:48 UTC
Permalink
Post by Jorge Timón
Can you please elaborate on what terrible things will happen if we
don't increase the block size by winter this year?
I was referring to winter next year. 0.12 isn't scheduled until the end
of the year, according to Wladimir. I explained where this figure comes
On a related note, I'd like to agree strongly with Peter Todd that we
should get away from doing forks-only-in-releases. We can add code to do
a fork and then enable it in 0.11.1 or 0.11.11 if Gavin prefers more 11s.
Dave Hudson
2015-05-07 11:55:49 UTC
Permalink
Post by Jorge Timón
I observed to Wladimir and Gavin in private that this timeline meant a change to the block size was unlikely to get into 0.11, leaving only 0.12, which would give everyone only a few months to upgrade in order to fork the chain by the end of the winter growth season. That seemed tight.
Can you please elaborate on what terrible things will happen if we
don't increase the block size by winter this year?
I assume that you are expecting full blocks by then, have you used any
statistical technique to come up with that date or is it just your
guess?
Because I love wild guesses and mine is that full 1 MB blocks will not
happen until June 2017.
I've been looking at this problem for quite a while (Gavin cited some of my work a few days ago) so thought I'd chime in with a few thoughts (some of which I've not published). I believe the major problem here is that this isn't just an engineering decision; the reaction of the miners will actually determine the success or failure of any course of action. In fact any decision forced upon them may backfire if they collectively take exception to it. It's worth bearing in mind that most of the hash rate is now under the control of relatively large companies, many of whom have investors who are expecting to see returns; it probably isn't sufficient to just expect them to "do the right thing".

We're seeing plenty of full 1M byte blocks already and have been for months. Typically whenever we have one of the large inter-block gaps then these are often followed by one (and sometimes several) completely full blocks (full by the definition of whatever the miner wanted to use as a size limit).

The problem with this particular discussion is that there are quite a few "knowns" but an equally large number of "unknowns". Let's look at them:

Known: There has been a steady trend towards the mean block size getting larger. See https://blockchain.info/charts/avg-block-size?timespan=all&showDataPoints=false&daysAverageString=7&show_header=true&scale=0&address= <https://blockchain.info/charts/avg-block-size?timespan=all&showDataPoints=false&daysAverageString=7&show_header=true&scale=0&address=>

Known: Now the trend was definitely increasing quite quickly last year but for the last few months has been slowing down, however we did see pretty much a 2x increase in mean block sizes in 2014.

Known: For most of 2015 we've actually been seeing that rate slow quite dramatically, but the total numbers of transactions are still rising so we're seeing mean transaction sizes have been reducing, and that tallies with seeing more transactions per block: https://blockchain.info/charts/n-transactions-per-block?timespan=all&showDataPoints=false&daysAverageString=7&show_header=true&scale=0&address= <https://blockchain.info/charts/n-transactions-per-block?timespan=all&showDataPoints=false&daysAverageString=7&show_header=true&scale=0&address=>

Unknown: Why are seeing more smaller transactions? Are we simply seeing more efficient use of blockchain resources or have some large consumers of block space going away? How much more block space compression might be possible in, say, the next 12 months?

Known: If we reach the point where all blocks are 1M bytes then there's a major problem in terms of transaction confirmation. I published an analysis of the impact of different mean block sizes against confirmation times: http://hashingit.com/analysis/34-bitcoin-traffic-bulletin <http://hashingit.com/analysis/34-bitcoin-traffic-bulletin>. The current 35% to 45% mean block size doesn't have a huge impact on transaction confirmations (assuming equal fees for all) but once we're up at 80% then things start to get unpleasant. Instead of 50% of first confirmations taking about 7 minutes they instead take nearer to 19 minutes.

Known: There are currently a reasonably large number of zero-fee transactions getting relayed and mined. If things start to slow down then there will be a huge incentive to delay them (or drop them altogether).

Unknown: If block space starts to get more scarce then how will this affect the use of the blockchain? Do the zero-fee TXs move to some batched transfer solution via third party? Do people start to get smarter about how TXs are encoded? Do some TXs go away completely (there are a lot of long-chain transactions that may simply be "noise" creating an artificially inflated view of transaction volumes)?

Known: There's a major problem looming for miners at the next block reward halving. Many are already in a bad place and without meaningful fees then sans a 2x increase in the USD:BTC ratio then many will simply have to leave the network, increasing centralisation risks. There seems to be a fairly pervasive assumption that the 300-ish MW of power that they currently use is going to pay for itself (ignoring capital and other operating costs).

Unknown: If the block size is increased and yet more negligible fee transactions are dumped onto the network then that might well motivate some large fraction of miners to start to clamp block sizes or reject transactions below a certain fee threshold; they can easily create their own artificial scarcity if enough of them feel it is in their interest (it's not the most tricky setting to change). One can well imagine VC investors in mining groups asking why they're essentially subsidising all of the other VC-funded Bitcoin startups.

Known: the orphan rate is still pretty-high even with everyone's fast connections. If we assume that 20M byte blocks become possible then that's likely to increase.

Unknown: What are the security implications for larger blocks (this one (at least) can be simulated though)? For example, could large blocks with huge numbers of trivial transactions be used to put other validators at a disadvantage in a variant of a selfish mining attack? I've seen objections that such bad actors could be blacklisted in the future but it's not clear to me how. A private mining pool can trivially be made to appear like 100 pools of 1% of the size without significantly affecting the economics of running that private mine.


Cheers,
Dave
Jorge Timón
2015-05-07 13:40:23 UTC
Permalink
Post by Dave Hudson
Known: There has been a steady trend towards the mean block size getting
larger. See
https://blockchain.info/charts/avg-block-size?timespan=all&showDataPoints=false&daysAverageString=7&show_header=true&scale=0&address=
Looking at this graph and in retrospective, we shouldn't have removed
the standard policy limit without observing the supposedly disastrous
effects of hitting the limit first.
Removing the standard limit would have been trivial (bdb issues aside)
at any point after seeing the effects.
Post by Dave Hudson
Known: If we reach the point where all blocks are 1M bytes then there's a
major problem in terms of transaction confirmation. I published an analysis
http://hashingit.com/analysis/34-bitcoin-traffic-bulletin. The current 35%
to 45% mean block size doesn't have a huge impact on transaction
confirmations (assuming equal fees for all) but once we're up at 80% then
things start to get unpleasant. Instead of 50% of first confirmations taking
about 7 minutes they instead take nearer to 19 minutes.
Well, this is only for first confirmations of free transaction.
A higher fee should increase your probabilities, but if you're sending
free transactions you may not care about them taking longer to be
included.
Post by Dave Hudson
Known: There are currently a reasonably large number of zero-fee
transactions getting relayed and mined. If things start to slow down then
there will be a huge incentive to delay them (or drop them altogether).
Well, maybe "instant and free" it's not a honest form of bitcoin
marketing and it just has to disappear.
Maybe we just need to start being more honest about pow being good for
processing micro-transactions: it is not.
Hopefully lightning will be good for that.
Free and fast in-chain transactions is something temporary that we
know will eventually disappear.
If people think it would be a adoption disaster that it happens soon,
then they could also detail an alternative plan to roll that out
instead of letting it happen.
But if the plan is to delay it forever...then I'm absolutely against.
Post by Dave Hudson
Known: There's a major problem looming for miners at the next block reward
halving. Many are already in a bad place and without meaningful fees then
sans a 2x increase in the USD:BTC ratio then many will simply have to leave
the network, increasing centralisation risks. There seems to be a fairly
pervasive assumption that the 300-ish MW of power that they currently use is
going to pay for itself (ignoring capital and other operating costs).
I take this as an argument for increasing fee competition and thus,
against increasing the block size.
Post by Dave Hudson
Known: the orphan rate is still pretty-high even with everyone's fast
connections. If we assume that 20M byte blocks become possible then that's
likely to increase.
Unknown: What are the security implications for larger blocks (this one (at
least) can be simulated though)? For example, could large blocks with huge
numbers of trivial transactions be used to put other validators at a
disadvantage in a variant of a selfish mining attack? I've seen objections
that such bad actors could be blacklisted in the future but it's not clear
to me how. A private mining pool can trivially be made to appear like 100
pools of 1% of the size without significantly affecting the economics of
running that private mine.
No blacklisting, please, that's centralized.
In any case, a related known: bigger blocks give competitive advantage
to bigger miners.
Tom Harding
2015-05-08 04:46:48 UTC
Permalink
Post by Jorge Timón
Post by Dave Hudson
Known: There's a major problem looming for miners at the next block reward
halving. Many are already in a bad place and without meaningful fees then
sans a 2x increase in the USD:BTC ratio then many will simply have to leave
the network, increasing centralisation risks. There seems to be a fairly
pervasive assumption that the 300-ish MW of power that they currently use is
going to pay for itself (ignoring capital and other operating costs).
I take this as an argument for increasing fee competition and thus,
against increasing the block size.
That doesn't follow. Supposing average fees per transaction decrease
with block size, total fees / block reach an optimum somewhere. While
the optimum might be at infinity, it's certainly not at zero, and it's
not at all obvious that the optimum is at a block size lower than 1MB.
Jeff Garzik
2015-05-07 14:04:21 UTC
Permalink
I have a lot more written down, a WIP; here are the highlights.

- The 1MB limit is an ancient anti-spam limit, and needs to go.

- The 1MB limit is economically entrenched at this point, and cannot be
removed at a whim.

- This is a major change to the economics of a $3.2B system. This change
picks winners and losers. There is attendant moral hazard.

- The core dev team is not and should not be an FOMC.

- The bar for "major economic change to a $3.2B system" should necessarily
be high. In the more boring world of investments, this would accompanied
by Due Diligence including but not limited to projections for success,
failure scenarios, upside risks and downside risks. Projections and
fact-based simulations.

- There are significant disruption risks on the pro (change it) and con
(keep 1MB) sides of the debate.

- People are privately lobbying Gavin for this. That is the wrong way to
go. I have pushed for a more public debate, and public endorsements (or
condemnations) from major miners, merchants, payment processors,
stackholders, ... It is unfair to criticize Gavin to doing this.
Peter Todd
2015-05-07 14:32:34 UTC
Permalink
Post by Jeff Garzik
I have a lot more written down, a WIP; here are the highlights.
- The 1MB limit is an ancient anti-spam limit, and needs to go.
- The 1MB limit is economically entrenched at this point, and cannot be
removed at a whim.
- This is a major change to the economics of a $3.2B system. This change
picks winners and losers. There is attendant moral hazard.
- The core dev team is not and should not be an FOMC.
- The bar for "major economic change to a $3.2B system" should necessarily
be high. In the more boring world of investments, this would accompanied
by Due Diligence including but not limited to projections for success,
failure scenarios, upside risks and downside risks. Projections and
fact-based simulations.
- There are significant disruption risks on the pro (change it) and con
(keep 1MB) sides of the debate.
- People are privately lobbying Gavin for this. That is the wrong way to
go. I have pushed for a more public debate, and public endorsements (or
condemnations) from major miners, merchants, payment processors,
stackholders, ... It is unfair to criticize Gavin to doing this.
The hard part here will be including the players who aren't individually
"major", but are collectively important; who is the community?

How do you give the small merchants a voice in this discussion? The
small time traders? The small time miners? The people in repressive
countries who are trying to transact on thier own terms?

Legality? Should people involved in 3rd world remittances be
included? Even if what they're doing is technically illegal? What about
dark markets? If DPR voiced his opinion, should we ignore it?

Personally, I'm dubious about trying to make ecosystem-wide decisions
like this without cryptographic consensus; fuzzy human social consensus
is easy to manipulate.
--
'peter'[:-1]@petertodd.org
000000000000000013e67b343b1f6d75cc87dfb54430bdb3bcf66d8d4b3ef6b8
Justus Ranvier
2015-05-07 14:38:22 UTC
Permalink
Post by Jeff Garzik
- This is a major change to the economics of a $3.2B system. This
change picks winners and losers. There is attendant moral hazard.
This is exactly true.

There are a number of projects which aren't Bitcoin that benefit from
filling in the gap left by Bitcoin's restricted transaction rate
capability.

If Bitcoin fills that gap, Bitcoin wins and those other projects lose.

Should decisions about Bitcoin development take into account the
desires of competing projects?
Peter Todd
2015-05-07 14:49:18 UTC
Permalink
Post by Justus Ranvier
Post by Jeff Garzik
- This is a major change to the economics of a $3.2B system. This
change picks winners and losers. There is attendant moral hazard.
This is exactly true.
There are a number of projects which aren't Bitcoin that benefit from
filling in the gap left by Bitcoin's restricted transaction rate
capability.
If Bitcoin fills that gap, Bitcoin wins and those other projects lose.
Should decisions about Bitcoin development take into account the
desires of competing projects?
Well, basically you're asking if we shouldn't assume the people in this
discussion have honest intentions. If you want to go down that path,
keep in mind where it leads.

I think we'll find an basic assumption of civility to be more
productive, until proven otherwise. (e.g. NSA ties)
--
'peter'[:-1]@petertodd.org
00000000000000000d49f263bbbb80f264abc7cc930fc9cbc7ba80ac068d9648
Justus Ranvier
2015-05-07 15:13:34 UTC
Permalink
Post by Peter Todd
I think we'll find an basic assumption of civility to be more
productive, until proven otherwise. (e.g. NSA ties)
I'm not sure why you'd construe my post as having anything to do with
accusations like NSA ties.

By "non-Bitcoin" projects I mean any altcoin or off-chain processing
solution.
Peter Todd
2015-05-07 15:25:03 UTC
Permalink
Post by Justus Ranvier
Post by Peter Todd
I think we'll find an basic assumption of civility to be more
productive, until proven otherwise. (e.g. NSA ties)
I'm not sure why you'd construe my post as having anything to do with
accusations like NSA ties.
I'm not.

I'm saying dealing with someone with proven NSA ties is one of the few
times when I think the assumption of honest intent should be ignored in
this forum.

Altcoins and non-Bitcoin-blockchain tx systems? Assuming anything other
than honest intent isn't productive in this forum.
--
'peter'[:-1]@petertodd.org
00000000000000000622ff7c71c105480baf123fe74df549b5a42596fd8bfbcb
Jeff Garzik
2015-05-07 15:04:58 UTC
Permalink
Post by Justus Ranvier
Post by Jeff Garzik
- This is a major change to the economics of a $3.2B system. This
change picks winners and losers. There is attendant moral hazard.
This is exactly true.
There are a number of projects which aren't Bitcoin that benefit from
filling in the gap left by Bitcoin's restricted transaction rate
capability.
If Bitcoin fills that gap, Bitcoin wins and those other projects lose.
Should decisions about Bitcoin development take into account the
desires of competing projects?
heh - I tend to think people here want bitcoin to succeed. My statement
refers to picking winners and losers from within the existing bitcoin
community & stakeholders.

The existential question of the block size increase is larger - will
failing to increase the 1MB limit permanently stunt bitcoin's growth?
Justus Ranvier
2015-05-07 15:16:50 UTC
Permalink
Post by Jeff Garzik
heh - I tend to think people here want bitcoin to succeed. My
statement refers to picking winners and losers from within the
existing bitcoin community & stakeholders.
"Success" is not a sufficiently precise term in this context.

There is a large contingent of people for whom the definition of
Bitcoin "success" means serving as a stable backend which can meet the
needs of their non-Bitcoin platform - and nothing more.

To be extremely specific: should Bitcoin development intenionally
limit the network's capabilities to leave room for other projects, or
should Bitcoin attempt to be the best system possible and let the
other projects try to keep up as best they can?
Jeff Garzik
2015-05-07 15:27:39 UTC
Permalink
Post by Justus Ranvier
To be extremely specific: should Bitcoin development intenionally
limit the network's capabilities to leave room for other projects, or
should Bitcoin attempt to be the best system possible and let the
other projects try to keep up as best they can?
Avoid such narrow, binary thinking.

Referencing the problem described in
http://gavinandresen.ninja/why-increasing-the-max-block-size-is-urgent
(not the solution - block size change - just the problem, tx/block Poisson
mismatch)

This problem - block creation is bursty - is fundamental to bitcoin.
Raising block size does not fix this problem (as [1] notes), but merely
kicks the can down the road a bit, by hiding it from users a bit longer.

Bitcoin is a settlement system, at the most fundamental engineering level.
It will never be an instant payment system for all the world's coffees (or
all the world's stock trades). It is left to "Layer 2" projects to
engineer around bitcoin's gaps, to produce an instant, secure, trustless,
egalitarian payment system using the bitcoin token. [1] also notes this.

It is therefore not a binary decision of leaving room for other projects,
or not. Layer-2 projects are critical to the success of bitcoin, and
complement bitcoin.






[1] http://gavinandresen.ninja/it-must-be-done-but-is-not-a-panacea

Holistic thinking implies you build a full-stack system with bitcoin
Justus Ranvier
2015-05-07 15:33:45 UTC
Permalink
On Thu, May 7, 2015 at 11:16 AM, Justus Ranvier
Post by Justus Ranvier
To be extremely specific: should Bitcoin development
intenionally limit the network's capabilities to leave room for
other projects, or should Bitcoin attempt to be the best system
possible and let the other projects try to keep up as best they
can?
Avoid such narrow, binary thinking.
Altcoins and non-Bitcoin-blockchain tx systems? Assuming anything
other than honest intent isn't productive in this forum.
In summary, I asked a question neither you, nor Peter Todd, want to
answer and want to actively discourage people from even asking at all.
Jeff Garzik
2015-05-07 15:47:00 UTC
Permalink
Post by Justus Ranvier
In summary, I asked a question neither you, nor Peter Todd, want to
answer and want to actively discourage people from even asking at all.
Incorrect; your question included built-in assumptions with which I
disagree.

Bitcoin needs to be the best it can be (Layer 1), but all solutions cannot
and should not be implemented at Layer 1.

We need to scale up both bitcoin (L1) and solutions built on top of bitcoin
(L2).
Justus Ranvier
2015-05-07 15:50:38 UTC
Permalink
Post by Jeff Garzik
Bitcoin needs to be the best it can be (Layer 1), but all solutions
cannot and should not be implemented at Layer 1.
I can provisionally agree with that statement as long as "all
solutions cannot and should not be implemented at Layer 1" it taken to
be a hypothesis to be tested in the context of each proposed solution
rather than a law of nature.
Wladimir J. van der Laan
2015-05-07 11:20:43 UTC
Permalink
Post by Matt Corallo
Personally, I'm rather strongly against any commitment to a block size
increase in the near future. Long-term incentive compatibility requires
that there be some fee pressure, and that blocks be relatively
consistently full or very nearly full. What we see today are
transactions enjoying next-block confirmations with nearly zero pressure
to include any fee at all (though many do because it makes wallet code
simpler).
I'm weakly against a block size increase in the near future. Some arguments follow. For sake of brevity, this ignores the inherent practical and political issues in scheduling a hardfork.

Against:

1. The everyone-verifies-everything approach inherently doesn't scale well. Yes, it is possible to increase the capacity, within limits, without completely destroying the system, but if scaling turns out to be a success, even a 20-fold increase is just a drop in the bucket (this will not make a decentralized Changetip possible). The gain will always be linear, at a total cost that scales in the number of (full node) users times the block size. The whole idea of everyone verifying the whole world's bus tickets and coffee purchases seems ludicrous to me. For true scaling, as well as decentralized microtransactions, the community needs to work on non-centralized 'level 2' protocols for transactions, for example the Lightning network.

I prefer not to rely on faith that 'Moore's law' - which isn't a physical law but a trend - will save us. And it doesn't so much apply to communication bandwidth as its techniques are more diverse. E.g. for Bitsat, 20MB blocks will push the limit.

2a. Pushing up bandwidth and CPU usage will, inevitably, push people at the lower end away from running their own full nodes. Mind you, the sheer number of full nodes is not the issue, but Bitcoin's security model is based on being able to verify the shared consensus on one's own. A lot of recent development effort went into making the node software less heavy. Yes, one could switch to SPV, but that is a serious privacy compromise. In the worst case people will feel forced to move to webwallets.

That was about sustained bandwidth - syncing a new node from scratch through the internet will become unwieldy sooner - this can be worked around with UTXO snapshots, but doing this in a way that doesn't completely pull the rug under the security model needs research (I'm aware that this could be construed the other way, that such a solution will be needed eventually and fast block chain growth just accelerates it).

2b. The bandwidth bound for just downloading blocks is ~4GB per month now, it will be ~52GB per month. Behind Tor and other anonimity networks, nodes will reveal themselves by the sheer volume of data transferred even to only download the block chain. This may already be the case, but will become worse. It may even become harmful to Tor itself.

3a. The costs are effectively externalized to users. I, hence, don't like calling the costs "trivial". I don't like making this decision for others, I'm not convinced that they are trivial for everyone. Besides, this will increase supply of block chain space, and thus push down transaction fees, but at cost to *all users*, even users that hardly do any transactions. Hence it favors those that generate lots of transactions (is that a perverse incentive? not sure, but it needs to be weighted in any decision).

3b. A mounting fee pressure, resulting in a true fee market where transactions compete to get into blocks, results in urgency to develop decentralized off-chain solutions. I'm afraid increasing the block size will kick this can down the road and let people (and the large Bitcoin companies) relax, until it's again time for a block chain increase, and then they'll rally Gavin again, never resulting in a smart, sustainable solution but eternal awkward discussions like this.

4. We haven't solved the problem of arbitrary data storage, and increasing the block size would compound that problem. More data storage more storage available for the same price - and up to 20x faster growth of the UTXO set, which is permanent (more externalization). More opportunity for pedonazis to insert double-plus ungood data, exposing users to possible legal ramifications.

For:

1. First, the obvious: It gives some breathing room in a year (or whenever the hard fork is planned). If necessary, it will allow more transactions to be on-chain for a while longer while other solutions are being implemented.

2. *Allowing* 20MB blocks does not mean miners will immediately start making them. Many of them aren't even filling up to the 1MB limit right now, probably due to latency/stale block issues. This makes objection 2 milder, which is about *worst case* bandwidth, as well as 4, as the end result may not be 20MB blocks filled with arbitrary "junk".

3. Investment in off-chain solutions is guided by not only fee pressure, but also other reasons such as speed of confirmation, which is unreliable on-chain. This eases objection 3b.

Whew, this grew longer than I expected. To conclude, I understand the advantages of scaling, I do not doubt a block size increase will *work* Although there may be unforseen issues, I'm confident they'll be resolved. However, it may well make Bitcoin less useful for what sets it apart from other systems in the first place: the possibility for people to run their own own "bank" without special investment in connectivity and computing hardware.
Also the politics aspect (at some point it becomes a question of who decides for who? who is excluded? all those human decisions...) of this I don't like in the least. Possibly unavoidable at some point, but that's something *I*'d like to kick down the road.

Wladimir
Eric Lombrozo
2015-05-07 11:30:49 UTC
Permalink
Post by Wladimir J. van der Laan
For sake of brevity, this ignores the inherent practical and political issues in scheduling a hardfork.
IMHO, these issues are the elephant in the room and the talk of block size increases is just a distraction.

- Eric Lombrozo
Jeff Garzik
2015-05-07 15:56:36 UTC
Permalink
Dear list,

Apparently my emails are being marked as spam, despite being sent from
GMail's web interface. I've pinged our sysadmin. Thanks for letting
me know.
--
Jeff Garzik
Bitcoin core developer and open source evangelist
BitPay, Inc. https://bitpay.com/
Mike Hearn
2015-05-07 16:13:47 UTC
Permalink
Post by Jeff Garzik
Dear list,
Apparently my emails are being marked as spam, despite being sent from
GMail's web interface. I've pinged our sysadmin.
It's a problem with the mailing list software, not your setup. BitPay could
disable the phishing protections but that seems like a poor solution. The
only real fix is to send from a non @bitpay.com email address. Gmail or
Hotmail will work, I think. Yahoo won't: they enforce the same strict
policies than bitpay does.
John Bodeen
2015-05-07 16:54:35 UTC
Permalink
If the worry about raising the block size will increase centralization,
could not one could imagine an application which rewarded decentralized
storage of block data? It could even be build aside or on top of the
existing bitcoin protocol.

See the Permacoin paper by Andrew Miller:
http://cs.umd.edu/~amiller/permacoin.pdf

Regards
Loading...