Discussion:
[bitcoin-dev] Hardfork to fix difficulty drop algorithm
Luke Dashjr via bitcoin-dev
2016-03-02 14:56:14 UTC
Permalink
We are coming up on the subsidy halving this July, and there have been some
concerns raised that a non-trivial number of miners could potentially drop off
the network. This would result in a significantly longer block interval, which
also means a higher per-block transaction volume, which could cause the block
size limit to legitimately be hit much sooner than expected. Furthermore, due
to difficulty adjustment being measured exclusively in blocks, the time until
it adjusts to compensate would be prolonged.

For example, if 50% of miners dropped off the network, blocks would be every
20 minutes on average and contain double the transactions they presently do.
Even double would be approximately 850-900k, which potentially bumps up
against the hard limit when empty blocks are taken into consideration. This
situation would continue for a full month if no changes are made. If more
miners drop off the network, most of this becomes linearly worse, but due to
hitting the block size limit, the backlog would grow indefinitely until the
adjustment occurs.

To alleviate this risk, it seems reasonable to propose a hardfork to the
difficulty adjustment algorithm so it can adapt quicker to such a significant
drop in mining rate. BtcDrak tells me he has well-tested code for this in his
altcoin, which has seen some roller-coaster hashrates, so it may even be
possible to have such a proposal ready in time to be deployed alongside SegWit
to take effect in time for the upcoming subsidy halving. If this slips, I
think it may be reasonable to push for at least code-readiness before July,
and possibly roll it into any other hardfork proposed before or around that
time.

I am unaware of any reason this would be controversial, so if anyone has a
problem with such a change, please speak up sooner rather than later. Other
ideas or concerns are of course welcome as well.

Thanks,

Luke
Pavel Janík via bitcoin-dev
2016-03-02 15:05:08 UTC
Permalink
Post by Luke Dashjr via bitcoin-dev
the network. This would result in a significantly longer block interval, which
also means a higher per-block transaction volume, which could cause the block
size limit to legitimately be hit much sooner than expected.
If this happens at all (the exchange rate of the coin can accomodate such expectation), the local fee market will develop, fees will raise and complement mined coins, thus bringing more miners back to the game (together with expected higher exchange rate).
--
Pavel Janík
Luke Dashjr via bitcoin-dev
2016-03-02 15:14:35 UTC
Permalink
Post by Pavel Janík via bitcoin-dev
Post by Luke Dashjr via bitcoin-dev
the network. This would result in a significantly longer block interval,
which also means a higher per-block transaction volume, which could
cause the block size limit to legitimately be hit much sooner than
expected.
If this happens at all (the exchange rate of the coin can accomodate such
expectation),
The exchange rate is not significantly influenced by these things.
Historically, it seems fairly obvious that the difficulty has followed value,
not value following difficulty.
Post by Pavel Janík via bitcoin-dev
the local fee market will develop, fees will raise and complement mined
coins, thus bringing more miners back to the game (together with expected
higher exchange rate).
Depends on the hashrate drop, and tolerance for higher fees, both of which are
largely unknown at this time. At least having code prepared for the negative
scenarios in case of an emergency seems reasonable, even if we don't end up
needing to deploy it.

Luke
Jérémie Dubois-Lacoste via bitcoin-dev
2016-03-02 15:24:31 UTC
Permalink
BtcDrak tells me he has well-tested code for this in his altcoin
Could you be more explicit, which altcoin is that?
I am unaware of any reason this would be controversial
Probably not until you get to the details of any proposal. What is
your exact proposal here? Algorithm? Parameters?
As you likely know a too short time window would be dangerous for
other reasons. Getting to an agreement as to what is reasonable or not
is not necessarily trivial.

Jeremie


2016-03-02 16:14 GMT+01:00 Luke Dashjr via bitcoin-dev
Post by Pavel Janík via bitcoin-dev
Post by Luke Dashjr via bitcoin-dev
the network. This would result in a significantly longer block interval,
which also means a higher per-block transaction volume, which could
cause the block size limit to legitimately be hit much sooner than
expected.
If this happens at all (the exchange rate of the coin can accomodate such
expectation),
The exchange rate is not significantly influenced by these things.
Historically, it seems fairly obvious that the difficulty has followed value,
not value following difficulty.
Post by Pavel Janík via bitcoin-dev
the local fee market will develop, fees will raise and complement mined
coins, thus bringing more miners back to the game (together with expected
higher exchange rate).
Depends on the hashrate drop, and tolerance for higher fees, both of which are
largely unknown at this time. At least having code prepared for the negative
scenarios in case of an emergency seems reasonable, even if we don't end up
needing to deploy it.
Luke
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Tier Nolan via bitcoin-dev
2016-03-02 15:54:15 UTC
Permalink
If a hard-fork is being considered, the easiest is to just step the
difficulty down by a factor of 2 when the adjustment happens.

This means that miners still get paid the same minting fee per hash as
before. There isn't that much risk. If the hashing power stays constant,
then there will be 5 minute blocks for a while until everything readjusts.

Nearly the same can be accomplished by a soft fork.

Proposal:

If 900 of the last 1000 blocks are block version X or above, then the
smooth change rule applies.

The adjustment is as follows

big_number get_new_target(int height, big_number old_target) {
if (height < 405000)
return old_target;
else if (height < 420000)
return (old_target * 15000) / (height - 390000);
else
return old_target;
}

What this does is ramp up the difficulty slowly from 405,000 to 420,000.
It ends up with a target that is 50% of the value stored in target bits.
These blocks are valid since they have twice as much POW as normally
required.

For block 420000, the difficulty drops by 2 and the reward drops by 2 at
the same time. This means that miners still get paid the same BTC per
hash. It would mean 5 minute blocks until the next adjustment though.

If 90% of the network are mining the artificially hard blocks, then a 10%
fork still loses. The 90% has an effective hash rate of 45% vs the 10%.

It is unlikely that miners would accept the fork, since they lose minting
fees. It effectively brings the subsidy reduction forward in time.
Luke Dashjr via bitcoin-dev
2016-03-02 15:42:28 UTC
Permalink
so it may even be possible to have such a proposal ready in time to be
deployed alongside SegWit to take effect in time for the upcoming subsidy
halving.
Lapse of thinking/clarity here. This probably isn't a practical timeframe for
deployment, unless/until there's an emergency situation. So if the code were
bundled with SegWit, it would need some way to avoid its early activation
outside of such an emergency (which could possibly be detected in code, in
this case).

Luke
Paul Sztorc via bitcoin-dev
2016-03-02 16:27:52 UTC
Permalink
It is **essential** that emergency code be prepared. This code must be
able to lower the difficulty by a large factor.

---

This halving-difficulty-drop problem can, with some bad luck, get quite
disastrous, very quickly.

( I did a micro-study of this problem here, for those who are unaware:
http://www.truthcoin.info/blog/mining-heart-attack )

For example, it is theoretically possible that 100% of miners (not 50%
or 10%) will shut off their hardware. This is because it is revenue
which ~halves, not profit. If miners are all equal, difficulty causes
their profit margin to narrow over time (for example, if BTC revenues
are $100, and amortized fixed costs are $10, then difficulty adjustments
will cause total energy costs to rise to ~ $89, such that total
pre-halving profit is $1 for everyone...post-halving, profit is -$49 for
everyone).

So, if miners are homogenous the result is disastrous. Fortunately,
miners are probably still somewhat heterogenous. However, we don't know
how their power contracts (or their hardware turnover) are
scheduled...many miners might (?) have already planned, in private, to
close down (or substantially reduce) operations after the halving.

As the coinbase rewards are currently orders of magnitude larger than
tx-fees, fees are unlikely to be able to compensate for this. Users may
decide to simply hold-off on transacting until fees decrease.

Worse, if the price crashes (possibly as a result of uncertainty
surrounding this episode), it will begin to affect miner-revenue.

As a result, miners may decide to temporarily halt mining until the
difficulty falls naturally.

But such a temporary halt is also (potentially) disastrous. Recall the
simple fact that difficulty adjustments are measured in blocks, not time
(it appears that we have exactly 1015 blocks between the halving block
and the next difficulty adjustment block). If excessive difficulty
chokes the system, next difficulty adjustment may *never* arrive naturally.

In this worst-case (but somewhat plausible) scenario, we will be
*forced* to lower the difficulty via hard fork, and we will be forced to
do so very very QUICKLY, as word will be spreading that the Bitcoin
system has broken!

If a specific hard fork is not coded and tested for this, in advance,
the delay might be accompanied by endless [contentious] conversations
about what else should be included in this hard fork.

Worse, since all users will need to upgrade, there will be uncertainty
over contentious versions, malicious agents may try to tamper with
versions (to steal Bitcoins), etc. We should consider pushing a version
out for users to upgrade, in advance of the halving, as soon as possible.



What a disaster! I certainly hope it does not happen, but if it does we
should have already agreed on what to do.


One choice is "which number do we set the difficulty to?". Half may be
too much, or too little. However, allow me to suggest that, if this
disastrous scenario occurs, we shouldn't take any chances, and reduce
difficulty by a huge proportion...80% or so. The difficulty will then
quickly begin to increase again...we can warn users of the increased
orphan risk, and that they should wait for many confirmations (which
should be happening faster).

So, "Allow the alert key to reduce the difficulty by 80%, exactly once
on one of the 1015 blocks between halving and difficulty adjustment."

And we should consider smoothing the rewards (as described in my post,
can be done via soft fork) to prevent this from happening again. In
microeconomics literature, 'kinks' in incentive-systems are
almost-universally agreed to be very undesirable.

Paul
Post by Luke Dashjr via bitcoin-dev
so it may even be possible to have such a proposal ready in time to be
deployed alongside SegWit to take effect in time for the upcoming subsidy
halving.
Lapse of thinking/clarity here. This probably isn't a practical timeframe for
deployment, unless/until there's an emergency situation. So if the code were
bundled with SegWit, it would need some way to avoid its early activation
outside of such an emergency (which could possibly be detected in code, in
this case).
Luke
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Tier Nolan via bitcoin-dev
2016-03-02 18:07:41 UTC
Permalink
On Wed, Mar 2, 2016 at 4:27 PM, Paul Sztorc via bitcoin-dev <
Post by Paul Sztorc via bitcoin-dev
For example, it is theoretically possible that 100% of miners (not 50%
or 10%) will shut off their hardware. This is because it is revenue
which ~halves, not profit.
It depends on how much is sunk costs and how much is marginal costs too.

If hashing costs are 50% capital and 50% marginal, then the entire network
will be able to absorb a 50% drop in subsidy.

50% capital costs means that the cost of the loan to buy the hardware
represents half the cost.

Assume that for every $100 of income, you have to pay $49 for the loan and
$49 for electricity giving 2% profit. If the subsidy halves, then you only
get $50 of income, so lose $48.

But if the bank repossesses the operation, they might as well keep things
running for the $1 in marginal profit (or sell on the hardware to someone
who will keep using it).

Since this drop in revenue is well known in advance, businesses will spend
less on capital. That means that there should be less mining hardware than
otherwise.

A 6 month investment with 3 months on the high subsidy and 3 months on low
subsidy would not be made if it only generated a small profit for the first
3 and then massive losses for the 2nd period of 3 months. For it to be
made, there needs to be large profit during the first period to compensate
for the losses in the 2nd period.
Eric Voskuil via bitcoin-dev
2016-03-02 19:01:36 UTC
Permalink
A 6 month investment with 3 months on the high subsidy and 3 months on low subsidy would not be made

Yes, this is the essential point. All capital investments are made based on expectations of future returns. To the extent that futures are perfectly knowable, they can be perfectly factored in. This is why inflation in Bitcoin is not a tax, it’s a cost. These step functions are made continuous by their predictability, removing that predictability will make them -- unpredictable.



Changing these futures punishes those who have planned properly and favors those who have not. Sort of like a Bitcoin bail-in; are some miners are too big to fail? It also creates the expectation that it may happen again. This infects the money with the sort of uncertainty that Bitcoin is designed to prevent.



e



From: bitcoin-dev-***@lists.linuxfoundation.org [mailto:bitcoin-dev-***@lists.linuxfoundation.org] On Behalf Of Tier Nolan via bitcoin-dev
Sent: Wednesday, March 2, 2016 10:08 AM
Cc: Bitcoin Dev <bitcoin-***@lists.linuxfoundation.org>
Subject: Re: [bitcoin-dev] Hardfork to fix difficulty drop algorithm



On Wed, Mar 2, 2016 at 4:27 PM, Paul Sztorc via bitcoin-dev <bitcoin-***@lists.linuxfoundation.org <mailto:bitcoin-***@lists.linuxfoundation.org> > wrote:

For example, it is theoretically possible that 100% of miners (not 50%
or 10%) will shut off their hardware. This is because it is revenue
which ~halves, not profit.



It depends on how much is sunk costs and how much is marginal costs too.

If hashing costs are 50% capital and 50% marginal, then the entire network will be able to absorb a 50% drop in subsidy.

50% capital costs means that the cost of the loan to buy the hardware represents half the cost.

Assume that for every $100 of income, you have to pay $49 for the loan and $49 for electricity giving 2% profit. If the subsidy halves, then you only get $50 of income, so lose $48.

But if the bank repossesses the operation, they might as well keep things running for the $1 in marginal profit (or sell on the hardware to someone who will keep using it).

Since this drop in revenue is well known in advance, businesses will spend less on capital. That means that there should be less mining hardware than otherwise.

A 6 month investment with 3 months on the high subsidy and 3 months on low subsidy would not be made if it only generated a small profit for the first 3 and then massive losses for the 2nd period of 3 months. For it to be made, there needs to be large profit during the first period to compensate for the losses in the 2nd period.
Eric Voskuil via bitcoin-dev
2016-03-02 20:44:07 UTC
Permalink
Post by Tier Nolan via bitcoin-dev
A 6 month investment with 3 months on the high subsidy and 3 months on
low subsidy would not be made…
Yes, this is the essential point. All capital investments are made based
on expectations of future returns. To the extent that futures are
perfectly knowable, they can be perfectly factored in. This is why
inflation in Bitcoin is not a tax, it’s a cost. These step functions are
made continuous by their predictability, removing that predictability
will make them -- unpredictable.
The Ministry of Truth is taking job applications in the doublespeak
department...
Not sure how you interpret a tautology as doublespeak.
Changing these futures punishes those who have planned properly and
favors those who have not. Sort of like a Bitcoin bail-in; are some
miners are too big to fail? It also creates the expectation that it may
happen again. This infects the money with the sort of uncertainty that
Bitcoin is designed to prevent.
Coinbase-smoothing can be done via soft fork (soft forks typically only
move "one way" toward stability).
I'm addressing the hard fork proposal (see subject line).
Moreover, the effect *costs* miners,
it does not benefit them. Finally, it can be done so that the economic
impact on miners is minimized.
Changes to consensus rules change the value of coins, which are property
of their owners. Nobody owes a miner a promise of consistent revenue for
future work. Cost or benefit to miners is relevant only to the extent
that those who hold money believe it will affect their value and
therefore consider it in their decision to consent.
You'll just have to weigh the risks -- some vague, tiny effect on
expectations today, vs the need for a small group of experts to
emergency hard fork once every four years.
How is the small group of experts today different from the small group
of experts tomorrow?
I'm sure those experts are completely reliable, and won't get threatened
or assassinated!
This is precisely the issue. The precedent of hard-forking to "fix" the
money is a precedent for establishing authority over the money.
A 6 month investment with 3 months on the high subsidy and 3 months on
low subsidy would not be made if it only generated a small profit for
the first 3 and then massive losses for the 2nd period of 3 months. For
it to be made, there needs to be large profit during the first period to
compensate for the losses in the 2nd period.
The word "loss" is of utmost importance here...if they are operational
losses, it should be obvious to everyone that the best "compensation for
losses in the 2nd period" is to just shut them off (thus reducing losses
to zero).
But of course the losses would not be entirely operational, since
hardware (at a minimum) does not depreciate to zero because of a
halving. The ability to plan does not change this fact. There are
certainly similar considerations for labor, bandwidth, space and even
electrical/cooling costs (contracts). To the extent that these costs are
sunk (as Tier said) *any* earnings are better than none.
So you must be arguing that miners have made an investment 3 months
prior, knowing that it would pay for itself despite the reward halving.
Of course, how could they not?
That's nice, but it ignores the fact that, if that investment is made
everyone, by all miners, the *difficulty* will have increased 2 weeks
afterward...such that operating profits are tending *immediately* toward
zero, and will be zero by the time the first set of 3 months is over.
... which also ignores fees.

e
Peter Todd via bitcoin-dev
2016-03-02 23:02:13 UTC
Permalink
Post by Eric Voskuil via bitcoin-dev
A 6 month investment with 3 months on the high subsidy and 3 months on low subsidy would not be made

Yes, this is the essential point. All capital investments are made based on expectations of future returns. To the extent that futures are perfectly knowable, they can be perfectly factored in. This is why inflation in Bitcoin is not a tax, it’s a cost. These step functions are made continuous by their predictability, removing that predictability will make them -- unpredictable.
You know, I do agree with you.

But see, this is one of the reasons why we keep reminding people that
strictly speaking a hardfork *is* an altcoin, and the altcoin can change
any rule currently in Bitcoin.

It'd be perfectly reasonable to create an altcoin with a 22-million-coin
limit and an inflation schedule that had smooth, rather than abrupt,
drops. It'd also be reasonable to make that altcoin start with the same
UTXO set as Bitcoin as a means of initial coin distribution.

If miners choose to start mining that altcoin en-mass on the halving,
all the more power to them. It's our choice whether or not we buy those
coins. We may choose not to, but if 95% of the hashing power decides to
go mine something different we have to accept that under our current
chosen rules confirmations might take a long time.


Of course, personally I agree with Gregory Maxwell: this is all fairly
unlikely to happen, so the discussion is academic. But we'll see.
--
https://petertodd.org 'peter'[:-1]@petertodd.org
000000000000000004d430e1daab776bc1c194589b0326924220faa00efc50cf
Patrick Shirkey via bitcoin-dev
2016-03-03 10:14:56 UTC
Permalink
Post by Peter Todd via bitcoin-dev
Post by Tier Nolan via bitcoin-dev
A 6 month investment with 3 months on the high subsidy and 3 months on
low subsidy would not be made…
Yes, this is the essential point. All capital investments are made based
on expectations of future returns. To the extent that futures are
perfectly knowable, they can be perfectly factored in. This is why
inflation in Bitcoin is not a tax, it’s a cost. These step functions
are made continuous by their predictability, removing that
predictability will make them -- unpredictable.
You know, I do agree with you.
But see, this is one of the reasons why we keep reminding people that
strictly speaking a hardfork *is* an altcoin, and the altcoin can change
any rule currently in Bitcoin.
It'd be perfectly reasonable to create an altcoin with a 22-million-coin
limit and an inflation schedule that had smooth, rather than abrupt,
drops. It'd also be reasonable to make that altcoin start with the same
UTXO set as Bitcoin as a means of initial coin distribution.
If miners choose to start mining that altcoin en-mass on the halving,
all the more power to them. It's our choice whether or not we buy those
coins. We may choose not to, but if 95% of the hashing power decides to
go mine something different we have to accept that under our current
chosen rules confirmations might take a long time.
Of course, personally I agree with Gregory Maxwell: this is all fairly
unlikely to happen, so the discussion is academic. But we'll see.
Bitcoin is a success.

The success has forced various hardfork discussions.

Hard forking is contentious. If a softfork cannot be achieved the
alternate to a hardfork is creating a new bitcoin. ex bitcoin 2.0

Similar to silver, gold, palladium, etc...

Bitcoins success partly stems from it's brand awareness. Any new
officially supported bitcoin will also benefit from this brand awareness.

If the market values the new improved bitcoin they will put their money
into it. This doesn't require any consensus.

Let the market decide which option has the most value. If everyone
switches to the new bitcoin then the old bitcoin miners will follow.





--
Patrick Shirkey
Boost Hardware Ltd
Dave Scotese via bitcoin-dev
2016-03-03 05:11:16 UTC
Permalink
It makes sense to me that there might be objective conditions under which
we would want to use a number smaller than 2016. A good example would be a
mean time between blocks of more than 20 minutes over the last 144 blocks
(one - two days). If such an occurrence ever happened, and the software
then cut the retarget interval to 1008 (triggering an immediate retarget if
the counter is over 1008), the only problem I see is how to measure the
mean time between blocks.

In fact, has anyone examined the potential problems of reducing the
retarget period, even to one? Not Really.
<http://bitcoin.stackexchange.com/questions/9305/why-not-retarget-on-every-block>
That question includes a suggestion of retargeting on every block, but
using the same 2016 block window for the calculation, so difficulty changes
would be very smooth, and still as unpredictable and how long till we find
the next block.

On Wed, Mar 2, 2016 at 3:02 PM, Peter Todd via bitcoin-dev <
Post by Eric Voskuil via bitcoin-dev
Post by Tier Nolan via bitcoin-dev
A 6 month investment with 3 months on the high subsidy and 3 months on
low subsidy would not be made

Post by Eric Voskuil via bitcoin-dev
Yes, this is the essential point. All capital investments are made based
on expectations of future returns. To the extent that futures are perfectly
knowable, they can be perfectly factored in. This is why inflation in
Bitcoin is not a tax, it’s a cost. These step functions are made continuous
by their predictability, removing that predictability will make them --
unpredictable.
You know, I do agree with you.
But see, this is one of the reasons why we keep reminding people that
strictly speaking a hardfork *is* an altcoin, and the altcoin can change
any rule currently in Bitcoin.
It'd be perfectly reasonable to create an altcoin with a 22-million-coin
limit and an inflation schedule that had smooth, rather than abrupt,
drops. It'd also be reasonable to make that altcoin start with the same
UTXO set as Bitcoin as a means of initial coin distribution.
If miners choose to start mining that altcoin en-mass on the halving,
all the more power to them. It's our choice whether or not we buy those
coins. We may choose not to, but if 95% of the hashing power decides to
go mine something different we have to accept that under our current
chosen rules confirmations might take a long time.
Of course, personally I agree with Gregory Maxwell: this is all fairly
unlikely to happen, so the discussion is academic. But we'll see.
--
000000000000000004d430e1daab776bc1c194589b0326924220faa00efc50cf
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
--
I like to provide some work at no charge to prove my value. Do you need a
techie?
I own Litmocracy <http://www.litmocracy.com> and Meme Racing
<http://www.memeracing.net> (in alpha).
I'm the webmaster for The Voluntaryist <http://www.voluntaryist.com> which
now accepts Bitcoin.
I also code for The Dollar Vigilante <http://dollarvigilante.com/>.
"He ought to find it more profitable to play by the rules" - Satoshi
Nakamoto
Eric Voskuil via bitcoin-dev
2016-03-03 20:54:24 UTC
Permalink
Post by Peter Todd via bitcoin-dev
Post by Eric Voskuil via bitcoin-dev
A 6 month investment with 3 months on the high subsidy and 3 months on low subsidy would not be made

Yes, this is the essential point. All capital investments are made based on expectations of future returns. To the extent that futures are perfectly knowable, they can be perfectly factored in. This is why inflation in Bitcoin is not a tax, it’s a cost. These step functions are made continuous by their predictability, removing that predictability will make them -- unpredictable.
You know, I do agree with you.
But see, this is one of the reasons why we keep reminding people that
strictly speaking a hardfork *is* an altcoin, and the altcoin can change
any rule currently in Bitcoin.
It'd be perfectly reasonable to create an altcoin with a 22-million-coin
limit and an inflation schedule that had smooth, rather than abrupt,
drops. It'd also be reasonable to make that altcoin start with the same
UTXO set as Bitcoin as a means of initial coin distribution.
If miners choose to start mining that altcoin en-mass on the halving,
all the more power to them. It's our choice whether or not we buy those
coins. We may choose not to, but if 95% of the hashing power decides to
go mine something different we have to accept that under our current
chosen rules confirmations might take a long time.
Of course, personally I agree with Gregory Maxwell: this is all fairly
unlikely to happen, so the discussion is academic. But we'll see.
I agree, this is a perfectly rational interpretation. I also agree that
this particular instance is academic. But I see more to this than
accepting what is possible.

In the case of Federal Reserve Notes the gold obligation was abrogated.
This was (at least) a contract default, implemented by force of arms.
This contentious hard fork was clearly an attack.

But in a system with no authority and in which nobody has formed a
contractual obligation with anyone else, what would constitute an attack
on the money? There is no difference between state attacks on (or
collusion with) miners and miners acting on self interest.

One answer is that nothing is an attack, it's up to the market to
decide. But to the extent that there can be an attack on the money, the
attempt to move the value of the coin to an altcoin (hard fork) is it.
Though the choice of the term "attack" isn't essential.

The importance of recognizing an attack is that it affords one the
opportunity to defend against it. People holding "dollars" in 1933 were
ill equipped to defend against a system level attack (monetary policy),
in part because many did not recognize it as such, and in part because
there was insufficient preparation by those who did.

I see us building the tools and awareness necessary for defense. As you
say, nobody has to buy into an altcoin forked from their coin. This much
is simple to achieve. The more difficult problem is preserving the
utility of the original coin. Clearly the purpose of a hard fork (as
opposed to a new coin) is to transfer this value.

We've all seen arguments for contentious hard fork deployment that
explicitly depend on the fear of monetary loss to drag people to
acceptance. While this may be the nature of the technology, it's
important that we develop effective defense against it.

Ultimately the only defense is individual validation. The collusion of
banks (web wallets) with miners in attacking consensus is obvious. But
even without active collusion, the surrender of validation leaves people
just as defenseless as *being* unarmed while retaining a right to
*become* armed.

Even if every person mines at the same level, the system amounts to
little more than majority rule if validation is not decentralized. There
are people perfectly willing to exploit this weakness.

e
Tier Nolan via bitcoin-dev
2016-03-04 10:27:48 UTC
Permalink
An alternative soft fork would be to require that miners pay some of the
coinbase to a CLTV locked output (that is otherwise unlocked). This allows
the release of the funds to be delayed.
Dave Hudson via bitcoin-dev
2016-03-02 15:48:21 UTC
Permalink
I think the biggest question here would be how would the difficulty retargeting be changed? Without seeing the algorithm proposal it's difficult to assess the impact that it would have, but my intuition is that this is likely to be problematic.

Probabilistically the network sees surprisingly frequent swings of +/-20% in terms of the block finding rate on any given day, while the statistical noise over a 2016 block period can be more than +/-5%. Any change would still have to require a fairly significant period of time before there would be a reasonable level of confidence that the hash rate really had fallen as opposed to just seeing statistical noise (http://hashingit.com/analysis/29-lies-damned-lies-and-bitcoin-difficulties and http://hashingit.com/analysis/28-reach-for-the-ear-defenders).

How long would be required to deem that the hash rate had dramatically fallen? Would such a change be a one-time event or would it be ever-present?

If we were to say that if the hash rate dropped 50% in one day (which could, of course be a 30% real drop and 20% variance) and the difficulty was retargeted to 50% lower then that would have to be matched with a similar rapid retarget if it were to increase by a similar amount. Failing to do this both ways this would introduce an economic incentive for large miners to suppress the difficulty and gain dramatically larger numbers of block rewards. The current fixed block count per difficulty change prevents this because the daily losses while suppressing hashing outweigh the potential gains when it's re-added.


Cheers,
Dave
Post by Luke Dashjr via bitcoin-dev
We are coming up on the subsidy halving this July, and there have been some
concerns raised that a non-trivial number of miners could potentially drop off
the network. This would result in a significantly longer block interval, which
also means a higher per-block transaction volume, which could cause the block
size limit to legitimately be hit much sooner than expected. Furthermore, due
to difficulty adjustment being measured exclusively in blocks, the time until
it adjusts to compensate would be prolonged.
For example, if 50% of miners dropped off the network, blocks would be every
20 minutes on average and contain double the transactions they presently do.
Even double would be approximately 850-900k, which potentially bumps up
against the hard limit when empty blocks are taken into consideration. This
situation would continue for a full month if no changes are made. If more
miners drop off the network, most of this becomes linearly worse, but due to
hitting the block size limit, the backlog would grow indefinitely until the
adjustment occurs.
To alleviate this risk, it seems reasonable to propose a hardfork to the
difficulty adjustment algorithm so it can adapt quicker to such a significant
drop in mining rate. BtcDrak tells me he has well-tested code for this in his
altcoin, which has seen some roller-coaster hashrates, so it may even be
possible to have such a proposal ready in time to be deployed alongside SegWit
to take effect in time for the upcoming subsidy halving. If this slips, I
think it may be reasonable to push for at least code-readiness before July,
and possibly roll it into any other hardfork proposed before or around that
time.
I am unaware of any reason this would be controversial, so if anyone has a
problem with such a change, please speak up sooner rather than later. Other
ideas or concerns are of course welcome as well.
Thanks,
Luke
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Bob McElrath via bitcoin-dev
2016-03-08 22:05:07 UTC
Permalink
Post by Dave Hudson via bitcoin-dev
I think the biggest question here would be how would the difficulty
retargeting be changed? Without seeing the algorithm proposal it's difficult
to assess the impact that it would have, but my intuition is that this is
likely to be problematic.
I have no comment on whether this will be *needed* but there's a simple
algorithm that I haven't seen any coin adopt, that I think needs to be: the
critically damped harmonic oscillator:

http://mathworld.wolfram.com/CriticallyDampedSimpleHarmonicMotion.html

In dynamical systems one does a derivative expansion. Here we want to find the
first and second derivatives (in time) of the hashrate. These can be determined
by a method of finite differences, or fancier algorithms which use a quadratic
or quartic polynomial approximation. Two derivatives are generally all that is
needed, and the resulting dynamical system is a damped harmonic oscillator.

A damped harmonic oscillator is basically how your car's shock absorbers work.
The relevant differential equation has two parameters: the oscillation frequency
and damping factor. The maximum oscillation frequency is the block rate. Any
oscillation faster than the block rate cannot be measured by block times. The
damping rate is an exponential decay and for critical damping is twice the
oscillation frequency.

So, this is a zero parameter, optimal damping solution for a varying hashrate.
This is inherently a numeric approximation solution to a differential equation,
so questions of approximations for the hashrate enter, but that's all. Weak
block proposals will be able to get better approximations to the hashrate.

If solving this problem is deemed desirable, I can put some time into this, or
direct others as to how to go about it.

--
Cheers, Bob McElrath

"For every complex problem, there is a solution that is simple, neat, and wrong."
-- H. L. Mencken
Dave Hudson via bitcoin-dev
2016-03-09 18:30:19 UTC
Permalink
A damping-based design would seem like the obvious choice (I can think of a few variations on a theme here, but most are found in the realms of control theory somewhere). The problem, though, is working working out a timeframe over which to run the derivative calculations.

The problem is the measurement of the hashrate, which is pretty inaccurate at best because even 2016 events isn't really enough (with a completely constant hash rate running indefinitely we'd see difficulty swings of up to +/- 5% even with the current algorithm). In order to meaningfully react to a major loss of hashing we'd still need to be considering a window of probably 2 weeks.

My other concern is that if we allow quick retargets to lower difficulties then that seems likely to expose the chain to being gamed. I'd need to think about this some more, but a few scenarios I was thinking about earlier this week appeared to risk making some types of selfish mining strategies quite a lot more profitable.

With all this said though I'll be very surprised if there's a huge drop in the hash rate come July. The hash rate has jumped up by almost 70% in the last 6 to 7 months and that implies some pretty serious investments by miners who are quite aware of the halving. My guess is that quite a lot of the baseline 30% has also been replaced in the same cycle. These same miners were mining with a coin price around $250 last year so in terms of profitability I'm pretty sure that one around $400 won't be a huge concern.

I'm sure that there will be some very public "I'm done with mining" announcements from a few smaller miners come July, but I suspect the bulk of the network will have a relatively small blip and continue on its way.


Cheers,
Dave
Post by Bob McElrath via bitcoin-dev
Post by Dave Hudson via bitcoin-dev
I think the biggest question here would be how would the difficulty
retargeting be changed? Without seeing the algorithm proposal it's difficult
to assess the impact that it would have, but my intuition is that this is
likely to be problematic.
I have no comment on whether this will be *needed* but there's a simple
algorithm that I haven't seen any coin adopt, that I think needs to be: the
http://mathworld.wolfram.com/CriticallyDampedSimpleHarmonicMotion.html
In dynamical systems one does a derivative expansion. Here we want to find the
first and second derivatives (in time) of the hashrate. These can be determined
by a method of finite differences, or fancier algorithms which use a quadratic
or quartic polynomial approximation. Two derivatives are generally all that is
needed, and the resulting dynamical system is a damped harmonic oscillator.
A damped harmonic oscillator is basically how your car's shock absorbers work.
The relevant differential equation has two parameters: the oscillation frequency
and damping factor. The maximum oscillation frequency is the block rate. Any
oscillation faster than the block rate cannot be measured by block times. The
damping rate is an exponential decay and for critical damping is twice the
oscillation frequency.
So, this is a zero parameter, optimal damping solution for a varying hashrate.
This is inherently a numeric approximation solution to a differential equation,
so questions of approximations for the hashrate enter, but that's all. Weak
block proposals will be able to get better approximations to the hashrate.
If solving this problem is deemed desirable, I can put some time into this, or
direct others as to how to go about it.
--
Cheers, Bob McElrath
"For every complex problem, there is a solution that is simple, neat, and wrong."
-- H. L. Mencken
Bob McElrath via bitcoin-dev
2016-03-09 20:21:36 UTC
Permalink
Post by Dave Hudson via bitcoin-dev
A damping-based design would seem like the obvious choice (I can think of a
few variations on a theme here, but most are found in the realms of control
theory somewhere). The problem, though, is working working out a timeframe
over which to run the derivative calculations.
From a measurement theory perspective this is straightforward. Each block is a
measurement, and error propagation can be performed to derive an error on the
derivatives.

The statistical theory of Bitcoin's block timing is known as a Poisson Point
Process: https://en.wikipedia.org/wiki/Poisson_point_process or temporal point
process. If you google those plus "estimation" you'll find a metric shit-ton of
literature on how to handle this.
Post by Dave Hudson via bitcoin-dev
The problem is the measurement of the hashrate, which is pretty inaccurate at
best because even 2016 events isn't really enough (with a completely constant
hash rate running indefinitely we'd see difficulty swings of up to +/- 5% even
with the current algorithm). In order to meaningfully react to a major loss
of hashing we'd still need to be considering a window of probably 2 weeks.
You don't want to assume it's constant in order to get a better measurement.
The assumption is clearly false. But, errors can be calculated, and retargeting
can take errors into account, because no matter what we'll always be dealing
with a finite sample.

Personally I don't think difficulty target variations are such a big deal, if
the algorithm targets that over any long time interval, the average block time
is 10 min. Bitcoin's current algorithm fails here, with increasing hashrate (as
we have), it issues coins faster than its assumed schedule.

--
Cheers, Bob McElrath

"For every complex problem, there is a solution that is simple, neat, and wrong."
-- H. L. Mencken
Dave Hudson via bitcoin-dev
2016-03-09 23:24:15 UTC
Permalink
Post by Bob McElrath via bitcoin-dev
Post by Dave Hudson via bitcoin-dev
A damping-based design would seem like the obvious choice (I can think of a
few variations on a theme here, but most are found in the realms of control
theory somewhere). The problem, though, is working working out a timeframe
over which to run the derivative calculations.
From a measurement theory perspective this is straightforward. Each block is a
measurement, and error propagation can be performed to derive an error on the
derivatives.
Sure, but I think there are 2 problems:

1) My guess is that errors over anything but a long period are probably too large to be very useful.

2) We don't have a strong notion of time that is part of the consensus. Sure, blocks have timestamps but they're very loosely controlled (can't be more than 2 hours ahead of what any validating node thinks the time might be). Difficulty can't be calculated based on anything that's not part of the consensus data.
Post by Bob McElrath via bitcoin-dev
The statistical theory of Bitcoin's block timing is known as a Poisson Point
Process: https://en.wikipedia.org/wiki/Poisson_point_process or temporal point
process. If you google those plus "estimation" you'll find a metric shit-ton of
literature on how to handle this.
Strictly it's a non-homogeneous Poisson Process, but I'm pretty familiar with the concept (Google threw one of my own blog posts back at me: http://hashingit.com/analysis/27-hash-rate-headaches, but I actually prefer this one: http://hashingit.com/analysis/30-finding-2016-blocks because most people seem to find it easier to visualize).
Post by Bob McElrath via bitcoin-dev
Post by Dave Hudson via bitcoin-dev
The problem is the measurement of the hashrate, which is pretty inaccurate at
best because even 2016 events isn't really enough (with a completely constant
hash rate running indefinitely we'd see difficulty swings of up to +/- 5% even
with the current algorithm). In order to meaningfully react to a major loss
of hashing we'd still need to be considering a window of probably 2 weeks.
You don't want to assume it's constant in order to get a better measurement.
The assumption is clearly false. But, errors can be calculated, and retargeting
can take errors into account, because no matter what we'll always be dealing
with a finite sample.
Agreed, it's a thought experiment I ran in May 2014 (http://hashingit.com/analysis/28-reach-for-the-ear-defenders). I found that many people's intuition is that there would be little or no difficulty changes in such a scenario, but the intuition isn't reliable. Given a static hash rate the NHPP behaviour introduces a surprisingly large amount of noise (often much larger than any signal over a period of even weeks). Any measurements in the order of even a few days has so much noise that it's practically unusable. I just realized that unlike some of my other sims this one didn't make it to github; I'll fix that later this week.


Cheers,
Dave

Paul Sztorc via bitcoin-dev
2016-03-09 20:26:47 UTC
Permalink
Post by Dave Hudson via bitcoin-dev
The hash rate has jumped up by almost 70% in the last 6 to 7 months and that implies some pretty serious investments by miners who are quite aware of the halving.
There are a few ways in which that information would be irrelevant:
[1.] It is possible that miners expect to breakeven before the halving.
[2.] It is also possible that miners earnestly believe that there will
be no problem -- however:
... [2a.] This belief may be mistaken.
... [2b.] Miners may be counting on Core Devs to fix any problems that
come up with anything, this one included.

Also, [3.] many miners believe that the price will increase around the
time of the halving, either for market-microstructure reasons or
marketing reasons. I, personally, think that the price is as likely to
go down as up.
Post by Dave Hudson via bitcoin-dev
These same miners were mining with a coin price around $250 last year so in terms of profitability I'm pretty sure that one around $400 won't be a huge concern.
For some miners, currently it costs $X in electricity per coin mined,
and $400 / 2 is less than X. I do not know how representative this
information is.

Paul
Bryan Bishop via bitcoin-dev
2016-03-02 16:17:31 UTC
Permalink
Post by Luke Dashjr via bitcoin-dev
We are coming up on the subsidy halving this July, and there have been some
Luke,

One reason "hard-fork to fix difficulty drop algorithm" could be
controversial is that the proposal involves a hard-fork (perhaps
necessarily so, at my first and second glance). There are a number of
concerns with hard-forks including security, deployment, participation,
readiness measurement, backwards incompatibility, etc. In fact, some
Bitcoin Core developers believe that hard-forks are not a good idea and
should not be used.

# Hard-forks

An interesting (unspoken?) idea I’ve heard from a few people has been “we
should try to avoid all hard-forks because they are backwards
incompatible”, another thought has been "there should only be one more
hard-fork if any" and/or "there should only be one hard-fork every 30
years". I also recognize feedback from others who have mentioned "probably
unrealistic to expect that the consensus layer can be solidified this early
in Bitcoin's history". At the same time there are concerns about “slippery
slopes”....

Also, if you are going to participate in a hard-fork then I think you
should make up some proposals for ensure minimal monetary loss on the old
(non-hard-forked) chain, especially since your proposed timeline is so
short seems reasonable to expect even more safety-related due diligence to
minimize money loss (such as using a new address prefix on the hard-forked
upgrade). Anyway, it should be clear that hard-forks are an unsettled issue
and are controversial in ways that I believe you are already aware about.

# Have miners gradually reduce their hashrate instead of using a step
function cliff

adam3us recently proposed that miners who are thinking of turning off
equipment should consider gradually ramping down their hashrate, as a show
of goodwill (and substantial loss to themselves, similar to how they would
incur losses from no longer mining after the halving). This is not
something the consensus algorithm can enforce at the moment, and this
suggestion does not help under adversarial conditions. Since this
suggestion does not require a hard-fork, perhaps some effort should be made
to query miners and figure out if they need assistance with implementing
this (if they happen to be interested).

# Contingency planning

Having said all of the negative things above about hard-forks, I will add
that I do actually like the idea of having backup plans available and
tested and gitian-built many weeks ahead of expected network event dates.
Unfortunately this might encourage partial consensus layer hard-forks in
times of extreme uncertainty such as "emergencies".... creating an even
further emergency.

# "Indefinite backlog growth"

You write "the backlog would grow indefinitely until the adjustment
occurs". This seems to be expected behavior regardless of difficulty
adjustment (in fact, a backlog could continue to grow even once difficulty
adjusts downward), and the consensus protocol does not commit to
information regarding that backlog anyway...

# Difficulty adjustment taking time is expected

This is an expected part of the protocol, it's been mentioned since
forever, it's well known and accounted for. Instead, we should be providing
advice to users about which alternative payment systems they should be
using if they expect instantaneous transaction confirmations. This has been
a long-standing issue, and rolling out a hard-fork is not going to fix
mistaken assumptions from users. They will still think that confirmations
were meant to be instantaneous regardless of how many hard-forks you choose
to deploy.

- Bryan
http://heybryan.org/
1 512 203 0507
David A. Harding via bitcoin-dev
2016-03-02 17:14:28 UTC
Permalink
Post by Luke Dashjr via bitcoin-dev
To alleviate this risk, it seems reasonable to propose a hardfork to the
difficulty adjustment algorithm so it can adapt quicker to such a significant
drop in mining rate.
Having a well-reviewed hard fork patch for rapid difficulty adjustment
would seem to be a useful reserve for all sorts of possible problems.
That said, couldn't this specific potential situation be dealt with by a
relatively simple soft fork?

Let's say that, starting soon, miners require that valid block header
hashes be X% below the target value indicated by nBits. The X% changes
with each block, starting at 0% and increasing to 50% just before block
420,000 (the halving). This means the before the halving, every two
hashes are being treated as one hash, on average.

For blocks 420,000 and higher the code is disabled, immediately doubling
the effective hash rate at the same time the subsidy is halved,
potentially roughly canceling each other out to make a pre-halving hash
equal in economic value to a post-halving hash.

Of course, some (perhaps many) miners will not be profitable at the
post-halving subsidy level, so the steady increase in X% will force them
off the network at some point before the halving, hopefully in small
numbers rather than all at once like the halving would be expected to do.

For example, if the soft fork begins enforcement at block 410,000, then
X% can be increased 0.01% per block. Alice is a miner whose costs are
24BTC per block and she never claims tx fees for some reason, so her
profits now are always 25BTC per block. During the first difficulty
period after the soft fork is deployed, the cost to produce a hash will
increase like this,

0: 0% 500: 5% 1000: 10% 1500: 15% 2000: 20%
100: 1% 600: 6% 1100: 11% 1600: 16%
200: 2% 700: 7% 1200: 12% 1700: 17%
300: 3% 800: 8% 1300: 13% 1800: 18%
400: 4% 900: 9% 1400: 14% 1900: 19%

Somewhere around block 417, Alice will need to drop out because her
costs are now above 25BTC per block. With the loss of her hash rate,
the average interblock time will increase and the capacity will decrease
(all other things being equal). However, Bob whose costs are 20BTC per
block can keep mining through the period.

At the retarget, the difficulty will go down (the target goes up) to
account for the loss of Alice's hashes. It may even go down enough
that Alice can mine profitably for a few more blocks early in the new
period, but the increasing X% factor will make her uneconomical again,
and this time it might even make Bob uneconomical too near the end of
the period. However, Charlie whose costs are 12BTC per block will
never be uneconomical as he can continue mining profitably even after
the halving. Alice and Bob mining less will increase the percentage of
blocks Charlie produces before the retarget, steadily shifting the
dynamics of the mining network to the state expected after the halving
and hopefully minimizing the magnitude of any shocks.

This does create the question about whether this soft fork would be
ethical, as Alice and Bob may have invested money and time on the
assumption that their marginal hardware would be usable up until the
halving and with this soft fork they would become uneconomical earlier
than block 420,000. A counterargument here is such an investment was
always speculative given the vagaries of exchange rate fluctuation, so
it could be permissible to change the economics slightly in order to
help ensure all other Bitcoin users experience minimal disruption during
the halving.

Unless I'm missing something (likely) I think this proposal has the
advantage of fast rollout (if the mechanism of an adjusted target is as
simple as I think it could be) in a non-emergency manner without a hard
fork that would require all full nodes upgrade (plus maybe some SPV
software that check nBits, which they probably all should be doing
given it's in the block headers that they download anyway).

-Dave

P.S. I see Tier Nolan proposed something similar while I was writing
this. I think this proposal differs in its analysis to warrant a
possible duplicate posting.
Gregory Maxwell via bitcoin-dev
2016-03-02 17:53:46 UTC
Permalink
On Wed, Mar 2, 2016 at 5:14 PM, David A. Harding via bitcoin-dev
Post by David A. Harding via bitcoin-dev
Post by Luke Dashjr via bitcoin-dev
To alleviate this risk, it seems reasonable to propose a hardfork to the
difficulty adjustment algorithm so it can adapt quicker to such a significant
drop in mining rate.
Having a well-reviewed hard fork patch for rapid difficulty adjustment
would seem to be a useful reserve for all sorts of possible problems.
That said, couldn't this specific potential situation be dealt with by a
relatively simple soft fork?
[...]


What you are proposing makes sense only if it was believed that a very
large difficulty drop would be very likely.

This appears to be almost certainly untrue-- consider-- look how long
ago since hashrate was 50% of what it is now, or 25% of what it is
now-- this is strong evidence that supermajority of the hashrate is
equipment with state of the art power efficiency. (I've also heard
more directly-- but I think the this evidence is more compelling
because it can't be tainted by boasting). If a pre-programmed ramp and
drop is set then it has the risk of massively under-setting
difficulty; which is also strongly undesirable (e.g. advanced
inflation and exacerbating existing unintentional selfish mining)...
and that is before suggesting that miners voluntarily take a loss of
inflation now.

So while I think this concern is generally implausible; I think it's
prudent to have a difficulty step patch (e.g. a one time single point
where a particular block is required to lower bits a set amount) ready
to go in the unlikely case the network is stalled. Of course, if the
alternative is "stuck" from a large hashrate drop the deployment would
be both safe and relatively uncontroversial. I think the
unfavorability of that approach is well matched to the implausibility
of the situation, and likely the right coarse of action compared to
risky interventions that would likely cause harm. The cost of
developing and testing such a patch is low, and justified purely on
the basis of increasing confidence that an issue would be handled (a
fact _I_ am perfectly confident in; but apparently some are not).

With respect what Luke was suggesting; without specifics its hard to
comment, but most altcoin "tolerate difficulty drop" changes have made
them much more vulnerable to partitioning attacks and other issues
(e.g. strategic behavior by miners to increase inflation), and have
actually been exploited in practice several times (solidcoin's being
the oldest I'm aware of). Many survived a fairly long time before
being shown to be pretty broken, simply because they were deployed in
cases where no one cared to attack. I'm currently doubtful that
particular path would be fruitful.
David A. Harding via bitcoin-dev
2016-03-02 19:34:33 UTC
Permalink
Post by Gregory Maxwell via bitcoin-dev
What you are proposing makes sense only if it was believed that a very
large difficulty drop would be very likely.
This appears to be almost certainly untrue-- consider-- look how long
ago since hashrate was 50% of what it is now, or 25% of what it is
now-- this is strong evidence that supermajority of the hashrate is
equipment with state of the art power efficiency.
To avoid duplication of looking up this statistic among readers, here
are the various recent difficulties:

$ for i in $( seq 0 2016 60000 ) ; do echo -n $i blocks ago:' ' ; bitcoin-cli getblock $( bitcoin-cli getblockhash $(( 400857 - i )) ) | jshon -e difficulty ; done | column -t
0 blocks ago: 163491654908.95929
2016 blocks ago: 144116447847.34869
4032 blocks ago: 120033340651.237
6048 blocks ago: 113354299801.4711
8064 blocks ago: 103880340815.4559
10080 blocks ago: 93448670796.323807
12096 blocks ago: 79102380900.225983
14112 blocks ago: 72722780642.54718
16128 blocks ago: 65848255179.702606
18144 blocks ago: 62253982449.760818
20160 blocks ago: 60883825480.098282
22176 blocks ago: 60813224039.440353
24192 blocks ago: 59335351233.86657
26208 blocks ago: 56957648455.01001
28224 blocks ago: 54256630327.889961
30240 blocks ago: 52699842409.347008
32256 blocks ago: 52278304845.591682
34272 blocks ago: 51076366303.481934
36288 blocks ago: 49402014931.227463
38304 blocks ago: 49692386354.893837
40320 blocks ago: 47589591153.625008
42336 blocks ago: 48807487244.681381
44352 blocks ago: 47643398017.803436
46368 blocks ago: 47610564513.47126
48384 blocks ago: 49446390688.24144
50400 blocks ago: 46717549644.706421
52416 blocks ago: 47427554950.6483
54432 blocks ago: 46684376316.860291
56448 blocks ago: 44455415962.343803
58464 blocks ago: 41272873894.697021

<50% of current hash rate was last seen roughly six retarget periods (12
weeks) ago and <25% of current hash rate was last seen roughly 29 periods
(58 weeks) ago.

I think that's reasonably strong evidence for your thesis given that
the increases in hash rate from the introduction of new efficient
equipment are likely partly offset by the removal from the hash rate of
lower efficiency equipment, so the one-year tail of ~25% probably means
that less than 25% of operating equipment is one year old or older.

However, it is my understanding that most mining equipment can be run at
different hash rates. Is there any evidence that high-efficiency miners
today are using high clock speeds to produce more hashes per ASIC than
they will after halving? Is there any way to guess at how many fewer
hashes they might produce?
Post by Gregory Maxwell via bitcoin-dev
If a pre-programmed ramp and drop is set then it has the risk of
massively under-setting difficulty; which is also strongly undesirable
(e.g. advanced inflation and exacerbating existing unintentional
selfish mining)
Maybe I'm not thinking this through thoroughly, but I don't think it's
possible to significantly advance inflation unless the effective hash
rate increases by more than 300% at the halving. With the proposal
being replied to, if all mining equipment operation before the
halving continued operating after it, the effective increase would be
200%. That doubling in effective hash rate would've been offset in
advance through a reduction in the effective hash rate in the weeks
before the halving.

Exacerbated unintentional selfish mining is a much more significant
concern IMO, even if it's only for a short retarget period or two. This
is especially the case given the current high levels of centralization
and validationless mining on the network today, which we would not want
to reward by making those miners the only ones effectively capable of
creating blocks until difficulty adjusted. I had not thought of this
aspect; thank you for bringing it up.
Post by Gregory Maxwell via bitcoin-dev
and that is before suggesting that miners voluntarily take a loss of
inflation now.
Yes, I very much don't like that aspect, which is why I made sure to
mention it.
Post by Gregory Maxwell via bitcoin-dev
So while I think this concern is generally implausible; I think it's
prudent to have a difficulty step patch (e.g. a one time single point
where a particular block is required to lower bits a set amount) ready
to go in the unlikely case the network is stalled.
I think having that code ready in general is a good idea, and a one-time
change in nBits is sounds like a good and simple way to go about it.

Thank you for your insightful reply,

-Dave
Paul Sztorc via bitcoin-dev
2016-03-03 01:06:27 UTC
Permalink
Post by Gregory Maxwell via bitcoin-dev
What you are proposing makes sense only if it was believed that a very
large difficulty drop would be very likely.
This appears to be almost certainly untrue-- consider-- look how long
ago since hashrate was 50% of what it is now, or 25% of what it is
now-- this is strong evidence that supermajority of the hashrate is
equipment with state of the art power efficiency.
I don't understand the relevance of this.

In my view, we would prefer miners to invest in hardware just a mere
2016 blocks away from the halving. Instead, they've made them too soon.
Assuming that miners are already located in low-power-cost areas, the
difficulty will be quickly rising to compensate for "state of the art
power efficiency".

So it will have canceled out by July.

If anything, the more efficient miners become today, the bigger our
potential problem in July, because chip-manufacturers may have used up
all of the easy efficiency-increasing moves, such that investments do
not take place in June.

Paul
Paul Sztorc via bitcoin-dev
2016-03-09 17:58:18 UTC
Permalink
My recent conversations with miners revealed:

* Many have made "extra-large" hardware investments recently.
* Some wonder if we have just reached (or are quickly reaching) a
plateau of hardware-efficiency. This would mean that
hardware-investments might not be made in the critical period
immediately preceding the halving.

However, some good news:

* For Chinese miners, power is often purchased in fixed quantities, for
long-durations (of around 12 months, and these contracts -fortunately-
do overlap the July halving). Because power is difficult to store, this
implies that miners will *need* to mine, at all times, even at a loss.
So miners may continue to mine after the halving, no matter what.

On the other hand, miners can default on these contracts by simply
declaring bankruptcy, at which point their equipment would be entirely
unusable, by anyone, for a very long time.

So the problem is less likely, but more potentially-catastrophic.

Paul
Post by Paul Sztorc via bitcoin-dev
Post by Gregory Maxwell via bitcoin-dev
What you are proposing makes sense only if it was believed that a very
large difficulty drop would be very likely.
This appears to be almost certainly untrue-- consider-- look how long
ago since hashrate was 50% of what it is now, or 25% of what it is
now-- this is strong evidence that supermajority of the hashrate is
equipment with state of the art power efficiency.
I don't understand the relevance of this.
In my view, we would prefer miners to invest in hardware just a mere
2016 blocks away from the halving. Instead, they've made them too soon.
Assuming that miners are already located in low-power-cost areas, the
difficulty will be quickly rising to compensate for "state of the art
power efficiency".
So it will have canceled out by July.
If anything, the more efficient miners become today, the bigger our
potential problem in July, because chip-manufacturers may have used up
all of the easy efficiency-increasing moves, such that investments do
not take place in June.
Paul
Peter Todd via bitcoin-dev
2016-03-02 18:20:28 UTC
Permalink
Post by Luke Dashjr via bitcoin-dev
To alleviate this risk, it seems reasonable to propose a hardfork to the
difficulty adjustment algorithm so it can adapt quicker to such a significant
drop in mining rate. BtcDrak tells me he has well-tested code for this in his
altcoin, which has seen some roller-coaster hashrates, so it may even be
possible to have such a proposal ready in time to be deployed alongside SegWit
to take effect in time for the upcoming subsidy halving. If this slips, I
think it may be reasonable to push for at least code-readiness before July,
and possibly roll it into any other hardfork proposed before or around that
time.
I am unaware of any reason this would be controversial, so if anyone has a
problem with such a change, please speak up sooner rather than later. Other
ideas or concerns are of course welcome as well.
Changing the difficulty adjustment algorithm significantly changes the
security of the whole system, as it lets attackers create fake chains
with a lot less hashing power.

Given as tx fees rise this problem will hopefully be a one-time issue, a
simple fixed difficulty adjustment probably makes sense. No need to
bring in new algorithms here with controversial new security tradeoffs.
--
https://petertodd.org 'peter'[:-1]@petertodd.org
0000000000000000045a03e0e551c4e674f301e0a8eeb217a31ad13580446626
Corey Haddad via bitcoin-dev
2016-03-03 18:27:35 UTC
Permalink
Since the root cause of what you are trying to address is the reward
having, I'd suggest considering an adjustment to the having schedule.
Instead of their being a large supply shock every four years, perhaps the
reward could drop every 52,500 blocks (yearly), or even at each difficulty
adjustment, in such a way that the inflation curve is smoothed out. The
exponential decay rate would be preserved, so overall economic philosophy
would be preserved.

I'm guessing hesitance to this approach would lie in a reluctance to tinker
with Bitcoin's 'economic contract', and slippery slope concerns about might
be the next change (21M?). However, I think it could actually increase
confidence in the system if the community is able to demonstrate a good
process for making such decisions, and show that we can separate the
meaningful underlying principles, such as the coin limit and overall
inflation rate, from what is more akin to an implementation detail, as I
consider the large-step reward reduction to be.

I'm not too worried about the impact of the having as is, but adjusting the
economic parameter would be a safer and simpler way to address the concerns
than to tinker with the difficulty targeting mechanism, which is at the
heart of Bitcoin's security

On Wed, Mar 2, 2016 at 6:56 AM, Luke Dashjr via bitcoin-dev <
Post by Luke Dashjr via bitcoin-dev
We are coming up on the subsidy halving this July, and there have been some
concerns raised that a non-trivial number of miners could potentially drop off
the network. This would result in a significantly longer block interval, which
also means a higher per-block transaction volume, which could cause the block
size limit to legitimately be hit much sooner than expected. Furthermore, due
to difficulty adjustment being measured exclusively in blocks, the time until
it adjusts to compensate would be prolonged.
For example, if 50% of miners dropped off the network, blocks would be every
20 minutes on average and contain double the transactions they presently do.
Even double would be approximately 850-900k, which potentially bumps up
against the hard limit when empty blocks are taken into consideration. This
situation would continue for a full month if no changes are made. If more
miners drop off the network, most of this becomes linearly worse, but due to
hitting the block size limit, the backlog would grow indefinitely until the
adjustment occurs.
To alleviate this risk, it seems reasonable to propose a hardfork to the
difficulty adjustment algorithm so it can adapt quicker to such a significant
drop in mining rate. BtcDrak tells me he has well-tested code for this in his
altcoin, which has seen some roller-coaster hashrates, so it may even be
possible to have such a proposal ready in time to be deployed alongside SegWit
to take effect in time for the upcoming subsidy halving. If this slips, I
think it may be reasonable to push for at least code-readiness before July,
and possibly roll it into any other hardfork proposed before or around that
time.
I am unaware of any reason this would be controversial, so if anyone has a
problem with such a change, please speak up sooner rather than later. Other
ideas or concerns are of course welcome as well.
Thanks,
Luke
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Henning Kopp via bitcoin-dev
2016-03-04 08:41:02 UTC
Permalink
Hi,
Post by Corey Haddad via bitcoin-dev
However, I think it could actually increase
confidence in the system if the community is able to demonstrate a good
process for making such decisions, and show that we can separate the
meaningful underlying principles, such as the coin limit and overall
inflation rate, from what is more akin to an implementation detail, as I
consider the large-step reward reduction to be.
I do not think that a line can be drawn here. As far as I understood,
you think that the coin limit is a meaningful underlying principle
which should not be touched, whereas the halving of mining rewards is
an implementation detail. The two are very closely tied together and
changes to both of them would result in a hardfork, if I am not
mistaken.

Regarding the effects of the mining reward halving, there is a nice
paper from courtois:
http://arxiv.org/abs/1405.0534

All the best
Henning
Post by Corey Haddad via bitcoin-dev
Since the root cause of what you are trying to address is the reward
having, I'd suggest considering an adjustment to the having schedule.
Instead of their being a large supply shock every four years, perhaps the
reward could drop every 52,500 blocks (yearly), or even at each difficulty
adjustment, in such a way that the inflation curve is smoothed out. The
exponential decay rate would be preserved, so overall economic philosophy
would be preserved.
I'm guessing hesitance to this approach would lie in a reluctance to tinker
with Bitcoin's 'economic contract', and slippery slope concerns about might
be the next change (21M?). However, I think it could actually increase
confidence in the system if the community is able to demonstrate a good
process for making such decisions, and show that we can separate the
meaningful underlying principles, such as the coin limit and overall
inflation rate, from what is more akin to an implementation detail, as I
consider the large-step reward reduction to be.
I'm not too worried about the impact of the having as is, but adjusting the
economic parameter would be a safer and simpler way to address the concerns
than to tinker with the difficulty targeting mechanism, which is at the
heart of Bitcoin's security
On Wed, Mar 2, 2016 at 6:56 AM, Luke Dashjr via bitcoin-dev <
Post by Luke Dashjr via bitcoin-dev
We are coming up on the subsidy halving this July, and there have been some
concerns raised that a non-trivial number of miners could potentially drop off
the network. This would result in a significantly longer block interval, which
also means a higher per-block transaction volume, which could cause the block
size limit to legitimately be hit much sooner than expected. Furthermore, due
to difficulty adjustment being measured exclusively in blocks, the time until
it adjusts to compensate would be prolonged.
For example, if 50% of miners dropped off the network, blocks would be every
20 minutes on average and contain double the transactions they presently do.
Even double would be approximately 850-900k, which potentially bumps up
against the hard limit when empty blocks are taken into consideration. This
situation would continue for a full month if no changes are made. If more
miners drop off the network, most of this becomes linearly worse, but due to
hitting the block size limit, the backlog would grow indefinitely until the
adjustment occurs.
To alleviate this risk, it seems reasonable to propose a hardfork to the
difficulty adjustment algorithm so it can adapt quicker to such a significant
drop in mining rate. BtcDrak tells me he has well-tested code for this in his
altcoin, which has seen some roller-coaster hashrates, so it may even be
possible to have such a proposal ready in time to be deployed alongside SegWit
to take effect in time for the upcoming subsidy halving. If this slips, I
think it may be reasonable to push for at least code-readiness before July,
and possibly roll it into any other hardfork proposed before or around that
time.
I am unaware of any reason this would be controversial, so if anyone has a
problem with such a change, please speak up sooner rather than later. Other
ideas or concerns are of course welcome as well.
Thanks,
Luke
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
--
Henning Kopp
Institute of Distributed Systems
Ulm University, Germany

Office: O27 - 3402
Phone: +49 731 50-24138
Web: http://www.uni-ulm.de/in/vs/~kopp
Paul Sztorc via bitcoin-dev
2016-03-09 20:43:08 UTC
Permalink
Post by Henning Kopp via bitcoin-dev
Hi,
Post by Corey Haddad via bitcoin-dev
However, I think it could actually increase
confidence in the system if the community is able to demonstrate a good
process for making such decisions, and show that we can separate the
meaningful underlying principles, such as the coin limit and overall
inflation rate, from what is more akin to an implementation detail, as I
consider the large-step reward reduction to be.
I do not think that a line can be drawn here. As far as I understood,
you think that the coin limit is a meaningful underlying principle
which should not be touched, whereas the halving of mining rewards is
an implementation detail. The two are very closely tied together and
changes to both of them would result in a hardfork, if I am not
mistaken.
I believe that you are mistaken.

The two are almost-completely unrelated, and (as Dr. Back has been
pointing out for a very long time now) the halving of mining rewards can
be modified with a soft fork.

http://www.truthcoin.info/blog/mining-heart-attack/#smooth-the-disinflation-out
Loading...