diff --git a/.gitignore b/.gitignore
new file mode 100644
index 0000000..79b58dc
--- /dev/null
+++ b/.gitignore
@@ -0,0 +1,12 @@
+*.pyc
+*.pyo
+*~
+
+data
+_trial_temp*
+Makefile.local
+.cache
+.pkg
+.coverage
+build
+web-static
diff --git a/Makefile.local b/Makefile.local
deleted file mode 100644
index e69de29..0000000
diff --git a/README.md b/README.md
index eae692e..16c7404 100644
--- a/README.md
+++ b/README.md
@@ -1,62 +1,48 @@
Requirements:
-------------------------
Generic:
-* Vertcoin >=0.8.5
-* Python >=2.6
-* Twisted >=10.0.0
-* python-argparse (for Python =2.6)
-Linux:
-* sudo apt-get install python-zope.interface python-twisted python-twisted-web
-* sudo apt-get install python-argparse # if on Python 2.6
+* Dash >=0.11.2.17
+* Python >=2.7
+* Twisted >=13.0.0
+* Zope.interface >=3.8.0
-Windows:
-* Install Python 2.7: http://www.python.org/getit/
-* Install Twisted: http://twistedmatrix.com/trac/wiki/Downloads
-* Install Zope.Interface: http://pypi.python.org/pypi/zope.interface/3.8.0
-* Install python win32 api: http://sourceforge.net/projects/pywin32/files/pywin32/Build%20218/
-* Install python win32 api wmi wrapper: https://pypi.python.org/pypi/WMI/#downloads
-* Unzip the files into C:\Python27\Lib\site-packages
+Linux:
+ sudo apt-get install python-zope.interface python-twisted python-twisted-web python-dev
+ sudo apt-get install gcc g++
-Install module:
+Install Python modules:
-------------------------
+dash_hash:
-* apt-get install libboost1.48-all-dev python-dev
-
-
-* git clone https://github.com/chaeplin/SUBSIDY_FUNC.git
-* git clone https://github.com/evan82/xcoin-hash.git
-
-* cd SUBSIDY_FUNC/darkcoin-subsidy-python
-* python setup.py install
-
-* cd xcoin-hash
-* python setup.py install
+ git clone https://github.com/vertoe/darkcoin_hash.git
+ cd darkcoin_hash
+ python setup.py install
+dash_subsidy:
+ git clone https://github.com/vertoe/darkcoin_subsidy.git
+ cd darkcoin_subsidy
+ python setup.py install
Running P2Pool:
-------------------------
-To use P2Pool, you must be running your own local bitcoind. For standard
+To use P2Pool, you must be running your own local dashd. For standard
configurations, using P2Pool should be as simple as:
python run_p2pool.py
-Then run your miner program, connecting to 127.0.0.1 on port 9332 with any
+Then run your miner program, connecting to 127.0.0.1 on port 7903 with any
username and password.
If you are behind a NAT, you should enable TCP port forwarding on your
-router. Forward port 9333 to the host running P2Pool.
+router. Forward port 8999 to the host running P2Pool.
Run for additional options.
python run_p2pool.py --help
-Donations towards further development:
--------------------------
- 1HNeqi3pJRNvXybNX4FKzZgYJsdTSqJTbk
-
Official wiki :
-------------------------
https://en.bitcoin.it/wiki/P2Pool
@@ -64,50 +50,8 @@ https://en.bitcoin.it/wiki/P2Pool
Alternate web front end :
-------------------------
* https://github.com/hardcpp/P2PoolExtendedFrontEnd
-
-Notes for Vertcoin:
-=========================
-Requirements:
--------------------------
-In order to run P2Pool with the Vertcoin network, you would need to build and install the
-vtc_scrypt module that includes the scrypt proof of work code that Vertcoin uses for hashes.
-
-Linux:
-
- cd py_modules/vertcoin_scrypt
- sudo python setup.py install
-
-Windows (mingw):
-* Install MinGW: http://www.mingw.org/wiki/Getting_Started
-* Install Python 2.7: http://www.python.org/getit/
-
-In bash type this:
-
- cd py_modules\vertcoin_scrypt
- C:\Python27\python.exe setup.py build --compile=mingw32 install
-
-Windows (microsoft visual c++)
-* Open visual studio console
-
-In bash type this:
-
- SET VS90COMNTOOLS=%VS110COMNTOOLS% # For visual c++ 2012
- SET VS90COMNTOOLS=%VS100COMNTOOLS% # For visual c++ 2010
- cd py_modules\vertcoin_scrypt
- C:\Python27\python.exe setup.py build --compile=mingw32 install
-
-If you run into an error with unrecognized command line option '-mno-cygwin', see this:
-http://stackoverflow.com/questions/6034390/compiling-with-cython-and-mingw-produces-gcc-error-unrecognized-command-line-o
-
-Running P2Pool:
--------------------------
-Run P2Pool with the "--net vertcoin" option.
-Run your miner program, connecting to 127.0.0.1 on port 9171.
-
-Notes for Cachecoin:
-=========================
-This is currently under heavy development and still experimental. For the
-current latest stable implementation, please use https://github.com/Sykh/p2pool-cache
+* https://github.com/johndoe75/p2pool-node-status
+* https://github.com/justino/p2pool-ui-punchy
Sponsors:
-------------------------
@@ -116,4 +60,4 @@ Thanks to:
* The Bitcoin Foundation for its generous support of P2Pool
* The Litecoin Project for its generous donations to P2Pool
* The Vertcoin Community for its great contribution to P2Pool
-
+* chaeplin, dstorm and mr.slaveg from the Dash Community
diff --git a/nattraverso/__init__.pyc b/nattraverso/__init__.pyc
deleted file mode 100644
index 9b0e987..0000000
Binary files a/nattraverso/__init__.pyc and /dev/null differ
diff --git a/nattraverso/ipdiscover.pyc b/nattraverso/ipdiscover.pyc
deleted file mode 100644
index 42b12df..0000000
Binary files a/nattraverso/ipdiscover.pyc and /dev/null differ
diff --git a/nattraverso/portmapper.pyc b/nattraverso/portmapper.pyc
deleted file mode 100644
index 41f2c1d..0000000
Binary files a/nattraverso/portmapper.pyc and /dev/null differ
diff --git a/nattraverso/utils.pyc b/nattraverso/utils.pyc
deleted file mode 100644
index 87684c1..0000000
Binary files a/nattraverso/utils.pyc and /dev/null differ
diff --git a/p2pool/bitcoin/helper.py b/p2pool/bitcoin/helper.py
deleted file mode 100644
index f9c459f..0000000
--- a/p2pool/bitcoin/helper.py
+++ /dev/null
@@ -1,87 +0,0 @@
-import sys
-import time
-
-from twisted.internet import defer
-
-import p2pool
-from p2pool.bitcoin import data as bitcoin_data
-from p2pool.util import deferral, jsonrpc
-
-@deferral.retry('Error while checking Bitcoin connection:', 1)
-@defer.inlineCallbacks
-def check(bitcoind, net):
- if not (yield net.PARENT.RPC_CHECK(bitcoind)):
- print >>sys.stderr, " Check failed! Make sure that you're connected to the right bitcoind with --bitcoind-rpc-port!"
- raise deferral.RetrySilentlyException()
- if not net.VERSION_CHECK((yield bitcoind.rpc_getinfo())['version']):
- print >>sys.stderr, ' Bitcoin version too old! Upgrade to 0.6.4 or newer!'
- raise deferral.RetrySilentlyException()
-
-@deferral.retry('Error getting work from bitcoind:', 3)
-@defer.inlineCallbacks
-def getwork(bitcoind, use_getblocktemplate=False):
- def go():
- if use_getblocktemplate:
- return bitcoind.rpc_getblocktemplate(dict(mode='template'))
- else:
- return bitcoind.rpc_getmemorypool()
- try:
- start = time.time()
- work = yield go()
- end = time.time()
- except jsonrpc.Error_for_code(-32601): # Method not found
- use_getblocktemplate = not use_getblocktemplate
- try:
- start = time.time()
- work = yield go()
- end = time.time()
- except jsonrpc.Error_for_code(-32601): # Method not found
- print >>sys.stderr, 'Error: Bitcoin version too old! Upgrade to v0.5 or newer!'
- raise deferral.RetrySilentlyException()
- packed_transactions = [(x['data'] if isinstance(x, dict) else x).decode('hex') for x in work['transactions']]
- if 'height' not in work:
- work['height'] = (yield bitcoind.rpc_getblock(work['previousblockhash']))['height'] + 1
- elif p2pool.DEBUG:
- assert work['height'] == (yield bitcoind.rpc_getblock(work['previousblockhash']))['height'] + 1
- defer.returnValue(dict(
- version=work['version'],
- previous_block=int(work['previousblockhash'], 16),
- transactions=map(bitcoin_data.tx_type.unpack, packed_transactions),
- transaction_hashes=map(bitcoin_data.hash256, packed_transactions),
- transaction_fees=[x.get('fee', None) if isinstance(x, dict) else None for x in work['transactions']],
- subsidy=work['coinbasevalue'],
- time=work['time'] if 'time' in work else work['curtime'],
- bits=bitcoin_data.FloatingIntegerType().unpack(work['bits'].decode('hex')[::-1]) if isinstance(work['bits'], (str, unicode)) else bitcoin_data.FloatingInteger(work['bits']),
- coinbaseflags=work['coinbaseflags'].decode('hex') if 'coinbaseflags' in work else ''.join(x.decode('hex') for x in work['coinbaseaux'].itervalues()) if 'coinbaseaux' in work else '',
- height=work['height'],
- last_update=time.time(),
- use_getblocktemplate=use_getblocktemplate,
- latency=end - start,
- ))
-
-@deferral.retry('Error submitting primary block: (will retry)', 10, 10)
-def submit_block_p2p(block, factory, net):
- if factory.conn.value is None:
- print >>sys.stderr, 'No bitcoind connection when block submittal attempted! %s%064x' % (net.PARENT.BLOCK_EXPLORER_URL_PREFIX, bitcoin_data.hash256(bitcoin_data.block_header_type.pack(block['header'])))
- raise deferral.RetrySilentlyException()
- factory.conn.value.send_block(block=block)
-
-@deferral.retry('Error submitting block: (will retry)', 10, 10)
-@defer.inlineCallbacks
-def submit_block_rpc(block, ignore_failure, bitcoind, bitcoind_work, net):
- if bitcoind_work.value['use_getblocktemplate']:
- try:
- result = yield bitcoind.rpc_submitblock(bitcoin_data.block_type.pack(block).encode('hex'))
- except jsonrpc.Error_for_code(-32601): # Method not found, for older litecoin versions
- result = yield bitcoind.rpc_getblocktemplate(dict(mode='submit', data=bitcoin_data.block_type.pack(block).encode('hex')))
- success = result is None
- else:
- result = yield bitcoind.rpc_getmemorypool(bitcoin_data.block_type.pack(block).encode('hex'))
- success = result
- success_expected = net.PARENT.POW_FUNC(bitcoin_data.block_header_type.pack(block['header'])) <= block['header']['bits'].target
- if (not success and success_expected and not ignore_failure) or (success and not success_expected):
- print >>sys.stderr, 'Block submittal result: %s (%r) Expected: %s' % (success, result, success_expected)
-
-def submit_block(block, ignore_failure, factory, bitcoind, bitcoind_work, net):
- submit_block_p2p(block, factory, net)
- submit_block_rpc(block, ignore_failure, bitcoind, bitcoind_work, net)
diff --git a/p2pool/bitcoin/networks.py b/p2pool/bitcoin/networks.py
deleted file mode 100644
index e415b3b..0000000
--- a/p2pool/bitcoin/networks.py
+++ /dev/null
@@ -1,71 +0,0 @@
-import os
-import platform
-
-from twisted.internet import defer
-
-from . import data
-from p2pool.util import math, pack, jsonrpc
-from operator import *
-
-
-@defer.inlineCallbacks
-def check_genesis_block(bitcoind, genesis_block_hash):
- try:
- yield bitcoind.rpc_getblock(genesis_block_hash)
- except jsonrpc.Error_for_code(-5):
- defer.returnValue(False)
- else:
- defer.returnValue(True)
-
-nets = dict(
-
- darkcoin=math.Object(
- P2P_PREFIX='fbc0b6db'.decode('hex'),
- P2P_PORT=9999,
- ADDRESS_VERSION=76,
- RPC_PORT=9998,
- RPC_CHECK=defer.inlineCallbacks(lambda bitcoind: defer.returnValue(
- 'darkcoinaddress' in (yield bitcoind.rpc_help()) and
- not (yield bitcoind.rpc_getinfo())['testnet']
- )),
- SUBSIDY_FUNC=lambda nBits, height: __import__('darkcoin_subsidy').GetBlockBaseValue(nBits, height),
- BLOCKHASH_FUNC=lambda data: pack.IntType(256).unpack(__import__('xcoin_hash').getPoWHash(data)),
- POW_FUNC=lambda data: pack.IntType(256).unpack(__import__('xcoin_hash').getPoWHash(data)),
- BLOCK_PERIOD=150, # s
- SYMBOL='DRK',
- CONF_FILE_FUNC=lambda: os.path.join(os.path.join(os.environ['APPDATA'], 'Darkcoin') if platform.system() == 'Windows' else os.path.expanduser('~/Library/Application Support/Darkcoin/') if platform.system() == 'Darwin' else os.path.expanduser('~/.darkcoin'), 'darkcoin.conf'),
- BLOCK_EXPLORER_URL_PREFIX='http://explorer.darkcoin.io/block/',
- ADDRESS_EXPLORER_URL_PREFIX='http://explorer.darkcoin.io/address/',
- TX_EXPLORER_URL_PREFIX='http://explorer.darkcoin.io/tx/',
- SANE_TARGET_RANGE=(2**256//2**32//1000 - 1, 2**256//2**20 - 1),
- DUMB_SCRYPT_DIFF=1,
- DUST_THRESHOLD=0.001e8,
- ),
-
- darkcoin_testnet=math.Object(
- P2P_PREFIX='fcc1b7dc'.decode('hex'),
- P2P_PORT=19999,
- ADDRESS_VERSION=111,
- RPC_PORT=19998,
- RPC_CHECK=defer.inlineCallbacks(lambda bitcoind: defer.returnValue(
- 'darkcoinaddress' in (yield bitcoind.rpc_help()) and
- (yield bitcoind.rpc_getinfo())['testnet']
- )),
- SUBSIDY_FUNC=lambda nBits, height: __import__('darkcoin_subsidy').GetBlockBaseValue(nBits, height),
- BLOCKHASH_FUNC=lambda data: pack.IntType(256).unpack(__import__('dark_hash').getPoWHash(data)),
- POW_FUNC=lambda data: pack.IntType(256).unpack(__import__('dark_hash').getPoWHash(data)),
- BLOCK_PERIOD=150, # s
- SYMBOL='tDRK',
- CONF_FILE_FUNC=lambda: os.path.join(os.path.join(os.environ['APPDATA'], 'Darkcoin') if platform.system() == 'Windows' else os.path.expanduser('~/Library/Application Support/Darkcoin/') if platform.system() == 'Darwin' else os.path.expanduser('~/.darkcoin'), 'darkcoin.conf'),
- BLOCK_EXPLORER_URL_PREFIX='',
- ADDRESS_EXPLORER_URL_PREFIX='',
- TX_EXPLORER_URL_PREFIX='',
- SANE_TARGET_RANGE=(2**256//2**32//1000 - 1, 2**256//2**20 - 1),
- DUMB_SCRYPT_DIFF=1,
- DUST_THRESHOLD=0.001e8,
- ),
-
-
-)
-for net_name, net in nets.iteritems():
- net.NAME = net_name
diff --git a/p2pool/bitcoin/__init__.py b/p2pool/dash/__init__.py
similarity index 100%
rename from p2pool/bitcoin/__init__.py
rename to p2pool/dash/__init__.py
diff --git a/p2pool/bitcoin/data.py b/p2pool/dash/data.py
similarity index 96%
rename from p2pool/bitcoin/data.py
rename to p2pool/dash/data.py
index 88e1730..eb644e4 100644
--- a/p2pool/bitcoin/data.py
+++ b/p2pool/dash/data.py
@@ -108,6 +108,12 @@ def write(self, file, item):
('lock_time', pack.IntType(32)),
])
+vote_type = pack.ComposedType([
+ ('block_height', pack.IntType(64)),
+ ('pubkey', pack.VarStrType()),
+ ('votes', pack.IntType(32)),
+])
+
merkle_link_type = pack.ComposedType([
('branch', pack.ListType(pack.IntType(256))),
('index', pack.IntType(32)),
@@ -131,6 +137,12 @@ def write(self, file, item):
block_type = pack.ComposedType([
('header', block_header_type),
('txs', pack.ListType(tx_type)),
+ ('votes', pack.ListType(vote_type)),
+])
+
+block_type_old = pack.ComposedType([
+ ('header', block_header_type),
+ ('txs', pack.ListType(tx_type)),
])
# merged mining
@@ -258,7 +270,7 @@ def pubkey_to_address(pubkey, net):
def address_to_pubkey_hash(address, net):
x = human_address_type.unpack(base58_decode(address))
- if x['version'] != net.ADDRESS_VERSION:
+ if x['version'] != net.ADDRESS_VERSION and x['version'] != net.SCRIPT_ADDRESS_VERSION:
raise ValueError('address not for this net!')
return x['pubkey_hash']
diff --git a/p2pool/bitcoin/getwork.py b/p2pool/dash/getwork.py
similarity index 93%
rename from p2pool/bitcoin/getwork.py
rename to p2pool/dash/getwork.py
index 721b8ba..1c5c4aa 100644
--- a/p2pool/bitcoin/getwork.py
+++ b/p2pool/dash/getwork.py
@@ -4,7 +4,7 @@
from __future__ import division
-from . import data as bitcoin_data
+from . import data as dash_data
from . import sha256
from p2pool.util import pack
@@ -35,7 +35,7 @@ def getwork(self, **extra):
if 'data' in extra or 'hash1' in extra or 'target' in extra or 'midstate' in extra:
raise ValueError()
- block_data = bitcoin_data.block_header_type.pack(dict(
+ block_data = dash_data.block_header_type.pack(dict(
version=self.version,
previous_block=self.previous_block,
merkle_root=self.merkle_root,
@@ -75,4 +75,4 @@ def update(self, **kwargs):
return self.__class__(**d)
def decode_data(data):
- return bitcoin_data.block_header_type.unpack(_swap4(data.decode('hex'))[:80])
+ return dash_data.block_header_type.unpack(_swap4(data.decode('hex'))[:80])
diff --git a/p2pool/bitcoin/height_tracker.py b/p2pool/dash/height_tracker.py
similarity index 91%
rename from p2pool/bitcoin/height_tracker.py
rename to p2pool/dash/height_tracker.py
index e5119fb..19d79e4 100644
--- a/p2pool/bitcoin/height_tracker.py
+++ b/p2pool/dash/height_tracker.py
@@ -2,7 +2,7 @@
from twisted.python import log
import p2pool
-from p2pool.bitcoin import data as bitcoin_data
+from p2pool.dash import data as dash_data
from p2pool.util import deferral, forest, jsonrpc, variable
class HeaderWrapper(object):
@@ -10,7 +10,7 @@ class HeaderWrapper(object):
@classmethod
def from_header(cls, header):
- return cls(bitcoin_data.hash256(bitcoin_data.block_header_type.pack(header)), header['previous_block'])
+ return cls(dash_data.hash256(dash_data.block_header_type.pack(header)), header['previous_block'])
def __init__(self, hash, previous_hash):
self.hash, self.previous_hash = hash, previous_hash
@@ -89,13 +89,13 @@ def get_height_rel_highest(self, block_hash):
return height - best_height
@defer.inlineCallbacks
-def get_height_rel_highest_func(bitcoind, factory, best_block_func, net):
- if '\ngetblock ' in (yield deferral.retry()(bitcoind.rpc_help)()):
+def get_height_rel_highest_func(dashd, factory, best_block_func, net):
+ if '\ngetblock ' in (yield deferral.retry()(dashd.rpc_help)()):
@deferral.DeferredCacher
@defer.inlineCallbacks
def height_cacher(block_hash):
try:
- x = yield bitcoind.rpc_getblock('%x' % (block_hash,))
+ x = yield dashd.rpc_getblock('%x' % (block_hash,))
except jsonrpc.Error_for_code(-5): # Block not found
if not p2pool.DEBUG:
raise deferral.RetrySilentlyException()
diff --git a/p2pool/dash/helper.py b/p2pool/dash/helper.py
new file mode 100644
index 0000000..079a809
--- /dev/null
+++ b/p2pool/dash/helper.py
@@ -0,0 +1,119 @@
+import sys
+import time
+
+from twisted.internet import defer
+
+import p2pool
+from p2pool.dash import data as dash_data
+from p2pool.util import deferral, jsonrpc
+
+@deferral.retry('Error while checking Dash connection:', 1)
+@defer.inlineCallbacks
+def check(dashd, net):
+ if not (yield net.PARENT.RPC_CHECK(dashd)):
+ print >>sys.stderr, " Check failed! Make sure that you're connected to the right dashd with --dashd-rpc-port!"
+ raise deferral.RetrySilentlyException()
+ if not net.VERSION_CHECK((yield dashd.rpc_getinfo())['version']):
+ print >>sys.stderr, ' Dash version too old! Upgrade to 0.11.0.11 or newer!'
+ raise deferral.RetrySilentlyException()
+
+@deferral.retry('Error getting work from dashd:', 3)
+@defer.inlineCallbacks
+def getwork(dashd, net, use_getblocktemplate=False):
+ def go():
+ if use_getblocktemplate:
+ return dashd.rpc_getblocktemplate(dict(mode='template'))
+ else:
+ return dashd.rpc_getmemorypool()
+ try:
+ start = time.time()
+ work = yield go()
+ end = time.time()
+ except jsonrpc.Error_for_code(-32601): # Method not found
+ use_getblocktemplate = not use_getblocktemplate
+ try:
+ start = time.time()
+ work = yield go()
+ end = time.time()
+ except jsonrpc.Error_for_code(-32601): # Method not found
+ print >>sys.stderr, 'Error: Dash version too old! Upgrade to v0.11.0.11 or newer!'
+ raise deferral.RetrySilentlyException()
+ packed_transactions = [(x['data'] if isinstance(x, dict) else x).decode('hex') for x in work['transactions']]
+ packed_votes = [(x['data'] if isinstance(x, dict) else x).decode('hex') for x in work['votes']]
+ if 'height' not in work:
+ work['height'] = (yield dashd.rpc_getblock(work['previousblockhash']))['height'] + 1
+ elif p2pool.DEBUG:
+ assert work['height'] == (yield dashd.rpc_getblock(work['previousblockhash']))['height'] + 1
+ defer.returnValue(dict(
+ version=work['version'],
+ previous_block=int(work['previousblockhash'], 16),
+ transactions=map(dash_data.tx_type.unpack, packed_transactions),
+ transaction_hashes=map(dash_data.hash256, packed_transactions),
+ transaction_fees=[x.get('fee', None) if isinstance(x, dict) else None for x in work['transactions']],
+ subsidy=work['coinbasevalue'],
+ time=work['time'] if 'time' in work else work['curtime'],
+ bits=dash_data.FloatingIntegerType().unpack(work['bits'].decode('hex')[::-1]) if isinstance(work['bits'], (str, unicode)) else dash_data.FloatingInteger(work['bits']),
+ coinbaseflags=work['coinbaseflags'].decode('hex') if 'coinbaseflags' in work else ''.join(x.decode('hex') for x in work['coinbaseaux'].itervalues()) if 'coinbaseaux' in work else '',
+ height=work['height'],
+ last_update=time.time(),
+ use_getblocktemplate=use_getblocktemplate,
+ latency=end - start,
+ votes=map(dash_data.vote_type.unpack, packed_votes),
+ payee=dash_data.address_to_pubkey_hash(work['payee'], net.PARENT) if (work['payee'] != '') else None,
+ masternode_payments=work['masternode_payments'],
+ payee_amount=work['payee_amount'] if (work['payee_amount'] != '') else work['coinbasevalue'] / 5,
+ ))
+
+@deferral.retry('Error submitting primary block: (will retry)', 10, 10)
+def submit_block_p2p(block, factory, net):
+ if factory.conn.value is None:
+ print >>sys.stderr, 'No dashd connection when block submittal attempted! %s%064x' % (net.PARENT.BLOCK_EXPLORER_URL_PREFIX, dash_data.hash256(dash_data.block_header_type.pack(block['header'])))
+ raise deferral.RetrySilentlyException()
+ factory.conn.value.send_block(block=block)
+
+@deferral.retry('Error submitting block: (will retry)', 10, 10)
+@defer.inlineCallbacks
+def submit_block_rpc(block, ignore_failure, dashd, dashd_work, net):
+ if dashd_work.value['use_getblocktemplate']:
+ try:
+ result = yield dashd.rpc_submitblock(dash_data.block_type.pack(block).encode('hex'))
+ except jsonrpc.Error_for_code(-32601): # Method not found, for older litecoin versions
+ result = yield dashd.rpc_getblocktemplate(dict(mode='submit', data=dash_data.block_type.pack(block).encode('hex')))
+ success = result is None
+ else:
+ result = yield dashd.rpc_getmemorypool(dash_data.block_type.pack(block).encode('hex'))
+ success = result
+ success_expected = net.PARENT.POW_FUNC(dash_data.block_header_type.pack(block['header'])) <= block['header']['bits'].target
+ if (not success and success_expected and not ignore_failure) or (success and not success_expected):
+ print >>sys.stderr, 'Block submittal result: %s (%r) Expected: %s' % (success, result, success_expected)
+
+@deferral.retry('Error submitting primary block: (will retry)', 10, 10)
+def submit_block_p2p_old(block, factory, net):
+ if factory.conn.value is None:
+ print >>sys.stderr, 'No dashd connection when block submittal attempted! %s%064x' % (net.PARENT.BLOCK_EXPLORER_URL_PREFIX, dash_data.hash256(dash_data.block_header_type.pack(block['header'])))
+ raise deferral.RetrySilentlyException()
+ factory.conn.value.send_block_old(block=block)
+
+@deferral.retry('Error submitting block: (will retry)', 10, 10)
+@defer.inlineCallbacks
+def submit_block_rpc_old(block, ignore_failure, dashd, dashd_work, net):
+ if dashd_work.value['use_getblocktemplate']:
+ try:
+ result = yield dashd.rpc_submitblock(dash_data.block_type_old.pack(block).encode('hex'))
+ except jsonrpc.Error_for_code(-32601): # Method not found, for older litecoin versions
+ result = yield dashd.rpc_getblocktemplate(dict(mode='submit', data=dash_data.block_type_old.pack(block).encode('hex')))
+ success = result is None
+ else:
+ result = yield dashd.rpc_getmemorypool(dash_data.block_type_old.pack(block).encode('hex'))
+ success = result
+ success_expected = net.PARENT.POW_FUNC(dash_data.block_header_type.pack(block['header'])) <= block['header']['bits'].target
+ if (not success and success_expected and not ignore_failure) or (success and not success_expected):
+ print >>sys.stderr, 'Block submittal result: %s (%r) Expected: %s' % (success, result, success_expected)
+
+def submit_block(block, ignore_failure, factory, dashd, dashd_work, net):
+ if dashd_work.value['masternode_payments']:
+ submit_block_p2p(block, factory, net)
+ submit_block_rpc(block, ignore_failure, dashd, dashd_work, net)
+ else:
+ submit_block_p2p_old(block, factory, net)
+ submit_block_rpc_old(block, ignore_failure, dashd, dashd_work, net)
diff --git a/p2pool/dash/networks.py b/p2pool/dash/networks.py
new file mode 100644
index 0000000..6875256
--- /dev/null
+++ b/p2pool/dash/networks.py
@@ -0,0 +1,67 @@
+import os
+import platform
+
+from twisted.internet import defer
+
+from . import data
+from p2pool.util import math, pack, jsonrpc
+
+@defer.inlineCallbacks
+def check_genesis_block(dashd, genesis_block_hash):
+ try:
+ yield dashd.rpc_getblock(genesis_block_hash)
+ except jsonrpc.Error_for_code(-5):
+ defer.returnValue(False)
+ else:
+ defer.returnValue(True)
+
+nets = dict(
+ dash=math.Object(
+ P2P_PREFIX='bf0c6bbd'.decode('hex'),
+ P2P_PORT=9999,
+ ADDRESS_VERSION=76,
+ SCRIPT_ADDRESS_VERSION=16,
+ RPC_PORT=9998,
+ RPC_CHECK=defer.inlineCallbacks(lambda dashd: defer.returnValue(
+ 'dashaddress' in (yield dashd.rpc_help()) and
+ not (yield dashd.rpc_getinfo())['testnet']
+ )),
+ SUBSIDY_FUNC=lambda nBits, height: __import__('darkcoin_subsidy').GetBlockBaseValue(nBits, height),
+ BLOCKHASH_FUNC=lambda data: pack.IntType(256).unpack(__import__('darkcoin_hash').getPoWHash(data)),
+ POW_FUNC=lambda data: pack.IntType(256).unpack(__import__('darkcoin_hash').getPoWHash(data)),
+ BLOCK_PERIOD=150, # s
+ SYMBOL='DASH',
+ CONF_FILE_FUNC=lambda: os.path.join(os.path.join(os.environ['APPDATA'], 'Dash') if platform.system() == 'Windows' else os.path.expanduser('~/Library/Application Support/Dash/') if platform.system() == 'Darwin' else os.path.expanduser('~/.dash'), 'dash.conf'),
+ BLOCK_EXPLORER_URL_PREFIX='http://explorer.dashninja.pl/block/',
+ ADDRESS_EXPLORER_URL_PREFIX='http://explorer.dashninja.pl/address/',
+ TX_EXPLORER_URL_PREFIX='http://explorer.dashninja.pl/tx/',
+ SANE_TARGET_RANGE=(2**256//2**32//1000 - 1, 2**256//2**20 - 1),
+ DUMB_SCRYPT_DIFF=1,
+ DUST_THRESHOLD=0.001e8,
+ ),
+ dash_testnet=math.Object(
+ P2P_PREFIX='cee2caff'.decode('hex'),
+ P2P_PORT=19999,
+ ADDRESS_VERSION=139,
+ SCRIPT_ADDRESS_VERSION=19,
+ RPC_PORT=19998,
+ RPC_CHECK=defer.inlineCallbacks(lambda dashd: defer.returnValue(
+ 'dashaddress' in (yield dashd.rpc_help()) and
+ (yield dashd.rpc_getinfo())['testnet']
+ )),
+ SUBSIDY_FUNC=lambda nBits, height: __import__('dash_subsidy').GetBlockBaseValue_testnet(nBits, height),
+ BLOCKHASH_FUNC=lambda data: pack.IntType(256).unpack(__import__('dash_hash').getPoWHash(data)),
+ POW_FUNC=lambda data: pack.IntType(256).unpack(__import__('dash_hash').getPoWHash(data)),
+ BLOCK_PERIOD=150, # s
+ SYMBOL='tDASH',
+ CONF_FILE_FUNC=lambda: os.path.join(os.path.join(os.environ['APPDATA'], 'Dash') if platform.system() == 'Windows' else os.path.expanduser('~/Library/Application Support/Dash/') if platform.system() == 'Darwin' else os.path.expanduser('~/.dash'), 'dash.conf'),
+ BLOCK_EXPLORER_URL_PREFIX='http://test.explorer.dashninja.pl/block/',
+ ADDRESS_EXPLORER_URL_PREFIX='http://test.explorer.dashninja.pl/address/',
+ TX_EXPLORER_URL_PREFIX='http://test.explorer.dashninja.pl/tx/',
+ SANE_TARGET_RANGE=(2**256//2**32//1000 - 1, 2**256//2**20 - 1),
+ DUMB_SCRYPT_DIFF=1,
+ DUST_THRESHOLD=0.001e8,
+ ),
+)
+for net_name, net in nets.iteritems():
+ net.NAME = net_name
diff --git a/p2pool/bitcoin/p2p.py b/p2pool/dash/p2p.py
similarity index 81%
rename from p2pool/bitcoin/p2p.py
rename to p2pool/dash/p2p.py
index 11f6a25..d55ebdf 100644
--- a/p2pool/bitcoin/p2p.py
+++ b/p2pool/dash/p2p.py
@@ -1,5 +1,5 @@
'''
-Implementation of Bitcoin's p2p protocol
+Implementation of Dash's p2p protocol
'''
import random
@@ -9,7 +9,7 @@
from twisted.internet import protocol
import p2pool
-from . import data as bitcoin_data
+from . import data as dash_data
from p2pool.util import deferral, p2protocol, pack, variable
class Protocol(p2protocol.Protocol):
@@ -19,7 +19,7 @@ def __init__(self, net):
def connectionMade(self):
self.send_version(
- version=70002,
+ version=70075,
services=1,
time=int(time.time()),
addr_to=dict(
@@ -41,8 +41,8 @@ def connectionMade(self):
('version', pack.IntType(32)),
('services', pack.IntType(64)),
('time', pack.IntType(64)),
- ('addr_to', bitcoin_data.address_type),
- ('addr_from', bitcoin_data.address_type),
+ ('addr_to', dash_data.address_type),
+ ('addr_from', dash_data.address_type),
('nonce', pack.IntType(64)),
('sub_version_num', pack.VarStrType()),
('start_height', pack.IntType(32)),
@@ -65,7 +65,7 @@ def handle_verack(self):
message_inv = pack.ComposedType([
('invs', pack.ListType(pack.ComposedType([
- ('type', pack.EnumType(pack.IntType(32), {1: 'tx', 2: 'block'})),
+ ('type', pack.EnumType(pack.IntType(32), {1: 'tx', 2: 'block', 3: 'filtered_block', 4: 'txlock_request', 5: 'txlock_vote', 6: 'spork', 7: 'masternode_winner', 8: 'masternode_scanning_error'})),
('hash', pack.IntType(256)),
]))),
])
@@ -76,8 +76,8 @@ def handle_inv(self, invs):
elif inv['type'] == 'block':
self.factory.new_block.happened(inv['hash'])
else:
- print 'Unknown inv type', inv
-
+ print 'Unneeded inv type', inv
+
message_getdata = pack.ComposedType([
('requests', pack.ListType(pack.ComposedType([
('type', pack.EnumType(pack.IntType(32), {1: 'tx', 2: 'block'})),
@@ -99,7 +99,7 @@ def handle_inv(self, invs):
message_addr = pack.ComposedType([
('addrs', pack.ListType(pack.ComposedType([
('timestamp', pack.IntType(32)),
- ('address', bitcoin_data.address_type),
+ ('address', dash_data.address_type),
]))),
])
def handle_addr(self, addrs):
@@ -107,30 +107,34 @@ def handle_addr(self, addrs):
pass
message_tx = pack.ComposedType([
- ('tx', bitcoin_data.tx_type),
+ ('tx', dash_data.tx_type),
])
def handle_tx(self, tx):
self.factory.new_tx.happened(tx)
message_block = pack.ComposedType([
- ('block', bitcoin_data.block_type),
+ ('block', dash_data.block_type),
])
def handle_block(self, block):
- #block_hash = bitcoin_data.hash256(bitcoin_data.block_header_type.pack(block['header']))
- block_hash = self.net.BLOCKHASH_FUNC(bitcoin_data.block_header_type.pack(block['header']))
+ block_hash = self.net.BLOCKHASH_FUNC(dash_data.block_header_type.pack(block['header']))
+ self.get_block.got_response(block_hash, block)
+ self.get_block_header.got_response(block_hash, block['header'])
+
+ message_block_old = pack.ComposedType([
+ ('block', dash_data.block_type_old),
+ ])
+ def handle_block_old(self, block):
+ block_hash = self.net.BLOCKHASH_FUNC(dash_data.block_header_type.pack(block['header']))
self.get_block.got_response(block_hash, block)
self.get_block_header.got_response(block_hash, block['header'])
message_headers = pack.ComposedType([
- ('headers', pack.ListType(bitcoin_data.block_type)),
+ ('headers', pack.ListType(dash_data.block_type_old)),
])
def handle_headers(self, headers):
for header in headers:
header = header['header']
- header_hash = self.net.BLOCKHASH_FUNC(bitcoin_data.block_header_type.pack(header))
- self.get_block_header.got_response(header_hash, header)
-
- #self.get_block_header.got_response(bitcoin_data.hash256(bitcoin_data.block_header_type.pack(header)), header)
+ self.get_block_header.got_response(self.net.BLOCKHASH_FUNC(dash_data.block_header_type.pack(header)), header)
self.factory.new_headers.happened([header['header'] for header in headers])
message_ping = pack.ComposedType([
@@ -158,7 +162,7 @@ def connectionLost(self, reason):
if hasattr(self, 'pinger'):
self.pinger.stop()
if p2pool.DEBUG:
- print >>sys.stderr, 'Bitcoin connection lost. Reason:', reason.getErrorMessage()
+ print >>sys.stderr, 'Dash connection lost. Reason:', reason.getErrorMessage()
class ClientFactory(protocol.ReconnectingClientFactory):
protocol = Protocol
diff --git a/p2pool/bitcoin/script.py b/p2pool/dash/script.py
similarity index 100%
rename from p2pool/bitcoin/script.py
rename to p2pool/dash/script.py
diff --git a/p2pool/bitcoin/sha256.py b/p2pool/dash/sha256.py
similarity index 100%
rename from p2pool/bitcoin/sha256.py
rename to p2pool/dash/sha256.py
diff --git a/p2pool/bitcoin/stratum.py b/p2pool/dash/stratum.py
similarity index 91%
rename from p2pool/bitcoin/stratum.py
rename to p2pool/dash/stratum.py
index ca2aa94..abd9ff3 100644
--- a/p2pool/bitcoin/stratum.py
+++ b/p2pool/dash/stratum.py
@@ -4,7 +4,7 @@
from twisted.internet import protocol, reactor
from twisted.python import log
-from p2pool.bitcoin import data as bitcoin_data, getwork
+from p2pool.dash import data as dash_data, getwork
from p2pool.util import expiring_dict, jsonrpc, pack
@@ -32,6 +32,7 @@ def rpc_authorize(self, username, password):
self.username = username
reactor.callLater(0, self._send_work)
+ return True
def _send_work(self):
try:
@@ -41,7 +42,7 @@ def _send_work(self):
self.transport.loseConnection()
return
jobid = str(random.randrange(2**128))
- self.other.svc_mining.rpc_set_difficulty(bitcoin_data.target_to_difficulty(x['share_target'])*self.wb.net.DUMB_SCRYPT_DIFF).addErrback(lambda err: None)
+ self.other.svc_mining.rpc_set_difficulty(dash_data.target_to_difficulty(x['share_target'])*self.wb.net.DUMB_SCRYPT_DIFF).addErrback(lambda err: None)
self.other.svc_mining.rpc_notify(
jobid, # jobid
getwork._swap4(pack.IntType(256).pack(x['previous_block'])).encode('hex'), # prevhash
@@ -66,7 +67,7 @@ def rpc_submit(self, worker_name, job_id, extranonce2, ntime, nonce):
header = dict(
version=x['version'],
previous_block=x['previous_block'],
- merkle_root=bitcoin_data.check_merkle_link(bitcoin_data.hash256(new_packed_gentx), x['merkle_link']),
+ merkle_root=dash_data.check_merkle_link(dash_data.hash256(new_packed_gentx), x['merkle_link']),
timestamp=pack.IntType(32).unpack(getwork._swap4(ntime.decode('hex'))),
bits=x['bits'],
nonce=pack.IntType(32).unpack(getwork._swap4(nonce.decode('hex'))),
diff --git a/p2pool/bitcoin/worker_interface.py b/p2pool/dash/worker_interface.py
similarity index 96%
rename from p2pool/bitcoin/worker_interface.py
rename to p2pool/dash/worker_interface.py
index 7ae1951..ddefb01 100644
--- a/p2pool/bitcoin/worker_interface.py
+++ b/p2pool/dash/worker_interface.py
@@ -8,7 +8,7 @@
from twisted.internet import defer
import p2pool
-from p2pool.bitcoin import data as bitcoin_data, getwork
+from p2pool.dash import data as dash_data, getwork
from p2pool.util import expiring_dict, jsonrpc, pack, variable
class _Provider(object):
@@ -85,7 +85,7 @@ def _getwork(self, request, data, long_poll):
res = getwork.BlockAttempt(
version=x['version'],
previous_block=x['previous_block'],
- merkle_root=bitcoin_data.check_merkle_link(bitcoin_data.hash256(x['coinb1'] + '\0'*self.worker_bridge.COINBASE_NONCE_LENGTH + x['coinb2']), x['merkle_link']),
+ merkle_root=dash_data.check_merkle_link(dash_data.hash256(x['coinb1'] + '\0'*self.worker_bridge.COINBASE_NONCE_LENGTH + x['coinb2']), x['merkle_link']),
timestamp=x['timestamp'],
bits=x['bits'],
share_target=x['share_target'],
diff --git a/p2pool/data.py b/p2pool/data.py
index 0673caf..acd761f 100644
--- a/p2pool/data.py
+++ b/p2pool/data.py
@@ -9,7 +9,7 @@
from twisted.python import log
import p2pool
-from p2pool.bitcoin import data as bitcoin_data, script, sha256
+from p2pool.dash import data as dash_data, script, sha256
from p2pool.util import math, forest, pack
# hashlink
@@ -49,7 +49,7 @@ def load_share(share, net, peer_addr):
else:
raise ValueError('unknown share type: %r' % (share['type'],))
-DONATION_SCRIPT = '4104ffd03de44a6e11b9917f3a29f9443283d9871c9d743ef30d5eddcd37094b64d1b3d8090496b53256786bf5c82932ec23c3b74d9f05a6f95a8b5529352656664bac'.decode('hex')
+DONATION_SCRIPT = '41042d71f6448f92c35ede838e3922313b162cbf20d357e7d067115aac8d1a27f66a89b46dee086775c8b083ee5f06fe1c08d1d0ae0668d029aed17e1f8eaea544d4ac'.decode('hex')
class Share(object):
VERSION = 13
@@ -60,7 +60,7 @@ class Share(object):
('version', pack.VarIntType()),
('previous_block', pack.PossiblyNoneType(0, pack.IntType(256))),
('timestamp', pack.IntType(32)),
- ('bits', bitcoin_data.FloatingIntegerType()),
+ ('bits', dash_data.FloatingIntegerType()),
('nonce', pack.IntType(32)),
])
@@ -74,12 +74,14 @@ class Share(object):
('donation', pack.IntType(16)),
('stale_info', pack.EnumType(pack.IntType(8), dict((k, {0: None, 253: 'orphan', 254: 'doa'}.get(k, 'unk%i' % (k,))) for k in xrange(256)))),
('desired_version', pack.VarIntType()),
+ ('payee', pack.PossiblyNoneType(0, pack.IntType(160))),
+ ('payee_amount', pack.IntType(64)),
])),
('new_transaction_hashes', pack.ListType(pack.IntType(256))),
('transaction_hash_refs', pack.ListType(pack.VarIntType(), 2)), # pairs of share_count, tx_count
('far_share_hash', pack.PossiblyNoneType(0, pack.IntType(256))),
- ('max_bits', bitcoin_data.FloatingIntegerType()),
- ('bits', bitcoin_data.FloatingIntegerType()),
+ ('max_bits', dash_data.FloatingIntegerType()),
+ ('bits', dash_data.FloatingIntegerType()),
('timestamp', pack.IntType(32)),
('absheight', pack.IntType(32)),
('abswork', pack.IntType(128)),
@@ -120,8 +122,8 @@ def generate_transaction(cls, tracker, share_data, block_target, desired_timesta
pre_target = 2**256//(net.SHARE_PERIOD*attempts_per_second) - 1 if attempts_per_second else 2**256-1
pre_target2 = math.clip(pre_target, (previous_share.max_target*9//10, previous_share.max_target*11//10))
pre_target3 = math.clip(pre_target2, (net.MIN_TARGET, net.MAX_TARGET))
- max_bits = bitcoin_data.FloatingInteger.from_target_upper_bound(pre_target3)
- bits = bitcoin_data.FloatingInteger.from_target_upper_bound(math.clip(desired_target, (pre_target3//30, pre_target3)))
+ max_bits = dash_data.FloatingInteger.from_target_upper_bound(pre_target3)
+ bits = dash_data.FloatingInteger.from_target_upper_bound(math.clip(desired_target, (pre_target3//30, pre_target3)))
new_transaction_hashes = []
new_transaction_size = 0
@@ -139,7 +141,7 @@ def generate_transaction(cls, tracker, share_data, block_target, desired_timesta
this = tx_hash_to_this[tx_hash]
else:
if known_txs is not None:
- this_size = bitcoin_data.tx_type.packed_size(known_txs[tx_hash])
+ this_size = dash_data.tx_type.packed_size(known_txs[tx_hash])
if new_transaction_size + this_size > 50000: # only allow 50 kB of new txns/share
break
new_transaction_size += this_size
@@ -159,19 +161,31 @@ def generate_transaction(cls, tracker, share_data, block_target, desired_timesta
weights, total_weight, donation_weight = tracker.get_cumulative_weights(previous_share.share_data['previous_share_hash'] if previous_share is not None else None,
max(0, min(height, net.REAL_CHAIN_LENGTH) - 1),
- 65535*net.SPREAD*bitcoin_data.target_to_average_attempts(block_target),
+ 65535*net.SPREAD*dash_data.target_to_average_attempts(block_target),
)
assert total_weight == sum(weights.itervalues()) + donation_weight, (total_weight, sum(weights.itervalues()) + donation_weight)
- amounts = dict((script, share_data['subsidy']*(199*weight)//(200*total_weight)) for script, weight in weights.iteritems()) # 99.5% goes according to weights prior to this share
- this_script = bitcoin_data.pubkey_hash_to_script2(share_data['pubkey_hash'])
- amounts[this_script] = amounts.get(this_script, 0) + share_data['subsidy']//200 # 0.5% goes to block finder
- amounts[DONATION_SCRIPT] = amounts.get(DONATION_SCRIPT, 0) + share_data['subsidy'] - sum(amounts.itervalues()) # all that's left over is the donation weight and some extra satoshis due to rounding
+ worker_payout = share_data['subsidy']
- if sum(amounts.itervalues()) != share_data['subsidy'] or any(x < 0 for x in amounts.itervalues()):
+ masternode_tx = []
+ if share_data['payee'] is not None:
+ masternode_payout = share_data['payee_amount']
+ worker_payout -= masternode_payout
+ payee_script = dash_data.pubkey_hash_to_script2(share_data['payee'])
+ masternode_tx = [dict(value=masternode_payout, script=payee_script)]
+
+ amounts = dict((script, worker_payout*(199*weight)//(200*total_weight)) for script, weight in weights.iteritems()) # 99.5% goes according to weights prior to this share
+ this_script = dash_data.pubkey_hash_to_script2(share_data['pubkey_hash'])
+ amounts[this_script] = amounts.get(this_script, 0) + worker_payout//200 # 0.5% goes to block finder
+ amounts[DONATION_SCRIPT] = amounts.get(DONATION_SCRIPT, 0) + worker_payout - sum(amounts.itervalues()) # all that's left over is the donation weight and some extra satoshis due to rounding
+
+ if sum(amounts.itervalues()) != worker_payout or any(x < 0 for x in amounts.itervalues()):
raise ValueError()
- dests = sorted(amounts.iterkeys(), key=lambda script: (script == DONATION_SCRIPT, amounts[script], script))[-4000:] # block length limit, unlikely to ever be hit
+ worker_scripts = sorted([k for k in amounts.iterkeys() if k != DONATION_SCRIPT])
+ worker_tx=[dict(value=amounts[script], script=script) for script in worker_scripts if amounts[script]]
+
+ donation_tx = [dict(value=amounts[DONATION_SCRIPT], script=DONATION_SCRIPT)]
share_info = dict(
share_data=share_data,
@@ -185,7 +199,7 @@ def generate_transaction(cls, tracker, share_data, block_target, desired_timesta
new_transaction_hashes=new_transaction_hashes,
transaction_hash_refs=transaction_hash_refs,
absheight=((previous_share.absheight if previous_share is not None else 0) + 1) % 2**32,
- abswork=((previous_share.abswork if previous_share is not None else 0) + bitcoin_data.target_to_average_attempts(bits.target)) % 2**128,
+ abswork=((previous_share.abswork if previous_share is not None else 0) + dash_data.target_to_average_attempts(bits.target)) % 2**128,
)
gentx = dict(
@@ -195,7 +209,7 @@ def generate_transaction(cls, tracker, share_data, block_target, desired_timesta
sequence=None,
script=share_data['coinbase'],
)],
- tx_outs=[dict(value=amounts[script], script=script) for script in dests if amounts[script] or script == DONATION_SCRIPT] + [dict(
+ tx_outs=worker_tx + masternode_tx + donation_tx + [dict(
value=0,
script='\x6a\x28' + cls.get_ref_hash(net, share_info, ref_merkle_link) + pack.IntType(64).pack(last_txout_nonce),
)],
@@ -209,8 +223,8 @@ def get_share(header, last_txout_nonce=last_txout_nonce):
share_info=share_info,
ref_merkle_link=dict(branch=[], index=0),
last_txout_nonce=last_txout_nonce,
- hash_link=prefix_to_hash_link(bitcoin_data.tx_type.pack(gentx)[:-32-8-4], cls.gentx_before_refhash),
- merkle_link=bitcoin_data.calculate_merkle_link([None] + other_transaction_hashes, 0),
+ hash_link=prefix_to_hash_link(dash_data.tx_type.pack(gentx)[:-32-8-4], cls.gentx_before_refhash),
+ merkle_link=dash_data.calculate_merkle_link([None] + other_transaction_hashes, 0),
))
assert share.header == header # checks merkle_root
return share
@@ -219,7 +233,7 @@ def get_share(header, last_txout_nonce=last_txout_nonce):
@classmethod
def get_ref_hash(cls, net, share_info, ref_merkle_link):
- return pack.IntType(256).pack(bitcoin_data.check_merkle_link(bitcoin_data.hash256(cls.ref_type.pack(dict(
+ return pack.IntType(256).pack(dash_data.check_merkle_link(dash_data.hash256(cls.ref_type.pack(dict(
identifier=net.IDENTIFIER,
share_info=share_info,
))), ref_merkle_link))
@@ -249,7 +263,7 @@ def __init__(self, net, peer_addr, contents):
self.target = self.share_info['bits'].target
self.timestamp = self.share_info['timestamp']
self.previous_hash = self.share_data['previous_share_hash']
- self.new_script = bitcoin_data.pubkey_hash_to_script2(self.share_data['pubkey_hash'])
+ self.new_script = dash_data.pubkey_hash_to_script2(self.share_data['pubkey_hash'])
self.desired_version = self.share_data['desired_version']
self.absheight = self.share_info['absheight']
self.abswork = self.share_info['abswork']
@@ -266,11 +280,10 @@ def __init__(self, net, peer_addr, contents):
self.get_ref_hash(net, self.share_info, contents['ref_merkle_link']) + pack.IntType(64).pack(self.contents['last_txout_nonce']) + pack.IntType(32).pack(0),
self.gentx_before_refhash,
)
- merkle_root = bitcoin_data.check_merkle_link(self.gentx_hash, self.merkle_link)
+ merkle_root = dash_data.check_merkle_link(self.gentx_hash, self.merkle_link)
self.header = dict(self.min_header, merkle_root=merkle_root)
- self.pow_hash = net.PARENT.POW_FUNC(bitcoin_data.block_header_type.pack(self.header))
- #self.hash = self.header_hash = bitcoin_data.hash256(bitcoin_data.block_header_type.pack(self.header))
- self.hash = self.header_hash = net.PARENT.BLOCKHASH_FUNC(bitcoin_data.block_header_type.pack(self.header))
+ self.pow_hash = net.PARENT.POW_FUNC(dash_data.block_header_type.pack(self.header))
+ self.hash = self.header_hash = net.PARENT.BLOCKHASH_FUNC(dash_data.block_header_type.pack(self.header))
if self.target > net.MAX_TARGET:
from p2pool import p2p
@@ -319,10 +332,10 @@ def check(self, tracker):
assert other_tx_hashes2 == other_tx_hashes
if share_info != self.share_info:
raise ValueError('share_info invalid')
- if bitcoin_data.hash256(bitcoin_data.tx_type.pack(gentx)) != self.gentx_hash:
+ if dash_data.hash256(dash_data.tx_type.pack(gentx)) != self.gentx_hash:
raise ValueError('''gentx doesn't match hash_link''')
- if bitcoin_data.calculate_merkle_link([None] + other_tx_hashes, 0) != self.merkle_link:
+ if dash_data.calculate_merkle_link([None] + other_tx_hashes, 0) != self.merkle_link:
raise ValueError('merkle_link and other_tx_hashes do not match')
return gentx # only used by as_block
@@ -356,17 +369,23 @@ def should_punish_reason(self, previous_block, bits, tracker, known_txs):
if other_txs is None:
pass
else:
- all_txs_size = sum(bitcoin_data.tx_type.packed_size(tx) for tx in other_txs)
+ all_txs_size = sum(dash_data.tx_type.packed_size(tx) for tx in other_txs)
if all_txs_size > 1000000:
return True, 'txs over block size limit'
- new_txs_size = sum(bitcoin_data.tx_type.packed_size(known_txs[tx_hash]) for tx_hash in self.share_info['new_transaction_hashes'])
+ new_txs_size = sum(dash_data.tx_type.packed_size(known_txs[tx_hash]) for tx_hash in self.share_info['new_transaction_hashes'])
if new_txs_size > 50000:
return True, 'new txs over limit'
return False, None
- def as_block(self, tracker, known_txs):
+ def as_block(self, tracker, known_txs, votes):
+ other_txs = self._get_other_txs(tracker, known_txs)
+ if other_txs is None:
+ return None # not all txs present
+ return dict(header=self.header, txs=[self.check(tracker)] + other_txs, votes=votes)
+
+ def as_block_old(self, tracker, known_txs):
other_txs = self._get_other_txs(tracker, known_txs)
if other_txs is None:
return None # not all txs present
@@ -377,9 +396,9 @@ class WeightsSkipList(forest.TrackerSkipList):
# share_count, weights, total_weight
def get_delta(self, element):
- from p2pool.bitcoin import data as bitcoin_data
+ from p2pool.dash import data as dash_data
share = self.tracker.items[element]
- att = bitcoin_data.target_to_average_attempts(share.target)
+ att = dash_data.target_to_average_attempts(share.target)
return 1, {share.new_script: att*(65535-share.share_data['donation'])}, att*65535, att*share.share_data['donation']
def combine_deltas(self, (share_count1, weights1, total_weight1, total_donation_weight1), (share_count2, weights2, total_weight2, total_donation_weight2)):
@@ -413,12 +432,12 @@ def finalize(self, (share_count, weights_list, total_weight, total_donation_weig
class OkayTracker(forest.Tracker):
def __init__(self, net):
forest.Tracker.__init__(self, delta_type=forest.get_attributedelta_type(dict(forest.AttributeDelta.attrs,
- work=lambda share: bitcoin_data.target_to_average_attempts(share.target),
- min_work=lambda share: bitcoin_data.target_to_average_attempts(share.max_target),
+ work=lambda share: dash_data.target_to_average_attempts(share.target),
+ min_work=lambda share: dash_data.target_to_average_attempts(share.max_target),
)))
self.net = net
self.verified = forest.SubsetTracker(delta_type=forest.get_attributedelta_type(dict(forest.AttributeDelta.attrs,
- work=lambda share: bitcoin_data.target_to_average_attempts(share.target),
+ work=lambda share: dash_data.target_to_average_attempts(share.target),
)), subset_of=self)
self.get_cumulative_weights = WeightsSkipList(self)
@@ -530,9 +549,9 @@ def think(self, block_rel_height_func, previous_block, bits, known_txs):
target_cutoff = 2**256-1
if p2pool.DEBUG:
- print 'Desire %i shares. Cutoff: %s old diff>%.2f' % (len(desired), math.format_dt(time.time() - timestamp_cutoff), bitcoin_data.target_to_difficulty(target_cutoff))
+ print 'Desire %i shares. Cutoff: %s old diff>%.2f' % (len(desired), math.format_dt(time.time() - timestamp_cutoff), dash_data.target_to_difficulty(target_cutoff))
for peer_addr, hash, ts, targ in desired:
- print ' ', None if peer_addr is None else '%s:%i' % peer_addr, format_hash(hash), math.format_dt(time.time() - ts), bitcoin_data.target_to_difficulty(targ), ts >= timestamp_cutoff, targ <= target_cutoff
+ print ' ', None if peer_addr is None else '%s:%i' % peer_addr, format_hash(hash), math.format_dt(time.time() - ts), dash_data.target_to_difficulty(targ), ts >= timestamp_cutoff, targ <= target_cutoff
return best, [(peer_addr, hash) for peer_addr, hash, ts, targ in desired if ts >= timestamp_cutoff], decorated_heads, bad_peer_addresses
@@ -569,10 +588,10 @@ def get_average_stale_prop(tracker, share_hash, lookbehind):
def get_stale_counts(tracker, share_hash, lookbehind, rates=False):
res = {}
for share in tracker.get_chain(share_hash, lookbehind - 1):
- res['good'] = res.get('good', 0) + bitcoin_data.target_to_average_attempts(share.target)
+ res['good'] = res.get('good', 0) + dash_data.target_to_average_attempts(share.target)
s = share.share_data['stale_info']
if s is not None:
- res[s] = res.get(s, 0) + bitcoin_data.target_to_average_attempts(share.target)
+ res[s] = res.get(s, 0) + dash_data.target_to_average_attempts(share.target)
if rates:
dt = tracker.items[share_hash].timestamp - tracker.items[tracker.get_nth_parent_hash(share_hash, lookbehind - 1)].timestamp
res = dict((k, v/dt) for k, v in res.iteritems())
@@ -590,7 +609,7 @@ def get_user_stale_props(tracker, share_hash, lookbehind):
return dict((pubkey_hash, stale/total) for pubkey_hash, (stale, total) in res.iteritems())
def get_expected_payouts(tracker, best_share_hash, block_target, subsidy, net):
- weights, total_weight, donation_weight = tracker.get_cumulative_weights(best_share_hash, min(tracker.get_height(best_share_hash), net.REAL_CHAIN_LENGTH), 65535*net.SPREAD*bitcoin_data.target_to_average_attempts(block_target))
+ weights, total_weight, donation_weight = tracker.get_cumulative_weights(best_share_hash, min(tracker.get_height(best_share_hash), net.REAL_CHAIN_LENGTH), 65535*net.SPREAD*dash_data.target_to_average_attempts(block_target))
res = dict((script, subsidy*weight//total_weight) for script, weight in weights.iteritems())
res[DONATION_SCRIPT] = res.get(DONATION_SCRIPT, 0) + subsidy - sum(res.itervalues())
return res
@@ -598,10 +617,10 @@ def get_expected_payouts(tracker, best_share_hash, block_target, subsidy, net):
def get_desired_version_counts(tracker, best_share_hash, dist):
res = {}
for share in tracker.get_chain(best_share_hash, dist):
- res[share.desired_version] = res.get(share.desired_version, 0) + bitcoin_data.target_to_average_attempts(share.target)
+ res[share.desired_version] = res.get(share.desired_version, 0) + dash_data.target_to_average_attempts(share.target)
return res
-def get_warnings(tracker, best_share, net, bitcoind_getinfo, bitcoind_work_value):
+def get_warnings(tracker, best_share, net, dashd_getinfo, dashd_work_value):
res = []
desired_version_counts = get_desired_version_counts(tracker, best_share,
@@ -612,16 +631,16 @@ def get_warnings(tracker, best_share, net, bitcoind_getinfo, bitcoind_work_value
'An upgrade is likely necessary. Check http://p2pool.forre.st/ for more information.' % (
majority_desired_version, 100*desired_version_counts[majority_desired_version]/sum(desired_version_counts.itervalues())))
- if bitcoind_getinfo['errors'] != '':
- if 'This is a pre-release test build' not in bitcoind_getinfo['errors']:
- res.append('(from bitcoind) %s' % (bitcoind_getinfo['errors'],))
+ if dashd_getinfo['errors'] != '':
+ if 'This is a pre-release test build' not in dashd_getinfo['errors']:
+ res.append('(from dashd) %s' % (dashd_getinfo['errors'],))
- version_warning = getattr(net, 'VERSION_WARNING', lambda v: None)(bitcoind_getinfo['version'])
+ version_warning = getattr(net, 'VERSION_WARNING', lambda v: None)(dashd_getinfo['version'])
if version_warning is not None:
res.append(version_warning)
- if time.time() > bitcoind_work_value['last_update'] + 60:
- res.append('''LOST CONTACT WITH BITCOIND for %s! Check that it isn't frozen or dead!''' % (math.format_dt(time.time() - bitcoind_work_value['last_update']),))
+ if time.time() > dashd_work_value['last_update'] + 60:
+ res.append('''LOST CONTACT WITH DASHD for %s! Check that it isn't frozen or dead!''' % (math.format_dt(time.time() - dashd_work_value['last_update']),))
return res
diff --git a/p2pool/main.py b/p2pool/main.py
index 609ff19..69328f1 100644
--- a/p2pool/main.py
+++ b/p2pool/main.py
@@ -19,8 +19,8 @@
from twisted.python import log
from nattraverso import portmapper, ipdiscover
-import bitcoin.p2p as bitcoin_p2p, bitcoin.data as bitcoin_data
-from bitcoin import stratum, worker_interface, helper
+import dash.p2p as dash_p2p, dash.data as dash_data
+from dash import stratum, worker_interface, helper
from util import fixargparse, jsonrpc, variable, deferral, math, logging, switchprotocol
from . import networks, web, work
import p2pool, p2pool.data as p2pool_data, p2pool.node as p2pool_node
@@ -33,12 +33,12 @@ def main(args, net, datadir_path, merged_urls, worker_endpoint):
@defer.inlineCallbacks
def connect_p2p():
- # connect to bitcoind over bitcoin-p2p
- print '''Testing bitcoind P2P connection to '%s:%s'...''' % (args.bitcoind_address, args.bitcoind_p2p_port)
- factory = bitcoin_p2p.ClientFactory(net.PARENT)
- reactor.connectTCP(args.bitcoind_address, args.bitcoind_p2p_port, factory)
+ # connect to dashd over dash-p2p
+ print '''Testing dashd P2P connection to '%s:%s'...''' % (args.dashd_address, args.dashd_p2p_port)
+ factory = dash_p2p.ClientFactory(net.PARENT)
+ reactor.connectTCP(args.dashd_address, args.dashd_p2p_port, factory)
def long():
- print ''' ...taking a while. Common reasons for this include all of bitcoind's connection slots being used...'''
+ print ''' ...taking a while. Common reasons for this include all of dashd's connection slots being used...'''
long_dc = reactor.callLater(5, long)
yield factory.getProtocol() # waits until handshake is successful
if not long_dc.called: long_dc.cancel()
@@ -46,20 +46,20 @@ def long():
print
defer.returnValue(factory)
- if args.testnet: # establish p2p connection first if testnet so bitcoind can work without connections
+ if args.testnet: # establish p2p connection first if testnet so dashd can work without connections
factory = yield connect_p2p()
- # connect to bitcoind over JSON-RPC and do initial getmemorypool
- url = '%s://%s:%i/' % ('https' if args.bitcoind_rpc_ssl else 'http', args.bitcoind_address, args.bitcoind_rpc_port)
- print '''Testing bitcoind RPC connection to '%s' with username '%s'...''' % (url, args.bitcoind_rpc_username)
- bitcoind = jsonrpc.HTTPProxy(url, dict(Authorization='Basic ' + base64.b64encode(args.bitcoind_rpc_username + ':' + args.bitcoind_rpc_password)), timeout=30)
- yield helper.check(bitcoind, net)
- temp_work = yield helper.getwork(bitcoind)
+ # connect to dashd over JSON-RPC and do initial getmemorypool
+ url = '%s://%s:%i/' % ('https' if args.dashd_rpc_ssl else 'http', args.dashd_address, args.dashd_rpc_port)
+ print '''Testing dashd RPC connection to '%s' with username '%s'...''' % (url, args.dashd_rpc_username)
+ dashd = jsonrpc.HTTPProxy(url, dict(Authorization='Basic ' + base64.b64encode(args.dashd_rpc_username + ':' + args.dashd_rpc_password)), timeout=30)
+ yield helper.check(dashd, net)
+ temp_work = yield helper.getwork(dashd, net)
- bitcoind_getinfo_var = variable.Variable(None)
+ dashd_getinfo_var = variable.Variable(None)
@defer.inlineCallbacks
def poll_warnings():
- bitcoind_getinfo_var.set((yield deferral.retry('Error while calling getinfo:')(bitcoind.rpc_getinfo)()))
+ dashd_getinfo_var.set((yield deferral.retry('Error while calling getinfo:')(dashd.rpc_getinfo)()))
yield poll_warnings()
deferral.RobustLoopingCall(poll_warnings).start(20*60)
@@ -83,22 +83,22 @@ def poll_warnings():
address = None
if address is not None:
- res = yield deferral.retry('Error validating cached address:', 5)(lambda: bitcoind.rpc_validateaddress(address))()
+ res = yield deferral.retry('Error validating cached address:', 5)(lambda: dashd.rpc_validateaddress(address))()
if not res['isvalid'] or not res['ismine']:
- print ' Cached address is either invalid or not controlled by local bitcoind!'
+ print ' Cached address is either invalid or not controlled by local dashd!'
address = None
if address is None:
- print ' Getting payout address from bitcoind...'
- address = yield deferral.retry('Error getting payout address from bitcoind:', 5)(lambda: bitcoind.rpc_getaccountaddress('p2pool'))()
+ print ' Getting payout address from dashd...'
+ address = yield deferral.retry('Error getting payout address from dashd:', 5)(lambda: dashd.rpc_getaccountaddress('p2pool'))()
with open(address_path, 'wb') as f:
f.write(address)
- my_pubkey_hash = bitcoin_data.address_to_pubkey_hash(address, net.PARENT)
+ my_pubkey_hash = dash_data.address_to_pubkey_hash(address, net.PARENT)
else:
my_pubkey_hash = args.pubkey_hash
- print ' ...success! Payout address:', bitcoin_data.pubkey_hash_to_address(my_pubkey_hash, net.PARENT)
+ print ' ...success! Payout address:', dash_data.pubkey_hash_to_address(my_pubkey_hash, net.PARENT)
print
print "Loading shares..."
@@ -116,7 +116,7 @@ def share_cb(share):
print 'Initializing work...'
- node = p2pool_node.Node(factory, bitcoind, shares.values(), known_verified, net)
+ node = p2pool_node.Node(factory, dashd, shares.values(), known_verified, net)
yield node.start()
for share_hash in shares:
@@ -211,7 +211,7 @@ def upnp_thread():
print 'Listening for workers on %r port %i...' % (worker_endpoint[0], worker_endpoint[1])
wb = work.WorkerBridge(node, my_pubkey_hash, args.donation_percentage, merged_urls, args.worker_fee)
- web_root = web.get_web_root(wb, datadir_path, bitcoind_getinfo_var)
+ web_root = web.get_web_root(wb, datadir_path, dashd_getinfo_var)
caching_wb = worker_interface.CachingWorkerBridge(wb)
worker_interface.WorkerInterface(caching_wb).attach_to(web_root, get_handler=lambda request: request.redirect('static/'))
web_serverfactory = server.Site(web_root)
@@ -267,7 +267,7 @@ def new_share(share):
return
if share.pow_hash <= share.header['bits'].target and abs(share.timestamp - time.time()) < 10*60:
yield deferral.sleep(random.expovariate(1/60))
- message = '\x02%s BLOCK FOUND by %s! %s%064x' % (net.NAME.upper(), bitcoin_data.script2_to_address(share.new_script, net.PARENT), net.PARENT.BLOCK_EXPLORER_URL_PREFIX, share.header_hash)
+ message = '\x02%s BLOCK FOUND by %s! %s%064x' % (net.NAME.upper(), dash_data.script2_to_address(share.new_script, net.PARENT), net.PARENT.BLOCK_EXPLORER_URL_PREFIX, share.header_hash)
if all('%x' % (share.header_hash,) not in old_message for old_message in self.recent_messages):
self.say(self.channel, message)
self._remember_message(message)
@@ -289,7 +289,6 @@ def connectionLost(self, reason):
print 'IRC connection lost:', reason.getErrorMessage()
class IRCClientFactory(protocol.ReconnectingClientFactory):
protocol = IRCClient
-
reactor.connectTCP("irc.freenode.net", 6667, IRCClientFactory(), bindAddress=(worker_endpoint[0], 0))
@defer.inlineCallbacks
@@ -310,7 +309,7 @@ def status_thread():
datums, dt = wb.local_rate_monitor.get_datums_in_last()
my_att_s = sum(datum['work']/dt for datum in datums)
- my_shares_per_s = sum(datum['work']/dt/bitcoin_data.target_to_average_attempts(datum['share_target']) for datum in datums)
+ my_shares_per_s = sum(datum['work']/dt/dash_data.target_to_average_attempts(datum['share_target']) for datum in datums)
this_str += '\n Local: %sH/s in last %s Local dead on arrival: %s Expected time to share: %s' % (
math.format(int(my_att_s)),
math.format_dt(dt),
@@ -327,15 +326,15 @@ def status_thread():
shares, stale_orphan_shares, stale_doa_shares,
math.format_binomial_conf(stale_orphan_shares + stale_doa_shares, shares, 0.95),
math.format_binomial_conf(stale_orphan_shares + stale_doa_shares, shares, 0.95, lambda x: (1 - x)/(1 - stale_prop)),
- node.get_current_txouts().get(bitcoin_data.pubkey_hash_to_script2(my_pubkey_hash), 0)*1e-8, net.PARENT.SYMBOL,
+ node.get_current_txouts().get(dash_data.pubkey_hash_to_script2(my_pubkey_hash), 0)*1e-8, net.PARENT.SYMBOL,
)
this_str += '\n Pool: %sH/s Stale rate: %.1f%% Expected time to block: %s' % (
math.format(int(real_att_s)),
100*stale_prop,
- math.format_dt(2**256 / node.bitcoind_work.value['bits'].target / real_att_s),
+ math.format_dt(2**256 / node.dashd_work.value['bits'].target / real_att_s),
)
- for warning in p2pool_data.get_warnings(node.tracker, node.best_share_var.value, net, bitcoind_getinfo_var.value, node.bitcoind_work.value):
+ for warning in p2pool_data.get_warnings(node.tracker, node.best_share_var.value, net, dashd_getinfo_var.value, node.dashd_work.value):
print >>sys.stderr, '#'*40
print >>sys.stderr, '>>> Warning: ' + warning
print >>sys.stderr, '#'*40
@@ -365,8 +364,8 @@ def run():
parser = fixargparse.FixedArgumentParser(description='p2pool (version %s)' % (p2pool.__version__,), fromfile_prefix_chars='@')
parser.add_argument('--version', action='version', version=p2pool.__version__)
parser.add_argument('--net',
- help='use specified network (default: bitcoin)',
- action='store', choices=sorted(realnets), default='bitcoin', dest='net_name')
+ help='use specified network (default: dash)',
+ action='store', choices=sorted(realnets), default='dash', dest='net_name')
parser.add_argument('--testnet',
help='''use the network's testnet''',
action='store_const', const=True, default=False, dest='testnet')
@@ -374,7 +373,7 @@ def run():
help='enable debugging mode',
action='store_const', const=True, default=False, dest='debug')
parser.add_argument('-a', '--address',
- help='generate payouts to this address (default:
)',
+ help='generate payouts to this address (default: )',
type=str, action='store', default=None, dest='address')
parser.add_argument('--datadir',
help='store data in this directory (default: /data)',
@@ -423,26 +422,26 @@ def run():
help='listen on PORT on interface with ADDR for RPC connections from miners (default: all interfaces, %s)' % ', '.join('%s:%i' % (name, net.WORKER_PORT) for name, net in sorted(realnets.items())),
type=str, action='store', default=None, dest='worker_endpoint')
worker_group.add_argument('-f', '--fee', metavar='FEE_PERCENTAGE',
- help='''charge workers mining to their own bitcoin address (by setting their miner's username to a bitcoin address) this percentage fee to mine on your p2pool instance. Amount displayed at http://127.0.0.1:WORKER_PORT/fee (default: 0)''',
+ help='''charge workers mining to their own dash address (by setting their miner's username to a dash address) this percentage fee to mine on your p2pool instance. Amount displayed at http://127.0.0.1:WORKER_PORT/fee (default: 0)''',
type=float, action='store', default=0, dest='worker_fee')
- bitcoind_group = parser.add_argument_group('bitcoind interface')
- bitcoind_group.add_argument('--bitcoind-address', metavar='BITCOIND_ADDRESS',
+ dashd_group = parser.add_argument_group('dashd interface')
+ dashd_group.add_argument('--dashd-address', metavar='DASHD_ADDRESS',
help='connect to this address (default: 127.0.0.1)',
- type=str, action='store', default='127.0.0.1', dest='bitcoind_address')
- bitcoind_group.add_argument('--bitcoind-rpc-port', metavar='BITCOIND_RPC_PORT',
- help='''connect to JSON-RPC interface at this port (default: %s )''' % ', '.join('%s:%i' % (name, net.PARENT.RPC_PORT) for name, net in sorted(realnets.items())),
- type=int, action='store', default=None, dest='bitcoind_rpc_port')
- bitcoind_group.add_argument('--bitcoind-rpc-ssl',
+ type=str, action='store', default='127.0.0.1', dest='dashd_address')
+ dashd_group.add_argument('--dashd-rpc-port', metavar='DASHD_RPC_PORT',
+ help='''connect to JSON-RPC interface at this port (default: %s )''' % ', '.join('%s:%i' % (name, net.PARENT.RPC_PORT) for name, net in sorted(realnets.items())),
+ type=int, action='store', default=None, dest='dashd_rpc_port')
+ dashd_group.add_argument('--dashd-rpc-ssl',
help='connect to JSON-RPC interface using SSL',
- action='store_true', default=False, dest='bitcoind_rpc_ssl')
- bitcoind_group.add_argument('--bitcoind-p2p-port', metavar='BITCOIND_P2P_PORT',
- help='''connect to P2P interface at this port (default: %s )''' % ', '.join('%s:%i' % (name, net.PARENT.P2P_PORT) for name, net in sorted(realnets.items())),
- type=int, action='store', default=None, dest='bitcoind_p2p_port')
+ action='store_true', default=False, dest='dashd_rpc_ssl')
+ dashd_group.add_argument('--dashd-p2p-port', metavar='DASHD_P2P_PORT',
+ help='''connect to P2P interface at this port (default: %s )''' % ', '.join('%s:%i' % (name, net.PARENT.P2P_PORT) for name, net in sorted(realnets.items())),
+ type=int, action='store', default=None, dest='dashd_p2p_port')
- bitcoind_group.add_argument(metavar='BITCOIND_RPCUSERPASS',
- help='bitcoind RPC interface username, then password, space-separated (only one being provided will cause the username to default to being empty, and none will cause P2Pool to read them from bitcoin.conf)',
- type=str, action='store', default=[], nargs='*', dest='bitcoind_rpc_userpass')
+ dashd_group.add_argument(metavar='DASHD_RPCUSERPASS',
+ help='dashd RPC interface username, then password, space-separated (only one being provided will cause the username to default to being empty, and none will cause P2Pool to read them from dash.conf)',
+ type=str, action='store', default=[], nargs='*', dest='dashd_rpc_userpass')
args = parser.parse_args()
@@ -459,20 +458,20 @@ def run():
if not os.path.exists(datadir_path):
os.makedirs(datadir_path)
- if len(args.bitcoind_rpc_userpass) > 2:
+ if len(args.dashd_rpc_userpass) > 2:
parser.error('a maximum of two arguments are allowed')
- args.bitcoind_rpc_username, args.bitcoind_rpc_password = ([None, None] + args.bitcoind_rpc_userpass)[-2:]
+ args.dashd_rpc_username, args.dashd_rpc_password = ([None, None] + args.dashd_rpc_userpass)[-2:]
- if args.bitcoind_rpc_password is None:
+ if args.dashd_rpc_password is None:
conf_path = net.PARENT.CONF_FILE_FUNC()
if not os.path.exists(conf_path):
- parser.error('''Bitcoin configuration file not found. Manually enter your RPC password.\r\n'''
+ parser.error('''Dash configuration file not found. Manually enter your RPC password.\r\n'''
'''If you actually haven't created a configuration file, you should create one at %s with the text:\r\n'''
'''\r\n'''
'''server=1\r\n'''
'''rpcpassword=%x\r\n'''
'''\r\n'''
- '''Keep that password secret! After creating the file, restart Bitcoin.''' % (conf_path, random.randrange(2**128)))
+ '''Keep that password secret! After creating the file, restart Dash.''' % (conf_path, random.randrange(2**128)))
conf = open(conf_path, 'rb').read()
contents = {}
for line in conf.splitlines(True):
@@ -483,24 +482,24 @@ def run():
k, v = line.split('=', 1)
contents[k.strip()] = v.strip()
for conf_name, var_name, var_type in [
- ('rpcuser', 'bitcoind_rpc_username', str),
- ('rpcpassword', 'bitcoind_rpc_password', str),
- ('rpcport', 'bitcoind_rpc_port', int),
- ('port', 'bitcoind_p2p_port', int),
+ ('rpcuser', 'dashd_rpc_username', str),
+ ('rpcpassword', 'dashd_rpc_password', str),
+ ('rpcport', 'dashd_rpc_port', int),
+ ('port', 'dashd_p2p_port', int),
]:
if getattr(args, var_name) is None and conf_name in contents:
setattr(args, var_name, var_type(contents[conf_name]))
- if args.bitcoind_rpc_password is None:
- parser.error('''Bitcoin configuration file didn't contain an rpcpassword= line! Add one!''')
+ if args.dashd_rpc_password is None:
+ parser.error('''Dash configuration file didn't contain an rpcpassword= line! Add one!''')
- if args.bitcoind_rpc_username is None:
- args.bitcoind_rpc_username = ''
+ if args.dashd_rpc_username is None:
+ args.dashd_rpc_username = ''
- if args.bitcoind_rpc_port is None:
- args.bitcoind_rpc_port = net.PARENT.RPC_PORT
+ if args.dashd_rpc_port is None:
+ args.dashd_rpc_port = net.PARENT.RPC_PORT
- if args.bitcoind_p2p_port is None:
- args.bitcoind_p2p_port = net.PARENT.P2P_PORT
+ if args.dashd_p2p_port is None:
+ args.dashd_p2p_port = net.PARENT.P2P_PORT
if args.p2pool_port is None:
args.p2pool_port = net.P2P_PORT
@@ -518,7 +517,7 @@ def run():
if args.address is not None:
try:
- args.pubkey_hash = bitcoin_data.address_to_pubkey_hash(args.address, net.PARENT)
+ args.pubkey_hash = dash_data.address_to_pubkey_hash(args.address, net.PARENT)
except Exception, e:
parser.error('error parsing address: ' + repr(e))
else:
diff --git a/p2pool/networks.py b/p2pool/networks.py
index 7d9b0db..bfc4c3c 100644
--- a/p2pool/networks.py
+++ b/p2pool/networks.py
@@ -1,4 +1,4 @@
-from p2pool.bitcoin import networks
+from p2pool.dash import networks
from p2pool.util import math
# CHAIN_LENGTH = number of shares back client keeps
@@ -8,50 +8,42 @@
# changes can be done by changing one, then the other
nets = dict(
-
- darkcoin=math.Object(
- PARENT=networks.nets['darkcoin'],
- SHARE_PERIOD=15, # seconds
- NEW_SHARE_PERIOD=15, # seconds
- CHAIN_LENGTH=24*60*60//10, # shares
- REAL_CHAIN_LENGTH=24*60*60//10, # shares
- TARGET_LOOKBEHIND=200, # shares //with that the pools share diff is adjusting faster, important if huge hashing power comes to the pool
- SPREAD=30, # blocks
- NEW_SPREAD=30, # blocks
- IDENTIFIER='496247d46a00c115'.decode('hex'),
- PREFIX='5685a273806675db'.decode('hex'),
- P2P_PORT=7902,
- MIN_TARGET=4,
+ dash=math.Object(
+ PARENT=networks.nets['dash'],
+ SHARE_PERIOD=20, # seconds
+ CHAIN_LENGTH=24*60*60//20, # shares
+ REAL_CHAIN_LENGTH=24*60*60//20, # shares
+ TARGET_LOOKBEHIND=100, # shares //with that the pools share diff is adjusting faster, important if huge hashing power comes to the pool
+ SPREAD=10, # blocks
+ IDENTIFIER='7242ef345e1bed6b'.decode('hex'),
+ PREFIX='3b3e1286f446b891'.decode('hex'),
+ P2P_PORT=8999,
+ MIN_TARGET=0,
MAX_TARGET=2**256//2**20 - 1,
- PERSIST=False,
+ PERSIST=True,
WORKER_PORT=7903,
- BOOTSTRAP_ADDRS='p2phash.com asia02.poolhash.org asia01.poolhash.org 157.56.161.11 54.186.8.140 62.141.39.175 mightypool.net 85.131.127.26 213.229.88.102 cryptohasher.net darkcoin.fr'.split(' '),
- ANNOUNCE_CHANNEL='#p2pool-drk',
- VERSION_CHECK=lambda v: True,
+ BOOTSTRAP_ADDRS='eu.p2pool.pl p2pool.dashninja.pl dash.p2pools.us darkcoin.fr p2pool.crunchpool.com happymining.de'.split(' '),
+ ANNOUNCE_CHANNEL='#p2pool-dash',
+ VERSION_CHECK=lambda v: v >= 110217,
),
-
-
- darkcoin_testnet=math.Object(
- PARENT=networks.nets['darkcoin_testnet'],
- SHARE_PERIOD=15, # seconds
- NEW_SHARE_PERIOD=15, # seconds
- CHAIN_LENGTH=24*60*60//10, # shares
- REAL_CHAIN_LENGTH=24*60*60//10, # shares
- TARGET_LOOKBEHIND=200, # shares //with that the pools share diff is adjusting faster, important if huge hashing power comes to the pool
- SPREAD=30, # blocks
- NEW_SPREAD=30, # blocks
- IDENTIFIER='17cf94c1ae12e98f'.decode('hex'),
- PREFIX='5559f46dfee6881f'.decode('hex'),
- P2P_PORT=17902,
+ dash_testnet=math.Object(
+ PARENT=networks.nets['dash_testnet'],
+ SHARE_PERIOD=20, # seconds
+ CHAIN_LENGTH=24*60*60//20, # shares
+ REAL_CHAIN_LENGTH=24*60*60//20, # shares
+ TARGET_LOOKBEHIND=100, # shares //with that the pools share diff is adjusting faster, important if huge hashing power comes to the pool
+ SPREAD=10, # blocks
+ IDENTIFIER='b6deb1e543fe2427'.decode('hex'),
+ PREFIX='198b644f6821e3b3'.decode('hex'),
+ P2P_PORT=18999,
MIN_TARGET=0,
MAX_TARGET=2**256//2**20 - 1,
PERSIST=False,
- WORKER_PORT=13990,
- BOOTSTRAP_ADDRS=''.split(' '),
- ANNOUNCE_CHANNEL='#p2pool-drk',
+ WORKER_PORT=17903,
+ BOOTSTRAP_ADDRS='p2pool.dashninja.pl test.p2pool.masternode.io'.split(' '),
+ ANNOUNCE_CHANNEL='',
VERSION_CHECK=lambda v: True,
),
-
)
for net_name, net in nets.iteritems():
net.NAME = net_name
diff --git a/p2pool/node.py b/p2pool/node.py
index 5eaea32..6950251 100644
--- a/p2pool/node.py
+++ b/p2pool/node.py
@@ -6,7 +6,7 @@
from twisted.python import log
from p2pool import data as p2pool_data, p2p
-from p2pool.bitcoin import data as bitcoin_data, helper, height_tracker
+from p2pool.dash import data as dash_data, helper, height_tracker
from p2pool.util import deferral, variable
@@ -28,7 +28,7 @@ def handle_shares(self, shares, peer):
all_new_txs = {}
for share, new_txs in shares:
if new_txs is not None:
- all_new_txs.update((bitcoin_data.hash256(bitcoin_data.tx_type.pack(new_tx)), new_tx) for new_tx in new_txs)
+ all_new_txs.update((dash_data.hash256(dash_data.tx_type.pack(new_tx)), new_tx) for new_tx in new_txs)
if share.hash in self.node.tracker.items:
#print 'Got duplicate share, ignoring. Hash: %s' % (p2pool_data.format_hash(share.hash),)
@@ -80,7 +80,7 @@ def handle_get_shares(self, hashes, parents, stops, peer):
return shares
def handle_bestblock(self, header, peer):
- if self.node.net.PARENT.POW_FUNC(bitcoin_data.block_header_type.pack(header)) > header['bits'].target:
+ if self.node.net.PARENT.POW_FUNC(dash_data.block_header_type.pack(header)) > header['bits'].target:
raise p2p.PeerMisbehavingError('received block header fails PoW test')
self.node.handle_header(header)
@@ -150,16 +150,16 @@ def _(share):
def spread():
if (self.node.get_height_rel_highest(share.header['previous_block']) > -5 or
- self.node.bitcoind_work.value['previous_block'] in [share.header['previous_block'], share.header_hash]):
+ self.node.dashd_work.value['previous_block'] in [share.header['previous_block'], share.header_hash]):
self.broadcast_share(share.hash)
spread()
reactor.callLater(5, spread) # so get_height_rel_highest can update
class Node(object):
- def __init__(self, factory, bitcoind, shares, known_verified_share_hashes, net):
+ def __init__(self, factory, dashd, shares, known_verified_share_hashes, net):
self.factory = factory
- self.bitcoind = bitcoind
+ self.dashd = dashd
self.net = net
self.tracker = p2pool_data.OkayTracker(self.net)
@@ -178,15 +178,15 @@ def start(self):
stop_signal = variable.Event()
self.stop = stop_signal.happened
- # BITCOIND WORK
+ # DASHD WORK
- self.bitcoind_work = variable.Variable((yield helper.getwork(self.bitcoind)))
+ self.dashd_work = variable.Variable((yield helper.getwork(self.dashd, self.net)))
@defer.inlineCallbacks
def work_poller():
while stop_signal.times == 0:
flag = self.factory.new_block.get_deferred()
try:
- self.bitcoind_work.set((yield helper.getwork(self.bitcoind, self.bitcoind_work.value['use_getblocktemplate'])))
+ self.dashd_work.set((yield helper.getwork(self.dashd, self.net, self.dashd_work.value['use_getblocktemplate'])))
except:
log.err()
yield defer.DeferredList([flag, deferral.sleep(15)], fireOnOneCallback=True)
@@ -197,19 +197,17 @@ def work_poller():
self.best_block_header = variable.Variable(None)
def handle_header(new_header):
# check that header matches current target
- if not (self.net.PARENT.POW_FUNC(bitcoin_data.block_header_type.pack(new_header)) <= self.bitcoind_work.value['bits'].target):
+ if not (self.net.PARENT.POW_FUNC(dash_data.block_header_type.pack(new_header)) <= self.dashd_work.value['bits'].target):
return
- bitcoind_best_block = self.bitcoind_work.value['previous_block']
+ dashd_best_block = self.dashd_work.value['previous_block']
if (self.best_block_header.value is None
or (
- new_header['previous_block'] == bitcoind_best_block and
- #bitcoin_data.hash256(bitcoin_data.block_header_type.pack(self.best_block_header.value)) == bitcoind_best_block
- self.net.PARENT.BLOCKHASH_FUNC(bitcoin_data.block_header_type.pack(self.best_block_header.value)) == bitcoind_best_block
+ new_header['previous_block'] == dashd_best_block and
+ self.net.PARENT.BLOCKHASH_FUNC(dash_data.block_header_type.pack(self.best_block_header.value)) == dashd_best_block
) # new is child of current and previous is current
or (
- #bitcoin_data.hash256(bitcoin_data.block_header_type.pack(new_header)) == bitcoind_best_block and
- self.net.PARENT.BLOCKHASH_FUNC(bitcoin_data.block_header_type.pack(new_header)) == bitcoind_best_block and
- self.best_block_header.value['previous_block'] != bitcoind_best_block
+ self.net.PARENT.BLOCKHASH_FUNC(dash_data.block_header_type.pack(new_header)) == dashd_best_block and
+ self.best_block_header.value['previous_block'] != dashd_best_block
)): # new is current and previous is not a child of current
self.best_block_header.set(new_header)
self.handle_header = handle_header
@@ -217,40 +215,40 @@ def handle_header(new_header):
def poll_header():
if self.factory.conn.value is None:
return
- handle_header((yield self.factory.conn.value.get_block_header(self.bitcoind_work.value['previous_block'])))
- self.bitcoind_work.changed.watch(lambda _: poll_header())
+ handle_header((yield self.factory.conn.value.get_block_header(self.dashd_work.value['previous_block'])))
+ self.dashd_work.changed.watch(lambda _: poll_header())
yield deferral.retry('Error while requesting best block header:')(poll_header)()
# BEST SHARE
self.known_txs_var = variable.Variable({}) # hash -> tx
self.mining_txs_var = variable.Variable({}) # hash -> tx
- self.get_height_rel_highest = yield height_tracker.get_height_rel_highest_func(self.bitcoind, self.factory, lambda: self.bitcoind_work.value['previous_block'], self.net)
+ self.get_height_rel_highest = yield height_tracker.get_height_rel_highest_func(self.dashd, self.factory, lambda: self.dashd_work.value['previous_block'], self.net)
self.best_share_var = variable.Variable(None)
self.desired_var = variable.Variable(None)
- self.bitcoind_work.changed.watch(lambda _: self.set_best_share())
+ self.dashd_work.changed.watch(lambda _: self.set_best_share())
self.set_best_share()
# setup p2p logic and join p2pool network
# update mining_txs according to getwork results
- @self.bitcoind_work.changed.run_and_watch
+ @self.dashd_work.changed.run_and_watch
def _(_=None):
new_mining_txs = {}
new_known_txs = dict(self.known_txs_var.value)
- for tx_hash, tx in zip(self.bitcoind_work.value['transaction_hashes'], self.bitcoind_work.value['transactions']):
+ for tx_hash, tx in zip(self.dashd_work.value['transaction_hashes'], self.dashd_work.value['transactions']):
new_mining_txs[tx_hash] = tx
new_known_txs[tx_hash] = tx
self.mining_txs_var.set(new_mining_txs)
self.known_txs_var.set(new_known_txs)
- # add p2p transactions from bitcoind to known_txs
+ # add p2p transactions from dashd to known_txs
@self.factory.new_tx.watch
def _(tx):
new_known_txs = dict(self.known_txs_var.value)
- new_known_txs[bitcoin_data.hash256(bitcoin_data.tx_type.pack(tx))] = tx
+ new_known_txs[dash_data.hash256(dash_data.tx_type.pack(tx))] = tx
self.known_txs_var.set(new_known_txs)
- # forward transactions seen to bitcoind
+ # forward transactions seen to dashd
@self.known_txs_var.transitioned.watch
@defer.inlineCallbacks
def _(before, after):
@@ -265,13 +263,17 @@ def _(share):
if not (share.pow_hash <= share.header['bits'].target):
return
- block = share.as_block(self.tracker, self.known_txs_var.value)
+ if self.dashd_work.value['masternode_payments']:
+ block = share.as_block(self.tracker, self.known_txs_var.value, self.dashd_work.value['votes'])
+ else:
+ block = share.as_block_old(self.tracker, self.known_txs_var.value)
+
if block is None:
- print >>sys.stderr, 'GOT INCOMPLETE BLOCK FROM PEER! %s bitcoin: %s%064x' % (p2pool_data.format_hash(share.hash), self.net.PARENT.BLOCK_EXPLORER_URL_PREFIX, share.header_hash)
+ print >>sys.stderr, 'GOT INCOMPLETE BLOCK FROM PEER! %s dash: %s%064x' % (p2pool_data.format_hash(share.hash), self.net.PARENT.BLOCK_EXPLORER_URL_PREFIX, share.header_hash)
return
- helper.submit_block(block, True, self.factory, self.bitcoind, self.bitcoind_work, self.net)
+ helper.submit_block(block, True, self.factory, self.dashd, self.dashd_work, self.net)
print
- print 'GOT BLOCK FROM PEER! Passing to bitcoind! %s bitcoin: %s%064x' % (p2pool_data.format_hash(share.hash), self.net.PARENT.BLOCK_EXPLORER_URL_PREFIX, share.header_hash)
+ print 'GOT BLOCK FROM PEER! Passing to dashd! %s dash: %s%064x' % (p2pool_data.format_hash(share.hash), self.net.PARENT.BLOCK_EXPLORER_URL_PREFIX, share.header_hash)
print
def forget_old_txs():
@@ -294,7 +296,7 @@ def forget_old_txs():
stop_signal.watch(t.stop)
def set_best_share(self):
- best, desired, decorated_heads, bad_peer_addresses = self.tracker.think(self.get_height_rel_highest, self.bitcoind_work.value['previous_block'], self.bitcoind_work.value['bits'], self.known_txs_var.value)
+ best, desired, decorated_heads, bad_peer_addresses = self.tracker.think(self.get_height_rel_highest, self.dashd_work.value['previous_block'], self.dashd_work.value['bits'], self.known_txs_var.value)
self.best_share_var.set(best)
self.desired_var.set(desired)
@@ -307,10 +309,10 @@ def set_best_share(self):
break
def get_current_txouts(self):
- return p2pool_data.get_expected_payouts(self.tracker, self.best_share_var.value, self.bitcoind_work.value['bits'].target, self.bitcoind_work.value['subsidy'], self.net)
+ return p2pool_data.get_expected_payouts(self.tracker, self.best_share_var.value, self.dashd_work.value['bits'].target, self.dashd_work.value['subsidy'], self.net)
def clean_tracker(self):
- best, desired, decorated_heads, bad_peer_addresses = self.tracker.think(self.get_height_rel_highest, self.bitcoind_work.value['previous_block'], self.bitcoind_work.value['bits'], self.known_txs_var.value)
+ best, desired, decorated_heads, bad_peer_addresses = self.tracker.think(self.get_height_rel_highest, self.dashd_work.value['previous_block'], self.dashd_work.value['bits'], self.known_txs_var.value)
# eat away at heads
if decorated_heads:
diff --git a/p2pool/p2p.py b/p2pool/p2p.py
index 94db442..7bba659 100644
--- a/p2pool/p2p.py
+++ b/p2pool/p2p.py
@@ -10,7 +10,7 @@
import p2pool
from p2pool import data as p2pool_data
-from p2pool.bitcoin import data as bitcoin_data
+from p2pool.dash import data as dash_data
from p2pool.util import deferral, p2protocol, pack, variable
class PeerMisbehavingError(Exception):
@@ -107,8 +107,8 @@ def _timeout(self):
message_version = pack.ComposedType([
('version', pack.IntType(32)),
('services', pack.IntType(64)),
- ('addr_to', bitcoin_data.address_type),
- ('addr_from', bitcoin_data.address_type),
+ ('addr_to', dash_data.address_type),
+ ('addr_from', dash_data.address_type),
('nonce', pack.IntType(64)),
('sub_version', pack.VarStrType()),
('mode', pack.IntType(32)), # always 1 for legacy compatibility
@@ -180,16 +180,16 @@ def update_remote_view_of_my_mining_txs(before, after):
added = set(after) - set(before)
removed = set(before) - set(after)
if added:
- self.remote_remembered_txs_size += sum(100 + bitcoin_data.tx_type.packed_size(after[x]) for x in added)
+ self.remote_remembered_txs_size += sum(100 + dash_data.tx_type.packed_size(after[x]) for x in added)
assert self.remote_remembered_txs_size <= self.max_remembered_txs_size
fragment(self.send_remember_tx, tx_hashes=[x for x in added if x in self.remote_tx_hashes], txs=[after[x] for x in added if x not in self.remote_tx_hashes])
if removed:
self.send_forget_tx(tx_hashes=list(removed))
- self.remote_remembered_txs_size -= sum(100 + bitcoin_data.tx_type.packed_size(before[x]) for x in removed)
+ self.remote_remembered_txs_size -= sum(100 + dash_data.tx_type.packed_size(before[x]) for x in removed)
watch_id2 = self.node.mining_txs_var.transitioned.watch(update_remote_view_of_my_mining_txs)
self.connection_lost_event.watch(lambda: self.node.mining_txs_var.transitioned.unwatch(watch_id2))
- self.remote_remembered_txs_size += sum(100 + bitcoin_data.tx_type.packed_size(x) for x in self.node.mining_txs_var.value.values())
+ self.remote_remembered_txs_size += sum(100 + dash_data.tx_type.packed_size(x) for x in self.node.mining_txs_var.value.values())
assert self.remote_remembered_txs_size <= self.max_remembered_txs_size
fragment(self.send_remember_tx, tx_hashes=[], txs=self.node.mining_txs_var.value.values())
@@ -223,7 +223,7 @@ def handle_addrme(self, port):
message_addrs = pack.ComposedType([
('addrs', pack.ListType(pack.ComposedType([
('timestamp', pack.IntType(64)),
- ('address', bitcoin_data.address_type),
+ ('address', dash_data.address_type),
]))),
])
def handle_addrs(self, addrs):
@@ -298,7 +298,7 @@ def sendShares(self, shares, tracker, known_txs, include_txs_with=[]):
hashes_to_send = [x for x in tx_hashes if x not in self.node.mining_txs_var.value and x in known_txs]
- new_remote_remembered_txs_size = self.remote_remembered_txs_size + sum(100 + bitcoin_data.tx_type.packed_size(known_txs[x]) for x in hashes_to_send)
+ new_remote_remembered_txs_size = self.remote_remembered_txs_size + sum(100 + dash_data.tx_type.packed_size(known_txs[x]) for x in hashes_to_send)
if new_remote_remembered_txs_size > self.max_remembered_txs_size:
raise ValueError('shares have too many txs')
self.remote_remembered_txs_size = new_remote_remembered_txs_size
@@ -309,7 +309,7 @@ def sendShares(self, shares, tracker, known_txs, include_txs_with=[]):
self.send_forget_tx(tx_hashes=hashes_to_send)
- self.remote_remembered_txs_size -= sum(100 + bitcoin_data.tx_type.packed_size(known_txs[x]) for x in hashes_to_send)
+ self.remote_remembered_txs_size -= sum(100 + dash_data.tx_type.packed_size(known_txs[x]) for x in hashes_to_send)
message_sharereq = pack.ComposedType([
@@ -340,7 +340,7 @@ def handle_sharereply(self, id, result, shares):
message_bestblock = pack.ComposedType([
- ('header', bitcoin_data.block_header_type),
+ ('header', dash_data.block_header_type),
])
def handle_bestblock(self, header):
self.node.handle_bestblock(header, self)
@@ -364,7 +364,7 @@ def handle_losing_tx(self, tx_hashes):
message_remember_tx = pack.ComposedType([
('tx_hashes', pack.ListType(pack.IntType(256))),
- ('txs', pack.ListType(bitcoin_data.tx_type)),
+ ('txs', pack.ListType(dash_data.tx_type)),
])
def handle_remember_tx(self, tx_hashes, txs):
for tx_hash in tx_hashes:
@@ -387,11 +387,11 @@ def handle_remember_tx(self, tx_hashes, txs):
return
self.remembered_txs[tx_hash] = tx
- self.remembered_txs_size += 100 + bitcoin_data.tx_type.packed_size(tx)
+ self.remembered_txs_size += 100 + dash_data.tx_type.packed_size(tx)
new_known_txs = dict(self.node.known_txs_var.value)
warned = False
for tx in txs:
- tx_hash = bitcoin_data.hash256(bitcoin_data.tx_type.pack(tx))
+ tx_hash = dash_data.hash256(dash_data.tx_type.pack(tx))
if tx_hash in self.remembered_txs:
print >>sys.stderr, 'Peer referenced transaction twice, disconnecting'
self.disconnect()
@@ -402,7 +402,7 @@ def handle_remember_tx(self, tx_hashes, txs):
warned = True
self.remembered_txs[tx_hash] = tx
- self.remembered_txs_size += 100 + bitcoin_data.tx_type.packed_size(tx)
+ self.remembered_txs_size += 100 + dash_data.tx_type.packed_size(tx)
new_known_txs[tx_hash] = tx
self.node.known_txs_var.set(new_known_txs)
if self.remembered_txs_size >= self.max_remembered_txs_size:
@@ -412,7 +412,7 @@ def handle_remember_tx(self, tx_hashes, txs):
])
def handle_forget_tx(self, tx_hashes):
for tx_hash in tx_hashes:
- self.remembered_txs_size -= 100 + bitcoin_data.tx_type.packed_size(self.remembered_txs[tx_hash])
+ self.remembered_txs_size -= 100 + dash_data.tx_type.packed_size(self.remembered_txs[tx_hash])
assert self.remembered_txs_size >= 0
del self.remembered_txs[tx_hash]
diff --git a/p2pool/test/bitcoin/__init__.py b/p2pool/test/dash/__init__.py
similarity index 100%
rename from p2pool/test/bitcoin/__init__.py
rename to p2pool/test/dash/__init__.py
diff --git a/p2pool/test/bitcoin/test_data.py b/p2pool/test/dash/test_data.py
similarity index 96%
rename from p2pool/test/bitcoin/test_data.py
rename to p2pool/test/dash/test_data.py
index 74d5dee..e286385 100644
--- a/p2pool/test/bitcoin/test_data.py
+++ b/p2pool/test/dash/test_data.py
@@ -1,6 +1,6 @@
import unittest
-from p2pool.bitcoin import data, networks
+from p2pool.dash import data, networks
from p2pool.util import pack
@@ -41,7 +41,7 @@ def test_tx_hash(self):
))) == 0xb53802b2333e828d6532059f46ecf6b313a42d79f97925e457fbbfda45367e5c
def test_address_to_pubkey_hash(self):
- assert data.address_to_pubkey_hash('1KUCp7YP5FP8ViRxhfszSUJCTAajK6viGy', networks.nets['bitcoin']) == pack.IntType(160).unpack('ca975b00a8c203b8692f5a18d92dc5c2d2ebc57b'.decode('hex'))
+ assert data.address_to_pubkey_hash('1KUCp7YP5FP8ViRxhfszSUJCTAajK6viGy', networks.nets['dash']) == pack.IntType(160).unpack('ca975b00a8c203b8692f5a18d92dc5c2d2ebc57b'.decode('hex'))
def test_merkle_hash(self):
assert data.merkle_hash([
diff --git a/p2pool/test/bitcoin/test_getwork.py b/p2pool/test/dash/test_getwork.py
similarity index 89%
rename from p2pool/test/bitcoin/test_getwork.py
rename to p2pool/test/dash/test_getwork.py
index 58776f9..6787899 100644
--- a/p2pool/test/bitcoin/test_getwork.py
+++ b/p2pool/test/dash/test_getwork.py
@@ -1,6 +1,6 @@
import unittest
-from p2pool.bitcoin import getwork, data as bitcoin_data
+from p2pool.dash import getwork, data as dash_data
class Test(unittest.TestCase):
def test_all(self):
@@ -41,7 +41,7 @@ def test_all(self):
0x148135e10208db85abb62754341a392eab1f186aab077a831cf7,
0x534ea08be1ab529f484369344b6d5423ef5a0767db9b3ebb4e182bbb67962520,
1305759879,
- bitcoin_data.FloatingInteger.from_target_upper_bound(0x44b9f20000000000000000000000000000000000000000000000),
+ dash_data.FloatingInteger.from_target_upper_bound(0x44b9f20000000000000000000000000000000000000000000000),
0x44b9f20000000000000000000000000000000000000000000000,
),
getwork.BlockAttempt(
@@ -49,7 +49,7 @@ def test_all(self):
0x148135e10208db85abb62754341a392eab1f186aab077a831cf7,
0x534ea08be1ab529f484369344b6d5423ef5a0767db9b3ebb4e182bbb67962520,
1305759879,
- bitcoin_data.FloatingInteger.from_target_upper_bound(0x44b9f20000000000000000000000000000000000000000000000),
+ dash_data.FloatingInteger.from_target_upper_bound(0x44b9f20000000000000000000000000000000000000000000000),
432*2**230,
),
getwork.BlockAttempt(
@@ -57,7 +57,7 @@ def test_all(self):
0x148135e10208db85abb62754341a392eab1f186aab077a831cf7,
0x534ea08be1ab529f484369344b6d5423ef5a0767db9b3ebb4e182bbb67962520,
1305759879,
- bitcoin_data.FloatingInteger.from_target_upper_bound(0x44b9f20000000000000000000000000000000000000000000000),
+ dash_data.FloatingInteger.from_target_upper_bound(0x44b9f20000000000000000000000000000000000000000000000),
7*2**240,
)
]
diff --git a/p2pool/test/bitcoin/test_p2p.py b/p2pool/test/dash/test_p2p.py
similarity index 87%
rename from p2pool/test/bitcoin/test_p2p.py
rename to p2pool/test/dash/test_p2p.py
index a564dad..dbfd34b 100644
--- a/p2pool/test/bitcoin/test_p2p.py
+++ b/p2pool/test/dash/test_p2p.py
@@ -1,14 +1,14 @@
from twisted.internet import defer, reactor
from twisted.trial import unittest
-from p2pool.bitcoin import data, networks, p2p
+from p2pool.dash import data, networks, p2p
from p2pool.util import deferral
class Test(unittest.TestCase):
@defer.inlineCallbacks
def test_get_block(self):
- factory = p2p.ClientFactory(networks.nets['bitcoin'])
+ factory = p2p.ClientFactory(networks.nets['dash'])
c = reactor.connectTCP('127.0.0.1', 8333, factory)
try:
h = 0x000000000000046acff93b0e76cd10490551bf871ce9ac9fad62e67a07ff1d1e
diff --git a/p2pool/test/bitcoin/test_script.py b/p2pool/test/dash/test_script.py
similarity index 93%
rename from p2pool/test/bitcoin/test_script.py
rename to p2pool/test/dash/test_script.py
index 15be06c..03d2ad0 100644
--- a/p2pool/test/bitcoin/test_script.py
+++ b/p2pool/test/dash/test_script.py
@@ -1,6 +1,6 @@
import unittest
-from p2pool.bitcoin import script
+from p2pool.dash import script
class Test(unittest.TestCase):
def test_all(self):
diff --git a/p2pool/test/bitcoin/test_sha256.py b/p2pool/test/dash/test_sha256.py
similarity index 97%
rename from p2pool/test/bitcoin/test_sha256.py
rename to p2pool/test/dash/test_sha256.py
index 57d3b08..ebad4d2 100644
--- a/p2pool/test/bitcoin/test_sha256.py
+++ b/p2pool/test/dash/test_sha256.py
@@ -4,7 +4,7 @@
import hashlib
import random
-from p2pool.bitcoin import sha256
+from p2pool.dash import sha256
class Test(unittest.TestCase):
def test_all(self):
diff --git a/p2pool/test/test_data.py b/p2pool/test/test_data.py
index c9f3527..3ab5770 100644
--- a/p2pool/test/test_data.py
+++ b/p2pool/test/test_data.py
@@ -2,7 +2,7 @@
import unittest
from p2pool import data
-from p2pool.bitcoin import data as bitcoin_data
+from p2pool.dash import data as dash_data
from p2pool.test.util import test_forest
from p2pool.util import forest
@@ -14,14 +14,14 @@ def test_hashlink1(self):
for i in xrange(100):
d = random_bytes(random.randrange(2048))
x = data.prefix_to_hash_link(d)
- assert data.check_hash_link(x, '') == bitcoin_data.hash256(d)
+ assert data.check_hash_link(x, '') == dash_data.hash256(d)
def test_hashlink2(self):
for i in xrange(100):
d = random_bytes(random.randrange(2048))
d2 = random_bytes(random.randrange(2048))
x = data.prefix_to_hash_link(d)
- assert data.check_hash_link(x, d2) == bitcoin_data.hash256(d + d2)
+ assert data.check_hash_link(x, d2) == dash_data.hash256(d + d2)
def test_hashlink3(self):
for i in xrange(100):
@@ -29,7 +29,7 @@ def test_hashlink3(self):
d2 = random_bytes(random.randrange(200))
d3 = random_bytes(random.randrange(2048))
x = data.prefix_to_hash_link(d + d2, d2)
- assert data.check_hash_link(x, d3, d2) == bitcoin_data.hash256(d + d2 + d3)
+ assert data.check_hash_link(x, d3, d2) == dash_data.hash256(d + d2 + d3)
def test_skiplist(self):
t = forest.Tracker()
diff --git a/p2pool/test/test_node.py b/p2pool/test/test_node.py
index f09dd42..392108b 100644
--- a/p2pool/test/test_node.py
+++ b/p2pool/test/test_node.py
@@ -10,10 +10,10 @@
from twisted.web import client, resource, server
from p2pool import data, node, work
-from p2pool.bitcoin import data as bitcoin_data, networks, worker_interface
+from p2pool.dash import data as dash_data, networks, worker_interface
from p2pool.util import deferral, jsonrpc, math, variable
-class bitcoind(object): # can be used as p2p factory, p2p protocol, or rpc jsonrpc proxy
+class dashd(object): # can be used as p2p factory, p2p protocol, or rpc jsonrpc proxy
def __init__(self):
self.blocks = [0x000000000000016c169477c25421250ec5d32cf9c6d38538b5de970a2355fd89]
self.headers = {0x16c169477c25421250ec5d32cf9c6d38538b5de970a2355fd89: {
@@ -22,7 +22,7 @@ def __init__(self):
'merkle_root': 2282849479936278423916707524932131168473430114569971665822757638339486597658L,
'version': 1,
'previous_block': 1048610514577342396345362905164852351970507722694242579238530L,
- 'bits': bitcoin_data.FloatingInteger(bits=0x1a0513c5, target=0x513c50000000000000000000000000000000000000000000000L),
+ 'bits': dash_data.FloatingInteger(bits=0x1a0513c5, target=0x513c50000000000000000000000000000000000000000000000L),
}}
self.conn = variable.Variable(self)
@@ -64,14 +64,14 @@ def rpc_getblocktemplate(self, param):
pass
elif param['mode'] == 'submit':
result = param['data']
- block = bitcoin_data.block_type.unpack(result.decode('hex'))
+ block = dash_data.block_type.unpack(result.decode('hex'))
if sum(tx_out['value'] for tx_out in block['txs'][0]['tx_outs']) != sum(tx['tx_outs'][0]['value'] for tx in block['txs'][1:]) + 5000000000:
print 'invalid fee'
if block['header']['previous_block'] != self.blocks[-1]:
return False
- if bitcoin_data.hash256(result.decode('hex')) > block['header']['bits'].target:
+ if dash_data.hash256(result.decode('hex')) > block['header']['bits'].target:
return False
- header_hash = bitcoin_data.hash256(bitcoin_data.block_header_type.pack(block['header']))
+ header_hash = dash_data.hash256(dash_data.block_header_type.pack(block['header']))
self.blocks.append(header_hash)
self.headers[header_hash] = block['header']
reactor.callLater(0, self.new_block.happened)
@@ -83,7 +83,7 @@ def rpc_getblocktemplate(self, param):
for i in xrange(100):
fee = i
txs.append(dict(
- data=bitcoin_data.tx_type.pack(dict(version=1, tx_ins=[], tx_outs=[dict(value=fee, script='hello!'*100)], lock_time=0)).encode('hex'),
+ data=dash_data.tx_type.pack(dict(version=1, tx_ins=[], tx_outs=[dict(value=fee, script='hello!'*100)], lock_time=0)).encode('hex'),
fee=fee,
))
return {
@@ -146,10 +146,10 @@ def rpc_getauxblock(self, request, result1=None, result2=None):
class MiniNode(object):
@classmethod
@defer.inlineCallbacks
- def start(cls, net, factory, bitcoind, peer_ports, merged_urls):
+ def start(cls, net, factory, dashd, peer_ports, merged_urls):
self = cls()
- self.n = node.Node(factory, bitcoind, [], [], net)
+ self.n = node.Node(factory, dashd, [], [], net)
yield self.n.start()
self.n.p2p_node = node.P2PNode(self.n, port=0, max_incoming_conns=1000000, addr_store={}, connect_addrs=[('127.0.0.1', peer_port) for peer_port in peer_ports])
@@ -173,7 +173,7 @@ def stop(self):
class Test(unittest.TestCase):
@defer.inlineCallbacks
def test_node(self):
- bitd = bitcoind()
+ bitd = dashd()
mm_root = resource.Resource()
mm_root.putChild('', jsonrpc.HTTPServer(mm_provider))
@@ -221,7 +221,7 @@ def test_nodes(self):
N = 3
SHARES = 600
- bitd = bitcoind()
+ bitd = dashd()
nodes = []
for i in xrange(N):
diff --git a/p2pool/test/test_p2p.py b/p2pool/test/test_p2p.py
index 4cf3901..11e729b 100644
--- a/p2pool/test/test_p2p.py
+++ b/p2pool/test/test_p2p.py
@@ -4,7 +4,7 @@
from twisted.trial import unittest
from p2pool import networks, p2p
-from p2pool.bitcoin import data as bitcoin_data
+from p2pool.dash import data as dash_data
from p2pool.util import deferral
@@ -13,7 +13,7 @@ class Test(unittest.TestCase):
def test_sharereq(self):
class MyNode(p2p.Node):
def __init__(self, df):
- p2p.Node.__init__(self, lambda: None, 29333, networks.nets['bitcoin'], {}, set([('127.0.0.1', 9333)]), 0, 0, 0, 0)
+ p2p.Node.__init__(self, lambda: None, 29333, networks.nets['dash'], {}, set([('127.0.0.1', 9333)]), 0, 0, 0, 0)
self.df = df
@@ -36,7 +36,7 @@ def handle_share_hashes(self, hashes, peer):
def test_tx_limit(self):
class MyNode(p2p.Node):
def __init__(self, df):
- p2p.Node.__init__(self, lambda: None, 29333, networks.nets['bitcoin'], {}, set([('127.0.0.1', 9333)]), 0, 0, 0, 0)
+ p2p.Node.__init__(self, lambda: None, 29333, networks.nets['dash'], {}, set([('127.0.0.1', 9333)]), 0, 0, 0, 0)
self.df = df
self.sent_time = 0
@@ -58,7 +58,7 @@ def got_conn(self, conn):
)],
lock_time=i,
)
- new_mining_txs[bitcoin_data.hash256(bitcoin_data.tx_type.pack(huge_tx))] = huge_tx
+ new_mining_txs[dash_data.hash256(dash_data.tx_type.pack(huge_tx))] = huge_tx
self.mining_txs_var.set(new_mining_txs)
self.sent_time = reactor.seconds()
diff --git a/p2pool/util/jsonrpc.py b/p2pool/util/jsonrpc.py
index d810ada..180a182 100644
--- a/p2pool/util/jsonrpc.py
+++ b/p2pool/util/jsonrpc.py
@@ -67,6 +67,16 @@ def _handle(data, provider, preargs=(), response_handler=None):
id_ = req.get('id', None)
method = req.get('method', None)
+
+ # return an error message that sgminer understands
+ if method == 'mining.extranonce.subscribe':
+ defer.returnValue(json.dumps(dict(
+ id=id_,
+ result=None,
+ error=[-3, "Method 'subscribe' not found for service 'mining.extranonce'", None],
+ )))
+ return
+
if not isinstance(method, basestring):
raise Error_for_code(-32600)(u'Invalid Request')
params = req.get('params', [])
diff --git a/p2pool/util/math.py b/p2pool/util/math.py
index 71b5dec..8b2d9d8 100644
--- a/p2pool/util/math.py
+++ b/p2pool/util/math.py
@@ -72,13 +72,15 @@ def add_dicts(*dicts):
mult_dict = lambda c, x: dict((k, c*v) for k, v in x.iteritems())
-def format(x):
+def format(x, add_space=False):
prefixes = 'kMGTPEZY'
count = 0
while x >= 100000 and count < len(prefixes) - 2:
x = x//1000
count += 1
s = '' if count == 0 else prefixes[count - 1]
+ if add_space and s:
+ s = ' ' + s
return '%i' % (x,) + s
def format_dt(dt):
diff --git a/p2pool/util/p2protocol.py b/p2pool/util/p2protocol.py
index 6e8634e..f2a06e7 100644
--- a/p2pool/util/p2protocol.py
+++ b/p2pool/util/p2protocol.py
@@ -1,5 +1,5 @@
'''
-Generic message-based protocol used by Bitcoin and P2Pool for P2P communication
+Generic message-based protocol used by Dash and P2Pool for P2P communication
'''
import hashlib
diff --git a/p2pool/web.py b/p2pool/web.py
index b47ae6e..8fab9d2 100644
--- a/p2pool/web.py
+++ b/p2pool/web.py
@@ -12,7 +12,7 @@
from twisted.web import resource, static
import p2pool
-from bitcoin import data as bitcoin_data
+from dash import data as dash_data
from . import data as p2pool_data, p2p
from util import deferral, deferred_resource, graph, math, memory, pack, variable
@@ -45,7 +45,7 @@ def _atomic_write(filename, data):
os.remove(filename)
os.rename(filename + '.new', filename)
-def get_web_root(wb, datadir_path, bitcoind_getinfo_var, stop_event=variable.Event()):
+def get_web_root(wb, datadir_path, dashd_getinfo_var, stop_event=variable.Event()):
node = wb.node
start_time = time.time()
@@ -56,7 +56,7 @@ def get_users():
weights, total_weight, donation_weight = node.tracker.get_cumulative_weights(node.best_share_var.value, min(height, 720), 65535*2**256)
res = {}
for script in sorted(weights, key=lambda s: weights[s]):
- res[bitcoin_data.script2_to_address(script, node.net.PARENT)] = weights[script]/total_weight
+ res[dash_data.script2_to_address(script, node.net.PARENT)] = weights[script]/total_weight
return res
def get_current_scaled_txouts(scale, trunc=0):
@@ -86,9 +86,9 @@ def get_patron_sendmany(total=None, trunc='0.01'):
total = int(float(total)*1e8)
trunc = int(float(trunc)*1e8)
return json.dumps(dict(
- (bitcoin_data.script2_to_address(script, node.net.PARENT), value/1e8)
+ (dash_data.script2_to_address(script, node.net.PARENT), value/1e8)
for script, value in get_current_scaled_txouts(total, trunc).iteritems()
- if bitcoin_data.script2_to_address(script, node.net.PARENT) is not None
+ if dash_data.script2_to_address(script, node.net.PARENT) is not None
))
def get_global_stats():
@@ -99,13 +99,12 @@ def get_global_stats():
nonstale_hash_rate = p2pool_data.get_pool_attempts_per_second(node.tracker, node.best_share_var.value, lookbehind)
stale_prop = p2pool_data.get_average_stale_prop(node.tracker, node.best_share_var.value, lookbehind)
- diff = bitcoin_data.target_to_difficulty(wb.current_work.value['bits'].target)
-
+ diff = dash_data.target_to_difficulty(wb.current_work.value['bits'].target)
return dict(
pool_nonstale_hash_rate=nonstale_hash_rate,
pool_hash_rate=nonstale_hash_rate/(1 - stale_prop),
pool_stale_prop=stale_prop,
- min_difficulty=bitcoin_data.target_to_difficulty(node.tracker.items[node.best_share_var.value].max_target),
+ min_difficulty=dash_data.target_to_difficulty(node.tracker.items[node.best_share_var.value].max_target),
network_block_difficulty=diff,
network_hashrate=(diff * 2**32 // node.net.PARENT.BLOCK_PERIOD),
)
@@ -125,7 +124,7 @@ def get_local_stats():
my_stale_prop = my_stale_count/my_share_count if my_share_count != 0 else None
- my_work = sum(bitcoin_data.target_to_average_attempts(share.target)
+ my_work = sum(dash_data.target_to_average_attempts(share.target)
for share in node.tracker.get_chain(node.best_share_var.value, lookbehind - 1)
if share.hash in wb.my_share_hashes)
actual_time = (node.tracker.items[node.best_share_var.value].timestamp -
@@ -137,8 +136,8 @@ def get_local_stats():
miner_last_difficulties = {}
for addr in wb.last_work_shares.value:
- miner_last_difficulties[addr] = bitcoin_data.target_to_difficulty(wb.last_work_shares.value[addr].target)
-
+ miner_last_difficulties[addr] = dash_data.target_to_difficulty(wb.last_work_shares.value[addr].target)
+
return dict(
my_hash_rates_in_last_hour=dict(
note="DEPRECATED",
@@ -173,10 +172,10 @@ def get_local_stats():
dead=stale_doa_shares,
),
uptime=time.time() - start_time,
- attempts_to_share=bitcoin_data.target_to_average_attempts(node.tracker.items[node.best_share_var.value].max_target),
- attempts_to_block=bitcoin_data.target_to_average_attempts(node.bitcoind_work.value['bits'].target),
- block_value=node.bitcoind_work.value['subsidy']*1e-8,
- warnings=p2pool_data.get_warnings(node.tracker, node.best_share_var.value, node.net, bitcoind_getinfo_var.value, node.bitcoind_work.value),
+ attempts_to_share=dash_data.target_to_average_attempts(node.tracker.items[node.best_share_var.value].max_target),
+ attempts_to_block=dash_data.target_to_average_attempts(node.dashd_work.value['bits'].target),
+ block_value=node.dashd_work.value['subsidy']*1e-8,
+ warnings=p2pool_data.get_warnings(node.tracker, node.best_share_var.value, node.net, dashd_getinfo_var.value, node.dashd_work.value),
donation_proportion=wb.donation_percentage/100,
version=p2pool.__version__,
protocol_version=p2p.Protocol.VERSION,
@@ -201,12 +200,12 @@ def render_GET(self, request):
def decent_height():
return min(node.tracker.get_height(node.best_share_var.value), 720)
web_root.putChild('rate', WebInterface(lambda: p2pool_data.get_pool_attempts_per_second(node.tracker, node.best_share_var.value, decent_height())/(1-p2pool_data.get_average_stale_prop(node.tracker, node.best_share_var.value, decent_height()))))
- web_root.putChild('difficulty', WebInterface(lambda: bitcoin_data.target_to_difficulty(node.tracker.items[node.best_share_var.value].max_target)))
+ web_root.putChild('difficulty', WebInterface(lambda: dash_data.target_to_difficulty(node.tracker.items[node.best_share_var.value].max_target)))
web_root.putChild('users', WebInterface(get_users))
- web_root.putChild('user_stales', WebInterface(lambda: dict((bitcoin_data.pubkey_hash_to_address(ph, node.net.PARENT), prop) for ph, prop in
+ web_root.putChild('user_stales', WebInterface(lambda: dict((dash_data.pubkey_hash_to_address(ph, node.net.PARENT), prop) for ph, prop in
p2pool_data.get_user_stale_props(node.tracker, node.best_share_var.value, node.tracker.get_height(node.best_share_var.value)).iteritems())))
web_root.putChild('fee', WebInterface(lambda: wb.worker_fee))
- web_root.putChild('current_payouts', WebInterface(lambda: dict((bitcoin_data.script2_to_address(script, node.net.PARENT), value/1e8) for script, value in node.get_current_txouts().iteritems())))
+ web_root.putChild('current_payouts', WebInterface(lambda: dict((dash_data.script2_to_address(script, node.net.PARENT), value/1e8) for script, value in node.get_current_txouts().iteritems())))
web_root.putChild('patron_sendmany', WebInterface(get_patron_sendmany, 'text/plain'))
web_root.putChild('global_stats', WebInterface(get_global_stats))
web_root.putChild('local_stats', WebInterface(get_local_stats))
@@ -223,11 +222,24 @@ def decent_height():
])
))))
web_root.putChild('peer_versions', WebInterface(lambda: dict(('%s:%i' % peer.addr, peer.other_sub_version) for peer in node.p2p_node.peers.itervalues())))
- web_root.putChild('payout_addr', WebInterface(lambda: bitcoin_data.pubkey_hash_to_address(wb.my_pubkey_hash, node.net.PARENT)))
+ web_root.putChild('payout_addr', WebInterface(lambda: dash_data.pubkey_hash_to_address(wb.my_pubkey_hash, node.net.PARENT)))
+ def height_from_coinbase(coinbase):
+ opcode = ord(coinbase[0]) if len(coinbase) > 0 else 0
+ if opcode >= 1 and opcode <= 75:
+ return pack.IntType(opcode*8).unpack(coinbase[1:opcode+1])
+ if opcode == 76:
+ return pack.IntType(8).unpack(coinbase[1:2])
+ if opcode == 77:
+ return pack.IntType(8).unpack(coinbase[1:3])
+ if opcode == 78:
+ return pack.IntType(8).unpack(coinbase[1:5])
+ if opcode >= 79 and opcode <= 96:
+ return opcode - 80
+ return None
web_root.putChild('recent_blocks', WebInterface(lambda: [dict(
ts=s.timestamp,
hash='%064x' % s.header_hash,
- number=pack.IntType(24).unpack(s.share_data['coinbase'][1:4]) if len(s.share_data['coinbase']) >= 4 else None,
+ number=height_from_coinbase(s.share_data['coinbase']),
share='%064x' % s.hash,
) for s in node.tracker.get_chain(node.best_share_var.value, min(node.tracker.get_height(node.best_share_var.value), 24*60*60//node.net.SHARE_PERIOD)) if s.pow_hash <= s.header['bits'].target]))
web_root.putChild('uptime', WebInterface(lambda: time.time() - start_time))
@@ -264,14 +276,14 @@ def update_stat_log():
shares=shares,
stale_shares=stale_orphan_shares + stale_doa_shares,
stale_shares_breakdown=dict(orphan=stale_orphan_shares, doa=stale_doa_shares),
- current_payout=node.get_current_txouts().get(bitcoin_data.pubkey_hash_to_script2(wb.my_pubkey_hash), 0)*1e-8,
+ current_payout=node.get_current_txouts().get(dash_data.pubkey_hash_to_script2(wb.my_pubkey_hash), 0)*1e-8,
peers=dict(
incoming=sum(1 for peer in node.p2p_node.peers.itervalues() if peer.incoming),
outgoing=sum(1 for peer in node.p2p_node.peers.itervalues() if not peer.incoming),
),
- attempts_to_share=bitcoin_data.target_to_average_attempts(node.tracker.items[node.best_share_var.value].max_target),
- attempts_to_block=bitcoin_data.target_to_average_attempts(node.bitcoind_work.value['bits'].target),
- block_value=node.bitcoind_work.value['subsidy']*1e-8,
+ attempts_to_share=dash_data.target_to_average_attempts(node.tracker.items[node.best_share_var.value].max_target),
+ attempts_to_block=dash_data.target_to_average_attempts(node.dashd_work.value['bits'].target),
+ block_value=node.dashd_work.value['subsidy']*1e-8,
))
with open(os.path.join(datadir_path, 'stats'), 'wb') as f:
@@ -299,7 +311,7 @@ def get_share(share_hash_str):
timestamp=share.timestamp,
target=share.target,
max_target=share.max_target,
- payout_address=bitcoin_data.script2_to_address(share.new_script, node.net.PARENT),
+ payout_address=dash_data.script2_to_address(share.new_script, node.net.PARENT),
donation=share.share_data['donation']/65535,
stale_info=share.share_data['stale_info'],
nonce=share.share_data['nonce'],
@@ -426,9 +438,9 @@ def add_point():
hd.datastreams['pool_rates'].add_datum(t, pool_rates)
current_txouts = node.get_current_txouts()
- hd.datastreams['current_payout'].add_datum(t, current_txouts.get(bitcoin_data.pubkey_hash_to_script2(wb.my_pubkey_hash), 0)*1e-8)
+ hd.datastreams['current_payout'].add_datum(t, current_txouts.get(dash_data.pubkey_hash_to_script2(wb.my_pubkey_hash), 0)*1e-8)
miner_hash_rates, miner_dead_hash_rates = wb.get_local_rates()
- current_txouts_by_address = dict((bitcoin_data.script2_to_address(script, node.net.PARENT), amount) for script, amount in current_txouts.iteritems())
+ current_txouts_by_address = dict((dash_data.script2_to_address(script, node.net.PARENT), amount) for script, amount in current_txouts.iteritems())
hd.datastreams['current_payouts'].add_datum(t, dict((user, current_txouts_by_address[user]*1e-8) for user in miner_hash_rates if user in current_txouts_by_address))
hd.datastreams['peers'].add_datum(t, dict(
@@ -447,7 +459,7 @@ def add_point():
x = deferral.RobustLoopingCall(add_point)
x.start(5)
stop_event.watch(x.stop)
- @node.bitcoind_work.changed.watch
+ @node.dashd_work.changed.watch
def _(new_work):
hd.datastreams['getwork_latency'].add_datum(time.time(), new_work['latency'])
new_root.putChild('graph_data', WebInterface(lambda source, view: hd.datastreams[source].dataviews[view].get_data(time.time())))
diff --git a/p2pool/work.py b/p2pool/work.py
index 1296ad2..3b7effb 100644
--- a/p2pool/work.py
+++ b/p2pool/work.py
@@ -9,45 +9,45 @@
from twisted.internet import defer
from twisted.python import log
-import bitcoin.getwork as bitcoin_getwork, bitcoin.data as bitcoin_data
-from bitcoin import helper, script, worker_interface
+import dash.getwork as dash_getwork, dash.data as dash_data
+from dash import helper, script, worker_interface
from util import forest, jsonrpc, variable, deferral, math, pack
import p2pool, p2pool.data as p2pool_data
class WorkerBridge(worker_interface.WorkerBridge):
COINBASE_NONCE_LENGTH = 8
-
+
def __init__(self, node, my_pubkey_hash, donation_percentage, merged_urls, worker_fee):
worker_interface.WorkerBridge.__init__(self)
self.recent_shares_ts_work = []
-
+
self.node = node
self.my_pubkey_hash = my_pubkey_hash
self.donation_percentage = donation_percentage
self.worker_fee = worker_fee
-
+
self.net = self.node.net.PARENT
self.running = True
self.pseudoshare_received = variable.Event()
self.share_received = variable.Event()
self.local_rate_monitor = math.RateMonitor(10*60)
self.local_addr_rate_monitor = math.RateMonitor(10*60)
-
+
self.removed_unstales_var = variable.Variable((0, 0, 0))
self.removed_doa_unstales_var = variable.Variable(0)
-
+
self.last_work_shares = variable.Variable( {} )
-
+
self.my_share_hashes = set()
self.my_doa_share_hashes = set()
-
+
self.tracker_view = forest.TrackerView(self.node.tracker, forest.get_attributedelta_type(dict(forest.AttributeDelta.attrs,
my_count=lambda share: 1 if share.hash in self.my_share_hashes else 0,
my_doa_count=lambda share: 1 if share.hash in self.my_doa_share_hashes else 0,
my_orphan_announce_count=lambda share: 1 if share.hash in self.my_share_hashes and share.share_data['stale_info'] == 'orphan' else 0,
my_dead_announce_count=lambda share: 1 if share.hash in self.my_share_hashes and share.share_data['stale_info'] == 'doa' else 0,
)))
-
+
@self.node.tracker.verified.removed.watch
def _(share):
if share.hash in self.my_share_hashes and self.node.tracker.is_child_of(share.hash, self.node.best_share_var.value):
@@ -59,11 +59,11 @@ def _(share):
))
if share.hash in self.my_doa_share_hashes and self.node.tracker.is_child_of(share.hash, self.node.best_share_var.value):
self.removed_doa_unstales_var.set(self.removed_doa_unstales_var.value + 1)
-
+
# MERGED WORK
-
+
self.merged_work = variable.Variable({})
-
+
@defer.inlineCallbacks
def set_merged_work(merged_url, merged_userpass):
merged_proxy = jsonrpc.HTTPProxy(merged_url, dict(Authorization='Basic ' + base64.b64encode(merged_userpass)))
@@ -77,42 +77,41 @@ def set_merged_work(merged_url, merged_userpass):
yield deferral.sleep(1)
for merged_url, merged_userpass in merged_urls:
set_merged_work(merged_url, merged_userpass)
-
+
@self.merged_work.changed.watch
def _(new_merged_work):
print 'Got new merged mining work!'
-
+
# COMBINE WORK
-
+
self.current_work = variable.Variable(None)
def compute_work():
- t = self.node.bitcoind_work.value
+ t = self.node.dashd_work.value
bb = self.node.best_block_header.value
- if bb is not None and bb['previous_block'] == t['previous_block'] and self.node.net.PARENT.POW_FUNC(bitcoin_data.block_header_type.pack(bb)) <= t['bits'].target:
+ if bb is not None and bb['previous_block'] == t['previous_block'] and self.node.net.PARENT.POW_FUNC(dash_data.block_header_type.pack(bb)) <= t['bits'].target:
print 'Skipping from block %x to block %x!' % (bb['previous_block'],
- #bitcoin_data.hash256(bitcoin_data.block_header_type.pack(bb)))
- self.node.net.PARENT.BLOCKHASH_FUNC(bitcoin_data.block_header_type.pack(bb)))
+ self.node.net.PARENT.BLOCKHASH_FUNC(dash_data.block_header_type.pack(bb)))
t = dict(
version=bb['version'],
- #previous_block=bitcoin_data.hash256(bitcoin_data.block_header_type.pack(bb)),
- previous_block=self.node.net.PARENT.BLOCKHASH_FUNC(bitcoin_data.block_header_type.pack(bb)),
+ previous_block=self.node.net.PARENT.BLOCKHASH_FUNC(dash_data.block_header_type.pack(bb)),
bits=bb['bits'], # not always true
coinbaseflags='',
height=t['height'] + 1,
time=bb['timestamp'] + 600, # better way?
transactions=[],
transaction_fees=[],
- merkle_link=bitcoin_data.calculate_merkle_link([None], 0),
- #subsidy=self.node.net.PARENT.SUBSIDY_FUNC(self.node.bitcoind_work.value['height']),
- subsidy=self.node.net.PARENT.SUBSIDY_FUNC(self.node.bitcoind_work.value['bits'].bits, self.node.bitcoind_work.value['height']),
- last_update=self.node.bitcoind_work.value['last_update'],
+ merkle_link=dash_data.calculate_merkle_link([None], 0),
+ subsidy=self.node.net.PARENT.SUBSIDY_FUNC(self.node.dashd_work.value['bits'].bits, self.node.dashd_work.value['height']),
+ last_update=self.node.dashd_work.value['last_update'],
+ payee=self.node.dashd_work.value['payee'],
+ payee_amount=self.node.dashd_work.value['payee_amount'],
)
-
+
self.current_work.set(t)
- self.node.bitcoind_work.changed.watch(lambda _: compute_work())
+ self.node.dashd_work.changed.watch(lambda _: compute_work())
self.node.best_block_header.changed.watch(lambda _: compute_work())
compute_work()
-
+
self.new_work_event = variable.Event()
@self.current_work.transitioned.watch
def _(before, after):
@@ -121,10 +120,10 @@ def _(before, after):
self.new_work_event.happened()
self.merged_work.changed.watch(lambda _: self.new_work_event.happened())
self.node.best_share_var.changed.watch(lambda _: self.new_work_event.happened())
-
+
def stop(self):
self.running = False
-
+
def get_stale_counts(self):
'''Returns (orphans, doas), total, (orphans_recorded_in_chain, doas_recorded_in_chain)'''
my_shares = len(self.my_share_hashes)
@@ -134,60 +133,59 @@ def get_stale_counts(self):
my_doa_shares_in_chain = delta.my_doa_count + self.removed_doa_unstales_var.value
orphans_recorded_in_chain = delta.my_orphan_announce_count + self.removed_unstales_var.value[1]
doas_recorded_in_chain = delta.my_dead_announce_count + self.removed_unstales_var.value[2]
-
+
my_shares_not_in_chain = my_shares - my_shares_in_chain
my_doa_shares_not_in_chain = my_doa_shares - my_doa_shares_in_chain
-
+
return (my_shares_not_in_chain - my_doa_shares_not_in_chain, my_doa_shares_not_in_chain), my_shares, (orphans_recorded_in_chain, doas_recorded_in_chain)
-
+
def get_user_details(self, username):
contents = re.split('([+/])', username)
assert len(contents) % 2 == 1
-
+
user, contents2 = contents[0], contents[1:]
-
+
desired_pseudoshare_target = None
desired_share_target = None
for symbol, parameter in zip(contents2[::2], contents2[1::2]):
if symbol == '+':
try:
- desired_pseudoshare_target = bitcoin_data.difficulty_to_target(float(parameter))
+ desired_pseudoshare_target = dash_data.difficulty_to_target(float(parameter))
except:
if p2pool.DEBUG:
log.err()
elif symbol == '/':
try:
- desired_share_target = bitcoin_data.difficulty_to_target(float(parameter))
+ desired_share_target = dash_data.difficulty_to_target(float(parameter))
except:
if p2pool.DEBUG:
log.err()
-
+
if random.uniform(0, 100) < self.worker_fee:
pubkey_hash = self.my_pubkey_hash
else:
try:
- pubkey_hash = bitcoin_data.address_to_pubkey_hash(user, self.node.net.PARENT)
+ pubkey_hash = dash_data.address_to_pubkey_hash(user, self.node.net.PARENT)
except: # XXX blah
pubkey_hash = self.my_pubkey_hash
-
+
return user, pubkey_hash, desired_share_target, desired_pseudoshare_target
-
+
def preprocess_request(self, user):
if (self.node.p2p_node is None or len(self.node.p2p_node.peers) == 0) and self.node.net.PERSIST:
raise jsonrpc.Error_for_code(-12345)(u'p2pool is not connected to any peers')
if time.time() > self.current_work.value['last_update'] + 60:
- raise jsonrpc.Error_for_code(-12345)(u'lost contact with bitcoind')
+ raise jsonrpc.Error_for_code(-12345)(u'lost contact with dashd')
user, pubkey_hash, desired_share_target, desired_pseudoshare_target = self.get_user_details(user)
return pubkey_hash, desired_share_target, desired_pseudoshare_target
-
+
def _estimate_local_hash_rate(self):
if len(self.recent_shares_ts_work) == 50:
hash_rate = sum(work for ts, work in self.recent_shares_ts_work[1:])//(self.recent_shares_ts_work[-1][0] - self.recent_shares_ts_work[0][0])
-
if hash_rate > 0:
return hash_rate
return None
-
+
def get_local_rates(self):
miner_hash_rates = {}
miner_dead_hash_rates = {}
@@ -197,23 +195,23 @@ def get_local_rates(self):
if datum['dead']:
miner_dead_hash_rates[datum['user']] = miner_dead_hash_rates.get(datum['user'], 0) + datum['work']/dt
return miner_hash_rates, miner_dead_hash_rates
-
+
def get_local_addr_rates(self):
addr_hash_rates = {}
datums, dt = self.local_addr_rate_monitor.get_datums_in_last()
for datum in datums:
addr_hash_rates[datum['pubkey_hash']] = addr_hash_rates.get(datum['pubkey_hash'], 0) + datum['work']/dt
return addr_hash_rates
-
+
def get_work(self, pubkey_hash, desired_share_target, desired_pseudoshare_target):
if self.node.best_share_var.value is None and self.node.net.PERSIST:
raise jsonrpc.Error_for_code(-12345)(u'p2pool is downloading shares')
-
+
if self.merged_work.value:
- tree, size = bitcoin_data.make_auxpow_tree(self.merged_work.value)
+ tree, size = dash_data.make_auxpow_tree(self.merged_work.value)
mm_hashes = [self.merged_work.value.get(tree.get(i), dict(hash=0))['hash'] for i in xrange(size)]
- mm_data = '\xfa\xbemm' + bitcoin_data.aux_pow_coinbase_type.pack(dict(
- merkle_root=bitcoin_data.merkle_hash(mm_hashes),
+ mm_data = '\xfa\xbemm' + dash_data.aux_pow_coinbase_type.pack(dict(
+ merkle_root=dash_data.merkle_hash(mm_hashes),
size=size,
nonce=0,
))
@@ -221,50 +219,51 @@ def get_work(self, pubkey_hash, desired_share_target, desired_pseudoshare_target
else:
mm_data = ''
mm_later = []
-
- tx_hashes = [bitcoin_data.hash256(bitcoin_data.tx_type.pack(tx)) for tx in self.current_work.value['transactions']]
+
+ tx_hashes = [dash_data.hash256(dash_data.tx_type.pack(tx)) for tx in self.current_work.value['transactions']]
tx_map = dict(zip(tx_hashes, self.current_work.value['transactions']))
-
+
previous_share = self.node.tracker.items[self.node.best_share_var.value] if self.node.best_share_var.value is not None else None
if previous_share is None:
share_type = p2pool_data.Share
else:
previous_share_type = type(previous_share)
-
+
if previous_share_type.SUCCESSOR is None or self.node.tracker.get_height(previous_share.hash) < self.node.net.CHAIN_LENGTH:
share_type = previous_share_type
else:
successor_type = previous_share_type.SUCCESSOR
-
+
counts = p2pool_data.get_desired_version_counts(self.node.tracker,
self.node.tracker.get_nth_parent_hash(previous_share.hash, self.node.net.CHAIN_LENGTH*9//10), self.node.net.CHAIN_LENGTH//10)
upgraded = counts.get(successor_type.VERSION, 0)/sum(counts.itervalues())
if upgraded > .65:
print 'Switchover imminent. Upgraded: %.3f%% Threshold: %.3f%%' % (upgraded*100, 95)
- print
+ print
# Share -> NewShare only valid if 95% of hashes in [net.CHAIN_LENGTH*9//10, net.CHAIN_LENGTH] for new version
if counts.get(successor_type.VERSION, 0) > sum(counts.itervalues())*95//100:
share_type = successor_type
else:
share_type = previous_share_type
-
+
if desired_share_target is None:
desired_share_target = 2**256-1
- local_addr_rates = self.get_local_addr_rates()
- local_hash_rate = local_addr_rates.get(pubkey_hash, 0)
- if local_hash_rate > 0.0:
+ local_hash_rate = self._estimate_local_hash_rate()
+ if local_hash_rate is not None:
desired_share_target = min(desired_share_target,
- bitcoin_data.average_attempts_to_target(local_hash_rate * self.node.net.SHARE_PERIOD / 0.0167)) # limit to 1.67% of pool shares by modulating share difficulty
+ dash_data.average_attempts_to_target(local_hash_rate * self.node.net.SHARE_PERIOD / 0.0167)) # limit to 1.67% of pool shares by modulating share difficulty
+
+ local_addr_rates = self.get_local_addr_rates()
lookbehind = 3600//self.node.net.SHARE_PERIOD
- block_subsidy = self.node.bitcoind_work.value['subsidy']
+ block_subsidy = self.node.dashd_work.value['subsidy']
if previous_share is not None and self.node.tracker.get_height(previous_share.hash) > lookbehind:
expected_payout_per_block = local_addr_rates.get(pubkey_hash, 0)/p2pool_data.get_pool_attempts_per_second(self.node.tracker, self.node.best_share_var.value, lookbehind) \
* block_subsidy*(1-self.donation_percentage/100) # XXX doesn't use global stale rate to compute pool hash
if expected_payout_per_block < self.node.net.PARENT.DUST_THRESHOLD:
desired_share_target = min(desired_share_target,
- bitcoin_data.average_attempts_to_target((bitcoin_data.target_to_average_attempts(self.node.bitcoind_work.value['bits'].target)*self.node.net.SPREAD)*self.node.net.PARENT.DUST_THRESHOLD/block_subsidy)
+ dash_data.average_attempts_to_target((dash_data.target_to_average_attempts(self.node.dashd_work.value['bits'].target)*self.node.net.SPREAD)*self.node.net.PARENT.DUST_THRESHOLD/block_subsidy)
)
-
+
if True:
share_info, gentx, other_transaction_hashes, get_share = share_type.generate_transaction(
tracker=self.node.tracker,
@@ -284,6 +283,8 @@ def get_work(self, pubkey_hash, desired_share_target, desired_pseudoshare_target
None
)(*self.get_stale_counts()),
desired_version=(share_type.SUCCESSOR if share_type.SUCCESSOR is not None else share_type).VOTING_VERSION,
+ payee=self.current_work.value['payee'],
+ payee_amount=self.current_work.value['payee_amount'],
),
block_target=self.current_work.value['bits'].target,
desired_timestamp=int(time.time() + 0.5),
@@ -292,46 +293,43 @@ def get_work(self, pubkey_hash, desired_share_target, desired_pseudoshare_target
desired_other_transaction_hashes_and_fees=zip(tx_hashes, self.current_work.value['transaction_fees']),
net=self.node.net,
known_txs=tx_map,
- #base_subsidy=self.node.net.PARENT.SUBSIDY_FUNC(self.current_work.value['height']),
base_subsidy=self.node.net.PARENT.SUBSIDY_FUNC(self.current_work.value['bits'].bits, self.current_work.value['height']),
)
-
- packed_gentx = bitcoin_data.tx_type.pack(gentx)
+
+ packed_gentx = dash_data.tx_type.pack(gentx)
other_transactions = [tx_map[tx_hash] for tx_hash in other_transaction_hashes]
-
+
mm_later = [(dict(aux_work, target=aux_work['target'] if aux_work['target'] != 'p2pool' else share_info['bits'].target), index, hashes) for aux_work, index, hashes in mm_later]
-
+
if desired_pseudoshare_target is None:
target = 2**256-1
local_hash_rate = self._estimate_local_hash_rate()
if local_hash_rate is not None:
target = min(target,
- bitcoin_data.average_attempts_to_target(local_hash_rate * 1)) # limit to 1 share response every second by modulating pseudoshare difficulty
+ dash_data.average_attempts_to_target(local_hash_rate * 1)) # limit to 1 share response every second by modulating pseudoshare difficulty
else:
target = desired_pseudoshare_target
target = max(target, share_info['bits'].target)
for aux_work, index, hashes in mm_later:
target = max(target, aux_work['target'])
target = math.clip(target, self.node.net.PARENT.SANE_TARGET_RANGE)
-
+
getwork_time = time.time()
lp_count = self.new_work_event.times
- merkle_link = bitcoin_data.calculate_merkle_link([None] + other_transaction_hashes, 0)
-
- print 'New work for worker %s! Difficulty: %.06f Share difficulty: %.06f (speed %.06f) Total block value: %.6f %s including %i transactions' % (
- bitcoin_data.pubkey_hash_to_address(pubkey_hash, self.node.net.PARENT),
- bitcoin_data.target_to_difficulty(target),
- bitcoin_data.target_to_difficulty(share_info['bits'].target),
- self.get_local_addr_rates().get(pubkey_hash, 0),
+ merkle_link = dash_data.calculate_merkle_link([None] + other_transaction_hashes, 0)
+
+ print 'New work for worker! Difficulty: %.06f Share difficulty: %.06f Total block value: %.6f %s including %i transactions' % (
+ dash_data.target_to_difficulty(target),
+ dash_data.target_to_difficulty(share_info['bits'].target),
self.current_work.value['subsidy']*1e-8, self.node.net.PARENT.SYMBOL,
len(self.current_work.value['transactions']),
)
#need this for stats
- self.last_work_shares.value[bitcoin_data.pubkey_hash_to_address(pubkey_hash, self.node.net.PARENT)]=share_info['bits']
-
+ self.last_work_shares.value[dash_data.pubkey_hash_to_address(pubkey_hash, self.node.net.PARENT)]=share_info['bits']
+
ba = dict(
- version=min(self.current_work.value['version'], 2),
+ version=min(self.current_work.value['version'], 3),
previous_block=self.current_work.value['previous_block'],
merkle_link=merkle_link,
coinb1=packed_gentx[:-self.COINBASE_NONCE_LENGTH-4],
@@ -340,46 +338,48 @@ def get_work(self, pubkey_hash, desired_share_target, desired_pseudoshare_target
bits=self.current_work.value['bits'],
share_target=target,
)
-
+
received_header_hashes = set()
-
+
def got_response(header, user, coinbase_nonce):
assert len(coinbase_nonce) == self.COINBASE_NONCE_LENGTH
new_packed_gentx = packed_gentx[:-self.COINBASE_NONCE_LENGTH-4] + coinbase_nonce + packed_gentx[-4:] if coinbase_nonce != '\0'*self.COINBASE_NONCE_LENGTH else packed_gentx
- new_gentx = bitcoin_data.tx_type.unpack(new_packed_gentx) if coinbase_nonce != '\0'*self.COINBASE_NONCE_LENGTH else gentx
-
- #header_hash = bitcoin_data.hash256(bitcoin_data.block_header_type.pack(header))
- header_hash = self.node.net.PARENT.BLOCKHASH_FUNC(bitcoin_data.block_header_type.pack(header))
- pow_hash = self.node.net.PARENT.POW_FUNC(bitcoin_data.block_header_type.pack(header))
+ new_gentx = dash_data.tx_type.unpack(new_packed_gentx) if coinbase_nonce != '\0'*self.COINBASE_NONCE_LENGTH else gentx
+
+ header_hash = self.node.net.PARENT.BLOCKHASH_FUNC(dash_data.block_header_type.pack(header))
+ pow_hash = self.node.net.PARENT.POW_FUNC(dash_data.block_header_type.pack(header))
try:
if pow_hash <= header['bits'].target or p2pool.DEBUG:
- helper.submit_block(dict(header=header, txs=[new_gentx] + other_transactions), False, self.node.factory, self.node.bitcoind, self.node.bitcoind_work, self.node.net)
+ if self.node.dashd_work.value['masternode_payments']:
+ helper.submit_block(dict(header=header, txs=[new_gentx] + other_transactions, votes=self.node.dashd_work.value['votes']), False, self.node.factory, self.node.dashd, self.node.dashd_work, self.node.net)
+ else:
+ helper.submit_block(dict(header=header, txs=[new_gentx] + other_transactions), False, self.node.factory, self.node.dashd, self.node.dashd_work, self.node.net)
if pow_hash <= header['bits'].target:
print
- print 'GOT BLOCK FROM MINER! Passing to bitcoind! %s%064x' % (self.node.net.PARENT.BLOCK_EXPLORER_URL_PREFIX, header_hash)
+ print 'GOT BLOCK FROM MINER! Passing to dashd! %s%064x' % (self.node.net.PARENT.BLOCK_EXPLORER_URL_PREFIX, header_hash)
print
except:
log.err(None, 'Error while processing potential block:')
-
+
user, _, _, _ = self.get_user_details(user)
assert header['previous_block'] == ba['previous_block']
- assert header['merkle_root'] == bitcoin_data.check_merkle_link(bitcoin_data.hash256(new_packed_gentx), merkle_link)
+ assert header['merkle_root'] == dash_data.check_merkle_link(dash_data.hash256(new_packed_gentx), merkle_link)
assert header['bits'] == ba['bits']
-
+
on_time = self.new_work_event.times == lp_count
-
+
for aux_work, index, hashes in mm_later:
try:
if pow_hash <= aux_work['target'] or p2pool.DEBUG:
df = deferral.retry('Error submitting merged block: (will retry)', 10, 10)(aux_work['merged_proxy'].rpc_getauxblock)(
pack.IntType(256, 'big').pack(aux_work['hash']).encode('hex'),
- bitcoin_data.aux_pow_type.pack(dict(
+ dash_data.aux_pow_type.pack(dict(
merkle_tx=dict(
tx=new_gentx,
block_hash=header_hash,
merkle_link=merkle_link,
),
- merkle_link=bitcoin_data.calculate_merkle_link(hashes, index),
+ merkle_link=dash_data.calculate_merkle_link(hashes, index),
parent_block_header=header,
)).encode('hex'),
)
@@ -394,11 +394,11 @@ def _(err):
log.err(err, 'Error submitting merged block:')
except:
log.err(None, 'Error while processing merged mining POW:')
-
+
if pow_hash <= share_info['bits'].target and header_hash not in received_header_hashes:
last_txout_nonce = pack.IntType(8*self.COINBASE_NONCE_LENGTH).unpack(coinbase_nonce)
share = get_share(header, last_txout_nonce)
-
+
print 'GOT SHARE! %s %s prev %s age %.2fs%s' % (
user,
p2pool_data.format_hash(share.hash),
@@ -409,18 +409,18 @@ def _(err):
self.my_share_hashes.add(share.hash)
if not on_time:
self.my_doa_share_hashes.add(share.hash)
-
+
self.node.tracker.add(share)
self.node.set_best_share()
-
+
try:
if (pow_hash <= header['bits'].target or p2pool.DEBUG) and self.node.p2p_node is not None:
self.node.p2p_node.broadcast_share(share.hash)
except:
log.err(None, 'Error forwarding block solution:')
-
- self.share_received.happened(bitcoin_data.target_to_average_attempts(share.target), not on_time, share.hash)
-
+
+ self.share_received.happened(dash_data.target_to_average_attempts(share.target), not on_time, share.hash)
+
if pow_hash > target:
print 'Worker %s submitted share with hash > target:' % (user,)
print ' Hash: %56x' % (pow_hash,)
@@ -429,14 +429,14 @@ def _(err):
print >>sys.stderr, 'Worker %s submitted share more than once!' % (user,)
else:
received_header_hashes.add(header_hash)
-
- self.pseudoshare_received.happened(bitcoin_data.target_to_average_attempts(target), not on_time, user)
- self.recent_shares_ts_work.append((time.time(), bitcoin_data.target_to_average_attempts(target)))
+
+ self.pseudoshare_received.happened(dash_data.target_to_average_attempts(target), not on_time, user)
+ self.recent_shares_ts_work.append((time.time(), dash_data.target_to_average_attempts(target)))
while len(self.recent_shares_ts_work) > 50:
self.recent_shares_ts_work.pop(0)
- self.local_rate_monitor.add_datum(dict(work=bitcoin_data.target_to_average_attempts(target), dead=not on_time, user=user, share_target=share_info['bits'].target))
- self.local_addr_rate_monitor.add_datum(dict(work=bitcoin_data.target_to_average_attempts(target), pubkey_hash=pubkey_hash))
-
+ self.local_rate_monitor.add_datum(dict(work=dash_data.target_to_average_attempts(target), dead=not on_time, user=user, share_target=share_info['bits'].target))
+ self.local_addr_rate_monitor.add_datum(dict(work=dash_data.target_to_average_attempts(target), pubkey_hash=pubkey_hash))
+
return on_time
-
+
return ba, got_response
diff --git a/py_module/darkcoin-subsidy-python.txt b/py_module/darkcoin-subsidy-python.txt
deleted file mode 100644
index 37eb626..0000000
--- a/py_module/darkcoin-subsidy-python.txt
+++ /dev/null
@@ -1,2 +0,0 @@
-SUBSIDY_FUNC is at
-git clone https://github.com/chaeplin/SUBSIDY_FUNC.git
diff --git a/py_module/dash_hash.txt b/py_module/dash_hash.txt
new file mode 100644
index 0000000..745f797
--- /dev/null
+++ b/py_module/dash_hash.txt
@@ -0,0 +1,2 @@
+dash_hash is available at
+git clone https://github.com/vertoe/dash_hash.git
diff --git a/py_module/dash_subsidy.txt b/py_module/dash_subsidy.txt
new file mode 100644
index 0000000..6c7412a
--- /dev/null
+++ b/py_module/dash_subsidy.txt
@@ -0,0 +1,2 @@
+dash_subsidy is available at:
+git clone https://github.com/vertoe/dash_subsidy.git
diff --git a/py_module/xcoin-hash.txt b/py_module/xcoin-hash.txt
deleted file mode 100644
index 9021ce4..0000000
--- a/py_module/xcoin-hash.txt
+++ /dev/null
@@ -1,2 +0,0 @@
-xcoin-hash is at
-git clone https://github.com/evan82/xcoin-hash.git
diff --git a/run_darkpool.sh b/run_darkpool.sh
index af0d97f..c9deec5 100755
--- a/run_darkpool.sh
+++ b/run_darkpool.sh
@@ -1,11 +1,11 @@
#!/bin/sh
-SERVICE='python ./run_p2pool.py --net darkcoin'
+SERVICE='python ./run_p2pool.py'
if ps ax | grep -v grep | grep "$SERVICE" > /dev/null
then
echo "$SERVICE is already running!"
else
- screen -d -m -S P2P_DRK_DIFF python ./run_p2pool.py --net darkcoin --give-author 0 --disable-upnp -f 1
+ screen -dmS p2pool-dash python ./run_p2pool.py --give-author 0 --disable-upnp -f 1
wait
fi
diff --git a/setup.py b/setup.py
index d1ed265..fad01c0 100644
--- a/setup.py
+++ b/setup.py
@@ -24,7 +24,7 @@
sys.argv[1:] = ['py2exe']
setup(name='p2pool',
version=version,
- description='Peer-to-peer Bitcoin mining pool',
+ description='Peer-to-peer Dash mining pool',
author='Forrest Voight',
author_email='forrest@forre.st',
url='http://p2pool.forre.st/',
diff --git a/web-static/classic/graphs.html b/web-static/classic/graphs.html
index fda5149..ac0570d 100644
--- a/web-static/classic/graphs.html
+++ b/web-static/classic/graphs.html
@@ -56,7 +56,7 @@ Desired version rates
Traffic rate
- Bitcoind GetBlockTemplate Latency
+ Dashd GetBlockTemplate Latency
Memory Usage
diff --git a/web-static/d3.v2.min.js b/web-static/d3.v2.min.js
new file mode 100644
index 0000000..521c420
--- /dev/null
+++ b/web-static/d3.v2.min.js
@@ -0,0 +1,4 @@
+(function(){function e(a,b){try{for(var c in b)Object.defineProperty(a.prototype,c,{value:b[c],enumerable:!1})}catch(d){a.prototype=b}}function g(a){var b=-1,c=a.length,d=[];while(++b=0?a.substring(b):(b=a.length,""),d=[];while(b>0)d.push(a.substring(b-=3,b+3));return d.reverse().join(",")+c}function I(a,b){return{scale:Math.pow(10,(8-b)*3),symbol:a}}function O(a){return function(b){return b<=0?0:b>=1?1:a(b)}}function P(a){return function(b){return 1-a(1-b)}}function Q(a){return function(b){return.5*(b<.5?a(2*b):2-a(2-2*b))}}function R(a){return a}function S(a){return function(b){return Math.pow(b,a)}}function T(a){return 1-Math.cos(a*Math.PI/2)}function U(a){return Math.pow(2,10*(a-1))}function V(a){return 1-Math.sqrt(1-a*a)}function W(a,b){var c;return arguments.length<2&&(b=.45),arguments.length<1?(a=1,c=b/4):c=b/(2*Math.PI)*Math.asin(1/a),function(d){return 1+a*Math.pow(2,10*-d)*Math.sin((d-c)*2*Math.PI/b)}}function X(a){return a||(a=1.70158),function(b){return b*b*((a+1)*b-a)}}function Y(a){return a<1/2.75?7.5625*a*a:a<2/2.75?7.5625*(a-=1.5/2.75)*a+.75:a<2.5/2.75?7.5625*(a-=2.25/2.75)*a+.9375:7.5625*(a-=2.625/2.75)*a+.984375}function Z(){d3.event.stopPropagation(),d3.event.preventDefault()}function $(){var a=d3.event,b;while(b=a.sourceEvent)a=b;return a}function _(a){var b=new A,c=0,d=arguments.length;while(++c360?a-=360:a<0&&(a+=360),a<60?d+(e-d)*a/60:a<180?e:a<240?d+(e-d)*(240-a)/60:d}function g(a){return Math.round(f(a)*255)}var d,e;return a%=360,a<0&&(a+=360),b=b<0?0:b>1?1:b,c=c<0?0:c>1?1:c,e=c<=.5?c*(1+b):c+b-c*b,d=2*c-e,be(g(a+120),g(a),g(a-120))}function bo(a){return j(a,bu),a}function bv(a){return function(){return bp(a,this)}}function bw(a){return function(){return bq(a,this)}}function by(a,b){function f(){if(b=this.classList)return b.add(a);var b=this.className,d=b.baseVal!=null,e=d?b.baseVal:b;c.lastIndex=0,c.test(e)||(e=w(e+" "+a),d?b.baseVal=e:this.className=e)}function g(){if(b=this.classList)return b.remove(a);var b=this.className,d=b.baseVal!=null,e=d?b.baseVal:b;e=w(e.replace(c," ")),d?b.baseVal=e:this.className=e}function h(){(b.apply(this,arguments)?f:g).call(this)}var c=new RegExp("(^|\\s+)"+d3.requote(a)+"(\\s+|$)","g");if(arguments.length<2){var d=this.node();if(e=d.classList)return e.contains(a);var e=d.className;return c.lastIndex=0,c.test(e.baseVal!=null?e.baseVal:e)}return this.each(typeof b=="function"?h:b?f:g)}function bz(a){return{__data__:a}}function bA(a){return function(){return bt(this,a)}}function bB(a){return arguments.length||(a=d3.ascending),function(b,c){return a(b&&b.__data__,c&&c.__data__)}}function bD(a){return j(a,bE),a}function bF(a,b,c){j(a,bJ);var d=new k,e=d3.dispatch("start","end"),f=bR;return a.id=b,a.time=c,a.tween=function(b,c){return arguments.length<2?d.get(b):(c==null?d.remove(b):d.set(b,c),a)},a.ease=function(b){return arguments.length?(f=typeof b=="function"?b:d3.ease.apply(d3,arguments),a):f},a.each=function(b,c){return arguments.length<2?bS.call(a,b):(e.on(b,c),a)},d3.timer(function(g){return a.each(function(h,i,j){function p(a){return o.active>b?r():(o.active=b,d.forEach(function(a,b){(b=b.call(l,h,i))&&k.push(b)}),e.start.call(l,h,i),q(a)||d3.timer(q,0,c),1)}function q(a){if(o.active!==b)return r();var c=(a-m)/n,d=f(c),g=k.length;while(g>0)k[--g].call(l,d);if(c>=1)return r(),bL=b,e.end.call(l,h,i),bL=0,1}function r(){return--o.count||delete l.__transition__,1}var k=[],l=this,m=a[j][i].delay,n=a[j][i].duration,o=l.__transition__||(l.__transition__={active:0,count:0});++o.count,m<=g?p(g):d3.timer(p,m,c)}),1},0,c),a}function bH(a,b,c){return c!=""&&bG}function bI(a,b){function d(a,d,e){var f=b.call(this,a,d);return f==null?e!=""&&bG:e!=f&&c(e,f)}function e(a,d,e){return e!=b&&c(e,b)}var c=bb(a);return typeof b=="function"?d:b==null?bH:(b+="",e)}function bS(a){var b=bL,c=bR,d=bP,e=bQ;bL=this.id,bR=this.ease();for(var f=0,g=this.length;f=c.delay&&(c.flush=c.callback(a)),c=c.next;var d=bX()-b;d>24?(isFinite(d)&&(clearTimeout(bV),bV=setTimeout(bW,d)),bU=0):(bU=1,bY(bW))}function bX(){var a=null,b=bT,c=Infinity;while(b)b.flush?b=a?a.next=b.next:bT=b.next:(c=Math.min(c,b.then+b.delay),b=(a=b).next);return c}function bZ(a){var b=[a.a,a.b],c=[a.c,a.d],d=b_(b),e=b$(b,c),f=b_(ca(c,b,-e))||0;b[0]*c[1]2?cq:cp,i=d?bd:bc;return e=g(a,b,i,c),f=g(b,a,i,d3.interpolate),h}function h(a){return e(a)}var e,f;return h.invert=function(a){return f(a)},h.domain=function(b){return arguments.length?(a=b.map(Number),g()):a},h.range=function(a){return arguments.length?(b=a,g()):b},h.rangeRound=function(a){return h.range(a).interpolate(d3.interpolateRound)},h.clamp=function(a){return arguments.length?(d=a,g()):d},h.interpolate=function(a){return arguments.length?(c=a,g()):c},h.ticks=function(b){return cn(a,b)},h.tickFormat=function(b){return co(a,b)},h.nice=function(){return ch(a,cl),g()},h.copy=function(){return cj(a,b,c,d)},g()}function ck(a,b){return d3.rebind(a,b,"range","rangeRound","interpolate","clamp")}function cl(a){return a=Math.pow(10,Math.round(Math.log(a)/Math.LN10)-1),{floor:function(b){return Math.floor(b/a)*a},ceil:function(b){return Math.ceil(b/a)*a}}}function cm(a,b){var c=cf(a),d=c[1]-c[0],e=Math.pow(10,Math.floor(Math.log(d/b)/Math.LN10)),f=b/d*e;return f<=.15?e*=10:f<=.35?e*=5:f<=.75&&(e*=2),c[0]=Math.ceil(c[0]/e)*e,c[1]=Math.floor(c[1]/e)*e+e*.5,c[2]=e,c}function cn(a,b){return d3.range.apply(d3,cm(a,b))}function co(a,b){return d3.format(",."+Math.max(0,-Math.floor(Math.log(cm(a,b)[2])/Math.LN10+.01))+"f")}function cp(a,b,c,d){var e=c(a[0],a[1]),f=d(b[0],b[1]);return function(a){return f(e(a))}}function cq(a,b,c,d){var e=[],f=[],g=0,h=Math.min(a.length,b.length)-1;a[h]0;j--)e.push(c(f)*j)}else{for(;fi;g--);e=e.slice(f,g)}return e},d.tickFormat=function(a,e){arguments.length<2&&(e=cs);if(arguments.length<1)return e;var f=a/d.ticks().length,g=b===cu?(h=-1e-12,Math.floor):(h=1e-12,Math.ceil),h;return function(a){return a/c(g(b(a)+h))0?0:-a)/Math.LN10}function cv(a,b){function e(b){return a(c(b))}var c=cw(b),d=cw(1/b);return e.invert=function(b){return d(a.invert(b))},e.domain=function(b){return arguments.length?(a.domain(b.map(c)),e):a.domain().map(d)},e.ticks=function(a){return cn(e.domain(),a)},e.tickFormat=function(a){return co(e.domain(),a)},e.nice=function(){return e.domain(ch(e.domain(),cl))},e.exponent=function(a){if(!arguments.length)return b;var f=e.domain();return c=cw(b=a),d=cw(1/b),e.domain(f)},e.copy=function(){return cv(a.copy(),b)},ck(e,a)}function cw(a){return function(b){return b<0?-Math.pow(-b,a):Math.pow(b,a)}}function cx(a,b){function f(b){return d[((c.get(b)||c.set(b,a.push(b)))-1)%d.length]}function g(b,c){return d3.range(a.length).map(function(a){return b+c*a})}var c,d,e;return f.domain=function(d){if(!arguments.length)return a;a=[],c=new k;var e=-1,g=d.length,h;while(++e1){h=b[1],f=a[i],i++,d+="C"+(e[0]+g[0])+","+(e[1]+g[1])+","+(f[0]-h[0])+","+(f[1]-h[1])+","+f[0]+","+f[1];for(var j=2;j9&&(f=c*3/Math.sqrt(f),g[h]=f*d,g[h+1]=f*e));h=-1;while(++h<=i)f=(a[Math.min(i,h+1)][0]-a[Math.max(0,h-1)][0])/(6*(1+g[h]*g[h])),b.push([f||0,g[h]*f||0]);return b}function di(a){return a.length<3?cQ(a):a[0]+cW(a,dh(a))}function dj(a){var b,c=-1,d=a.length,e,f;while(++c1){var d=cf(a.domain()),e,f=-1,g=b.length,h=(b[1]-b[0])/++c,i,j;while(++f0;)(j=+b[f]-i*h)>=d[0]&&e.push(j);for(--f,i=0;++id&&(c=b,d=e);return c}function ea(a){return a.reduce(eb,0)}function eb(a,b){return a+b[1]}function ec(a,b){return ed(a,Math.ceil(Math.log(b.length)/Math.LN2+1))}function ed(a,b){var c=-1,d=+a[0],e=(a[1]-d)/b,f=[];while(++c<=b)f[c]=e*c+d;return f}function ee(a){return[d3.min(a),d3.max(a)]}function ef(a,b){return d3.rebind(a,b,"sort","children","value"),a.links=ej,a.nodes=function(b){return ek=!0,(a.nodes=a)(b)},a}function eg(a){return a.children}function eh(a){return a.value}function ei(a,b){return b.value-a.value}function ej(a){return d3.merge(a.map(function(a){return(a.children||[]).map(function(b){return{source:a,target:b}})}))}function el(a,b){return a.value-b.value}function em(a,b){var c=a._pack_next;a._pack_next=b,b._pack_prev=a,b._pack_next=c,c._pack_prev=b}function en(a,b){a._pack_next=b,b._pack_prev=a}function eo(a,b){var c=b.x-a.x,d=b.y-a.y,e=a.r+b.r;return e*e-c*c-d*d>.001}function ep(a){function l(a){b=Math.min(a.x-a.r,b),c=Math.max(a.x+a.r,c),d=Math.min(a.y-a.r,d),e=Math.max(a.y+a.r,e)}var b=Infinity,c=-Infinity,d=Infinity,e=-Infinity,f=a.length,g,h,i,j,k;a.forEach(eq),g=a[0],g.x=-g.r,g.y=0,l(g);if(f>1){h=a[1],h.x=h.r,h.y=0,l(h);if(f>2){i=a[2],eu(g,h,i),l(i),em(g,i),g._pack_prev=i,em(i,h),h=g._pack_next;for(var m=3;m0&&(a=d)}return a}function eD(a,b){return a.x-b.x}function eE(a,b){return b.x-a.x}function eF(a,b){return a.depth-b.depth}function eG(a,b){function c(a,d){var e=a.children;if(e&&(i=e.length)){var f,g=null,h=-1,i;while(++h=0)f=d[e]._tree,f.prelim+=b,f.mod+=b,b+=f.shift+(c+=f.change)}function eI(a,b,c){a=a._tree,b=b._tree;var d=c/(b.number-a.number);a.change+=d,b.change-=d,b.shift+=c,b.prelim+=c,b.mod+=c}function eJ(a,b,c){return a._tree.ancestor.parent==b.parent?a._tree.ancestor:c}function eK(a){return{x:a.x,y:a.y,dx:a.dx,dy:a.dy}}function eL(a,b){var c=a.x+b[3],d=a.y+b[0],e=a.dx-b[1]-b[3],f=a.dy-b[0]-b[2];return e<0&&(c+=e/2,e=0),f<0&&(d+=f/2,f=0),{x:c,y:d,dx:e,dy:f}}function eM(a){return a.map(eN).join(",")}function eN(a){return/[",\n]/.test(a)?'"'+a.replace(/\"/g,'""')+'"':a}function eP(a,b){return function(c){return c&&a.hasOwnProperty(c.type)?a[c.type](c):b}}function eQ(a){return"m0,"+a+"a"+a+","+a+" 0 1,1 0,"+ -2*a+"a"+a+","+a+" 0 1,1 0,"+2*a+"z"}function eR(a,b){eS.hasOwnProperty(a.type)&&eS[a.type](a,b)}function eT(a,b){eR(a.geometry,b)}function eU(a,b){for(var c=a.features,d=0,e=c.length;d0}function fg(a,b,c){return(c[0]-b[0])*(a[1]-b[1])<(c[1]-b[1])*(a[0]-b[0])}function fh(a,b,c,d){var e=a[0],f=b[0],g=c[0],h=d[0],i=a[1],j=b[1],k=c[1],l=d[1],m=e-g,n=f-e,o=h-g,p=i-k,q=j-i,r=l-k,s=(o*p-r*m)/(r*n-o*q);return[e+s*n,i+s*q]}function fj(a,b){var c={list:a.map(function(a,b){return{index:b,x:a[0],y:a[1]}}).sort(function(a,b){return a.yb.y?1:a.xb.x?1:0}),bottomSite:null},d={list:[],leftEnd:null,rightEnd:null,init:function(){d.leftEnd=d.createHalfEdge(null,"l"),d.rightEnd=d.createHalfEdge(null,"l"),d.leftEnd.r=d.rightEnd,d.rightEnd.l=d.leftEnd,d.list.unshift(d.leftEnd,d.rightEnd)},createHalfEdge:function(a,b){return{edge:a,side:b,vertex:null,l:null,r:null}},insert:function(a,b){b.l=a,b.r=a.r,a.r.l=b,a.r=b},leftBound:function(a){var b=d.leftEnd;do b=b.r;while(b!=d.rightEnd&&e.rightOf(b,a));return b=b.l,b},del:function(a){a.l.r=a.r,a.r.l=a.l,a.edge=null},right:function(a){return a.r},left:function(a){return a.l},leftRegion:function(a){return a.edge==null?c.bottomSite:a.edge.region[a.side]},rightRegion:function(a){return a.edge==null?c.bottomSite:a.edge.region[fi[a.side]]}},e={bisect:function(a,b){var c={region:{l:a,r:b},ep:{l:null,r:null}},d=b.x-a.x,e=b.y-a.y,f=d>0?d:-d,g=e>0?e:-e;return c.c=a.x*d+a.y*e+(d*d+e*e)*.5,f>g?(c.a=1,c.b=e/d,c.c/=d):(c.b=1,c.a=d/e,c.c/=e),c},intersect:function(a,b){var c=a.edge,d=b.edge;if(!c||!d||c.region.r==d.region.r)return null;var e=c.a*d.b-c.b*d.a;if(Math.abs(e)<1e-10)return null;var f=(c.c*d.b-d.c*c.b)/e,g=(d.c*c.a-c.c*d.a)/e,h=c.region.r,i=d.region.r,j,k;h.y=k.region.r.x;return l&&j.side==="l"||!l&&j.side==="r"?null:{x:f,y:g}},rightOf:function(a,b){var c=a.edge,d=c.region.r,e=b.x>d.x;if(e&&a.side==="l")return 1;if(!e&&a.side==="r")return 0;if(c.a===1){var f=b.y-d.y,g=b.x-d.x,h=0,i=0;!e&&c.b<0||e&&c.b>=0?i=h=f>=c.b*g:(i=b.x+b.y*c.b>c.c,c.b<0&&(i=!i),i||(h=1));if(!h){var j=d.x-c.region.l.x;i=c.b*(g*g-f*f)m*m+n*n}return a.side==="l"?i:!i},endPoint:function(a,c,d){a.ep[c]=d;if(!a.ep[fi[c]])return;b(a)},distance:function(a,b){var c=a.x-b.x,d=a.y-b.y;return Math.sqrt(c*c+d*d)}},f={list:[],insert:function(a,b,c){a.vertex=b,a.ystar=b.y+c;for(var d=0,e=f.list,g=e.length;dh.ystar||a.ystar==h.ystar&&b.x>h.vertex.x)continue;break}e.splice(d,0,a)},del:function(a){for(var b=0,c=f.list,d=c.length;bo.y&&(p=n,n=o,o=p,t="r"),s=e.bisect(n,o),m=d.createHalfEdge(s,t),d.insert(k,m),e.endPoint(s,fi[t],r),q=e.intersect(k,m),q&&(f.del(k),f.insert(k,q,e.distance(q,n))),q=e.intersect(m,l),q&&f.insert(m,q,e.distance(q,n));else break}for(i=d.right(d.leftEnd);i!=d.rightEnd;i=d.right(i))b(i.edge)}function fk(){return{leaf:!0,nodes:[],point:null}}function fl(a,b,c,d,e,f){if(!a(b,c,d,e,f)){var g=(c+e)*.5,h=(d+f)*.5,i=b.nodes;i[0]&&fl(a,i[0],c,d,g,h),i[1]&&fl(a,i[1],g,d,e,h),i[2]&&fl(a,i[2],c,h,g,f),i[3]&&fl(a,i[3],g,h,e,f)}}function fm(a){return{x:a[0],y:a[1]}}function fo(){this._=new Date(arguments.length>1?Date.UTC.apply(this,arguments):arguments[0])}function fq(a,b,c,d){var e,f,g=0,h=b.length,i=c.length;while(g=i)return-1;e=b.charCodeAt(g++);if(e==37){f=fw[b.charAt(g++)];if(!f||(d=f(a,c,d))<0)return-1}else if(e!=c.charCodeAt(d++))return-1}return d}function fx(a,b,c){return fz.test(b.substring(c,c+=3))?c:-1}function fy(a,b,c){fA.lastIndex=0;var d=fA.exec(b.substring(c,c+10));return d?c+=d[0].length:-1}function fC(a,b,c){var d=fD.get(b.substring(c,c+=3).toLowerCase());return d==null?-1:(a.m=d,c)}function fE(a,b,c){fF.lastIndex=0;var d=fF.exec(b.substring(c,c+12));return d?(a.m=fG.get(d[0].toLowerCase()),c+=d[0].length):-1}function fI(a,b,c){return fq(a,fv.c.toString(),b,c)}function fJ(a,b,c){return fq(a,fv.x.toString(),b,c)}function fK(a,b,c){return fq(a,fv.X.toString(),b,c)}function fL(a,b,c){fU.lastIndex=0;var d=fU.exec(b.substring(c,c+4));return d?(a.y=+d[0],c+=d[0].length):-1}function fM(a,b,c){fU.lastIndex=0;var d=fU.exec(b.substring(c,c+2));return d?(a.y=fN()+ +d[0],c+=d[0].length):-1}function fN(){return~~((new Date).getFullYear()/1e3)*1e3}function fO(a,b,c){fU.lastIndex=0;var d=fU.exec(b.substring(c,c+2));return d?(a.m=d[0]-1,c+=d[0].length):-1}function fP(a,b,c){fU.lastIndex=0;var d=fU.exec(b.substring(c,c+2));return d?(a.d=+d[0],c+=d[0].length):-1}function fQ(a,b,c){fU.lastIndex=0;var d=fU.exec(b.substring(c,c+2));return d?(a.H=+d[0],c+=d[0].length):-1}function fR(a,b,c){fU.lastIndex=0;var d=fU.exec(b.substring(c,c+2));return d?(a.M=+d[0],c+=d[0].length):-1}function fS(a,b,c){fU.lastIndex=0;var d=fU.exec(b.substring(c,c+2));return d?(a.S=+d[0],c+=d[0].length):-1}function fT(a,b,c){fU.lastIndex=0;var d=fU.exec(b.substring(c,c+3));return d?(a.L=+d[0],c+=d[0].length):-1}function fV(a,b,c){var d=fW.get(b.substring(c,c+=2).toLowerCase());return d==null?-1:(a.p=d,c)}function fX(a){var b=a.getTimezoneOffset(),c=b>0?"-":"+",d=~~(Math.abs(b)/60),e=Math.abs(b)%60;return c+fr(d)+fr(e)}function fZ(a){return a.toISOString()}function f$(a,b,c){function d(b){var c=a(b),d=f(c,1);return b-c1)while(gb?1:a>=b?0:NaN},d3.descending=function(a,b){return ba?1:b>=a?0:NaN},d3.mean=function(a,b){var c=a.length,d,e=0,f=-1,g=0;if(arguments.length===1)while(++f1&&(a=a.map(b)),a=a.filter(s),a.length?d3.quantile(a.sort(d3.ascending),.5):undefined},d3.min=function(a,b){var c=-1,d=a.length,e,f;if(arguments.length===1){while(++cf&&(e=f)}else{while(++cf&&(e=f)}return e},d3.max=function(a,b){var c=-1,d=a.length,e,f;if(arguments.length===1){while(++ce&&(e=f)}else{while(++ce&&(e=f)}return e},d3.extent=function(a,b){var c=-1,d=a.length,e,f,g;if(arguments.length===1){while(++cf&&(e=f),gf&&(e=f),g1);return a+b*c*Math.sqrt(-2*Math.log(e)/e)}}},d3.sum=function(a,b){var c=0,d=a.length,e,f=-1;if(arguments.length===1)while(++f>1;a.call(b,b[f],f)>1;c0&&(e=f);return e},d3.last=function(a,b){var c=0,d=a.length,e=a[0],f;arguments.length===1&&(b=d3.ascending);while(++c=b.length)return e?e.call(a,c):d?c.sort(d):c;var h=-1,i=c.length,j=b[g++],l,m,n=new k,o,p={};while(++h=b.length)return a;var e=[],f=c[d++],h;for(h in a)e.push({key:h,values:g(a[h],d)});return f&&e.sort(function(a,b){return f(a.key,b.key)}),e}var a={},b=[],c=[],d,e;return a.map=function(a){return f(a,0)},a.entries=function(a){return g(f(a,0),0)},a.key=function(c){return b.push(c),a},a.sortKeys=function(d){return c[b.length-1]=d,a},a.sortValues=function(b){return d=b,a},a.rollup=function(b){return e=b,a},a},d3.keys=function(a){var b=[];for(var c in a)b.push(c);return b},d3.values=function(a){var b=[];for(var c in a)b.push(a[c]);return b},d3.entries=function(a){var b=[];for(var c in a)b.push({key:c,value:a[c]});return b},d3.permute=function(a,b){var c=[],d=-1,e=b.length;while(++db)d.push(g/e);else while((g=a+c*++f)=200&&a<300||a===304?d:null)}},d.send(null)},d3.text=function(a,b,c){function d(a){c(a&&a.responseText)}arguments.length<3&&(c=b,b=null),d3.xhr(a,b,d)},d3.json=function(a,b){d3.text(a,"application/json",function(a){b(a?JSON.parse(a):null)})},d3.html=function(a,b){d3.text(a,"text/html",function(a){if(a!=null){var c=document.createRange();c.selectNode(document.body),a=c.createContextualFragment(a)}b(a)})},d3.xml=function(a,b,c){function d(a){c(a&&a.responseXML)}arguments.length<3&&(c=b,b=null),d3.xhr(a,b,d)};var z={svg:"http://www.w3.org/2000/svg",xhtml:"http://www.w3.org/1999/xhtml",xlink:"http://www.w3.org/1999/xlink",xml:"http://www.w3.org/XML/1998/namespace",xmlns:"http://www.w3.org/2000/xmlns/"};d3.ns={prefix:z,qualify:function(a){var b=a.indexOf(":"),c=a;return b>=0&&(c=a.substring(0,b),a=a.substring(b+1)),z.hasOwnProperty(c)?{space:z[c],local:a}:a}},d3.dispatch=function(){var a=new A,b=-1,c=arguments.length;while(++b0&&(d=a.substring(c+1),a=a.substring(0,c)),arguments.length<2?this[a].on(d):this[a].on(d,b)},d3.format=function(a){var b=C.exec(a),c=b[1]||" ",d=b[3]||"",e=b[5],f=+b[6],g=b[7],h=b[8],i=b[9],j=1,k="",l=!1;h&&(h=+h.substring(1)),e&&(c="0",g&&(f-=Math.floor((f-1)/4)));switch(i){case"n":g=!0,i="g";break;case"%":j=100,k="%",i="f";break;case"p":j=100,k="%",i="r";break;case"d":l=!0,h=0;break;case"s":j=-1,i="r"}return i=="r"&&!h&&(i="g"),i=D.get(i)||F,function(a){if(l&&a%1)return"";var b=a<0&&(a=-a)?"−":d;if(j<0){var m=d3.formatPrefix(a,h);a*=m.scale,k=m.symbol}else a*=j;a=i(a,h);if(e){var n=a.length+b.length;n=^]))?([+\- ])?(#)?(0)?([0-9]+)?(,)?(\.[0-9]+)?([a-zA-Z%])?/,D=d3.map({g:function(a,b){return a.toPrecision(b)},e:function(a,b){return a.toExponential(b)},f:function(a,b){return a.toFixed(b)},r:function(a,b){return d3.round(a,b=E(a,b)).toFixed(Math.max(0,Math.min(20,b)))}}),H=["y","z","a","f","p","n","μ","m","","k","M","G","T","P","E","Z","Y"].map(I);d3.formatPrefix=function(a,b){var c=0;return a&&(a<0&&(a*=-1),b&&(a=d3.round(a,E(a,b))),c=1+Math.floor(1e-12+Math.log(a)/Math.LN10),c=Math.max(-24,Math.min(24,Math.floor((c<=0?c+1:c-1)/3)*3))),H[8+c/3]};var J=S(2),K=S(3),L=function(){return R},M=d3.map({linear:L,poly:S,quad:function(){return J},cubic:function(){return K},sin:function(){return T},exp:function(){return U},circle:function(){return V},elastic:W,back:X,bounce:function(){return Y}}),N=d3.map({"in":R,out:P,"in-out":Q,"out-in":function(a){return Q(P(a))}});d3.ease=function(a){var b=a.indexOf("-"),c=b>=0?a.substring(0,b):a,d=b>=0?a.substring(b+1):"in";return c=M.get(c)||L,d=N.get(d)||R,O(d(c.apply(null,Array.prototype.slice.call(arguments,1))))},d3.event=null,d3.interpolate=function(a,b){var c=d3.interpolators.length,d;while(--c>=0&&!(d=d3.interpolators[c](a,b)));return d},d3.interpolateNumber=function(a,b){return b-=a,function(c){return a+b*c}},d3.interpolateRound=function(a,b){return b-=a,function(c){return Math.round(a+b*c)}},d3.interpolateString=function(a,b){var c,d,e,f=0,g=0,h=[],i=[],j,k;ba.lastIndex=0;for(d=0;c=ba.exec(b);++d)c.index&&h.push(b.substring(f,g=c.index)),i.push({i:h.length,x:c[0]}),h.push(null),f=ba.lastIndex;f1){while(++e=0;)if(f=c[d])e&&e!==f.nextSibling&&e.parentNode.insertBefore(f,e),e=f;return this},bu.sort=function(a){a=bB.apply(this,arguments);for(var b=-1,c=this.length;++b0&&(a=a.substring(0,e)),arguments.length<2?(e=this.node()[d])&&e._:this.each(function(e,f){function i(a){var c=d3.event;d3.event=a;try{b.call(g,g.__data__,f)}finally{d3.event=c}}var g=this,h=g[d];h&&(g.removeEventListener(a,h,h.$),delete g[d]),b&&(g.addEventListener(a,g[d]=i,i.$=c),i._=b)})},bu.each=function(a){for(var b=-1,c=this.length;++b=cG?e?"M0,"+f+"A"+f+","+f+" 0 1,1 0,"+ -f+"A"+f+","+f+" 0 1,1 0,"+f+"M0,"+e+"A"+e+","+e+" 0 1,0 0,"+ -e+"A"+e+","+e+" 0 1,0 0,"+e+"Z":"M0,"+f+"A"+f+","+f+" 0 1,1 0,"+ -f+"A"+f+","+f+" 0 1,1 0,"+f+"Z":e?"M"+f*k+","+f*l+"A"+f+","+f+" 0 "+j+",1 "+f*m+","+f*n+"L"+e*m+","+e*n+"A"+e+","+e+" 0 "+j+",0 "+e*k+","+e*l+"Z":"M"+f*k+","+f*l+"A"+f+","+f+" 0 "+j+",1 "+f*m+","+f*n+"L0,0"+"Z"}var a=cH,b=cI,c=cJ,d=cK;return e.innerRadius=function(b){return arguments.length?(a=q(b),e):a},e.outerRadius=function(a){return arguments.length?(b=q(a),e):b},e.startAngle=function(a){return arguments.length?(c=q(a),e):c},e.endAngle=function(a){return arguments.length?(d=q(a),e):d},e.centroid=function(){var e=(a.apply(this,arguments)+b.apply(this,arguments))/2,f=(c.apply(this,arguments)+d.apply(this,arguments))/2+cF;return[Math.cos(f)*e,Math.sin(f)*e]},e};var cF=-Math.PI/2,cG=2*Math.PI-1e-6;d3.svg.line=function(){return cL(n)};var cO="linear",cP=d3.map({linear:cQ,"step-before":cR,"step-after":cS,basis:cY,"basis-open":cZ,"basis-closed":c$,bundle:c_,cardinal:cV,"cardinal-open":cT,"cardinal-closed":cU,monotone:di}),db=[0,2/3,1/3,0],dc=[0,1/3,2/3,0],dd=[0,1/6,2/3,1/6];d3.svg.line.radial=function(){var a=cL(dj);return a.radius=a.x,delete a.x,a.angle=a.y,delete a.y,a},cR.reverse=cS,cS.reverse=cR,d3.svg.area=function(){return dk(Object)},d3.svg.area.radial=function(){var a=dk(dj);return a.radius=a.x,delete a.x,a.innerRadius=a.x0,delete a.x0,a.outerRadius=a.x1,delete a.x1,a.angle=a.y,delete a.y,a.startAngle=a.y0,delete a.y0,a.endAngle=a.y1,delete a.y1,a},d3.svg.chord=function(){function f(c,d){var e=g(this,a,c,d),f=g(this,b,c,d);return"M"+e.p0+i(e.r,e.p1,e.a1-e.a0)+(h(e,f)?j(e.r,e.p1,e.r,e.p0):j(e.r,e.p1,f.r,f.p0)+i(f.r,f.p1,f.a1-f.a0)+j(f.r,f.p1,e.r,e.p0))+"Z"}function g(a,b,f,g){var h=b.call(a,f,g),i=c.call(a,h,g),j=d.call(a,h,g)+cF,k=e.call(a,h,g)+cF;return{r:i,a0:j,a1:k,p0:[i*Math.cos(j),i*Math.sin(j)],p1:[i*Math.cos(k),i*Math.sin(k)]}}function h(a,b){return a.a0==b.a0&&a.a1==b.a1}function i(a,b,c){return"A"+a+","+a+" 0 "+ +(c>Math.PI)+",1 "+b}function j(a,b,c,d){return"Q 0,0 "+d}var a=dl,b=dm,c=dn,d=cJ,e=cK;return f.radius=function(a){return arguments.length?(c=q(a),f):c},f.source=function(b){return arguments.length?(a=q(b),f):a},f.target=function(a){return arguments.length?(b=q(a),f):b},f.startAngle=function(a){return arguments.length?(d=q(a),f):d},f.endAngle=function(a){return arguments.length?(e=q(a),f):e},f},d3.svg.diagonal=function(){function d(d,e){var f=a.call(this,d,e),g=b.call(this,d,e),h=(f.y+g.y)/2,i=[f,{x:f.x,y:h},{x:g.x,y:h},g];return i=i.map(c),"M"+i[0]+"C"+i[1]+" "+i[2]+" "+i[3]}var a=dl,b=dm,c=dr;return d.source=function(b){return arguments.length?(a=q(b),d):a},d.target=function(a){return arguments.length?(b=q(a),d):b},d.projection=function(a){return arguments.length?(c=a,d):c},d},d3.svg.diagonal.radial=function(){var a=d3.svg.diagonal(),b=dr,c=a.projection;return a.projection=function(a){return arguments.length?c(ds(b=a)):b},a},d3.svg.mouse=d3.mouse,d3.svg.touches=d3.touches,d3.svg.symbol=function(){function c(c,d){return(dw.get(a.call(this,c,d))||dv)(b.call(this,c,d))}var a=du,b=dt;return c
+.type=function(b){return arguments.length?(a=q(b),c):a},c.size=function(a){return arguments.length?(b=q(a),c):b},c};var dw=d3.map({circle:dv,cross:function(a){var b=Math.sqrt(a/5)/2;return"M"+ -3*b+","+ -b+"H"+ -b+"V"+ -3*b+"H"+b+"V"+ -b+"H"+3*b+"V"+b+"H"+b+"V"+3*b+"H"+ -b+"V"+b+"H"+ -3*b+"Z"},diamond:function(a){var b=Math.sqrt(a/(2*dy)),c=b*dy;return"M0,"+ -b+"L"+c+",0"+" 0,"+b+" "+ -c+",0"+"Z"},square:function(a){var b=Math.sqrt(a)/2;return"M"+ -b+","+ -b+"L"+b+","+ -b+" "+b+","+b+" "+ -b+","+b+"Z"},"triangle-down":function(a){var b=Math.sqrt(a/dx),c=b*dx/2;return"M0,"+c+"L"+b+","+ -c+" "+ -b+","+ -c+"Z"},"triangle-up":function(a){var b=Math.sqrt(a/dx),c=b*dx/2;return"M0,"+ -c+"L"+b+","+c+" "+ -b+","+c+"Z"}});d3.svg.symbolTypes=dw.keys();var dx=Math.sqrt(3),dy=Math.tan(30*Math.PI/180);d3.svg.axis=function(){function k(k){k.each(function(){var k=d3.select(this),l=h==null?a.ticks?a.ticks.apply(a,g):a.domain():h,m=i==null?a.tickFormat?a.tickFormat.apply(a,g):String:i,n=dB(a,l,j),o=k.selectAll(".minor").data(n,String),p=o.enter().insert("line","g").attr("class","tick minor").style("opacity",1e-6),q=d3.transition(o.exit()).style("opacity",1e-6).remove(),r=d3.transition(o).style("opacity",1),s=k.selectAll("g").data(l,String),t=s.enter().insert("g","path").style("opacity",1e-6),u=d3.transition(s.exit()).style("opacity",1e-6).remove(),v=d3.transition(s).style("opacity",1),w,x=cg(a),y=k.selectAll(".domain").data([0]),z=y.enter().append("path").attr("class","domain"),A=d3.transition(y),B=a.copy(),C=this.__chart__||B;this.__chart__=B,t.append("line").attr("class","tick"),t.append("text"),v.select("text").text(m);switch(b){case"bottom":w=dz,p.attr("y2",d),r.attr("x2",0).attr("y2",d),t.select("line").attr("y2",c),t.select("text").attr("y",Math.max(c,0)+f),v.select("line").attr("x2",0).attr("y2",c),v.select("text").attr("x",0).attr("y",Math.max(c,0)+f).attr("dy",".71em").attr("text-anchor","middle"),A.attr("d","M"+x[0]+","+e+"V0H"+x[1]+"V"+e);break;case"top":w=dz,p.attr("y2",-d),r.attr("x2",0).attr("y2",-d),t.select("line").attr("y2",-c),t.select("text").attr("y",-(Math.max(c,0)+f)),v.select("line").attr("x2",0).attr("y2",-c),v.select("text").attr("x",0).attr("y",-(Math.max(c,0)+f)).attr("dy","0em").attr("text-anchor","middle"),A.attr("d","M"+x[0]+","+ -e+"V0H"+x[1]+"V"+ -e);break;case"left":w=dA,p.attr("x2",-d),r.attr("x2",-d).attr("y2",0),t.select("line").attr("x2",-c),t.select("text").attr("x",-(Math.max(c,0)+f)),v.select("line").attr("x2",-c).attr("y2",0),v.select("text").attr("x",-(Math.max(c,0)+f)).attr("y",0).attr("dy",".32em").attr("text-anchor","end"),A.attr("d","M"+ -e+","+x[0]+"H0V"+x[1]+"H"+ -e);break;case"right":w=dA,p.attr("x2",d),r.attr("x2",d).attr("y2",0),t.select("line").attr("x2",c),t.select("text").attr("x",Math.max(c,0)+f),v.select("line").attr("x2",c).attr("y2",0),v.select("text").attr("x",Math.max(c,0)+f).attr("y",0).attr("dy",".32em").attr("text-anchor","start"),A.attr("d","M"+e+","+x[0]+"H0V"+x[1]+"H"+e)}if(a.ticks)t.call(w,C),v.call(w,B),u.call(w,B),p.call(w,C),r.call(w,B),q.call(w,B);else{var D=B.rangeBand()/2,E=function(a){return B(a)+D};t.call(w,E),v.call(w,E)}})}var a=d3.scale.linear(),b="bottom",c=6,d=6,e=6,f=3,g=[10],h=null,i,j=0;return k.scale=function(b){return arguments.length?(a=b,k):a},k.orient=function(a){return arguments.length?(b=a,k):b},k.ticks=function(){return arguments.length?(g=arguments,k):g},k.tickValues=function(a){return arguments.length?(h=a,k):h},k.tickFormat=function(a){return arguments.length?(i=a,k):i},k.tickSize=function(a,b,f){if(!arguments.length)return c;var g=arguments.length-1;return c=+a,d=g>1?+b:c,e=g>0?+arguments[g]:c,k},k.tickPadding=function(a){return arguments.length?(f=+a,k):f},k.tickSubdivide=function(a){return arguments.length?(j=+a,k):j},k},d3.svg.brush=function(){function g(a){a.each(function(){var a=d3.select(this),e=a.selectAll(".background").data([0]),f=a.selectAll(".extent").data([0]),l=a.selectAll(".resize").data(d,String),m;a.style("pointer-events","all").on("mousedown.brush",k).on("touchstart.brush",k),e.enter().append("rect").attr("class","background").style("visibility","hidden").style("cursor","crosshair"),f.enter().append("rect").attr("class","extent").style("cursor","move"),l.enter().append("g").attr("class",function(a){return"resize "+a}).style("cursor",function(a){return dC[a]}).append("rect").attr("x",function(a){return/[ew]$/.test(a)?-3:null}).attr("y",function(a){return/^[ns]/.test(a)?-3:null}).attr("width",6).attr("height",6).style("visibility","hidden"),l.style("display",g.empty()?"none":null),l.exit().remove(),b&&(m=cg(b),e.attr("x",m[0]).attr("width",m[1]-m[0]),i(a)),c&&(m=cg(c),e.attr("y",m[0]).attr("height",m[1]-m[0]),j(a)),h(a)})}function h(a){a.selectAll(".resize").attr("transform",function(a){return"translate("+e[+/e$/.test(a)][0]+","+e[+/^s/.test(a)][1]+")"})}function i(a){a.select(".extent").attr("x",e[0][0]),a.selectAll(".extent,.n>rect,.s>rect").attr("width",e[1][0]-e[0][0])}function j(a){a.select(".extent").attr("y",e[0][1]),a.selectAll(".extent,.e>rect,.w>rect").attr("height",e[1][1]-e[0][1])}function k(){function x(){var a=d3.event.changedTouches;return a?d3.touches(d,a)[0]:d3.mouse(d)}function y(){d3.event.keyCode==32&&(q||(r=null,s[0]-=e[1][0],s[1]-=e[1][1],q=2),Z())}function z(){d3.event.keyCode==32&&q==2&&(s[0]+=e[1][0],s[1]+=e[1][1],q=0,Z())}function A(){var a=x(),d=!1;t&&(a[0]+=t[0],a[1]+=t[1]),q||(d3.event.altKey?(r||(r=[(e[0][0]+e[1][0])/2,(e[0][1]+e[1][1])/2]),s[0]=e[+(a[0]0?e=c:e=0:c>0&&(b.start({type:"start",alpha:e=c}),d3.timer(a.tick)),a):e},a.start=function(){function q(a,c){var d=t(b),e=-1,f=d.length,g;while(++ee&&(e=h),d.push(h)}for(g=0;g0){f=-1;while(++f=i[0]&&o<=i[1]&&(k=g[d3.bisect(j,o,1,m)-1],k.y+=n,k.push(e[f]))}return g}var a=!0,b=Number,c=ee,d=ec;return e.value=function(a){return arguments.length?(b=a,e):b},e.range=function(a){return arguments.length?(c=q(a),e):c},e.bins=function(a){return arguments.length?(d=typeof a=="number"?function(b){return ed(b,a)}:q(a),e):d},e.frequency=function(b){return arguments.length?(a=!!b,e):a},e},d3.layout.hierarchy=function(){function e(f,h,i){var j=b.call(g,f,h),k=ek?f:{data:f};k.depth=h,i.push(k);if(j&&(m=j.length)){var l=-1,m,n=k.children=[],o=0,p=h+1;while(++l0&&(eI(eJ(g,a,d),a,m),i+=m,j+=m),k+=g._tree.mod,i+=e._tree.mod,l+=h._tree.mod,j+=f._tree.mod;g&&!eB(f)&&(f._tree.thread=g,f._tree.mod+=k-j),e&&!eA(h)&&(h._tree.thread=e,h._tree.mod+=i-l,d=a)}return d}var f=a.call(this,d,e),g=f[0];eG(g,function(a,b){a._tree={ancestor:a,prelim:0,mod:0,change:0,shift:0,number:b?b._tree.number+1:0}}),h(g),i(g,-g._tree.prelim);var k=eC(g,eE),l=eC(g,eD),m=eC(g,eF),n=k.x-b(k,l)/2,o=l.x+b(l,k)/2,p=m.depth||1;return eG(g,function(a){a.x=(a.x-n)/(o-n)*c[0],a.y=a.depth/p*c[1],delete a._tree}),f}var a=d3.layout.hierarchy().sort(null).value(null),b=ez,c=[1,1];return d.separation=function(a){return arguments.length?(b=a,d):b},d.size=function(a){return arguments.length?(c=a,d):c},ef(d,a)},d3.layout.treemap=function(){function i(a,b){var c=-1,d=a.length,e,f;while(++c0)d.push(g=f[o-1]),d.area+=g.area,(k=l(d,n))<=h?(f.pop(),h=k):(d.area-=d.pop().area,m(d,n,c,!1),n=Math.min(c.dx,c.dy),d.length=d.area=0,h=Infinity);d.length&&(m(d,n,c,!0),d.length=d.area=0),b.forEach(j)}}function k(a){var b=a.children;if(b&&b.length){var c=e(a),d=b.slice(),f,g=[];i(d,c.dx*c.dy/a.value),g.area=0;while(f=d.pop())g.push(f),g.area+=f.area,f.z!=null&&(m(g,f.z?c.dx:c.dy,c,!d.length),g.length=g.area=0);b.forEach(k)}}function l(a,b){var c=a.area,d,e=0,f=Infinity,g=-1,i=a.length;while(++ge&&(e=d)}return c*=c,b*=b,c?Math.max(b*e*h/c,c/(b*f*h)):Infinity}function m(a,c,d,e){var f=-1,g=a.length,h=d.x,i=d.y,j=c?b(a.area/c):0,k;if(c==d.dx){if(e||j>d.dy)j=d.dy;while(++fd.dx)j=d.dx;while(++f=a.length)return d;if(i)return i=!1,c;var b=f.lastIndex;if(a.charCodeAt(b)===34){var e=b;while(e++50?b:f<-140?c:g<21?d:a)(e)}var a=d3.geo.albers(),b=d3.geo.albers().origin([-160,60]).parallels([55,65]),c=d3.geo.albers().origin([-160,20]).parallels([8,18]),d=d3.geo.albers().origin([-60,10]).parallels([8,18]);return e.scale=function(f){return arguments.length?(a.scale(f),b.scale(f*.6),c.scale(f),d.scale(f*1.5),e.translate(a.translate())):a.scale()},e.translate=function(f){if(!arguments.length)return a.translate();var g=a.scale()/1e3,h=f[0],i=f[1];return a.translate(f),b.translate([h-400*g,i+170*g]),c.translate([h-190*g,i+200*g]),d.translate([h+580*g,i+430*g]),e},e.scale(a.scale())},d3.geo.bonne=function(){function g(g){var h=g[0]*eO-c,i=g[1]*eO-d;if(e){var j=f+e-i,k=h*Math.cos(i)/j;h=j*Math.sin(k),i=j*Math.cos(k)-f}else h*=Math.cos(i),i*=-1;return[a*h+b[0],a*i+b[1]]}var a=200,b=[480,250],c,d,e,f;return g.invert=function(d){var g=(d[0]-b[0])/a,h=(d[1]-b[1])/a;if(e){var i=f+h,j=Math.sqrt(g*g+i*i);h=f+e-j,g=c+j*Math.atan2(g,i)/Math.cos(h)}else h*=-1,g/=Math.cos(h);return[g/eO,h/eO]},g.parallel=function(a){return arguments.length?(f=1/Math.tan(e=a*eO),g):e/eO},g.origin=function(a){return arguments.length?(c=a[0]*eO,d=a[1]*eO,g):[c/eO,d/eO]},g.scale=function(b){return arguments.length?(a=+b,g):a},g.translate=function(a){return arguments.length?(b=[+a[0],+a[1]],g):b},g.origin([0,0]).parallel(45)},d3.geo.equirectangular=function(){function c(c){var d=c[0]/360,e=-c[1]/360;return[a*d+b[0],a*e+b[1]]}var a=500,b=[480,250];return c.invert=function(c){var d=(c[0]-b[0])/a,e=(c[1]-b[1])/a;return[360*d,-360*e]},c.scale=function(b){return arguments.length?(a=+b,c):a},c.translate=function(a){return arguments.length?(b=[+a[0],+a[1]],c):b},c},d3.geo.mercator=function(){function c(c){var d=c[0]/360,e=-(Math.log(Math.tan(Math.PI/4+c[1]*eO/2))/eO)/360;return[a*d+b[0],a*Math.max(-0.5,Math.min(.5,e))+b[1]]}var a=500,b=[480,250];return c.invert=function(c){var d=(c[0]-b[0])/a,e=(c[1]-b[1])/a;return[360*d,2*Math.atan(Math.exp(-360*e*eO))/eO-90]},c.scale=function(b){return arguments.length?(a=+b,c):a},c.translate=function(a){return arguments.length?(b=[+a[0],+a[1]],c):b},c},d3.geo.path=function(){function d(c,d){return typeof a=="function"&&(b=eQ(a.apply(this,arguments))),f(c)||null}function e(a){return c(a).join(",")}function h(a){var b=k(a[0]),c=0,d=a.length;while(++c0){b.push("M");while(++h0){b.push("M");while(++kd&&(d=a),fe&&(e=f)}),[[b,c],[d,e]]};var eS={Feature:eT,FeatureCollection:eU,GeometryCollection:eV,LineString:eW,MultiLineString:eX,MultiPoint:eW,MultiPolygon:eY,Point:eZ,Polygon:e$};d3.geo.circle=function(){function e(){}function f(a){return d.distance(a)=k*k+l*l?d[f].index=-1:(d[m].index=-1,o=d[f].angle,m=f,n=g)):(o=d[f].angle,m=f,n=g);e.push(h);for(f=0,g=0;f<2;++g)d[g].index!==-1&&(e.push(d[g].index),f++);p=e.length;for(;g=0?(c=a.ep.r,d=a.ep.l):(c=a.ep.l,d=a.ep.r),a.a===1?(g=c?c.y:-1e6,e=a.c-a.b*g,h=d?d.y:1e6,f=a.c-a.b*h):(e=c?c.x:-1e6,g=a.c-a.a*e,f=d?d.x:1e6,h=a.c-a.a*f);var i=[e,g],j=[f,h];b[a.region.l.index].push(i,j),b[a.region.r.index].push(i,j)}),b.map(function(b,c){var d=a[c][0],e=a[c][1];return b.forEach(function(a){a.angle=Math.atan2(a[0]-d,a[1]-e)}),b.sort(function(a,b){return a.angle-b.angle}).filter(function(a,c){return!c||a.angle-b[c-1].angle>1e-10})})};var fi={l:"r",r:"l"};d3.geom.delaunay=function(a){var b=a.map(function(){return[]}),c=[];return fj(a,function(c){b[c.region.l.index].push(a[c.region.r.index])}),b.forEach(function(b,d){var e=a[d],f=e[0],g=e[1];b.forEach(function(a){a.angle=Math.atan2(a[0]-f,a[1]-g)}),b.sort(function(a,b){return a.angle-b.angle});for(var h=0,i=b.length-1;h=g,j=b.y>=h,l=(j<<1)+i;a.leaf=!1,a=a.nodes[l]||(a.nodes[l]=fk()),i?c=g:e=g,j?d=h:f=h,k(a,b,c,d,e,f)}var f,g=-1,h=a.length;h&&isNaN(a[0].x)&&(a=a.map(fm));if(arguments.length<5)if(arguments.length===3)e=d=c,c=b;else{b=c=Infinity,d=e=-Infinity;while(++gd&&(d=f.x),f.y>e&&(e=f.y);var i=d-b,j=e-c;i>j?e=c+i:d=b+j}var m=fk();return m.add=function(a){k(m,a,b,c,d,e)},m.visit=function(a){fl(a,m,b,c,d,e)},a.forEach(m.add),m},d3.time={};var fn=Date;fo.prototype={getDate:function(){return this._.getUTCDate()},getDay:function(){return this._.getUTCDay()},getFullYear:function(){return this._.getUTCFullYear()},getHours:function(){return this._.getUTCHours()},getMilliseconds:function(){return this._.getUTCMilliseconds()},getMinutes:function(){return this._.getUTCMinutes()},getMonth:function(){return this._.getUTCMonth()},getSeconds:function(){return this._.getUTCSeconds()},getTime:function(){return this._.getTime()},getTimezoneOffset:function(){return 0},valueOf:function(){return this._.valueOf()},setDate:function(){fp.setUTCDate.apply(this._,arguments)},setDay:function(){fp.setUTCDay.apply(this._,arguments)},setFullYear:function(){fp.setUTCFullYear.apply(this._,arguments)},setHours:function(){fp.setUTCHours.apply(this._,arguments)},setMilliseconds:function(){fp.setUTCMilliseconds.apply(this._,arguments)},setMinutes:function(){fp.setUTCMinutes.apply(this._,arguments)},setMonth:function(){fp.setUTCMonth.apply(this._,arguments)},setSeconds:function(){fp.setUTCSeconds.apply(this._,arguments)},setTime:function(){fp.setTime.apply(this._,arguments)}};var fp=Date.prototype;d3.time.format=function(a){function c(c){var d=[],e=-1,f=0,g,h;while(++e=12?"PM":"AM"},S:function(a){return fr(a.getSeconds())},U:function(a){return fr(d3.time.sundayOfYear(a))},w:function(a){return a.getDay()},W:function(a){return fr(d3.time.mondayOfYear(a))},x:d3.time.format("%m/%d/%y"),X:d3.time.format("%H:%M:%S"),y:function(a){return fr(a.getFullYear()%100)},Y:function(a){return ft(a.getFullYear()%1e4)},Z:fX,"%":function(a){return"%"}},fw={a:fx,A:fy,b:fC,B:fE,c:fI,d:fP,e:fP,H:fQ,I:fQ,L:fT,m:fO,M:fR,p:fV,S:fS,x:fJ,X:fK,y:fM,Y:fL},fz=/^(?:sun|mon|tue|wed|thu|fri|sat)/i,fA=/^(?:Sunday|Monday|Tuesday|Wednesday|Thursday|Friday|Saturday)/i,fB=["Sunday","Monday","Tuesday","Wednesday","Thursday","Friday","Saturday"],fD=d3.map({jan:0,feb:1,mar:2,apr:3,may:4,jun:5,jul:6,aug:7,sep:8,oct:9,nov:10,dec:11}),fF=/^(?:January|February|March|April|May|June|July|August|September|October|November|December)/ig,fG=d3.map({january:0,february:1,march:2,april:3,may:4,june:5,july:6,august:7,september:8,october:9,november:10,december:11}),fH=["January","February","March","April","May","June","July","August","September","October","November","December"],fU=/\s*\d+/,fW=d3.map({am:0,pm:1});d3.time.format.utc=function(a){function c(a){try{fn=fo;var c=new fn;return c._=a,b(c)}finally{fn=Date}}var b=d3.time.format(a);return c.parse=function(a){try{fn=fo;var c=b.parse(a);return c&&c._}finally{fn=Date}},c.toString=b.toString,c};var fY=d3.time.format.utc("%Y-%m-%dT%H:%M:%S.%LZ");d3.time.format.iso=Date.prototype.toISOString?fZ:fY,fZ.parse=function(a){return new Date(a)},fZ.toString=fY.toString,d3.time.second=f$(function(a){return new fn(Math.floor(a/1e3)*1e3)},function(a,b){a.setTime(a.getTime()+Math.floor(b)*1e3)},function(a){return a.getSeconds()}),d3.time.seconds=d3.time.second.range,d3.time.seconds.utc=d3.time.second.utc.range,d3.time.minute=f$(function(a){return new fn(Math.floor(a/6e4)*6e4)},function(a,b){a.setTime(a.getTime()+Math.floor(b)*6e4)},function(a){return a.getMinutes()}),d3.time.minutes=d3.time.minute.range,d3.time.minutes.utc=d3.time.minute.utc.range,d3.time.hour=f$(function(a){var b=a.getTimezoneOffset()/60;return new fn((Math.floor(a/36e5-b)+b)*36e5)},function(a,b){a.setTime(a.getTime()+Math.floor(b)*36e5)},function(a){return a.getHours()}),d3.time.hours=d3.time.hour.range,d3.time.hours.utc=d3.time.hour.utc.range,d3.time.day=f$(function(a){return new fn(a.getFullYear(),a.getMonth(),a.getDate())},function(a,b){a.setDate(a.getDate()+b)},function(a){return a.getDate()-1}),d3.time.days=d3.time.day.range,d3.time.days.utc=d3.time.day.utc.range,d3.time.dayOfYear=function(a){var b=d3.time.year(a);return Math.floor((a-b)/864e5-(a.getTimezoneOffset()-b.getTimezoneOffset())/1440)},fB.forEach(function(a,b){a=a.toLowerCase(),b=7-b;var c=d3.time[a]=f$(function(a){return(a=d3.time.day(a)).setDate(a.getDate()-(a.getDay()+b)%7),a},function(a,b){a.setDate(a.getDate()+Math.floor(b)*7)},function(a){var c=d3.time.year(a).getDay();return Math.floor((d3.time.dayOfYear(a)+(c+b)%7)/7)-(c!==b)});d3.time[a+"s"]=c.range,d3.time[a+"s"].utc=c.utc.range,d3.time[a+"OfYear"]=function(a){var c=d3.time.year(a).getDay();return Math.floor((d3.time.dayOfYear(a)+(c+b)%7)/7)}}),d3.time.week=d3.time.sunday,d3.time.weeks=d3.time.sunday.range,d3.time.weeks.utc=d3.time.sunday.utc.range,d3.time.weekOfYear=d3.time.sundayOfYear,d3.time.month=f$(function(a){return new fn(a.getFullYear(),a.getMonth(),1)},function(a,b){a.setMonth(a.getMonth()+b)},function(a){return a.getMonth()}),d3.time.months=d3.time.month.range,d3.time.months.utc=d3.time.month.utc.range,d3.time.year=f$(function(a){return new fn(a.getFullYear(),0,1)},function(a,b){a.setFullYear(a.getFullYear()+b)},function(a){return a.getFullYear()}),d3.time.years=d3.time.year.range,d3.time.years.utc=d3.time.year.utc.range;var gg=[1e3,5e3,15e3,3e4,6e4,3e5,9e5,18e5,36e5,108e5,216e5,432e5,864e5,1728e5,6048e5,2592e6,7776e6,31536e6],gh=[[d3.time.second,1],[d3.time.second,5],[d3.time.second,15],[d3.time.second,30],[d3.time.minute,1],[d3.time.minute,5],[d3.time.minute,15],[d3.time.minute,30],[d3.time.hour,1],[d3.time.hour,3],[d3.time.hour,6],[d3.time.hour,12],[d3.time.day,1],[d3.time.day,2],[d3.time.week,1],[d3.time.month,1],[d3.time.month,3],[d3.time.year,1]],gi=[[d3.time.format("%Y"),function(a){return!0}],[d3.time.format("%B"),function(a){return a.getMonth()}],[d3.time.format("%b %d"),function(a){return a.getDate()!=1}],[d3.time.format("%a %d"),function(a){return a.getDay()&&a.getDate()!=1}],[d3.time.format("%I %p"),function(a){return a.getHours()}],[d3.time.format("%I:%M"),function(a){return a.getMinutes()}],[d3.time.format(":%S"),function(a){return a.getSeconds()}],[d3.time.format(".%L"),function(a){return a.getMilliseconds()}]],gj=d3.scale.linear(),gk=gd(gi);gh.year=function(a,b){return gj.domain(a.map(gf)).ticks(b).map(ge)},d3.time.scale=function(){return ga(d3.scale.linear(),gh,gk)};var gl=gh.map(function(a){return[a[0].utc,a[1]]}),gm=[[d3.time.format.utc("%Y"),function(a){return!0}],[d3.time.format.utc("%B"),function(a){return a.getUTCMonth()}],[d3.time.format.utc("%b %d"),function(a){return a.getUTCDate()!=1}],[d3.time.format.utc("%a %d"),function(a){return a.getUTCDay()&&a.getUTCDate()!=1}],[d3.time.format.utc("%I %p"),function(a){return a.getUTCHours()}],[d3.time.format.utc("%I:%M"),function(a){return a.getUTCMinutes()}],[d3.time.format.utc(":%S"),function(a){return a.getUTCSeconds()}],[d3.time.format.utc(".%L"),function(a){return a.getUTCMilliseconds()}]],gn=gd(gm);gl.year=function(a,b){return gj.domain(a.map(gp)).ticks(b).map(go)},d3.time.scale.utc=function(){return ga(d3.scale.linear(),gl,gn)}})();
\ No newline at end of file
diff --git a/web-static/graphs.html b/web-static/graphs.html
new file mode 100644
index 0000000..bd8ee0f
--- /dev/null
+++ b/web-static/graphs.html
@@ -0,0 +1,508 @@
+
+
+
+
+ P2Pool Graphs
+
+
+
+
+
+
+
+
+
+ Periods: Current:
+
+
+
+ Local rate
+
+
+ Local rate reflected in shares
+
+
+ Current payout to default address
+
+
+ Pool rate
+
+
+ Peers
+
+
+ Miners
+
+
+ Desired version rates
+
+
+ Traffic rate
+
+
+ Dashd GetBlockTemplate Latency
+
+
+ Memory Usage
+
+
+
+
+
diff --git a/web-static/index.html b/web-static/index.html
index cf2e50a..eb16a8f 100644
--- a/web-static/index.html
+++ b/web-static/index.html
@@ -1,511 +1,150 @@
-
-
-
-
-
-
-
-
-
-
-
+
+
+
+
+
+ P2Pool
+
+
+
+
+
+ P2Pool
+ Graphs
+ Version:
+ Pool rate: ( DOA+orphan) Share difficulty:
+ Node uptime: Peers: out, in
+ Local rate: ( DOA) Expected time to share:
+ Shares: total ( orphaned, dead) Efficiency:
+ Payout if a block were found NOW: to . Expected after mining for 24 hours: per block.
+ Current block value: Expected time to block:
+
- p2pool
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
Status
-
-
- | Local rate: |
- |
- Shares: |
- |
-
-
- | Global pool rate: |
- |
- Share difficulty (pool minimum): |
- |
-
-
- | network hashrate (estimate): |
- |
- Network block difficulty: |
- |
-
-
- | Current block value: |
- |
- Expected time to share (this node): |
- |
-
-
-
- | Node peers: |
-
-
-
- /
-
-
- |
- Expected time to block (pool): |
- |
-
-
- | Node fee: |
- |
- Node uptime: |
- |
-
-
- | Node p2pool version: |
- |
- Protocol version: |
- |
-
-
-
-
-
Active miners on this node
-
-
- | Address |
- Hashrate |
- DOA Hashrate (DOA %) |
- Share difficulty |
- Time to share |
- Predicted payout |
-
-
-
-
-
-
-
-
Recent blocks
-
-
- | When |
- Date/Time |
- Number |
- Hash |
-
-
-
-
-
-
-
-
-
+ Payouts if a block were found NOW:
+
+
diff --git a/web-static/share.html b/web-static/share.html
new file mode 100644
index 0000000..514f57a
--- /dev/null
+++ b/web-static/share.html
@@ -0,0 +1,96 @@
+
+
+
+
+ P2Pool Share
+
+
+
+
+ Loading...
+
+
+