blog:2022:0628_avalanche_network_discovery

NVP Framework clean up and consolidation

Lately I've been adding a lot of stuff in my NVP framework and private NervHome sub project. And this framework has been evolving pretty fast so, I have some behaviors/code paths that are obsolete already. Time for some cleanup session I think 😉!

  • One operation that I end up doing often is to pull my changes from the NVPproject with git, and then also pull the changes from NervHome on my main server:
  • ⇒ I think I could simplify that will a “pullall” command in my git component, which I should turn into a dynamic component in the process in fact.
  • Moving nvp/components/git.py as nvp/core/git_manage.py: OK
  • Adding the “pullall” command:
            if cmd == 'pullall':
                # Pull all the known repositories:
                # We start with the NVP framework itself:
                logger.info("Pulling NVP framework...")
                self.git_pull(self.ctx.get_root_dir())
    
                # Next we iterate on all the projects:
                for proj in self.ctx.get_projects():
                    ppath = proj.get_root_dir()
                    if ppath is not None:
                        logger.info("Pulling %s...", proj.get_name())
                        self.git_pull(ppath)
                return True
  • OK: this works just fine:
    $ nvp git pullall
    2022/06/19 08:19:44 [nvp.core.git_manager] INFO: Pulling NVP framework...
    Already up to date.
    2022/06/19 08:19:46 [nvp.core.git_manager] INFO: Pulling nervland...
    Already up to date.
    2022/06/19 08:19:46 [nvp.core.git_manager] INFO: Pulling packages...
    Already up to date.
    2022/06/19 08:19:47 [nvp.core.git_manager] INFO: Pulling resp_t7...
    Already up to date.
    2022/06/19 08:19:47 [nvp.core.git_manager] INFO: Pulling simcore...
    From gitlab.gmv-insyen.com:core/simcore
    5345f36eb..c6f972e09  master     -> origin/master
    Already up to date.
    2022/06/19 08:19:52 [nvp.core.git_manager] INFO: Pulling nervhome...
    Already up to date.
  • Handle the GitlabManager component: OK
  • Also added the support to resolve the root dir in the NVPContext:
        def resolve_root_dir(self, project=None):
            """Resolve a root directory either from project or CWD"""
            if project is None:
                project = self.get_current_project(True)
    
            if project:
                return project.get_root_dir()
    
            # We might still be in the NVP project itself:
            cwd = self.to_absolute_path(self.get_cwd())
    
            root_dir = self.to_absolute_path(self.get_root_dir())
            if cwd.startswith(root_dir):
                return root_dir
    
            return None
  • Handle the ToolsManager component: OK
  • Naturally, while processing the components above, another (totally unrelated) urgent error happened: in the coingecko highres update process I have in place.
  • Took me some time to figure this out, but in this end the error I have seems to be due to the reef-finance name not being a valid coin ID anymore.
  • ⇒ I should report this kind of error clearly I think, so updated the coingecko.send_get_request() method with the content:
            while True:
                # logger.info("Sending request to %s (%s) (delay: %f)", url, params or "None", self.delay)
                resp = self.make_get_request(
                    url,
                    params=params,
                    headers=headers,
                    timeout=self.connection_timeout,
                    max_retries=self.max_retries,
                    retry_delay=self.delay * 3,
                    status_codes=[200, 429, 404],
                )
    
                if resp is not None and resp.status_code == 429:
                    now = datetime.now()
                    now_str = now.strftime("%m/%d/%Y, %H:%M:%S")
                    msg = f"[{now_str}] Coingecko: detected rate limit response, cooling down for {self.cool_down_delay} seconds."
                    logger.warning(msg)
                    utl.send_rocketchat_message(":warning: " + msg, channel="problem-reports")
                    time.sleep(self.cool_down_delay)
                    continue
    
                if resp is not None and resp.status_code == 404:
                    now = datetime.now()
                    now_str = now.strftime("%m/%d/%Y, %H:%M:%S")
                    msg = f"[{now_str}] Coingecko: got 404 status error while processing {url}: {resp.text}"
                    logger.error(msg)
                    utl.send_rocketchat_message(":warning: " + msg, channel="problem-reports")
                    resp = None
    
                break
  • And now checking the state of our database:
    from nvp.nvp_context import NVPContext
    
    if NVPContext.instance is None:
        NVPContext()
        
    ctx = NVPContext.get()
    
    from nvh.crypto.coingecko import Coingecko
    
    cg = Coingecko(ctx)
    cdb = cg.get_coin_db()
    
    cdb.find_coins("reef")
    # output: ['reef-finance', 'threefold-token', 'reefer-token']
  • ⇒ The point is, I'm not currently dynamically updating the list of coins (as I was before), that's why I still have “reef-finance” in this list above and nothing new on reef.
  • ⇒ As a quick solution for the moment I'm simply removing the “reef-finance” entry from my monitored coins list:
    cdb.delete_monitored_coin("reef-finance")
  • But I definitely need to update the list of coins regularly, which I will cover next.
  • Here is the handler script I created to update the coins:
    """Handler to update the coingecko coins data"""
    
    import logging
    from datetime import date
    
    import nvp.core.utils as utl
    
    from nvh.crypto.coingecko import Coingecko
    
    logger = logging.getLogger("update_coins_data")
    
    
    def handle(self: Coingecko):
        """Handler function entry point"""
    
        clist = self.get_coins_list()
    
        cdb = self.get_coin_db()
    
        # Iterate on each coin:
        ncoins = len(clist)
    
        def set_infos(obj, cname):
            """Set some infos on a given coin object."""
    
            infos = self.get_coin_info(cname, market_data=True)
            if infos is None:
                return
    
            obj["platform_id"] = infos["asset_platform_id"]
            obj["contract_address"] = infos.get("contract_address")
            links = infos["links"]
            obj["explorers"] = [l for l in links["blockchain_site"] if l != ""]
            obj["homepage"] = links["homepage"][0]
            obj["telegram"] = links["telegram_channel_identifier"]
            obj["twitter"] = links["twitter_screen_name"]
            obj["announcements"] = [l for l in links["announcement_url"] if l != ""]
            obj["total_supply"] = infos["market_data"]["total_supply"] if "market_data" in infos else None
    
        new_coins = []
    
        # Retrieve all the registered coin Ids:
        all_ids = self.getAllCoinIDs(True)
    
        for i in range(ncoins):
            cdat = clist[i]
            cid = cdat["id"]
    
            if cid == "":
                logger.info("Ignoring invalid coin ID for %s", cdat)
                continue
    
            # Check if we have that coin already in the database:
            if not cid in all_ids:
                res = self.get_market_data(cid, "max")
    
                if res is None:
                    logger.warning("Cannot retrieve any data for coin: %s.", cid)
                    nprices = 0
                else:
                    nprices = len(res["prices"])
    
                t0 = None
                tstr = None
    
                if nprices == 0:
                    logger.info("No price data available for coin %s", cid)
                    # Use the current time as timestamp:
                    # ts = nvGetCurrentTimestamp()
                else:
                    t0 = int(res["prices"][0][0] / 1000)
                    # logDEBUG("Timestamp: %d, num points: %d" % (ts,np))
    
                    # lastTs = int(res['prices'][-1][0]/1000)
    
                    d0 = date.fromtimestamp(t0)
                    tstr = d0.isoformat()
                    logger.info("New coin %d/%d: '%s' start date: %s", i + 1, ncoins, cid, tstr)
    
                desc = {"coin_id": cid, "start_timestamp": t0, "symbol": cdat["symbol"], "name": cdat["name"]}
                set_infos(desc, cid)
    
                # Add this new coin to the list:
                new_coins.append(desc)
    
                # Also add the coin id to the all list, to ensure we don't add it twice:
                all_ids.append(cid)
    
            if len(new_coins) > 20:
                logger.info("Inserting partial coins list..")
                cdb.insert_coins(new_coins)
                new_coins = []
    
        # If we have new coins then we should submit them to the DB:
        if len(new_coins) > 0:
            cdb.insert_coins(new_coins)
    
        received_ids = [clist[i]["id"] for i in range(ncoins)]
    
        # We should also check here if some coins were removed:
        mon_coins = cdb.get_all_monitored_coins()
    
        for cid in all_ids:
            if cid not in received_ids:
                logger.info("Removing obsolete coin %s", cid)
                cdb.remove_coin(cid)
    
                # Also remove the coin from the monitored coins list if applicable:
                if cid in mon_coins:
                    msg = f"Removing obsolete coin {cid} from list of monitored coins."
                    logger.warning(msg)
                    utl.send_rocketchat_message(":warning: Coingecko: " + msg)
    
        logger.info("Done updating coins data.")
    
  • Note before executing the script above, I learnt how to duplicate a postgresql table (⇒ That could proove very handy in some cases ✌):
    # duplicate the coins table:
    sql = "create table coins_orig as (select * from coins);"
    cdb.execute(sql, commit=True)
  • Now trying to update the coins:
    $ nvp coingecko update-coins
  • ⇒ And this works! Now just adding a dedicated script entry for this:
        "coingecko_update_coins": {
          "custom_python_env": "defi_env",
          "lock_file": "${PROJECT_ROOT_DIR}/data/locks/coingecko_update.lock",
          "cmd": "${PYTHON} nvh/crypto/coingecko.py update-coins",
          "cwd": "${PROJECT_ROOT_DIR}",
          "python_path": ["${PROJECT_ROOT_DIR}", "${NVP_ROOT_DIR}"]
        },
  • And running it as cron daily task: OK
  • Next interesting thing I just noticed is that my bsc_chain daily backup file went instantly from about 100MB to 530MB since i'm collecting the blocks/transactions data 😨!
  • This was somewhat expected of course, but now I'm feeling this might get too big, too quickly, so maybe I should do something about it.
  • My first idea here, is to check for contract creation transactions (ie. “to address” == “0x”): these will usually inject a lot of input data on the blockchain, and really I would not know how to parse that data at all, so maybe I should not store the input raw data in the database in that case.
  • Hmmm… okay so, actually I only have about 90MB of data in the registers for the created contracts transactions, as discovered below:
    # Select all the registers from contract creation transations:
    sql = "SELECT sig,registers FROM transactions WHERE to_id is null;"
    rows = db.execute(sql).fetchall()
    nrows = len(rows)
    print(f"Got {nrows} rows")
    
    # output: Got 8987 rows
    
    # Compute total length of that data:
    bcount = 0
    for row in rows:
        bcount += 0 if row[1] is None else len(row[1])
    
    print(f"Total number of bytes: {bcount}")
    
    # output: Total number of bytes: 89376916
  • While I have more the 1.9 GB of data in the other 10827212 transactions already in the database after only 1 day 😅, that's not too good…
  • Here is the script I used to classify what are the method signature taking the most space due to the input data:
    import nvp.core.utils as utl
    
    bcount = 0
    
    size_stats={}
    
    for row in rows:
        sig = row[0]
        if sig is None:
            sig = 0
    
        if sig not in size_stats:
            size_stats[sig] = {
                'sig': hex(utl.to_uint32(sig)),
                'min_size':100000,
                'max_size':0,
                'mean_size': 0,
                'count': 0
            }
        
        desc = size_stats[sig]
        size = 0 if row[1] is None else len(row[1])
        desc['min_size'] = min(desc['min_size'], size)
        desc['max_size'] = max(desc['max_size'], size)
        desc['mean_size'] += size
        desc['count'] += 1
        
        bcount += size
    
    stats = list(size_stats.values())
    for desc in stats:
        desc["share"] = desc["mean_size"]
        desc["mean_size"] /= desc["count"]
    
    stats.sort(key=lambda item: item['share'], reverse=True)
    
    print(f"Total number of bytes: {bcount}")
    
    stats[:20]
  • The outputs we get from our +10M transactions with the script above are as follow:
    Total number of bytes: 1912614627
    [{'sig': '0x38ed1739',
      'min_size': 0,
      'max_size': 416,
      'mean_size': 267.53302217321635,
      'count': 572267,
      'share': 153100320},
     {'sig': '0xa9059cbb',
      'min_size': 63,
      'max_size': 1084,
      'mean_size': 64.36142689845857,
      'count': 1892272,
      'share': 121789326},
     {'sig': '0xc9807539',
      'min_size': 896,
      'max_size': 1408,
      'mean_size': 1240.363025981665,
      'count': 69703,
      'share': 86457024},
     {'sig': '0x7c025200',
      'min_size': 708,
      'max_size': 25924,
      'mean_size': 2497.147587215375,
      'count': 28722,
      'share': 71723073},
     {'sig': '0x7ff36ab5',
      'min_size': 0,
      'max_size': 320,
      'mean_size': 229.34020209492746,
      'count': 297385,
      'share': 68202336},
     {'sig': '0x5c11d795',
      'min_size': 256,
      'max_size': 352,
      'mean_size': 268.7948732170819,
      'count': 234377,
      'share': 62999336},
     {'sig': '0x5f575529',
      'min_size': 640,
      'max_size': 17888,
      'mean_size': 1332.4805035845427,
      'count': 45752,
      'share': 60963648},
     {'sig': '0x791ac947',
      'min_size': 64,
      'max_size': 320,
      'mean_size': 259.65219550223696,
      'count': 201389,
      'share': 52291096},
     {'sig': '0x18cbafe5',
      'min_size': 256,
      'max_size': 352,
      'mean_size': 262.9286742079253,
      'count': 185557,
      'share': 48788256},
     {'sig': '0x95ea7b3',
      'min_size': 44,
      'max_size': 299,
      'mean_size': 65.38600831292992,
      'count': 668597,
      'share': 43716889},
     {'sig': '0x1de9c881',
      'min_size': 224,
      'max_size': 288,
      'mean_size': 244.97880883337052,
      'count': 134490,
      'share': 32947200},
     {'sig': '0x8803dbee',
      'min_size': 224,
      'max_size': 352,
      'mean_size': 264.1658646653026,
      'count': 123649,
      'share': 32663845},
     {'sig': '0xd8169879',
      'min_size': 3872,
      'max_size': 4512,
      'mean_size': 3964.3599106791016,
      'count': 7613,
      'share': 30180672},
     {'sig': '0x84013b6a',
      'min_size': 2400,
      'max_size': 3776,
      'mean_size': 2423.9184163167365,
      'count': 11669,
      'share': 28284704},
     {'sig': '0xfb3bdb41',
      'min_size': 96,
      'max_size': 288,
      'mean_size': 228.33169934640523,
      'count': 117504,
      'share': 26829888},
     {'sig': '0x6326d217',
      'min_size': 256,
      'max_size': 352,
      'mean_size': 270.54684163608846,
      'count': 96474,
      'share': 26100736},
     {'sig': '0xb6f9de95',
      'min_size': 0,
      'max_size': 288,
      'mean_size': 223.66593745077793,
      'count': 114278,
      'share': 25560096},
     {'sig': '0xb7835e4c',
      'min_size': 288,
      'max_size': 288,
      'mean_size': 288.0,
      'count': 86774,
      'share': 24990912},
     {'sig': '0xc5860465',
      'min_size': 352,
      'max_size': 352,
      'mean_size': 352.0,
      'count': 44108,
      'share': 15526016},
     {'sig': '0x5866cc8e',
      'min_size': 640,
      'max_size': 1088,
      'mean_size': 767.4700615846401,
      'count': 19323,
      'share': 14829824}]
  • Note: Checking https://www.4byte.directory/ we see that 0x38ed1739 corresponds to “swapExactTokensForTokens(uint256,uint256,address[],address,uint256)”
  • ⇒ But anyway, I cannot let my blockchain database grow that fast without some kind of control, so let's fix this now.
  • So, the “solution” I have for the moment is to split the transactions into 2 databases: first part in my standard bsc_chain database for instance. And then the transactions “heavy data” (ie. input data/value/gas price) in a separated database called transactions_db.
  • The idea then is to avoid backing up the full transactions_db database, and instead export the new rows in there regularly with a semi-manual system… At least, this will limit the size of the bsc_chain database itself hopefully!
  • After the CELO blockchain last time, now I would like to also start investigating the arbitrage possibilities on Avalanche, so let's prepare the required elements to start collecting blocks for that network.
  • Built a config for the “avax” blockchain: OK
  • And then started collecting blocks with a dedicated new script:
        "collect_avax_blocks": {
          "lock_file": "${PROJECT_ROOT_DIR}/data/locks/collect_avax_blocks.lock",
          "custom_python_env": "bsc_env",
          "cmd": "${PYTHON} ${PROJECT_ROOT_DIR}/nvh/crypto/blockchain/blockchain_manager.py collect blocks -c avax",
          "python_path": ["${PROJECT_ROOT_DIR}", "${NVP_ROOT_DIR}"]
        },
  • Also collecting gas prices on AVALANCHE now so I could add a dedicated page in my CryptoView app ⇒ ahh, need to wait a little until I have some gas prices for that blockchain before I can display the data 😄
  • One thing I noticed then is that I seem to have quite a few routers on that chain:
    routers = []
    for addid in id_counts.keys():
        routers.append(chain.get_address(addid))
    
    routers
  • Output:
    ['0xC7f372c62238f6a5b79136A9e5D16A2FD7A3f0F5',
     '0x60aE616a2155Ee3d9A68541Ba4544862310933d4',
     '0xE54Ca86531e17Ef3616d22Ca28b0D458b6C89106',
     '0x1b02dA8Cb0d097eB8D57A175b88c7D8b47997506',
     '0x5F1FdcA239362c5b8A8Ada26a256ac5626CC33E0',
     '0xF7b1e993d1b4F7348D64Aa55A294E4B74512F7f2',
     '0xA52aBE4676dbfd04Df42eF7755F01A3c41f28D27',
     '0xb9a9BB6CC39387548BAA7185fbFf51d47eef8771',
     '0x7aE9D51DC773Ea7518509786d2017bb964606663',
     '0x262DcFB36766C88E6A7a2953c16F8defc40c378A',
     '0x9E4AAbd2B3E60Ee1322E94307d0776F2c8e6CFbb',
     '0x763D8D37669bB18064b410e17E3bB46BCF34F487',
     '0x8685aBD6959745E41D64194F42e21bbD25Fd57B3',
     '0x78c18E6BE20df11f1f41b9635F3A18B8AD82dDD1',
     '0xd8aA70F7990Dab4a383a0d8A57dF7a372916575D',
     '0xc2544A32872A91F4A553b404C6950e89De901fdb',
     '0x14f800843F7d70C911d4ec728f2F839c264A9895',
     '0x3f767183E3DFf6726950965EBDE537C5a7C29595',
     '0xb5b2444eDF79b00d40f463f79158D1187a0D0c25',
     '0x0c45FB63001b56a21e29c7dcc1727bfDA273a368',
     '0x0eb4DF266dA7c7BC1629b52a540cc87C8D95Bf24',
     '0x85995d5f8ee9645cA855e92de16FA62D26398060']
  • But before pushing this further I think I should write an handler to retrieve those routers/factories.
  • One nice thing I've just been thinking about writing that new handler and calling it during implementation phase in jupyter, is that I could even provide some hot reloading support here: each time I call an “handler” I could get an hash from the corresponding file, and if it doesn't match the previous hash, then I could reload the handler code from file overriding the previous version (yeah 😎✌! ⇒ Sounds realy cool, so let's try it!).
  • Yooohhooo! That's indeed really cool: I don't have to re-start my jupyter kernel each time I change something in my handler now. Here is the updated code providing support for that in NVPContext.get_handler():
        def get_handler(self, hname):
            """Get a handler by name"""
    
            # Adding support for hot reloading of handlers here:
            # Given a module name, we should try to find the corresponding file:
            filepath = self.resolve_module_file(hname)
    
            # Compute the hash of that file:
            fhash = self.compute_file_hash(filepath)
            prev_hash = self.handler_hashes.get(filepath, None)
            if prev_hash is not None and prev_hash != fhash:
                logger.debug("Detected change in %s, reloading handler %s", filepath, hname)
    
                # Remove the already loaded function:
                self.check(hname in self.handlers, "Expected handler to be loader already: %s", hname)
                del self.handlers[hname]
    
            # Store the new file hash anyway:
            self.handler_hashes[filepath] = fhash
    
            if hname in self.handlers:
                return self.handlers[hname]
    
            # otherwise we have to search for that handler:
            comp_module = import_module(hname)
            handler = comp_module.handle
            del sys.modules[hname]
    
            self.handlers[hname] = handler
            return handler
  • Now let's continue with the retrieval of the factories from the router addresses: OK
  • Next, trying to collect the pairs on JoeSwap:
    $ nvp bchain update-pairs joeswap
  • All good so continuing with PangolinSwap:
    $ nvp bchain update-pairs pangolinswap
  • And then with SushiSwap on avalanche:
    $ nvp bchain update-pairs avaxsushiswap
  • Net I should also ensure that I'm updating all the pairs automatically once per day: OK
  • Finally, I also added “SicleSwap” (even if I have no idea which website this is… but it was the fourth router on my ordered list):
    $ nvp bchain update-pairs sicleswap
  • Now that I have 4 DEXes registered on Avalanche, it's time to start saerching for arb setups there!
  • But to do so at least in dry-run, I need the GetReservesV2 contract deployed. Yet, this should be done with the evm_arb account, so I should send some avax to that account first (done in jupyter):
    chain.transfer(0.2, to_name="evm_arb")
  • Next the contract deployment:
    $ nvp bchain deploy -c avax -a evm_arb GetReservesV2
  • But now I realize that I would also need the additionnal contract addresses to be able to build the ArbitrageManager hmmm… 🤔 Or let's see if I can bypass that.
  • ⇒ In fact I could probably do without the FlashArb contract at start, but I will definitely need the PairChecker contract, so deploying that one too:
    $ nvp bchain deploy -c avax -a evm_arb PairCheckerV6
  • Now trying to monitor the arbs on avalanche (in dry-run mode):
    $ nvp arbman monitor-arbs -c avax -n
  • Arrff… That's not quite working out of the box yet 😆:
    $ nvp arbman monitor-arbs -c avax -n
    2022/06/26 20:26:03 [nvh.crypto.blockchain.evm_blockchain] INFO: Keeping 19022/19962 quotable pairs.
    2022/06/26 20:26:03 [nvh.crypto.blockchain.evm_blockchain] INFO: Found 689 arb compatible pairs
    2022/06/26 20:26:03 [__main__] INFO: wrapped native token address: 0x4cc3Fc374627E9244dc49a258ae717098c2165d0
    2022/06/26 20:26:03 [__main__] INFO: Collected 689 arb pairs
    2022/06/26 20:26:03 [__main__] INFO: Min profit value: 0.000100
    2022/06/26 20:26:03 [__main__] INFO: Quote token WAVAX value: 256326 AVAX
    2022/06/26 20:26:03 [__main__] WARNING: Not enough WAVAX funds to send to pair checker contract.
    2022/06/26 20:26:03 [__main__] INFO: PairChecker balance: 0 WAVAX
    2022/06/26 20:26:04 [nvh.crypto.blockchain.uniswap_base] INFO: Searching for swap path between MIM -> ? -> AVAX
    2022/06/26 20:26:04 [nvh.crypto.blockchain.uniswap_base] INFO: Using token swap path: ['MIM', 'WAVAX', 'AVAX']
    2022/06/26 20:26:04 [__main__] INFO: Quote token MIM value: 107802 AVAX
    2022/06/26 20:26:04 [__main__] WARNING: Not enough MIM funds to send to pair checker contract.
    2022/06/26 20:26:04 [__main__] INFO: PairChecker balance: 0 MIM
    2022/06/26 20:26:04 [nvh.crypto.blockchain.uniswap_base] INFO: Searching for swap path between USDC.e -> ? -> AVAX
    2022/06/26 20:26:04 [nvh.crypto.blockchain.uniswap_base] INFO: Using token swap path: ['USDC.e', 'WAVAX', 'AVAX']
    2022/06/26 20:26:05 [__main__] INFO: Quote token USDC.e value: 108200 AVAX
    2022/06/26 20:26:05 [__main__] WARNING: Not enough USDC.e funds to send to pair checker contract.
    2022/06/26 20:26:05 [__main__] INFO: PairChecker balance: 0 USDC.e
    2022/06/26 20:26:05 [nvh.crypto.blockchain.uniswap_base] INFO: Searching for swap path between USDT.e -> ? -> AVAX
    2022/06/26 20:26:05 [nvh.crypto.blockchain.uniswap_base] INFO: Using token swap path: ['USDT.e', 'WAVAX', 'AVAX']
    2022/06/26 20:26:05 [__main__] INFO: Quote token USDT.e value: 108249 AVAX
    2022/06/26 20:26:05 [__main__] WARNING: Not enough USDT.e funds to send to pair checker contract.
    2022/06/26 20:26:06 [__main__] INFO: PairChecker balance: 0 USDT.e
    2022/06/26 20:26:06 [nvh.crypto.blockchain.uniswap_base] INFO: Searching for swap path between USDC -> ? -> AVAX
    2022/06/26 20:26:06 [nvh.crypto.blockchain.uniswap_base] INFO: Using token swap path: ['USDC', 'WAVAX', 'AVAX']
    2022/06/26 20:26:06 [__main__] INFO: Quote token USDC value: 108309 AVAX
    2022/06/26 20:26:06 [__main__] WARNING: Not enough USDC funds to send to pair checker contract.
    2022/06/26 20:26:06 [__main__] INFO: PairChecker balance: 0 USDC
    2022/06/26 20:26:06 [nvh.crypto.blockchain.uniswap_base] INFO: Searching for swap path between DAI.e -> ? -> AVAX
    2022/06/26 20:26:06 [nvh.crypto.blockchain.uniswap_base] INFO: Using token swap path: ['DAI.e', 'WAVAX', 'AVAX']
    2022/06/26 20:26:06 [__main__] INFO: Quote token DAI.e value: 108251 AVAX
    2022/06/26 20:26:07 [__main__] WARNING: Not enough DAI.e funds to send to pair checker contract.
    2022/06/26 20:26:07 [__main__] INFO: PairChecker balance: 0 DAI.e
    2022/06/26 20:26:07 [nvh.crypto.blockchain.uniswap_base] INFO: Searching for swap path between PNG -> ? -> AVAX
    2022/06/26 20:26:07 [nvh.crypto.blockchain.uniswap_base] INFO: Using token swap path: ['PNG', 'WAVAX', 'AVAX']
    2022/06/26 20:26:07 [__main__] INFO: Quote token PNG value: 8881.42 AVAX
    2022/06/26 20:26:07 [__main__] WARNING: Not enough PNG funds to send to pair checker contract.
    2022/06/26 20:26:07 [__main__] INFO: PairChecker balance: 0 PNG
    2022/06/26 20:26:07 [nvh.crypto.blockchain.uniswap_base] INFO: Searching for swap path between JOE -> ? -> AVAX
    2022/06/26 20:26:07 [nvh.crypto.blockchain.uniswap_base] INFO: Using token swap path: ['JOE', 'WAVAX', 'AVAX']
    2022/06/26 20:26:08 [__main__] INFO: Quote token JOE value: 43963.4 AVAX
    2022/06/26 20:26:08 [__main__] WARNING: Not enough JOE funds to send to pair checker contract.
    2022/06/26 20:26:08 [__main__] INFO: PairChecker balance: 0 JOE
    2022/06/26 20:26:08 [nvh.crypto.blockchain.uniswap_base] INFO: Searching for swap path between WETH.e -> ? -> AVAX
    2022/06/26 20:26:08 [nvh.crypto.blockchain.uniswap_base] INFO: Using token swap path: ['WETH.e', 'WAVAX', 'AVAX']
    2022/06/26 20:26:08 [__main__] INFO: Quote token WETH.e value: 275450 AVAX
    2022/06/26 20:26:08 [__main__] WARNING: Not enough WETH.e funds to send to pair checker contract.
    2022/06/26 20:26:08 [__main__] INFO: PairChecker balance: 0 WETH.e
    2022/06/26 20:26:09 [nvh.crypto.blockchain.uniswap_base] INFO: Searching for swap path between USDT -> ? -> AVAX
    2022/06/26 20:26:09 [nvh.crypto.blockchain.uniswap_base] INFO: Using token swap path: ['USDT', 'WAVAX', 'AVAX']
    2022/06/26 20:26:09 [__main__] INFO: Quote token USDT value: 107662 AVAX
    2022/06/26 20:26:09 [__main__] WARNING: Not enough USDT funds to send to pair checker contract.
    2022/06/26 20:26:09 [__main__] INFO: PairChecker balance: 0 USDT
    2022/06/26 20:26:10 [__main__] WARNING: Unhandled method signature '0x7a42416a' in 0xeb8c29814de99e7999e5493a101018dc1658fae01c021414c4c8a1ea9c9f1300
    2022/06/26 20:26:11 [__main__] INFO: 8 pair reserves collected in 0.1369 secs
    2022/06/26 20:26:11 [__main__] INFO: Pair reserves block offset: 1
    2022/06/26 20:26:11 [__main__] WARNING: Unhandled method signature '0x7a42416a' in 0x510965bca0e7eeac515a84c155a8651f0504dae311cf4711dad576152394ac07
    2022/06/26 20:26:11 [__main__] INFO: 4 pair reserves collected in 0.1195 secs
    2022/06/26 20:26:12 [__main__] WARNING: Unhandled method signature '0xf91b3f72' in 0x610f19450047f05df550d65c82590e73be11f0d81e2589b165b77750407ef34d
    2022/06/26 20:26:18 [__main__] WARNING: Unhandled method signature '0xf91b3f72' in 0x801bdb853b67509d61e3d2f6f375ae8c90e1d96516039ea145903216b2421d21
    2022/06/26 20:26:19 [__main__] WARNING: Unhandled method signature '0x2c407024' in 0xdc85d79dc709aef036637b57e922ba391c9da85ebcd8d6fd1a8edda7c44507be
    2022/06/26 20:26:19 [__main__] WARNING: Unhandled method signature '0x676528d1' in 0xd56b4db987f3265a385673e0064bbfcd192b3d6db0b6815e91ed9ba59c6659cb
    2022/06/26 20:26:19 [__main__] INFO: 2 pair reserves collected in 0.1214 secs
    2022/06/26 20:26:31 [__main__] INFO: 2 pair reserves collected in 0.1331 secs
    2022/06/26 20:26:34 [__main__] INFO: 3 pair reserves collected in 0.1184 secs
  • ⇒ Which means I need to perform some debugging here 😉
  • First, let's check what is the “WAVAX” token above:
    $ nvp bchain find-token -c avax WAVAX
    2022/06/26 20:38:22 [__main__] INFO: Found token: { 'address': '0xB31f66AA3C1e785363F0875A1B74E27b85FD66c7',
    'classification': 0,
    'decimals': 18,
    'id': 1,
    'min_slippage': None,
    'name': 'Wrapped AVAX',
    'swap_fees': None,
    'symbol': 'WAVAX'}
  • Hmmm, then why do we get a different address for the wapped native token above ? OK ⇒ was simply the native symbol mapping missing.
  • But then I get a lot of warnings about invalid signatures, simply because we are now using function names such as: swapExactAVAXForTokens, so I need to handle that.
  • To solve the issue reported above with different uniswap function names on differen blockchains, I'm thinking I could build an automated system for this, to generate the expected method signature given a function declaration. So let's see if I can build that.
  • ⇒ To generate a keccak value in python, one can use:
     from Crypto.Hash import keccak
     k = keccak.new(digest_bits=256)
     k.update('age')
     print k.hexdigest()
  • So here are the methods I came up with in a new UniswapSigmap class to handle the generation of “signature banks” from function declarations:
        def get_keccak256(self, text):
            """Generate the keccak-256 from a given string"""
            kec = keccak.new(digest_bits=256)
            kec.update(text.encode())
            return kec.hexdigest()
    
        def gen_signatures(self, fnames, params):
            """Generate a list of signatures given a list of function names
            and the parameters that should be apppended to the name"""
            sigs = []
    
            for fname in fnames:
                # generate the signature:
                full_name = fname + params
                sig = "0x" + self.get_keccak256(full_name)[:8]
                sigs.append(sig)
                logger.info("Registering signature '%s' for '%s'", sig, full_name)
    
            return sigs
    
        def setup_signature_banks(self, syms):
            """Prepare the signature banks given a list of acceptable
            native token names in the funtion names."""
    
            # Swap native for tokens:
            # ex: swapExactETHForTokens(uint256 amountOutMin, address[] path, address to, uint256 deadline)
            params = "(uint256,address[],address,uint256)"
            fnames = [f"swapExact{name}ForTokens" for name in syms]
            fnames += [f"swapExact{name}ForTokensSupportingFeeOnTransferTokens" for name in syms]
            fnames += [f"swap{name}ForExactTokens" for name in syms]
    
            self.sig_banks["swap_native_for_tokens"] = self.gen_signatures(fnames, params)
    
            # Swap tokens:
            # ex: swapExactTokensForETH(uint256 amountIn, uint256 amountOutMin, address[] path, address to, uint256 deadline)
            params = "(uint256,uint256,address[],address,uint256)"
            fnames = [f"swapExactTokensFor{name}" for name in syms]
            fnames += [f"swapTokensForExact{name}" for name in syms]
            fnames += [f"swapExactTokensFor{name}SupportingFeeOnTransferTokens" for name in syms]
            fnames += [
                "swapTokensForExactTokens",
                "swapExactTokensForTokens",
                "swapExactTokensForTokensSupportingFeeOnTransferTokens",
            ]
    
            self.sig_banks["swap_tokens"] = self.gen_signatures(fnames, params)
    
            # Tokens liquidity:
            # ex: addLiquidity(address tokenA, address tokenB, uint256 amountADesired, uint256 amountBDesired,
            # uint256 amountAMin, uint256 amountBMin, address to, uint256 deadline)
            params = "(address,address,uint256,uint256,uint256,uint256,address,uint256)"
            fnames = ["addLiquidity"]
    
            self.sig_banks["token_liquidity"] = self.gen_signatures(fnames, params)
    
            # Remove liquidity:
            # ex: removeLiquidityWithPermit(address tokenA, address tokenB, uint256 liquidity, uint256 amountAMin,
            # uint256 amountBMin, address to, uint256 deadline, bool approveMax, uint8 v, bytes32 r, bytes32 s)
            params = "(address,address,uint256,uint256,uint256,address,uint256,bool,uint8,bytes32,bytes32)"
            fnames = ["removeLiquidityWithPermit"]
    
            self.sig_banks["token_liquidity"] += self.gen_signatures(fnames, params)
    
            # removeLiquidity(address tokenA, address tokenB, uint256 liquidity, uint256 amountAMin,
            # uint256 amountBMin, address to, uint256 deadline)
            params = "(address,address,uint256,uint256,uint256,address,uint256)"
            fnames = ["removeLiquidity"]
    
            self.sig_banks["token_liquidity"] += self.gen_signatures(fnames, params)
    
            # Native liquidity:
            # ex: addLiquidityETH(address token, uint256 amountTokenDesired, uint256 amountTokenMin,
            # uint256 amountETHMin, address to, uint256 deadline)
            params = "(address,uint256,uint256,uint256,address,uint256)"
            fnames = [f"addLiquidity{name}" for name in syms]
            fnames += [f"removeLiquidity{name}" for name in syms]
            fnames += [f"removeLiquidity{name}SupportingFeeOnTransferTokens" for name in syms]
    
            self.sig_banks["native_liquidity"] = self.gen_signatures(fnames, params)
    
            # removeLiquidityETHWithPermit(address token, uint256 liquidity, uint256 amountTokenMin,
            # uint256 amountETHMin, address to, uint256 deadline, bool approveMax, uint8 v, bytes32 r, bytes32 s)
            params = "(address,uint256,uint256,uint256,address,uint256,bool,uint8,bytes32,bytes32)"
            fnames = [f"removeLiquidity{name}WithPermit" for name in syms]
            fnames += [f"removeLiquidity{name}WithPermitSupportingFeeOnTransferTokens" for name in syms]
    
            self.sig_banks["native_liquidity"] += self.gen_signatures(fnames, params)
  • ⇒ using this with our regular ArbitrageManager on BSC seems to work as usual, so I guess I didn't break anything 😁.
  • Now time to test on Avalanche 🤞:
    $ nvp arbman monitor-arbs -c avax -n
  • Arrff… seems better, but we get an error now because we have no funds in the PairChecker contract:
    Traceback (most recent call last):
    File "D:\Projects\NervHome\nvh\crypto\blockchain\arbitrage_manager.py", line 921, in <module>
    comp.run()
    File "D:\Projects\NervProj\nvp\nvp_component.py", line 93, in run
    res = self.process_command(cmd)
    File "D:\Projects\NervHome\nvh\crypto\blockchain\arbitrage_manager.py", line 876, in process_command
    self.monitor_arbitrages(chain)
    File "D:\Projects\NervHome\nvh\crypto\blockchain\arbitrage_manager.py", line 132, in monitor_arbitrages
    result = self.handle_arbitrage_setups()
    File "D:\Projects\NervHome\nvh\crypto\blockchain\arbitrage_manager.py", line 297, in handle_arbitrage_setups
    self.process_arb_setups(plist, pair_reserves)
    File "D:\Projects\NervHome\nvh\crypto\blockchain\arbitrage_manager.py", line 406, in process_arb_setups
    if (t0.is_class_unknown() or t1.is_class_unknown()) and not self.check_pair(best_p0, qtoken):
    File "D:\Projects\NervHome\nvh\crypto\blockchain\arbitrage_manager.py", line 651, in check_pair
    self.check(
    File "D:\Projects\NervProj\nvp\nvp_object.py", line 73, in check
    raise NVPCheckError(fmt % args)
    nvp.nvp_object.NVPCheckError: Initial balance of USDC too low, cannot perform pair check for WAVAX/USDC.
  • ⇒ So I definitely have to send some tokens there, before pushing it further.
  • First thing I need for that is support to wrap the native token:
            if cmd == "wrap":
                chain_name = self.get_param("chain")
                chain: EVMBlockchain = self.get_component(f"{chain_name}_chain")
    
                account = self.get_param("account")
                if account is not None:
                    chain.set_account(account)
    
                # We can only unwrap the native token:
                token = chain.get_wrapped_native_token()
    
                value = self.get_param("value")
                self.check(value is not None, "No value specified.")
    
                amount = token.to_amount(value)
    
                if amount == 0:
                    logger.info("Nothing to wrap.")
                    return True
    
                # Get the initial balance:
                bal0 = chain.get_native_balance()
                self.check(bal0 > value, "Native balance too low.")
    
                logger.info("Initial wrapper native balance: %f %s", token.get_balance(), token.symbol())
                logger.info("Wrapping %d units of %s", amount, chain.get_native_symbol())
                token.deposit(amount)
                logger.info("Final wrapped native balance: %f %s", token.get_balance(), chain.get_native_symbol())
                return True
  • Now I can call:
    $ nvp bchain wrap -c avax -a evm_arb -v 0.001
  • Then I started to add support to automatically swap some tokens as needed for the PairChecker setup:
        def update_quote_token_values(self):
            """Update the value of the quote tokens in the native currency"""
            self.quote_token_values = {}
    
            dex = self.chain.get_default_exchange()
    
            # Check that we have some of the wrapped native token available:
            # and if not we send some:
            ntoken = self.chain.get_wrapped_native_token()
            if ntoken.get_balance(as_value=False) == 0:
                logger.info("Wrapping some %s PairChecker setup...", self.native_symbol)
                ntoken.deposit(200000 * 20)
    
            dex = self.chain.get_default_exchange()
    
            for addr in self.quote_tokens:
                val = dex.get_quote(1.0, addr, self.native_symbol)
                self.quote_token_values[addr] = val
    
                token = self.chain.get_token(addr)
                logger.info("Quote token %s value: %.6g %s", token.symbol(), val, self.native_symbol)
    
                # Get the current balance of qtoken in the pair checker:
                bal = token.get_balance(as_value=False, account=self.pair_checker_address)
    
                if bal == 0:
                    if token.get_balance(as_value=False) < 400000:
                        nval = max(int(203000 * val) + 1, 1)
    
                        logger.info("Swapping %.6g units of %s for %s", nval, ntoken.symbol(), token.symbol())
                        dex.swap_tokens(nval, [ntoken.symbol(), token.symbol()], is_amount=True)
    
                    nbal = token.get_balance(as_value=False)
                    while nbal < 200000:
                        logger.info("Waiting for funds to arrive... (current balance: %d %s)", nbal, token.symbol())
                        time.sleep(1.0)
                        nbal = token.get_balance(as_value=False)
    
                    # # We don't have the funds yet ourself, so we conduct a swap operation first:
                    # logger.warning("Not enough %s funds to send to pair checker contract.", token.symbol())
    
                    logger.info("Sending %s funds to pair checker contract...", token.symbol())
                    token.transfer(self.pair_checker_address, 200000)
    
                bal = token.get_balance(as_value=False, account=self.pair_checker_address)
                logger.info("PairChecker balance: %d %s", bal, token.symbol())
  • Unfortunately, it seems the swapping operation is failing:
    2022/06/28 12:28:05 [__main__] INFO: Quote token MIM value: 0.0501456 WAVAX
    2022/06/28 12:28:05 [__main__] INFO: Swapping 10180 units of WAVAX for MIM
    2022/06/28 12:28:06 [nvh.crypto.blockchain.uniswap_base] INFO: Reserve for WAVAX: 44787055416129637105101
    2022/06/28 12:28:06 [nvh.crypto.blockchain.uniswap_base] INFO: Reserve for MIM: 890459127292300704299446
    2022/06/28 12:28:06 [nvh.crypto.blockchain.uniswap_base] INFO: Amount_out_min: 201590
    2022/06/28 12:28:06 [nvh.crypto.blockchain.uniswap_base] INFO: Amount_out_min value: 2.016e-13 MIM
    2022/06/28 12:28:06 [nvh.crypto.blockchain.evm_blockchain] INFO: Error while trying to perform operation: execution reverted: TransferHelper: TRANSFER_FROM_FAILED
    2022/06/28 12:28:06 [__main__] INFO: Waiting for funds to arrive... (current balance: 0 MIM)
    2022/06/28 12:28:07 [__main__] INFO: Waiting for funds to arrive... (current balance: 0 MIM)
    2022/06/28 12:28:08 [__main__] INFO: Waiting for funds to arrive... (current balance: 0 MIM)
    2022/06/28 12:28:09 [__main__] INFO: Waiting for funds to arrive... (current balance: 0 MIM)
  • ⇒ Actually, this is somewhat expected, because I'm pretty sure I need to approve some contract first before it may swap my WAVAX tokens on my behalf 😅 Question is, which contract should that be exactly 🤔?
  • ⇒ OK, that should simply be the router since this is the one that will try to transfer our funds to the first pair on the path at start of the swap process:
        function swapExactTokensForTokensSupportingFeeOnTransferTokens(
            uint256 amountIn,
            uint256 amountOutMin,
            address[] calldata path,
            address to,
            uint256 deadline
        ) external virtual override ensure(deadline) {
            TransferHelper.safeTransferFrom(
                path[0],
                msg.sender,
                JoeLibrary.pairFor(factory, path[0], path[1]),
                amountIn
            );
            uint256 balanceBefore = IERC20Joe(path[path.length - 1]).balanceOf(to);
            _swapSupportingFeeOnTransferTokens(path, to);
            require(
                IERC20Joe(path[path.length - 1]).balanceOf(to).sub(balanceBefore) >=
                    amountOutMin,
                "JoeRouter: INSUFFICIENT_OUTPUT_AMOUNT"
            );
        }
  • So let's approve the router on the WAVAX token first… OK! This helped:
            dex = self.chain.get_default_exchange()
            addr = dex.get_router_address()
    
            if ntoken.get_allowance(addr) == 0:
                logger.info("Increasing allowance for router (%s) on %s", addr, ntoken.symbol())
                ntoken.approuve(addr, 10000000)
  • But I still have another problem when it comes to USDC.e:
    2022/06/28 13:35:04 [__main__] INFO: Quote token USDC.e value: 0.0498541 WAVAX
    2022/06/28 13:35:04 [__main__] INFO: Swapping 10121 units of WAVAX for USDC.e
    2022/06/28 13:35:04 [nvh.crypto.blockchain.uniswap_base] INFO: Reserve for WAVAX: 184375863573191424131829
    2022/06/28 13:35:04 [nvh.crypto.blockchain.uniswap_base] INFO: Reserve for USDC.e: 3687209961192
    2022/06/28 13:35:04 [nvh.crypto.blockchain.uniswap_base] INFO: Amount_out_min: 0
    2022/06/28 13:35:04 [nvh.crypto.blockchain.uniswap_base] INFO: Amount_out_min value: 0 USDC.e
    2022/06/28 13:35:04 [nvh.crypto.blockchain.evm_blockchain] INFO: Error while trying to perform operation: execution reverted: Joe: INSUFFICIENT_OUTPUT_AMOUNT
    2022/06/28 13:35:04 [__main__] INFO: Waiting for funds to arrive... (current balance: 0 USDC.e)
    2022/06/28 13:35:05 [__main__] INFO: Waiting for funds to arrive... (current balance: 0 USDC.e)
    2022/06/28 13:35:06 [__main__] INFO: Waiting for funds to arrive... (current balance: 0 USDC.e)
    2022/06/28 13:35:07 [__main__] INFO: Waiting for funds to arrive... (current balance: 0 USDC.e)
  • ⇒ The “Amount_out_min” is incorrect of course. Could that be because the number of decimals are not the same for both tokens ?
  • Arrfff, actually we only have 6 decimals for USDC.e so the amount of WAVAX I'm requesting to swap is going to be too small here. ⇒ Let's just do the swap manually. Or let's just ignore that quote token for now: the pair checker would by default need 100000 units of USDC.e to do the test, which is an “high value” already (ie. 0.1$ lol) ⇒ We'll get back to this if the arbitrage system works (but it won't work, so no worries).
  • Hmmm, interestingly, I get failing classification for tokens that are really supposed to be valid here:
    2022/06/28 14:13:31 [__main__] ERROR: Pair checker test failed for USDT.e: execution reverted: [error] fn=swapToken0FL
    2022/06/28 14:13:31 [__main__] ERROR: Pair address: 0xe28984e1EE8D431346D32BeC9Ec800Efb643eef4, exchange: PangolinSwap, router: 0xE54Ca86531e17Ef3616d22Ca28b0D458b6C89106, ifp: 9970
    2022/06/28 14:13:31 [__main__] INFO: PairChecker result for USDT.e (against WAVAX) is: FAILED
  • ⇒ I should really have a closer look at that at some point, but first, I think I could try to deploy the flasharb contract itself ✌:
    $ nvp bchain deploy -c avax -a evm_arb FlashArbV5
  • Now trying to run the arb monitoring again, but not in dry-run mode:
    $ nvp arbman monitor-arbs -c avax
  • And crap… 😭 I just get execution reverted messages when estimating the gas prices each time I find a valid setup:
    2022/06/28 14:47:45 [__main__] INFO: Block 16630211: Found arb setup of 0.001141 WAVAX with pairs on EGG#2/WAVAX
    2022/06/28 14:47:46 [__main__] ERROR: Error while estimating operation gas: execution reverted
    2022/06/28 14:48:00 [__main__] INFO: 9 pair reserves collected in 0.1179 secs
    2022/06/28 14:48:00 [__main__] INFO: Extracted infos: {'r0': 1784880975329859006333, 'r1': 323415904337601654857214, 'r0dt': 0, 'r1dt': 0, 'swap_exp': 18065387, 'swap_rcv': 18065387, 'transfer_exp': 18065387, 'transfer_rcv': 18065387, 'amount_back': 99400}
    2022/06/28 14:48:00 [__main__] INFO: PairChecker result for HeC (against WAVAX) is: PASSED
    2022/06/28 14:48:00 [__main__] INFO: Saving classification for token HeC
    2022/06/28 14:48:00 [__main__] INFO: PairCheck done in 0.2883 secs
    2022/06/28 14:48:00 [__main__] INFO: Block 16630218: Found arb setup of 0.0003253 WAVAX with pairs on WAVAX/HeC
    2022/06/28 14:48:00 [__main__] ERROR: Error while estimating operation gas: execution reverted
  • My initial guess on that error above is that I might have some kind of incompatibility in the assembly code ? Maybe I should consider using a previous version of the compiler ? I don't know…
  • Soooooooo… from there, where do I go ? 😋 Arrfff, I guess I should stop it here for this “session”, I need some fresh ideas to work on anyway 🤣…
  • blog/2022/0628_avalanche_network_discovery.txt
  • Last modified: 2022/06/28 16:39
  • by 127.0.0.1