From 8d5f0cee6990e609d83ad725f42aede9e87ee485 Mon Sep 17 00:00:00 2001 From: Bryan Stitt Date: Fri, 6 May 2022 18:53:16 -0700 Subject: [PATCH] Update README.md --- README.md | 37 ++++++++++++++++++------------------- 1 file changed, 18 insertions(+), 19 deletions(-) diff --git a/README.md b/README.md index 0ee0a164..fbc8f18f 100644 --- a/README.md +++ b/README.md @@ -1,31 +1,30 @@ # web3-proxy quick and dirty proxy for ethereum rpcs (or similar) -Signed transactions are sent to the configured private RPC (eden, flashbots, etc.). All other requests are sent to the configured primary RPC (alchemy, moralis, rivet, your own node, or one of many other providers). +Signed transactions are sent to the configured private RPC (eden, flashbots, etc.). All other requests are sent to an RPC server on the latest block (alchemy, moralis, rivet, your own node, or one of many other providers). ``` -cargo run -- --help +cargo run --release -- --help ``` ``` - Finished dev [unoptimized + debuginfo] target(s) in 0.04s - Running `target/debug/eth-proxy --help` -Usage: eth-proxy --eth-primary-rpc --eth-private-rpc [--listen-port ] + Finished release [optimized] target(s) in 0.13s + Running `target/release/web3-proxy --help` +Usage: web3-proxy [--listen-port ] [--rpc-config-path ] -Proxy Web3 Requests +Reach new heights. Options: - --eth-primary-rpc the primary Ethereum RPC server - --eth-private-rpc the private Ethereum RPC server - --listen-port the port to listen on + --listen-port what port the proxy should listen on + --rpc-config-path what port the proxy should listen on --help display usage information ``` ``` -cargo run -r -- --eth-primary-rpc "https://your.favorite.provider" +cargo run --release ``` ``` -curl -X POST -H "Content-Type: application/json" --data '{"jsonrpc":"2.0","method":"web3_clientVersion","params":[],"id":67}' 127.0.0.1:8845/eth +curl -X POST -H "Content-Type: application/json" --data '{"jsonrpc":"2.0","method":"web3_clientVersion","params":[],"id":67}' 127.0.0.1:8544/eth ``` ## Flame Graphs @@ -41,8 +40,8 @@ curl -X POST -H "Content-Type: application/json" --data '{"jsonrpc":"2.0","metho Test the proxy: - wrk -s ./data/wrk/getBlockNumber.lua -t12 -c400 -d30s --latency http://127.0.0.1:8445 - wrk -s ./data/wrk/getLatestBlockByNumber.lua -t12 -c400 -d30s --latency http://127.0.0.1:8445 + wrk -s ./data/wrk/getBlockNumber.lua -t12 -c400 -d30s --latency http://127.0.0.1:8544 + wrk -s ./data/wrk/getLatestBlockByNumber.lua -t12 -c400 -d30s --latency http://127.0.0.1:8544 Test geth: @@ -58,14 +57,14 @@ Test erigon: ## Todo - [x] simple proxy -- [ ] better locking. when lots of requests come in, we seem to be in the way of block updates +- [x] better locking. when lots of requests come in, we seem to be in the way of block updates - [ ] proper logging -- [ ] load balance between multiple RPC servers -- [ ] support more than just ETH -- [ ] option to disable private rpc and send everything to primary -- [ ] health check nodes by block height +- [x] load balance between multiple RPC servers +- [x] support more than just ETH +- [x] option to disable private rpc and send everything to primary +- [x] health check nodes by block height - [ ] measure latency to nodes -- [ ] Dockerfile +- [x] Dockerfile - [ ] testing getLatestBlockByNumber is not great because the latest block changes and so one run is likely to be different than another - [ ] if a request gets a socket timeout, try on another server - maybe always try at least two servers in parallel? and then return the first? or only if the first one doesn't respond very quickly?