Compare commits

...

66 Commits

Author SHA1 Message Date
de05a8b64b Update repository link (name changed, because classic-relayer repo merged with nova-relayer) 2023-10-07 07:30:40 -07:00
91ccc7c2ad Add Nova relayer server to docker-compose bundle 2023-10-07 07:02:07 -07:00
c338f27819 Add Nova relayer installation to script and instructions about Nova to README 2023-10-07 05:59:54 -07:00
a14f535d34 Update version badges in REAMDE 2023-09-07 11:15:09 -07:00
21c822b9f8 Add instructions for relayer how to change parameters if relayer already deployed and ran 2023-09-04 03:21:04 -07:00
ef32badf36 Download updated main repo to use up-to-date docker-compose configuration 2023-08-31 10:14:29 -07:00
021ad0e83b Use v4 relayer for mainnet and v5 for sidechains for backwards compatibility 2023-08-28 01:52:33 -07:00
f3041b59b9 Use v5 docker images in deployment 2023-07-30 01:55:16 -07:00
2246d6acd5 Fix README: change script link 2023-07-23 07:30:38 -07:00
7eb5fb0a43 Bump version to 5.1.0 (add smart gas estimation) 2023-07-13 22:14:34 -07:00
d709b49204 Change docker images tags & build images in source folders 2023-07-13 11:34:54 -07:00
3b6289bf2c Prepare main repository for v4 multiple network relayer deployment 2023-07-12 10:08:11 -07:00
6adeb27b83 Change .env name for mainnet 2023-07-11 19:17:20 -07:00
58dae5b030 Edit docker-compose config: change docker images to prebuilt local, change .env files and profiles naming for sidechains 2023-06-10 19:16:11 +03:00
3b13d3f508 Create .env for ETH mainnet 2023-06-10 19:12:24 +03:00
6b1a41585b Update tornado cash site link on relayer site index page 2023-05-06 14:00:32 +03:00
e90d1c4086 Create script to autoinstall dependecies & configure nginx & build docker containers for debian-based distros 2023-05-06 13:57:38 +03:00
61f464441b Delete old docker-compose.yml 2023-05-06 13:55:47 +03:00
gozzy
ed5d99cf44 nginx template ddos mitigation 2023-03-26 21:50:17 +00:00
gozzy
7d10fe2ab9 decrease minimumBalance & disable treeWatcher by default 2023-03-26 11:21:24 +00:00
gozzy
7d3cb5be49 update default rpc 2023-03-20 01:09:46 +00:00
gozzy
ff05e30bd2 update reverse-proxy app port 2023-03-19 21:18:15 +00:00
gozzy
1e8ddffdf1 update README.md 2023-03-19 21:13:12 +00:00
gozzy
3dc9314e29 local image compilation 2023-03-19 19:43:40 +00:00
gozzy
5f3da2578a yaml multi-network config 2023-03-18 23:01:06 +00:00
gozzy
cd6bd25d2c reinstate #1 2023-03-14 16:33:51 +00:00
smart_ex
2247730603 bump tx-manager 2022-05-25 19:21:03 +10:00
smart_ex
c915c97b85 priceWatcher oracle fix 2022-05-25 19:21:03 +10:00
smart_ex
582af773e6 tornado proxy address 2022-05-25 19:21:03 +10:00
smart_ex
9488090892 fix logging error messages 2022-05-25 19:21:03 +10:00
smart_ex
632dce129d fix queue collision 2022-05-25 19:21:03 +10:00
smart_ex
76cda01ee1 validate only selected network contract addresses 2022-05-25 19:21:03 +10:00
smart_ex
7f657c1d7d add test for isKnownContract validation 2022-05-25 19:21:03 +10:00
smart_ex
e386a1d23c fix hash map key 2022-05-25 19:21:03 +10:00
smart_ex
16d8e0fc28 fix contract address validation 2022-05-25 19:21:03 +10:00
smart_ex
a7fc9c4b24 block iframe 2022-05-25 19:21:03 +10:00
smart_ex
ee9e27ecad move errorsLog to health 2022-05-25 19:21:03 +10:00
smart_ex
95c6dc23c6 increment errors score 2022-05-25 19:21:03 +10:00
smart_ex
cfcf1c8677 show errors on status page 2022-05-25 19:21:03 +10:00
smart_ex
8868040882 write errors with positive score 2022-05-25 19:21:03 +10:00
smart_ex
50054e0516 bit of refactor, add RelayerError class 2022-05-25 19:21:03 +10:00
smart_ex
76209e11c0 remove proposal 10 check 2022-05-25 19:21:03 +10:00
Danil Kovtonyuk
3c5eaa2c4b
Update README.md 2022-04-13 21:11:16 +10:00
_den
fd36dd5c5e added link to instructions for running rpc nodes 2022-03-10 21:11:06 +10:00
Alexey Pertsev
350d1f1d11
Update Readme 2022-03-01 11:32:30 +01:00
Danil Kovtonyuk
49a90872a2
fix: gas limit 2022-02-22 04:33:26 +10:00
Sergei SMART
2f79125dd1 init check, add Governance contract 2022-02-16 11:19:40 +01:00
_den
1f900843de
added discord notification (#86)
* added discord notification

* updated notification message

* updated discord notification

* updated discord notification message

Co-authored-by: _den <_den@outlook.com>
2022-02-16 10:37:24 +01:00
dependabot[bot]
cc5ec13d97 Bump simple-get from 2.8.1 to 2.8.2
Bumps [simple-get](https://github.com/feross/simple-get) from 2.8.1 to 2.8.2.
- [Release notes](https://github.com/feross/simple-get/releases)
- [Commits](https://github.com/feross/simple-get/compare/v2.8.1...v2.8.2)

---
updated-dependencies:
- dependency-name: simple-get
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-02-16 10:34:03 +01:00
dependabot[bot]
9e8a6c79ac Bump node-fetch from 2.6.1 to 2.6.7
Bumps [node-fetch](https://github.com/node-fetch/node-fetch) from 2.6.1 to 2.6.7.
- [Release notes](https://github.com/node-fetch/node-fetch/releases)
- [Commits](https://github.com/node-fetch/node-fetch/compare/v2.6.1...v2.6.7)

---
updated-dependencies:
- dependency-name: node-fetch
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-02-16 10:33:45 +01:00
dependabot[bot]
38b1169eae Bump pathval from 1.1.0 to 1.1.1
Bumps [pathval](https://github.com/chaijs/pathval) from 1.1.0 to 1.1.1.
- [Release notes](https://github.com/chaijs/pathval/releases)
- [Changelog](https://github.com/chaijs/pathval/blob/master/CHANGELOG.md)
- [Commits](https://github.com/chaijs/pathval/compare/v1.1.0...v1.1.1)

---
updated-dependencies:
- dependency-name: pathval
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-02-16 10:32:56 +01:00
dependabot[bot]
d95d495853 Bump tar from 4.4.13 to 4.4.19
Bumps [tar](https://github.com/npm/node-tar) from 4.4.13 to 4.4.19.
- [Release notes](https://github.com/npm/node-tar/releases)
- [Changelog](https://github.com/npm/node-tar/blob/main/CHANGELOG.md)
- [Commits](https://github.com/npm/node-tar/compare/v4.4.13...v4.4.19)

---
updated-dependencies:
- dependency-name: tar
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-02-16 10:32:45 +01:00
dependabot[bot]
54e1c03f9e Bump glob-parent from 5.1.1 to 5.1.2
Bumps [glob-parent](https://github.com/gulpjs/glob-parent) from 5.1.1 to 5.1.2.
- [Release notes](https://github.com/gulpjs/glob-parent/releases)
- [Changelog](https://github.com/gulpjs/glob-parent/blob/main/CHANGELOG.md)
- [Commits](https://github.com/gulpjs/glob-parent/compare/v5.1.1...v5.1.2)

---
updated-dependencies:
- dependency-name: glob-parent
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-02-16 10:32:33 +01:00
dependabot[bot]
429723f370 Bump normalize-url from 4.5.0 to 4.5.1
Bumps [normalize-url](https://github.com/sindresorhus/normalize-url) from 4.5.0 to 4.5.1.
- [Release notes](https://github.com/sindresorhus/normalize-url/releases)
- [Commits](https://github.com/sindresorhus/normalize-url/commits)

---
updated-dependencies:
- dependency-name: normalize-url
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-02-16 10:32:18 +01:00
dependabot[bot]
b457820cf5 Bump lodash from 4.17.20 to 4.17.21
Bumps [lodash](https://github.com/lodash/lodash) from 4.17.20 to 4.17.21.
- [Release notes](https://github.com/lodash/lodash/releases)
- [Commits](https://github.com/lodash/lodash/compare/4.17.20...4.17.21)

Signed-off-by: dependabot[bot] <support@github.com>
2022-02-16 10:31:27 +01:00
Danil Kovtonyuk
31d697701e
fix: lint 2022-01-19 22:11:00 +10:00
_den
b103033103 added zabbix guide to README.md 2022-01-19 22:03:39 +10:00
_den
1bb2d5f044 updated installation instructions 2022-01-19 22:03:39 +10:00
_den
44f1e1ec7a added templates for monitoring with zabbix 2022-01-19 22:03:39 +10:00
Danil Kovtonyuk
043356f5fe fix: initializing mining tree 2021-10-09 03:15:56 +10:00
Danil Kovtonyuk
8f5b673f3b feat: add env baseFeeReserve variable 2021-09-08 23:15:25 +10:00
Danil Kovtonyuk
82e5f4ee70 feat: add EIP-1559 support 2021-09-08 23:15:25 +10:00
nikdementev
cc0a252040
fix: bump cdai gas limit 2021-07-30 20:39:03 +03:00
nikdementev
a46afed752
styles: prettier 2021-06-19 23:20:19 +03:00
nikdementev
0cd9e515ae
fix: price method 2021-06-19 23:10:31 +03:00
nikdementev
fa77997896 fix: prices oracle 2021-06-19 23:05:38 +03:00
40 changed files with 723 additions and 656995 deletions

@ -1,3 +0,0 @@
node_modules
.env
.git

@ -1,9 +0,0 @@
root = true
[*]
indent_style = space
indent_size = 2
end_of_line = lf
charset = utf-8
trim_trailing_whitespace = true
insert_final_newline = true

@ -1,23 +0,0 @@
NET_ID=1
HTTP_RPC_URL=https://mainnet.infura.io
WS_RPC_URL=wss://mainnet.infura.io/ws/v3/
# ORACLE_RPC_URL should always point to the mainnet
ORACLE_RPC_URL=https://mainnet.infura.io
REDIS_URL=redis://127.0.0.1:6379
# DNS settings
VIRTUAL_HOST=example.duckdns.org
LETSENCRYPT_HOST=example.duckdns.org
APP_PORT=8000
# without 0x prefix
PRIVATE_KEY=
# 0.05 means 0.05%
REGULAR_TORNADO_WITHDRAW_FEE=0.05
MINING_SERVICE_FEE=0.05
REWARD_ACCOUNT=
CONFIRMATIONS=4
# in GWEI
MAX_GAS_PRICE=1000
AGGREGATOR=0x8cb1436F64a3c33aD17bb42F94e255c4c0E871b2

@ -1,45 +0,0 @@
{
"env": {
"node": true,
"browser": true,
"es6": true,
"mocha": true
},
"extends": "eslint:recommended",
"globals": {
"Atomics": "readonly",
"SharedArrayBuffer": "readonly"
},
"parserOptions": {
"ecmaVersion": 2018
},
"rules": {
"indent": [
"error",
2,
{
"SwitchCase": 1
}
],
"linebreak-style": ["error", "unix"],
"quotes": [
"error",
"single",
{
"avoidEscape": true
}
],
"semi": ["error", "never"],
"object-curly-spacing": ["error", "always"],
"require-await": "error",
"comma-dangle": ["error", "only-multiline"],
"space-before-function-paren": [
"error",
{
"anonymous": "always",
"named": "never",
"asyncArrow": "always"
}
]
}
}

@ -1,86 +0,0 @@
name: build
on:
push:
branches: ['*']
tags: ['v[0-9]+.[0-9]+.[0-9]+']
pull_request:
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v2
- uses: actions/setup-node@v1
with:
node-version: 12
- run: yarn install
- run: yarn test
- run: yarn lint
- name: Telegram Failure Notification
uses: appleboy/telegram-action@0.0.7
if: failure()
with:
message: ❗ Build failed for [${{ github.repository }}](https://github.com/${{ github.repository }}/actions) because of ${{ github.actor }}
format: markdown
to: ${{ secrets.TELEGRAM_CHAT_ID }}
token: ${{ secrets.TELEGRAM_BOT_TOKEN }}
publish:
runs-on: ubuntu-latest
needs: build
if: startsWith(github.ref, 'refs/tags')
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Set vars
id: vars
run: |
echo "::set-output name=version::$(echo ${GITHUB_REF#refs/tags/v})"
echo "::set-output name=repo_name::$(echo ${GITHUB_REPOSITORY#*/})"
- name: Check package.json version vs tag
run: |
[ ${{ steps.vars.outputs.version }} = $(grep '"version":' package.json | grep -o "[0-9.]*") ] || (echo "Git tag doesn't match version in package.json" && false)
- name: Build and push Docker image
uses: docker/build-push-action@v1.1.0
with:
dockerfile: Dockerfile
repository: tornadocash/relayer
tag_with_ref: true
tags: mining,candidate
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}
- name: Telegram Message Notify
uses: appleboy/telegram-action@0.0.7
with:
to: ${{ secrets.TELEGRAM_CHAT_ID }}
token: ${{ secrets.TELEGRAM_BOT_TOKEN }}
message: 🚀 Published a [${{ steps.vars.outputs.repo_name }}](https://github.com/${{ github.repository }}) version ${{ steps.vars.outputs.version }} to docker hub
debug: true
format: markdown
- name: Telegram Relayer Channel Notification
uses: appleboy/telegram-action@0.0.7
with:
to: ${{ secrets.TELEGRAM_RELAYER_CHAT_ID }}
token: ${{ secrets.TELEGRAM_BOT_TOKEN }}
message: |
🚀 Published a new version of the relayer node service to docker hub: `tornadocash/relayer:v${{ steps.vars.outputs.version }}` and `tornadocash/relayer:mining`.
Please update your nodes ❗️
debug: true
format: markdown
- name: Telegram Failure Notification
uses: appleboy/telegram-action@0.0.7
if: failure()
with:
message: ❗ Failed to publish [${{ steps.vars.outputs.repo_name }}](https://github.com/${{ github.repository }}/actions) because of ${{ github.actor }}
format: markdown
to: ${{ secrets.TELEGRAM_CHAT_ID }}
token: ${{ secrets.TELEGRAM_BOT_TOKEN }}

10
.gitignore vendored

@ -1,8 +1,4 @@
.vscode
node_modules/
node_modules
.env
.env.mainnet
.env.kovan
kovan.*
dump.rdb
.idea
.env*

@ -1 +0,0 @@
keys/TreeUpdate.json

@ -1,7 +0,0 @@
{
"semi": false,
"arrowParens": "avoid",
"singleQuote": true,
"printWidth": 110,
"trailingComma": "all"
}

@ -1,9 +0,0 @@
FROM node:12
WORKDIR /app
COPY package.json yarn.lock ./
RUN yarn && yarn cache clean --force
COPY . .
EXPOSE 8000
ENTRYPOINT ["yarn"]

112
README.md

@ -1,84 +1,60 @@
# Relayer for Tornado Cash [![Build Status](https://github.com/tornadocash/relayer/workflows/build/badge.svg)](https://github.com/tornadocash/relayer/actions) [![Docker Image Version (latest semver)](https://img.shields.io/docker/v/tornadocash/relayer?logo=docker&logoColor=%23FFFFFF&sort=semver)](https://hub.docker.com/repository/docker/tornadocash/relayer)
# Relayer for Tornado Cash [![Build Status](https://github.com/tornadocash/relayer/workflows/build/badge.svg)](https://github.com/tornadocash/relayer/actions)![Sidechains version](https://img.shields.io/badge/version-5.2.1-blue?logo=docker)![Mainnet version](https://img.shields.io/badge/version-4.1.5-blue?logo=docker)
## Getting listed on app.tornado.cash
**\*Tornado Cash was sanctioned by the US Treasury on 08/08/2022, this makes it illegal for US citizens to interact with Tornado Cash and all of it's associated deployed smart contracts. Please understand the laws where you live and take all necessary steps to protect and anonymize yourself.**
If you would like to be listed in tornado.cash UI relayer's dropdown option, please do the following:
**\*It is recommended to run your Relayer on a VPS instnace (Virtual Private Server). Ensure SSH configuration is enabled for security, you can find information about SSH keygen and management [here](https://www.ssh.com/academy/ssh/keygen).**
1. Setup tornado.cash relayer node(see below for docker-compose.yml example)
2. Setup ENS subdomain(`goerli-v2.xxx.eth`, `mainnet-v2.xxx.eth`) with TEXT record and URL key that points to your DNS or IP address.
3. Test your relayer setup on Goerli testnet at https://app.tornado.cash by choosing custom relayer's option on withdraw tab. Enter your ens name and initiate a withdrawal.
4. Open new Github issue in https://github.com/tornadocash/tornado-relayer/issues and specify the following:
## Deploy with script and docker-compose
- your goerli ens url
- your mainnet ens url
- your telegram handle
- withdrawal tx on goerli
- withdrawal tx on mainnet
_The following instructions are for Ubuntu 22.10, other operating systems may vary._
Please choose your testnet relayer's fee wisely.
#### Installation:
Disclaimer: Please consult with legal and tax advisors regarding the compliance of running a relayer service in your jurisdiction. The authors of this project bear no responsibility.
USE AT YOUR OWN RISK.
## Deploy with docker-compose
docker-compose.yml contains a stack that will automatically provision SSL certificates for your domain name and will add a https redirect to port 80.
1. Download [docker-compose.yml](/docker-compose.yml) and [.env.example](/.env.example)
```
wget https://raw.githubusercontent.com/tornadocash/tornado-relayer/master/docker-compose.yml
wget https://raw.githubusercontent.com/tornadocash/tornado-relayer/master/.env.example -O .env
```
2. Setup environment variables
- set `NET_ID` (1 for mainnet, 5 for Goerli)
- set `HTTP_RPC_URL` rpc url for your ethereum node
- set `WS_RPC_URL` websocket url
- set `ORACLE_RPC_URL` - rpc url for mainnet node for fetching prices(always have to be on mainnet)
- set `PRIVATE_KEY` for your relayer address (without 0x prefix)
- set `VIRTUAL_HOST` and `LETSENCRYPT_HOST` to your domain and add DNS record pointing to your relayer ip address
- set `REGULAR_TORNADO_WITHDRAW_FEE` - fee in % that is used for tornado pool withdrawals
- set `MINING_SERVICE_FEE` - fee in % that is used for mining AP withdrawals
- set `REWARD_ACCOUNT` - eth address that is used to collect fees
- update `AGGREGATOR` if needed - Contract address of aggregator instance.
- update `CONFIRMATIONS` if needed - how many block confirmations to wait before processing an event. Not recommended to set less than 3
- update `MAX_GAS_PRICE` if needed - maximum value of gwei value for relayer's transaction
If you want to use more than 1 eth address for relaying transactions, please add as many `workers` as you want. For example, you can comment out `worker2` in docker-compose.yml file, but please use a different `PRIVATE_KEY` for each worker.
3. Run `docker-compose up -d`
## Run locally
1. `npm i`
2. `cp .env.example .env`
3. Modify `.env` as needed
4. `npm run start`
5. Go to `http://127.0.0.1:8000`
6. In order to execute withdraw request, you can run following command
Just run in terminal:
```bash
curl -X POST -H 'content-type:application/json' --data '<input data>' http://127.0.0.1:8000/relay
curl -s https://git.tornado.ws/tornadocash/tornado-relayer/raw/branch/main/install.sh | bash
```
Relayer should return a transaction hash
#### Configuring environments:
In that case you will need to add https termination yourself because browsers with default settings will prevent https
tornado.cash UI from submitting your request over http connection
1. Go to `tornado-relayer` folder on the server home directory
2. Check environment files:
## Architecture
By default each network is preconfigured the naming of `.env.<NETWORK>`
1. TreeWatcher module keeps track of Account Tree changes and automatically caches the actual state in Redis and emits `treeUpdate` event to redis pub/sub channel
2. Server module is Express.js instance that accepts http requests
3. Controller contains handlers for the Server endpoints. It validates input data and adds a Job to Queue.
4. Queue module is used by Controller to put and get Job from queue (bull wrapper)
5. Status module contains handler to get a Job status. It's used by UI for pull updates
6. Validate contains validation logic for all endpoints
7. Worker is the main module that gets a Job from queue and processes it
- `.env.eth` for Ethereum Mainnet
- `.env.bsc` for Binance Smart Chain
- `.env.arb` for Arbitrum
- `.env.op` for Optimism
- `.env.gnosis` for Gnosis (xdai)
- `.env.polygon` for Polygon (matic)
- `.env.avax` for Avalanche C-Chain
Disclaimer:
3. Configure (fill) environment files for those networks on which the relayer will be deployed:
- Set `PRIVATE_KEY` to your relayer address (remove the 0x from your private key) to each environment file
- _It is recommended not to reuse the same private keys for each network as a security measure_
- Set `VIRTUAL_HOST` and `LETSENCRYPT_HOST` a unique subndomain for every network to each environment file
- eg: `mainnet.example.com` for Ethereum, `binance.example.com` for Binance etc
- add a A wildcard record DNS record with the value assigned to your instance IP address to configure subdomains
- Set `RELAYER_FEE` to what you would like to charge as your fee (remember 0.3% is deducted from your staked relayer balance)
- Set `RPC_URL` to a non-censoring RPC (You can [run your own](https://github.com/feshchenkod/rpc-nodes), or use a [free option](https://chainnodes.org/))
- Set `ORACLE_RPC_URL` to an Ethereum native RPC endpoint
4(Optional). If you want to run relayer for [Nova](https://nova.tornado.ws), fill `.env.nova` file by instructions in [Nova branch](https://git.tornado.ws/tornadocash/tornado-relayer/src/branch/nova), because config is very specific
#### Deployment:
1. Build and deploy the docker source for the configured networks specified via `--profile <NETWORK_SYMBOL>`, for example (if you run relayer only for Ethereum Mainnet, Binance Smart Chain and Arbitrum):
- `docker-compose --profile eth --profile bsc --profile arb up -d`
2. Visit your domain addresses and check each `/status` endpoint to ensure there is no errors in the `status` fields
2. Optional: if you want to run Nova relayer, just add `--profile nova` to docker-compose command
If you want to change some relayer parameters, for example, RPC url or fee percent, stop the relayer software with command `docker-compose down --remove-orphans`, change in corresponding `.env.{name}` file what you need and rerun relayer as described above.
#### Disclaimer:
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

@ -1,319 +0,0 @@
[
{
"inputs": [
{
"internalType": "bytes32[]",
"name": "domains",
"type": "bytes32[]"
}
],
"name": "bulkResolve",
"outputs": [
{
"internalType": "address[]",
"name": "result",
"type": "address[]"
}
],
"stateMutability": "nonpayable",
"type": "function"
},
{
"inputs": [
{
"internalType": "contract Governance",
"name": "governance",
"type": "address"
}
],
"name": "getAllProposals",
"outputs": [
{
"components": [
{
"internalType": "address",
"name": "proposer",
"type": "address"
},
{
"internalType": "address",
"name": "target",
"type": "address"
},
{
"internalType": "uint256",
"name": "startTime",
"type": "uint256"
},
{
"internalType": "uint256",
"name": "endTime",
"type": "uint256"
},
{
"internalType": "uint256",
"name": "forVotes",
"type": "uint256"
},
{
"internalType": "uint256",
"name": "againstVotes",
"type": "uint256"
},
{
"internalType": "bool",
"name": "executed",
"type": "bool"
},
{
"internalType": "bool",
"name": "extended",
"type": "bool"
},
{
"internalType": "enum Governance.ProposalState",
"name": "state",
"type": "uint8"
}
],
"internalType": "struct GovernanceAggregator.Proposal[]",
"name": "proposals",
"type": "tuple[]"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "contract Governance",
"name": "governance",
"type": "address"
},
{
"internalType": "address[]",
"name": "accs",
"type": "address[]"
}
],
"name": "getGovernanceBalances",
"outputs": [
{
"internalType": "uint256[]",
"name": "amounts",
"type": "uint256[]"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "address[]",
"name": "fromTokens",
"type": "address[]"
},
{
"internalType": "uint256[]",
"name": "oneUnitAmounts",
"type": "uint256[]"
}
],
"name": "getPricesInETH",
"outputs": [
{
"internalType": "uint256[]",
"name": "prices",
"type": "uint256[]"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "contract Governance",
"name": "governance",
"type": "address"
},
{
"internalType": "address",
"name": "account",
"type": "address"
}
],
"name": "getUserData",
"outputs": [
{
"internalType": "uint256",
"name": "balance",
"type": "uint256"
},
{
"internalType": "uint256",
"name": "latestProposalId",
"type": "uint256"
},
{
"internalType": "uint256",
"name": "latestProposalIdState",
"type": "uint256"
},
{
"internalType": "uint256",
"name": "timelock",
"type": "uint256"
},
{
"internalType": "address",
"name": "delegatee",
"type": "address"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "contract Miner",
"name": "miner",
"type": "address"
},
{
"internalType": "address[]",
"name": "instances",
"type": "address[]"
}
],
"name": "minerRates",
"outputs": [
{
"internalType": "uint256[]",
"name": "_rates",
"type": "uint256[]"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "bytes32",
"name": "node",
"type": "bytes32"
}
],
"name": "resolve",
"outputs": [
{
"internalType": "address",
"name": "",
"type": "address"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "contract RewardSwap",
"name": "swap",
"type": "address"
}
],
"name": "swapState",
"outputs": [
{
"internalType": "uint256",
"name": "balance",
"type": "uint256"
},
{
"internalType": "uint256",
"name": "poolWeight",
"type": "uint256"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "contract Miner",
"name": "miner",
"type": "address"
},
{
"internalType": "address[]",
"name": "instances",
"type": "address[]"
},
{
"internalType": "contract RewardSwap",
"name": "swap",
"type": "address"
}
],
"name": "miningData",
"outputs": [
{
"internalType": "uint256[]",
"name": "_rates",
"type": "uint256[]"
},
{
"internalType": "uint256",
"name": "balance",
"type": "uint256"
},
{
"internalType": "uint256",
"name": "poolWeight",
"type": "uint256"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "address[]",
"name": "fromTokens",
"type": "address[]"
},
{
"internalType": "uint256[]",
"name": "oneUnitAmounts",
"type": "uint256[]"
},
{
"internalType": "contract RewardSwap",
"name": "swap",
"type": "address"
}
],
"name": "marketData",
"outputs": [
{
"internalType": "uint256[]",
"name": "prices",
"type": "uint256[]"
},
{
"internalType": "uint256",
"name": "balance",
"type": "uint256"
}
],
"stateMutability": "view",
"type": "function"
}
]

@ -1,12 +0,0 @@
[
{
"inputs": [
{ "internalType": "contract IERC20", "name": "srcToken", "type": "address" },
{ "internalType": "contract IERC20", "name": "dstToken", "type": "address" }
],
"name": "getRate",
"outputs": [{ "internalType": "uint256", "name": "weightedRate", "type": "uint256" }],
"stateMutability": "view",
"type": "function"
}
]

File diff suppressed because it is too large Load Diff

@ -1,252 +0,0 @@
[
{
"inputs": [
{
"internalType": "bytes32",
"name": "_torn",
"type": "bytes32"
},
{
"internalType": "bytes32",
"name": "_miner",
"type": "bytes32"
},
{
"internalType": "uint256",
"name": "_miningCap",
"type": "uint256"
},
{
"internalType": "uint256",
"name": "_initialLiquidity",
"type": "uint256"
}
],
"stateMutability": "nonpayable",
"type": "constructor"
},
{
"anonymous": false,
"inputs": [
{
"indexed": false,
"internalType": "uint256",
"name": "newWeight",
"type": "uint256"
}
],
"name": "PoolWeightUpdated",
"type": "event"
},
{
"anonymous": false,
"inputs": [
{
"indexed": true,
"internalType": "address",
"name": "recipient",
"type": "address"
},
{
"indexed": false,
"internalType": "uint256",
"name": "pTORN",
"type": "uint256"
},
{
"indexed": false,
"internalType": "uint256",
"name": "TORN",
"type": "uint256"
}
],
"name": "Swap",
"type": "event"
},
{
"inputs": [],
"name": "DURATION",
"outputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [],
"name": "initialLiquidity",
"outputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [],
"name": "liquidity",
"outputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [],
"name": "miner",
"outputs": [
{
"internalType": "address",
"name": "",
"type": "address"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [],
"name": "poolWeight",
"outputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "bytes32",
"name": "node",
"type": "bytes32"
}
],
"name": "resolve",
"outputs": [
{
"internalType": "address",
"name": "",
"type": "address"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [],
"name": "startTimestamp",
"outputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [],
"name": "tokensSold",
"outputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [],
"name": "torn",
"outputs": [
{
"internalType": "contract IERC20",
"name": "",
"type": "address"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "address",
"name": "recipient",
"type": "address"
},
{
"internalType": "uint256",
"name": "amount",
"type": "uint256"
}
],
"name": "swap",
"outputs": [],
"stateMutability": "nonpayable",
"type": "function"
},
{
"inputs": [
{
"internalType": "uint256",
"name": "amount",
"type": "uint256"
}
],
"name": "getExpectedReturn",
"outputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [],
"name": "tornVirtualBalance",
"outputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "uint256",
"name": "newWeight",
"type": "uint256"
}
],
"name": "setPoolWeight",
"outputs": [],
"stateMutability": "nonpayable",
"type": "function"
}
]

@ -1,498 +0,0 @@
[
{
"constant": false,
"inputs": [
{
"internalType": "address",
"name": "_newOperator",
"type": "address"
}
],
"name": "changeOperator",
"outputs": [],
"payable": false,
"stateMutability": "nonpayable",
"type": "function"
},
{
"constant": true,
"inputs": [
{
"internalType": "bytes32",
"name": "",
"type": "bytes32"
}
],
"name": "nullifierHashes",
"outputs": [
{
"internalType": "bool",
"name": "",
"type": "bool"
}
],
"payable": false,
"stateMutability": "view",
"type": "function"
},
{
"constant": false,
"inputs": [
{
"internalType": "bytes",
"name": "_proof",
"type": "bytes"
},
{
"internalType": "bytes32",
"name": "_root",
"type": "bytes32"
},
{
"internalType": "bytes32",
"name": "_nullifierHash",
"type": "bytes32"
},
{
"internalType": "address payable",
"name": "_recipient",
"type": "address"
},
{
"internalType": "address payable",
"name": "_relayer",
"type": "address"
},
{
"internalType": "uint256",
"name": "_fee",
"type": "uint256"
},
{
"internalType": "uint256",
"name": "_refund",
"type": "uint256"
}
],
"name": "withdraw",
"outputs": [],
"payable": true,
"stateMutability": "payable",
"type": "function"
},
{
"constant": true,
"inputs": [],
"name": "verifier",
"outputs": [
{
"internalType": "contract IVerifier",
"name": "",
"type": "address"
}
],
"payable": false,
"stateMutability": "view",
"type": "function"
},
{
"constant": true,
"inputs": [
{
"internalType": "bytes32",
"name": "_left",
"type": "bytes32"
},
{
"internalType": "bytes32",
"name": "_right",
"type": "bytes32"
}
],
"name": "hashLeftRight",
"outputs": [
{
"internalType": "bytes32",
"name": "",
"type": "bytes32"
}
],
"payable": false,
"stateMutability": "pure",
"type": "function"
},
{
"constant": true,
"inputs": [],
"name": "FIELD_SIZE",
"outputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"payable": false,
"stateMutability": "view",
"type": "function"
},
{
"constant": true,
"inputs": [],
"name": "levels",
"outputs": [
{
"internalType": "uint32",
"name": "",
"type": "uint32"
}
],
"payable": false,
"stateMutability": "view",
"type": "function"
},
{
"constant": true,
"inputs": [],
"name": "operator",
"outputs": [
{
"internalType": "address",
"name": "",
"type": "address"
}
],
"payable": false,
"stateMutability": "view",
"type": "function"
},
{
"constant": true,
"inputs": [
{
"internalType": "bytes32",
"name": "_root",
"type": "bytes32"
}
],
"name": "isKnownRoot",
"outputs": [
{
"internalType": "bool",
"name": "",
"type": "bool"
}
],
"payable": false,
"stateMutability": "view",
"type": "function"
},
{
"constant": true,
"inputs": [
{
"internalType": "bytes32",
"name": "",
"type": "bytes32"
}
],
"name": "commitments",
"outputs": [
{
"internalType": "bool",
"name": "",
"type": "bool"
}
],
"payable": false,
"stateMutability": "view",
"type": "function"
},
{
"constant": true,
"inputs": [],
"name": "denomination",
"outputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"payable": false,
"stateMutability": "view",
"type": "function"
},
{
"constant": true,
"inputs": [],
"name": "currentRootIndex",
"outputs": [
{
"internalType": "uint32",
"name": "",
"type": "uint32"
}
],
"payable": false,
"stateMutability": "view",
"type": "function"
},
{
"constant": false,
"inputs": [
{
"internalType": "address",
"name": "_newVerifier",
"type": "address"
}
],
"name": "updateVerifier",
"outputs": [],
"payable": false,
"stateMutability": "nonpayable",
"type": "function"
},
{
"constant": false,
"inputs": [
{
"internalType": "bytes32",
"name": "_commitment",
"type": "bytes32"
}
],
"name": "deposit",
"outputs": [],
"payable": true,
"stateMutability": "payable",
"type": "function"
},
{
"constant": true,
"inputs": [],
"name": "getLastRoot",
"outputs": [
{
"internalType": "bytes32",
"name": "",
"type": "bytes32"
}
],
"payable": false,
"stateMutability": "view",
"type": "function"
},
{
"constant": true,
"inputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"name": "roots",
"outputs": [
{
"internalType": "bytes32",
"name": "",
"type": "bytes32"
}
],
"payable": false,
"stateMutability": "view",
"type": "function"
},
{
"constant": true,
"inputs": [],
"name": "ROOT_HISTORY_SIZE",
"outputs": [
{
"internalType": "uint32",
"name": "",
"type": "uint32"
}
],
"payable": false,
"stateMutability": "view",
"type": "function"
},
{
"constant": true,
"inputs": [
{
"internalType": "bytes32",
"name": "_nullifierHash",
"type": "bytes32"
}
],
"name": "isSpent",
"outputs": [
{
"internalType": "bool",
"name": "",
"type": "bool"
}
],
"payable": false,
"stateMutability": "view",
"type": "function"
},
{
"constant": true,
"inputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"name": "zeros",
"outputs": [
{
"internalType": "bytes32",
"name": "",
"type": "bytes32"
}
],
"payable": false,
"stateMutability": "view",
"type": "function"
},
{
"constant": true,
"inputs": [],
"name": "ZERO_VALUE",
"outputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"payable": false,
"stateMutability": "view",
"type": "function"
},
{
"constant": true,
"inputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"name": "filledSubtrees",
"outputs": [
{
"internalType": "bytes32",
"name": "",
"type": "bytes32"
}
],
"payable": false,
"stateMutability": "view",
"type": "function"
},
{
"constant": true,
"inputs": [],
"name": "nextIndex",
"outputs": [
{
"internalType": "uint32",
"name": "",
"type": "uint32"
}
],
"payable": false,
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "contract IVerifier",
"name": "_verifier",
"type": "address"
},
{
"internalType": "uint256",
"name": "_denomination",
"type": "uint256"
},
{
"internalType": "uint32",
"name": "_merkleTreeHeight",
"type": "uint32"
},
{
"internalType": "address",
"name": "_operator",
"type": "address"
}
],
"payable": false,
"stateMutability": "nonpayable",
"type": "constructor"
},
{
"anonymous": false,
"inputs": [
{
"indexed": true,
"internalType": "bytes32",
"name": "commitment",
"type": "bytes32"
},
{
"indexed": false,
"internalType": "uint32",
"name": "leafIndex",
"type": "uint32"
},
{
"indexed": false,
"internalType": "uint256",
"name": "timestamp",
"type": "uint256"
}
],
"name": "Deposit",
"type": "event"
},
{
"anonymous": false,
"inputs": [
{
"indexed": false,
"internalType": "address",
"name": "to",
"type": "address"
},
{
"indexed": false,
"internalType": "bytes32",
"name": "nullifierHash",
"type": "bytes32"
},
{
"indexed": true,
"internalType": "address",
"name": "relayer",
"type": "address"
},
{
"indexed": false,
"internalType": "uint256",
"name": "fee",
"type": "uint256"
}
],
"name": "Withdrawal",
"type": "event"
}
]

@ -1,171 +0,0 @@
[
{
"inputs": [
{
"internalType": "bytes32",
"name": "_tornadoTrees",
"type": "bytes32"
},
{
"internalType": "bytes32",
"name": "_governance",
"type": "bytes32"
},
{
"internalType": "contract ITornado[]",
"name": "_instances",
"type": "address[]"
}
],
"stateMutability": "nonpayable",
"type": "constructor"
},
{
"inputs": [],
"name": "governance",
"outputs": [
{
"internalType": "address",
"name": "",
"type": "address"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "contract ITornado",
"name": "",
"type": "address"
}
],
"name": "instances",
"outputs": [
{
"internalType": "bool",
"name": "",
"type": "bool"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "bytes32",
"name": "node",
"type": "bytes32"
}
],
"name": "resolve",
"outputs": [
{
"internalType": "address",
"name": "",
"type": "address"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [],
"name": "tornadoTrees",
"outputs": [
{
"internalType": "contract ITornadoTrees",
"name": "",
"type": "address"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "contract ITornado",
"name": "tornado",
"type": "address"
},
{
"internalType": "bytes32",
"name": "commitment",
"type": "bytes32"
}
],
"name": "deposit",
"outputs": [],
"stateMutability": "payable",
"type": "function"
},
{
"inputs": [
{
"internalType": "contract ITornado",
"name": "instance",
"type": "address"
},
{
"internalType": "bool",
"name": "update",
"type": "bool"
}
],
"name": "updateInstances",
"outputs": [],
"stateMutability": "nonpayable",
"type": "function"
},
{
"inputs": [
{
"internalType": "contract ITornado",
"name": "tornado",
"type": "address"
},
{
"internalType": "bytes",
"name": "proof",
"type": "bytes"
},
{
"internalType": "bytes32",
"name": "root",
"type": "bytes32"
},
{
"internalType": "bytes32",
"name": "nullifierHash",
"type": "bytes32"
},
{
"internalType": "address payable",
"name": "recipient",
"type": "address"
},
{
"internalType": "address payable",
"name": "relayer",
"type": "address"
},
{
"internalType": "uint256",
"name": "fee",
"type": "uint256"
},
{
"internalType": "uint256",
"name": "refund",
"type": "uint256"
}
],
"name": "withdraw",
"outputs": [],
"stateMutability": "payable",
"type": "function"
}
]

1
app.js

@ -1 +0,0 @@
module.exports = require('./src/index')

@ -1,62 +0,0 @@
version: '2'
# ssh-agent && ssh-add -K ~/.ssh/id_rsa
# DOCKER_BUILDKIT=1 docker build --ssh default -t tornadocash/relayer .
services:
server:
image: tornadocash/relayer
restart: always
command: server
env_file: .env
ports:
- 8000:8000
environment:
REDIS_URL: redis://redis/0
nginx_proxy_read_timeout: 600
depends_on: [redis]
treeWatcher:
image: tornadocash/relayer
restart: always
command: treeWatcher
env_file: .env
environment:
REDIS_URL: redis://redis/0
depends_on: [redis]
priceWatcher:
image: tornadocash/relayer
restart: always
command: priceWatcher
env_file: .env
environment:
REDIS_URL: redis://redis/0
depends_on: [redis]
worker1:
image: tornadocash/relayer
restart: always
command: worker
env_file: .env
environment:
REDIS_URL: redis://redis/0
depends_on: [redis]
# worker2:
# image: tornadocash/relayer
# restart: always
# command: worker
# env_file: .env
# environment:
# PRIVATE_KEY: qwe
# REDIS_URL: redis://redis/0
redis:
image: redis
restart: always
command: [redis-server, --appendonly, 'yes']
volumes:
- redis:/data
volumes:
redis:

@ -1,154 +1,471 @@
version: '2'
version: "2"
services:
server:
image: tornadocash/relayer:mining
restart: always
command: server
env_file: .env
environment:
REDIS_URL: redis://redis/0
nginx_proxy_read_timeout: 600
depends_on: [redis]
redis:
image: redis
restart: always
command: [redis-server, --appendonly, "yes"]
volumes:
- redis:/data
treeWatcher:
image: tornadocash/relayer:mining
restart: always
command: treeWatcher
env_file: .env
environment:
REDIS_URL: redis://redis/0
depends_on: [redis]
nginx:
image: nginx:alpine
container_name: nginx
restart: always
ports:
- 80:80
- 443:443
volumes:
- conf:/etc/nginx/conf.d
- vhost:/etc/nginx/vhost.d
- html:/usr/share/nginx/html
- certs:/etc/nginx/certs
logging:
driver: none
priceWatcher:
image: tornadocash/relayer:mining
restart: always
command: priceWatcher
env_file: .env
environment:
REDIS_URL: redis://redis/0
depends_on: [redis]
dockergen:
image: poma/docker-gen
container_name: dockergen
restart: always
command: -notify-sighup nginx -watch /etc/docker-gen/templates/nginx.tmpl /etc/nginx/conf.d/default.conf
volumes_from:
- nginx
volumes:
- /var/run/docker.sock:/var/run/docker.sock:ro
healthWatcher:
image: tornadocash/relayer:mining
restart: always
command: healthWatcher
env_file: .env
environment:
REDIS_URL: redis://redis/0
depends_on: [redis]
letsencrypt:
image: jrcs/letsencrypt-nginx-proxy-companion
container_name: letsencrypt
restart: always
environment:
NGINX_DOCKER_GEN_CONTAINER: dockergen
volumes_from:
- nginx
- dockergen
worker1:
image: tornadocash/relayer:mining
restart: always
command: worker
env_file: .env
environment:
REDIS_URL: redis://redis/0
depends_on: [redis]
# ---------------------- ETH Mainnet ----------------------- #
# worker2:
# image: tornadocash/relayer:mining
# restart: always
# command: worker
# env_file: .env
# environment:
# PRIVATE_KEY: qwe
# REDIS_URL: redis://redis/0
eth-server:
build: .
image: tornadocash/relayer:mainnet-v4
profiles: ["eth"]
restart: always
command: server
env_file: .env.eth
environment:
NET_ID: 1
REDIS_URL: redis://redis/0
nginx_proxy_read_timeout: 600
depends_on: [redis]
# # this container will proxy *.onion domain to the server container
# # if you want to run *only* as .onion service, you don't need `nginx`, `letsencrypt`, `dockergen` containers
# tor:
# image: strm/tor
# restart: always
# depends_on: [server]
# environment:
# LISTEN_PORT: 80
# REDIRECT: server:8000
# # Generate a new key with
# # docker run --rm --entrypoint shallot strm/tor-hiddenservice-nginx ^foo
# PRIVATE_KEY: |
# -----BEGIN RSA PRIVATE KEY-----
# ...
# -----END RSA PRIVATE KEY-----
eth-treeWatcher:
image: tornadocash/relayer:mainnet-v4
profiles: ["eth"]
restart: always
command: treeWatcher
env_file: .env.eth
environment:
NET_ID: 1
REDIS_URL: redis://redis/0
depends_on: [redis, eth-server]
# # auto update docker containers when new image is pushed to docker hub (be careful with that)
# watchtower:
# image: v2tec/watchtower
# restart: always
# volumes:
# - /var/run/docker.sock:/var/run/docker.sock
eth-priceWatcher:
image: tornadocash/relayer:mainnet-v4
profiles: ["eth"]
restart: always
command: priceWatcher
env_file: .env.eth
environment:
NET_ID: 1
REDIS_URL: redis://redis/0
depends_on: [redis, eth-server]
# # this container will send Telegram notifications when other containers are stopped/restarted
# # it's best to run this container on some other instance, otherwise it can't notify if the whole instance goes down
# notifier:
# image: poma/docker-telegram-notifier
# restart: always
# volumes:
# - /var/run/docker.sock:/var/run/docker.sock:ro
# environment:
# # How to create bot: https://core.telegram.org/bots#3-how-do-i-create-a-bot
# # How to get chat id: https://stackoverflow.com/questions/32423837/telegram-bot-how-to-get-a-group-chat-id/32572159#32572159
# TELEGRAM_NOTIFIER_BOT_TOKEN: ...
# TELEGRAM_NOTIFIER_CHAT_ID: ...
eth-healthWatcher:
image: tornadocash/relayer:mainnet-v4
profiles: ["eth"]
restart: always
command: healthWatcher
env_file: .env.eth
environment:
NET_ID: 1
REDIS_URL: redis://redis/0
depends_on: [redis, eth-server]
# # this container will send Telegram notifications if specified address doesn't have enough funds
# monitor_mainnet:
# image: peppersec/monitor_eth
# restart: always
# environment:
# TELEGRAM_NOTIFIER_BOT_TOKEN: ...
# TELEGRAM_NOTIFIER_CHAT_ID: ...
# ADDRESS: '0x0000000000000000000000000000000000000000'
# THRESHOLD: 0.5 # ETH
# RPC_URL: https://mainnet.infura.io
# BLOCK_EXPLORER: etherscan.io
eth-worker1:
image: tornadocash/relayer:mainnet-v4
profiles: ["eth"]
restart: always
command: worker
env_file: .env.eth
environment:
NET_ID: 1
REDIS_URL: redis://redis/0
depends_on: [redis, eth-server]
redis:
image: redis
restart: always
command: [redis-server, --appendonly, 'yes']
volumes:
- redis:/data
# # This is additional worker for ethereum mainnet
# # So you can process transactions from multiple addresses, but before it you need to set up those addresses as workers
# eth-worker2:
# image: tornadocash/relayer:mainnet-v4
# profiles: [ 'eth' ]
# restart: always
# command: worker
# env_file: .env2.eth
# environment:
# REDIS_URL: redis://redis/0
nginx:
image: nginx:alpine
container_name: nginx
restart: always
ports:
- 80:80
- 443:443
volumes:
- conf:/etc/nginx/conf.d
- vhost:/etc/nginx/vhost.d
- html:/usr/share/nginx/html
- certs:/etc/nginx/certs
logging:
driver: none
# # this container will proxy *.onion domain to the server container
# # if you want to run *only* as .onion service, you don't need `nginx`, `letsencrypt`, `dockergen` containers
# tor:
# image: strm/tor
# restart: always
# depends_on: [server]
# environment:
# LISTEN_PORT: 80
# REDIRECT: server:8000
# # Generate a new key with
# # docker run --rm --entrypoint shallot strm/tor-hiddenservice-nginx ^foo
# PRIVATE_KEY: |
# -----BEGIN RSA PRIVATE KEY-----
# ...
# -----END RSA PRIVATE KEY-----
dockergen:
image: poma/docker-gen
container_name: dockergen
restart: always
command: -notify-sighup nginx -watch /etc/docker-gen/templates/nginx.tmpl /etc/nginx/conf.d/default.conf
volumes_from:
- nginx
volumes:
- /var/run/docker.sock:/var/run/docker.sock:ro
# # auto update docker containers when new image is pushed to docker hub (be careful with that)
# watchtower:
# image: v2tec/watchtower
# restart: always
# volumes:
# - /var/run/docker.sock:/var/run/docker.sock
letsencrypt:
image: jrcs/letsencrypt-nginx-proxy-companion
container_name: letsencrypt
restart: always
environment:
NGINX_DOCKER_GEN_CONTAINER: dockergen
volumes_from:
- nginx
- dockergen
# # this container will send Telegram notifications when other containers are stopped/restarted
# # it's best to run this container on some other instance, otherwise it can't notify if the whole instance goes down
# notifier:
# image: poma/docker-telegram-notifier
# restart: always
# volumes:
# - /var/run/docker.sock:/var/run/docker.sock:ro
# environment:
# # How to create bot: https://core.telegram.org/bots#3-how-do-i-create-a-bot
# # How to get chat id: https://stackoverflow.com/questions/32423837/telegram-bot-how-to-get-a-group-chat-id/32572159#32572159
# TELEGRAM_NOTIFIER_BOT_TOKEN: ...
# TELEGRAM_NOTIFIER_CHAT_ID: ...
# # this container will send Telegram notifications if specified address doesn't have enough funds
# monitor_mainnet:
# image: peppersec/monitor_eth
# restart: always
# environment:
# TELEGRAM_NOTIFIER_BOT_TOKEN: ...
# TELEGRAM_NOTIFIER_CHAT_ID: ...
# ADDRESS: '0x0000000000000000000000000000000000000000'
# THRESHOLD: 0.5 # ETH
# RPC_URL: https://mainnet.infura.io
# BLOCK_EXPLORER: etherscan.io
# -------------------------------------------------- #
# ---------------------- BSC (Binance Smart Chain) ----------------------- #
bsc-server:
image: tornadocash/relayer:sidechain-v5
profiles: ["bsc"]
restart: always
command: server
env_file: .env.bsc
environment:
NET_ID: 56
REDIS_URL: redis://redis/1
nginx_proxy_read_timeout: 600
depends_on: [redis]
bsc-healthWatcher:
image: tornadocash/relayer:sidechain-v5
profiles: ["bsc"]
restart: always
command: healthWatcher
env_file: .env.bsc
environment:
NET_ID: 56
REDIS_URL: redis://redis/1
depends_on: [redis, bsc-server]
bsc-worker1:
image: tornadocash/relayer:sidechain-v5
profiles: ["bsc"]
restart: always
command: worker
env_file: .env.bsc
environment:
NET_ID: 56
REDIS_URL: redis://redis/1
depends_on: [redis, bsc-server]
# -------------------------------------------------- #
# ---------------------- Polygon (MATIC) --------------------- #
polygon-server:
image: tornadocash/relayer:sidechain-v5
profiles: ["polygon"]
restart: always
command: server
env_file: .env.polygon
environment:
NET_ID: 137
REDIS_URL: redis://redis/2
nginx_proxy_read_timeout: 600
depends_on: [redis]
polygon-healthWatcher:
image: tornadocash/relayer:sidechain-v5
profiles: ["polygon"]
restart: always
command: healthWatcher
env_file: .env.polygon
environment:
NET_ID: 137
REDIS_URL: redis://redis/2
depends_on: [redis, polygon-server]
polygon-worker1:
image: tornadocash/relayer:sidechain-v5
profiles: ["polygon"]
restart: always
command: worker
env_file: .env.polygon
environment:
NET_ID: 137
REDIS_URL: redis://redis/2
depends_on: [redis, polygon-server]
# -------------------------------------------------- #
# ---------------------- Gnosis (XDAI) ---------------------- #
gnosis-server:
image: tornadocash/relayer:sidechain-v5
profiles: ["gnosis"]
restart: always
command: server
env_file: .env.gnosis
environment:
NET_ID: 100
REDIS_URL: redis://redis/3
nginx_proxy_read_timeout: 600
depends_on: [redis]
gnosis-healthWatcher:
image: tornadocash/relayer:sidechain-v5
profiles: ["gnosis"]
restart: always
command: healthWatcher
env_file: .env.gnosis
environment:
NET_ID: 100
REDIS_URL: redis://redis/3
depends_on: [redis, gnosis-server]
gnosis-worker1:
image: tornadocash/relayer:sidechain-v5
profiles: ["gnosis"]
restart: always
command: worker
env_file: .env.gnosis
environment:
NET_ID: 100
REDIS_URL: redis://redis/3
depends_on: [redis, gnosis-server]
# -------------------------------------------------- #
# ---------------------- AVAX ---------------------- #
avax-server:
image: tornadocash/relayer:sidechain-v5
profiles: ["avax"]
restart: always
command: server
env_file: .env.avax
environment:
NET_ID: 43114
REDIS_URL: redis://redis/4
nginx_proxy_read_timeout: 600
depends_on: [redis]
avax-healthWatcher:
image: tornadocash/relayer:sidechain-v5
profiles: ["avax"]
restart: always
command: healthWatcher
env_file: .env.avax
environment:
NET_ID: 43114
REDIS_URL: redis://redis/4
depends_on: [redis, avax-server]
avax-worker1:
image: tornadocash/relayer:sidechain-v5
profiles: ["avax"]
restart: always
command: worker
env_file: .env.avax
environment:
NET_ID: 43114
REDIS_URL: redis://redis/4
depends_on: [redis, avax-server]
# -------------------------------------------------- #
# ---------------------- OP ------------------------ #
op-server:
image: tornadocash/relayer:sidechain-v5
profiles: ["op"]
restart: always
command: server
env_file: .env.op
environment:
NET_ID: 10
REDIS_URL: redis://redis/5
nginx_proxy_read_timeout: 600
depends_on: [redis]
op-healthWatcher:
image: tornadocash/relayer:sidechain-v5
profiles: ["op"]
restart: always
command: healthWatcher
env_file: .env.op
environment:
NET_ID: 10
REDIS_URL: redis://redis/5
depends_on: [redis, op-server]
op-worker1:
image: tornadocash/relayer:sidechain-v5
profiles: ["op"]
restart: always
command: worker
env_file: .env.op
environment:
NET_ID: 10
REDIS_URL: redis://redis/5
depends_on: [redis, op-server]
# -------------------------------------------------- #
# ---------------------- Arbitrum ----------------------- #
arb-server:
image: tornadocash/relayer:sidechain-v5
profiles: ["arb"]
restart: always
command: server
env_file: .env.arb
environment:
NET_ID: 42161
REDIS_URL: redis://redis/6
nginx_proxy_read_timeout: 600
depends_on: [redis]
arb-healthWatcher:
image: tornadocash/relayer:sidechain-v5
profiles: ["arb"]
restart: always
command: healthWatcher
env_file: .env.arb
environment:
NET_ID: 42161
REDIS_URL: redis://redis/6
depends_on: [redis, arb-server]
arb-worker1:
image: tornadocash/relayer:sidechain-v5
profiles: ["arb"]
restart: always
command: worker
env_file: .env.arb
environment:
NET_ID: 42161
REDIS_URL: redis://redis/6
depends_on: [redis, arb-server]
# -------------------------------------------------- #
# ---------------------- Goerli (Ethereum Testnet) ---------------------- #
goerli-server:
image: tornadocash/relayer:mainnet-v4
profiles: ["geth"]
restart: always
command: server
env_file: .env.goerli
environment:
NET_ID: 5
REDIS_URL: redis://redis/7
nginx_proxy_read_timeout: 600
depends_on: [redis]
goerli-treeWatcher:
image: tornadocash/relayer:mainnet-v4
profiles: ["goerli"]
restart: always
command: treeWatcher
env_file: .env.goerli
environment:
NET_ID: 5
REDIS_URL: redis://redis/7
depends_on: [redis, goerli-server]
goerli-priceWatcher:
image: tornadocash/relayer:mainnet-v4
profiles: ["goerli"]
restart: always
command: priceWatcher
env_file: .env.goerli
environment:
NET_ID: 5
REDIS_URL: redis://redis/7
depends_on: [redis, goerli-server]
goerli-healthWatcher:
image: tornadocash/relayer:mainnet-v4
profiles: ["goerli"]
restart: always
command: healthWatcher
env_file: .env.goerli
environment:
NET_ID: 5
REDIS_URL: redis://redis/7
depends_on: [redis, goerli-server]
goerli-worker1:
image: tornadocash/relayer:mainnet-v4
profiles: ["goerli"]
restart: always
command: worker
env_file: .env.goerli
environment:
NET_ID: 5
REDIS_URL: redis://redis/7
depends_on: [redis, goerli-server]
# -------------------------------------------------- #
# ---------------------- Tornado Nova (Gnosis Chain) ----------------------- #
server:
image: tornadocash/relayer:nova
profiles: ["nova"]
restart: always
command: start:prod
env_file: .env.nova
environment:
REDIS_URL: redis://redis/8
nginx_proxy_read_timeout: 600
depends_on: [redis]
volumes:
conf:
vhost:
html:
certs:
redis:
conf:
vhost:
html:
certs:
redis:

120
install.sh Normal file

@ -0,0 +1,120 @@
#!/bin/bash
# Script must be running from root
if [ "$EUID" -ne 0 ];
then echo "Please run as root";
exit 1;
fi;
relayer_soft_git_repo="https://git.tornado.ws/tornadocash/tornado-relayer";
user_home_dir=$(eval echo ~$USER);
relayer_folder="$user_home_dir/tornado-relayer";
relayer_mainnet_soft_source_folder="$relayer_folder/mainnet-soft-source";
relayer_sidechains_soft_source_folder="$relayer_folder/sidechains-soft-source";
nova_relayer_soft_source_folder="$relayer_folder/nova-soft-source";
script_log_file="/tmp/tornado-relayer-installation.log"
if [ -f $script_log_file ]; then rm $script_log_file; fi;
function echo_log_err(){
echo $1 1>&2;
echo -e "$1\n" &>> $script_log_file;
}
function echo_log_err_and_exit(){
echo_log_err "$1";
exit 1;
}
function is_package_installed(){
if [ $(dpkg-query -W -f='${Status}' $1 2>/dev/null | grep -c "ok installed") -eq 0 ]; then return 1; else return 0; fi;
}
function install_requred_packages(){
apt update &>> $script_log_file;
requred_packages=("curl" "git-all" "ufw" "nginx");
local package;
for package in ${requred_packages[@]}; do
if ! is_package_installed $package; then
# Kill apache process, because Debian configuring nginx package right during installation
if [ $package = "nginx" ]; then systemctl stop apache2; fi;
apt install --yes --force-yes -o DPkg::Options::="--force-confold" $package &>> $script_log_file;
if ! is_package_installed $package; then
echo_log_err_and_exit "Error: cannot install \"$package\" package";
fi;
fi;
done;
echo -e "\nAll required packages installed successfully";
}
function install_node(){
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.3/install.sh | bash;
. ~/.nvm/nvm.sh;
. ~/.profile;
. ~/.bashrc;
nvm install 14.21.3;
}
function install_repositories(){
git clone $relayer_soft_git_repo -b main $relayer_folder
git clone $relayer_soft_git_repo -b mainnet-v4 $relayer_mainnet_soft_source_folder;
git clone $relayer_soft_git_repo -b sidechain-v5 $relayer_sidechains_soft_source_folder;
git clone $relayer_soft_git_repo -b nova $nova_relayer_soft_source_folder;
}
function install_docker_utilities(){
local kernel_name=$(uname -s);
local processor_type=$(uname -m);
curl -SL https://github.com/docker/compose/releases/download/v2.16.0/docker-compose-$kernel_name-$processor_type -o /usr/local/bin/docker-compose;
chmod +x /usr/local/bin/docker-compose;
curl -s https://get.docker.com | bash;
}
function configure_firewall(){
ufw allow https/tcp;
ufw allow http/tcp;
ufw insert 1 allow OpenSSH;
echo "y" | ufw enable;
}
function configure_nginx_reverse_proxy(){
systemctl stop apache2;
cp $relayer_folder/tornado.conf /etc/nginx/sites-available/default;
echo "stream { map_hash_bucket_size 128; map_hash_max_size 128; include /etc/nginx/conf.d/streams/*.conf; }" >> /etc/nginx/nginx.conf;
mkdir /etc/nginx/conf.d/streams;
cp $relayer_folder/tornado-stream.conf /etc/nginx/conf.d/streams/tornado-stream.conf;
systemctl restart nginx;
systemctl stop nginx;
}
function build_relayer_docker_containers(){
cd $relayer_mainnet_soft_source_folder && npm run build;
cd $relayer_sidechains_soft_source_folder && npm run build;
cd $nova_relayer_soft_source_folder && npm run build:docker;
}
function prepare_environments(){
cp $relayer_mainnet_soft_source_folder/.env.example $relayer_folder/.env.eth;
cp $nova_relayer_soft_source_folder/.env.example $relayer_folder/.env.nova;
tee $relayer_folder/.env.bsc $relayer_folder/.env.arb $relayer_folder/.env.goerli $relayer_folder/.env.polygon $relayer_folder/.env.op \
$relayer_folder/.env.avax $relayer_folder/.env.gnosis < $relayer_sidechains_soft_source_folder/.env.example > /dev/null;
}
function main(){
install_requred_packages;
install_node;
install_repositories;
configure_firewall;
configure_nginx_reverse_proxy;
install_docker_utilities;
build_relayer_docker_containers;
prepare_environments;
cd $relayer_folder;
}
main;

File diff suppressed because one or more lines are too long

Binary file not shown.

@ -1,48 +0,0 @@
{
"name": "relay",
"version": "4.0.15",
"description": "Relayer for Tornado.cash privacy solution. https://tornado.cash",
"scripts": {
"server": "node src/server.js",
"worker": "node src/worker",
"treeWatcher": "node src/treeWatcher",
"priceWatcher": "node src/priceWatcher",
"healthWatcher": "node src/healthWatcher",
"eslint": "eslint --ext .js --ignore-path .gitignore .",
"prettier:check": "npx prettier --check . --config .prettierrc",
"prettier:fix": "npx prettier --write . --config .prettierrc",
"lint": "yarn eslint && yarn prettier:check",
"test": "mocha",
"start": "yarn server & yarn priceWatcher & yarn treeWatcher & yarn worker & yarn healthWatcher"
},
"author": "tornado.cash",
"license": "MIT",
"dependencies": {
"ajv": "^6.12.5",
"async-mutex": "^0.2.4",
"bull": "^3.12.1",
"circomlib": "git+https://github.com/tornadocash/circomlib.git#3b492f9801573eebcfe1b6c584afe8a3beecf2b4",
"dotenv": "^8.2.0",
"eth-ens-namehash": "^2.0.8",
"express": "^4.17.1",
"fixed-merkle-tree": "^0.4.0",
"gas-price-oracle": "^0.2.2",
"ioredis": "^4.14.1",
"node-fetch": "^2.6.0",
"torn-token": "1.0.4",
"tornado-anonymity-mining": "^2.1.2",
"tx-manager": "^0.2.9",
"uuid": "^8.3.0",
"web3": "^1.3.0",
"web3-core-promievent": "^1.3.0",
"web3-utils": "^1.2.2"
},
"devDependencies": {
"chai": "^4.2.0",
"eslint": "^6.6.0",
"eslint-config-prettier": "^6.12.0",
"eslint-plugin-prettier": "^3.1.4",
"mocha": "^8.1.3",
"prettier": "^2.1.2"
}
}

@ -1,29 +0,0 @@
require('dotenv').config()
const { jobType } = require('./constants')
const tornConfig = require('torn-token')
module.exports = {
netId: Number(process.env.NET_ID) || 1,
redisUrl: process.env.REDIS_URL || 'redis://127.0.0.1:6379',
httpRpcUrl: process.env.HTTP_RPC_URL,
wsRpcUrl: process.env.WS_RPC_URL,
oracleRpcUrl: process.env.ORACLE_RPC_URL || 'https://mainnet.infura.io/',
offchainOracleAddress: '0x080AB73787A8B13EC7F40bd7d00d6CC07F9b24d0',
aggregatorAddress: process.env.AGGREGATOR,
minerMerkleTreeHeight: 20,
privateKey: process.env.PRIVATE_KEY,
instances: tornConfig.instances,
torn: tornConfig,
port: process.env.APP_PORT || 8000,
tornadoServiceFee: Number(process.env.REGULAR_TORNADO_WITHDRAW_FEE),
miningServiceFee: Number(process.env.MINING_SERVICE_FEE),
rewardAccount: process.env.REWARD_ACCOUNT,
tornadoGoerliProxy: '0x454d870a72e29d5E5697f635128D18077BD04C60',
gasLimits: {
[jobType.TORNADO_WITHDRAW]: 390000,
WITHDRAW_WITH_EXTRA: 480000,
[jobType.MINING_REWARD]: 455000,
[jobType.MINING_WITHDRAW]: 400000,
},
minimumBalance: '1000000000000000000',
}

@ -1,20 +0,0 @@
const jobType = Object.freeze({
TORNADO_WITHDRAW: 'TORNADO_WITHDRAW',
MINING_REWARD: 'MINING_REWARD',
MINING_WITHDRAW: 'MINING_WITHDRAW',
})
const status = Object.freeze({
QUEUED: 'QUEUED',
ACCEPTED: 'ACCEPTED',
SENT: 'SENT',
MINED: 'MINED',
RESUBMITTED: 'RESUBMITTED',
CONFIRMED: 'CONFIRMED',
FAILED: 'FAILED',
})
module.exports = {
jobType,
status,
}

@ -1,55 +0,0 @@
const {
getTornadoWithdrawInputError,
getMiningRewardInputError,
getMiningWithdrawInputError,
} = require('./validator')
const { postJob } = require('./queue')
const { jobType } = require('./constants')
async function tornadoWithdraw(req, res) {
const inputError = getTornadoWithdrawInputError(req.body)
if (inputError) {
console.log('Invalid input:', inputError)
return res.status(400).json({ error: inputError })
}
const id = await postJob({
type: jobType.TORNADO_WITHDRAW,
request: req.body,
})
return res.json({ id })
}
async function miningReward(req, res) {
const inputError = getMiningRewardInputError(req.body)
if (inputError) {
console.log('Invalid input:', inputError)
return res.status(400).json({ error: inputError })
}
const id = await postJob({
type: jobType.MINING_REWARD,
request: req.body,
})
return res.json({ id })
}
async function miningWithdraw(req, res) {
const inputError = getMiningWithdrawInputError(req.body)
if (inputError) {
console.log('Invalid input:', inputError)
return res.status(400).json({ error: inputError })
}
const id = await postJob({
type: jobType.MINING_WITHDRAW,
request: req.body,
})
return res.json({ id })
}
module.exports = {
tornadoWithdraw,
miningReward,
miningWithdraw,
}

@ -1,27 +0,0 @@
const Web3 = require('web3')
const Redis = require('ioredis')
const { toBN, fromWei } = require('web3-utils')
const { setSafeInterval } = require('./utils')
const { redisUrl, httpRpcUrl, privateKey, minimumBalance } = require('./config')
const web3 = new Web3(httpRpcUrl)
const redis = new Redis(redisUrl)
async function main() {
try {
const { address } = web3.eth.accounts.privateKeyToAccount(privateKey)
const balance = await web3.eth.getBalance(address)
if (toBN(balance).lt(toBN(minimumBalance))) {
throw new Error(`Not enough balance, less than ${fromWei(minimumBalance)} ETH`)
}
await redis.hset('health', { status: true, error: '' })
} catch (e) {
console.error('healthWatcher', e.message)
await redis.hset('health', { status: false, error: e.message })
}
}
setSafeInterval(main, 30 * 1000)

@ -1,44 +0,0 @@
const Redis = require('ioredis')
const { redisUrl, offchainOracleAddress, oracleRpcUrl } = require('./config')
const { getArgsForOracle, setSafeInterval } = require('./utils')
const redis = new Redis(redisUrl)
const Web3 = require('web3')
const web3 = new Web3(
new Web3.providers.HttpProvider(oracleRpcUrl, {
timeout: 200000, // ms
}),
)
const offchainOracleABI = require('../abis/OffchainOracle.abi.json')
const offchainOracle = new web3.eth.Contract(offchainOracleABI, offchainOracleAddress)
const { tokenAddresses, oneUintAmount, currencyLookup } = getArgsForOracle()
const { toBN } = require('web3-utils')
async function main() {
try {
const ethPrices = {}
for (let i = 0; i < tokenAddresses.length; i++) {
try {
const price = await offchainOracle.methods
.getRate(tokenAddresses[i], '0x0000000000000000000000000000000000000000')
.call()
const numerator = toBN(oneUintAmount[i])
const denominator = toBN(10).pow(toBN(18)) // eth decimals
const priceFormatted = toBN(price).mul(numerator).div(denominator)
ethPrices[currencyLookup[tokenAddresses[i]]] = priceFormatted.toString()
} catch (e) {
console.error('cant get price of ', tokenAddresses[i])
}
}
await redis.hmset('prices', ethPrices)
console.log('Wrote following prices to redis', ethPrices)
} catch (e) {
console.error('priceWatcher error', e)
}
}
setSafeInterval(main, 30 * 1000)

@ -1,57 +0,0 @@
const { v4: uuid } = require('uuid')
const Queue = require('bull')
const Redis = require('ioredis')
const { redisUrl } = require('./config')
const { status } = require('./constants')
const redis = new Redis(redisUrl)
const queue = new Queue('proofs', redisUrl, {
lockDuration: 300000, // Key expiration time for job locks.
lockRenewTime: 30000, // Interval on which to acquire the job lock
stalledInterval: 30000, // How often check for stalled jobs (use 0 for never checking).
maxStalledCount: 3, // Max amount of times a stalled job will be re-processed.
guardInterval: 5000, // Poll interval for delayed jobs and added jobs.
retryProcessDelay: 5000, // delay before processing next job in case of internal error.
drainDelay: 5, // A timeout for when the queue is in drained state (empty waiting for jobs).
})
async function postJob({ type, request }) {
const id = uuid()
const job = await queue.add(
{
id,
type,
status: status.QUEUED,
...request, // proof, args, ?contract
},
{
//removeOnComplete: true
},
)
await redis.set(`job:${id}`, job.id)
return id
}
async function getJob(uuid) {
const id = await redis.get(`job:${uuid}`)
return queue.getJobFromId(id)
}
async function getJobStatus(uuid) {
const job = await getJob(uuid)
if (!job) {
return null
}
return {
...job.data,
failedReason: job.failedReason,
}
}
module.exports = {
postJob,
getJob,
getJobStatus,
queue,
}

@ -1,30 +0,0 @@
const { httpRpcUrl, aggregatorAddress } = require('./config')
const Web3 = require('web3')
const web3 = new Web3(httpRpcUrl)
const aggregator = new web3.eth.Contract(require('../abis/Aggregator.abi.json'), aggregatorAddress)
const ens = require('eth-ens-namehash')
class ENSResolver {
constructor() {
this.addresses = {}
}
async resolve(domains) {
if (!Array.isArray(domains)) {
domains = [domains]
}
const unresolved = domains.filter(d => !this.addresses[d])
if (unresolved.length) {
const resolved = await aggregator.methods.bulkResolve(unresolved.map(ens.hash)).call()
for (let i = 0; i < resolved.length; i++) {
this.addresses[domains[i]] = resolved[i]
}
}
const addresses = domains.map(domain => this.addresses[domain])
return addresses.length === 1 ? addresses[0] : addresses
}
}
module.exports = ENSResolver

@ -1,41 +0,0 @@
const express = require('express')
const status = require('./status')
const controller = require('./controller')
const { port, rewardAccount } = require('./config')
const { version } = require('../package.json')
const { isAddress } = require('web3-utils')
const app = express()
app.use(express.json())
// Add CORS headers
app.use((req, res, next) => {
res.header('Access-Control-Allow-Origin', '*')
res.header('Access-Control-Allow-Headers', 'Origin, X-Requested-With, Content-Type, Accept')
next()
})
// Log error to console but don't send it to the client to avoid leaking data
app.use((err, req, res, next) => {
if (err) {
console.error(err)
return res.sendStatus(500)
}
next()
})
app.get('/', status.index)
app.get('/v1/status', status.status)
app.get('/v1/jobs/:id', status.getJob)
app.post('/v1/tornadoWithdraw', controller.tornadoWithdraw)
app.get('/status', status.status)
app.post('/relay', controller.tornadoWithdraw)
app.post('/v1/miningReward', controller.miningReward)
app.post('/v1/miningWithdraw', controller.miningWithdraw)
if (!isAddress(rewardAccount)) {
throw new Error('No REWARD_ACCOUNT specified')
}
app.listen(port)
console.log(`Relayer ${version} started on port ${port}`)

@ -1,41 +0,0 @@
const queue = require('./queue')
const { netId, tornadoServiceFee, miningServiceFee, instances, redisUrl, rewardAccount } = require('./config')
const { version } = require('../package.json')
const Redis = require('ioredis')
const redis = new Redis(redisUrl)
async function status(req, res) {
const ethPrices = await redis.hgetall('prices')
const health = await redis.hgetall('health')
const { waiting: currentQueue } = await queue.queue.getJobCounts()
res.json({
rewardAccount,
instances: instances[`netId${netId}`],
netId,
ethPrices,
tornadoServiceFee,
miningServiceFee,
version,
health,
currentQueue,
})
}
function index(req, res) {
res.send(
'This is <a href=https://tornado.cash>tornado.cash</a> Relayer service. Check the <a href=/v1/status>/status</a> for settings',
)
}
async function getJob(req, res) {
const status = await queue.getJobStatus(req.params.id)
return status ? res.json(status) : res.status(400).json({ error: "The job doesn't exist" })
}
module.exports = {
status,
index,
getJob,
}

@ -1,125 +0,0 @@
const MerkleTree = require('fixed-merkle-tree')
const { redisUrl, wsRpcUrl, minerMerkleTreeHeight, torn } = require('./config')
const { poseidonHash2 } = require('./utils')
const { toBN } = require('web3-utils')
const Redis = require('ioredis')
const redis = new Redis(redisUrl)
const ENSResolver = require('./resolver')
const resolver = new ENSResolver()
const Web3 = require('web3')
const web3 = new Web3(
new Web3.providers.WebsocketProvider(wsRpcUrl, {
clientConfig: {
maxReceivedFrameSize: 100000000,
maxReceivedMessageSize: 100000000,
},
}),
)
const MinerABI = require('../abis/mining.abi.json')
let contract
// eslint-disable-next-line no-unused-vars
let tree, eventSubscription, blockSubscription
// todo handle the situation when we have two rewards in one block
async function fetchEvents(from = 0, to = 'latest') {
try {
const events = await contract.getPastEvents('NewAccount', {
fromBlock: from,
toBlock: to,
})
return events
.sort((a, b) => a.returnValues.index - b.returnValues.index)
.map(e => toBN(e.returnValues.commitment))
} catch (e) {
console.error('error fetching events', e)
}
}
async function processNewEvent(err, event) {
if (err) {
throw new Error(`Event handler error: ${err}`)
// console.error(err)
// return
}
console.log(
`New account event
Index: ${event.returnValues.index}
Commitment: ${event.returnValues.commitment}
Nullifier: ${event.returnValues.nullifier}
EncAcc: ${event.returnValues.encryptedAccount}`,
)
const { commitment, index } = event.returnValues
if (tree.elements().length === Number(index)) {
tree.insert(toBN(commitment))
await updateRedis()
} else if (tree.elements().length === Number(index) + 1) {
console.log('Replacing element', index)
tree.update(index, toBN(commitment))
await updateRedis()
} else {
console.log(`Invalid element index ${index}, rebuilding tree`)
rebuild()
}
}
async function processNewBlock(err) {
if (err) {
throw new Error(`Event handler error: ${err}`)
// console.error(err)
// return
}
// what if updateRedis takes more than 15 sec?
await updateRedis()
}
async function updateRedis() {
const rootOnContract = await contract.methods.getLastAccountRoot().call()
if (!tree.root().eq(toBN(rootOnContract))) {
console.log(`Invalid tree root: ${tree.root()} != ${toBN(rootOnContract)}, rebuilding tree`)
rebuild()
return
}
const rootInRedis = await redis.get('tree:root')
if (!rootInRedis || !tree.root().eq(toBN(rootInRedis))) {
const serializedTree = JSON.stringify(tree.serialize())
await redis.set('tree:elements', serializedTree)
await redis.set('tree:root', tree.root().toString())
await redis.publish('treeUpdate', tree.root().toString())
console.log('Updated tree in redis, new root:', tree.root().toString())
} else {
console.log('Tree in redis is up to date, skipping update')
}
}
function rebuild() {
process.exit(1)
// await eventSubscription.unsubscribe()
// await blockSubscription.unsubscribe()
// setTimeout(init, 3000)
}
async function init() {
try {
console.log('Initializing')
const miner = await resolver.resolve(torn.miningV2.address)
contract = new web3.eth.Contract(MinerABI, miner)
const block = await web3.eth.getBlockNumber()
const events = await fetchEvents(0, block)
tree = new MerkleTree(minerMerkleTreeHeight, events, { hashFunction: poseidonHash2 })
await updateRedis()
console.log(`Rebuilt tree with ${events.length} elements, root: ${tree.root()}`)
eventSubscription = contract.events.NewAccount({ fromBlock: block + 1 }, processNewEvent)
blockSubscription = web3.eth.subscribe('newBlockHeaders', processNewBlock)
} catch (e) {
console.error('error on init treeWatcher', e.message)
}
}
init()
process.on('unhandledRejection', error => {
console.error('Unhandled promise rejection', error)
process.exit(1)
})

@ -1,129 +0,0 @@
const { instances, netId } = require('./config')
const { poseidon } = require('circomlib')
const { toBN, toChecksumAddress, BN } = require('web3-utils')
const TOKENS = {
torn: {
tokenAddress: '0x77777FeDdddFfC19Ff86DB637967013e6C6A116C',
symbol: 'TORN',
decimals: 18,
},
}
const sleep = ms => new Promise(res => setTimeout(res, ms))
function getInstance(address) {
address = toChecksumAddress(address)
const inst = instances[`netId${netId}`]
for (const currency of Object.keys(inst)) {
for (const amount of Object.keys(inst[currency].instanceAddress)) {
if (inst[currency].instanceAddress[amount] === address) {
return { currency, amount }
}
}
}
return null
}
const poseidonHash = items => toBN(poseidon(items).toString())
const poseidonHash2 = (a, b) => poseidonHash([a, b])
function setSafeInterval(func, interval) {
func()
.catch(console.error)
.finally(() => {
setTimeout(() => setSafeInterval(func, interval), interval)
})
}
/**
* A promise that resolves when the source emits specified event
*/
function when(source, event) {
return new Promise((resolve, reject) => {
source
.once(event, payload => {
resolve(payload)
})
.on('error', error => {
reject(error)
})
})
}
function getArgsForOracle() {
const tokens = {
...instances.netId1,
...TOKENS,
}
const tokenAddresses = []
const oneUintAmount = []
const currencyLookup = {}
Object.entries(tokens).map(([currency, data]) => {
if (currency !== 'eth') {
tokenAddresses.push(data.tokenAddress)
oneUintAmount.push(toBN('10').pow(toBN(data.decimals.toString())).toString())
currencyLookup[data.tokenAddress] = currency
}
})
return { tokenAddresses, oneUintAmount, currencyLookup }
}
function fromDecimals(value, decimals) {
value = value.toString()
let ether = value.toString()
const base = new BN('10').pow(new BN(decimals))
const baseLength = base.toString(10).length - 1 || 1
const negative = ether.substring(0, 1) === '-'
if (negative) {
ether = ether.substring(1)
}
if (ether === '.') {
throw new Error('[ethjs-unit] while converting number ' + value + ' to wei, invalid value')
}
// Split it into a whole and fractional part
const comps = ether.split('.')
if (comps.length > 2) {
throw new Error('[ethjs-unit] while converting number ' + value + ' to wei, too many decimal points')
}
let whole = comps[0]
let fraction = comps[1]
if (!whole) {
whole = '0'
}
if (!fraction) {
fraction = '0'
}
if (fraction.length > baseLength) {
throw new Error('[ethjs-unit] while converting number ' + value + ' to wei, too many decimal places')
}
while (fraction.length < baseLength) {
fraction += '0'
}
whole = new BN(whole)
fraction = new BN(fraction)
let wei = whole.mul(base).add(fraction)
if (negative) {
wei = wei.mul(negative)
}
return new BN(wei.toString(10), 10)
}
module.exports = {
getInstance,
setSafeInterval,
poseidonHash2,
sleep,
when,
getArgsForOracle,
fromDecimals,
}

@ -1,200 +0,0 @@
const { isAddress, toChecksumAddress } = require('web3-utils')
const { getInstance } = require('./utils')
const { rewardAccount } = require('./config')
const Ajv = require('ajv')
const ajv = new Ajv({ format: 'fast' })
ajv.addKeyword('isAddress', {
validate: (schema, data) => {
try {
return isAddress(data)
} catch (e) {
return false
}
},
errors: true,
})
ajv.addKeyword('isKnownContract', {
validate: (schema, data) => {
try {
return getInstance(data) !== null
} catch (e) {
return false
}
},
errors: true,
})
ajv.addKeyword('isFeeRecipient', {
validate: (schema, data) => {
try {
return toChecksumAddress(rewardAccount) === toChecksumAddress(data)
} catch (e) {
return false
}
},
errors: true,
})
const addressType = { type: 'string', pattern: '^0x[a-fA-F0-9]{40}$', isAddress: true }
const proofType = { type: 'string', pattern: '^0x[a-fA-F0-9]{512}$' }
const encryptedAccountType = { type: 'string', pattern: '^0x[a-fA-F0-9]{392}$' }
const bytes32Type = { type: 'string', pattern: '^0x[a-fA-F0-9]{64}$' }
const instanceType = { ...addressType, isKnownContract: true }
const relayerType = { ...addressType, isFeeRecipient: true }
const tornadoWithdrawSchema = {
type: 'object',
properties: {
proof: proofType,
contract: instanceType,
args: {
type: 'array',
maxItems: 6,
minItems: 6,
items: [bytes32Type, bytes32Type, addressType, relayerType, bytes32Type, bytes32Type],
},
},
additionalProperties: false,
required: ['proof', 'contract', 'args'],
}
const miningRewardSchema = {
type: 'object',
properties: {
proof: proofType,
args: {
type: 'object',
properties: {
rate: bytes32Type,
fee: bytes32Type,
instance: instanceType,
rewardNullifier: bytes32Type,
extDataHash: bytes32Type,
depositRoot: bytes32Type,
withdrawalRoot: bytes32Type,
extData: {
type: 'object',
properties: {
relayer: relayerType,
encryptedAccount: encryptedAccountType,
},
additionalProperties: false,
required: ['relayer', 'encryptedAccount'],
},
account: {
type: 'object',
properties: {
inputRoot: bytes32Type,
inputNullifierHash: bytes32Type,
outputRoot: bytes32Type,
outputPathIndices: bytes32Type,
outputCommitment: bytes32Type,
},
additionalProperties: false,
required: [
'inputRoot',
'inputNullifierHash',
'outputRoot',
'outputPathIndices',
'outputCommitment',
],
},
},
additionalProperties: false,
required: [
'rate',
'fee',
'instance',
'rewardNullifier',
'extDataHash',
'depositRoot',
'withdrawalRoot',
'extData',
'account',
],
},
},
additionalProperties: false,
required: ['proof', 'args'],
}
const miningWithdrawSchema = {
type: 'object',
properties: {
proof: proofType,
args: {
type: 'object',
properties: {
amount: bytes32Type,
extDataHash: bytes32Type,
extData: {
type: 'object',
properties: {
fee: bytes32Type,
recipient: addressType,
relayer: relayerType,
encryptedAccount: encryptedAccountType,
},
additionalProperties: false,
required: ['fee', 'relayer', 'encryptedAccount', 'recipient'],
},
account: {
type: 'object',
properties: {
inputRoot: bytes32Type,
inputNullifierHash: bytes32Type,
outputRoot: bytes32Type,
outputPathIndices: bytes32Type,
outputCommitment: bytes32Type,
},
additionalProperties: false,
required: [
'inputRoot',
'inputNullifierHash',
'outputRoot',
'outputPathIndices',
'outputCommitment',
],
},
},
additionalProperties: false,
required: ['amount', 'extDataHash', 'extData', 'account'],
},
},
additionalProperties: false,
required: ['proof', 'args'],
}
const validateTornadoWithdraw = ajv.compile(tornadoWithdrawSchema)
const validateMiningReward = ajv.compile(miningRewardSchema)
const validateMiningWithdraw = ajv.compile(miningWithdrawSchema)
function getInputError(validator, data) {
validator(data)
if (validator.errors) {
const error = validator.errors[0]
return `${error.dataPath} ${error.message}`
}
return null
}
function getTornadoWithdrawInputError(data) {
return getInputError(validateTornadoWithdraw, data)
}
function getMiningRewardInputError(data) {
return getInputError(validateMiningReward, data)
}
function getMiningWithdrawInputError(data) {
return getInputError(validateMiningWithdraw, data)
}
module.exports = {
getTornadoWithdrawInputError,
getMiningRewardInputError,
getMiningWithdrawInputError,
}

@ -1,331 +0,0 @@
const fs = require('fs')
const Web3 = require('web3')
const { toBN, toWei, fromWei, toChecksumAddress } = require('web3-utils')
const MerkleTree = require('fixed-merkle-tree')
const Redis = require('ioredis')
const { GasPriceOracle } = require('gas-price-oracle')
const { Utils, Controller } = require('tornado-anonymity-mining')
const swapABI = require('../abis/swap.abi.json')
const miningABI = require('../abis/mining.abi.json')
const tornadoABI = require('../abis/tornadoABI.json')
const tornadoProxyABI = require('../abis/tornadoProxyABI.json')
const { queue } = require('./queue')
const { poseidonHash2, getInstance, fromDecimals, sleep } = require('./utils')
const { jobType, status } = require('./constants')
const {
torn,
netId,
redisUrl,
gasLimits,
instances,
privateKey,
httpRpcUrl,
oracleRpcUrl,
miningServiceFee,
tornadoServiceFee,
tornadoGoerliProxy,
} = require('./config')
const ENSResolver = require('./resolver')
const resolver = new ENSResolver()
const { TxManager } = require('tx-manager')
let web3
let currentTx
let currentJob
let tree
let txManager
let controller
let swap
let minerContract
const redis = new Redis(redisUrl)
const redisSubscribe = new Redis(redisUrl)
const gasPriceOracle = new GasPriceOracle({ defaultRpc: oracleRpcUrl })
async function fetchTree() {
const elements = await redis.get('tree:elements')
const convert = (_, val) => (typeof val === 'string' ? toBN(val) : val)
tree = MerkleTree.deserialize(JSON.parse(elements, convert), poseidonHash2)
if (currentTx && currentJob && ['MINING_REWARD', 'MINING_WITHDRAW'].includes(currentJob.data.type)) {
const { proof, args } = currentJob.data
if (toBN(args.account.inputRoot).eq(toBN(tree.root()))) {
console.log('Account root is up to date. Skipping Root Update operation...')
return
} else {
console.log('Account root is outdated. Starting Root Update operation...')
}
const update = await controller.treeUpdate(args.account.outputCommitment, tree)
const minerAddress = await resolver.resolve(torn.miningV2.address)
const instance = new web3.eth.Contract(miningABI, minerAddress)
const data =
currentJob.data.type === 'MINING_REWARD'
? instance.methods.reward(proof, args, update.proof, update.args).encodeABI()
: instance.methods.withdraw(proof, args, update.proof, update.args).encodeABI()
await currentTx.replace({
to: minerAddress,
data,
})
console.log('replaced pending tx')
}
}
async function start() {
try {
web3 = new Web3(httpRpcUrl)
const { CONFIRMATIONS, MAX_GAS_PRICE } = process.env
txManager = new TxManager({
privateKey,
rpcUrl: httpRpcUrl,
config: { CONFIRMATIONS, MAX_GAS_PRICE, THROW_ON_REVERT: false },
})
swap = new web3.eth.Contract(swapABI, await resolver.resolve(torn.rewardSwap.address))
minerContract = new web3.eth.Contract(miningABI, await resolver.resolve(torn.miningV2.address))
redisSubscribe.subscribe('treeUpdate', fetchTree)
await fetchTree()
const provingKeys = {
treeUpdateCircuit: require('../keys/TreeUpdate.json'),
treeUpdateProvingKey: fs.readFileSync('./keys/TreeUpdate_proving_key.bin').buffer,
}
controller = new Controller({ provingKeys })
await controller.init()
queue.process(processJob)
console.log('Worker started')
} catch (e) {
console.error('error on start worker', e.message)
}
}
function checkFee({ data }) {
if (data.type === jobType.TORNADO_WITHDRAW) {
return checkTornadoFee(data)
}
return checkMiningFee(data)
}
async function checkTornadoFee({ args, contract }) {
const { currency, amount } = getInstance(contract)
const { decimals } = instances[`netId${netId}`][currency]
const [fee, refund] = [args[4], args[5]].map(toBN)
const { fast } = await gasPriceOracle.gasPrices()
const ethPrice = await redis.hget('prices', currency)
const expense = toBN(toWei(fast.toString(), 'gwei')).mul(toBN(gasLimits[jobType.TORNADO_WITHDRAW]))
const feePercent = toBN(fromDecimals(amount, decimals))
.mul(toBN(parseInt(tornadoServiceFee * 1e10)))
.div(toBN(1e10 * 100))
let desiredFee
switch (currency) {
case 'eth': {
desiredFee = expense.add(feePercent)
break
}
default: {
desiredFee = expense
.add(refund)
.mul(toBN(10 ** decimals))
.div(toBN(ethPrice))
desiredFee = desiredFee.add(feePercent)
break
}
}
console.log(
'sent fee, desired fee, feePercent',
fromWei(fee.toString()),
fromWei(desiredFee.toString()),
fromWei(feePercent.toString()),
)
if (fee.lt(desiredFee)) {
throw new Error('Provided fee is not enough. Probably it is a Gas Price spike, try to resubmit.')
}
}
async function checkMiningFee({ args }) {
const { fast } = await gasPriceOracle.gasPrices()
const ethPrice = await redis.hget('prices', 'torn')
const isMiningReward = currentJob.data.type === jobType.MINING_REWARD
const providedFee = isMiningReward ? toBN(args.fee) : toBN(args.extData.fee)
const expense = toBN(toWei(fast.toString(), 'gwei')).mul(toBN(gasLimits[currentJob.data.type]))
const expenseInTorn = expense.mul(toBN(1e18)).div(toBN(ethPrice))
// todo make aggregator for ethPrices and rewardSwap data
const balance = await swap.methods.tornVirtualBalance().call()
const poolWeight = await swap.methods.poolWeight().call()
const expenseInPoints = Utils.reverseTornadoFormula({ balance, tokens: expenseInTorn, poolWeight })
/* eslint-disable */
const serviceFeePercent = isMiningReward
? toBN(0)
: toBN(args.amount)
.sub(providedFee) // args.amount includes fee
.mul(toBN(parseInt(miningServiceFee * 1e10)))
.div(toBN(1e10 * 100))
/* eslint-enable */
const desiredFee = expenseInPoints.add(serviceFeePercent) // in points
console.log(
'user provided fee, desired fee, serviceFeePercent',
providedFee.toString(),
desiredFee.toString(),
serviceFeePercent.toString(),
)
if (toBN(providedFee).lt(desiredFee)) {
throw new Error('Provided fee is not enough. Probably it is a Gas Price spike, try to resubmit.')
}
}
async function getProxyContract() {
let proxyAddress
if (netId === 5) {
proxyAddress = tornadoGoerliProxy
} else {
proxyAddress = await resolver.resolve(torn.tornadoProxy.address)
}
const contract = new web3.eth.Contract(tornadoProxyABI, proxyAddress)
return {
contract,
isOldProxy: checkOldProxy(proxyAddress),
}
}
function checkOldProxy(address) {
const OLD_PROXY = '0x905b63Fff465B9fFBF41DeA908CEb12478ec7601'
return toChecksumAddress(address) === toChecksumAddress(OLD_PROXY)
}
async function getTxObject({ data }) {
if (data.type === jobType.TORNADO_WITHDRAW) {
let { contract, isOldProxy } = await getProxyContract()
let calldata = contract.methods.withdraw(data.contract, data.proof, ...data.args).encodeABI()
if (isOldProxy && getInstance(data.contract).currency !== 'eth') {
contract = new web3.eth.Contract(tornadoABI, data.contract)
calldata = contract.methods.withdraw(data.proof, ...data.args).encodeABI()
}
return {
value: data.args[5],
to: contract._address,
data: calldata,
gasLimit: gasLimits['WITHDRAW_WITH_EXTRA'],
}
} else {
const method = data.type === jobType.MINING_REWARD ? 'reward' : 'withdraw'
const calldata = minerContract.methods[method](data.proof, data.args).encodeABI()
return {
to: minerContract._address,
data: calldata,
gasLimit: gasLimits[data.type],
}
}
}
async function isOutdatedTreeRevert(receipt, currentTx) {
try {
await web3.eth.call(currentTx.tx, receipt.blockNumber)
console.log('Simulated call successful')
return false
} catch (e) {
console.log('Decoded revert reason:', e.message)
return (
e.message.indexOf('Outdated account merkle root') !== -1 ||
e.message.indexOf('Outdated tree update merkle root') !== -1
)
}
}
async function processJob(job) {
try {
if (!jobType[job.data.type]) {
throw new Error(`Unknown job type: ${job.data.type}`)
}
currentJob = job
await updateStatus(status.ACCEPTED)
console.log(`Start processing a new ${job.data.type} job #${job.id}`)
await submitTx(job)
} catch (e) {
console.error('processJob', e.message)
await updateStatus(status.FAILED)
throw e
}
}
async function submitTx(job, retry = 0) {
await checkFee(job)
currentTx = await txManager.createTx(await getTxObject(job))
if (job.data.type !== jobType.TORNADO_WITHDRAW) {
await fetchTree()
}
try {
const receipt = await currentTx
.send()
.on('transactionHash', txHash => {
updateTxHash(txHash)
updateStatus(status.SENT)
})
.on('mined', receipt => {
console.log('Mined in block', receipt.blockNumber)
updateStatus(status.MINED)
})
.on('confirmations', updateConfirmations)
if (receipt.status === 1) {
await updateStatus(status.CONFIRMED)
} else {
if (job.data.type !== jobType.TORNADO_WITHDRAW && (await isOutdatedTreeRevert(receipt, currentTx))) {
if (retry < 3) {
await updateStatus(status.RESUBMITTED)
await submitTx(job, retry + 1)
} else {
throw new Error('Tree update retry limit exceeded')
}
} else {
throw new Error('Submitted transaction failed')
}
}
} catch (e) {
// todo this could result in duplicated error logs
// todo handle a case where account tree is still not up to date (wait and retry)?
if (
job.data.type !== jobType.TORNADO_WITHDRAW &&
(e.message.indexOf('Outdated account merkle root') !== -1 ||
e.message.indexOf('Outdated tree update merkle root') !== -1)
) {
if (retry < 5) {
await sleep(3000)
console.log('Tree is still not up to date, resubmitting')
await submitTx(job, retry + 1)
} else {
throw new Error('Tree update retry limit exceeded')
}
} else {
throw new Error(`Revert by smart contract ${e.message}`)
}
}
}
async function updateTxHash(txHash) {
console.log(`A new successfully sent tx ${txHash}`)
currentJob.data.txHash = txHash
await currentJob.update(currentJob.data)
}
async function updateConfirmations(confirmations) {
console.log(`Confirmations count ${confirmations}`)
currentJob.data.confirmations = confirmations
await currentJob.update(currentJob.data)
}
async function updateStatus(status) {
console.log(`Job status updated ${status}`)
currentJob.data.status = status
await currentJob.update(currentJob.data)
}
start()

@ -1,151 +0,0 @@
require('chai').should()
const {
getTornadoWithdrawInputError,
getMiningRewardInputError,
getMiningWithdrawInputError,
} = require('../src/validator')
describe('Validator', () => {
describe('#getTornadoWithdrawInputError', () => {
it('should work', () => {
getTornadoWithdrawInputError(withdrawData)
})
it('should throw for incorrect proof', () => {
const malformedData = { ...withdrawData }
malformedData.proof = '0xbeef'
getTornadoWithdrawInputError(malformedData).should.be.equal(
'.proof should match pattern "^0x[a-fA-F0-9]{512}$"',
)
})
it('should throw something is missing', () => {
const malformedData = { ...withdrawData }
delete malformedData.proof
getTornadoWithdrawInputError(malformedData).should.be.equal(" should have required property 'proof'")
malformedData.proof = withdrawData.proof
delete malformedData.args
getTornadoWithdrawInputError(malformedData).should.be.equal(" should have required property 'args'")
malformedData.args = withdrawData.args
delete malformedData.contract
getTornadoWithdrawInputError(malformedData).should.be.equal(" should have required property 'contract'")
malformedData.contract = withdrawData.contract
})
})
describe('#getMiningRewardInputError', () => {
it('should work', () => {
getMiningRewardInputError(rewardData)
})
it('should throw for incorrect proof', () => {
const malformedData = { ...rewardData }
malformedData.proof = '0xbeef'
getMiningRewardInputError(malformedData).should.be.equal(
'.proof should match pattern "^0x[a-fA-F0-9]{512}$"',
)
})
it('should throw something is missing', () => {
const malformedData = { ...rewardData }
delete malformedData.proof
getMiningRewardInputError(malformedData).should.be.equal(" should have required property 'proof'")
malformedData.proof = rewardData.proof
delete malformedData.args
getMiningRewardInputError(malformedData).should.be.equal(" should have required property 'args'")
malformedData.args = rewardData.args
})
})
describe('#getMiningWithdrawInputError', () => {
it('should work', () => {
getMiningWithdrawInputError(miningWithdrawData)
})
it('should throw for incorrect proof', () => {
const malformedData = { ...miningWithdrawData }
malformedData.proof = '0xbeef'
getMiningWithdrawInputError(malformedData).should.be.equal(
'.proof should match pattern "^0x[a-fA-F0-9]{512}$"',
)
})
it('should throw something is missing', () => {
const malformedData = { ...miningWithdrawData }
delete malformedData.proof
getMiningWithdrawInputError(malformedData).should.be.equal(" should have required property 'proof'")
malformedData.proof = miningWithdrawData.proof
delete malformedData.args
getMiningWithdrawInputError(malformedData).should.be.equal(" should have required property 'args'")
malformedData.args = miningWithdrawData.args
})
})
})
const withdrawData = {
proof:
'0x0f8cb4c2ca9cbb23a5f21475773e19e39d3470436d7296f25c8730d19d88fcef2986ec694ad094f4c5fff79a4e5043bd553df20b23108bc023ec3670718143c20cc49c6d9798e1ae831fd32a878b96ff8897728f9b7963f0d5a4b5574426ac6203b2456d360b8e825d8f5731970bf1fc1b95b9713e3b24203667ecdd5939c2e40dec48f9e51d9cc8dc2f7f3916f0e9e31519c7df2bea8c51a195eb0f57beea4924cb846deaa78cdcbe361a6c310638af6f6157317bc27d74746bfaa2e1f8d2e9088fd10fa62100740874cdffdd6feb15c95c5a303f6bc226d5e51619c5b825471a17ddfeb05b250c0802261f7d05cf29a39a72c13e200e5bc721b0e4c50d55e6',
args: [
'0x1579d41e5290ab5bcec9a7df16705e49b5c0b869095299196c19c5e14462c9e3',
'0x0cf7f49c5b35c48b9e1d43713e0b46a75977e3d10521e9ac1e4c3cd5e3da1c5d',
'0xbd4369dc854c5d5b79fe25492e3a3cfcb5d02da5',
'0x0000000000000000000000000000000000000000',
'0x000000000000000000000000000000000000000000000000058d15e176280000',
'0x0000000000000000000000000000000000000000000000000000000000000000',
],
contract: '0x47CE0C6eD5B0Ce3d3A51fdb1C52DC66a7c3c2936',
}
const rewardData = {
proof:
'0x2e0f4c76b35ce3275bf57492cbe12ddc76fae4eabdbeaacdcc7cd5255d0abb2325bd80b2a867f9c1bab854de5d7c443a18eb9ad796943dd53c30c04e8f0a37ae164916c932776b3c28dd49808a5d5e1648d8bc9006b2386096b88757644ce8f102f7e2f1505bb66385a1d53a101922a17d8ab653694dedd7d150ec71d543202e0f0a67e5d59904d75af1c52bef4dfac0a302c2beb2ca3bb29b6bbbe1038368702e5ba8d6d829d74968a94e321cc91cccbc0654f5df6460a0a6ad73b06c42b7d1289ff36655fc7106b5538bd2c6617dd0c313919331e63bcb4de9c9b45dc2207b098a5729efbecf79a4cab39ade3c99e5772bfbe5ae75d932facbf9e0910a34ae',
args: {
rate: '0x000000000000000000000000000000000000000000000000000000000000000a',
fee: '0x0000000000000000000000000000000000000000000000000000000000000000',
instance: '0x8b3f5393bA08c24cc7ff5A66a832562aAB7bC95f',
rewardNullifier: '0x08fdc416b85c76d246925994ae0c0df539789fd1669c45b57104907c7ef8b0b5',
extDataHash: '0x006c5f12c20933beab10cfffab31ea0c9d736cf9aa868ee29eed3047d4ea4c2e',
depositRoot: '0x0405962838a47fb25ffd75d80d53b268654a06bc1bdde7e5ad94c675c2f2f0ff',
withdrawalRoot: '0x1cd83f5df5dbc826fecbf6be87f05db9c9dc617a3f1b1f3a421b1335c1ff7dbf',
extData: {
relayer: '0x0000000000000000000000000000000000000000',
encryptedAccount:
'0x6a8494fca4c433ef323d03f0db3fede90c3d2c6f216d73345ffc77ceec79622f327a83c4254063a3027620c262835e335fa32c33600a70547a53b2aa311d3ff35cf943e8f9e8f321f60d4266f680e0606a5837d78deb4d74c8b4fa3e9b67414513c71b73e38995cd8d57fd08aa9e135b342cecaf4128d4cfbb26148022e7a87da8b2423440b62034be202a6a48b45baa9736def6455771b442baaf2358fc52aa6c1d14a9a452b064d280fafd69f2a3ba416c10c1d8276f1c3810c664b24e0f1eefc75d63',
},
account: {
inputRoot: '0x22e875e5e54d8569fb40d0c568984e87b4c97da6383d8d8a334a79e22b48fd54',
inputNullifierHash: '0x24be972a00e3938a58f44ea6f8ead271ecdd6ab2cab42d1910fb7190b5816188',
outputRoot: '0x04a3cd1e37487dcee5da51cbce4245742903262a5824aef77fb7aff84a3cb053',
outputPathIndices: '0x0000000000000000000000000000000000000000000000000000000000000000',
outputCommitment: '0x0ae58c1605312bd42fffdfc41d5e0f9a364ad458717c522bf9338068ab258601',
},
},
}
const miningWithdrawData = {
proof:
'0x087c02cdc5946b44f295e1adb8b65341708fe43854e44f05f205da6e46e2e4c4248b2dd5ee30236e7be2ea657265765b4e43dae263d67ff43190bb806faaafc10dd0a771f9d589b5061ddf0a713f27fc0b496d1b136dc4e98838b88f60efb072087c3018fa5c25b1f78b4bb968291b9afa3966d976e961d0a86719a8e07d771209dad29620f3bc2fc21c00510749a19e7ff369ade6b9fd1a7f05b74e70faee771fd839c710bd983927c9d3d5f39bb5e839a2ece19e899c4d50a91b29d5ac3f1a0e8faf7eeb2f6f672561bfba39bcb1d851f6c97d5c14b7fce6661cf315af3468119855a426fc4df511e848011bcdb704369deba20541a7651ab4d5813a60c056',
args: {
amount: '0x000000000000000000000000000000000000000000000000000000000000000f',
fee: '0x0000000000000000000000000000000000000000000000000000000000000000',
extDataHash: '0x00d95a201b89061613b5bc539bcf8fdee63a400ea80f1f5e813d6aacfee3ec67',
extData: {
recipient: '0xf17f52151ebef6c7334fad080c5704d77216b732',
relayer: '0x0000000000000000000000000000000000000000',
encryptedAccount:
'0x4bd7f84edab796b390181d8b1dd850c418c8b3fe41d63b9677b7b99a2fadc505dcc70df336a42847dc00fa39175d16ddfec0d80dc166282e024b5371f561467651ed94e71524fa2e365a8330b053d5cff7c3bcc3564b335fb9e74fb805a3a6e760b811db60e5d6b4e154376196c3cb61457bac6d5ea804f63208a389555cde72f40ab1b94705e728f692e699fc441504b9df34390b3992a1a1eac160dcf0df0b5c5a9ec9cd6c0c8f5f8aa11627fdf2b3bedece5836e9ca38b09d70ff7ba06702971d245d',
},
account: {
inputRoot: '0x1a756aeee7f7d05f276b20c8ca83150e110e1a436c2d959e501ab306420ab536',
inputNullifierHash: '0x0dc8ea0330171a1f868ef5f3f9f92e919d7be754846f6145c5e7819e87738e65',
outputRoot: '0x0d9d85371bd8c941400ae54815491799e98d1f335a9d263e41f0b81f22b55aa8',
outputPathIndices: '0x0000000000000000000000000000000000000000000000000000000000000001',
outputCommitment: '0x1ebd38a8bc53f47386687386397c8b5cefd33d55341b62a2a576b39d9bcec57c',
},
},
}

15
tornado-stream.conf Normal file

@ -0,0 +1,15 @@
map $ssl_preread_server_name $name {
yourdomain.com tornado_mainnet;
default tornado_mainnet;
}
upstream tornado_mainnet {
server 127.0.0.1:4380;
}
server {
listen 0.0.0.0:443;
proxy_pass $name;
ssl_preread on;
}

87
tornado.conf Normal file

@ -0,0 +1,87 @@
# If we receive X-Forwarded-Proto, pass it through; otherwise, pass along the
# scheme used to connect to this server
map $http_x_forwarded_proto $proxy_x_forwarded_proto {
default $http_x_forwarded_proto;
'' $scheme;
}
# If we receive X-Forwarded-Port, pass it through; otherwise, pass along the
# server port the client connected to
map $http_x_forwarded_port $proxy_x_forwarded_port {
default $http_x_forwarded_port;
'' $server_port;
}
# If we receive Upgrade, set Connection to "upgrade"; otherwise, delete any
# Connection header that may have been passed to this server
map $http_upgrade $proxy_connection {
default upgrade;
'' close;
}
# Apply fix for very long server names
server_names_hash_bucket_size 128;
# Default dhparam
# Set appropriate X-Forwarded-Ssl header based on $proxy_x_forwarded_proto
map $proxy_x_forwarded_proto $proxy_x_forwarded_ssl {
default off;
https on;
}
gzip_types text/plain text/css application/javascript application/json application/x-javascript text/xml application/xml application/xml+rss text/javascript;
log_format vhost '$host $remote_addr - $remote_user [$time_local] '
'"$request" $status $body_bytes_sent '
'"$http_referer" "$http_user_agent" '
'"$upstream_addr"';
# HTTP 1.1 support
proxy_http_version 1.1;
proxy_buffering off;
proxy_set_header Host $http_host;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection $proxy_connection;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $proxy_x_forwarded_proto;
proxy_set_header X-Forwarded-Ssl $proxy_x_forwarded_ssl;
proxy_set_header X-Forwarded-Port $proxy_x_forwarded_port;
proxy_set_header X-Original-URI $request_uri;
# Mitigate httpoxy attack (see README for details)
proxy_set_header Proxy "";
# Request rate limiting per second, 2Mb zone @ 5 requests per second
limit_req_zone $binary_remote_addr zone=one:2m rate=5r/s;
# Connections per IP limited to 2
limit_conn_zone $binary_remote_addr zone=two:2m;
server {
server_name _; # This is just an invalid value which will never trigger on a real hostname.
server_tokens off;
listen 80;
access_log /var/log/nginx/access.log vhost;
return 503;
}
server {
server_name yourdomain.com;
# Connection timeouts
client_body_timeout 10s;
client_header_timeout 10s;
listen 80;
access_log /var/log/nginx/access.log vhost;
# Do not HTTPS redirect LetsEncrypt ACME challenge
location ^~ /.well-known/acme-challenge/ {
limit_req zone=one;
limit_conn two 1;
proxy_pass http://127.0.0.1:8000;
break;
}
location / {
limit_req zone=one;
limit_conn two 1;
return 301 https://$host$request_uri;
}
}

5746
yarn.lock

File diff suppressed because it is too large Load Diff