Initial commit
This commit is contained in:
commit
d95635c4f8
4
.dockerignore
Normal file
4
.dockerignore
Normal file
@ -0,0 +1,4 @@
|
|||||||
|
node_modules
|
||||||
|
data
|
||||||
|
.git
|
||||||
|
.env
|
44
.eslintrc.js
Normal file
44
.eslintrc.js
Normal file
@ -0,0 +1,44 @@
|
|||||||
|
module.exports = {
|
||||||
|
env: {
|
||||||
|
es2021: true,
|
||||||
|
node: true,
|
||||||
|
},
|
||||||
|
extends: [
|
||||||
|
'prettier',
|
||||||
|
'eslint:recommended',
|
||||||
|
'plugin:@typescript-eslint/recommended',
|
||||||
|
'plugin:import/recommended',
|
||||||
|
'plugin:import/typescript',
|
||||||
|
'plugin:prettier/recommended',
|
||||||
|
],
|
||||||
|
overrides: [
|
||||||
|
{
|
||||||
|
env: {
|
||||||
|
node: true,
|
||||||
|
},
|
||||||
|
files: ['.eslintrc.{js,cjs}'],
|
||||||
|
parserOptions: {
|
||||||
|
sourceType: 'script',
|
||||||
|
},
|
||||||
|
},
|
||||||
|
],
|
||||||
|
parser: '@typescript-eslint/parser',
|
||||||
|
parserOptions: {
|
||||||
|
ecmaVersion: 'latest',
|
||||||
|
sourceType: 'module',
|
||||||
|
},
|
||||||
|
plugins: ['@typescript-eslint', 'prettier'],
|
||||||
|
rules: {
|
||||||
|
'prettier/prettier': [
|
||||||
|
'error',
|
||||||
|
{
|
||||||
|
tabWidth: 4,
|
||||||
|
printWidth: 120,
|
||||||
|
singleQuote: true,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
'import/order': ['error'],
|
||||||
|
'@typescript-eslint/no-unused-vars': ['warn'],
|
||||||
|
'@typescript-eslint/no-unused-expressions': ['off'],
|
||||||
|
},
|
||||||
|
};
|
2
.gitattributes
vendored
Normal file
2
.gitattributes
vendored
Normal file
@ -0,0 +1,2 @@
|
|||||||
|
lib/**/* linguist-vendored
|
||||||
|
static/**/* linguist-vendored
|
3
.gitignore
vendored
Normal file
3
.gitignore
vendored
Normal file
@ -0,0 +1,3 @@
|
|||||||
|
node_modules
|
||||||
|
data
|
||||||
|
.env
|
1
.npmrc
Normal file
1
.npmrc
Normal file
@ -0,0 +1 @@
|
|||||||
|
@tornado:registry=https://git.tornado.ws/api/packages/tornado-packages/npm/
|
12
Dockerfile
Normal file
12
Dockerfile
Normal file
@ -0,0 +1,12 @@
|
|||||||
|
FROM node:20-alpine
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
COPY package.json .
|
||||||
|
COPY yarn.lock .
|
||||||
|
|
||||||
|
RUN yarn
|
||||||
|
|
||||||
|
COPY . .
|
||||||
|
|
||||||
|
ENTRYPOINT ["yarn", "start"]
|
45
README.md
Normal file
45
README.md
Normal file
@ -0,0 +1,45 @@
|
|||||||
|
<div class="hero" align="center">
|
||||||
|
|
||||||
|
<img src="./logo2.png">
|
||||||
|
|
||||||
|
# Товарищ Relayer
|
||||||
|
|
||||||
|
Tovarish Relayer is a new Tornado Cash relayer applied with synced historic events API to help create withdrawal snark proofs in a decentralized, uncensorable manner.
|
||||||
|
|
||||||
|
It runs on multiple chains (yes, you only need to run a single tovarish relayer instance to run across any Tornado Cash deployed chains), and it only requires a single URL endpoint to expose and advertise your relayer.
|
||||||
|
|
||||||
|
</div>
|
||||||
|
|
||||||
|
## Disclaimer
|
||||||
|
|
||||||
|
Currently, this relayer isn't compatible with the deployed UI on [tornadocash.eth](https://tornadocash-eth.ipns.dweb.link) or any other classic UI deployments. API or any REST requests are compatible with the [classic relayer software](https://git.tornado.ws/tornadocash/tornado-relayer), however since this relayer uses more optimized gas price calculation from [@tornado/core](https://git.tornado.ws/tornado-packages/tornado-core) package the relayer may require more fees for network gas price then the previous version of the relayer. Thus, we have blocked interaction with any previous classic UI until the UI is upgraded to support the new gas calculation optimized for EIP-1559 chains.
|
||||||
|
|
||||||
|
## Technical Requirements
|
||||||
|
|
||||||
|
+ Latest LTS version of Node.js (20.x recommended)
|
||||||
|
|
||||||
|
+ RPC node (Supply them with 1_RPC or 56_RPC env value, see ./src/config.ts for available ENV variables).
|
||||||
|
|
||||||
|
+ Nginx
|
||||||
|
|
||||||
|
Note that unlike the classic version of relayer, this relayer doesn't require Redis DB or Docker installation, a single Relayer instance that runs on the top of Node.js would be sufficient (And it utilizes multi threading environment with Workers and Cluster support).
|
||||||
|
|
||||||
|
## How to run the relayer?
|
||||||
|
|
||||||
|
Note that this relayer is still on an early stage in which many components or source code can be changed in the foreseeable future.
|
||||||
|
|
||||||
|
In order to work with the relayer properly you must
|
||||||
|
|
||||||
|
1. Have your own RPC node or a paid plan with sufficient rate limits ( that allows 30 requests per second with historic events data ).
|
||||||
|
|
||||||
|
2. On-chain registered relayer addresses
|
||||||
|
|
||||||
|
You must register the relayer on the on-chain Relayer Registry contract. Follow the guidelines about registering the relayer on-chain. https://docs.tornado.ws/general/guides/relayer.html
|
||||||
|
|
||||||
|
After you run the relayer locally and have registered the relayer on-chain, you must also register the main URL on `tovarish-relayer` ENS subdomain just like how you registered for each chain.
|
||||||
|
|
||||||
|
Here is the example of registering tovarish relayer on the ENS domain https://app.ens.domains/tovarish-relayer.tornadowithdraw.eth?tab=records
|
||||||
|
|
||||||
|
## Upcoming Updates
|
||||||
|
|
||||||
|
This documentation will be likely updated in the near future.
|
13
docker-compose.yml
Normal file
13
docker-compose.yml
Normal file
@ -0,0 +1,13 @@
|
|||||||
|
services:
|
||||||
|
relayer:
|
||||||
|
container_name: relayer
|
||||||
|
image: relayer
|
||||||
|
build:
|
||||||
|
context: .
|
||||||
|
restart: always
|
||||||
|
env_file:
|
||||||
|
- ./.env
|
||||||
|
ports:
|
||||||
|
- '127.0.0.1:3000:3000'
|
||||||
|
volumes:
|
||||||
|
- './data:/app/data'
|
33
lib/config.d.ts
vendored
Normal file
33
lib/config.d.ts
vendored
Normal file
@ -0,0 +1,33 @@
|
|||||||
|
import 'dotenv/config';
|
||||||
|
import { NetIdType, SubdomainMap } from '@tornado/core';
|
||||||
|
export declare const version: string;
|
||||||
|
export interface RelayerConfig {
|
||||||
|
/**
|
||||||
|
* Router config
|
||||||
|
*/
|
||||||
|
host: string;
|
||||||
|
port: number;
|
||||||
|
workers: number;
|
||||||
|
reverseProxy: boolean;
|
||||||
|
logLevel?: string;
|
||||||
|
/**
|
||||||
|
* Worker config
|
||||||
|
*/
|
||||||
|
rewardAccount: string;
|
||||||
|
serviceFee: number;
|
||||||
|
clearInterval: number;
|
||||||
|
/**
|
||||||
|
* Sync config
|
||||||
|
*/
|
||||||
|
enabledNetworks: NetIdType[];
|
||||||
|
rpcUrls: SubdomainMap;
|
||||||
|
txRpcUrls: SubdomainMap;
|
||||||
|
merkleWorkerPath: string;
|
||||||
|
cacheDir: string;
|
||||||
|
userEventsDir: string;
|
||||||
|
userTreeDir: string;
|
||||||
|
syncInterval: number;
|
||||||
|
}
|
||||||
|
export declare function getPrivateKey(): string;
|
||||||
|
export declare function getRewardAccount(): string;
|
||||||
|
export declare function getRelayerConfig(): RelayerConfig;
|
75
lib/config.js
vendored
Normal file
75
lib/config.js
vendored
Normal file
@ -0,0 +1,75 @@
|
|||||||
|
"use strict";
|
||||||
|
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||||
|
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||||
|
};
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
exports.version = void 0;
|
||||||
|
exports.getPrivateKey = getPrivateKey;
|
||||||
|
exports.getRewardAccount = getRewardAccount;
|
||||||
|
exports.getRelayerConfig = getRelayerConfig;
|
||||||
|
const path_1 = __importDefault(require("path"));
|
||||||
|
const process_1 = __importDefault(require("process"));
|
||||||
|
const os_1 = __importDefault(require("os"));
|
||||||
|
require("dotenv/config");
|
||||||
|
const ethers_1 = require("ethers");
|
||||||
|
const core_1 = require("@tornado/core");
|
||||||
|
const package_json_1 = __importDefault(require("../package.json"));
|
||||||
|
exports.version = `${package_json_1.default.name} ${package_json_1.default.version}`;
|
||||||
|
function getPrivateKey() {
|
||||||
|
const privateKey = process_1.default.env.PRIVATE_KEY;
|
||||||
|
if (!privateKey || !(0, ethers_1.isHexString)(privateKey, 32)) {
|
||||||
|
throw new Error('Invalid private key, make sure it contains 0x prefix!');
|
||||||
|
}
|
||||||
|
return privateKey;
|
||||||
|
}
|
||||||
|
function getRewardAccount() {
|
||||||
|
return (0, ethers_1.computeAddress)(getPrivateKey());
|
||||||
|
}
|
||||||
|
function getRelayerConfig() {
|
||||||
|
const enabledNetworks = process_1.default.env.ENABLED_NETWORKS
|
||||||
|
? process_1.default.env.ENABLED_NETWORKS.replaceAll(' ', '')
|
||||||
|
.split(',')
|
||||||
|
.map((n) => Number(n))
|
||||||
|
.filter((n) => core_1.enabledChains.includes(n))
|
||||||
|
: core_1.enabledChains;
|
||||||
|
const rpcUrls = enabledNetworks.reduce((acc, netId) => {
|
||||||
|
// If we have custom RPC url (like as 1_RPC from ENV)
|
||||||
|
if (process_1.default.env[`${netId}_RPC`]) {
|
||||||
|
acc[netId] = process_1.default.env[`${netId}_RPC`] || '';
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
acc[netId] = Object.values((0, core_1.getConfig)(netId).rpcUrls)[0]?.url;
|
||||||
|
}
|
||||||
|
return acc;
|
||||||
|
}, {});
|
||||||
|
const txRpcUrls = enabledNetworks.reduce((acc, netId) => {
|
||||||
|
// If we have custom RPC url (like as 1_RPC from ENV)
|
||||||
|
if (process_1.default.env[`${netId}_TX_RPC`]) {
|
||||||
|
acc[netId] = process_1.default.env[`${netId}_TX_RPC`] || '';
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
acc[netId] = rpcUrls[netId];
|
||||||
|
}
|
||||||
|
return acc;
|
||||||
|
}, {});
|
||||||
|
const STATIC_DIR = process_1.default.env.CACHE_DIR || path_1.default.join(__dirname, '../static');
|
||||||
|
const USER_DIR = process_1.default.env.USER_DIR || './data';
|
||||||
|
return {
|
||||||
|
host: process_1.default.env.HOST || '0.0.0.0',
|
||||||
|
port: Number(process_1.default.env.PORT || 3000),
|
||||||
|
workers: Number(process_1.default.env.WORKERS || os_1.default.cpus().length),
|
||||||
|
reverseProxy: process_1.default.env.REVERSE_PROXY === 'true',
|
||||||
|
logLevel: process_1.default.env.LOG_LEVEL || undefined,
|
||||||
|
rewardAccount: getRewardAccount(),
|
||||||
|
serviceFee: Number(process_1.default.env.SERVICE_FEE || 0.5),
|
||||||
|
clearInterval: Number(process_1.default.env.CLEAR_INTERVAL || 86400),
|
||||||
|
enabledNetworks,
|
||||||
|
rpcUrls,
|
||||||
|
txRpcUrls,
|
||||||
|
merkleWorkerPath: path_1.default.join(STATIC_DIR, './merkleTreeWorker.js'),
|
||||||
|
cacheDir: path_1.default.join(STATIC_DIR, './events'),
|
||||||
|
userEventsDir: path_1.default.join(USER_DIR, './events'),
|
||||||
|
userTreeDir: path_1.default.join(USER_DIR, './trees'),
|
||||||
|
syncInterval: Number(process_1.default.env.SYNC_INTERVAL || 120),
|
||||||
|
};
|
||||||
|
}
|
2
lib/index.d.ts
vendored
Normal file
2
lib/index.d.ts
vendored
Normal file
@ -0,0 +1,2 @@
|
|||||||
|
export * from './services';
|
||||||
|
export * from './config';
|
18
lib/index.js
vendored
Normal file
18
lib/index.js
vendored
Normal file
@ -0,0 +1,18 @@
|
|||||||
|
"use strict";
|
||||||
|
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
||||||
|
if (k2 === undefined) k2 = k;
|
||||||
|
var desc = Object.getOwnPropertyDescriptor(m, k);
|
||||||
|
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
|
||||||
|
desc = { enumerable: true, get: function() { return m[k]; } };
|
||||||
|
}
|
||||||
|
Object.defineProperty(o, k2, desc);
|
||||||
|
}) : (function(o, m, k, k2) {
|
||||||
|
if (k2 === undefined) k2 = k;
|
||||||
|
o[k2] = m[k];
|
||||||
|
}));
|
||||||
|
var __exportStar = (this && this.__exportStar) || function(m, exports) {
|
||||||
|
for (var p in m) if (p !== "default" && !Object.prototype.hasOwnProperty.call(exports, p)) __createBinding(exports, m, p);
|
||||||
|
};
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
__exportStar(require("./services"), exports);
|
||||||
|
__exportStar(require("./config"), exports);
|
5
lib/services/check.d.ts
vendored
Normal file
5
lib/services/check.d.ts
vendored
Normal file
@ -0,0 +1,5 @@
|
|||||||
|
import type { Logger } from 'winston';
|
||||||
|
import { RelayerConfig } from '../config';
|
||||||
|
export declare const CHECK_BALANCE: bigint;
|
||||||
|
export declare const DISABLE_LOW_BALANCE = true;
|
||||||
|
export declare function checkProviders(relayerConfig: RelayerConfig, logger: Logger): Promise<void>;
|
42
lib/services/check.js
vendored
Normal file
42
lib/services/check.js
vendored
Normal file
@ -0,0 +1,42 @@
|
|||||||
|
"use strict";
|
||||||
|
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||||
|
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||||
|
};
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
exports.DISABLE_LOW_BALANCE = exports.CHECK_BALANCE = void 0;
|
||||||
|
exports.checkProviders = checkProviders;
|
||||||
|
const process_1 = __importDefault(require("process"));
|
||||||
|
const ethers_1 = require("ethers");
|
||||||
|
const core_1 = require("@tornado/core");
|
||||||
|
// Can use 0 to use network on low balance
|
||||||
|
exports.CHECK_BALANCE = (0, ethers_1.parseEther)(process_1.default.env.CHECK_BALANCE || '0.001');
|
||||||
|
exports.DISABLE_LOW_BALANCE = true;
|
||||||
|
async function checkProviders(relayerConfig, logger) {
|
||||||
|
const { enabledNetworks, rpcUrls, rewardAccount } = relayerConfig;
|
||||||
|
const disabledNetworks = (await Promise.all(enabledNetworks.map(async (netId) => {
|
||||||
|
try {
|
||||||
|
const config = (0, core_1.getConfig)(netId);
|
||||||
|
const rpcUrl = rpcUrls[netId];
|
||||||
|
const provider = await (0, core_1.getProvider)(rpcUrl, {
|
||||||
|
netId,
|
||||||
|
});
|
||||||
|
const balance = await provider.getBalance(rewardAccount);
|
||||||
|
const symbol = config.nativeCurrency.toUpperCase();
|
||||||
|
if (balance < exports.CHECK_BALANCE) {
|
||||||
|
logger.error(`Network ${netId} has lower balance than 0.001 ${symbol} and thus disabled (Balance: ${(0, ethers_1.formatEther)(balance)} ${symbol})`);
|
||||||
|
if (exports.DISABLE_LOW_BALANCE) {
|
||||||
|
return netId;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
logger.info(`Network ${netId} connected with ${rpcUrl} (Balance: ${(0, ethers_1.formatEther)(balance)} ${config.nativeCurrency.toUpperCase()})`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
catch (err) {
|
||||||
|
logger.error(`Failed to connect with ${netId} provider, make sure you have configured correct RPC url`);
|
||||||
|
throw err;
|
||||||
|
}
|
||||||
|
}))).filter((n) => n);
|
||||||
|
relayerConfig.enabledNetworks = relayerConfig.enabledNetworks.filter((n) => !disabledNetworks.includes(n));
|
||||||
|
logger.info(`Enabled Networks: ${relayerConfig.enabledNetworks.join(', ')}`);
|
||||||
|
}
|
33
lib/services/data.d.ts
vendored
Normal file
33
lib/services/data.d.ts
vendored
Normal file
@ -0,0 +1,33 @@
|
|||||||
|
import { AsyncZippable, Unzipped } from 'fflate';
|
||||||
|
import { BaseEvents, CachedEvents, MinimalEvents } from '@tornado/core';
|
||||||
|
export declare function existsAsync(fileOrDir: string): Promise<boolean>;
|
||||||
|
export declare function zipAsync(file: AsyncZippable): Promise<Uint8Array>;
|
||||||
|
export declare function unzipAsync(data: Uint8Array): Promise<Unzipped>;
|
||||||
|
export declare function saveUserFile({ fileName, userDirectory, dataString, lastBlock, }: {
|
||||||
|
fileName: string;
|
||||||
|
userDirectory: string;
|
||||||
|
dataString: string;
|
||||||
|
lastBlock?: number;
|
||||||
|
}): Promise<void>;
|
||||||
|
export declare function saveLastBlock({ fileName, userDirectory, lastBlock, }: {
|
||||||
|
fileName: string;
|
||||||
|
userDirectory: string;
|
||||||
|
lastBlock: number;
|
||||||
|
}): Promise<void>;
|
||||||
|
export declare function loadLastBlock({ name, directory }: {
|
||||||
|
name: string;
|
||||||
|
directory: string;
|
||||||
|
}): Promise<number | undefined>;
|
||||||
|
export declare function loadSavedEvents<T extends MinimalEvents>({ name, userDirectory, }: {
|
||||||
|
name: string;
|
||||||
|
userDirectory: string;
|
||||||
|
}): Promise<BaseEvents<T>>;
|
||||||
|
export declare function download({ name, cacheDirectory }: {
|
||||||
|
name: string;
|
||||||
|
cacheDirectory: string;
|
||||||
|
}): Promise<string>;
|
||||||
|
export declare function loadCachedEvents<T extends MinimalEvents>({ name, cacheDirectory, deployedBlock, }: {
|
||||||
|
name: string;
|
||||||
|
cacheDirectory: string;
|
||||||
|
deployedBlock: number;
|
||||||
|
}): Promise<CachedEvents<T>>;
|
151
lib/services/data.js
vendored
Normal file
151
lib/services/data.js
vendored
Normal file
@ -0,0 +1,151 @@
|
|||||||
|
"use strict";
|
||||||
|
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||||
|
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||||
|
};
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
exports.existsAsync = existsAsync;
|
||||||
|
exports.zipAsync = zipAsync;
|
||||||
|
exports.unzipAsync = unzipAsync;
|
||||||
|
exports.saveUserFile = saveUserFile;
|
||||||
|
exports.saveLastBlock = saveLastBlock;
|
||||||
|
exports.loadLastBlock = loadLastBlock;
|
||||||
|
exports.loadSavedEvents = loadSavedEvents;
|
||||||
|
exports.download = download;
|
||||||
|
exports.loadCachedEvents = loadCachedEvents;
|
||||||
|
const path_1 = __importDefault(require("path"));
|
||||||
|
const promises_1 = require("fs/promises");
|
||||||
|
const fflate_1 = require("fflate");
|
||||||
|
async function existsAsync(fileOrDir) {
|
||||||
|
try {
|
||||||
|
await (0, promises_1.stat)(fileOrDir);
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
catch {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
function zipAsync(file) {
|
||||||
|
return new Promise((res, rej) => {
|
||||||
|
(0, fflate_1.zip)(file, { mtime: new Date('1/1/1980') }, (err, data) => {
|
||||||
|
if (err) {
|
||||||
|
rej(err);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
res(data);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
function unzipAsync(data) {
|
||||||
|
return new Promise((res, rej) => {
|
||||||
|
(0, fflate_1.unzip)(data, {}, (err, data) => {
|
||||||
|
if (err) {
|
||||||
|
rej(err);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
res(data);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
async function saveUserFile({ fileName, userDirectory, dataString, lastBlock, }) {
|
||||||
|
fileName = fileName.toLowerCase();
|
||||||
|
const filePath = path_1.default.join(userDirectory, fileName);
|
||||||
|
const payload = await zipAsync({
|
||||||
|
[fileName]: new TextEncoder().encode(dataString),
|
||||||
|
});
|
||||||
|
if (!(await existsAsync(userDirectory))) {
|
||||||
|
await (0, promises_1.mkdir)(userDirectory, { recursive: true });
|
||||||
|
}
|
||||||
|
await (0, promises_1.writeFile)(filePath + '.zip', payload);
|
||||||
|
await (0, promises_1.writeFile)(filePath, dataString);
|
||||||
|
if (lastBlock) {
|
||||||
|
await saveLastBlock({
|
||||||
|
fileName: fileName.replace('.json', ''),
|
||||||
|
userDirectory,
|
||||||
|
lastBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
async function saveLastBlock({ fileName, userDirectory, lastBlock, }) {
|
||||||
|
const filePath = path_1.default.join(userDirectory, fileName);
|
||||||
|
if (lastBlock) {
|
||||||
|
await (0, promises_1.writeFile)(filePath + '.lastblock.txt', String(lastBlock));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
async function loadLastBlock({ name, directory }) {
|
||||||
|
const filePath = path_1.default.join(directory, `${name}.lastblock.txt`);
|
||||||
|
if (!(await existsAsync(filePath))) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
try {
|
||||||
|
const lastBlock = Number(await (0, promises_1.readFile)(filePath, { encoding: 'utf8' }));
|
||||||
|
if (lastBlock) {
|
||||||
|
return lastBlock;
|
||||||
|
}
|
||||||
|
// eslint-disable-next-line no-empty
|
||||||
|
}
|
||||||
|
catch { }
|
||||||
|
}
|
||||||
|
async function loadSavedEvents({ name, userDirectory, }) {
|
||||||
|
const filePath = path_1.default.join(userDirectory, `${name}.json`.toLowerCase());
|
||||||
|
if (!(await existsAsync(filePath))) {
|
||||||
|
return {
|
||||||
|
events: [],
|
||||||
|
lastBlock: 0,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
try {
|
||||||
|
const events = JSON.parse(await (0, promises_1.readFile)(filePath, { encoding: 'utf8' }));
|
||||||
|
const loadedBlock = await loadLastBlock({
|
||||||
|
name,
|
||||||
|
directory: userDirectory,
|
||||||
|
});
|
||||||
|
return {
|
||||||
|
events,
|
||||||
|
lastBlock: loadedBlock || events[events.length - 1]?.blockNumber || 0,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
catch (err) {
|
||||||
|
console.log('Method loadSavedEvents has error');
|
||||||
|
console.log(err);
|
||||||
|
return {
|
||||||
|
events: [],
|
||||||
|
lastBlock: 0,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
async function download({ name, cacheDirectory }) {
|
||||||
|
const fileName = `${name}.json`.toLowerCase();
|
||||||
|
const zipName = `${fileName}.zip`;
|
||||||
|
const zipPath = path_1.default.join(cacheDirectory, zipName);
|
||||||
|
const data = await (0, promises_1.readFile)(zipPath);
|
||||||
|
const { [fileName]: content } = await unzipAsync(data);
|
||||||
|
return new TextDecoder().decode(content);
|
||||||
|
}
|
||||||
|
async function loadCachedEvents({ name, cacheDirectory, deployedBlock, }) {
|
||||||
|
try {
|
||||||
|
const module = await download({ cacheDirectory, name });
|
||||||
|
if (module) {
|
||||||
|
const events = JSON.parse(module);
|
||||||
|
const lastBlock = events && events.length ? events[events.length - 1].blockNumber : deployedBlock;
|
||||||
|
return {
|
||||||
|
events,
|
||||||
|
lastBlock,
|
||||||
|
fromCache: true,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
events: [],
|
||||||
|
lastBlock: deployedBlock,
|
||||||
|
fromCache: true,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
catch (err) {
|
||||||
|
console.log('Method loadCachedEvents has error');
|
||||||
|
console.log(err);
|
||||||
|
return {
|
||||||
|
events: [],
|
||||||
|
lastBlock: deployedBlock,
|
||||||
|
fromCache: true,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
11
lib/services/error.d.ts
vendored
Normal file
11
lib/services/error.d.ts
vendored
Normal file
@ -0,0 +1,11 @@
|
|||||||
|
import { NetIdType } from '@tornado/core';
|
||||||
|
export interface ErrorTypes {
|
||||||
|
type: string;
|
||||||
|
netId: number;
|
||||||
|
timestamp: number;
|
||||||
|
}
|
||||||
|
export interface ErrorMessages extends ErrorTypes {
|
||||||
|
message?: string;
|
||||||
|
stack?: string;
|
||||||
|
}
|
||||||
|
export declare function newError(type: string, netId: NetIdType, err: any): ErrorMessages;
|
14
lib/services/error.js
vendored
Normal file
14
lib/services/error.js
vendored
Normal file
@ -0,0 +1,14 @@
|
|||||||
|
"use strict";
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
exports.newError = newError;
|
||||||
|
function newError(type, netId,
|
||||||
|
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||||
|
err) {
|
||||||
|
return {
|
||||||
|
type,
|
||||||
|
netId,
|
||||||
|
timestamp: Math.floor(Date.now() / 1000),
|
||||||
|
message: err.message,
|
||||||
|
stack: err.stack,
|
||||||
|
};
|
||||||
|
}
|
144
lib/services/events.d.ts
vendored
Normal file
144
lib/services/events.d.ts
vendored
Normal file
@ -0,0 +1,144 @@
|
|||||||
|
import { BaseTornadoService, BaseEncryptedNotesService, BaseGovernanceService, BaseRegistryService, BaseTornadoServiceConstructor, BaseEncryptedNotesServiceConstructor, BaseGovernanceServiceConstructor, BaseRegistryServiceConstructor, BaseEchoServiceConstructor, BaseEchoService, CachedRelayers, BatchEventsService, BaseEvents, DepositsEvents, WithdrawalsEvents, EncryptedNotesEvents, AllGovernanceEvents, EchoEvents, BatchEventServiceConstructor, BatchEventOnProgress, NetIdType, AllRelayerRegistryEvents, BaseRevenueService, BaseRevenueServiceConstructor, StakeBurnedEvents } from '@tornado/core';
|
||||||
|
import type { Logger } from 'winston';
|
||||||
|
import { TreeCache } from './treeCache';
|
||||||
|
export interface NodeEventsConstructor extends BatchEventServiceConstructor {
|
||||||
|
netId: NetIdType;
|
||||||
|
logger: Logger;
|
||||||
|
getInstanceName: () => string;
|
||||||
|
}
|
||||||
|
export declare class NodeEventsService extends BatchEventsService {
|
||||||
|
netId: NetIdType;
|
||||||
|
logger: Logger;
|
||||||
|
getInstanceName: () => string;
|
||||||
|
constructor(serviceConstructor: NodeEventsConstructor);
|
||||||
|
}
|
||||||
|
export interface NodeTornadoServiceConstructor extends BaseTornadoServiceConstructor {
|
||||||
|
cacheDirectory: string;
|
||||||
|
userDirectory: string;
|
||||||
|
nativeCurrency: string;
|
||||||
|
logger: Logger;
|
||||||
|
treeCache?: TreeCache;
|
||||||
|
}
|
||||||
|
export declare class NodeTornadoService extends BaseTornadoService {
|
||||||
|
cacheDirectory: string;
|
||||||
|
userDirectory: string;
|
||||||
|
nativeCurrency: string;
|
||||||
|
logger: Logger;
|
||||||
|
treeCache?: TreeCache;
|
||||||
|
constructor(serviceConstructor: NodeTornadoServiceConstructor);
|
||||||
|
updateEventProgress({ fromBlock, toBlock, count }: Parameters<BatchEventOnProgress>[0]): void;
|
||||||
|
getEventsFromDB(): Promise<BaseEvents<DepositsEvents | WithdrawalsEvents>>;
|
||||||
|
getEventsFromCache(): Promise<import("@tornado/core").CachedEvents<DepositsEvents | WithdrawalsEvents>>;
|
||||||
|
validateEvents<S>({ events, lastBlock, hasNewEvents, }: BaseEvents<DepositsEvents | WithdrawalsEvents> & {
|
||||||
|
hasNewEvents?: boolean;
|
||||||
|
}): Promise<S>;
|
||||||
|
saveEvents({ events, lastBlock }: BaseEvents<DepositsEvents | WithdrawalsEvents>): Promise<void>;
|
||||||
|
updateEvents<S>(): Promise<{
|
||||||
|
events: (DepositsEvents | WithdrawalsEvents)[];
|
||||||
|
lastBlock: number;
|
||||||
|
validateResult: Awaited<S>;
|
||||||
|
}>;
|
||||||
|
}
|
||||||
|
export interface NodeEchoServiceConstructor extends BaseEchoServiceConstructor {
|
||||||
|
cacheDirectory: string;
|
||||||
|
userDirectory: string;
|
||||||
|
logger: Logger;
|
||||||
|
}
|
||||||
|
export declare class NodeEchoService extends BaseEchoService {
|
||||||
|
cacheDirectory: string;
|
||||||
|
userDirectory: string;
|
||||||
|
logger: Logger;
|
||||||
|
constructor(serviceConstructor: NodeEchoServiceConstructor);
|
||||||
|
updateEventProgress({ fromBlock, toBlock, count }: Parameters<BatchEventOnProgress>[0]): void;
|
||||||
|
getEventsFromDB(): Promise<BaseEvents<EchoEvents>>;
|
||||||
|
getEventsFromCache(): Promise<import("@tornado/core").CachedEvents<EchoEvents>>;
|
||||||
|
saveEvents({ events, lastBlock }: BaseEvents<EchoEvents>): Promise<void>;
|
||||||
|
updateEvents<S>(): Promise<{
|
||||||
|
events: EchoEvents[];
|
||||||
|
lastBlock: number;
|
||||||
|
validateResult: Awaited<S>;
|
||||||
|
}>;
|
||||||
|
}
|
||||||
|
export interface NodeEncryptedNotesServiceConstructor extends BaseEncryptedNotesServiceConstructor {
|
||||||
|
cacheDirectory: string;
|
||||||
|
userDirectory: string;
|
||||||
|
logger: Logger;
|
||||||
|
}
|
||||||
|
export declare class NodeEncryptedNotesService extends BaseEncryptedNotesService {
|
||||||
|
cacheDirectory: string;
|
||||||
|
userDirectory: string;
|
||||||
|
logger: Logger;
|
||||||
|
constructor(serviceConstructor: NodeEncryptedNotesServiceConstructor);
|
||||||
|
updateEventProgress({ fromBlock, toBlock, count }: Parameters<BatchEventOnProgress>[0]): void;
|
||||||
|
getEventsFromDB(): Promise<BaseEvents<EncryptedNotesEvents>>;
|
||||||
|
getEventsFromCache(): Promise<import("@tornado/core").CachedEvents<EncryptedNotesEvents>>;
|
||||||
|
saveEvents({ events, lastBlock }: BaseEvents<EncryptedNotesEvents>): Promise<void>;
|
||||||
|
updateEvents<S>(): Promise<{
|
||||||
|
events: EncryptedNotesEvents[];
|
||||||
|
lastBlock: number;
|
||||||
|
validateResult: Awaited<S>;
|
||||||
|
}>;
|
||||||
|
}
|
||||||
|
export interface NodeGovernanceServiceConstructor extends BaseGovernanceServiceConstructor {
|
||||||
|
cacheDirectory: string;
|
||||||
|
userDirectory: string;
|
||||||
|
logger: Logger;
|
||||||
|
}
|
||||||
|
export declare class NodeGovernanceService extends BaseGovernanceService {
|
||||||
|
cacheDirectory: string;
|
||||||
|
userDirectory: string;
|
||||||
|
logger: Logger;
|
||||||
|
constructor(serviceConstructor: NodeGovernanceServiceConstructor);
|
||||||
|
updateEventProgress({ fromBlock, toBlock, count }: Parameters<BatchEventOnProgress>[0]): void;
|
||||||
|
getEventsFromDB(): Promise<BaseEvents<AllGovernanceEvents>>;
|
||||||
|
getEventsFromCache(): Promise<import("@tornado/core").CachedEvents<AllGovernanceEvents>>;
|
||||||
|
saveEvents({ events, lastBlock }: BaseEvents<AllGovernanceEvents>): Promise<void>;
|
||||||
|
updateEvents<S>(): Promise<{
|
||||||
|
events: AllGovernanceEvents[];
|
||||||
|
lastBlock: number;
|
||||||
|
validateResult: Awaited<S>;
|
||||||
|
}>;
|
||||||
|
}
|
||||||
|
export interface NodeRegistryServiceConstructor extends BaseRegistryServiceConstructor {
|
||||||
|
cacheDirectory: string;
|
||||||
|
userDirectory: string;
|
||||||
|
logger: Logger;
|
||||||
|
}
|
||||||
|
export declare class NodeRegistryService extends BaseRegistryService {
|
||||||
|
cacheDirectory: string;
|
||||||
|
userDirectory: string;
|
||||||
|
logger: Logger;
|
||||||
|
constructor(serviceConstructor: NodeRegistryServiceConstructor);
|
||||||
|
updateEventProgress({ fromBlock, toBlock, count }: Parameters<BatchEventOnProgress>[0]): void;
|
||||||
|
getEventsFromDB(): Promise<BaseEvents<AllRelayerRegistryEvents>>;
|
||||||
|
getEventsFromCache(): Promise<import("@tornado/core").CachedEvents<AllRelayerRegistryEvents>>;
|
||||||
|
saveEvents({ events, lastBlock }: BaseEvents<AllRelayerRegistryEvents>): Promise<void>;
|
||||||
|
updateEvents<S>(): Promise<{
|
||||||
|
events: AllRelayerRegistryEvents[];
|
||||||
|
lastBlock: number;
|
||||||
|
validateResult: Awaited<S>;
|
||||||
|
}>;
|
||||||
|
getRelayersFromDB(): Promise<CachedRelayers>;
|
||||||
|
getRelayersFromCache(): Promise<CachedRelayers>;
|
||||||
|
saveRelayers({ lastBlock, timestamp, relayers }: CachedRelayers): Promise<void>;
|
||||||
|
}
|
||||||
|
export interface NodeRevenueServiceConstructor extends BaseRevenueServiceConstructor {
|
||||||
|
cacheDirectory: string;
|
||||||
|
userDirectory: string;
|
||||||
|
logger: Logger;
|
||||||
|
}
|
||||||
|
export declare class NodeRevenueService extends BaseRevenueService {
|
||||||
|
cacheDirectory: string;
|
||||||
|
userDirectory: string;
|
||||||
|
logger: Logger;
|
||||||
|
constructor(serviceConstructor: NodeRevenueServiceConstructor);
|
||||||
|
updateEventProgress({ fromBlock, toBlock, count }: Parameters<BatchEventOnProgress>[0]): void;
|
||||||
|
getEventsFromDB(): Promise<BaseEvents<StakeBurnedEvents>>;
|
||||||
|
getEventsFromCache(): Promise<import("@tornado/core").CachedEvents<StakeBurnedEvents>>;
|
||||||
|
saveEvents({ events, lastBlock }: BaseEvents<StakeBurnedEvents>): Promise<void>;
|
||||||
|
updateEvents<S>(): Promise<{
|
||||||
|
events: StakeBurnedEvents[];
|
||||||
|
lastBlock: number;
|
||||||
|
validateResult: Awaited<S>;
|
||||||
|
}>;
|
||||||
|
}
|
468
lib/services/events.js
vendored
Normal file
468
lib/services/events.js
vendored
Normal file
@ -0,0 +1,468 @@
|
|||||||
|
"use strict";
|
||||||
|
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||||
|
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||||
|
};
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
exports.NodeRevenueService = exports.NodeRegistryService = exports.NodeGovernanceService = exports.NodeEncryptedNotesService = exports.NodeEchoService = exports.NodeTornadoService = exports.NodeEventsService = void 0;
|
||||||
|
const path_1 = __importDefault(require("path"));
|
||||||
|
const promises_1 = require("fs/promises");
|
||||||
|
const core_1 = require("@tornado/core");
|
||||||
|
const data_1 = require("./data");
|
||||||
|
class NodeEventsService extends core_1.BatchEventsService {
|
||||||
|
netId;
|
||||||
|
logger;
|
||||||
|
getInstanceName;
|
||||||
|
constructor(serviceConstructor) {
|
||||||
|
super(serviceConstructor);
|
||||||
|
this.netId = serviceConstructor.netId;
|
||||||
|
this.logger = serviceConstructor.logger;
|
||||||
|
this.getInstanceName = serviceConstructor.getInstanceName;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
exports.NodeEventsService = NodeEventsService;
|
||||||
|
class NodeTornadoService extends core_1.BaseTornadoService {
|
||||||
|
cacheDirectory;
|
||||||
|
userDirectory;
|
||||||
|
nativeCurrency;
|
||||||
|
logger;
|
||||||
|
treeCache;
|
||||||
|
constructor(serviceConstructor) {
|
||||||
|
super(serviceConstructor);
|
||||||
|
const { netId, provider, Tornado, type, amount, currency, cacheDirectory, userDirectory, nativeCurrency, logger, treeCache, } = serviceConstructor;
|
||||||
|
this.cacheDirectory = cacheDirectory;
|
||||||
|
this.userDirectory = userDirectory;
|
||||||
|
this.nativeCurrency = nativeCurrency;
|
||||||
|
this.logger = logger;
|
||||||
|
this.batchEventsService = new NodeEventsService({
|
||||||
|
netId,
|
||||||
|
provider,
|
||||||
|
contract: Tornado,
|
||||||
|
onProgress: this.updateEventProgress,
|
||||||
|
logger,
|
||||||
|
getInstanceName: () => `${type.toLowerCase()}s_${netId}_${currency}_${amount}`,
|
||||||
|
});
|
||||||
|
this.treeCache = treeCache;
|
||||||
|
}
|
||||||
|
updateEventProgress({ fromBlock, toBlock, count }) {
|
||||||
|
if (toBlock) {
|
||||||
|
this.logger.debug(`${this.getInstanceName()}: Fetched ${count} events from ${fromBlock} to ${toBlock}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
async getEventsFromDB() {
|
||||||
|
return await (0, data_1.loadSavedEvents)({
|
||||||
|
name: this.getInstanceName(),
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
async getEventsFromCache() {
|
||||||
|
return await (0, data_1.loadCachedEvents)({
|
||||||
|
name: this.getInstanceName(),
|
||||||
|
cacheDirectory: this.cacheDirectory,
|
||||||
|
deployedBlock: this.deployedBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
async validateEvents({ events, lastBlock, hasNewEvents, }) {
|
||||||
|
const tree = await super.validateEvents({
|
||||||
|
events,
|
||||||
|
lastBlock,
|
||||||
|
hasNewEvents,
|
||||||
|
});
|
||||||
|
if (tree && this.currency === this.nativeCurrency && this.treeCache) {
|
||||||
|
const merkleTree = tree;
|
||||||
|
await this.treeCache.createTree(events, merkleTree);
|
||||||
|
console.log(`${this.getInstanceName()}: Updated tree cache with root ${(0, core_1.toFixedHex)(BigInt(merkleTree.root))}\n`);
|
||||||
|
}
|
||||||
|
return tree;
|
||||||
|
}
|
||||||
|
async saveEvents({ events, lastBlock }) {
|
||||||
|
await (0, data_1.saveUserFile)({
|
||||||
|
fileName: this.getInstanceName() + '.json',
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
dataString: JSON.stringify(events, null, 2) + '\n',
|
||||||
|
lastBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
async updateEvents() {
|
||||||
|
const { events, lastBlock, validateResult } = await super.updateEvents();
|
||||||
|
await (0, data_1.saveLastBlock)({
|
||||||
|
fileName: this.getInstanceName(),
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
lastBlock,
|
||||||
|
});
|
||||||
|
return {
|
||||||
|
events,
|
||||||
|
lastBlock,
|
||||||
|
validateResult,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
exports.NodeTornadoService = NodeTornadoService;
|
||||||
|
class NodeEchoService extends core_1.BaseEchoService {
|
||||||
|
cacheDirectory;
|
||||||
|
userDirectory;
|
||||||
|
logger;
|
||||||
|
constructor(serviceConstructor) {
|
||||||
|
super(serviceConstructor);
|
||||||
|
const { netId, provider, Echoer, cacheDirectory, userDirectory, logger } = serviceConstructor;
|
||||||
|
this.cacheDirectory = cacheDirectory;
|
||||||
|
this.userDirectory = userDirectory;
|
||||||
|
this.logger = logger;
|
||||||
|
this.batchEventsService = new NodeEventsService({
|
||||||
|
netId,
|
||||||
|
provider,
|
||||||
|
contract: Echoer,
|
||||||
|
onProgress: this.updateEventProgress,
|
||||||
|
logger,
|
||||||
|
getInstanceName: this.getInstanceName,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
updateEventProgress({ fromBlock, toBlock, count }) {
|
||||||
|
if (toBlock) {
|
||||||
|
this.logger.debug(`${this.getInstanceName()}: Fetched ${count} events from ${fromBlock} to ${toBlock}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
async getEventsFromDB() {
|
||||||
|
return await (0, data_1.loadSavedEvents)({
|
||||||
|
name: this.getInstanceName(),
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
async getEventsFromCache() {
|
||||||
|
return await (0, data_1.loadCachedEvents)({
|
||||||
|
name: this.getInstanceName(),
|
||||||
|
cacheDirectory: this.cacheDirectory,
|
||||||
|
deployedBlock: this.deployedBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
async saveEvents({ events, lastBlock }) {
|
||||||
|
const instanceName = this.getInstanceName();
|
||||||
|
await (0, data_1.saveUserFile)({
|
||||||
|
fileName: instanceName + '.json',
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
dataString: JSON.stringify(events, null, 2) + '\n',
|
||||||
|
lastBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
async updateEvents() {
|
||||||
|
const { events, lastBlock, validateResult } = await super.updateEvents();
|
||||||
|
await (0, data_1.saveLastBlock)({
|
||||||
|
fileName: this.getInstanceName(),
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
lastBlock,
|
||||||
|
});
|
||||||
|
return {
|
||||||
|
events,
|
||||||
|
lastBlock,
|
||||||
|
validateResult,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
exports.NodeEchoService = NodeEchoService;
|
||||||
|
class NodeEncryptedNotesService extends core_1.BaseEncryptedNotesService {
|
||||||
|
cacheDirectory;
|
||||||
|
userDirectory;
|
||||||
|
logger;
|
||||||
|
constructor(serviceConstructor) {
|
||||||
|
super(serviceConstructor);
|
||||||
|
const { netId, provider, Router, cacheDirectory, userDirectory, logger } = serviceConstructor;
|
||||||
|
this.cacheDirectory = cacheDirectory;
|
||||||
|
this.userDirectory = userDirectory;
|
||||||
|
this.logger = logger;
|
||||||
|
this.batchEventsService = new NodeEventsService({
|
||||||
|
netId,
|
||||||
|
provider,
|
||||||
|
contract: Router,
|
||||||
|
onProgress: this.updateEventProgress,
|
||||||
|
logger,
|
||||||
|
getInstanceName: this.getInstanceName,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
updateEventProgress({ fromBlock, toBlock, count }) {
|
||||||
|
if (toBlock) {
|
||||||
|
this.logger.debug(`${this.getInstanceName()}: Fetched ${count} events from ${fromBlock} to ${toBlock}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
async getEventsFromDB() {
|
||||||
|
return await (0, data_1.loadSavedEvents)({
|
||||||
|
name: this.getInstanceName(),
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
async getEventsFromCache() {
|
||||||
|
return await (0, data_1.loadCachedEvents)({
|
||||||
|
name: this.getInstanceName(),
|
||||||
|
cacheDirectory: this.cacheDirectory,
|
||||||
|
deployedBlock: this.deployedBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
async saveEvents({ events, lastBlock }) {
|
||||||
|
const instanceName = this.getInstanceName();
|
||||||
|
await (0, data_1.saveUserFile)({
|
||||||
|
fileName: instanceName + '.json',
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
dataString: JSON.stringify(events, null, 2) + '\n',
|
||||||
|
lastBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
async updateEvents() {
|
||||||
|
const { events, lastBlock, validateResult } = await super.updateEvents();
|
||||||
|
await (0, data_1.saveLastBlock)({
|
||||||
|
fileName: this.getInstanceName(),
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
lastBlock,
|
||||||
|
});
|
||||||
|
return {
|
||||||
|
events,
|
||||||
|
lastBlock,
|
||||||
|
validateResult,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
exports.NodeEncryptedNotesService = NodeEncryptedNotesService;
|
||||||
|
class NodeGovernanceService extends core_1.BaseGovernanceService {
|
||||||
|
cacheDirectory;
|
||||||
|
userDirectory;
|
||||||
|
logger;
|
||||||
|
constructor(serviceConstructor) {
|
||||||
|
super(serviceConstructor);
|
||||||
|
const { netId, provider, Governance, cacheDirectory, userDirectory, logger } = serviceConstructor;
|
||||||
|
this.cacheDirectory = cacheDirectory;
|
||||||
|
this.userDirectory = userDirectory;
|
||||||
|
this.logger = logger;
|
||||||
|
this.batchEventsService = new NodeEventsService({
|
||||||
|
netId,
|
||||||
|
provider,
|
||||||
|
contract: Governance,
|
||||||
|
onProgress: this.updateEventProgress,
|
||||||
|
logger,
|
||||||
|
getInstanceName: this.getInstanceName,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
updateEventProgress({ fromBlock, toBlock, count }) {
|
||||||
|
if (toBlock) {
|
||||||
|
this.logger.debug(`${this.getInstanceName()}: Fetched ${count} events from ${fromBlock} to ${toBlock}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
async getEventsFromDB() {
|
||||||
|
return await (0, data_1.loadSavedEvents)({
|
||||||
|
name: this.getInstanceName(),
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
async getEventsFromCache() {
|
||||||
|
return await (0, data_1.loadCachedEvents)({
|
||||||
|
name: this.getInstanceName(),
|
||||||
|
cacheDirectory: this.cacheDirectory,
|
||||||
|
deployedBlock: this.deployedBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
async saveEvents({ events, lastBlock }) {
|
||||||
|
const instanceName = this.getInstanceName();
|
||||||
|
await (0, data_1.saveUserFile)({
|
||||||
|
fileName: instanceName + '.json',
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
dataString: JSON.stringify(events, null, 2) + '\n',
|
||||||
|
lastBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
async updateEvents() {
|
||||||
|
const { events, lastBlock, validateResult } = await super.updateEvents();
|
||||||
|
await (0, data_1.saveLastBlock)({
|
||||||
|
fileName: this.getInstanceName(),
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
lastBlock,
|
||||||
|
});
|
||||||
|
return {
|
||||||
|
events,
|
||||||
|
lastBlock,
|
||||||
|
validateResult,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
exports.NodeGovernanceService = NodeGovernanceService;
|
||||||
|
class NodeRegistryService extends core_1.BaseRegistryService {
|
||||||
|
cacheDirectory;
|
||||||
|
userDirectory;
|
||||||
|
logger;
|
||||||
|
constructor(serviceConstructor) {
|
||||||
|
super(serviceConstructor);
|
||||||
|
const { netId, provider, RelayerRegistry, cacheDirectory, userDirectory, logger } = serviceConstructor;
|
||||||
|
this.cacheDirectory = cacheDirectory;
|
||||||
|
this.userDirectory = userDirectory;
|
||||||
|
this.logger = logger;
|
||||||
|
this.batchEventsService = new NodeEventsService({
|
||||||
|
netId,
|
||||||
|
provider,
|
||||||
|
contract: RelayerRegistry,
|
||||||
|
onProgress: this.updateEventProgress,
|
||||||
|
logger,
|
||||||
|
getInstanceName: this.getInstanceName,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
updateEventProgress({ fromBlock, toBlock, count }) {
|
||||||
|
if (toBlock) {
|
||||||
|
this.logger.debug(`${this.getInstanceName()}: Fetched ${count} events from ${fromBlock} to ${toBlock}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
async getEventsFromDB() {
|
||||||
|
return await (0, data_1.loadSavedEvents)({
|
||||||
|
name: this.getInstanceName(),
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
async getEventsFromCache() {
|
||||||
|
return await (0, data_1.loadCachedEvents)({
|
||||||
|
name: this.getInstanceName(),
|
||||||
|
cacheDirectory: this.cacheDirectory,
|
||||||
|
deployedBlock: this.deployedBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
async saveEvents({ events, lastBlock }) {
|
||||||
|
const instanceName = this.getInstanceName();
|
||||||
|
await (0, data_1.saveUserFile)({
|
||||||
|
fileName: instanceName + '.json',
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
dataString: JSON.stringify(events, null, 2) + '\n',
|
||||||
|
lastBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
async updateEvents() {
|
||||||
|
const { events, lastBlock, validateResult } = await super.updateEvents();
|
||||||
|
await (0, data_1.saveLastBlock)({
|
||||||
|
fileName: this.getInstanceName(),
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
lastBlock,
|
||||||
|
});
|
||||||
|
return {
|
||||||
|
events,
|
||||||
|
lastBlock,
|
||||||
|
validateResult,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
async getRelayersFromDB() {
|
||||||
|
const filePath = path_1.default.join(this.userDirectory || '', 'relayers.json');
|
||||||
|
if (!this.userDirectory || !(await (0, data_1.existsAsync)(filePath))) {
|
||||||
|
return {
|
||||||
|
lastBlock: 0,
|
||||||
|
timestamp: 0,
|
||||||
|
relayers: [],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
try {
|
||||||
|
const { lastBlock, timestamp, relayers } = JSON.parse(await (0, promises_1.readFile)(filePath, { encoding: 'utf8' }));
|
||||||
|
return {
|
||||||
|
lastBlock,
|
||||||
|
timestamp,
|
||||||
|
relayers,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
catch (err) {
|
||||||
|
console.log('Method getRelayersFromDB has error');
|
||||||
|
console.log(err);
|
||||||
|
return {
|
||||||
|
lastBlock: 0,
|
||||||
|
timestamp: 0,
|
||||||
|
relayers: [],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
async getRelayersFromCache() {
|
||||||
|
const filePath = path_1.default.join(this.cacheDirectory || '', 'relayers.json');
|
||||||
|
if (!this.cacheDirectory || !(await (0, data_1.existsAsync)(filePath))) {
|
||||||
|
return {
|
||||||
|
lastBlock: 0,
|
||||||
|
timestamp: 0,
|
||||||
|
relayers: [],
|
||||||
|
fromCache: true,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
try {
|
||||||
|
const { lastBlock, timestamp, relayers } = JSON.parse(await (0, promises_1.readFile)(filePath, { encoding: 'utf8' }));
|
||||||
|
return {
|
||||||
|
lastBlock,
|
||||||
|
timestamp,
|
||||||
|
relayers,
|
||||||
|
fromCache: true,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
catch (err) {
|
||||||
|
console.log('Method getRelayersFromDB has error');
|
||||||
|
console.log(err);
|
||||||
|
return {
|
||||||
|
lastBlock: 0,
|
||||||
|
timestamp: 0,
|
||||||
|
relayers: [],
|
||||||
|
fromCache: true,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
async saveRelayers({ lastBlock, timestamp, relayers }) {
|
||||||
|
await (0, data_1.saveUserFile)({
|
||||||
|
fileName: 'relayers.json',
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
dataString: JSON.stringify({ lastBlock, timestamp, relayers }, null, 2) + '\n',
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
exports.NodeRegistryService = NodeRegistryService;
|
||||||
|
class NodeRevenueService extends core_1.BaseRevenueService {
|
||||||
|
cacheDirectory;
|
||||||
|
userDirectory;
|
||||||
|
logger;
|
||||||
|
constructor(serviceConstructor) {
|
||||||
|
super(serviceConstructor);
|
||||||
|
const { netId, provider, RelayerRegistry, cacheDirectory, userDirectory, logger } = serviceConstructor;
|
||||||
|
this.cacheDirectory = cacheDirectory;
|
||||||
|
this.userDirectory = userDirectory;
|
||||||
|
this.logger = logger;
|
||||||
|
this.batchEventsService = new NodeEventsService({
|
||||||
|
netId,
|
||||||
|
provider,
|
||||||
|
contract: RelayerRegistry,
|
||||||
|
onProgress: this.updateEventProgress,
|
||||||
|
logger,
|
||||||
|
getInstanceName: this.getInstanceName,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
updateEventProgress({ fromBlock, toBlock, count }) {
|
||||||
|
if (toBlock) {
|
||||||
|
this.logger.debug(`${this.getInstanceName()}: Fetched ${count} events from ${fromBlock} to ${toBlock}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
async getEventsFromDB() {
|
||||||
|
return await (0, data_1.loadSavedEvents)({
|
||||||
|
name: this.getInstanceName(),
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
async getEventsFromCache() {
|
||||||
|
return await (0, data_1.loadCachedEvents)({
|
||||||
|
name: this.getInstanceName(),
|
||||||
|
cacheDirectory: this.cacheDirectory,
|
||||||
|
deployedBlock: this.deployedBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
async saveEvents({ events, lastBlock }) {
|
||||||
|
const instanceName = this.getInstanceName();
|
||||||
|
await (0, data_1.saveUserFile)({
|
||||||
|
fileName: instanceName + '.json',
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
dataString: JSON.stringify(events, null, 2) + '\n',
|
||||||
|
lastBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
async updateEvents() {
|
||||||
|
const { events, lastBlock, validateResult } = await super.updateEvents();
|
||||||
|
await (0, data_1.saveLastBlock)({
|
||||||
|
fileName: this.getInstanceName(),
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
lastBlock,
|
||||||
|
});
|
||||||
|
return {
|
||||||
|
events,
|
||||||
|
lastBlock,
|
||||||
|
validateResult,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
exports.NodeRevenueService = NodeRevenueService;
|
12
lib/services/index.d.ts
vendored
Normal file
12
lib/services/index.d.ts
vendored
Normal file
@ -0,0 +1,12 @@
|
|||||||
|
export * from './check';
|
||||||
|
export * from './data';
|
||||||
|
export * from './error';
|
||||||
|
export * from './events';
|
||||||
|
export * from './logger';
|
||||||
|
export * from './router';
|
||||||
|
export * from './routerMsg';
|
||||||
|
export * from './schema';
|
||||||
|
export * from './sync';
|
||||||
|
export * from './treeCache';
|
||||||
|
export * from './utils';
|
||||||
|
export * from './worker';
|
28
lib/services/index.js
vendored
Normal file
28
lib/services/index.js
vendored
Normal file
@ -0,0 +1,28 @@
|
|||||||
|
"use strict";
|
||||||
|
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
||||||
|
if (k2 === undefined) k2 = k;
|
||||||
|
var desc = Object.getOwnPropertyDescriptor(m, k);
|
||||||
|
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
|
||||||
|
desc = { enumerable: true, get: function() { return m[k]; } };
|
||||||
|
}
|
||||||
|
Object.defineProperty(o, k2, desc);
|
||||||
|
}) : (function(o, m, k, k2) {
|
||||||
|
if (k2 === undefined) k2 = k;
|
||||||
|
o[k2] = m[k];
|
||||||
|
}));
|
||||||
|
var __exportStar = (this && this.__exportStar) || function(m, exports) {
|
||||||
|
for (var p in m) if (p !== "default" && !Object.prototype.hasOwnProperty.call(exports, p)) __createBinding(exports, m, p);
|
||||||
|
};
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
__exportStar(require("./check"), exports);
|
||||||
|
__exportStar(require("./data"), exports);
|
||||||
|
__exportStar(require("./error"), exports);
|
||||||
|
__exportStar(require("./events"), exports);
|
||||||
|
__exportStar(require("./logger"), exports);
|
||||||
|
__exportStar(require("./router"), exports);
|
||||||
|
__exportStar(require("./routerMsg"), exports);
|
||||||
|
__exportStar(require("./schema"), exports);
|
||||||
|
__exportStar(require("./sync"), exports);
|
||||||
|
__exportStar(require("./treeCache"), exports);
|
||||||
|
__exportStar(require("./utils"), exports);
|
||||||
|
__exportStar(require("./worker"), exports);
|
2
lib/services/logger.d.ts
vendored
Normal file
2
lib/services/logger.d.ts
vendored
Normal file
@ -0,0 +1,2 @@
|
|||||||
|
import winston from 'winston';
|
||||||
|
export declare function getLogger(label?: string, minLevel?: string): winston.Logger;
|
26
lib/services/logger.js
vendored
Normal file
26
lib/services/logger.js
vendored
Normal file
@ -0,0 +1,26 @@
|
|||||||
|
"use strict";
|
||||||
|
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||||
|
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||||
|
};
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
exports.getLogger = getLogger;
|
||||||
|
const winston_1 = __importDefault(require("winston"));
|
||||||
|
const safe_1 = __importDefault(require("@colors/colors/safe"));
|
||||||
|
function getLogger(label, minLevel) {
|
||||||
|
return winston_1.default.createLogger({
|
||||||
|
format: winston_1.default.format.combine(winston_1.default.format.label({ label }), winston_1.default.format.timestamp({
|
||||||
|
format: 'YYYY-MM-DD HH:mm:ss',
|
||||||
|
}),
|
||||||
|
// Include timestamp on level
|
||||||
|
winston_1.default.format((info) => {
|
||||||
|
info.level = `[${info.level}]`;
|
||||||
|
while (info.level.length < 8) {
|
||||||
|
info.level += ' ';
|
||||||
|
}
|
||||||
|
info.level = `${info.timestamp} ${info.level}`.toUpperCase();
|
||||||
|
return info;
|
||||||
|
})(), winston_1.default.format.colorize(), winston_1.default.format.printf((info) => `${info.level} ${info.label ? `${info.label} ` : ''}${safe_1.default.grey(info.message)}`)),
|
||||||
|
// Define level filter from config
|
||||||
|
transports: [new winston_1.default.transports.Console({ level: minLevel || 'debug' })],
|
||||||
|
});
|
||||||
|
}
|
38
lib/services/router.d.ts
vendored
Normal file
38
lib/services/router.d.ts
vendored
Normal file
@ -0,0 +1,38 @@
|
|||||||
|
import type { Logger } from 'winston';
|
||||||
|
import { FastifyInstance, FastifyReply, FastifyRequest } from 'fastify';
|
||||||
|
import { NetIdType, DepositsEvents, WithdrawalsEvents, EchoEvents, EncryptedNotesEvents, AllGovernanceEvents, TovarishStatus, AllRelayerRegistryEvents, StakeBurnedEvents } from '@tornado/core';
|
||||||
|
import { RelayerConfig } from '../config';
|
||||||
|
import { SentMsg } from './routerMsg';
|
||||||
|
import { SyncManagerStatus } from './sync';
|
||||||
|
export declare function getHealthStatus(netId: NetIdType, syncManagerStatus: SyncManagerStatus): string;
|
||||||
|
export declare function getGasPrices(netId: NetIdType, syncManagerStatus: SyncManagerStatus): {
|
||||||
|
fast: number;
|
||||||
|
additionalProperties: number | undefined;
|
||||||
|
};
|
||||||
|
export declare function formatStatus({ url, netId, relayerConfig, syncManagerStatus, pendingWorks, }: {
|
||||||
|
url: string;
|
||||||
|
netId: NetIdType;
|
||||||
|
relayerConfig: RelayerConfig;
|
||||||
|
syncManagerStatus: SyncManagerStatus;
|
||||||
|
pendingWorks: number;
|
||||||
|
}): TovarishStatus;
|
||||||
|
export declare function handleIndex(enabledNetworks: NetIdType[]): string;
|
||||||
|
export declare function handleStatus(url: string, router: Router, netId: NetIdType | NetIdType[], reply: FastifyReply): Promise<void>;
|
||||||
|
/**
|
||||||
|
* Since we check gasLimit and fees, should extend timeout at any proxy more than 60s
|
||||||
|
*/
|
||||||
|
export declare function handleTornadoWithdraw(router: Router, netId: NetIdType, req: FastifyRequest, reply: FastifyReply): Promise<void>;
|
||||||
|
export declare function handleGetJob(router: Router, req: FastifyRequest, reply: FastifyReply): Promise<void>;
|
||||||
|
export type AllTovarishEvents = DepositsEvents | WithdrawalsEvents | EchoEvents | EncryptedNotesEvents | AllGovernanceEvents | AllRelayerRegistryEvents | StakeBurnedEvents;
|
||||||
|
export declare function handleEvents(router: Router, netId: NetIdType, req: FastifyRequest, reply: FastifyReply): Promise<void>;
|
||||||
|
export declare function handleTrees(router: Router, req: FastifyRequest, reply: FastifyReply): Promise<void>;
|
||||||
|
export declare function listenRouter(router: Router): void;
|
||||||
|
export declare class Router {
|
||||||
|
relayerConfig: RelayerConfig;
|
||||||
|
logger: Logger;
|
||||||
|
forkId: number;
|
||||||
|
app: FastifyInstance;
|
||||||
|
admin: FastifyInstance;
|
||||||
|
messages: SentMsg[];
|
||||||
|
constructor(relayerConfig: RelayerConfig, forkId?: number);
|
||||||
|
}
|
311
lib/services/router.js
vendored
Normal file
311
lib/services/router.js
vendored
Normal file
@ -0,0 +1,311 @@
|
|||||||
|
"use strict";
|
||||||
|
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||||
|
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||||
|
};
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
exports.Router = void 0;
|
||||||
|
exports.getHealthStatus = getHealthStatus;
|
||||||
|
exports.getGasPrices = getGasPrices;
|
||||||
|
exports.formatStatus = formatStatus;
|
||||||
|
exports.handleIndex = handleIndex;
|
||||||
|
exports.handleStatus = handleStatus;
|
||||||
|
exports.handleTornadoWithdraw = handleTornadoWithdraw;
|
||||||
|
exports.handleGetJob = handleGetJob;
|
||||||
|
exports.handleEvents = handleEvents;
|
||||||
|
exports.handleTrees = handleTrees;
|
||||||
|
exports.listenRouter = listenRouter;
|
||||||
|
const path_1 = __importDefault(require("path"));
|
||||||
|
const fs_1 = require("fs");
|
||||||
|
const fastify_1 = require("fastify");
|
||||||
|
const cors_1 = require("@fastify/cors");
|
||||||
|
const core_1 = require("@tornado/core");
|
||||||
|
const ethers_1 = require("ethers");
|
||||||
|
const config_1 = require("../config");
|
||||||
|
const logger_1 = require("./logger");
|
||||||
|
const routerMsg_1 = require("./routerMsg");
|
||||||
|
const data_1 = require("./data");
|
||||||
|
const schema_1 = require("./schema");
|
||||||
|
function getHealthStatus(netId, syncManagerStatus) {
|
||||||
|
const { events, tokenPrice, gasPrice } = syncManagerStatus.syncStatus[netId];
|
||||||
|
return String(Boolean(events && tokenPrice && gasPrice));
|
||||||
|
}
|
||||||
|
function getGasPrices(netId, syncManagerStatus) {
|
||||||
|
const { gasPrice, l1Fee } = syncManagerStatus.cachedGasPrices[netId];
|
||||||
|
return {
|
||||||
|
fast: Number(gasPrice),
|
||||||
|
additionalProperties: l1Fee ? Number(l1Fee) : undefined,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function formatStatus({ url, netId, relayerConfig, syncManagerStatus, pendingWorks, }) {
|
||||||
|
const config = (0, core_1.getConfig)(netId);
|
||||||
|
return {
|
||||||
|
url,
|
||||||
|
rewardAccount: relayerConfig.rewardAccount,
|
||||||
|
instances: (0, core_1.getActiveTokenInstances)(config),
|
||||||
|
events: syncManagerStatus.cachedEvents[netId],
|
||||||
|
gasPrices: getGasPrices(netId, syncManagerStatus),
|
||||||
|
netId,
|
||||||
|
ethPrices: syncManagerStatus.cachedPrices[netId],
|
||||||
|
tornadoServiceFee: relayerConfig.serviceFee,
|
||||||
|
latestBlock: syncManagerStatus.latestBlocks[netId],
|
||||||
|
latestBalance: syncManagerStatus.latestBalances[netId],
|
||||||
|
version: config_1.version,
|
||||||
|
health: {
|
||||||
|
status: getHealthStatus(netId, syncManagerStatus),
|
||||||
|
error: '',
|
||||||
|
errorsLog: [...syncManagerStatus.errors.filter((e) => e.netId === netId)],
|
||||||
|
},
|
||||||
|
syncStatus: syncManagerStatus.syncStatus[netId],
|
||||||
|
onSyncEvents: syncManagerStatus.onSyncEvents,
|
||||||
|
currentQueue: pendingWorks,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function handleIndex(enabledNetworks) {
|
||||||
|
return ('This is <a href=https://tornado.ws>Tornado Cash</a> Relayer service. Check the ' +
|
||||||
|
enabledNetworks.map((netId) => `<a href=/${netId}/v1/status>/${netId}/v1/status</a> `).join(', ') +
|
||||||
|
'for settings');
|
||||||
|
}
|
||||||
|
async function handleStatus(url, router, netId, reply) {
|
||||||
|
const { relayerConfig } = router;
|
||||||
|
const { syncManagerStatus, pendingWorks } = await (0, routerMsg_1.sendMessage)(router, { type: 'status' });
|
||||||
|
if (Array.isArray(netId)) {
|
||||||
|
reply.send(netId.map((n) => formatStatus({
|
||||||
|
url,
|
||||||
|
netId: n,
|
||||||
|
relayerConfig,
|
||||||
|
syncManagerStatus,
|
||||||
|
pendingWorks,
|
||||||
|
})));
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
reply.send(formatStatus({
|
||||||
|
url,
|
||||||
|
netId,
|
||||||
|
relayerConfig,
|
||||||
|
syncManagerStatus,
|
||||||
|
pendingWorks,
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* Since we check gasLimit and fees, should extend timeout at any proxy more than 60s
|
||||||
|
*/
|
||||||
|
async function handleTornadoWithdraw(router, netId, req, reply) {
|
||||||
|
const { contract, proof, args } = req.body;
|
||||||
|
const { id, error } = await (0, routerMsg_1.sendMessage)(router, {
|
||||||
|
type: 'tornadoWithdraw',
|
||||||
|
netId,
|
||||||
|
contract,
|
||||||
|
proof,
|
||||||
|
args,
|
||||||
|
});
|
||||||
|
if (error) {
|
||||||
|
reply.code(502).send({ error });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
reply.send({ id });
|
||||||
|
}
|
||||||
|
async function handleGetJob(router, req, reply) {
|
||||||
|
const { id } = req.params;
|
||||||
|
const job = await (0, routerMsg_1.sendMessage)(router, { type: 'job', id });
|
||||||
|
if (job.error) {
|
||||||
|
reply.code(502).send(job);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
reply.send(job);
|
||||||
|
}
|
||||||
|
async function handleEvents(router, netId, req, reply) {
|
||||||
|
const { relayerConfig: { userEventsDir: userDirectory }, } = router;
|
||||||
|
const { type, currency, amount, fromBlock, recent } = req.body;
|
||||||
|
const name = [core_1.DEPOSIT, core_1.WITHDRAWAL].includes(type) ? `${type}s_${netId}_${currency}_${amount}` : `${type}_${netId}`;
|
||||||
|
// Can return 0 events but we just return error codes here
|
||||||
|
if (!(await (0, data_1.existsAsync)(path_1.default.join(userDirectory, `${name}.json`)))) {
|
||||||
|
reply.code(404).send(`Events ${name} not found!`);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const { syncManagerStatus } = await (0, routerMsg_1.sendMessage)(router, { type: 'status' });
|
||||||
|
const lastSyncBlock = Number([core_1.DEPOSIT, core_1.WITHDRAWAL].includes(type)
|
||||||
|
? syncManagerStatus.cachedEvents[netId]?.instances?.[String(currency)]?.[String(amount)]?.[`${type}s`]?.lastBlock
|
||||||
|
: syncManagerStatus.cachedEvents[netId]?.[String(type)]?.lastBlock);
|
||||||
|
const { events } = await (0, data_1.loadSavedEvents)({
|
||||||
|
name,
|
||||||
|
userDirectory,
|
||||||
|
});
|
||||||
|
if (recent) {
|
||||||
|
reply.send({
|
||||||
|
events: events.slice(-10).sort((a, b) => {
|
||||||
|
if (a.blockNumber === b.blockNumber) {
|
||||||
|
return b.logIndex - a.logIndex;
|
||||||
|
}
|
||||||
|
return b.blockNumber - a.blockNumber;
|
||||||
|
}),
|
||||||
|
lastSyncBlock,
|
||||||
|
});
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
reply.send({
|
||||||
|
events: events.filter((e) => e.blockNumber >= (fromBlock || 0)).slice(0, core_1.MAX_TOVARISH_EVENTS),
|
||||||
|
lastSyncBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
async function handleTrees(router, req, reply) {
|
||||||
|
const treeRegex = /deposits_(?<netId>\d+)_(?<currency>\w+)_(?<amount>[\d.]+)_(?<part>\w+).json.zip/g;
|
||||||
|
const { netId, currency, amount, part } = treeRegex.exec(req.params.treeName)?.groups || {};
|
||||||
|
const treeName = `deposits_${netId}_${currency}_${amount}_${part}.json.zip`;
|
||||||
|
const treePath = path_1.default.join(router.relayerConfig.userTreeDir, treeName);
|
||||||
|
if (!(await (0, data_1.existsAsync)(treePath))) {
|
||||||
|
reply.status(404).send(`Tree ${treeName} not found!`);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
reply.send((0, fs_1.createReadStream)(treePath));
|
||||||
|
}
|
||||||
|
function listenRouter(router) {
|
||||||
|
const { relayerConfig, logger, app, admin, forkId } = router;
|
||||||
|
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||||
|
app.register(cors_1.fastifyCors, () => (req, callback) => {
|
||||||
|
callback(null, {
|
||||||
|
origin: req.headers.origin || '*',
|
||||||
|
credentials: true,
|
||||||
|
methods: ['GET, POST, OPTIONS'],
|
||||||
|
headers: [
|
||||||
|
'DNT,X-CustomHeader,Keep-Alive,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type',
|
||||||
|
],
|
||||||
|
maxAge: 1728000,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
app.get('/', (_, reply) => {
|
||||||
|
reply.type('text/html').send(handleIndex(relayerConfig.enabledNetworks));
|
||||||
|
});
|
||||||
|
app.get('/relayer', (_, reply) => {
|
||||||
|
reply.type('text/html').send(handleIndex(relayerConfig.enabledNetworks));
|
||||||
|
});
|
||||||
|
app.get('/status', (req, reply) => {
|
||||||
|
handleStatus(`${req.protocol}://${req.hostname}`, router, relayerConfig.enabledNetworks, reply);
|
||||||
|
});
|
||||||
|
app.get('/enabledNetworks', (_, reply) => {
|
||||||
|
reply.send(relayerConfig.enabledNetworks);
|
||||||
|
});
|
||||||
|
if (forkId === 0) {
|
||||||
|
logger.info('Router listening on /, /status, /enabledNetworks');
|
||||||
|
}
|
||||||
|
for (const netId of relayerConfig.enabledNetworks) {
|
||||||
|
app.get(`/${netId}`, (_, reply) => {
|
||||||
|
reply.type('text/html').send(handleIndex([netId]));
|
||||||
|
});
|
||||||
|
app.get(`/${netId}/status`, (req, reply) => {
|
||||||
|
handleStatus(`${req.protocol}://${req.hostname}/${netId}`, router, netId, reply);
|
||||||
|
});
|
||||||
|
const withdrawSchema = (0, schema_1.getWithdrawSchema)(netId);
|
||||||
|
app.post(`/${netId}/relay`, { schema: withdrawSchema }, (req, reply) => {
|
||||||
|
handleTornadoWithdraw(router, netId, req, reply);
|
||||||
|
});
|
||||||
|
app.get(`/${netId}/v1/status`, (req, reply) => {
|
||||||
|
handleStatus(`${req.protocol}://${req.hostname}/${netId}`, router, netId, reply);
|
||||||
|
});
|
||||||
|
app.post(`/${netId}/v1/tornadoWithdraw`, { schema: withdrawSchema }, (req, reply) => {
|
||||||
|
handleTornadoWithdraw(router, netId, req, reply);
|
||||||
|
});
|
||||||
|
app.get(`/${netId}/v1/jobs/:id`, { schema: schema_1.idParamsSchema }, (req, reply) => {
|
||||||
|
handleGetJob(router, req, reply);
|
||||||
|
});
|
||||||
|
const eventSchema = (0, schema_1.getEventsSchema)(netId);
|
||||||
|
app.post(`/${netId}/events`, { schema: eventSchema }, (req, reply) => {
|
||||||
|
handleEvents(router, netId, req, reply);
|
||||||
|
});
|
||||||
|
app.get(`/${netId}/trees/:treeName`, { schema: schema_1.treeNameSchema }, (req, reply) => {
|
||||||
|
handleTrees(router, req, reply);
|
||||||
|
});
|
||||||
|
if (forkId === 0) {
|
||||||
|
logger.info(`Router listening on /${netId}, /${netId}/status, /${netId}/relay, /${netId}/v1/status, /${netId}/v1/tornadoWithdraw, /${netId}/v1/jobs/:id, /${netId}/events, /${netId}/trees/:treeName`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
const { port, host } = relayerConfig;
|
||||||
|
app.listen({ port, host }, (err, address) => {
|
||||||
|
if (err) {
|
||||||
|
logger.error('Router Error');
|
||||||
|
console.log(err);
|
||||||
|
throw err;
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
logger.debug(`Router listening on ${address}`);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
admin.get('/errors', (_, reply) => {
|
||||||
|
(async () => {
|
||||||
|
const { errors } = await (0, routerMsg_1.sendMessage)(router, { type: 'errors' });
|
||||||
|
reply.header('Content-Type', 'application/json').send(JSON.stringify(errors, null, 2));
|
||||||
|
})();
|
||||||
|
});
|
||||||
|
admin.listen({ port: port + 100, host }, (err, address) => {
|
||||||
|
if (err) {
|
||||||
|
logger.error('Admin Router Error');
|
||||||
|
console.log(err);
|
||||||
|
throw err;
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
if (forkId === 0) {
|
||||||
|
logger.debug(`Admin Router listening on ${address}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
(0, routerMsg_1.resolveMessages)(router);
|
||||||
|
}
|
||||||
|
class Router {
|
||||||
|
relayerConfig;
|
||||||
|
logger;
|
||||||
|
forkId;
|
||||||
|
app;
|
||||||
|
// For viewing error logs
|
||||||
|
admin;
|
||||||
|
messages;
|
||||||
|
constructor(relayerConfig, forkId = 0) {
|
||||||
|
this.relayerConfig = relayerConfig;
|
||||||
|
this.logger = (0, logger_1.getLogger)(`[Router ${forkId}]`, relayerConfig.logLevel);
|
||||||
|
this.forkId = forkId;
|
||||||
|
const app = (0, fastify_1.fastify)({
|
||||||
|
ajv: {
|
||||||
|
customOptions: {
|
||||||
|
keywords: [
|
||||||
|
{
|
||||||
|
keyword: 'isAddress',
|
||||||
|
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||||
|
validate: (schema, data) => {
|
||||||
|
try {
|
||||||
|
return (0, ethers_1.isAddress)(data);
|
||||||
|
}
|
||||||
|
catch {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
},
|
||||||
|
errors: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
keyword: 'BN',
|
||||||
|
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||||
|
validate: (schema, data) => {
|
||||||
|
try {
|
||||||
|
BigInt(data);
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
catch {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
},
|
||||||
|
errors: true,
|
||||||
|
},
|
||||||
|
(0, schema_1.getTreeNameKeyword)(),
|
||||||
|
...(0, schema_1.getAllWithdrawKeyword)(relayerConfig.rewardAccount),
|
||||||
|
...(0, schema_1.getAllEventsKeyword)(),
|
||||||
|
],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
trustProxy: relayerConfig.reverseProxy ? 1 : false,
|
||||||
|
ignoreTrailingSlash: true,
|
||||||
|
});
|
||||||
|
const admin = (0, fastify_1.fastify)();
|
||||||
|
this.app = app;
|
||||||
|
this.admin = admin;
|
||||||
|
this.messages = [];
|
||||||
|
listenRouter(this);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
exports.Router = Router;
|
9
lib/services/routerMsg.d.ts
vendored
Normal file
9
lib/services/routerMsg.d.ts
vendored
Normal file
@ -0,0 +1,9 @@
|
|||||||
|
import { Router } from './router';
|
||||||
|
export interface SentMsg {
|
||||||
|
msgId: string;
|
||||||
|
resolve: (msg: any) => void;
|
||||||
|
reject: (err: any) => void;
|
||||||
|
resolved: boolean;
|
||||||
|
}
|
||||||
|
export declare function sendMessage<T>(router: Router, msg: any): Promise<T>;
|
||||||
|
export declare function resolveMessages(router: Router): void;
|
45
lib/services/routerMsg.js
vendored
Normal file
45
lib/services/routerMsg.js
vendored
Normal file
@ -0,0 +1,45 @@
|
|||||||
|
"use strict";
|
||||||
|
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||||
|
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||||
|
};
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
exports.sendMessage = sendMessage;
|
||||||
|
exports.resolveMessages = resolveMessages;
|
||||||
|
/* eslint-disable @typescript-eslint/no-explicit-any */
|
||||||
|
/**
|
||||||
|
* Send and receive messages from worker to main thread
|
||||||
|
*/
|
||||||
|
const process_1 = __importDefault(require("process"));
|
||||||
|
const crypto_1 = require("crypto");
|
||||||
|
const core_1 = require("@tornado/core");
|
||||||
|
function sendMessage(router, msg) {
|
||||||
|
const msgId = (0, core_1.bytesToHex)(crypto_1.webcrypto.getRandomValues(new Uint8Array(8)));
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
if (!process_1.default.send) {
|
||||||
|
reject(new Error('Not worker'));
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const msgJson = JSON.parse(JSON.stringify(msg));
|
||||||
|
msgJson.msgId = msgId;
|
||||||
|
process_1.default.send(msgJson);
|
||||||
|
router.messages.push({
|
||||||
|
msgId,
|
||||||
|
resolve,
|
||||||
|
reject,
|
||||||
|
resolved: false,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
function resolveMessages(router) {
|
||||||
|
process_1.default.on('message', (msg) => {
|
||||||
|
const message = router.messages.find((w) => w.msgId === msg.msgId);
|
||||||
|
if (!message) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const msgJson = JSON.parse(JSON.stringify(msg));
|
||||||
|
delete msgJson.msgId;
|
||||||
|
message.resolve(msgJson);
|
||||||
|
message.resolved = true;
|
||||||
|
router.messages = router.messages.filter((w) => !w.resolved);
|
||||||
|
});
|
||||||
|
}
|
132
lib/services/schema.d.ts
vendored
Normal file
132
lib/services/schema.d.ts
vendored
Normal file
@ -0,0 +1,132 @@
|
|||||||
|
import { NetIdType, TornadoWithdrawParams, TovarishEventsQuery } from '@tornado/core';
|
||||||
|
export declare const idParamsSchema: {
|
||||||
|
readonly params: {
|
||||||
|
readonly type: "object";
|
||||||
|
readonly properties: {
|
||||||
|
readonly id: {
|
||||||
|
readonly type: "string";
|
||||||
|
readonly format: "uuid";
|
||||||
|
};
|
||||||
|
};
|
||||||
|
readonly required: readonly ["id"];
|
||||||
|
readonly additionalProperties: false;
|
||||||
|
};
|
||||||
|
};
|
||||||
|
export declare const withdrawBodySchema: {
|
||||||
|
readonly body: {
|
||||||
|
readonly type: "object";
|
||||||
|
readonly properties: {
|
||||||
|
readonly proof: {
|
||||||
|
readonly type: "string";
|
||||||
|
readonly pattern: "^0x[a-fA-F0-9]{512}$";
|
||||||
|
};
|
||||||
|
readonly contract: {
|
||||||
|
readonly type: "string";
|
||||||
|
readonly pattern: "^0x[a-fA-F0-9]{40}$";
|
||||||
|
readonly isAddress: true;
|
||||||
|
};
|
||||||
|
readonly args: {
|
||||||
|
readonly type: "array";
|
||||||
|
readonly maxItems: 6;
|
||||||
|
readonly minItems: 6;
|
||||||
|
readonly items: readonly [{
|
||||||
|
readonly type: "string";
|
||||||
|
readonly pattern: "^0x[a-fA-F0-9]{64}$";
|
||||||
|
}, {
|
||||||
|
readonly type: "string";
|
||||||
|
readonly pattern: "^0x[a-fA-F0-9]{64}$";
|
||||||
|
}, {
|
||||||
|
readonly type: "string";
|
||||||
|
readonly pattern: "^0x[a-fA-F0-9]{40}$";
|
||||||
|
readonly isAddress: true;
|
||||||
|
}, {
|
||||||
|
readonly type: "string";
|
||||||
|
readonly pattern: "^0x[a-fA-F0-9]{40}$";
|
||||||
|
readonly isAddress: true;
|
||||||
|
}, {
|
||||||
|
readonly BN: true;
|
||||||
|
readonly type: "string";
|
||||||
|
readonly pattern: "^0x[a-fA-F0-9]{64}$";
|
||||||
|
}, {
|
||||||
|
readonly BN: true;
|
||||||
|
readonly type: "string";
|
||||||
|
readonly pattern: "^0x[a-fA-F0-9]{64}$";
|
||||||
|
}];
|
||||||
|
};
|
||||||
|
};
|
||||||
|
readonly additionalProperties: false;
|
||||||
|
readonly required: readonly ["proof", "contract", "args"];
|
||||||
|
};
|
||||||
|
};
|
||||||
|
export declare const eventsSchema: {
|
||||||
|
readonly body: {
|
||||||
|
readonly type: "object";
|
||||||
|
readonly properties: {
|
||||||
|
readonly type: {
|
||||||
|
readonly type: "string";
|
||||||
|
readonly minLength: 1;
|
||||||
|
readonly maxLength: 30;
|
||||||
|
};
|
||||||
|
readonly currency: {
|
||||||
|
readonly type: "string";
|
||||||
|
readonly minLength: 1;
|
||||||
|
readonly maxLength: 30;
|
||||||
|
};
|
||||||
|
readonly amount: {
|
||||||
|
readonly type: "string";
|
||||||
|
readonly minLength: 1;
|
||||||
|
readonly maxLength: 30;
|
||||||
|
};
|
||||||
|
readonly fromBlock: {
|
||||||
|
readonly type: "number";
|
||||||
|
};
|
||||||
|
readonly recent: {
|
||||||
|
readonly type: "boolean";
|
||||||
|
};
|
||||||
|
};
|
||||||
|
readonly additionalProperties: false;
|
||||||
|
readonly required: readonly ["type", "fromBlock"];
|
||||||
|
};
|
||||||
|
};
|
||||||
|
export declare const treeNameSchema: {
|
||||||
|
readonly params: {
|
||||||
|
readonly type: "object";
|
||||||
|
readonly properties: {
|
||||||
|
readonly treeName: {
|
||||||
|
readonly type: "string";
|
||||||
|
readonly minLength: 1;
|
||||||
|
readonly maxLength: 60;
|
||||||
|
readonly TreeName: true;
|
||||||
|
};
|
||||||
|
};
|
||||||
|
readonly additionalProperties: false;
|
||||||
|
readonly required: readonly ["treeName"];
|
||||||
|
};
|
||||||
|
};
|
||||||
|
export declare function getWithdrawSchema(netId: NetIdType): typeof withdrawBodySchema & { [key in number | typeof Symbol.iterator | "length" | "toString" | "concat" | "slice" | "indexOf" | "lastIndexOf" | "includes" | "at" | "charAt" | "charCodeAt" | "localeCompare" | "match" | "replace" | "search" | "split" | "substring" | "toLowerCase" | "toLocaleLowerCase" | "toUpperCase" | "toLocaleUpperCase" | "trim" | "substr" | "codePointAt" | "endsWith" | "normalize" | "repeat" | "startsWith" | "anchor" | "big" | "blink" | "bold" | "fixed" | "fontcolor" | "fontsize" | "italics" | "link" | "small" | "strike" | "sub" | "sup" | "padStart" | "padEnd" | "trimEnd" | "trimStart" | "trimLeft" | "trimRight" | "matchAll" | "replaceAll" | "isWellFormed" | "toWellFormed" | "valueOf"]: boolean; };
|
||||||
|
export declare function getEventsSchema(netId: NetIdType): typeof eventsSchema & { [key in number | typeof Symbol.iterator | "length" | "toString" | "concat" | "slice" | "indexOf" | "lastIndexOf" | "includes" | "at" | "charAt" | "charCodeAt" | "localeCompare" | "match" | "replace" | "search" | "split" | "substring" | "toLowerCase" | "toLocaleLowerCase" | "toUpperCase" | "toLocaleUpperCase" | "trim" | "substr" | "codePointAt" | "endsWith" | "normalize" | "repeat" | "startsWith" | "anchor" | "big" | "blink" | "bold" | "fixed" | "fontcolor" | "fontsize" | "italics" | "link" | "small" | "strike" | "sub" | "sup" | "padStart" | "padEnd" | "trimEnd" | "trimStart" | "trimLeft" | "trimRight" | "matchAll" | "replaceAll" | "isWellFormed" | "toWellFormed" | "valueOf"]: boolean; };
|
||||||
|
export declare function getWithdrawKeyword(netId: NetIdType, rewardAccount: string): {
|
||||||
|
keyword: string;
|
||||||
|
validate: (schema: string, data: TornadoWithdrawParams) => boolean;
|
||||||
|
errors: boolean;
|
||||||
|
};
|
||||||
|
export declare function getEventsKeyword(netId: NetIdType): {
|
||||||
|
keyword: string;
|
||||||
|
validate: (schema: string, data: TovarishEventsQuery) => boolean;
|
||||||
|
errors: boolean;
|
||||||
|
};
|
||||||
|
export declare function getTreeNameKeyword(): {
|
||||||
|
keyword: string;
|
||||||
|
validate: (schema: string, data: string) => boolean;
|
||||||
|
errors: boolean;
|
||||||
|
};
|
||||||
|
export declare function getAllWithdrawKeyword(rewardAccount: string): {
|
||||||
|
keyword: string;
|
||||||
|
validate: (schema: string, data: TornadoWithdrawParams) => boolean;
|
||||||
|
errors: boolean;
|
||||||
|
}[];
|
||||||
|
export declare function getAllEventsKeyword(): {
|
||||||
|
keyword: string;
|
||||||
|
validate: (schema: string, data: TovarishEventsQuery) => boolean;
|
||||||
|
errors: boolean;
|
||||||
|
}[];
|
200
lib/services/schema.js
vendored
Normal file
200
lib/services/schema.js
vendored
Normal file
@ -0,0 +1,200 @@
|
|||||||
|
"use strict";
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
exports.treeNameSchema = exports.eventsSchema = exports.withdrawBodySchema = exports.idParamsSchema = void 0;
|
||||||
|
exports.getWithdrawSchema = getWithdrawSchema;
|
||||||
|
exports.getEventsSchema = getEventsSchema;
|
||||||
|
exports.getWithdrawKeyword = getWithdrawKeyword;
|
||||||
|
exports.getEventsKeyword = getEventsKeyword;
|
||||||
|
exports.getTreeNameKeyword = getTreeNameKeyword;
|
||||||
|
exports.getAllWithdrawKeyword = getAllWithdrawKeyword;
|
||||||
|
exports.getAllEventsKeyword = getAllEventsKeyword;
|
||||||
|
const ethers_1 = require("ethers");
|
||||||
|
const core_1 = require("@tornado/core");
|
||||||
|
exports.idParamsSchema = {
|
||||||
|
params: {
|
||||||
|
type: 'object',
|
||||||
|
properties: {
|
||||||
|
id: { type: 'string', format: 'uuid' },
|
||||||
|
},
|
||||||
|
required: ['id'],
|
||||||
|
additionalProperties: false,
|
||||||
|
},
|
||||||
|
};
|
||||||
|
exports.withdrawBodySchema = {
|
||||||
|
body: {
|
||||||
|
type: 'object',
|
||||||
|
properties: {
|
||||||
|
proof: core_1.proofSchemaType,
|
||||||
|
contract: core_1.addressSchemaType,
|
||||||
|
args: {
|
||||||
|
type: 'array',
|
||||||
|
maxItems: 6,
|
||||||
|
minItems: 6,
|
||||||
|
items: [
|
||||||
|
core_1.bytes32SchemaType,
|
||||||
|
core_1.bytes32SchemaType,
|
||||||
|
core_1.addressSchemaType,
|
||||||
|
core_1.addressSchemaType,
|
||||||
|
core_1.bytes32BNSchemaType,
|
||||||
|
core_1.bytes32BNSchemaType,
|
||||||
|
],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
additionalProperties: false,
|
||||||
|
required: ['proof', 'contract', 'args'],
|
||||||
|
},
|
||||||
|
};
|
||||||
|
const stringParamsType = {
|
||||||
|
type: 'string',
|
||||||
|
minLength: 1,
|
||||||
|
maxLength: 30,
|
||||||
|
};
|
||||||
|
exports.eventsSchema = {
|
||||||
|
body: {
|
||||||
|
type: 'object',
|
||||||
|
properties: {
|
||||||
|
type: stringParamsType,
|
||||||
|
currency: stringParamsType,
|
||||||
|
amount: stringParamsType,
|
||||||
|
fromBlock: { type: 'number' },
|
||||||
|
recent: { type: 'boolean' },
|
||||||
|
},
|
||||||
|
additionalProperties: false,
|
||||||
|
required: ['type', 'fromBlock'],
|
||||||
|
},
|
||||||
|
};
|
||||||
|
exports.treeNameSchema = {
|
||||||
|
params: {
|
||||||
|
type: 'object',
|
||||||
|
properties: {
|
||||||
|
treeName: {
|
||||||
|
type: 'string',
|
||||||
|
minLength: 1,
|
||||||
|
maxLength: 60,
|
||||||
|
TreeName: true,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
additionalProperties: false,
|
||||||
|
required: ['treeName'],
|
||||||
|
},
|
||||||
|
};
|
||||||
|
function getWithdrawSchema(netId) {
|
||||||
|
const keyword = `withdraw${netId}`;
|
||||||
|
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||||
|
const schema = JSON.parse(JSON.stringify(exports.withdrawBodySchema));
|
||||||
|
schema.body[keyword] = true;
|
||||||
|
return schema;
|
||||||
|
}
|
||||||
|
function getEventsSchema(netId) {
|
||||||
|
const keyword = `events${netId}`;
|
||||||
|
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||||
|
const schema = JSON.parse(JSON.stringify(exports.eventsSchema));
|
||||||
|
schema.body[keyword] = true;
|
||||||
|
return schema;
|
||||||
|
}
|
||||||
|
function getWithdrawKeyword(netId, rewardAccount) {
|
||||||
|
const keyword = `withdraw${netId}`;
|
||||||
|
const config = (0, core_1.getConfig)(netId);
|
||||||
|
return {
|
||||||
|
keyword,
|
||||||
|
validate: (schema, data) => {
|
||||||
|
try {
|
||||||
|
const { contract, args } = data;
|
||||||
|
const instance = (0, core_1.getInstanceByAddress)(config, contract);
|
||||||
|
// Unknown instance contract is unsupported
|
||||||
|
if (!instance) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
// Fee recipient should be a reward account
|
||||||
|
if (args[3] !== rewardAccount) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
const { amount, currency } = instance;
|
||||||
|
const { nativeCurrency, tokens: { [currency]: { decimals }, }, } = config;
|
||||||
|
const denomination = (0, ethers_1.parseUnits)(amount, decimals);
|
||||||
|
const fee = BigInt(args[4]);
|
||||||
|
// Fees can't exceed denomination
|
||||||
|
if (!fee || fee >= denomination) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
// ETHTornado instances can't have refunds
|
||||||
|
if (currency === nativeCurrency && BigInt(args[5])) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
catch {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
},
|
||||||
|
errors: true,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function getEventsKeyword(netId) {
|
||||||
|
const keyword = `events${netId}`;
|
||||||
|
const config = (0, core_1.getConfig)(netId);
|
||||||
|
const { governanceContract, registryContract } = config;
|
||||||
|
return {
|
||||||
|
keyword,
|
||||||
|
validate: (schema, data) => {
|
||||||
|
try {
|
||||||
|
const { type, currency, amount } = data;
|
||||||
|
if ([core_1.DEPOSIT, core_1.WITHDRAWAL].includes(type)) {
|
||||||
|
const instanceAddress = config.tokens[String(currency)]?.instanceAddress?.[String(amount)];
|
||||||
|
if (!instanceAddress) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
if (type === 'governance') {
|
||||||
|
if (!governanceContract) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
// todo: remove this after some time, remains for legacy client connection
|
||||||
|
if (['registered', 'registry', 'revenue'].includes(type)) {
|
||||||
|
if (!registryContract) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
return ['echo', 'encrypted_notes'].includes(type);
|
||||||
|
}
|
||||||
|
catch {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
},
|
||||||
|
errors: true,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function getTreeNameKeyword() {
|
||||||
|
return {
|
||||||
|
keyword: 'TreeName',
|
||||||
|
validate: (schema, data) => {
|
||||||
|
try {
|
||||||
|
const treeRegex = /deposits_(?<netId>\d+)_(?<currency>\w+)_(?<amount>[\d.]+)_(?<part>\w+).json.zip/g;
|
||||||
|
const { netId, currency, amount, part } = treeRegex.exec(data)?.groups || {};
|
||||||
|
const config = (0, core_1.getConfig)(Number(netId));
|
||||||
|
if (!currency || !amount || !part || currency !== config.nativeCurrency) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
const instanceAddress = config.tokens[String(currency)]?.instanceAddress?.[String(amount)];
|
||||||
|
if (!instanceAddress) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
catch {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
},
|
||||||
|
errors: true,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function getAllWithdrawKeyword(rewardAccount) {
|
||||||
|
return core_1.enabledChains.map((netId) => getWithdrawKeyword(netId, rewardAccount));
|
||||||
|
}
|
||||||
|
function getAllEventsKeyword() {
|
||||||
|
return core_1.enabledChains.map((netId) => getEventsKeyword(netId));
|
||||||
|
}
|
92
lib/services/sync.d.ts
vendored
Normal file
92
lib/services/sync.d.ts
vendored
Normal file
@ -0,0 +1,92 @@
|
|||||||
|
import type { Provider } from 'ethers';
|
||||||
|
import type { Logger } from 'winston';
|
||||||
|
import { NetIdType, TokenPriceOracle, TornadoFeeOracle, TovarishEventsStatus, TovarishSyncStatus } from '@tornado/core';
|
||||||
|
import { RelayerConfig } from '../config';
|
||||||
|
import { NodeEchoService, NodeEncryptedNotesService, NodeGovernanceService, NodeRegistryService, NodeRevenueService, NodeTornadoService } from './events';
|
||||||
|
import { ErrorTypes, ErrorMessages } from './error';
|
||||||
|
export interface AmountsServices {
|
||||||
|
depositsService: NodeTornadoService;
|
||||||
|
withdrawalsService: NodeTornadoService;
|
||||||
|
}
|
||||||
|
export interface CurrencyServices {
|
||||||
|
[index: string]: AmountsServices;
|
||||||
|
}
|
||||||
|
export interface TornadoServices {
|
||||||
|
[index: string]: CurrencyServices;
|
||||||
|
}
|
||||||
|
export interface Services {
|
||||||
|
provider: Provider;
|
||||||
|
tokenPriceOracle: TokenPriceOracle;
|
||||||
|
tornadoFeeOracle: TornadoFeeOracle;
|
||||||
|
governanceService?: NodeGovernanceService;
|
||||||
|
registryService?: NodeRegistryService;
|
||||||
|
revenueService?: NodeRevenueService;
|
||||||
|
echoService: NodeEchoService;
|
||||||
|
encryptedNotesService: NodeEncryptedNotesService;
|
||||||
|
tornadoServices: TornadoServices;
|
||||||
|
}
|
||||||
|
export interface CachedServices {
|
||||||
|
[index: NetIdType]: Services;
|
||||||
|
}
|
||||||
|
export interface CachedEventsStatus {
|
||||||
|
[index: NetIdType]: TovarishEventsStatus;
|
||||||
|
}
|
||||||
|
export interface TokenPrices {
|
||||||
|
[index: string]: bigint;
|
||||||
|
}
|
||||||
|
export interface TokenPricesString {
|
||||||
|
[index: string]: string;
|
||||||
|
}
|
||||||
|
export interface CachedPrices {
|
||||||
|
[index: NetIdType]: TokenPrices;
|
||||||
|
}
|
||||||
|
export interface CachedPricesString {
|
||||||
|
[index: NetIdType]: TokenPricesString;
|
||||||
|
}
|
||||||
|
export interface GasPrices {
|
||||||
|
gasPrice: string;
|
||||||
|
l1Fee?: string;
|
||||||
|
}
|
||||||
|
export interface CachedGasPrices {
|
||||||
|
[index: NetIdType]: GasPrices;
|
||||||
|
}
|
||||||
|
export interface LatestBlocks {
|
||||||
|
[index: NetIdType]: number;
|
||||||
|
}
|
||||||
|
export interface LatestBalances {
|
||||||
|
[index: NetIdType]: string;
|
||||||
|
}
|
||||||
|
export interface CachedSyncStatus {
|
||||||
|
[index: NetIdType]: TovarishSyncStatus;
|
||||||
|
}
|
||||||
|
export declare function syncGasPrice(syncManager: SyncManager, netId: NetIdType): Promise<void>;
|
||||||
|
export declare function syncPrices(syncManager: SyncManager, netId: NetIdType): Promise<void>;
|
||||||
|
export declare function syncNetworkEvents(syncManager: SyncManager, netId: NetIdType): Promise<void>;
|
||||||
|
export interface SyncManagerStatus {
|
||||||
|
cachedEvents: CachedEventsStatus;
|
||||||
|
cachedPrices: CachedPricesString;
|
||||||
|
cachedGasPrices: CachedGasPrices;
|
||||||
|
syncStatus: CachedSyncStatus;
|
||||||
|
latestBlocks: LatestBlocks;
|
||||||
|
latestBalances: LatestBalances;
|
||||||
|
errors: ErrorTypes[];
|
||||||
|
onSyncEvents: boolean;
|
||||||
|
}
|
||||||
|
export declare class SyncManager {
|
||||||
|
relayerConfig: RelayerConfig;
|
||||||
|
logger: Logger;
|
||||||
|
cachedServices: CachedServices;
|
||||||
|
cachedEvents: CachedEventsStatus;
|
||||||
|
cachedPrices: CachedPrices;
|
||||||
|
cachedGasPrices: CachedGasPrices;
|
||||||
|
syncStatus: CachedSyncStatus;
|
||||||
|
latestBlocks: LatestBlocks;
|
||||||
|
latestBalances: LatestBalances;
|
||||||
|
errors: ErrorMessages[];
|
||||||
|
onSyncEvents: boolean;
|
||||||
|
constructor(relayerConfig: RelayerConfig);
|
||||||
|
getStatus(): SyncManagerStatus;
|
||||||
|
getPrice(netId: NetIdType, token: string): bigint;
|
||||||
|
getGasPrice(netId: NetIdType): GasPrices;
|
||||||
|
syncEvents(): Promise<void>;
|
||||||
|
}
|
365
lib/services/sync.js
vendored
Normal file
365
lib/services/sync.js
vendored
Normal file
@ -0,0 +1,365 @@
|
|||||||
|
"use strict";
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
exports.SyncManager = void 0;
|
||||||
|
exports.syncGasPrice = syncGasPrice;
|
||||||
|
exports.syncPrices = syncPrices;
|
||||||
|
exports.syncNetworkEvents = syncNetworkEvents;
|
||||||
|
const contracts_1 = require("@tornado/contracts");
|
||||||
|
const core_1 = require("@tornado/core");
|
||||||
|
const logger_1 = require("./logger");
|
||||||
|
const treeCache_1 = require("./treeCache");
|
||||||
|
const events_1 = require("./events");
|
||||||
|
const error_1 = require("./error");
|
||||||
|
function setupServices(syncManager) {
|
||||||
|
const { relayerConfig, logger, syncStatus } = syncManager;
|
||||||
|
const { cacheDir: cacheDirectory, userEventsDir: userDirectory, userTreeDir, merkleWorkerPath, enabledNetworks, } = relayerConfig;
|
||||||
|
const cachedServices = {};
|
||||||
|
for (const netId of enabledNetworks) {
|
||||||
|
const config = (0, core_1.getConfig)(netId);
|
||||||
|
const rpcUrl = relayerConfig.rpcUrls[netId];
|
||||||
|
const provider = (0, core_1.getProviderWithNetId)(netId, rpcUrl, config);
|
||||||
|
const { tokens, nativeCurrency, routerContract, echoContract, registryContract, aggregatorContract, reverseRecordsContract, governanceContract, multicallContract, offchainOracleContract, ovmGasPriceOracleContract, deployedBlock, constants: { GOVERNANCE_BLOCK, REGISTRY_BLOCK, NOTE_ACCOUNT_BLOCK, ENCRYPTED_NOTES_BLOCK }, } = config;
|
||||||
|
if (!syncStatus[netId]) {
|
||||||
|
syncStatus[netId] = {
|
||||||
|
events: false,
|
||||||
|
tokenPrice: false,
|
||||||
|
gasPrice: false,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
const services = (cachedServices[netId] = {});
|
||||||
|
services.provider = provider;
|
||||||
|
services.tokenPriceOracle = new core_1.TokenPriceOracle(provider, core_1.Multicall__factory.connect(multicallContract, provider), offchainOracleContract ? core_1.OffchainOracle__factory.connect(offchainOracleContract, provider) : undefined);
|
||||||
|
services.tornadoFeeOracle = new core_1.TornadoFeeOracle(provider, ovmGasPriceOracleContract
|
||||||
|
? core_1.OvmGasPriceOracle__factory.connect(ovmGasPriceOracleContract, provider)
|
||||||
|
: undefined);
|
||||||
|
if (governanceContract && aggregatorContract && reverseRecordsContract) {
|
||||||
|
services.governanceService = new events_1.NodeGovernanceService({
|
||||||
|
netId,
|
||||||
|
provider,
|
||||||
|
Governance: contracts_1.Governance__factory.connect(governanceContract, provider),
|
||||||
|
Aggregator: contracts_1.Aggregator__factory.connect(aggregatorContract, provider),
|
||||||
|
ReverseRecords: core_1.ReverseRecords__factory.connect(reverseRecordsContract, provider),
|
||||||
|
deployedBlock: GOVERNANCE_BLOCK,
|
||||||
|
cacheDirectory,
|
||||||
|
userDirectory,
|
||||||
|
logger,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
if (registryContract && aggregatorContract) {
|
||||||
|
services.registryService = new events_1.NodeRegistryService({
|
||||||
|
netId,
|
||||||
|
provider,
|
||||||
|
RelayerRegistry: contracts_1.RelayerRegistry__factory.connect(registryContract, provider),
|
||||||
|
Aggregator: contracts_1.Aggregator__factory.connect(aggregatorContract, provider),
|
||||||
|
relayerEnsSubdomains: (0, core_1.getRelayerEnsSubdomains)(),
|
||||||
|
deployedBlock: REGISTRY_BLOCK,
|
||||||
|
cacheDirectory,
|
||||||
|
userDirectory,
|
||||||
|
logger,
|
||||||
|
});
|
||||||
|
services.revenueService = new events_1.NodeRevenueService({
|
||||||
|
netId,
|
||||||
|
provider,
|
||||||
|
RelayerRegistry: contracts_1.RelayerRegistry__factory.connect(registryContract, provider),
|
||||||
|
deployedBlock: REGISTRY_BLOCK,
|
||||||
|
cacheDirectory,
|
||||||
|
userDirectory,
|
||||||
|
logger,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
services.echoService = new events_1.NodeEchoService({
|
||||||
|
netId,
|
||||||
|
provider,
|
||||||
|
Echoer: contracts_1.Echoer__factory.connect(echoContract, provider),
|
||||||
|
deployedBlock: NOTE_ACCOUNT_BLOCK,
|
||||||
|
cacheDirectory,
|
||||||
|
userDirectory,
|
||||||
|
logger,
|
||||||
|
});
|
||||||
|
services.encryptedNotesService = new events_1.NodeEncryptedNotesService({
|
||||||
|
netId,
|
||||||
|
provider,
|
||||||
|
Router: contracts_1.TornadoRouter__factory.connect(routerContract, provider),
|
||||||
|
deployedBlock: ENCRYPTED_NOTES_BLOCK,
|
||||||
|
cacheDirectory,
|
||||||
|
userDirectory,
|
||||||
|
logger,
|
||||||
|
});
|
||||||
|
services.tornadoServices = {};
|
||||||
|
for (const currency of (0, core_1.getActiveTokens)(config)) {
|
||||||
|
const currencyConfig = tokens[currency];
|
||||||
|
const currencyService = (services.tornadoServices[currency] = {});
|
||||||
|
for (const [amount, instanceAddress] of Object.entries(currencyConfig.instanceAddress)) {
|
||||||
|
const Tornado = contracts_1.Tornado__factory.connect(instanceAddress, provider);
|
||||||
|
const amountService = (currencyService[amount] = {});
|
||||||
|
const TornadoServiceConstructor = {
|
||||||
|
netId,
|
||||||
|
provider,
|
||||||
|
Tornado,
|
||||||
|
amount,
|
||||||
|
currency,
|
||||||
|
deployedBlock,
|
||||||
|
cacheDirectory,
|
||||||
|
userDirectory,
|
||||||
|
nativeCurrency,
|
||||||
|
logger,
|
||||||
|
};
|
||||||
|
amountService.depositsService = new events_1.NodeTornadoService({
|
||||||
|
...TornadoServiceConstructor,
|
||||||
|
merkleTreeService: new core_1.MerkleTreeService({
|
||||||
|
netId,
|
||||||
|
amount,
|
||||||
|
currency,
|
||||||
|
Tornado,
|
||||||
|
merkleWorkerPath,
|
||||||
|
}),
|
||||||
|
treeCache: new treeCache_1.TreeCache({
|
||||||
|
netId,
|
||||||
|
amount,
|
||||||
|
currency,
|
||||||
|
userDirectory: userTreeDir,
|
||||||
|
}),
|
||||||
|
optionalTree: true,
|
||||||
|
type: 'Deposit',
|
||||||
|
});
|
||||||
|
amountService.withdrawalsService = new events_1.NodeTornadoService({
|
||||||
|
...TornadoServiceConstructor,
|
||||||
|
type: 'Withdrawal',
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
syncManager.cachedServices = cachedServices;
|
||||||
|
}
|
||||||
|
async function syncGasPrice(syncManager, netId) {
|
||||||
|
const { cachedServices, logger, errors, cachedGasPrices, latestBlocks, latestBalances, syncStatus, relayerConfig: { rewardAccount }, } = syncManager;
|
||||||
|
try {
|
||||||
|
const services = cachedServices[netId];
|
||||||
|
const { provider, tornadoFeeOracle } = services;
|
||||||
|
const [blockNumber, balance, gasPrice, l1Fee] = await Promise.all([
|
||||||
|
provider.getBlockNumber(),
|
||||||
|
provider.getBalance(rewardAccount),
|
||||||
|
tornadoFeeOracle.gasPrice(),
|
||||||
|
tornadoFeeOracle.fetchL1OptimismFee(),
|
||||||
|
]);
|
||||||
|
cachedGasPrices[netId] = {
|
||||||
|
gasPrice: gasPrice.toString(),
|
||||||
|
l1Fee: l1Fee ? l1Fee.toString() : undefined,
|
||||||
|
};
|
||||||
|
latestBlocks[netId] = blockNumber;
|
||||||
|
latestBalances[netId] = balance.toString();
|
||||||
|
syncStatus[netId].gasPrice = true;
|
||||||
|
}
|
||||||
|
catch (err) {
|
||||||
|
logger.error(`${netId}: Failed to sync gas prices`);
|
||||||
|
console.log(err);
|
||||||
|
syncStatus[netId].gasPrice = false;
|
||||||
|
errors.push((0, error_1.newError)('SyncManager (gas)', netId, err));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
async function syncPrices(syncManager, netId) {
|
||||||
|
const { cachedServices, logger, errors, cachedPrices, syncStatus } = syncManager;
|
||||||
|
try {
|
||||||
|
const config = (0, core_1.getConfig)(netId);
|
||||||
|
const { nativeCurrency, tornContract } = config;
|
||||||
|
const services = cachedServices[netId];
|
||||||
|
const { tokenPriceOracle } = services;
|
||||||
|
// Classic UI ajv validator requires all token prices to present
|
||||||
|
const allTokens = Object.keys(config.tokens);
|
||||||
|
if (tornContract && !allTokens.includes('torn')) {
|
||||||
|
allTokens.push('torn');
|
||||||
|
}
|
||||||
|
const tokens = allTokens
|
||||||
|
.map((currency) => {
|
||||||
|
if (currency === nativeCurrency) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
if (currency === 'torn') {
|
||||||
|
return {
|
||||||
|
currency,
|
||||||
|
tokenAddress: tornContract,
|
||||||
|
decimals: 18,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
const { tokenAddress, decimals } = config.tokens[currency];
|
||||||
|
return {
|
||||||
|
currency,
|
||||||
|
tokenAddress,
|
||||||
|
decimals,
|
||||||
|
};
|
||||||
|
})
|
||||||
|
.filter((t) => t);
|
||||||
|
if (!tokens.length) {
|
||||||
|
syncStatus[netId].tokenPrice = true;
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
cachedPrices[netId] = (await tokenPriceOracle.fetchPrices(tokens)).reduce((acc, price, index) => {
|
||||||
|
acc[tokens[index].currency] = price;
|
||||||
|
return acc;
|
||||||
|
}, {});
|
||||||
|
syncStatus[netId].tokenPrice = true;
|
||||||
|
logger.info(`${netId}: Synced ${tokens.length} tokens price`);
|
||||||
|
}
|
||||||
|
catch (err) {
|
||||||
|
logger.error(`${netId}: Failed to sync prices`);
|
||||||
|
console.log(err);
|
||||||
|
syncStatus[netId].tokenPrice = false;
|
||||||
|
errors.push((0, error_1.newError)('SyncManager (price)', netId, err));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
async function syncNetworkEvents(syncManager, netId) {
|
||||||
|
const { cachedEvents, cachedServices, logger, errors, syncStatus } = syncManager;
|
||||||
|
try {
|
||||||
|
const services = cachedServices[netId];
|
||||||
|
const { provider, governanceService, registryService, revenueService, echoService, encryptedNotesService, tornadoServices, } = services;
|
||||||
|
logger.info(`${netId}: Syncing events from block ${await provider.getBlockNumber()}`);
|
||||||
|
const eventsStatus = {
|
||||||
|
governance: governanceService ? {} : undefined,
|
||||||
|
registered: registryService ? {} : undefined,
|
||||||
|
registry: registryService ? {} : undefined,
|
||||||
|
revenue: revenueService ? {} : undefined,
|
||||||
|
echo: {},
|
||||||
|
encrypted_notes: {},
|
||||||
|
instances: {},
|
||||||
|
};
|
||||||
|
if (governanceService) {
|
||||||
|
const { events, lastBlock } = await governanceService.updateEvents();
|
||||||
|
eventsStatus.governance = {
|
||||||
|
events: events.length,
|
||||||
|
lastBlock,
|
||||||
|
};
|
||||||
|
logger.info(`${netId}: Updated governance events (total: ${events.length}, block: ${lastBlock})`);
|
||||||
|
}
|
||||||
|
if (registryService) {
|
||||||
|
{
|
||||||
|
const { events, lastBlock } = await registryService.updateEvents();
|
||||||
|
eventsStatus.registry = {
|
||||||
|
events: events.length,
|
||||||
|
lastBlock,
|
||||||
|
};
|
||||||
|
logger.info(`${netId}: Updated registry events (total: ${events.length}, block: ${lastBlock})`);
|
||||||
|
}
|
||||||
|
{
|
||||||
|
const { lastBlock, timestamp, relayers } = await registryService.updateRelayers();
|
||||||
|
eventsStatus.registered = {
|
||||||
|
lastBlock,
|
||||||
|
timestamp,
|
||||||
|
relayers: relayers.length,
|
||||||
|
};
|
||||||
|
logger.info(`${netId}: Updated registry relayers (total: ${relayers.length}, block: ${lastBlock}, timestamp: ${timestamp})`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (revenueService) {
|
||||||
|
const { events, lastBlock } = await revenueService.updateEvents();
|
||||||
|
eventsStatus.revenue = {
|
||||||
|
events: events.length,
|
||||||
|
lastBlock,
|
||||||
|
};
|
||||||
|
logger.info(`${netId}: Updated revenue events (total: ${events.length}, block: ${lastBlock})`);
|
||||||
|
}
|
||||||
|
const echoEvents = await echoService.updateEvents();
|
||||||
|
eventsStatus.echo = {
|
||||||
|
events: echoEvents.events.length,
|
||||||
|
lastBlock: echoEvents.lastBlock,
|
||||||
|
};
|
||||||
|
logger.info(`${netId}: Updated echo events (total: ${echoEvents.events.length}, block: ${echoEvents.lastBlock})`);
|
||||||
|
const encryptedNotesEvents = await encryptedNotesService.updateEvents();
|
||||||
|
eventsStatus.encrypted_notes = {
|
||||||
|
events: encryptedNotesEvents.events.length,
|
||||||
|
lastBlock: encryptedNotesEvents.lastBlock,
|
||||||
|
};
|
||||||
|
logger.info(`${netId}: Updated encrypted notes events (total: ${encryptedNotesEvents.events.length}, block: ${encryptedNotesEvents.lastBlock})`);
|
||||||
|
const currencies = Object.keys(tornadoServices);
|
||||||
|
for (const currency of currencies) {
|
||||||
|
const currencyStatus = (eventsStatus.instances[currency] = {});
|
||||||
|
const amounts = Object.keys(tornadoServices[currency]);
|
||||||
|
for (const amount of amounts) {
|
||||||
|
const instanceStatus = (currencyStatus[amount] = {
|
||||||
|
deposits: {},
|
||||||
|
withdrawals: {},
|
||||||
|
});
|
||||||
|
const { depositsService, withdrawalsService } = tornadoServices[currency][amount];
|
||||||
|
const depositEvents = await depositsService.updateEvents();
|
||||||
|
instanceStatus.deposits = {
|
||||||
|
events: depositEvents.events.length,
|
||||||
|
lastBlock: depositEvents.lastBlock,
|
||||||
|
};
|
||||||
|
logger.info(`${netId}: Updated ${currency} ${amount} Tornado deposit events (total: ${depositEvents.events.length}, block: ${depositEvents.lastBlock})`);
|
||||||
|
const withdrawalEvents = await withdrawalsService.updateEvents();
|
||||||
|
instanceStatus.withdrawals = {
|
||||||
|
events: withdrawalEvents.events.length,
|
||||||
|
lastBlock: withdrawalEvents.lastBlock,
|
||||||
|
};
|
||||||
|
logger.info(`${netId}: Updated ${currency} ${amount} Tornado withdrawal events (total: ${withdrawalEvents.events.length}, block: ${withdrawalEvents.lastBlock})`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
cachedEvents[netId] = eventsStatus;
|
||||||
|
syncStatus[netId].events = true;
|
||||||
|
logger.info(`${netId}: Synced all events`);
|
||||||
|
await Promise.all([syncPrices(syncManager, netId), syncGasPrice(syncManager, netId)]);
|
||||||
|
}
|
||||||
|
catch (err) {
|
||||||
|
logger.error(`${netId}: Failed to sync events`);
|
||||||
|
console.log(err);
|
||||||
|
syncStatus[netId].events = false;
|
||||||
|
errors.push((0, error_1.newError)('SyncManager (events)', netId, err));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
class SyncManager {
|
||||||
|
relayerConfig;
|
||||||
|
logger;
|
||||||
|
cachedServices;
|
||||||
|
cachedEvents;
|
||||||
|
cachedPrices;
|
||||||
|
cachedGasPrices;
|
||||||
|
syncStatus;
|
||||||
|
latestBlocks;
|
||||||
|
latestBalances;
|
||||||
|
errors;
|
||||||
|
onSyncEvents;
|
||||||
|
constructor(relayerConfig) {
|
||||||
|
this.relayerConfig = relayerConfig;
|
||||||
|
this.logger = (0, logger_1.getLogger)('[SyncManager]', relayerConfig.logLevel);
|
||||||
|
this.cachedServices = {};
|
||||||
|
this.cachedEvents = {};
|
||||||
|
this.cachedPrices = {};
|
||||||
|
this.cachedGasPrices = {};
|
||||||
|
this.syncStatus = {};
|
||||||
|
this.latestBlocks = {};
|
||||||
|
this.latestBalances = {};
|
||||||
|
this.errors = [];
|
||||||
|
this.onSyncEvents = false;
|
||||||
|
setupServices(this);
|
||||||
|
}
|
||||||
|
getStatus() {
|
||||||
|
return {
|
||||||
|
cachedEvents: this.cachedEvents,
|
||||||
|
cachedPrices: JSON.parse(JSON.stringify(this.cachedPrices)),
|
||||||
|
cachedGasPrices: JSON.parse(JSON.stringify(this.cachedGasPrices)),
|
||||||
|
syncStatus: this.syncStatus,
|
||||||
|
latestBlocks: this.latestBlocks,
|
||||||
|
latestBalances: this.latestBalances,
|
||||||
|
errors: this.errors.map(({ type, netId, timestamp }) => ({
|
||||||
|
type,
|
||||||
|
netId,
|
||||||
|
timestamp,
|
||||||
|
})),
|
||||||
|
onSyncEvents: this.onSyncEvents,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
getPrice(netId, token) {
|
||||||
|
return this.cachedPrices[netId]?.[token] || BigInt(0);
|
||||||
|
}
|
||||||
|
getGasPrice(netId) {
|
||||||
|
return this.cachedGasPrices[netId];
|
||||||
|
}
|
||||||
|
async syncEvents() {
|
||||||
|
if (this.onSyncEvents) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
this.onSyncEvents = true;
|
||||||
|
await Promise.all(this.relayerConfig.enabledNetworks.map((netId) => syncNetworkEvents(this, Number(netId))));
|
||||||
|
this.onSyncEvents = false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
exports.SyncManager = SyncManager;
|
35
lib/services/treeCache.d.ts
vendored
Normal file
35
lib/services/treeCache.d.ts
vendored
Normal file
@ -0,0 +1,35 @@
|
|||||||
|
/**
|
||||||
|
* Create tree cache file from node.js
|
||||||
|
*
|
||||||
|
* Only works for node.js, modified from https://github.com/tornadocash/tornado-classic-ui/blob/master/scripts/updateTree.js
|
||||||
|
*/
|
||||||
|
import { MerkleTree } from '@tornado/fixed-merkle-tree';
|
||||||
|
import { DepositsEvents } from '@tornado/core';
|
||||||
|
import type { NetIdType } from '@tornado/core';
|
||||||
|
export interface TreeCacheConstructor {
|
||||||
|
netId: NetIdType;
|
||||||
|
amount: string;
|
||||||
|
currency: string;
|
||||||
|
userDirectory: string;
|
||||||
|
PARTS_COUNT?: number;
|
||||||
|
LEAVES?: number;
|
||||||
|
zeroElement?: string;
|
||||||
|
}
|
||||||
|
export interface treeMetadata {
|
||||||
|
blockNumber: number;
|
||||||
|
logIndex: number;
|
||||||
|
transactionHash: string;
|
||||||
|
timestamp: number;
|
||||||
|
from: string;
|
||||||
|
leafIndex: number;
|
||||||
|
}
|
||||||
|
export declare class TreeCache {
|
||||||
|
netId: NetIdType;
|
||||||
|
amount: string;
|
||||||
|
currency: string;
|
||||||
|
userDirectory: string;
|
||||||
|
PARTS_COUNT: number;
|
||||||
|
constructor({ netId, amount, currency, userDirectory, PARTS_COUNT }: TreeCacheConstructor);
|
||||||
|
getInstanceName(): string;
|
||||||
|
createTree(events: DepositsEvents[], tree: MerkleTree): Promise<void>;
|
||||||
|
}
|
65
lib/services/treeCache.js
vendored
Normal file
65
lib/services/treeCache.js
vendored
Normal file
@ -0,0 +1,65 @@
|
|||||||
|
"use strict";
|
||||||
|
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||||
|
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||||
|
};
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
exports.TreeCache = void 0;
|
||||||
|
const bloomfilter_js_1 = __importDefault(require("bloomfilter.js"));
|
||||||
|
const data_1 = require("./data");
|
||||||
|
class TreeCache {
|
||||||
|
netId;
|
||||||
|
amount;
|
||||||
|
currency;
|
||||||
|
userDirectory;
|
||||||
|
PARTS_COUNT;
|
||||||
|
constructor({ netId, amount, currency, userDirectory, PARTS_COUNT = 4 }) {
|
||||||
|
this.netId = netId;
|
||||||
|
this.amount = amount;
|
||||||
|
this.currency = currency;
|
||||||
|
this.userDirectory = userDirectory;
|
||||||
|
this.PARTS_COUNT = PARTS_COUNT;
|
||||||
|
}
|
||||||
|
getInstanceName() {
|
||||||
|
return `deposits_${this.netId}_${this.currency}_${this.amount}`;
|
||||||
|
}
|
||||||
|
async createTree(events, tree) {
|
||||||
|
const bloom = new bloomfilter_js_1.default(events.length);
|
||||||
|
console.log(`Creating cached tree for ${this.getInstanceName()}\n`);
|
||||||
|
// events indexed by commitment
|
||||||
|
const eventsData = events.reduce((acc, { leafIndex, commitment, ...rest }, i) => {
|
||||||
|
if (leafIndex !== i) {
|
||||||
|
throw new Error(`leafIndex (${leafIndex}) !== i (${i})`);
|
||||||
|
}
|
||||||
|
acc[commitment] = { ...rest, leafIndex };
|
||||||
|
return acc;
|
||||||
|
}, {});
|
||||||
|
const slices = tree.getTreeSlices(this.PARTS_COUNT);
|
||||||
|
await Promise.all(slices.map(async (slice, index) => {
|
||||||
|
const metadata = slice.elements.reduce((acc, curr) => {
|
||||||
|
if (index < this.PARTS_COUNT - 1) {
|
||||||
|
bloom.add(curr);
|
||||||
|
}
|
||||||
|
acc.push(eventsData[curr]);
|
||||||
|
return acc;
|
||||||
|
}, []);
|
||||||
|
const dataString = JSON.stringify({
|
||||||
|
...slice,
|
||||||
|
metadata,
|
||||||
|
}, null, 2) + '\n';
|
||||||
|
const fileName = `${this.getInstanceName()}_slice${index + 1}.json`;
|
||||||
|
await (0, data_1.saveUserFile)({
|
||||||
|
fileName,
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
dataString,
|
||||||
|
});
|
||||||
|
}));
|
||||||
|
const dataString = bloom.serialize() + '\n';
|
||||||
|
const fileName = `${this.getInstanceName()}_bloom.json`;
|
||||||
|
await (0, data_1.saveUserFile)({
|
||||||
|
fileName,
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
dataString,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
exports.TreeCache = TreeCache;
|
2
lib/services/utils.d.ts
vendored
Normal file
2
lib/services/utils.d.ts
vendored
Normal file
@ -0,0 +1,2 @@
|
|||||||
|
export declare const chunk: <T>(arr: T[], size: number) => T[][];
|
||||||
|
export declare function sleep(ms: number): Promise<unknown>;
|
13
lib/services/utils.js
vendored
Normal file
13
lib/services/utils.js
vendored
Normal file
@ -0,0 +1,13 @@
|
|||||||
|
"use strict";
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
exports.chunk = void 0;
|
||||||
|
exports.sleep = sleep;
|
||||||
|
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||||
|
BigInt.prototype.toJSON = function () {
|
||||||
|
return this.toString();
|
||||||
|
};
|
||||||
|
const chunk = (arr, size) => [...Array(Math.ceil(arr.length / size))].map((_, i) => arr.slice(size * i, size + size * i));
|
||||||
|
exports.chunk = chunk;
|
||||||
|
function sleep(ms) {
|
||||||
|
return new Promise((resolve) => setTimeout(resolve, ms));
|
||||||
|
}
|
80
lib/services/worker.d.ts
vendored
Normal file
80
lib/services/worker.d.ts
vendored
Normal file
@ -0,0 +1,80 @@
|
|||||||
|
import type { Logger } from 'winston';
|
||||||
|
import { Provider } from 'ethers';
|
||||||
|
import { TornadoRouter } from '@tornado/contracts';
|
||||||
|
import { NetIdType, TornadoWithdrawParams, RelayerTornadoJobs, RelayerTornadoWithdraw, TornadoFeeOracle, snarkArgs, Config, TornadoWallet } from '@tornado/core';
|
||||||
|
import { RelayerConfig } from '../config';
|
||||||
|
import { SyncManager } from './sync';
|
||||||
|
import { ErrorMessages } from './error';
|
||||||
|
export declare enum RelayerStatus {
|
||||||
|
QUEUED = "QUEUED",
|
||||||
|
ACCEPTED = "ACCEPTED",
|
||||||
|
SENT = "SENT",
|
||||||
|
MINED = "MINED",
|
||||||
|
RESUBMITTED = "RESUBMITTED",
|
||||||
|
CONFIRMED = "CONFIRMED",
|
||||||
|
FAILED = "FAILED"
|
||||||
|
}
|
||||||
|
export declare const DEFAULT_GAS_LIMIT = 600000;
|
||||||
|
export interface RelayerServices {
|
||||||
|
provider: Provider;
|
||||||
|
signer: TornadoWallet;
|
||||||
|
tornadoFeeOracle: TornadoFeeOracle;
|
||||||
|
Router: TornadoRouter;
|
||||||
|
}
|
||||||
|
export interface CachedRelayerServices {
|
||||||
|
[key: NetIdType]: RelayerServices;
|
||||||
|
}
|
||||||
|
export declare function getFeeParams(config: Config, serviceFee: number, syncManager: SyncManager, { netId, contract, args }: RelayerTornadoQueue): {
|
||||||
|
amount: string;
|
||||||
|
symbol: string;
|
||||||
|
gasPrice: bigint;
|
||||||
|
gasLimit: bigint;
|
||||||
|
l1Fee: string | undefined;
|
||||||
|
denomination: bigint;
|
||||||
|
ethRefund: bigint;
|
||||||
|
tokenPriceInWei: bigint;
|
||||||
|
tokenDecimals: number;
|
||||||
|
relayerFeePercent: number;
|
||||||
|
isEth: boolean;
|
||||||
|
premiumPercent: number;
|
||||||
|
};
|
||||||
|
export declare function checkWithdrawalFees(relayerWorker: RelayerWorker, work: RelayerTornadoQueue): Promise<{
|
||||||
|
gasPrice: bigint;
|
||||||
|
gasLimit: bigint;
|
||||||
|
status: boolean;
|
||||||
|
error?: string;
|
||||||
|
}>;
|
||||||
|
export declare function processWithdrawals(relayerWorker: RelayerWorker): Promise<void>;
|
||||||
|
export interface CreateWorkParams extends TornadoWithdrawParams {
|
||||||
|
netId: NetIdType;
|
||||||
|
}
|
||||||
|
export interface RelayerTornadoQueue extends Omit<RelayerTornadoJobs, 'contract' | 'proof' | 'args'> {
|
||||||
|
netId: NetIdType;
|
||||||
|
contract: string;
|
||||||
|
proof: string;
|
||||||
|
args: snarkArgs;
|
||||||
|
timestamp: number;
|
||||||
|
}
|
||||||
|
export interface RelayerQueueGas {
|
||||||
|
id: string;
|
||||||
|
gasPrice: bigint;
|
||||||
|
gasLimit: bigint;
|
||||||
|
timestamp: number;
|
||||||
|
}
|
||||||
|
export declare class RelayerWorker {
|
||||||
|
relayerConfig: RelayerConfig;
|
||||||
|
logger: Logger;
|
||||||
|
syncManager: SyncManager;
|
||||||
|
cachedRelayerServices: CachedRelayerServices;
|
||||||
|
queue: RelayerTornadoQueue[];
|
||||||
|
queueGas: RelayerQueueGas[];
|
||||||
|
queueTimer: null | NodeJS.Timeout;
|
||||||
|
errors: ErrorMessages[];
|
||||||
|
constructor(relayerConfig: RelayerConfig, syncManager: SyncManager);
|
||||||
|
doWork(): Promise<void>;
|
||||||
|
createWork({ netId, contract, proof, args, }: CreateWorkParams): Promise<RelayerTornadoWithdraw | RelayerTornadoQueue>;
|
||||||
|
getWork({ id }: {
|
||||||
|
id: string;
|
||||||
|
}): RelayerTornadoWithdraw | RelayerTornadoQueue;
|
||||||
|
pendingWorks(): number;
|
||||||
|
}
|
285
lib/services/worker.js
vendored
Normal file
285
lib/services/worker.js
vendored
Normal file
@ -0,0 +1,285 @@
|
|||||||
|
"use strict";
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
exports.RelayerWorker = exports.DEFAULT_GAS_LIMIT = exports.RelayerStatus = void 0;
|
||||||
|
exports.getFeeParams = getFeeParams;
|
||||||
|
exports.checkWithdrawalFees = checkWithdrawalFees;
|
||||||
|
exports.processWithdrawals = processWithdrawals;
|
||||||
|
const crypto_1 = require("crypto");
|
||||||
|
const ethers_1 = require("ethers");
|
||||||
|
const contracts_1 = require("@tornado/contracts");
|
||||||
|
const core_1 = require("@tornado/core");
|
||||||
|
const config_1 = require("../config");
|
||||||
|
const logger_1 = require("./logger");
|
||||||
|
const error_1 = require("./error");
|
||||||
|
var RelayerStatus;
|
||||||
|
(function (RelayerStatus) {
|
||||||
|
RelayerStatus["QUEUED"] = "QUEUED";
|
||||||
|
RelayerStatus["ACCEPTED"] = "ACCEPTED";
|
||||||
|
RelayerStatus["SENT"] = "SENT";
|
||||||
|
RelayerStatus["MINED"] = "MINED";
|
||||||
|
RelayerStatus["RESUBMITTED"] = "RESUBMITTED";
|
||||||
|
RelayerStatus["CONFIRMED"] = "CONFIRMED";
|
||||||
|
RelayerStatus["FAILED"] = "FAILED";
|
||||||
|
})(RelayerStatus || (exports.RelayerStatus = RelayerStatus = {}));
|
||||||
|
exports.DEFAULT_GAS_LIMIT = 600_000;
|
||||||
|
function setupServices(relayerWorker) {
|
||||||
|
const { relayerConfig: { enabledNetworks, txRpcUrls }, } = relayerWorker;
|
||||||
|
for (const netId of enabledNetworks) {
|
||||||
|
const config = (0, core_1.getConfig)(netId);
|
||||||
|
const rpcUrl = txRpcUrls[netId];
|
||||||
|
const provider = (0, core_1.getProviderWithNetId)(netId, rpcUrl, config);
|
||||||
|
const signer = new core_1.TornadoWallet((0, config_1.getPrivateKey)(), provider);
|
||||||
|
const Router = contracts_1.TornadoRouter__factory.connect(config.routerContract, signer);
|
||||||
|
const tornadoFeeOracle = new core_1.TornadoFeeOracle(provider);
|
||||||
|
relayerWorker.cachedRelayerServices[netId] = {
|
||||||
|
provider,
|
||||||
|
signer,
|
||||||
|
Router,
|
||||||
|
tornadoFeeOracle,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
function getFeeParams(config, serviceFee, syncManager, { netId, contract, args }) {
|
||||||
|
const { amount, currency } = (0, core_1.getInstanceByAddress)(config, contract);
|
||||||
|
const { nativeCurrency, tokens: { [currency]: { symbol: currencySymbol, decimals, gasLimit: instanceGasLimit }, }, } = config;
|
||||||
|
const symbol = currencySymbol.toLowerCase();
|
||||||
|
const { gasPrice, l1Fee } = syncManager.getGasPrice(netId);
|
||||||
|
const gasLimit = BigInt(instanceGasLimit || exports.DEFAULT_GAS_LIMIT);
|
||||||
|
const denomination = (0, ethers_1.parseUnits)(amount, decimals);
|
||||||
|
const ethRefund = BigInt(args[5]);
|
||||||
|
const tokenPriceInWei = syncManager.getPrice(netId, symbol);
|
||||||
|
const isEth = nativeCurrency === currency;
|
||||||
|
return {
|
||||||
|
amount,
|
||||||
|
symbol,
|
||||||
|
gasPrice: BigInt(gasPrice),
|
||||||
|
gasLimit,
|
||||||
|
l1Fee,
|
||||||
|
denomination,
|
||||||
|
ethRefund,
|
||||||
|
tokenPriceInWei,
|
||||||
|
tokenDecimals: decimals,
|
||||||
|
relayerFeePercent: serviceFee,
|
||||||
|
isEth,
|
||||||
|
premiumPercent: 5,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
async function checkWithdrawalFees(relayerWorker, work) {
|
||||||
|
try {
|
||||||
|
const { id, netId, contract, proof, args } = work;
|
||||||
|
const { relayerConfig: { rewardAccount, serviceFee }, cachedRelayerServices: { [netId]: { tornadoFeeOracle, Router }, }, syncManager, } = relayerWorker;
|
||||||
|
const config = (0, core_1.getConfig)(netId);
|
||||||
|
const feeParams = getFeeParams(config, serviceFee, syncManager, work);
|
||||||
|
const { amount, symbol, tokenDecimals, denomination, ethRefund } = feeParams;
|
||||||
|
let fee = tornadoFeeOracle.calculateRelayerFee(feeParams);
|
||||||
|
const gasLimit = await Router.withdraw.estimateGas(contract, proof, ...args, {
|
||||||
|
from: rewardAccount,
|
||||||
|
value: ethRefund,
|
||||||
|
});
|
||||||
|
// Recalculate fee based on correct gas limit
|
||||||
|
fee = tornadoFeeOracle.calculateRelayerFee({
|
||||||
|
...feeParams,
|
||||||
|
gasLimit,
|
||||||
|
});
|
||||||
|
if (fee > denomination) {
|
||||||
|
return {
|
||||||
|
gasPrice: feeParams.gasPrice,
|
||||||
|
gasLimit,
|
||||||
|
status: false,
|
||||||
|
error: `Fee above deposit amount, requires ${(0, ethers_1.formatUnits)(fee, tokenDecimals)} ${symbol} while denomination is ${amount} ${symbol}`,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
if (fee > BigInt(args[4])) {
|
||||||
|
return {
|
||||||
|
gasPrice: feeParams.gasPrice,
|
||||||
|
gasLimit,
|
||||||
|
status: false,
|
||||||
|
error: `Insufficient fee, requires ${(0, ethers_1.formatUnits)(fee, tokenDecimals)} ${symbol} while user only wants to pay ${(0, ethers_1.formatUnits)(BigInt(args[4]), tokenDecimals)} ${symbol}`,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
relayerWorker.logger.info(`New job: ${id} ${netId} ${amount} ${symbol} (Fee: ${(0, ethers_1.formatUnits)(BigInt(args[4]), tokenDecimals)} ${symbol}, Refund: ${(0, ethers_1.formatUnits)(BigInt(args[5]), tokenDecimals)})`);
|
||||||
|
return {
|
||||||
|
gasPrice: feeParams.gasPrice,
|
||||||
|
gasLimit,
|
||||||
|
status: true,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
catch {
|
||||||
|
return {
|
||||||
|
gasPrice: BigInt(0),
|
||||||
|
gasLimit: BigInt(0),
|
||||||
|
status: false,
|
||||||
|
error: 'Withdrawal transaction expected to be reverted',
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
async function processWithdrawals(relayerWorker) {
|
||||||
|
const { logger, cachedRelayerServices, errors } = relayerWorker;
|
||||||
|
for (const work of relayerWorker.queue) {
|
||||||
|
try {
|
||||||
|
if (work.status !== RelayerStatus.ACCEPTED) {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
const { id, netId, contract, proof, args } = work;
|
||||||
|
// cancel duplicated jobs
|
||||||
|
const otherWork = relayerWorker.queue.find((q) => q.id !== id &&
|
||||||
|
// find if other previous work is already sent (not pending or failed - to allow spending first and failed one)
|
||||||
|
q.status !== RelayerStatus.ACCEPTED &&
|
||||||
|
q.status !== RelayerStatus.FAILED &&
|
||||||
|
q.contract === contract &&
|
||||||
|
q.args[1] === args[1]);
|
||||||
|
if (otherWork) {
|
||||||
|
const errMsg = `Found the same pending job ${otherWork.id}, wait until the previous one completes`;
|
||||||
|
throw new Error(errMsg);
|
||||||
|
}
|
||||||
|
const { gasLimit, gasPrice } = relayerWorker.queueGas.find((w) => w.id === id);
|
||||||
|
const config = (0, core_1.getConfig)(netId);
|
||||||
|
const { amount, currency } = (0, core_1.getInstanceByAddress)(config, contract);
|
||||||
|
const { decimals } = config.tokens[currency];
|
||||||
|
const { Router, signer } = cachedRelayerServices[netId];
|
||||||
|
/**
|
||||||
|
* Check fees to ensure that it didn't spike or revert (or has insane gas spendings)
|
||||||
|
*/
|
||||||
|
const txObj = await signer.populateTransaction(await Router.withdraw.populateTransaction(contract, proof, ...args, {
|
||||||
|
value: BigInt(args[5]),
|
||||||
|
}));
|
||||||
|
const txGasPrice = txObj.maxFeePerGas
|
||||||
|
? txObj.maxFeePerGas + BigInt(txObj.maxPriorityFeePerGas || 0)
|
||||||
|
: txObj.gasPrice;
|
||||||
|
// Prevent tx on gas limit spike
|
||||||
|
if (txObj.gasLimit > (gasLimit * BigInt(15)) / BigInt(10)) {
|
||||||
|
const errMsg = `Job ${id} exceeds pre estimated gas limit, wants ${gasLimit * BigInt(2)} have ${txObj.gasLimit}`;
|
||||||
|
throw new Error(errMsg);
|
||||||
|
}
|
||||||
|
// Prevent tx on gas price spike
|
||||||
|
if (txGasPrice > gasPrice * BigInt(2)) {
|
||||||
|
const errMsg = `Job ${id} exceeds pre estimated gas price, wants ${gasPrice * BigInt(2)} have ${txGasPrice}`;
|
||||||
|
throw new Error(errMsg);
|
||||||
|
}
|
||||||
|
const tx = await signer.sendTransaction(txObj);
|
||||||
|
work.txHash = tx.hash;
|
||||||
|
work.confirmations = 0;
|
||||||
|
work.status = RelayerStatus.SENT;
|
||||||
|
logger.info(`Sent Job ${work.id} ${netId} ${amount} ${currency} tx (Fee: ${(0, ethers_1.formatUnits)(BigInt(args[4]), decimals)} ${currency}, Refund: ${(0, ethers_1.formatUnits)(BigInt(args[5]), decimals)} ${currency} ${tx.hash})`);
|
||||||
|
// Wait for 2 seconds so that the remote node could increment nonces
|
||||||
|
await (0, core_1.sleep)(2000);
|
||||||
|
// Head straight to confirmed status as the remote node oftenly doesn't report receipt correctly
|
||||||
|
work.confirmations = 1;
|
||||||
|
work.status = RelayerStatus.MINED;
|
||||||
|
work.confirmations = 3;
|
||||||
|
work.status = RelayerStatus.CONFIRMED;
|
||||||
|
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||||
|
}
|
||||||
|
catch (error) {
|
||||||
|
logger.error(`Failed to send job ${work.id}`);
|
||||||
|
console.log(error);
|
||||||
|
errors.push((0, error_1.newError)('Worker (processWithdrawals)', work.netId, error));
|
||||||
|
work.status = RelayerStatus.FAILED;
|
||||||
|
if (error.message?.includes('exceeds pre estimated')) {
|
||||||
|
work.failedReason = error.message;
|
||||||
|
}
|
||||||
|
else if (error.message?.includes('Found the same pending job')) {
|
||||||
|
work.failedReason = error.message;
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
work.failedReason = 'Relayer failed to send transaction';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
relayerWorker.queue = relayerWorker.queue.filter((w) => w.timestamp + relayerWorker.relayerConfig.clearInterval >= Math.floor(Date.now() / 1000));
|
||||||
|
relayerWorker.queueGas = relayerWorker.queueGas.filter((w) => w.timestamp + relayerWorker.relayerConfig.clearInterval >= Math.floor(Date.now() / 1000));
|
||||||
|
}
|
||||||
|
class RelayerWorker {
|
||||||
|
relayerConfig;
|
||||||
|
logger;
|
||||||
|
syncManager;
|
||||||
|
cachedRelayerServices;
|
||||||
|
queue;
|
||||||
|
queueGas;
|
||||||
|
queueTimer;
|
||||||
|
errors;
|
||||||
|
constructor(relayerConfig, syncManager) {
|
||||||
|
this.relayerConfig = relayerConfig;
|
||||||
|
this.syncManager = syncManager;
|
||||||
|
this.logger = (0, logger_1.getLogger)('[RelayerWorker]', relayerConfig.logLevel);
|
||||||
|
this.cachedRelayerServices = {};
|
||||||
|
this.queue = [];
|
||||||
|
this.queueGas = [];
|
||||||
|
this.queueTimer = null;
|
||||||
|
this.errors = [];
|
||||||
|
setupServices(this);
|
||||||
|
}
|
||||||
|
async doWork() {
|
||||||
|
await processWithdrawals(this);
|
||||||
|
const pendingWorks = this.queue.filter((q) => q.status === RelayerStatus.QUEUED || q.status === RelayerStatus.ACCEPTED).length;
|
||||||
|
if (pendingWorks) {
|
||||||
|
if (pendingWorks < 5) {
|
||||||
|
this.doWork();
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
this.queue.forEach((q) => {
|
||||||
|
q.status = RelayerStatus.FAILED;
|
||||||
|
q.error = 'Relayer has too many jobs, try it again later';
|
||||||
|
q.failedReason = 'Relayer has too many jobs, try it again later';
|
||||||
|
});
|
||||||
|
this.logger.error(`Relayer has cleared the workload ( ${pendingWorks} ) due to overhaul`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
this.queueTimer = null;
|
||||||
|
}
|
||||||
|
async createWork({ netId, contract, proof, args, }) {
|
||||||
|
const work = {
|
||||||
|
netId,
|
||||||
|
id: crypto_1.webcrypto.randomUUID(),
|
||||||
|
type: 'TORNADO_WITHDRAW',
|
||||||
|
status: RelayerStatus.QUEUED,
|
||||||
|
contract,
|
||||||
|
proof,
|
||||||
|
args,
|
||||||
|
timestamp: Math.floor(Date.now() / 1000),
|
||||||
|
};
|
||||||
|
if (this.queue.find((q) => q.status !== RelayerStatus.FAILED && q.contract === contract && q.args[1] === args[1])) {
|
||||||
|
work.status = RelayerStatus.FAILED;
|
||||||
|
return {
|
||||||
|
error: 'Found the same pending job, wait until the previous one completes',
|
||||||
|
};
|
||||||
|
}
|
||||||
|
const { gasPrice, gasLimit, status, error } = await checkWithdrawalFees(this, work);
|
||||||
|
const workGas = {
|
||||||
|
id: work.id,
|
||||||
|
gasPrice,
|
||||||
|
gasLimit,
|
||||||
|
timestamp: work.timestamp,
|
||||||
|
};
|
||||||
|
if (!status) {
|
||||||
|
work.status = RelayerStatus.FAILED;
|
||||||
|
return {
|
||||||
|
error,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
work.status = RelayerStatus.ACCEPTED;
|
||||||
|
this.queue.push(work);
|
||||||
|
this.queueGas.push(workGas);
|
||||||
|
if (!this.queueTimer) {
|
||||||
|
this.queueTimer = setTimeout(() => this.doWork(), 500);
|
||||||
|
}
|
||||||
|
return work;
|
||||||
|
}
|
||||||
|
getWork({ id }) {
|
||||||
|
const work = this.queue.find((w) => w.id === id);
|
||||||
|
if (!work) {
|
||||||
|
return {
|
||||||
|
error: `Work ${id} not found`,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
const copiedWork = JSON.parse(JSON.stringify(work));
|
||||||
|
delete copiedWork.netId;
|
||||||
|
return copiedWork;
|
||||||
|
}
|
||||||
|
pendingWorks() {
|
||||||
|
return this.queue.filter((q) => q.status === RelayerStatus.QUEUED || q.status === RelayerStatus.ACCEPTED)
|
||||||
|
.length;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
exports.RelayerWorker = RelayerWorker;
|
1
lib/start.d.ts
vendored
Normal file
1
lib/start.d.ts
vendored
Normal file
@ -0,0 +1 @@
|
|||||||
|
export {};
|
98
lib/start.js
vendored
Normal file
98
lib/start.js
vendored
Normal file
@ -0,0 +1,98 @@
|
|||||||
|
"use strict";
|
||||||
|
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||||
|
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||||
|
};
|
||||||
|
Object.defineProperty(exports, "__esModule", { value: true });
|
||||||
|
const process_1 = __importDefault(require("process"));
|
||||||
|
const cluster_1 = __importDefault(require("cluster"));
|
||||||
|
const config_1 = require("./config");
|
||||||
|
const services_1 = require("./services");
|
||||||
|
if (cluster_1.default.isWorker) {
|
||||||
|
new services_1.Router(JSON.parse(process_1.default.env.relayerConfig), Number(process_1.default.env.forkId));
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
start();
|
||||||
|
}
|
||||||
|
async function forkRouter({ relayerConfig, logger, syncManager, relayerWorker, forkId, }) {
|
||||||
|
const worker = cluster_1.default.fork({
|
||||||
|
relayerConfig: JSON.stringify(relayerConfig),
|
||||||
|
forkId,
|
||||||
|
});
|
||||||
|
worker
|
||||||
|
.on('exit', (code) => {
|
||||||
|
logger.error(`Router Worker ${forkId} died with code ${code}, respawning...`);
|
||||||
|
setTimeout(() => {
|
||||||
|
forkRouter({
|
||||||
|
relayerConfig,
|
||||||
|
logger,
|
||||||
|
syncManager,
|
||||||
|
relayerWorker,
|
||||||
|
forkId,
|
||||||
|
});
|
||||||
|
}, 5000);
|
||||||
|
})
|
||||||
|
.on('message', async (msg) => {
|
||||||
|
const { msgId, type } = msg;
|
||||||
|
if (type === 'status') {
|
||||||
|
worker.send({
|
||||||
|
msgId,
|
||||||
|
syncManagerStatus: syncManager.getStatus(),
|
||||||
|
pendingWorks: relayerWorker.pendingWorks(),
|
||||||
|
});
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
if (type === 'job') {
|
||||||
|
const work = relayerWorker.getWork({
|
||||||
|
id: msg.id,
|
||||||
|
});
|
||||||
|
worker.send({
|
||||||
|
msgId,
|
||||||
|
...work,
|
||||||
|
});
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
if (type === 'tornadoWithdraw') {
|
||||||
|
const newWork = await relayerWorker.createWork({
|
||||||
|
netId: msg.netId,
|
||||||
|
contract: msg.contract,
|
||||||
|
proof: msg.proof,
|
||||||
|
args: msg.args,
|
||||||
|
});
|
||||||
|
worker.send({
|
||||||
|
msgId,
|
||||||
|
...newWork,
|
||||||
|
});
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
if (type === 'errors') {
|
||||||
|
worker.send({
|
||||||
|
msgId,
|
||||||
|
errors: [...syncManager.errors, ...relayerWorker.errors],
|
||||||
|
});
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
async function start() {
|
||||||
|
const relayerConfig = (0, config_1.getRelayerConfig)();
|
||||||
|
const logger = (0, services_1.getLogger)('[Main]', relayerConfig.logLevel);
|
||||||
|
console.log('Relayer config', relayerConfig);
|
||||||
|
await (0, services_1.checkProviders)(relayerConfig, logger);
|
||||||
|
const syncManager = new services_1.SyncManager(relayerConfig);
|
||||||
|
await syncManager.syncEvents();
|
||||||
|
const relayerWorker = new services_1.RelayerWorker(relayerConfig, syncManager);
|
||||||
|
setInterval(() => syncManager.syncEvents(), relayerConfig.syncInterval * 1000);
|
||||||
|
// Spawn website
|
||||||
|
let i = 0;
|
||||||
|
while (i < relayerConfig.workers) {
|
||||||
|
forkRouter({
|
||||||
|
relayerConfig,
|
||||||
|
logger,
|
||||||
|
syncManager,
|
||||||
|
relayerWorker,
|
||||||
|
forkId: i,
|
||||||
|
});
|
||||||
|
i++;
|
||||||
|
}
|
||||||
|
logger.info(`Spawned ${i} Router Workers`);
|
||||||
|
}
|
BIN
logo.png
Normal file
BIN
logo.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 6.6 KiB |
BIN
logo2.png
Normal file
BIN
logo2.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 2.5 KiB |
41
package.json
Normal file
41
package.json
Normal file
@ -0,0 +1,41 @@
|
|||||||
|
{
|
||||||
|
"name": "tovarish-relayer",
|
||||||
|
"version": "1.0.0",
|
||||||
|
"main": "lib/index.js",
|
||||||
|
"types": "lib/index.d.ts",
|
||||||
|
"scripts": {
|
||||||
|
"typechain": "typechain --target ethers-v6 --out-dir src/typechain src/abi/*.json",
|
||||||
|
"lint": "eslint src/**/*.ts --ext .ts --ignore-pattern src/typechain",
|
||||||
|
"build": "tsc --declaration",
|
||||||
|
"dev": "ts-node ./src/start.ts",
|
||||||
|
"start": "node ./lib/start.js",
|
||||||
|
"test": "echo \"Error: no test specified\" && exit 1"
|
||||||
|
},
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"@fastify/cors": "^10.0.1",
|
||||||
|
"@tornado/core": "git+https://git.tornado.ws/tornadocontrib/tornado-core.git#94a62e6193c99457a8dfae0d8684bee299cb1097",
|
||||||
|
"bloomfilter.js": "^1.0.2",
|
||||||
|
"dotenv": "^16.4.5",
|
||||||
|
"fastify": "^5.0.0",
|
||||||
|
"winston": "^3.14.2"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"@typechain/ethers-v6": "^0.5.1",
|
||||||
|
"@types/node": "^22.7.5",
|
||||||
|
"@typescript-eslint/eslint-plugin": "^8.9.0",
|
||||||
|
"@typescript-eslint/parser": "^8.9.0",
|
||||||
|
"eslint": "8.57.0",
|
||||||
|
"eslint-config-prettier": "^9.1.0",
|
||||||
|
"eslint-import-resolver-typescript": "^3.6.3",
|
||||||
|
"eslint-plugin-import": "^2.31.0",
|
||||||
|
"eslint-plugin-prettier": "^5.2.1",
|
||||||
|
"prettier": "^3.3.3",
|
||||||
|
"ts-node": "^10.9.2",
|
||||||
|
"typechain": "^8.3.2",
|
||||||
|
"typescript": "^5.6.3"
|
||||||
|
},
|
||||||
|
"resolutions": {
|
||||||
|
"strip-ansi": "6.0.1"
|
||||||
|
}
|
||||||
|
}
|
103
src/config.ts
Normal file
103
src/config.ts
Normal file
@ -0,0 +1,103 @@
|
|||||||
|
import path from 'path';
|
||||||
|
import process from 'process';
|
||||||
|
import os from 'os';
|
||||||
|
import 'dotenv/config';
|
||||||
|
import { computeAddress, isHexString } from 'ethers';
|
||||||
|
import { enabledChains, getConfig, NetIdType, SubdomainMap } from '@tornado/core';
|
||||||
|
import pkgJson from '../package.json';
|
||||||
|
|
||||||
|
export const version = `${pkgJson.name} ${pkgJson.version}`;
|
||||||
|
|
||||||
|
export interface RelayerConfig {
|
||||||
|
/**
|
||||||
|
* Router config
|
||||||
|
*/
|
||||||
|
host: string;
|
||||||
|
port: number;
|
||||||
|
workers: number;
|
||||||
|
reverseProxy: boolean;
|
||||||
|
logLevel?: string;
|
||||||
|
/**
|
||||||
|
* Worker config
|
||||||
|
*/
|
||||||
|
rewardAccount: string;
|
||||||
|
serviceFee: number;
|
||||||
|
// Clear work after this period
|
||||||
|
clearInterval: number;
|
||||||
|
/**
|
||||||
|
* Sync config
|
||||||
|
*/
|
||||||
|
enabledNetworks: NetIdType[];
|
||||||
|
rpcUrls: SubdomainMap;
|
||||||
|
txRpcUrls: SubdomainMap;
|
||||||
|
merkleWorkerPath: string;
|
||||||
|
cacheDir: string;
|
||||||
|
userEventsDir: string;
|
||||||
|
userTreeDir: string;
|
||||||
|
syncInterval: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function getPrivateKey(): string {
|
||||||
|
const privateKey = process.env.PRIVATE_KEY;
|
||||||
|
|
||||||
|
if (!privateKey || !isHexString(privateKey, 32)) {
|
||||||
|
throw new Error('Invalid private key, make sure it contains 0x prefix!');
|
||||||
|
}
|
||||||
|
|
||||||
|
return privateKey;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function getRewardAccount(): string {
|
||||||
|
return computeAddress(getPrivateKey());
|
||||||
|
}
|
||||||
|
|
||||||
|
export function getRelayerConfig(): RelayerConfig {
|
||||||
|
const enabledNetworks = process.env.ENABLED_NETWORKS
|
||||||
|
? process.env.ENABLED_NETWORKS.replaceAll(' ', '')
|
||||||
|
.split(',')
|
||||||
|
.map((n) => Number(n))
|
||||||
|
.filter((n) => enabledChains.includes(n))
|
||||||
|
: enabledChains;
|
||||||
|
|
||||||
|
const rpcUrls = enabledNetworks.reduce((acc, netId) => {
|
||||||
|
// If we have custom RPC url (like as 1_RPC from ENV)
|
||||||
|
if (process.env[`${netId}_RPC`]) {
|
||||||
|
acc[netId] = process.env[`${netId}_RPC`] || '';
|
||||||
|
} else {
|
||||||
|
acc[netId] = Object.values(getConfig(netId).rpcUrls)[0]?.url;
|
||||||
|
}
|
||||||
|
return acc;
|
||||||
|
}, {} as SubdomainMap);
|
||||||
|
|
||||||
|
const txRpcUrls = enabledNetworks.reduce((acc, netId) => {
|
||||||
|
// If we have custom RPC url (like as 1_RPC from ENV)
|
||||||
|
if (process.env[`${netId}_TX_RPC`]) {
|
||||||
|
acc[netId] = process.env[`${netId}_TX_RPC`] || '';
|
||||||
|
} else {
|
||||||
|
acc[netId] = rpcUrls[netId];
|
||||||
|
}
|
||||||
|
return acc;
|
||||||
|
}, {} as SubdomainMap);
|
||||||
|
|
||||||
|
const STATIC_DIR = process.env.CACHE_DIR || path.join(__dirname, '../static');
|
||||||
|
const USER_DIR = process.env.USER_DIR || './data';
|
||||||
|
|
||||||
|
return {
|
||||||
|
host: process.env.HOST || '0.0.0.0',
|
||||||
|
port: Number(process.env.PORT || 3000),
|
||||||
|
workers: Number(process.env.WORKERS || os.cpus().length),
|
||||||
|
reverseProxy: process.env.REVERSE_PROXY === 'true',
|
||||||
|
logLevel: process.env.LOG_LEVEL || undefined,
|
||||||
|
rewardAccount: getRewardAccount(),
|
||||||
|
serviceFee: Number(process.env.SERVICE_FEE || 0.5),
|
||||||
|
clearInterval: Number(process.env.CLEAR_INTERVAL || 86400),
|
||||||
|
enabledNetworks,
|
||||||
|
rpcUrls,
|
||||||
|
txRpcUrls,
|
||||||
|
merkleWorkerPath: path.join(STATIC_DIR, './merkleTreeWorker.js'),
|
||||||
|
cacheDir: path.join(STATIC_DIR, './events'),
|
||||||
|
userEventsDir: path.join(USER_DIR, './events'),
|
||||||
|
userTreeDir: path.join(USER_DIR, './trees'),
|
||||||
|
syncInterval: Number(process.env.SYNC_INTERVAL || 120),
|
||||||
|
};
|
||||||
|
}
|
2
src/index.ts
Normal file
2
src/index.ts
Normal file
@ -0,0 +1,2 @@
|
|||||||
|
export * from './services';
|
||||||
|
export * from './config';
|
58
src/services/check.ts
Normal file
58
src/services/check.ts
Normal file
@ -0,0 +1,58 @@
|
|||||||
|
import process from 'process';
|
||||||
|
import type { Logger } from 'winston';
|
||||||
|
import { formatEther, parseEther } from 'ethers';
|
||||||
|
import { getConfig, getProvider } from '@tornado/core';
|
||||||
|
|
||||||
|
import { RelayerConfig } from '../config';
|
||||||
|
|
||||||
|
// Can use 0 to use network on low balance
|
||||||
|
export const CHECK_BALANCE = parseEther(process.env.CHECK_BALANCE || '0.001');
|
||||||
|
|
||||||
|
export const DISABLE_LOW_BALANCE = true;
|
||||||
|
|
||||||
|
export async function checkProviders(relayerConfig: RelayerConfig, logger: Logger) {
|
||||||
|
const { enabledNetworks, rpcUrls, rewardAccount } = relayerConfig;
|
||||||
|
|
||||||
|
const disabledNetworks = (
|
||||||
|
await Promise.all(
|
||||||
|
enabledNetworks.map(async (netId) => {
|
||||||
|
try {
|
||||||
|
const config = getConfig(netId);
|
||||||
|
|
||||||
|
const rpcUrl = rpcUrls[netId];
|
||||||
|
|
||||||
|
const provider = await getProvider(rpcUrl, {
|
||||||
|
netId,
|
||||||
|
});
|
||||||
|
|
||||||
|
const balance = await provider.getBalance(rewardAccount);
|
||||||
|
|
||||||
|
const symbol = config.nativeCurrency.toUpperCase();
|
||||||
|
|
||||||
|
if (balance < CHECK_BALANCE) {
|
||||||
|
logger.error(
|
||||||
|
`Network ${netId} has lower balance than 0.001 ${symbol} and thus disabled (Balance: ${formatEther(balance)} ${symbol})`,
|
||||||
|
);
|
||||||
|
|
||||||
|
if (DISABLE_LOW_BALANCE) {
|
||||||
|
return netId;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
logger.info(
|
||||||
|
`Network ${netId} connected with ${rpcUrl} (Balance: ${formatEther(balance)} ${config.nativeCurrency.toUpperCase()})`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
logger.error(
|
||||||
|
`Failed to connect with ${netId} provider, make sure you have configured correct RPC url`,
|
||||||
|
);
|
||||||
|
throw err;
|
||||||
|
}
|
||||||
|
}),
|
||||||
|
)
|
||||||
|
).filter((n) => n);
|
||||||
|
|
||||||
|
relayerConfig.enabledNetworks = relayerConfig.enabledNetworks.filter((n) => !disabledNetworks.includes(n));
|
||||||
|
|
||||||
|
logger.info(`Enabled Networks: ${relayerConfig.enabledNetworks.join(', ')}`);
|
||||||
|
}
|
195
src/services/data.ts
Normal file
195
src/services/data.ts
Normal file
@ -0,0 +1,195 @@
|
|||||||
|
import path from 'path';
|
||||||
|
import { stat, mkdir, readFile, writeFile } from 'fs/promises';
|
||||||
|
import { zip, unzip, AsyncZippable, Unzipped } from 'fflate';
|
||||||
|
import { BaseEvents, CachedEvents, MinimalEvents } from '@tornado/core';
|
||||||
|
|
||||||
|
export async function existsAsync(fileOrDir: string): Promise<boolean> {
|
||||||
|
try {
|
||||||
|
await stat(fileOrDir);
|
||||||
|
|
||||||
|
return true;
|
||||||
|
} catch {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export function zipAsync(file: AsyncZippable): Promise<Uint8Array> {
|
||||||
|
return new Promise((res, rej) => {
|
||||||
|
zip(file, { mtime: new Date('1/1/1980') }, (err, data) => {
|
||||||
|
if (err) {
|
||||||
|
rej(err);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
res(data);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
export function unzipAsync(data: Uint8Array): Promise<Unzipped> {
|
||||||
|
return new Promise((res, rej) => {
|
||||||
|
unzip(data, {}, (err, data) => {
|
||||||
|
if (err) {
|
||||||
|
rej(err);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
res(data);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function saveUserFile({
|
||||||
|
fileName,
|
||||||
|
userDirectory,
|
||||||
|
dataString,
|
||||||
|
lastBlock,
|
||||||
|
}: {
|
||||||
|
fileName: string;
|
||||||
|
userDirectory: string;
|
||||||
|
dataString: string;
|
||||||
|
lastBlock?: number;
|
||||||
|
}) {
|
||||||
|
fileName = fileName.toLowerCase();
|
||||||
|
|
||||||
|
const filePath = path.join(userDirectory, fileName);
|
||||||
|
|
||||||
|
const payload = await zipAsync({
|
||||||
|
[fileName]: new TextEncoder().encode(dataString),
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!(await existsAsync(userDirectory))) {
|
||||||
|
await mkdir(userDirectory, { recursive: true });
|
||||||
|
}
|
||||||
|
|
||||||
|
await writeFile(filePath + '.zip', payload);
|
||||||
|
await writeFile(filePath, dataString);
|
||||||
|
|
||||||
|
if (lastBlock) {
|
||||||
|
await saveLastBlock({
|
||||||
|
fileName: fileName.replace('.json', ''),
|
||||||
|
userDirectory,
|
||||||
|
lastBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function saveLastBlock({
|
||||||
|
fileName,
|
||||||
|
userDirectory,
|
||||||
|
lastBlock,
|
||||||
|
}: {
|
||||||
|
fileName: string;
|
||||||
|
userDirectory: string;
|
||||||
|
lastBlock: number;
|
||||||
|
}) {
|
||||||
|
const filePath = path.join(userDirectory, fileName);
|
||||||
|
|
||||||
|
if (lastBlock) {
|
||||||
|
await writeFile(filePath + '.lastblock.txt', String(lastBlock));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function loadLastBlock({ name, directory }: { name: string; directory: string }) {
|
||||||
|
const filePath = path.join(directory, `${name}.lastblock.txt`);
|
||||||
|
|
||||||
|
if (!(await existsAsync(filePath))) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
const lastBlock = Number(await readFile(filePath, { encoding: 'utf8' }));
|
||||||
|
|
||||||
|
if (lastBlock) {
|
||||||
|
return lastBlock;
|
||||||
|
}
|
||||||
|
// eslint-disable-next-line no-empty
|
||||||
|
} catch {}
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function loadSavedEvents<T extends MinimalEvents>({
|
||||||
|
name,
|
||||||
|
userDirectory,
|
||||||
|
}: {
|
||||||
|
name: string;
|
||||||
|
userDirectory: string;
|
||||||
|
}): Promise<BaseEvents<T>> {
|
||||||
|
const filePath = path.join(userDirectory, `${name}.json`.toLowerCase());
|
||||||
|
|
||||||
|
if (!(await existsAsync(filePath))) {
|
||||||
|
return {
|
||||||
|
events: [] as T[],
|
||||||
|
lastBlock: 0,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
const events = JSON.parse(await readFile(filePath, { encoding: 'utf8' })) as T[];
|
||||||
|
|
||||||
|
const loadedBlock = await loadLastBlock({
|
||||||
|
name,
|
||||||
|
directory: userDirectory,
|
||||||
|
});
|
||||||
|
|
||||||
|
return {
|
||||||
|
events,
|
||||||
|
lastBlock: loadedBlock || events[events.length - 1]?.blockNumber || 0,
|
||||||
|
};
|
||||||
|
} catch (err) {
|
||||||
|
console.log('Method loadSavedEvents has error');
|
||||||
|
console.log(err);
|
||||||
|
return {
|
||||||
|
events: [],
|
||||||
|
lastBlock: 0,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function download({ name, cacheDirectory }: { name: string; cacheDirectory: string }) {
|
||||||
|
const fileName = `${name}.json`.toLowerCase();
|
||||||
|
const zipName = `${fileName}.zip`;
|
||||||
|
const zipPath = path.join(cacheDirectory, zipName);
|
||||||
|
|
||||||
|
const data = await readFile(zipPath);
|
||||||
|
const { [fileName]: content } = await unzipAsync(data);
|
||||||
|
|
||||||
|
return new TextDecoder().decode(content);
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function loadCachedEvents<T extends MinimalEvents>({
|
||||||
|
name,
|
||||||
|
cacheDirectory,
|
||||||
|
deployedBlock,
|
||||||
|
}: {
|
||||||
|
name: string;
|
||||||
|
cacheDirectory: string;
|
||||||
|
deployedBlock: number;
|
||||||
|
}): Promise<CachedEvents<T>> {
|
||||||
|
try {
|
||||||
|
const module = await download({ cacheDirectory, name });
|
||||||
|
|
||||||
|
if (module) {
|
||||||
|
const events = JSON.parse(module);
|
||||||
|
|
||||||
|
const lastBlock = events && events.length ? events[events.length - 1].blockNumber : deployedBlock;
|
||||||
|
|
||||||
|
return {
|
||||||
|
events,
|
||||||
|
lastBlock,
|
||||||
|
fromCache: true,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
events: [],
|
||||||
|
lastBlock: deployedBlock,
|
||||||
|
fromCache: true,
|
||||||
|
};
|
||||||
|
} catch (err) {
|
||||||
|
console.log('Method loadCachedEvents has error');
|
||||||
|
console.log(err);
|
||||||
|
return {
|
||||||
|
events: [],
|
||||||
|
lastBlock: deployedBlock,
|
||||||
|
fromCache: true,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
27
src/services/error.ts
Normal file
27
src/services/error.ts
Normal file
@ -0,0 +1,27 @@
|
|||||||
|
import { NetIdType } from '@tornado/core';
|
||||||
|
|
||||||
|
export interface ErrorTypes {
|
||||||
|
type: string;
|
||||||
|
netId: number;
|
||||||
|
timestamp: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ErrorMessages extends ErrorTypes {
|
||||||
|
message?: string;
|
||||||
|
stack?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function newError(
|
||||||
|
type: string,
|
||||||
|
netId: NetIdType,
|
||||||
|
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||||
|
err: any,
|
||||||
|
): ErrorMessages {
|
||||||
|
return {
|
||||||
|
type,
|
||||||
|
netId,
|
||||||
|
timestamp: Math.floor(Date.now() / 1000),
|
||||||
|
message: err.message,
|
||||||
|
stack: err.stack,
|
||||||
|
};
|
||||||
|
}
|
651
src/services/events.ts
Normal file
651
src/services/events.ts
Normal file
@ -0,0 +1,651 @@
|
|||||||
|
import path from 'path';
|
||||||
|
import { readFile } from 'fs/promises';
|
||||||
|
import {
|
||||||
|
BaseTornadoService,
|
||||||
|
BaseEncryptedNotesService,
|
||||||
|
BaseGovernanceService,
|
||||||
|
BaseRegistryService,
|
||||||
|
BaseTornadoServiceConstructor,
|
||||||
|
BaseEncryptedNotesServiceConstructor,
|
||||||
|
BaseGovernanceServiceConstructor,
|
||||||
|
BaseRegistryServiceConstructor,
|
||||||
|
BaseEchoServiceConstructor,
|
||||||
|
BaseEchoService,
|
||||||
|
CachedRelayers,
|
||||||
|
BatchEventsService,
|
||||||
|
toFixedHex,
|
||||||
|
BaseEvents,
|
||||||
|
DepositsEvents,
|
||||||
|
WithdrawalsEvents,
|
||||||
|
EncryptedNotesEvents,
|
||||||
|
AllGovernanceEvents,
|
||||||
|
EchoEvents,
|
||||||
|
BatchEventServiceConstructor,
|
||||||
|
BatchEventOnProgress,
|
||||||
|
NetIdType,
|
||||||
|
AllRelayerRegistryEvents,
|
||||||
|
BaseRevenueService,
|
||||||
|
BaseRevenueServiceConstructor,
|
||||||
|
StakeBurnedEvents,
|
||||||
|
} from '@tornado/core';
|
||||||
|
import type { MerkleTree } from '@tornado/fixed-merkle-tree';
|
||||||
|
import type { Logger } from 'winston';
|
||||||
|
import { saveUserFile, loadSavedEvents, loadCachedEvents, existsAsync, saveLastBlock } from './data';
|
||||||
|
import { TreeCache } from './treeCache';
|
||||||
|
|
||||||
|
export interface NodeEventsConstructor extends BatchEventServiceConstructor {
|
||||||
|
netId: NetIdType;
|
||||||
|
logger: Logger;
|
||||||
|
getInstanceName: () => string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export class NodeEventsService extends BatchEventsService {
|
||||||
|
netId: NetIdType;
|
||||||
|
logger: Logger;
|
||||||
|
getInstanceName: () => string;
|
||||||
|
|
||||||
|
constructor(serviceConstructor: NodeEventsConstructor) {
|
||||||
|
super(serviceConstructor);
|
||||||
|
|
||||||
|
this.netId = serviceConstructor.netId;
|
||||||
|
this.logger = serviceConstructor.logger;
|
||||||
|
this.getInstanceName = serviceConstructor.getInstanceName;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface NodeTornadoServiceConstructor extends BaseTornadoServiceConstructor {
|
||||||
|
cacheDirectory: string;
|
||||||
|
userDirectory: string;
|
||||||
|
nativeCurrency: string;
|
||||||
|
logger: Logger;
|
||||||
|
treeCache?: TreeCache;
|
||||||
|
}
|
||||||
|
|
||||||
|
export class NodeTornadoService extends BaseTornadoService {
|
||||||
|
cacheDirectory: string;
|
||||||
|
userDirectory: string;
|
||||||
|
|
||||||
|
nativeCurrency: string;
|
||||||
|
logger: Logger;
|
||||||
|
|
||||||
|
treeCache?: TreeCache;
|
||||||
|
|
||||||
|
constructor(serviceConstructor: NodeTornadoServiceConstructor) {
|
||||||
|
super(serviceConstructor);
|
||||||
|
|
||||||
|
const {
|
||||||
|
netId,
|
||||||
|
provider,
|
||||||
|
Tornado,
|
||||||
|
type,
|
||||||
|
amount,
|
||||||
|
currency,
|
||||||
|
cacheDirectory,
|
||||||
|
userDirectory,
|
||||||
|
nativeCurrency,
|
||||||
|
logger,
|
||||||
|
treeCache,
|
||||||
|
} = serviceConstructor;
|
||||||
|
|
||||||
|
this.cacheDirectory = cacheDirectory;
|
||||||
|
this.userDirectory = userDirectory;
|
||||||
|
this.nativeCurrency = nativeCurrency;
|
||||||
|
|
||||||
|
this.logger = logger;
|
||||||
|
|
||||||
|
this.batchEventsService = new NodeEventsService({
|
||||||
|
netId,
|
||||||
|
provider,
|
||||||
|
contract: Tornado,
|
||||||
|
onProgress: this.updateEventProgress,
|
||||||
|
logger,
|
||||||
|
getInstanceName: () => `${type.toLowerCase()}s_${netId}_${currency}_${amount}`,
|
||||||
|
});
|
||||||
|
|
||||||
|
this.treeCache = treeCache;
|
||||||
|
}
|
||||||
|
|
||||||
|
updateEventProgress({ fromBlock, toBlock, count }: Parameters<BatchEventOnProgress>[0]) {
|
||||||
|
if (toBlock) {
|
||||||
|
this.logger.debug(`${this.getInstanceName()}: Fetched ${count} events from ${fromBlock} to ${toBlock}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async getEventsFromDB() {
|
||||||
|
return await loadSavedEvents<DepositsEvents | WithdrawalsEvents>({
|
||||||
|
name: this.getInstanceName(),
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async getEventsFromCache() {
|
||||||
|
return await loadCachedEvents<DepositsEvents | WithdrawalsEvents>({
|
||||||
|
name: this.getInstanceName(),
|
||||||
|
cacheDirectory: this.cacheDirectory,
|
||||||
|
deployedBlock: this.deployedBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async validateEvents<S>({
|
||||||
|
events,
|
||||||
|
lastBlock,
|
||||||
|
hasNewEvents,
|
||||||
|
}: BaseEvents<DepositsEvents | WithdrawalsEvents> & {
|
||||||
|
hasNewEvents?: boolean;
|
||||||
|
}): Promise<S> {
|
||||||
|
const tree = await super.validateEvents<S>({
|
||||||
|
events,
|
||||||
|
lastBlock,
|
||||||
|
hasNewEvents,
|
||||||
|
});
|
||||||
|
|
||||||
|
if (tree && this.currency === this.nativeCurrency && this.treeCache) {
|
||||||
|
const merkleTree = tree as unknown as MerkleTree;
|
||||||
|
|
||||||
|
await this.treeCache.createTree(events as DepositsEvents[], merkleTree);
|
||||||
|
|
||||||
|
console.log(
|
||||||
|
`${this.getInstanceName()}: Updated tree cache with root ${toFixedHex(BigInt(merkleTree.root))}\n`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return tree;
|
||||||
|
}
|
||||||
|
|
||||||
|
async saveEvents({ events, lastBlock }: BaseEvents<DepositsEvents | WithdrawalsEvents>) {
|
||||||
|
await saveUserFile({
|
||||||
|
fileName: this.getInstanceName() + '.json',
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
dataString: JSON.stringify(events, null, 2) + '\n',
|
||||||
|
lastBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async updateEvents<S>() {
|
||||||
|
const { events, lastBlock, validateResult } = await super.updateEvents<S>();
|
||||||
|
|
||||||
|
await saveLastBlock({
|
||||||
|
fileName: this.getInstanceName(),
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
lastBlock,
|
||||||
|
});
|
||||||
|
|
||||||
|
return {
|
||||||
|
events,
|
||||||
|
lastBlock,
|
||||||
|
validateResult,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface NodeEchoServiceConstructor extends BaseEchoServiceConstructor {
|
||||||
|
cacheDirectory: string;
|
||||||
|
userDirectory: string;
|
||||||
|
logger: Logger;
|
||||||
|
}
|
||||||
|
|
||||||
|
export class NodeEchoService extends BaseEchoService {
|
||||||
|
cacheDirectory: string;
|
||||||
|
userDirectory: string;
|
||||||
|
|
||||||
|
logger: Logger;
|
||||||
|
|
||||||
|
constructor(serviceConstructor: NodeEchoServiceConstructor) {
|
||||||
|
super(serviceConstructor);
|
||||||
|
|
||||||
|
const { netId, provider, Echoer, cacheDirectory, userDirectory, logger } = serviceConstructor;
|
||||||
|
|
||||||
|
this.cacheDirectory = cacheDirectory;
|
||||||
|
this.userDirectory = userDirectory;
|
||||||
|
|
||||||
|
this.logger = logger;
|
||||||
|
|
||||||
|
this.batchEventsService = new NodeEventsService({
|
||||||
|
netId,
|
||||||
|
provider,
|
||||||
|
contract: Echoer,
|
||||||
|
onProgress: this.updateEventProgress,
|
||||||
|
logger,
|
||||||
|
getInstanceName: this.getInstanceName,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
updateEventProgress({ fromBlock, toBlock, count }: Parameters<BatchEventOnProgress>[0]) {
|
||||||
|
if (toBlock) {
|
||||||
|
this.logger.debug(`${this.getInstanceName()}: Fetched ${count} events from ${fromBlock} to ${toBlock}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async getEventsFromDB() {
|
||||||
|
return await loadSavedEvents<EchoEvents>({
|
||||||
|
name: this.getInstanceName(),
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async getEventsFromCache() {
|
||||||
|
return await loadCachedEvents<EchoEvents>({
|
||||||
|
name: this.getInstanceName(),
|
||||||
|
cacheDirectory: this.cacheDirectory,
|
||||||
|
deployedBlock: this.deployedBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async saveEvents({ events, lastBlock }: BaseEvents<EchoEvents>) {
|
||||||
|
const instanceName = this.getInstanceName();
|
||||||
|
|
||||||
|
await saveUserFile({
|
||||||
|
fileName: instanceName + '.json',
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
dataString: JSON.stringify(events, null, 2) + '\n',
|
||||||
|
lastBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async updateEvents<S>() {
|
||||||
|
const { events, lastBlock, validateResult } = await super.updateEvents<S>();
|
||||||
|
|
||||||
|
await saveLastBlock({
|
||||||
|
fileName: this.getInstanceName(),
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
lastBlock,
|
||||||
|
});
|
||||||
|
|
||||||
|
return {
|
||||||
|
events,
|
||||||
|
lastBlock,
|
||||||
|
validateResult,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface NodeEncryptedNotesServiceConstructor extends BaseEncryptedNotesServiceConstructor {
|
||||||
|
cacheDirectory: string;
|
||||||
|
userDirectory: string;
|
||||||
|
logger: Logger;
|
||||||
|
}
|
||||||
|
|
||||||
|
export class NodeEncryptedNotesService extends BaseEncryptedNotesService {
|
||||||
|
cacheDirectory: string;
|
||||||
|
userDirectory: string;
|
||||||
|
|
||||||
|
logger: Logger;
|
||||||
|
|
||||||
|
constructor(serviceConstructor: NodeEncryptedNotesServiceConstructor) {
|
||||||
|
super(serviceConstructor);
|
||||||
|
|
||||||
|
const { netId, provider, Router, cacheDirectory, userDirectory, logger } = serviceConstructor;
|
||||||
|
|
||||||
|
this.cacheDirectory = cacheDirectory;
|
||||||
|
this.userDirectory = userDirectory;
|
||||||
|
this.logger = logger;
|
||||||
|
|
||||||
|
this.batchEventsService = new NodeEventsService({
|
||||||
|
netId,
|
||||||
|
provider,
|
||||||
|
contract: Router,
|
||||||
|
onProgress: this.updateEventProgress,
|
||||||
|
logger,
|
||||||
|
getInstanceName: this.getInstanceName,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
updateEventProgress({ fromBlock, toBlock, count }: Parameters<BatchEventOnProgress>[0]) {
|
||||||
|
if (toBlock) {
|
||||||
|
this.logger.debug(`${this.getInstanceName()}: Fetched ${count} events from ${fromBlock} to ${toBlock}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async getEventsFromDB() {
|
||||||
|
return await loadSavedEvents<EncryptedNotesEvents>({
|
||||||
|
name: this.getInstanceName(),
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async getEventsFromCache() {
|
||||||
|
return await loadCachedEvents<EncryptedNotesEvents>({
|
||||||
|
name: this.getInstanceName(),
|
||||||
|
cacheDirectory: this.cacheDirectory,
|
||||||
|
deployedBlock: this.deployedBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async saveEvents({ events, lastBlock }: BaseEvents<EncryptedNotesEvents>) {
|
||||||
|
const instanceName = this.getInstanceName();
|
||||||
|
|
||||||
|
await saveUserFile({
|
||||||
|
fileName: instanceName + '.json',
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
dataString: JSON.stringify(events, null, 2) + '\n',
|
||||||
|
lastBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async updateEvents<S>() {
|
||||||
|
const { events, lastBlock, validateResult } = await super.updateEvents<S>();
|
||||||
|
|
||||||
|
await saveLastBlock({
|
||||||
|
fileName: this.getInstanceName(),
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
lastBlock,
|
||||||
|
});
|
||||||
|
|
||||||
|
return {
|
||||||
|
events,
|
||||||
|
lastBlock,
|
||||||
|
validateResult,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface NodeGovernanceServiceConstructor extends BaseGovernanceServiceConstructor {
|
||||||
|
cacheDirectory: string;
|
||||||
|
userDirectory: string;
|
||||||
|
logger: Logger;
|
||||||
|
}
|
||||||
|
|
||||||
|
export class NodeGovernanceService extends BaseGovernanceService {
|
||||||
|
cacheDirectory: string;
|
||||||
|
userDirectory: string;
|
||||||
|
|
||||||
|
logger: Logger;
|
||||||
|
|
||||||
|
constructor(serviceConstructor: NodeGovernanceServiceConstructor) {
|
||||||
|
super(serviceConstructor);
|
||||||
|
|
||||||
|
const { netId, provider, Governance, cacheDirectory, userDirectory, logger } = serviceConstructor;
|
||||||
|
|
||||||
|
this.cacheDirectory = cacheDirectory;
|
||||||
|
this.userDirectory = userDirectory;
|
||||||
|
this.logger = logger;
|
||||||
|
|
||||||
|
this.batchEventsService = new NodeEventsService({
|
||||||
|
netId,
|
||||||
|
provider,
|
||||||
|
contract: Governance,
|
||||||
|
onProgress: this.updateEventProgress,
|
||||||
|
logger,
|
||||||
|
getInstanceName: this.getInstanceName,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
updateEventProgress({ fromBlock, toBlock, count }: Parameters<BatchEventOnProgress>[0]) {
|
||||||
|
if (toBlock) {
|
||||||
|
this.logger.debug(`${this.getInstanceName()}: Fetched ${count} events from ${fromBlock} to ${toBlock}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async getEventsFromDB() {
|
||||||
|
return await loadSavedEvents<AllGovernanceEvents>({
|
||||||
|
name: this.getInstanceName(),
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async getEventsFromCache() {
|
||||||
|
return await loadCachedEvents<AllGovernanceEvents>({
|
||||||
|
name: this.getInstanceName(),
|
||||||
|
cacheDirectory: this.cacheDirectory,
|
||||||
|
deployedBlock: this.deployedBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async saveEvents({ events, lastBlock }: BaseEvents<AllGovernanceEvents>) {
|
||||||
|
const instanceName = this.getInstanceName();
|
||||||
|
|
||||||
|
await saveUserFile({
|
||||||
|
fileName: instanceName + '.json',
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
dataString: JSON.stringify(events, null, 2) + '\n',
|
||||||
|
lastBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async updateEvents<S>() {
|
||||||
|
const { events, lastBlock, validateResult } = await super.updateEvents<S>();
|
||||||
|
|
||||||
|
await saveLastBlock({
|
||||||
|
fileName: this.getInstanceName(),
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
lastBlock,
|
||||||
|
});
|
||||||
|
|
||||||
|
return {
|
||||||
|
events,
|
||||||
|
lastBlock,
|
||||||
|
validateResult,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface NodeRegistryServiceConstructor extends BaseRegistryServiceConstructor {
|
||||||
|
cacheDirectory: string;
|
||||||
|
userDirectory: string;
|
||||||
|
logger: Logger;
|
||||||
|
}
|
||||||
|
|
||||||
|
export class NodeRegistryService extends BaseRegistryService {
|
||||||
|
cacheDirectory: string;
|
||||||
|
userDirectory: string;
|
||||||
|
logger: Logger;
|
||||||
|
|
||||||
|
constructor(serviceConstructor: NodeRegistryServiceConstructor) {
|
||||||
|
super(serviceConstructor);
|
||||||
|
|
||||||
|
const { netId, provider, RelayerRegistry, cacheDirectory, userDirectory, logger } = serviceConstructor;
|
||||||
|
|
||||||
|
this.cacheDirectory = cacheDirectory;
|
||||||
|
this.userDirectory = userDirectory;
|
||||||
|
this.logger = logger;
|
||||||
|
|
||||||
|
this.batchEventsService = new NodeEventsService({
|
||||||
|
netId,
|
||||||
|
provider,
|
||||||
|
contract: RelayerRegistry,
|
||||||
|
onProgress: this.updateEventProgress,
|
||||||
|
logger,
|
||||||
|
getInstanceName: this.getInstanceName,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
updateEventProgress({ fromBlock, toBlock, count }: Parameters<BatchEventOnProgress>[0]) {
|
||||||
|
if (toBlock) {
|
||||||
|
this.logger.debug(`${this.getInstanceName()}: Fetched ${count} events from ${fromBlock} to ${toBlock}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async getEventsFromDB() {
|
||||||
|
return await loadSavedEvents<AllRelayerRegistryEvents>({
|
||||||
|
name: this.getInstanceName(),
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async getEventsFromCache() {
|
||||||
|
return await loadCachedEvents<AllRelayerRegistryEvents>({
|
||||||
|
name: this.getInstanceName(),
|
||||||
|
cacheDirectory: this.cacheDirectory,
|
||||||
|
deployedBlock: this.deployedBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async saveEvents({ events, lastBlock }: BaseEvents<AllRelayerRegistryEvents>) {
|
||||||
|
const instanceName = this.getInstanceName();
|
||||||
|
|
||||||
|
await saveUserFile({
|
||||||
|
fileName: instanceName + '.json',
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
dataString: JSON.stringify(events, null, 2) + '\n',
|
||||||
|
lastBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async updateEvents<S>() {
|
||||||
|
const { events, lastBlock, validateResult } = await super.updateEvents<S>();
|
||||||
|
|
||||||
|
await saveLastBlock({
|
||||||
|
fileName: this.getInstanceName(),
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
lastBlock,
|
||||||
|
});
|
||||||
|
|
||||||
|
return {
|
||||||
|
events,
|
||||||
|
lastBlock,
|
||||||
|
validateResult,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
async getRelayersFromDB(): Promise<CachedRelayers> {
|
||||||
|
const filePath = path.join(this.userDirectory || '', 'relayers.json');
|
||||||
|
|
||||||
|
if (!this.userDirectory || !(await existsAsync(filePath))) {
|
||||||
|
return {
|
||||||
|
lastBlock: 0,
|
||||||
|
timestamp: 0,
|
||||||
|
relayers: [],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
const { lastBlock, timestamp, relayers } = JSON.parse(await readFile(filePath, { encoding: 'utf8' }));
|
||||||
|
|
||||||
|
return {
|
||||||
|
lastBlock,
|
||||||
|
timestamp,
|
||||||
|
relayers,
|
||||||
|
};
|
||||||
|
} catch (err) {
|
||||||
|
console.log('Method getRelayersFromDB has error');
|
||||||
|
console.log(err);
|
||||||
|
|
||||||
|
return {
|
||||||
|
lastBlock: 0,
|
||||||
|
timestamp: 0,
|
||||||
|
relayers: [],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async getRelayersFromCache(): Promise<CachedRelayers> {
|
||||||
|
const filePath = path.join(this.cacheDirectory || '', 'relayers.json');
|
||||||
|
|
||||||
|
if (!this.cacheDirectory || !(await existsAsync(filePath))) {
|
||||||
|
return {
|
||||||
|
lastBlock: 0,
|
||||||
|
timestamp: 0,
|
||||||
|
relayers: [],
|
||||||
|
fromCache: true,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
const { lastBlock, timestamp, relayers } = JSON.parse(await readFile(filePath, { encoding: 'utf8' }));
|
||||||
|
|
||||||
|
return {
|
||||||
|
lastBlock,
|
||||||
|
timestamp,
|
||||||
|
relayers,
|
||||||
|
fromCache: true,
|
||||||
|
};
|
||||||
|
} catch (err) {
|
||||||
|
console.log('Method getRelayersFromDB has error');
|
||||||
|
console.log(err);
|
||||||
|
|
||||||
|
return {
|
||||||
|
lastBlock: 0,
|
||||||
|
timestamp: 0,
|
||||||
|
relayers: [],
|
||||||
|
fromCache: true,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async saveRelayers({ lastBlock, timestamp, relayers }: CachedRelayers) {
|
||||||
|
await saveUserFile({
|
||||||
|
fileName: 'relayers.json',
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
dataString: JSON.stringify({ lastBlock, timestamp, relayers }, null, 2) + '\n',
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface NodeRevenueServiceConstructor extends BaseRevenueServiceConstructor {
|
||||||
|
cacheDirectory: string;
|
||||||
|
userDirectory: string;
|
||||||
|
logger: Logger;
|
||||||
|
}
|
||||||
|
|
||||||
|
export class NodeRevenueService extends BaseRevenueService {
|
||||||
|
cacheDirectory: string;
|
||||||
|
userDirectory: string;
|
||||||
|
logger: Logger;
|
||||||
|
|
||||||
|
constructor(serviceConstructor: NodeRevenueServiceConstructor) {
|
||||||
|
super(serviceConstructor);
|
||||||
|
|
||||||
|
const { netId, provider, RelayerRegistry, cacheDirectory, userDirectory, logger } = serviceConstructor;
|
||||||
|
|
||||||
|
this.cacheDirectory = cacheDirectory;
|
||||||
|
this.userDirectory = userDirectory;
|
||||||
|
this.logger = logger;
|
||||||
|
|
||||||
|
this.batchEventsService = new NodeEventsService({
|
||||||
|
netId,
|
||||||
|
provider,
|
||||||
|
contract: RelayerRegistry,
|
||||||
|
onProgress: this.updateEventProgress,
|
||||||
|
logger,
|
||||||
|
getInstanceName: this.getInstanceName,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
updateEventProgress({ fromBlock, toBlock, count }: Parameters<BatchEventOnProgress>[0]) {
|
||||||
|
if (toBlock) {
|
||||||
|
this.logger.debug(`${this.getInstanceName()}: Fetched ${count} events from ${fromBlock} to ${toBlock}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async getEventsFromDB() {
|
||||||
|
return await loadSavedEvents<StakeBurnedEvents>({
|
||||||
|
name: this.getInstanceName(),
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async getEventsFromCache() {
|
||||||
|
return await loadCachedEvents<StakeBurnedEvents>({
|
||||||
|
name: this.getInstanceName(),
|
||||||
|
cacheDirectory: this.cacheDirectory,
|
||||||
|
deployedBlock: this.deployedBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async saveEvents({ events, lastBlock }: BaseEvents<StakeBurnedEvents>) {
|
||||||
|
const instanceName = this.getInstanceName();
|
||||||
|
|
||||||
|
await saveUserFile({
|
||||||
|
fileName: instanceName + '.json',
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
dataString: JSON.stringify(events, null, 2) + '\n',
|
||||||
|
lastBlock,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async updateEvents<S>() {
|
||||||
|
const { events, lastBlock, validateResult } = await super.updateEvents<S>();
|
||||||
|
|
||||||
|
await saveLastBlock({
|
||||||
|
fileName: this.getInstanceName(),
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
lastBlock,
|
||||||
|
});
|
||||||
|
|
||||||
|
return {
|
||||||
|
events,
|
||||||
|
lastBlock,
|
||||||
|
validateResult,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
12
src/services/index.ts
Normal file
12
src/services/index.ts
Normal file
@ -0,0 +1,12 @@
|
|||||||
|
export * from './check';
|
||||||
|
export * from './data';
|
||||||
|
export * from './error';
|
||||||
|
export * from './events';
|
||||||
|
export * from './logger';
|
||||||
|
export * from './router';
|
||||||
|
export * from './routerMsg';
|
||||||
|
export * from './schema';
|
||||||
|
export * from './sync';
|
||||||
|
export * from './treeCache';
|
||||||
|
export * from './utils';
|
||||||
|
export * from './worker';
|
28
src/services/logger.ts
Normal file
28
src/services/logger.ts
Normal file
@ -0,0 +1,28 @@
|
|||||||
|
import winston from 'winston';
|
||||||
|
import colors from '@colors/colors/safe';
|
||||||
|
|
||||||
|
export function getLogger(label?: string, minLevel?: string) {
|
||||||
|
return winston.createLogger({
|
||||||
|
format: winston.format.combine(
|
||||||
|
winston.format.label({ label }),
|
||||||
|
winston.format.timestamp({
|
||||||
|
format: 'YYYY-MM-DD HH:mm:ss',
|
||||||
|
}),
|
||||||
|
// Include timestamp on level
|
||||||
|
winston.format((info) => {
|
||||||
|
info.level = `[${info.level}]`;
|
||||||
|
while (info.level.length < 8) {
|
||||||
|
info.level += ' ';
|
||||||
|
}
|
||||||
|
info.level = `${info.timestamp} ${info.level}`.toUpperCase();
|
||||||
|
return info;
|
||||||
|
})(),
|
||||||
|
winston.format.colorize(),
|
||||||
|
winston.format.printf(
|
||||||
|
(info) => `${info.level} ${info.label ? `${info.label} ` : ''}${colors.grey(info.message)}`,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
// Define level filter from config
|
||||||
|
transports: [new winston.transports.Console({ level: minLevel || 'debug' })],
|
||||||
|
});
|
||||||
|
}
|
439
src/services/router.ts
Normal file
439
src/services/router.ts
Normal file
@ -0,0 +1,439 @@
|
|||||||
|
import path from 'path';
|
||||||
|
import { createReadStream } from 'fs';
|
||||||
|
import type { Logger } from 'winston';
|
||||||
|
import { fastify, FastifyInstance, FastifyReply, FastifyRequest } from 'fastify';
|
||||||
|
import { fastifyCors } from '@fastify/cors';
|
||||||
|
|
||||||
|
import {
|
||||||
|
NetIdType,
|
||||||
|
getConfig,
|
||||||
|
DEPOSIT,
|
||||||
|
WITHDRAWAL,
|
||||||
|
DepositsEvents,
|
||||||
|
WithdrawalsEvents,
|
||||||
|
EchoEvents,
|
||||||
|
EncryptedNotesEvents,
|
||||||
|
AllGovernanceEvents,
|
||||||
|
TornadoWithdrawParams,
|
||||||
|
RelayerTornadoWithdraw,
|
||||||
|
getActiveTokenInstances,
|
||||||
|
TovarishEventsStatus,
|
||||||
|
MAX_TOVARISH_EVENTS,
|
||||||
|
TovarishStatus,
|
||||||
|
TovarishEventsQuery,
|
||||||
|
BaseTovarishEvents,
|
||||||
|
AllRelayerRegistryEvents,
|
||||||
|
StakeBurnedEvents,
|
||||||
|
} from '@tornado/core';
|
||||||
|
|
||||||
|
import { isAddress, BigNumberish } from 'ethers';
|
||||||
|
|
||||||
|
import { RelayerConfig, version } from '../config';
|
||||||
|
import { getLogger } from './logger';
|
||||||
|
import { resolveMessages, sendMessage, SentMsg } from './routerMsg';
|
||||||
|
import { SyncManagerStatus } from './sync';
|
||||||
|
import { existsAsync, loadSavedEvents } from './data';
|
||||||
|
import { RelayerTornadoQueue } from './worker';
|
||||||
|
import {
|
||||||
|
getTreeNameKeyword,
|
||||||
|
getAllEventsKeyword,
|
||||||
|
getAllWithdrawKeyword,
|
||||||
|
getEventsSchema,
|
||||||
|
getWithdrawSchema,
|
||||||
|
idParamsSchema,
|
||||||
|
treeNameSchema,
|
||||||
|
} from './schema';
|
||||||
|
import { ErrorMessages } from './error';
|
||||||
|
|
||||||
|
export function getHealthStatus(netId: NetIdType, syncManagerStatus: SyncManagerStatus) {
|
||||||
|
const { events, tokenPrice, gasPrice } = syncManagerStatus.syncStatus[netId];
|
||||||
|
|
||||||
|
return String(Boolean(events && tokenPrice && gasPrice));
|
||||||
|
}
|
||||||
|
|
||||||
|
export function getGasPrices(netId: NetIdType, syncManagerStatus: SyncManagerStatus) {
|
||||||
|
const { gasPrice, l1Fee } = syncManagerStatus.cachedGasPrices[netId];
|
||||||
|
|
||||||
|
return {
|
||||||
|
fast: Number(gasPrice),
|
||||||
|
additionalProperties: l1Fee ? Number(l1Fee) : undefined,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export function formatStatus({
|
||||||
|
url,
|
||||||
|
netId,
|
||||||
|
relayerConfig,
|
||||||
|
syncManagerStatus,
|
||||||
|
pendingWorks,
|
||||||
|
}: {
|
||||||
|
url: string;
|
||||||
|
netId: NetIdType;
|
||||||
|
relayerConfig: RelayerConfig;
|
||||||
|
syncManagerStatus: SyncManagerStatus;
|
||||||
|
pendingWorks: number;
|
||||||
|
}): TovarishStatus {
|
||||||
|
const config = getConfig(netId);
|
||||||
|
|
||||||
|
return {
|
||||||
|
url,
|
||||||
|
rewardAccount: relayerConfig.rewardAccount,
|
||||||
|
instances: getActiveTokenInstances(config),
|
||||||
|
events: syncManagerStatus.cachedEvents[netId],
|
||||||
|
gasPrices: getGasPrices(netId, syncManagerStatus),
|
||||||
|
netId,
|
||||||
|
ethPrices: syncManagerStatus.cachedPrices[netId],
|
||||||
|
tornadoServiceFee: relayerConfig.serviceFee,
|
||||||
|
latestBlock: syncManagerStatus.latestBlocks[netId],
|
||||||
|
latestBalance: syncManagerStatus.latestBalances[netId],
|
||||||
|
version,
|
||||||
|
health: {
|
||||||
|
status: getHealthStatus(netId, syncManagerStatus),
|
||||||
|
error: '',
|
||||||
|
errorsLog: [...syncManagerStatus.errors.filter((e) => e.netId === netId)],
|
||||||
|
},
|
||||||
|
syncStatus: syncManagerStatus.syncStatus[netId],
|
||||||
|
onSyncEvents: syncManagerStatus.onSyncEvents,
|
||||||
|
currentQueue: pendingWorks,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export function handleIndex(enabledNetworks: NetIdType[]) {
|
||||||
|
return (
|
||||||
|
'This is <a href=https://tornado.ws>Tornado Cash</a> Relayer service. Check the ' +
|
||||||
|
enabledNetworks.map((netId) => `<a href=/${netId}/v1/status>/${netId}/v1/status</a> `).join(', ') +
|
||||||
|
'for settings'
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function handleStatus(url: string, router: Router, netId: NetIdType | NetIdType[], reply: FastifyReply) {
|
||||||
|
const { relayerConfig } = router;
|
||||||
|
|
||||||
|
const { syncManagerStatus, pendingWorks } = await sendMessage<{
|
||||||
|
syncManagerStatus: SyncManagerStatus;
|
||||||
|
pendingWorks: number;
|
||||||
|
}>(router, { type: 'status' });
|
||||||
|
|
||||||
|
if (Array.isArray(netId)) {
|
||||||
|
reply.send(
|
||||||
|
netId.map((n) =>
|
||||||
|
formatStatus({
|
||||||
|
url,
|
||||||
|
netId: n,
|
||||||
|
relayerConfig,
|
||||||
|
syncManagerStatus,
|
||||||
|
pendingWorks,
|
||||||
|
}),
|
||||||
|
),
|
||||||
|
);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
reply.send(
|
||||||
|
formatStatus({
|
||||||
|
url,
|
||||||
|
netId,
|
||||||
|
relayerConfig,
|
||||||
|
syncManagerStatus,
|
||||||
|
pendingWorks,
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Since we check gasLimit and fees, should extend timeout at any proxy more than 60s
|
||||||
|
*/
|
||||||
|
export async function handleTornadoWithdraw(
|
||||||
|
router: Router,
|
||||||
|
netId: NetIdType,
|
||||||
|
req: FastifyRequest,
|
||||||
|
reply: FastifyReply,
|
||||||
|
) {
|
||||||
|
const { contract, proof, args } = req.body as unknown as TornadoWithdrawParams;
|
||||||
|
|
||||||
|
const { id, error } = await sendMessage<RelayerTornadoWithdraw>(router, {
|
||||||
|
type: 'tornadoWithdraw',
|
||||||
|
netId,
|
||||||
|
contract,
|
||||||
|
proof,
|
||||||
|
args,
|
||||||
|
});
|
||||||
|
|
||||||
|
if (error) {
|
||||||
|
reply.code(502).send({ error });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
reply.send({ id });
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function handleGetJob(router: Router, req: FastifyRequest, reply: FastifyReply) {
|
||||||
|
const { id } = req.params as unknown as { id: string };
|
||||||
|
|
||||||
|
const job = await sendMessage<{ error: string } | RelayerTornadoQueue>(router, { type: 'job', id });
|
||||||
|
|
||||||
|
if (job.error) {
|
||||||
|
reply.code(502).send(job);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
reply.send(job as RelayerTornadoQueue);
|
||||||
|
}
|
||||||
|
|
||||||
|
export type AllTovarishEvents =
|
||||||
|
| DepositsEvents
|
||||||
|
| WithdrawalsEvents
|
||||||
|
| EchoEvents
|
||||||
|
| EncryptedNotesEvents
|
||||||
|
| AllGovernanceEvents
|
||||||
|
| AllRelayerRegistryEvents
|
||||||
|
| StakeBurnedEvents;
|
||||||
|
|
||||||
|
export async function handleEvents(router: Router, netId: NetIdType, req: FastifyRequest, reply: FastifyReply) {
|
||||||
|
const {
|
||||||
|
relayerConfig: { userEventsDir: userDirectory },
|
||||||
|
} = router;
|
||||||
|
const { type, currency, amount, fromBlock, recent } = req.body as unknown as TovarishEventsQuery;
|
||||||
|
|
||||||
|
const name = [DEPOSIT, WITHDRAWAL].includes(type) ? `${type}s_${netId}_${currency}_${amount}` : `${type}_${netId}`;
|
||||||
|
|
||||||
|
// Can return 0 events but we just return error codes here
|
||||||
|
if (!(await existsAsync(path.join(userDirectory, `${name}.json`)))) {
|
||||||
|
reply.code(404).send(`Events ${name} not found!`);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const { syncManagerStatus } = await sendMessage<{
|
||||||
|
syncManagerStatus: SyncManagerStatus;
|
||||||
|
}>(router, { type: 'status' });
|
||||||
|
|
||||||
|
const lastSyncBlock = Number(
|
||||||
|
[DEPOSIT, WITHDRAWAL].includes(type)
|
||||||
|
? syncManagerStatus.cachedEvents[netId]?.instances?.[String(currency)]?.[String(amount)]?.[
|
||||||
|
`${type}s` as 'deposits' | 'withdrawals'
|
||||||
|
]?.lastBlock
|
||||||
|
: syncManagerStatus.cachedEvents[netId]?.[String(type) as keyof TovarishEventsStatus]?.lastBlock,
|
||||||
|
);
|
||||||
|
|
||||||
|
const { events } = await loadSavedEvents<AllTovarishEvents>({
|
||||||
|
name,
|
||||||
|
userDirectory,
|
||||||
|
});
|
||||||
|
|
||||||
|
if (recent) {
|
||||||
|
reply.send({
|
||||||
|
events: events.slice(-10).sort((a, b) => {
|
||||||
|
if (a.blockNumber === b.blockNumber) {
|
||||||
|
return b.logIndex - a.logIndex;
|
||||||
|
}
|
||||||
|
return b.blockNumber - a.blockNumber;
|
||||||
|
}),
|
||||||
|
lastSyncBlock,
|
||||||
|
} as BaseTovarishEvents<AllTovarishEvents>);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
reply.send({
|
||||||
|
events: events.filter((e) => e.blockNumber >= (fromBlock || 0)).slice(0, MAX_TOVARISH_EVENTS),
|
||||||
|
lastSyncBlock,
|
||||||
|
} as BaseTovarishEvents<AllTovarishEvents>);
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function handleTrees(router: Router, req: FastifyRequest, reply: FastifyReply) {
|
||||||
|
const treeRegex = /deposits_(?<netId>\d+)_(?<currency>\w+)_(?<amount>[\d.]+)_(?<part>\w+).json.zip/g;
|
||||||
|
const { netId, currency, amount, part } =
|
||||||
|
treeRegex.exec((req.params as unknown as { treeName: string }).treeName)?.groups || {};
|
||||||
|
|
||||||
|
const treeName = `deposits_${netId}_${currency}_${amount}_${part}.json.zip`;
|
||||||
|
const treePath = path.join(router.relayerConfig.userTreeDir, treeName);
|
||||||
|
|
||||||
|
if (!(await existsAsync(treePath))) {
|
||||||
|
reply.status(404).send(`Tree ${treeName} not found!`);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
reply.send(createReadStream(treePath));
|
||||||
|
}
|
||||||
|
|
||||||
|
export function listenRouter(router: Router) {
|
||||||
|
const { relayerConfig, logger, app, admin, forkId } = router;
|
||||||
|
|
||||||
|
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||||
|
app.register(fastifyCors, () => (req: FastifyRequest, callback: any) => {
|
||||||
|
callback(null, {
|
||||||
|
origin: req.headers.origin || '*',
|
||||||
|
credentials: true,
|
||||||
|
methods: ['GET, POST, OPTIONS'],
|
||||||
|
headers: [
|
||||||
|
'DNT,X-CustomHeader,Keep-Alive,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type',
|
||||||
|
],
|
||||||
|
maxAge: 1728000,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
app.get('/', (_, reply) => {
|
||||||
|
reply.type('text/html').send(handleIndex(relayerConfig.enabledNetworks));
|
||||||
|
});
|
||||||
|
|
||||||
|
app.get('/relayer', (_, reply) => {
|
||||||
|
reply.type('text/html').send(handleIndex(relayerConfig.enabledNetworks));
|
||||||
|
});
|
||||||
|
|
||||||
|
app.get('/status', (req, reply) => {
|
||||||
|
handleStatus(`${req.protocol}://${req.hostname}`, router, relayerConfig.enabledNetworks, reply);
|
||||||
|
});
|
||||||
|
|
||||||
|
app.get('/enabledNetworks', (_, reply) => {
|
||||||
|
reply.send(relayerConfig.enabledNetworks);
|
||||||
|
});
|
||||||
|
|
||||||
|
if (forkId === 0) {
|
||||||
|
logger.info('Router listening on /, /status, /enabledNetworks');
|
||||||
|
}
|
||||||
|
|
||||||
|
for (const netId of relayerConfig.enabledNetworks) {
|
||||||
|
app.get(`/${netId}`, (_, reply) => {
|
||||||
|
reply.type('text/html').send(handleIndex([netId]));
|
||||||
|
});
|
||||||
|
|
||||||
|
app.get(`/${netId}/status`, (req, reply) => {
|
||||||
|
handleStatus(`${req.protocol}://${req.hostname}/${netId}`, router, netId, reply);
|
||||||
|
});
|
||||||
|
|
||||||
|
const withdrawSchema = getWithdrawSchema(netId);
|
||||||
|
|
||||||
|
app.post(`/${netId}/relay`, { schema: withdrawSchema }, (req, reply) => {
|
||||||
|
handleTornadoWithdraw(router, netId, req, reply);
|
||||||
|
});
|
||||||
|
|
||||||
|
app.get(`/${netId}/v1/status`, (req, reply) => {
|
||||||
|
handleStatus(`${req.protocol}://${req.hostname}/${netId}`, router, netId, reply);
|
||||||
|
});
|
||||||
|
|
||||||
|
app.post(`/${netId}/v1/tornadoWithdraw`, { schema: withdrawSchema }, (req, reply) => {
|
||||||
|
handleTornadoWithdraw(router, netId, req, reply);
|
||||||
|
});
|
||||||
|
|
||||||
|
app.get(`/${netId}/v1/jobs/:id`, { schema: idParamsSchema }, (req, reply) => {
|
||||||
|
handleGetJob(router, req, reply);
|
||||||
|
});
|
||||||
|
|
||||||
|
const eventSchema = getEventsSchema(netId);
|
||||||
|
|
||||||
|
app.post(`/${netId}/events`, { schema: eventSchema }, (req, reply) => {
|
||||||
|
handleEvents(router, netId, req, reply);
|
||||||
|
});
|
||||||
|
|
||||||
|
app.get(`/${netId}/trees/:treeName`, { schema: treeNameSchema }, (req, reply) => {
|
||||||
|
handleTrees(router, req, reply);
|
||||||
|
});
|
||||||
|
|
||||||
|
if (forkId === 0) {
|
||||||
|
logger.info(
|
||||||
|
`Router listening on /${netId}, /${netId}/status, /${netId}/relay, /${netId}/v1/status, /${netId}/v1/tornadoWithdraw, /${netId}/v1/jobs/:id, /${netId}/events, /${netId}/trees/:treeName`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const { port, host } = relayerConfig;
|
||||||
|
|
||||||
|
app.listen({ port, host }, (err, address) => {
|
||||||
|
if (err) {
|
||||||
|
logger.error('Router Error');
|
||||||
|
console.log(err);
|
||||||
|
throw err;
|
||||||
|
} else {
|
||||||
|
logger.debug(`Router listening on ${address}`);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
admin.get('/errors', (_, reply) => {
|
||||||
|
(async () => {
|
||||||
|
const { errors } = await sendMessage<{
|
||||||
|
errors: ErrorMessages[];
|
||||||
|
}>(router, { type: 'errors' });
|
||||||
|
|
||||||
|
reply.header('Content-Type', 'application/json').send(JSON.stringify(errors, null, 2));
|
||||||
|
})();
|
||||||
|
});
|
||||||
|
|
||||||
|
admin.listen({ port: port + 100, host }, (err, address) => {
|
||||||
|
if (err) {
|
||||||
|
logger.error('Admin Router Error');
|
||||||
|
console.log(err);
|
||||||
|
throw err;
|
||||||
|
} else {
|
||||||
|
if (forkId === 0) {
|
||||||
|
logger.debug(`Admin Router listening on ${address}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
resolveMessages(router);
|
||||||
|
}
|
||||||
|
|
||||||
|
export class Router {
|
||||||
|
relayerConfig: RelayerConfig;
|
||||||
|
logger: Logger;
|
||||||
|
forkId: number;
|
||||||
|
|
||||||
|
app: FastifyInstance;
|
||||||
|
|
||||||
|
// For viewing error logs
|
||||||
|
admin: FastifyInstance;
|
||||||
|
|
||||||
|
messages: SentMsg[];
|
||||||
|
|
||||||
|
constructor(relayerConfig: RelayerConfig, forkId: number = 0) {
|
||||||
|
this.relayerConfig = relayerConfig;
|
||||||
|
this.logger = getLogger(`[Router ${forkId}]`, relayerConfig.logLevel);
|
||||||
|
this.forkId = forkId;
|
||||||
|
|
||||||
|
const app = fastify({
|
||||||
|
ajv: {
|
||||||
|
customOptions: {
|
||||||
|
keywords: [
|
||||||
|
{
|
||||||
|
keyword: 'isAddress',
|
||||||
|
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||||
|
validate: (schema: any, data: string) => {
|
||||||
|
try {
|
||||||
|
return isAddress(data);
|
||||||
|
} catch {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
},
|
||||||
|
errors: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
keyword: 'BN',
|
||||||
|
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||||
|
validate: (schema: any, data: BigNumberish) => {
|
||||||
|
try {
|
||||||
|
BigInt(data);
|
||||||
|
return true;
|
||||||
|
} catch {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
},
|
||||||
|
errors: true,
|
||||||
|
},
|
||||||
|
getTreeNameKeyword(),
|
||||||
|
...getAllWithdrawKeyword(relayerConfig.rewardAccount),
|
||||||
|
...getAllEventsKeyword(),
|
||||||
|
],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
trustProxy: relayerConfig.reverseProxy ? 1 : false,
|
||||||
|
ignoreTrailingSlash: true,
|
||||||
|
});
|
||||||
|
|
||||||
|
const admin = fastify();
|
||||||
|
|
||||||
|
this.app = app;
|
||||||
|
this.admin = admin;
|
||||||
|
this.messages = [];
|
||||||
|
|
||||||
|
listenRouter(this);
|
||||||
|
}
|
||||||
|
}
|
56
src/services/routerMsg.ts
Normal file
56
src/services/routerMsg.ts
Normal file
@ -0,0 +1,56 @@
|
|||||||
|
/* eslint-disable @typescript-eslint/no-explicit-any */
|
||||||
|
/**
|
||||||
|
* Send and receive messages from worker to main thread
|
||||||
|
*/
|
||||||
|
import process from 'process';
|
||||||
|
import { webcrypto as crypto } from 'crypto';
|
||||||
|
import { bytesToHex } from '@tornado/core';
|
||||||
|
|
||||||
|
import { Router } from './router';
|
||||||
|
|
||||||
|
export interface SentMsg {
|
||||||
|
msgId: string;
|
||||||
|
resolve: (msg: any) => void;
|
||||||
|
reject: (err: any) => void;
|
||||||
|
resolved: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function sendMessage<T>(router: Router, msg: any): Promise<T> {
|
||||||
|
const msgId = bytesToHex(crypto.getRandomValues(new Uint8Array(8)));
|
||||||
|
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
if (!process.send) {
|
||||||
|
reject(new Error('Not worker'));
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const msgJson = JSON.parse(JSON.stringify(msg)) as any;
|
||||||
|
msgJson.msgId = msgId;
|
||||||
|
process.send(msgJson);
|
||||||
|
|
||||||
|
router.messages.push({
|
||||||
|
msgId,
|
||||||
|
resolve,
|
||||||
|
reject,
|
||||||
|
resolved: false,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
export function resolveMessages(router: Router) {
|
||||||
|
process.on('message', (msg: any) => {
|
||||||
|
const message = router.messages.find((w) => w.msgId === msg.msgId);
|
||||||
|
|
||||||
|
if (!message) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const msgJson = JSON.parse(JSON.stringify(msg)) as any;
|
||||||
|
delete msgJson.msgId;
|
||||||
|
message.resolve(msgJson);
|
||||||
|
|
||||||
|
message.resolved = true;
|
||||||
|
|
||||||
|
router.messages = router.messages.filter((w) => !w.resolved);
|
||||||
|
});
|
||||||
|
}
|
254
src/services/schema.ts
Normal file
254
src/services/schema.ts
Normal file
@ -0,0 +1,254 @@
|
|||||||
|
import { parseUnits } from 'ethers';
|
||||||
|
|
||||||
|
import {
|
||||||
|
NetIdType,
|
||||||
|
TornadoWithdrawParams,
|
||||||
|
getConfig,
|
||||||
|
getInstanceByAddress,
|
||||||
|
enabledChains,
|
||||||
|
TovarishEventsQuery,
|
||||||
|
WITHDRAWAL,
|
||||||
|
DEPOSIT,
|
||||||
|
addressSchemaType,
|
||||||
|
proofSchemaType,
|
||||||
|
bytes32SchemaType,
|
||||||
|
bytes32BNSchemaType,
|
||||||
|
} from '@tornado/core';
|
||||||
|
|
||||||
|
export const idParamsSchema = {
|
||||||
|
params: {
|
||||||
|
type: 'object',
|
||||||
|
properties: {
|
||||||
|
id: { type: 'string', format: 'uuid' },
|
||||||
|
},
|
||||||
|
required: ['id'],
|
||||||
|
additionalProperties: false,
|
||||||
|
},
|
||||||
|
} as const;
|
||||||
|
|
||||||
|
export const withdrawBodySchema = {
|
||||||
|
body: {
|
||||||
|
type: 'object',
|
||||||
|
properties: {
|
||||||
|
proof: proofSchemaType,
|
||||||
|
contract: addressSchemaType,
|
||||||
|
args: {
|
||||||
|
type: 'array',
|
||||||
|
maxItems: 6,
|
||||||
|
minItems: 6,
|
||||||
|
items: [
|
||||||
|
bytes32SchemaType,
|
||||||
|
bytes32SchemaType,
|
||||||
|
addressSchemaType,
|
||||||
|
addressSchemaType,
|
||||||
|
bytes32BNSchemaType,
|
||||||
|
bytes32BNSchemaType,
|
||||||
|
],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
additionalProperties: false,
|
||||||
|
required: ['proof', 'contract', 'args'],
|
||||||
|
},
|
||||||
|
} as const;
|
||||||
|
|
||||||
|
const stringParamsType = {
|
||||||
|
type: 'string',
|
||||||
|
minLength: 1,
|
||||||
|
maxLength: 30,
|
||||||
|
} as const;
|
||||||
|
|
||||||
|
export const eventsSchema = {
|
||||||
|
body: {
|
||||||
|
type: 'object',
|
||||||
|
properties: {
|
||||||
|
type: stringParamsType,
|
||||||
|
currency: stringParamsType,
|
||||||
|
amount: stringParamsType,
|
||||||
|
fromBlock: { type: 'number' },
|
||||||
|
recent: { type: 'boolean' },
|
||||||
|
},
|
||||||
|
additionalProperties: false,
|
||||||
|
required: ['type', 'fromBlock'],
|
||||||
|
},
|
||||||
|
} as const;
|
||||||
|
|
||||||
|
export const treeNameSchema = {
|
||||||
|
params: {
|
||||||
|
type: 'object',
|
||||||
|
properties: {
|
||||||
|
treeName: {
|
||||||
|
type: 'string',
|
||||||
|
minLength: 1,
|
||||||
|
maxLength: 60,
|
||||||
|
TreeName: true,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
additionalProperties: false,
|
||||||
|
required: ['treeName'],
|
||||||
|
},
|
||||||
|
} as const;
|
||||||
|
|
||||||
|
export function getWithdrawSchema(netId: NetIdType) {
|
||||||
|
const keyword = `withdraw${netId}`;
|
||||||
|
|
||||||
|
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||||
|
const schema = JSON.parse(JSON.stringify(withdrawBodySchema)) as any;
|
||||||
|
|
||||||
|
schema.body[keyword] = true;
|
||||||
|
|
||||||
|
return schema as typeof withdrawBodySchema & {
|
||||||
|
[key in keyof typeof keyword]: boolean;
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export function getEventsSchema(netId: NetIdType) {
|
||||||
|
const keyword = `events${netId}`;
|
||||||
|
|
||||||
|
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||||
|
const schema = JSON.parse(JSON.stringify(eventsSchema)) as any;
|
||||||
|
|
||||||
|
schema.body[keyword] = true;
|
||||||
|
|
||||||
|
return schema as typeof eventsSchema & {
|
||||||
|
[key in keyof typeof keyword]: boolean;
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export function getWithdrawKeyword(netId: NetIdType, rewardAccount: string) {
|
||||||
|
const keyword = `withdraw${netId}`;
|
||||||
|
|
||||||
|
const config = getConfig(netId);
|
||||||
|
|
||||||
|
return {
|
||||||
|
keyword,
|
||||||
|
validate: (schema: string, data: TornadoWithdrawParams) => {
|
||||||
|
try {
|
||||||
|
const { contract, args } = data;
|
||||||
|
|
||||||
|
const instance = getInstanceByAddress(config, contract);
|
||||||
|
|
||||||
|
// Unknown instance contract is unsupported
|
||||||
|
if (!instance) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Fee recipient should be a reward account
|
||||||
|
if (args[3] !== rewardAccount) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
const { amount, currency } = instance;
|
||||||
|
|
||||||
|
const {
|
||||||
|
nativeCurrency,
|
||||||
|
tokens: {
|
||||||
|
[currency]: { decimals },
|
||||||
|
},
|
||||||
|
} = config;
|
||||||
|
|
||||||
|
const denomination = parseUnits(amount, decimals);
|
||||||
|
|
||||||
|
const fee = BigInt(args[4]);
|
||||||
|
|
||||||
|
// Fees can't exceed denomination
|
||||||
|
if (!fee || fee >= denomination) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
// ETHTornado instances can't have refunds
|
||||||
|
if (currency === nativeCurrency && BigInt(args[5])) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
return true;
|
||||||
|
} catch {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
},
|
||||||
|
errors: true,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export function getEventsKeyword(netId: NetIdType) {
|
||||||
|
const keyword = `events${netId}`;
|
||||||
|
|
||||||
|
const config = getConfig(netId);
|
||||||
|
|
||||||
|
const { governanceContract, registryContract } = config;
|
||||||
|
|
||||||
|
return {
|
||||||
|
keyword,
|
||||||
|
validate: (schema: string, data: TovarishEventsQuery) => {
|
||||||
|
try {
|
||||||
|
const { type, currency, amount } = data;
|
||||||
|
|
||||||
|
if ([DEPOSIT, WITHDRAWAL].includes(type)) {
|
||||||
|
const instanceAddress = config.tokens[String(currency)]?.instanceAddress?.[String(amount)];
|
||||||
|
|
||||||
|
if (!instanceAddress) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (type === 'governance') {
|
||||||
|
if (!governanceContract) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
// todo: remove this after some time, remains for legacy client connection
|
||||||
|
if (['registered', 'registry', 'revenue'].includes(type)) {
|
||||||
|
if (!registryContract) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
return ['echo', 'encrypted_notes'].includes(type);
|
||||||
|
} catch {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
},
|
||||||
|
errors: true,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export function getTreeNameKeyword() {
|
||||||
|
return {
|
||||||
|
keyword: 'TreeName',
|
||||||
|
validate: (schema: string, data: string) => {
|
||||||
|
try {
|
||||||
|
const treeRegex = /deposits_(?<netId>\d+)_(?<currency>\w+)_(?<amount>[\d.]+)_(?<part>\w+).json.zip/g;
|
||||||
|
const { netId, currency, amount, part } = treeRegex.exec(data)?.groups || {};
|
||||||
|
|
||||||
|
const config = getConfig(Number(netId));
|
||||||
|
|
||||||
|
if (!currency || !amount || !part || currency !== config.nativeCurrency) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
const instanceAddress = config.tokens[String(currency)]?.instanceAddress?.[String(amount)];
|
||||||
|
|
||||||
|
if (!instanceAddress) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
return true;
|
||||||
|
} catch {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
},
|
||||||
|
errors: true,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export function getAllWithdrawKeyword(rewardAccount: string) {
|
||||||
|
return enabledChains.map((netId) => getWithdrawKeyword(netId, rewardAccount));
|
||||||
|
}
|
||||||
|
|
||||||
|
export function getAllEventsKeyword() {
|
||||||
|
return enabledChains.map((netId) => getEventsKeyword(netId));
|
||||||
|
}
|
634
src/services/sync.ts
Normal file
634
src/services/sync.ts
Normal file
@ -0,0 +1,634 @@
|
|||||||
|
import type { Provider } from 'ethers';
|
||||||
|
import type { Logger } from 'winston';
|
||||||
|
|
||||||
|
import {
|
||||||
|
Governance__factory,
|
||||||
|
RelayerRegistry__factory,
|
||||||
|
Aggregator__factory,
|
||||||
|
Echoer__factory,
|
||||||
|
TornadoRouter__factory,
|
||||||
|
Tornado__factory,
|
||||||
|
} from '@tornado/contracts';
|
||||||
|
|
||||||
|
import {
|
||||||
|
getConfig,
|
||||||
|
getProviderWithNetId,
|
||||||
|
MerkleTreeService,
|
||||||
|
NetIdType,
|
||||||
|
getRelayerEnsSubdomains,
|
||||||
|
TokenPriceOracle,
|
||||||
|
Multicall__factory,
|
||||||
|
OffchainOracle__factory,
|
||||||
|
TornadoFeeOracle,
|
||||||
|
OvmGasPriceOracle__factory,
|
||||||
|
getActiveTokens,
|
||||||
|
TovarishEventsStatus,
|
||||||
|
InstanceEventsStatus,
|
||||||
|
TovarishSyncStatus,
|
||||||
|
EventsStatus,
|
||||||
|
ReverseRecords__factory,
|
||||||
|
} from '@tornado/core';
|
||||||
|
|
||||||
|
import { RelayerConfig } from '../config';
|
||||||
|
import { getLogger } from './logger';
|
||||||
|
import { TreeCache } from './treeCache';
|
||||||
|
import {
|
||||||
|
NodeEchoService,
|
||||||
|
NodeEncryptedNotesService,
|
||||||
|
NodeGovernanceService,
|
||||||
|
NodeRegistryService,
|
||||||
|
NodeRevenueService,
|
||||||
|
NodeTornadoService,
|
||||||
|
} from './events';
|
||||||
|
import { ErrorTypes, ErrorMessages, newError } from './error';
|
||||||
|
|
||||||
|
export interface AmountsServices {
|
||||||
|
depositsService: NodeTornadoService;
|
||||||
|
withdrawalsService: NodeTornadoService;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface CurrencyServices {
|
||||||
|
[index: string]: AmountsServices;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface TornadoServices {
|
||||||
|
[index: string]: CurrencyServices;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface Services {
|
||||||
|
provider: Provider;
|
||||||
|
tokenPriceOracle: TokenPriceOracle;
|
||||||
|
tornadoFeeOracle: TornadoFeeOracle;
|
||||||
|
governanceService?: NodeGovernanceService;
|
||||||
|
registryService?: NodeRegistryService;
|
||||||
|
revenueService?: NodeRevenueService;
|
||||||
|
echoService: NodeEchoService;
|
||||||
|
encryptedNotesService: NodeEncryptedNotesService;
|
||||||
|
tornadoServices: TornadoServices;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface CachedServices {
|
||||||
|
[index: NetIdType]: Services;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface CachedEventsStatus {
|
||||||
|
[index: NetIdType]: TovarishEventsStatus;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Token prices in ETH wei
|
||||||
|
export interface TokenPrices {
|
||||||
|
[index: string]: bigint;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface TokenPricesString {
|
||||||
|
[index: string]: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface CachedPrices {
|
||||||
|
[index: NetIdType]: TokenPrices;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface CachedPricesString {
|
||||||
|
[index: NetIdType]: TokenPricesString;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface GasPrices {
|
||||||
|
gasPrice: string;
|
||||||
|
l1Fee?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface CachedGasPrices {
|
||||||
|
[index: NetIdType]: GasPrices;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface LatestBlocks {
|
||||||
|
[index: NetIdType]: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface LatestBalances {
|
||||||
|
[index: NetIdType]: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface CachedSyncStatus {
|
||||||
|
[index: NetIdType]: TovarishSyncStatus;
|
||||||
|
}
|
||||||
|
|
||||||
|
function setupServices(syncManager: SyncManager) {
|
||||||
|
const { relayerConfig, logger, syncStatus } = syncManager;
|
||||||
|
const {
|
||||||
|
cacheDir: cacheDirectory,
|
||||||
|
userEventsDir: userDirectory,
|
||||||
|
userTreeDir,
|
||||||
|
merkleWorkerPath,
|
||||||
|
enabledNetworks,
|
||||||
|
} = relayerConfig;
|
||||||
|
|
||||||
|
const cachedServices = {} as CachedServices;
|
||||||
|
|
||||||
|
for (const netId of enabledNetworks) {
|
||||||
|
const config = getConfig(netId);
|
||||||
|
const rpcUrl = relayerConfig.rpcUrls[netId];
|
||||||
|
const provider = getProviderWithNetId(netId, rpcUrl, config);
|
||||||
|
|
||||||
|
const {
|
||||||
|
tokens,
|
||||||
|
nativeCurrency,
|
||||||
|
routerContract,
|
||||||
|
echoContract,
|
||||||
|
registryContract,
|
||||||
|
aggregatorContract,
|
||||||
|
reverseRecordsContract,
|
||||||
|
governanceContract,
|
||||||
|
multicallContract,
|
||||||
|
offchainOracleContract,
|
||||||
|
ovmGasPriceOracleContract,
|
||||||
|
deployedBlock,
|
||||||
|
constants: { GOVERNANCE_BLOCK, REGISTRY_BLOCK, NOTE_ACCOUNT_BLOCK, ENCRYPTED_NOTES_BLOCK },
|
||||||
|
} = config;
|
||||||
|
|
||||||
|
if (!syncStatus[netId]) {
|
||||||
|
syncStatus[netId] = {
|
||||||
|
events: false,
|
||||||
|
tokenPrice: false,
|
||||||
|
gasPrice: false,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const services = (cachedServices[netId] = {} as Services);
|
||||||
|
|
||||||
|
services.provider = provider;
|
||||||
|
|
||||||
|
services.tokenPriceOracle = new TokenPriceOracle(
|
||||||
|
provider,
|
||||||
|
Multicall__factory.connect(multicallContract, provider),
|
||||||
|
offchainOracleContract ? OffchainOracle__factory.connect(offchainOracleContract, provider) : undefined,
|
||||||
|
);
|
||||||
|
|
||||||
|
services.tornadoFeeOracle = new TornadoFeeOracle(
|
||||||
|
provider,
|
||||||
|
ovmGasPriceOracleContract
|
||||||
|
? OvmGasPriceOracle__factory.connect(ovmGasPriceOracleContract, provider)
|
||||||
|
: undefined,
|
||||||
|
);
|
||||||
|
|
||||||
|
if (governanceContract && aggregatorContract && reverseRecordsContract) {
|
||||||
|
services.governanceService = new NodeGovernanceService({
|
||||||
|
netId,
|
||||||
|
provider,
|
||||||
|
Governance: Governance__factory.connect(governanceContract, provider),
|
||||||
|
Aggregator: Aggregator__factory.connect(aggregatorContract, provider),
|
||||||
|
ReverseRecords: ReverseRecords__factory.connect(reverseRecordsContract, provider),
|
||||||
|
deployedBlock: GOVERNANCE_BLOCK,
|
||||||
|
cacheDirectory,
|
||||||
|
userDirectory,
|
||||||
|
logger,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
if (registryContract && aggregatorContract) {
|
||||||
|
services.registryService = new NodeRegistryService({
|
||||||
|
netId,
|
||||||
|
provider,
|
||||||
|
RelayerRegistry: RelayerRegistry__factory.connect(registryContract, provider),
|
||||||
|
Aggregator: Aggregator__factory.connect(aggregatorContract, provider),
|
||||||
|
relayerEnsSubdomains: getRelayerEnsSubdomains(),
|
||||||
|
deployedBlock: REGISTRY_BLOCK,
|
||||||
|
cacheDirectory,
|
||||||
|
userDirectory,
|
||||||
|
logger,
|
||||||
|
});
|
||||||
|
|
||||||
|
services.revenueService = new NodeRevenueService({
|
||||||
|
netId,
|
||||||
|
provider,
|
||||||
|
RelayerRegistry: RelayerRegistry__factory.connect(registryContract, provider),
|
||||||
|
deployedBlock: REGISTRY_BLOCK,
|
||||||
|
cacheDirectory,
|
||||||
|
userDirectory,
|
||||||
|
logger,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
services.echoService = new NodeEchoService({
|
||||||
|
netId,
|
||||||
|
provider,
|
||||||
|
Echoer: Echoer__factory.connect(echoContract, provider),
|
||||||
|
deployedBlock: NOTE_ACCOUNT_BLOCK,
|
||||||
|
cacheDirectory,
|
||||||
|
userDirectory,
|
||||||
|
logger,
|
||||||
|
});
|
||||||
|
|
||||||
|
services.encryptedNotesService = new NodeEncryptedNotesService({
|
||||||
|
netId,
|
||||||
|
provider,
|
||||||
|
Router: TornadoRouter__factory.connect(routerContract, provider),
|
||||||
|
deployedBlock: ENCRYPTED_NOTES_BLOCK,
|
||||||
|
cacheDirectory,
|
||||||
|
userDirectory,
|
||||||
|
logger,
|
||||||
|
});
|
||||||
|
|
||||||
|
services.tornadoServices = {} as TornadoServices;
|
||||||
|
|
||||||
|
for (const currency of getActiveTokens(config)) {
|
||||||
|
const currencyConfig = tokens[currency];
|
||||||
|
|
||||||
|
const currencyService = (services.tornadoServices[currency] = {} as CurrencyServices);
|
||||||
|
|
||||||
|
for (const [amount, instanceAddress] of Object.entries(currencyConfig.instanceAddress)) {
|
||||||
|
const Tornado = Tornado__factory.connect(instanceAddress, provider);
|
||||||
|
|
||||||
|
const amountService = (currencyService[amount] = {} as AmountsServices);
|
||||||
|
|
||||||
|
const TornadoServiceConstructor = {
|
||||||
|
netId,
|
||||||
|
provider,
|
||||||
|
Tornado,
|
||||||
|
amount,
|
||||||
|
currency,
|
||||||
|
deployedBlock,
|
||||||
|
cacheDirectory,
|
||||||
|
userDirectory,
|
||||||
|
nativeCurrency,
|
||||||
|
logger,
|
||||||
|
};
|
||||||
|
|
||||||
|
amountService.depositsService = new NodeTornadoService({
|
||||||
|
...TornadoServiceConstructor,
|
||||||
|
merkleTreeService: new MerkleTreeService({
|
||||||
|
netId,
|
||||||
|
amount,
|
||||||
|
currency,
|
||||||
|
Tornado,
|
||||||
|
merkleWorkerPath,
|
||||||
|
}),
|
||||||
|
treeCache: new TreeCache({
|
||||||
|
netId,
|
||||||
|
amount,
|
||||||
|
currency,
|
||||||
|
userDirectory: userTreeDir,
|
||||||
|
}),
|
||||||
|
optionalTree: true,
|
||||||
|
type: 'Deposit',
|
||||||
|
});
|
||||||
|
|
||||||
|
amountService.withdrawalsService = new NodeTornadoService({
|
||||||
|
...TornadoServiceConstructor,
|
||||||
|
type: 'Withdrawal',
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
syncManager.cachedServices = cachedServices;
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function syncGasPrice(syncManager: SyncManager, netId: NetIdType) {
|
||||||
|
const {
|
||||||
|
cachedServices,
|
||||||
|
logger,
|
||||||
|
errors,
|
||||||
|
cachedGasPrices,
|
||||||
|
latestBlocks,
|
||||||
|
latestBalances,
|
||||||
|
syncStatus,
|
||||||
|
relayerConfig: { rewardAccount },
|
||||||
|
} = syncManager;
|
||||||
|
|
||||||
|
try {
|
||||||
|
const services = cachedServices[netId];
|
||||||
|
|
||||||
|
const { provider, tornadoFeeOracle } = services;
|
||||||
|
|
||||||
|
const [blockNumber, balance, gasPrice, l1Fee] = await Promise.all([
|
||||||
|
provider.getBlockNumber(),
|
||||||
|
provider.getBalance(rewardAccount),
|
||||||
|
tornadoFeeOracle.gasPrice(),
|
||||||
|
tornadoFeeOracle.fetchL1OptimismFee(),
|
||||||
|
]);
|
||||||
|
|
||||||
|
cachedGasPrices[netId] = {
|
||||||
|
gasPrice: gasPrice.toString(),
|
||||||
|
l1Fee: l1Fee ? l1Fee.toString() : undefined,
|
||||||
|
};
|
||||||
|
|
||||||
|
latestBlocks[netId] = blockNumber;
|
||||||
|
latestBalances[netId] = balance.toString();
|
||||||
|
|
||||||
|
syncStatus[netId].gasPrice = true;
|
||||||
|
} catch (err) {
|
||||||
|
logger.error(`${netId}: Failed to sync gas prices`);
|
||||||
|
console.log(err);
|
||||||
|
syncStatus[netId].gasPrice = false;
|
||||||
|
errors.push(newError('SyncManager (gas)', netId, err));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function syncPrices(syncManager: SyncManager, netId: NetIdType) {
|
||||||
|
const { cachedServices, logger, errors, cachedPrices, syncStatus } = syncManager;
|
||||||
|
|
||||||
|
try {
|
||||||
|
const config = getConfig(netId);
|
||||||
|
|
||||||
|
const { nativeCurrency, tornContract } = config;
|
||||||
|
|
||||||
|
const services = cachedServices[netId];
|
||||||
|
|
||||||
|
const { tokenPriceOracle } = services;
|
||||||
|
|
||||||
|
// Classic UI ajv validator requires all token prices to present
|
||||||
|
const allTokens = Object.keys(config.tokens);
|
||||||
|
|
||||||
|
if (tornContract && !allTokens.includes('torn')) {
|
||||||
|
allTokens.push('torn');
|
||||||
|
}
|
||||||
|
|
||||||
|
const tokens = allTokens
|
||||||
|
.map((currency) => {
|
||||||
|
if (currency === nativeCurrency) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (currency === 'torn') {
|
||||||
|
return {
|
||||||
|
currency,
|
||||||
|
tokenAddress: tornContract,
|
||||||
|
decimals: 18,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const { tokenAddress, decimals } = config.tokens[currency];
|
||||||
|
|
||||||
|
return {
|
||||||
|
currency,
|
||||||
|
tokenAddress,
|
||||||
|
decimals,
|
||||||
|
};
|
||||||
|
})
|
||||||
|
.filter((t) => t) as {
|
||||||
|
currency: string;
|
||||||
|
tokenAddress: string;
|
||||||
|
decimals: number;
|
||||||
|
}[];
|
||||||
|
|
||||||
|
if (!tokens.length) {
|
||||||
|
syncStatus[netId].tokenPrice = true;
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
cachedPrices[netId] = (await tokenPriceOracle.fetchPrices(tokens)).reduce((acc, price, index) => {
|
||||||
|
acc[tokens[index].currency] = price;
|
||||||
|
return acc;
|
||||||
|
}, {} as TokenPrices);
|
||||||
|
|
||||||
|
syncStatus[netId].tokenPrice = true;
|
||||||
|
|
||||||
|
logger.info(`${netId}: Synced ${tokens.length} tokens price`);
|
||||||
|
} catch (err) {
|
||||||
|
logger.error(`${netId}: Failed to sync prices`);
|
||||||
|
console.log(err);
|
||||||
|
syncStatus[netId].tokenPrice = false;
|
||||||
|
errors.push(newError('SyncManager (price)', netId, err));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function syncNetworkEvents(syncManager: SyncManager, netId: NetIdType) {
|
||||||
|
const { cachedEvents, cachedServices, logger, errors, syncStatus } = syncManager;
|
||||||
|
|
||||||
|
try {
|
||||||
|
const services = cachedServices[netId];
|
||||||
|
|
||||||
|
const {
|
||||||
|
provider,
|
||||||
|
governanceService,
|
||||||
|
registryService,
|
||||||
|
revenueService,
|
||||||
|
echoService,
|
||||||
|
encryptedNotesService,
|
||||||
|
tornadoServices,
|
||||||
|
} = services;
|
||||||
|
|
||||||
|
logger.info(`${netId}: Syncing events from block ${await provider.getBlockNumber()}`);
|
||||||
|
|
||||||
|
const eventsStatus = {
|
||||||
|
governance: governanceService ? {} : undefined,
|
||||||
|
registered: registryService ? {} : undefined,
|
||||||
|
registry: registryService ? {} : undefined,
|
||||||
|
revenue: revenueService ? {} : undefined,
|
||||||
|
echo: {},
|
||||||
|
encrypted_notes: {},
|
||||||
|
instances: {},
|
||||||
|
} as TovarishEventsStatus;
|
||||||
|
|
||||||
|
if (governanceService) {
|
||||||
|
const { events, lastBlock } = await governanceService.updateEvents();
|
||||||
|
|
||||||
|
eventsStatus.governance = {
|
||||||
|
events: events.length,
|
||||||
|
lastBlock,
|
||||||
|
};
|
||||||
|
|
||||||
|
logger.info(`${netId}: Updated governance events (total: ${events.length}, block: ${lastBlock})`);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (registryService) {
|
||||||
|
{
|
||||||
|
const { events, lastBlock } = await registryService.updateEvents();
|
||||||
|
|
||||||
|
eventsStatus.registry = {
|
||||||
|
events: events.length,
|
||||||
|
lastBlock,
|
||||||
|
};
|
||||||
|
|
||||||
|
logger.info(`${netId}: Updated registry events (total: ${events.length}, block: ${lastBlock})`);
|
||||||
|
}
|
||||||
|
|
||||||
|
{
|
||||||
|
const { lastBlock, timestamp, relayers } = await registryService.updateRelayers();
|
||||||
|
|
||||||
|
eventsStatus.registered = {
|
||||||
|
lastBlock,
|
||||||
|
timestamp,
|
||||||
|
relayers: relayers.length,
|
||||||
|
};
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
`${netId}: Updated registry relayers (total: ${relayers.length}, block: ${lastBlock}, timestamp: ${timestamp})`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (revenueService) {
|
||||||
|
const { events, lastBlock } = await revenueService.updateEvents();
|
||||||
|
|
||||||
|
eventsStatus.revenue = {
|
||||||
|
events: events.length,
|
||||||
|
lastBlock,
|
||||||
|
};
|
||||||
|
|
||||||
|
logger.info(`${netId}: Updated revenue events (total: ${events.length}, block: ${lastBlock})`);
|
||||||
|
}
|
||||||
|
|
||||||
|
const echoEvents = await echoService.updateEvents();
|
||||||
|
|
||||||
|
eventsStatus.echo = {
|
||||||
|
events: echoEvents.events.length,
|
||||||
|
lastBlock: echoEvents.lastBlock,
|
||||||
|
};
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
`${netId}: Updated echo events (total: ${echoEvents.events.length}, block: ${echoEvents.lastBlock})`,
|
||||||
|
);
|
||||||
|
|
||||||
|
const encryptedNotesEvents = await encryptedNotesService.updateEvents();
|
||||||
|
|
||||||
|
eventsStatus.encrypted_notes = {
|
||||||
|
events: encryptedNotesEvents.events.length,
|
||||||
|
lastBlock: encryptedNotesEvents.lastBlock,
|
||||||
|
};
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
`${netId}: Updated encrypted notes events (total: ${encryptedNotesEvents.events.length}, block: ${encryptedNotesEvents.lastBlock})`,
|
||||||
|
);
|
||||||
|
|
||||||
|
const currencies = Object.keys(tornadoServices);
|
||||||
|
|
||||||
|
for (const currency of currencies) {
|
||||||
|
const currencyStatus = (eventsStatus.instances[currency] = {} as InstanceEventsStatus);
|
||||||
|
|
||||||
|
const amounts = Object.keys(tornadoServices[currency]);
|
||||||
|
|
||||||
|
for (const amount of amounts) {
|
||||||
|
const instanceStatus = (currencyStatus[amount] = {
|
||||||
|
deposits: {} as EventsStatus,
|
||||||
|
withdrawals: {} as EventsStatus,
|
||||||
|
});
|
||||||
|
|
||||||
|
const { depositsService, withdrawalsService } = tornadoServices[currency][amount];
|
||||||
|
|
||||||
|
const depositEvents = await depositsService.updateEvents();
|
||||||
|
|
||||||
|
instanceStatus.deposits = {
|
||||||
|
events: depositEvents.events.length,
|
||||||
|
lastBlock: depositEvents.lastBlock,
|
||||||
|
};
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
`${netId}: Updated ${currency} ${amount} Tornado deposit events (total: ${depositEvents.events.length}, block: ${depositEvents.lastBlock})`,
|
||||||
|
);
|
||||||
|
|
||||||
|
const withdrawalEvents = await withdrawalsService.updateEvents();
|
||||||
|
|
||||||
|
instanceStatus.withdrawals = {
|
||||||
|
events: withdrawalEvents.events.length,
|
||||||
|
lastBlock: withdrawalEvents.lastBlock,
|
||||||
|
};
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
`${netId}: Updated ${currency} ${amount} Tornado withdrawal events (total: ${withdrawalEvents.events.length}, block: ${withdrawalEvents.lastBlock})`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
cachedEvents[netId] = eventsStatus;
|
||||||
|
|
||||||
|
syncStatus[netId].events = true;
|
||||||
|
|
||||||
|
logger.info(`${netId}: Synced all events`);
|
||||||
|
|
||||||
|
await Promise.all([syncPrices(syncManager, netId), syncGasPrice(syncManager, netId)]);
|
||||||
|
} catch (err) {
|
||||||
|
logger.error(`${netId}: Failed to sync events`);
|
||||||
|
console.log(err);
|
||||||
|
syncStatus[netId].events = false;
|
||||||
|
errors.push(newError('SyncManager (events)', netId, err));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface SyncManagerStatus {
|
||||||
|
cachedEvents: CachedEventsStatus;
|
||||||
|
cachedPrices: CachedPricesString;
|
||||||
|
cachedGasPrices: CachedGasPrices;
|
||||||
|
|
||||||
|
syncStatus: CachedSyncStatus;
|
||||||
|
latestBlocks: LatestBlocks;
|
||||||
|
latestBalances: LatestBalances;
|
||||||
|
errors: ErrorTypes[];
|
||||||
|
|
||||||
|
onSyncEvents: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
export class SyncManager {
|
||||||
|
relayerConfig: RelayerConfig;
|
||||||
|
logger: Logger;
|
||||||
|
cachedServices: CachedServices;
|
||||||
|
|
||||||
|
cachedEvents: CachedEventsStatus;
|
||||||
|
cachedPrices: CachedPrices;
|
||||||
|
cachedGasPrices: CachedGasPrices;
|
||||||
|
|
||||||
|
syncStatus: CachedSyncStatus;
|
||||||
|
latestBlocks: LatestBlocks;
|
||||||
|
latestBalances: LatestBalances;
|
||||||
|
errors: ErrorMessages[];
|
||||||
|
|
||||||
|
onSyncEvents: boolean;
|
||||||
|
|
||||||
|
constructor(relayerConfig: RelayerConfig) {
|
||||||
|
this.relayerConfig = relayerConfig;
|
||||||
|
this.logger = getLogger('[SyncManager]', relayerConfig.logLevel);
|
||||||
|
this.cachedServices = {} as CachedServices;
|
||||||
|
|
||||||
|
this.cachedEvents = {} as CachedEventsStatus;
|
||||||
|
this.cachedPrices = {} as CachedPrices;
|
||||||
|
this.cachedGasPrices = {} as CachedGasPrices;
|
||||||
|
|
||||||
|
this.syncStatus = {} as CachedSyncStatus;
|
||||||
|
this.latestBlocks = {} as LatestBlocks;
|
||||||
|
this.latestBalances = {} as LatestBalances;
|
||||||
|
this.errors = [];
|
||||||
|
|
||||||
|
this.onSyncEvents = false;
|
||||||
|
|
||||||
|
setupServices(this);
|
||||||
|
}
|
||||||
|
|
||||||
|
getStatus(): SyncManagerStatus {
|
||||||
|
return {
|
||||||
|
cachedEvents: this.cachedEvents,
|
||||||
|
cachedPrices: JSON.parse(JSON.stringify(this.cachedPrices)),
|
||||||
|
cachedGasPrices: JSON.parse(JSON.stringify(this.cachedGasPrices)),
|
||||||
|
|
||||||
|
syncStatus: this.syncStatus,
|
||||||
|
latestBlocks: this.latestBlocks,
|
||||||
|
latestBalances: this.latestBalances,
|
||||||
|
errors: this.errors.map(({ type, netId, timestamp }) => ({
|
||||||
|
type,
|
||||||
|
netId,
|
||||||
|
timestamp,
|
||||||
|
})),
|
||||||
|
|
||||||
|
onSyncEvents: this.onSyncEvents,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
getPrice(netId: NetIdType, token: string) {
|
||||||
|
return this.cachedPrices[netId]?.[token] || BigInt(0);
|
||||||
|
}
|
||||||
|
|
||||||
|
getGasPrice(netId: NetIdType) {
|
||||||
|
return this.cachedGasPrices[netId];
|
||||||
|
}
|
||||||
|
|
||||||
|
async syncEvents() {
|
||||||
|
if (this.onSyncEvents) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
this.onSyncEvents = true;
|
||||||
|
|
||||||
|
await Promise.all(this.relayerConfig.enabledNetworks.map((netId) => syncNetworkEvents(this, Number(netId))));
|
||||||
|
|
||||||
|
this.onSyncEvents = false;
|
||||||
|
}
|
||||||
|
}
|
113
src/services/treeCache.ts
Normal file
113
src/services/treeCache.ts
Normal file
@ -0,0 +1,113 @@
|
|||||||
|
/**
|
||||||
|
* Create tree cache file from node.js
|
||||||
|
*
|
||||||
|
* Only works for node.js, modified from https://github.com/tornadocash/tornado-classic-ui/blob/master/scripts/updateTree.js
|
||||||
|
*/
|
||||||
|
import { MerkleTree } from '@tornado/fixed-merkle-tree';
|
||||||
|
import BloomFilter from 'bloomfilter.js';
|
||||||
|
import { DepositsEvents } from '@tornado/core';
|
||||||
|
import type { NetIdType } from '@tornado/core';
|
||||||
|
import { saveUserFile } from './data';
|
||||||
|
|
||||||
|
export interface TreeCacheConstructor {
|
||||||
|
netId: NetIdType;
|
||||||
|
amount: string;
|
||||||
|
currency: string;
|
||||||
|
userDirectory: string;
|
||||||
|
PARTS_COUNT?: number;
|
||||||
|
LEAVES?: number;
|
||||||
|
zeroElement?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface treeMetadata {
|
||||||
|
blockNumber: number;
|
||||||
|
logIndex: number;
|
||||||
|
transactionHash: string;
|
||||||
|
timestamp: number;
|
||||||
|
from: string;
|
||||||
|
leafIndex: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export class TreeCache {
|
||||||
|
netId: NetIdType;
|
||||||
|
amount: string;
|
||||||
|
currency: string;
|
||||||
|
userDirectory: string;
|
||||||
|
|
||||||
|
PARTS_COUNT: number;
|
||||||
|
|
||||||
|
constructor({ netId, amount, currency, userDirectory, PARTS_COUNT = 4 }: TreeCacheConstructor) {
|
||||||
|
this.netId = netId;
|
||||||
|
this.amount = amount;
|
||||||
|
this.currency = currency;
|
||||||
|
this.userDirectory = userDirectory;
|
||||||
|
|
||||||
|
this.PARTS_COUNT = PARTS_COUNT;
|
||||||
|
}
|
||||||
|
|
||||||
|
getInstanceName(): string {
|
||||||
|
return `deposits_${this.netId}_${this.currency}_${this.amount}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
async createTree(events: DepositsEvents[], tree: MerkleTree) {
|
||||||
|
const bloom = new BloomFilter(events.length);
|
||||||
|
|
||||||
|
console.log(`Creating cached tree for ${this.getInstanceName()}\n`);
|
||||||
|
|
||||||
|
// events indexed by commitment
|
||||||
|
const eventsData = events.reduce(
|
||||||
|
(acc, { leafIndex, commitment, ...rest }, i) => {
|
||||||
|
if (leafIndex !== i) {
|
||||||
|
throw new Error(`leafIndex (${leafIndex}) !== i (${i})`);
|
||||||
|
}
|
||||||
|
|
||||||
|
acc[commitment] = { ...rest, leafIndex };
|
||||||
|
|
||||||
|
return acc;
|
||||||
|
},
|
||||||
|
{} as { [key in string]: treeMetadata },
|
||||||
|
);
|
||||||
|
|
||||||
|
const slices = tree.getTreeSlices(this.PARTS_COUNT);
|
||||||
|
|
||||||
|
await Promise.all(
|
||||||
|
slices.map(async (slice, index) => {
|
||||||
|
const metadata = slice.elements.reduce((acc, curr) => {
|
||||||
|
if (index < this.PARTS_COUNT - 1) {
|
||||||
|
bloom.add(curr);
|
||||||
|
}
|
||||||
|
acc.push(eventsData[curr]);
|
||||||
|
return acc;
|
||||||
|
}, [] as treeMetadata[]);
|
||||||
|
|
||||||
|
const dataString =
|
||||||
|
JSON.stringify(
|
||||||
|
{
|
||||||
|
...slice,
|
||||||
|
metadata,
|
||||||
|
},
|
||||||
|
null,
|
||||||
|
2,
|
||||||
|
) + '\n';
|
||||||
|
|
||||||
|
const fileName = `${this.getInstanceName()}_slice${index + 1}.json`;
|
||||||
|
|
||||||
|
await saveUserFile({
|
||||||
|
fileName,
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
dataString,
|
||||||
|
});
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
|
const dataString = bloom.serialize() + '\n';
|
||||||
|
|
||||||
|
const fileName = `${this.getInstanceName()}_bloom.json`;
|
||||||
|
|
||||||
|
await saveUserFile({
|
||||||
|
fileName,
|
||||||
|
userDirectory: this.userDirectory,
|
||||||
|
dataString,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
11
src/services/utils.ts
Normal file
11
src/services/utils.ts
Normal file
@ -0,0 +1,11 @@
|
|||||||
|
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||||
|
(BigInt.prototype as any).toJSON = function () {
|
||||||
|
return this.toString();
|
||||||
|
};
|
||||||
|
|
||||||
|
export const chunk = <T>(arr: T[], size: number): T[][] =>
|
||||||
|
[...Array(Math.ceil(arr.length / size))].map((_, i) => arr.slice(size * i, size + size * i));
|
||||||
|
|
||||||
|
export function sleep(ms: number) {
|
||||||
|
return new Promise((resolve) => setTimeout(resolve, ms));
|
||||||
|
}
|
458
src/services/worker.ts
Normal file
458
src/services/worker.ts
Normal file
@ -0,0 +1,458 @@
|
|||||||
|
import { webcrypto as crypto } from 'crypto';
|
||||||
|
import type { Logger } from 'winston';
|
||||||
|
import { formatUnits, parseUnits, Provider } from 'ethers';
|
||||||
|
|
||||||
|
import { TornadoRouter, TornadoRouter__factory } from '@tornado/contracts';
|
||||||
|
|
||||||
|
import {
|
||||||
|
getConfig,
|
||||||
|
getProviderWithNetId,
|
||||||
|
NetIdType,
|
||||||
|
TornadoWithdrawParams,
|
||||||
|
RelayerTornadoJobs,
|
||||||
|
RelayerTornadoWithdraw,
|
||||||
|
TornadoFeeOracle,
|
||||||
|
snarkArgs,
|
||||||
|
Config,
|
||||||
|
getInstanceByAddress,
|
||||||
|
TornadoWallet,
|
||||||
|
sleep,
|
||||||
|
} from '@tornado/core';
|
||||||
|
|
||||||
|
import { getPrivateKey, RelayerConfig } from '../config';
|
||||||
|
import { getLogger } from './logger';
|
||||||
|
import { SyncManager } from './sync';
|
||||||
|
import { ErrorMessages, newError } from './error';
|
||||||
|
|
||||||
|
export enum RelayerStatus {
|
||||||
|
QUEUED = 'QUEUED',
|
||||||
|
ACCEPTED = 'ACCEPTED',
|
||||||
|
SENT = 'SENT',
|
||||||
|
MINED = 'MINED',
|
||||||
|
RESUBMITTED = 'RESUBMITTED',
|
||||||
|
CONFIRMED = 'CONFIRMED',
|
||||||
|
FAILED = 'FAILED',
|
||||||
|
}
|
||||||
|
|
||||||
|
export const DEFAULT_GAS_LIMIT = 600_000;
|
||||||
|
|
||||||
|
export interface RelayerServices {
|
||||||
|
provider: Provider;
|
||||||
|
signer: TornadoWallet;
|
||||||
|
tornadoFeeOracle: TornadoFeeOracle;
|
||||||
|
Router: TornadoRouter;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface CachedRelayerServices {
|
||||||
|
[key: NetIdType]: RelayerServices;
|
||||||
|
}
|
||||||
|
|
||||||
|
function setupServices(relayerWorker: RelayerWorker) {
|
||||||
|
const {
|
||||||
|
relayerConfig: { enabledNetworks, txRpcUrls },
|
||||||
|
} = relayerWorker;
|
||||||
|
|
||||||
|
for (const netId of enabledNetworks) {
|
||||||
|
const config = getConfig(netId);
|
||||||
|
const rpcUrl = txRpcUrls[netId];
|
||||||
|
const provider = getProviderWithNetId(netId, rpcUrl, config);
|
||||||
|
const signer = new TornadoWallet(getPrivateKey(), provider);
|
||||||
|
|
||||||
|
const Router = TornadoRouter__factory.connect(config.routerContract, signer);
|
||||||
|
|
||||||
|
const tornadoFeeOracle = new TornadoFeeOracle(provider);
|
||||||
|
|
||||||
|
relayerWorker.cachedRelayerServices[netId] = {
|
||||||
|
provider,
|
||||||
|
signer,
|
||||||
|
Router,
|
||||||
|
tornadoFeeOracle,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export function getFeeParams(
|
||||||
|
config: Config,
|
||||||
|
serviceFee: number,
|
||||||
|
syncManager: SyncManager,
|
||||||
|
{ netId, contract, args }: RelayerTornadoQueue,
|
||||||
|
) {
|
||||||
|
const { amount, currency } = getInstanceByAddress(config, contract) as {
|
||||||
|
amount: string;
|
||||||
|
currency: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
const {
|
||||||
|
nativeCurrency,
|
||||||
|
tokens: {
|
||||||
|
[currency]: { symbol: currencySymbol, decimals, gasLimit: instanceGasLimit },
|
||||||
|
},
|
||||||
|
} = config;
|
||||||
|
|
||||||
|
const symbol = currencySymbol.toLowerCase();
|
||||||
|
|
||||||
|
const { gasPrice, l1Fee } = syncManager.getGasPrice(netId);
|
||||||
|
|
||||||
|
const gasLimit = BigInt(instanceGasLimit || DEFAULT_GAS_LIMIT);
|
||||||
|
|
||||||
|
const denomination = parseUnits(amount, decimals);
|
||||||
|
|
||||||
|
const ethRefund = BigInt(args[5]);
|
||||||
|
|
||||||
|
const tokenPriceInWei = syncManager.getPrice(netId, symbol);
|
||||||
|
|
||||||
|
const isEth = nativeCurrency === currency;
|
||||||
|
|
||||||
|
return {
|
||||||
|
amount,
|
||||||
|
symbol,
|
||||||
|
gasPrice: BigInt(gasPrice),
|
||||||
|
gasLimit,
|
||||||
|
l1Fee,
|
||||||
|
denomination,
|
||||||
|
ethRefund,
|
||||||
|
tokenPriceInWei,
|
||||||
|
tokenDecimals: decimals,
|
||||||
|
relayerFeePercent: serviceFee,
|
||||||
|
isEth,
|
||||||
|
premiumPercent: 5,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function checkWithdrawalFees(
|
||||||
|
relayerWorker: RelayerWorker,
|
||||||
|
work: RelayerTornadoQueue,
|
||||||
|
): Promise<{
|
||||||
|
gasPrice: bigint;
|
||||||
|
gasLimit: bigint;
|
||||||
|
status: boolean;
|
||||||
|
error?: string;
|
||||||
|
}> {
|
||||||
|
try {
|
||||||
|
const { id, netId, contract, proof, args } = work;
|
||||||
|
|
||||||
|
const {
|
||||||
|
relayerConfig: { rewardAccount, serviceFee },
|
||||||
|
cachedRelayerServices: {
|
||||||
|
[netId]: { tornadoFeeOracle, Router },
|
||||||
|
},
|
||||||
|
syncManager,
|
||||||
|
} = relayerWorker;
|
||||||
|
|
||||||
|
const config = getConfig(netId);
|
||||||
|
|
||||||
|
const feeParams = getFeeParams(config, serviceFee, syncManager, work);
|
||||||
|
|
||||||
|
const { amount, symbol, tokenDecimals, denomination, ethRefund } = feeParams;
|
||||||
|
|
||||||
|
let fee = tornadoFeeOracle.calculateRelayerFee(feeParams);
|
||||||
|
|
||||||
|
const gasLimit = await Router.withdraw.estimateGas(contract, proof, ...args, {
|
||||||
|
from: rewardAccount,
|
||||||
|
value: ethRefund,
|
||||||
|
});
|
||||||
|
|
||||||
|
// Recalculate fee based on correct gas limit
|
||||||
|
fee = tornadoFeeOracle.calculateRelayerFee({
|
||||||
|
...feeParams,
|
||||||
|
gasLimit,
|
||||||
|
});
|
||||||
|
|
||||||
|
if (fee > denomination) {
|
||||||
|
return {
|
||||||
|
gasPrice: feeParams.gasPrice,
|
||||||
|
gasLimit,
|
||||||
|
status: false,
|
||||||
|
error: `Fee above deposit amount, requires ${formatUnits(fee, tokenDecimals)} ${symbol} while denomination is ${amount} ${symbol}`,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
if (fee > BigInt(args[4])) {
|
||||||
|
return {
|
||||||
|
gasPrice: feeParams.gasPrice,
|
||||||
|
gasLimit,
|
||||||
|
status: false,
|
||||||
|
error: `Insufficient fee, requires ${formatUnits(fee, tokenDecimals)} ${symbol} while user only wants to pay ${formatUnits(BigInt(args[4]), tokenDecimals)} ${symbol}`,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
relayerWorker.logger.info(
|
||||||
|
`New job: ${id} ${netId} ${amount} ${symbol} (Fee: ${formatUnits(BigInt(args[4]), tokenDecimals)} ${symbol}, Refund: ${formatUnits(BigInt(args[5]), tokenDecimals)})`,
|
||||||
|
);
|
||||||
|
|
||||||
|
return {
|
||||||
|
gasPrice: feeParams.gasPrice,
|
||||||
|
gasLimit,
|
||||||
|
status: true,
|
||||||
|
};
|
||||||
|
} catch {
|
||||||
|
return {
|
||||||
|
gasPrice: BigInt(0),
|
||||||
|
gasLimit: BigInt(0),
|
||||||
|
status: false,
|
||||||
|
error: 'Withdrawal transaction expected to be reverted',
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function processWithdrawals(relayerWorker: RelayerWorker) {
|
||||||
|
const { logger, cachedRelayerServices, errors } = relayerWorker;
|
||||||
|
|
||||||
|
for (const work of relayerWorker.queue) {
|
||||||
|
try {
|
||||||
|
if (work.status !== RelayerStatus.ACCEPTED) {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
const { id, netId, contract, proof, args } = work;
|
||||||
|
|
||||||
|
// cancel duplicated jobs
|
||||||
|
const otherWork = relayerWorker.queue.find(
|
||||||
|
(q) =>
|
||||||
|
q.id !== id &&
|
||||||
|
// find if other previous work is already sent (not pending or failed - to allow spending first and failed one)
|
||||||
|
q.status !== RelayerStatus.ACCEPTED &&
|
||||||
|
q.status !== RelayerStatus.FAILED &&
|
||||||
|
q.contract === contract &&
|
||||||
|
q.args[1] === args[1],
|
||||||
|
);
|
||||||
|
|
||||||
|
if (otherWork) {
|
||||||
|
const errMsg = `Found the same pending job ${otherWork.id}, wait until the previous one completes`;
|
||||||
|
throw new Error(errMsg);
|
||||||
|
}
|
||||||
|
|
||||||
|
const { gasLimit, gasPrice } = relayerWorker.queueGas.find((w) => w.id === id) as {
|
||||||
|
gasLimit: bigint;
|
||||||
|
gasPrice: bigint;
|
||||||
|
};
|
||||||
|
|
||||||
|
const config = getConfig(netId);
|
||||||
|
|
||||||
|
const { amount, currency } = getInstanceByAddress(config, contract) as {
|
||||||
|
amount: string;
|
||||||
|
currency: string;
|
||||||
|
};
|
||||||
|
const { decimals } = config.tokens[currency];
|
||||||
|
const { Router, signer } = cachedRelayerServices[netId];
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check fees to ensure that it didn't spike or revert (or has insane gas spendings)
|
||||||
|
*/
|
||||||
|
const txObj = await signer.populateTransaction(
|
||||||
|
await Router.withdraw.populateTransaction(contract, proof, ...args, {
|
||||||
|
value: BigInt(args[5]),
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
|
const txGasPrice = txObj.maxFeePerGas
|
||||||
|
? (txObj.maxFeePerGas as bigint) + BigInt(txObj.maxPriorityFeePerGas || 0)
|
||||||
|
: (txObj.gasPrice as bigint);
|
||||||
|
|
||||||
|
// Prevent tx on gas limit spike
|
||||||
|
if ((txObj.gasLimit as bigint) > (gasLimit * BigInt(15)) / BigInt(10)) {
|
||||||
|
const errMsg = `Job ${id} exceeds pre estimated gas limit, wants ${gasLimit * BigInt(2)} have ${txObj.gasLimit}`;
|
||||||
|
throw new Error(errMsg);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Prevent tx on gas price spike
|
||||||
|
if (txGasPrice > gasPrice * BigInt(2)) {
|
||||||
|
const errMsg = `Job ${id} exceeds pre estimated gas price, wants ${gasPrice * BigInt(2)} have ${txGasPrice}`;
|
||||||
|
throw new Error(errMsg);
|
||||||
|
}
|
||||||
|
|
||||||
|
const tx = await signer.sendTransaction(txObj);
|
||||||
|
|
||||||
|
work.txHash = tx.hash;
|
||||||
|
work.confirmations = 0;
|
||||||
|
work.status = RelayerStatus.SENT;
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
`Sent Job ${work.id} ${netId} ${amount} ${currency} tx (Fee: ${formatUnits(BigInt(args[4]), decimals)} ${currency}, Refund: ${formatUnits(BigInt(args[5]), decimals)} ${currency} ${tx.hash})`,
|
||||||
|
);
|
||||||
|
|
||||||
|
// Wait for 2 seconds so that the remote node could increment nonces
|
||||||
|
await sleep(2000);
|
||||||
|
|
||||||
|
// Head straight to confirmed status as the remote node oftenly doesn't report receipt correctly
|
||||||
|
work.confirmations = 1;
|
||||||
|
work.status = RelayerStatus.MINED;
|
||||||
|
|
||||||
|
work.confirmations = 3;
|
||||||
|
work.status = RelayerStatus.CONFIRMED;
|
||||||
|
|
||||||
|
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||||
|
} catch (error: any) {
|
||||||
|
logger.error(`Failed to send job ${work.id}`);
|
||||||
|
console.log(error);
|
||||||
|
errors.push(newError('Worker (processWithdrawals)', work.netId, error));
|
||||||
|
|
||||||
|
work.status = RelayerStatus.FAILED;
|
||||||
|
|
||||||
|
if (error.message?.includes('exceeds pre estimated')) {
|
||||||
|
work.failedReason = error.message;
|
||||||
|
} else if (error.message?.includes('Found the same pending job')) {
|
||||||
|
work.failedReason = error.message;
|
||||||
|
} else {
|
||||||
|
work.failedReason = 'Relayer failed to send transaction';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
relayerWorker.queue = relayerWorker.queue.filter(
|
||||||
|
(w) => w.timestamp + relayerWorker.relayerConfig.clearInterval >= Math.floor(Date.now() / 1000),
|
||||||
|
);
|
||||||
|
|
||||||
|
relayerWorker.queueGas = relayerWorker.queueGas.filter(
|
||||||
|
(w) => w.timestamp + relayerWorker.relayerConfig.clearInterval >= Math.floor(Date.now() / 1000),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface CreateWorkParams extends TornadoWithdrawParams {
|
||||||
|
netId: NetIdType;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface RelayerTornadoQueue extends Omit<RelayerTornadoJobs, 'contract' | 'proof' | 'args'> {
|
||||||
|
netId: NetIdType;
|
||||||
|
contract: string;
|
||||||
|
proof: string;
|
||||||
|
args: snarkArgs;
|
||||||
|
timestamp: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface RelayerQueueGas {
|
||||||
|
id: string;
|
||||||
|
gasPrice: bigint;
|
||||||
|
gasLimit: bigint;
|
||||||
|
timestamp: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export class RelayerWorker {
|
||||||
|
relayerConfig: RelayerConfig;
|
||||||
|
logger: Logger;
|
||||||
|
syncManager: SyncManager;
|
||||||
|
|
||||||
|
cachedRelayerServices: CachedRelayerServices;
|
||||||
|
|
||||||
|
queue: RelayerTornadoQueue[];
|
||||||
|
queueGas: RelayerQueueGas[];
|
||||||
|
|
||||||
|
queueTimer: null | NodeJS.Timeout;
|
||||||
|
|
||||||
|
errors: ErrorMessages[];
|
||||||
|
|
||||||
|
constructor(relayerConfig: RelayerConfig, syncManager: SyncManager) {
|
||||||
|
this.relayerConfig = relayerConfig;
|
||||||
|
this.syncManager = syncManager;
|
||||||
|
this.logger = getLogger('[RelayerWorker]', relayerConfig.logLevel);
|
||||||
|
this.cachedRelayerServices = {} as CachedRelayerServices;
|
||||||
|
this.queue = [];
|
||||||
|
this.queueGas = [];
|
||||||
|
this.queueTimer = null;
|
||||||
|
this.errors = [];
|
||||||
|
|
||||||
|
setupServices(this);
|
||||||
|
}
|
||||||
|
|
||||||
|
async doWork() {
|
||||||
|
await processWithdrawals(this);
|
||||||
|
|
||||||
|
const pendingWorks = this.queue.filter(
|
||||||
|
(q) => q.status === RelayerStatus.QUEUED || q.status === RelayerStatus.ACCEPTED,
|
||||||
|
).length;
|
||||||
|
|
||||||
|
if (pendingWorks) {
|
||||||
|
if (pendingWorks < 5) {
|
||||||
|
this.doWork();
|
||||||
|
return;
|
||||||
|
} else {
|
||||||
|
this.queue.forEach((q) => {
|
||||||
|
q.status = RelayerStatus.FAILED;
|
||||||
|
q.error = 'Relayer has too many jobs, try it again later';
|
||||||
|
q.failedReason = 'Relayer has too many jobs, try it again later';
|
||||||
|
});
|
||||||
|
|
||||||
|
this.logger.error(`Relayer has cleared the workload ( ${pendingWorks} ) due to overhaul`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
this.queueTimer = null;
|
||||||
|
}
|
||||||
|
|
||||||
|
async createWork({
|
||||||
|
netId,
|
||||||
|
contract,
|
||||||
|
proof,
|
||||||
|
args,
|
||||||
|
}: CreateWorkParams): Promise<RelayerTornadoWithdraw | RelayerTornadoQueue> {
|
||||||
|
const work: RelayerTornadoQueue = {
|
||||||
|
netId,
|
||||||
|
id: crypto.randomUUID(),
|
||||||
|
type: 'TORNADO_WITHDRAW',
|
||||||
|
status: RelayerStatus.QUEUED,
|
||||||
|
contract,
|
||||||
|
proof,
|
||||||
|
args,
|
||||||
|
timestamp: Math.floor(Date.now() / 1000),
|
||||||
|
};
|
||||||
|
|
||||||
|
if (
|
||||||
|
this.queue.find(
|
||||||
|
(q) => q.status !== RelayerStatus.FAILED && q.contract === contract && q.args[1] === args[1],
|
||||||
|
)
|
||||||
|
) {
|
||||||
|
work.status = RelayerStatus.FAILED;
|
||||||
|
|
||||||
|
return {
|
||||||
|
error: 'Found the same pending job, wait until the previous one completes',
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const { gasPrice, gasLimit, status, error } = await checkWithdrawalFees(this, work);
|
||||||
|
|
||||||
|
const workGas = {
|
||||||
|
id: work.id,
|
||||||
|
gasPrice,
|
||||||
|
gasLimit,
|
||||||
|
timestamp: work.timestamp,
|
||||||
|
};
|
||||||
|
|
||||||
|
if (!status) {
|
||||||
|
work.status = RelayerStatus.FAILED;
|
||||||
|
|
||||||
|
return {
|
||||||
|
error,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
work.status = RelayerStatus.ACCEPTED;
|
||||||
|
|
||||||
|
this.queue.push(work);
|
||||||
|
this.queueGas.push(workGas);
|
||||||
|
|
||||||
|
if (!this.queueTimer) {
|
||||||
|
this.queueTimer = setTimeout(() => this.doWork(), 500);
|
||||||
|
}
|
||||||
|
|
||||||
|
return work;
|
||||||
|
}
|
||||||
|
|
||||||
|
getWork({ id }: { id: string }): RelayerTornadoWithdraw | RelayerTornadoQueue {
|
||||||
|
const work = this.queue.find((w) => w.id === id);
|
||||||
|
|
||||||
|
if (!work) {
|
||||||
|
return {
|
||||||
|
error: `Work ${id} not found`,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const copiedWork = JSON.parse(JSON.stringify(work));
|
||||||
|
delete copiedWork.netId;
|
||||||
|
return copiedWork as RelayerTornadoQueue;
|
||||||
|
}
|
||||||
|
|
||||||
|
pendingWorks() {
|
||||||
|
return this.queue.filter((q) => q.status === RelayerStatus.QUEUED || q.status === RelayerStatus.ACCEPTED)
|
||||||
|
.length;
|
||||||
|
}
|
||||||
|
}
|
125
src/start.ts
Normal file
125
src/start.ts
Normal file
@ -0,0 +1,125 @@
|
|||||||
|
import process from 'process';
|
||||||
|
import cluster from 'cluster';
|
||||||
|
import type { Logger } from 'winston';
|
||||||
|
|
||||||
|
import { getRelayerConfig, RelayerConfig } from './config';
|
||||||
|
import { getLogger, SyncManager, Router, RelayerWorker, checkProviders } from './services';
|
||||||
|
|
||||||
|
if (cluster.isWorker) {
|
||||||
|
new Router(JSON.parse(process.env.relayerConfig as string) as RelayerConfig, Number(process.env.forkId));
|
||||||
|
} else {
|
||||||
|
start();
|
||||||
|
}
|
||||||
|
|
||||||
|
async function forkRouter({
|
||||||
|
relayerConfig,
|
||||||
|
logger,
|
||||||
|
syncManager,
|
||||||
|
relayerWorker,
|
||||||
|
forkId,
|
||||||
|
}: {
|
||||||
|
relayerConfig: RelayerConfig;
|
||||||
|
logger: Logger;
|
||||||
|
syncManager: SyncManager;
|
||||||
|
relayerWorker: RelayerWorker;
|
||||||
|
forkId: number;
|
||||||
|
}) {
|
||||||
|
const worker = cluster.fork({
|
||||||
|
relayerConfig: JSON.stringify(relayerConfig),
|
||||||
|
forkId,
|
||||||
|
});
|
||||||
|
|
||||||
|
worker
|
||||||
|
.on('exit', (code) => {
|
||||||
|
logger.error(`Router Worker ${forkId} died with code ${code}, respawning...`);
|
||||||
|
|
||||||
|
setTimeout(() => {
|
||||||
|
forkRouter({
|
||||||
|
relayerConfig,
|
||||||
|
logger,
|
||||||
|
syncManager,
|
||||||
|
relayerWorker,
|
||||||
|
forkId,
|
||||||
|
});
|
||||||
|
}, 5000);
|
||||||
|
})
|
||||||
|
.on('message', async (msg) => {
|
||||||
|
const { msgId, type } = msg;
|
||||||
|
|
||||||
|
if (type === 'status') {
|
||||||
|
worker.send({
|
||||||
|
msgId,
|
||||||
|
syncManagerStatus: syncManager.getStatus(),
|
||||||
|
pendingWorks: relayerWorker.pendingWorks(),
|
||||||
|
});
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (type === 'job') {
|
||||||
|
const work = relayerWorker.getWork({
|
||||||
|
id: msg.id,
|
||||||
|
});
|
||||||
|
|
||||||
|
worker.send({
|
||||||
|
msgId,
|
||||||
|
...work,
|
||||||
|
});
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (type === 'tornadoWithdraw') {
|
||||||
|
const newWork = await relayerWorker.createWork({
|
||||||
|
netId: msg.netId,
|
||||||
|
contract: msg.contract,
|
||||||
|
proof: msg.proof,
|
||||||
|
args: msg.args,
|
||||||
|
});
|
||||||
|
|
||||||
|
worker.send({
|
||||||
|
msgId,
|
||||||
|
...newWork,
|
||||||
|
});
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (type === 'errors') {
|
||||||
|
worker.send({
|
||||||
|
msgId,
|
||||||
|
errors: [...syncManager.errors, ...relayerWorker.errors],
|
||||||
|
});
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async function start() {
|
||||||
|
const relayerConfig = getRelayerConfig();
|
||||||
|
const logger = getLogger('[Main]', relayerConfig.logLevel);
|
||||||
|
|
||||||
|
console.log('Relayer config', relayerConfig);
|
||||||
|
|
||||||
|
await checkProviders(relayerConfig, logger);
|
||||||
|
|
||||||
|
const syncManager = new SyncManager(relayerConfig);
|
||||||
|
|
||||||
|
await syncManager.syncEvents();
|
||||||
|
|
||||||
|
const relayerWorker = new RelayerWorker(relayerConfig, syncManager);
|
||||||
|
|
||||||
|
setInterval(() => syncManager.syncEvents(), relayerConfig.syncInterval * 1000);
|
||||||
|
|
||||||
|
// Spawn website
|
||||||
|
let i = 0;
|
||||||
|
while (i < relayerConfig.workers) {
|
||||||
|
forkRouter({
|
||||||
|
relayerConfig,
|
||||||
|
logger,
|
||||||
|
syncManager,
|
||||||
|
relayerWorker,
|
||||||
|
forkId: i,
|
||||||
|
});
|
||||||
|
i++;
|
||||||
|
}
|
||||||
|
|
||||||
|
logger.info(`Spawned ${i} Router Workers`);
|
||||||
|
}
|
25
src/types/bloomfilter.js.d.ts
vendored
Normal file
25
src/types/bloomfilter.js.d.ts
vendored
Normal file
@ -0,0 +1,25 @@
|
|||||||
|
/* eslint-disable @typescript-eslint/no-explicit-any */
|
||||||
|
declare module 'bloomfilter.js' {
|
||||||
|
export default class BloomFilter {
|
||||||
|
m: number;
|
||||||
|
k: number;
|
||||||
|
size: number;
|
||||||
|
bitview: any;
|
||||||
|
|
||||||
|
constructor(n: number, false_postive_tolerance?: number);
|
||||||
|
|
||||||
|
calculateHash(x: number, m: number, i: number): number;
|
||||||
|
|
||||||
|
test(data: any): boolean;
|
||||||
|
|
||||||
|
add(data: any): void;
|
||||||
|
|
||||||
|
bytelength(): number;
|
||||||
|
|
||||||
|
view(): Uint8Array;
|
||||||
|
|
||||||
|
serialize(): string;
|
||||||
|
|
||||||
|
deserialize(serialized: string): BloomFilter;
|
||||||
|
}
|
||||||
|
}
|
BIN
static/events/deposits_100_xdai_100.json.zip
vendored
Normal file
BIN
static/events/deposits_100_xdai_100.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_100_xdai_1000.json.zip
vendored
Normal file
BIN
static/events/deposits_100_xdai_1000.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_100_xdai_10000.json.zip
vendored
Normal file
BIN
static/events/deposits_100_xdai_10000.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_100_xdai_100000.json.zip
vendored
Normal file
BIN
static/events/deposits_100_xdai_100000.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_10_eth_0.1.json.zip
vendored
Normal file
BIN
static/events/deposits_10_eth_0.1.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_10_eth_1.json.zip
vendored
Normal file
BIN
static/events/deposits_10_eth_1.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_10_eth_10.json.zip
vendored
Normal file
BIN
static/events/deposits_10_eth_10.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_10_eth_100.json.zip
vendored
Normal file
BIN
static/events/deposits_10_eth_100.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_11155111_dai_100.json.zip
vendored
Normal file
BIN
static/events/deposits_11155111_dai_100.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_11155111_dai_1000.json.zip
vendored
Normal file
BIN
static/events/deposits_11155111_dai_1000.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_11155111_dai_10000.json.zip
vendored
Normal file
BIN
static/events/deposits_11155111_dai_10000.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_11155111_dai_100000.json.zip
vendored
Normal file
BIN
static/events/deposits_11155111_dai_100000.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_11155111_eth_0.1.json.zip
vendored
Normal file
BIN
static/events/deposits_11155111_eth_0.1.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_11155111_eth_1.json.zip
vendored
Normal file
BIN
static/events/deposits_11155111_eth_1.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_11155111_eth_10.json.zip
vendored
Normal file
BIN
static/events/deposits_11155111_eth_10.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_11155111_eth_100.json.zip
vendored
Normal file
BIN
static/events/deposits_11155111_eth_100.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_137_matic_100.json.zip
vendored
Normal file
BIN
static/events/deposits_137_matic_100.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_137_matic_1000.json.zip
vendored
Normal file
BIN
static/events/deposits_137_matic_1000.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_137_matic_10000.json.zip
vendored
Normal file
BIN
static/events/deposits_137_matic_10000.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_137_matic_100000.json.zip
vendored
Normal file
BIN
static/events/deposits_137_matic_100000.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_1_cdai_5000.json.zip
vendored
Normal file
BIN
static/events/deposits_1_cdai_5000.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_1_cdai_50000.json.zip
vendored
Normal file
BIN
static/events/deposits_1_cdai_50000.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_1_cdai_500000.json.zip
vendored
Normal file
BIN
static/events/deposits_1_cdai_500000.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_1_cdai_5000000.json.zip
vendored
Normal file
BIN
static/events/deposits_1_cdai_5000000.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_1_dai_100.json.zip
vendored
Normal file
BIN
static/events/deposits_1_dai_100.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_1_dai_1000.json.zip
vendored
Normal file
BIN
static/events/deposits_1_dai_1000.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_1_dai_10000.json.zip
vendored
Normal file
BIN
static/events/deposits_1_dai_10000.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_1_dai_100000.json.zip
vendored
Normal file
BIN
static/events/deposits_1_dai_100000.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_1_eth_0.1.json.zip
vendored
Normal file
BIN
static/events/deposits_1_eth_0.1.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_1_eth_1.json.zip
vendored
Normal file
BIN
static/events/deposits_1_eth_1.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_1_eth_10.json.zip
vendored
Normal file
BIN
static/events/deposits_1_eth_10.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_1_eth_100.json.zip
vendored
Normal file
BIN
static/events/deposits_1_eth_100.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_1_usdc_100.json.zip
vendored
Normal file
BIN
static/events/deposits_1_usdc_100.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_1_usdc_1000.json.zip
vendored
Normal file
BIN
static/events/deposits_1_usdc_1000.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_1_usdt_100.json.zip
vendored
Normal file
BIN
static/events/deposits_1_usdt_100.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_1_usdt_1000.json.zip
vendored
Normal file
BIN
static/events/deposits_1_usdt_1000.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_1_wbtc_0.1.json.zip
vendored
Normal file
BIN
static/events/deposits_1_wbtc_0.1.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_1_wbtc_1.json.zip
vendored
Normal file
BIN
static/events/deposits_1_wbtc_1.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_1_wbtc_10.json.zip
vendored
Normal file
BIN
static/events/deposits_1_wbtc_10.json.zip
vendored
Normal file
Binary file not shown.
BIN
static/events/deposits_42161_eth_0.1.json.zip
vendored
Normal file
BIN
static/events/deposits_42161_eth_0.1.json.zip
vendored
Normal file
Binary file not shown.
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user