Fixes in Readme and challenge word fix

This commit is contained in:
Jordi Baylina 2020-07-14 11:55:12 +02:00
parent ded45aac6f
commit 12ec77e9c3
No known key found for this signature in database
GPG Key ID: 7480C80C1BE43112
20 changed files with 521 additions and 523 deletions

@ -49,7 +49,7 @@ snarkjs --help
```
You can also us the `--help` option with specific commands:
You can also use the `--help` option with specific commands:
```sh
snarkjs groth16 prove --help
@ -65,12 +65,10 @@ snarkjs g16p --help
### Debugging tip
If you a feel a command is taking longer than it should, re-run it with a `-v` or `--verbose` option to see more details about how it's progressing and where it's getting blocked.
If you a feel a command is taking longer than it should, re-run it with a `-v` or `--verbose` option to see more details about how it's progressing and where it's getting blocked.
## Guide
> If this is your first time using circom and snarkjs, we recommend going through [this tutorial](https://blog.iden3.io/first-zk-proof.html) first.
### 0. Create and move into a new directory
```sh
mkdir snarkjs_example
@ -89,8 +87,6 @@ The first parameter after `new` refers to the type of curve you wish to use. At
The second parameter, in this case `12`, is the power of two of the maximum number of contraints that the ceremony can accept: in this case, the number of constraints is `2 ^ 12 = 4096`. The maximum value supported here is `28`, which means you can use `snarkjs` to securely generate zk-snark parameters for circuits with up to `2 ^ 28` (≈268 million) constraints.
> Note that the creator of the ceremony is also the first contributor.
### 2. Contribute to the ceremony
```sh
snarkjs powersoftau contribute pot12_0000.ptau pot12_0001.ptau --name="First contribution" -v
@ -113,12 +109,14 @@ By allowing you to write the random text as part of the command, the `-e` parame
### 4. Provide a third contribution using third party software
```sh
snarkjs powersoftau export challange pot12_0002.ptau challange_0003
snarkjs powersoftau challange contribute bn128 challange_0003 response_0003 -e="some random text"
snarkjs powersoftau export challenge pot12_0002.ptau challenge_0003
snarkjs powersoftau challenge contribute bn128 challenge_0003 response_0003 -e="some random text"
snarkjs powersoftau import response pot12_0002.ptau response_0003 pot12_0003.ptau -n="Third contribution name"
```
The commands above use [this software](https://github.com/kobigurk/phase2-bn254) to help generate a challenge, response, and a new `ptau` file.
The challenge and response files are compatible with [this software](https://github.com/kobigurk/phase2-bn254).
This allows to use both softwares in a single ceremony.
### 5. Verify the protocol so far
```sh
@ -158,7 +156,7 @@ snarkjs powersoftau prepare phase2 pot12_beacon.ptau pot12_final.ptau -v
We're now ready to prepare phase 2 of the setup (the circuit-specific phase).
Under the hood, the `prepare phase2` command calculates the evaluation of the Lagrange polynomials at tau for `alpha*tau` and `beta*tau`. It takes the beacon `ptau` file we generated in the previous step, and outputs a final `ptau` file which will be used to generate the circuit proving and verification keys.
Under the hood, the `prepare phase2` command calculates the encrypted evaluation of the Lagrange polynomials at tau for `tau`, `alpha*tau` and `beta*tau`. It takes the beacon `ptau` file we generated in the previous step, and outputs a final `ptau` file which will be used to generate the circuit proving and verification keys.
### 8. Verify the final `ptau`
```sh
@ -169,7 +167,7 @@ The `verify` command verifies a powers of tau file.
Before we go ahead and create the circuit, we perform a final check and verify the final protocol transcript.
You can see now that there is not a warning any more informing that you have not run the prepare phase2.
### 9. Create the circuit
```sh
@ -259,7 +257,7 @@ snarkjs zkey new circuit.r1cs pot12_final.ptau circuit_0000.zkey
The `zkey new` command creates an initial `zkey` file with zero contributions.
The `zkey` is a zero-knowledge key that includes both the pairing and verification keys as well as phase2 contributions.
The `zkey` is a zero-knowledge key that includes both the prooving and verification keys as well as phase2 contributions.
Importantly, one can verify whether a `zkey` belongs to a specific circuit or not.
@ -287,8 +285,8 @@ We provide a second contribution.
### 17. Provide a third contribution using third party software
```sh
snarkjs zkey export bellman circuit_0002.zkey challange_phase2_0003
snarkjs zkey bellman contribute bn128 challange_phase2_0003 response_phase2_0003 -e="some random text"
snarkjs zkey export bellman circuit_0002.zkey challenge_phase2_0003
snarkjs zkey bellman contribute bn128 challenge_phase2_0003 response_phase2_0003 -e="some random text"
snarkjs zkey import bellman circuit_0002.zkey response_phase2_0003 circuit_0003.zkey -n="Third contribution name"
```
@ -301,7 +299,9 @@ snarkjs zkey verify circuit.r1cs pot12_final.ptau circuit_0003.zkey
The `zkey verify` command verifies a `zkey` file. It also prints the hashes of all the intermediary results to the console.
We verify the `zkey` file we created in the previous step. Which means we check all the contributions to the second phase of the multi-party computation (MPC) up to that point.
We verify the `zkey` file we created in the previous step. Which means we check all the contributions to the second phase of the multi-party computation (MPC) up to that point.
This command also checksh that the zkey file matches the circuit.
If everything checks out, you should see the following:
@ -316,7 +316,7 @@ snarkjs zkey beacon circuit_0003.zkey circuit_final.zkey 0102030405060708090a0b0
The `zkey beacon` command creates a `zkey` file with a contribution applied in the form of a random beacon.
We us it to apply a random beacon to the latest `zkey` after all the final contribution has been made (this is necessary in order to generate a final `zkey` file and finalise phase 2 of the trusted setup).
We use it to apply a random beacon to the latest `zkey` after all the final contribution has been made (this is necessary in order to generate a final `zkey` file and finalise phase 2 of the trusted setup).
### 20. Verify the final `zkey`
```sh
@ -388,9 +388,7 @@ Finally, we export the verifier as a Solidity smart-contract so that we can publ
snarkjs zkey export soliditycalldata public.json proof.json
```
We use `soliditycalldata` to simulate a verification call, and cut and paste the result directly in the verifyProof field in the deployed smart contract.
This call will return true if both the proof and public data are valid.
We use `soliditycalldata` to simulate a verification call, and cut and paste the result directly in the verifyProof field in the deployed smart contract in the remix envirotment.
And voila! That's all there is to it :)

@ -739,7 +739,7 @@ async function r1csExportJson(r1csFileName, logger) {
var name = "snarkjs";
var type = "module";
var version = "0.3.3";
var version = "0.3.4";
var description = "zkSNARKs implementation in JavaScript";
var main = "./build/main.cjs";
var module$1 = "./main.js";
@ -1064,12 +1064,12 @@ function hashToG2(curve, hash) {
return g2_sp;
}
function getG2sp(curve, persinalization, challange, g1s, g1sx) {
function getG2sp(curve, persinalization, challenge, g1s, g1sx) {
const h = Blake2b(64);
const b1 = new Uint8Array([persinalization]);
h.update(b1);
h.update(challange);
h.update(challenge);
const b3 = curve.G1.toUncompressed(g1s);
h.update( b3);
const b4 = curve.G1.toUncompressed(g1sx);
@ -1079,15 +1079,15 @@ function getG2sp(curve, persinalization, challange, g1s, g1sx) {
return hashToG2(curve, hash);
}
function calculatePubKey(k, curve, personalization, challangeHash, rng ) {
function calculatePubKey(k, curve, personalization, challengeHash, rng ) {
k.g1_s = curve.G1.toAffine(curve.G1.fromRng(rng));
k.g1_sx = curve.G1.toAffine(curve.G1.timesFr(k.g1_s, k.prvKey));
k.g2_sp = curve.G2.toAffine(getG2sp(curve, personalization, challangeHash, k.g1_s, k.g1_sx));
k.g2_sp = curve.G2.toAffine(getG2sp(curve, personalization, challengeHash, k.g1_s, k.g1_sx));
k.g2_spx = curve.G2.toAffine(curve.G2.timesFr(k.g2_sp, k.prvKey));
return k;
}
function createPTauKey(curve, challangeHash, rng) {
function createPTauKey(curve, challengeHash, rng) {
const key = {
tau: {},
alpha: {},
@ -1096,9 +1096,9 @@ function createPTauKey(curve, challangeHash, rng) {
key.tau.prvKey = curve.Fr.fromRng(rng);
key.alpha.prvKey = curve.Fr.fromRng(rng);
key.beta.prvKey = curve.Fr.fromRng(rng);
calculatePubKey(key.tau, curve, 0, challangeHash, rng);
calculatePubKey(key.alpha, curve, 1, challangeHash, rng);
calculatePubKey(key.beta, curve, 2, challangeHash, rng);
calculatePubKey(key.tau, curve, 0, challengeHash, rng);
calculatePubKey(key.alpha, curve, 1, challengeHash, rng);
calculatePubKey(key.beta, curve, 2, challengeHash, rng);
return key;
}
@ -1439,7 +1439,7 @@ async function readContribution(fd, curve) {
c.betaG2 = await readG2();
c.key = await readPtauPubKey(fd, curve, true);
c.partialHash = await fd.read(216);
c.nextChallange = await fd.read(64);
c.nextChallenge = await fd.read(64);
c.type = await fd.readULE32();
const buffV = new Uint8Array(curve.G1.F.n8*2*6+curve.G2.F.n8*2*3);
@ -1522,7 +1522,7 @@ async function writeContribution(fd, curve, contribution) {
await writeG2(contribution.betaG2);
await writePtauPubKey(fd, curve, contribution.key, true);
await fd.write(contribution.partialHash);
await fd.write(contribution.nextChallange);
await fd.write(contribution.nextChallenge);
await fd.writeULE32(contribution.type || 0);
const params = [];
@ -1579,8 +1579,8 @@ async function writeContributions(fd, curve, contributions) {
fd.pos = oldPos;
}
function calculateFirstChallangeHash(curve, power, logger) {
if (logger) logger.debug("Calculating First Challange Hash");
function calculateFirstChallengeHash(curve, power, logger) {
if (logger) logger.debug("Calculating First Challenge Hash");
const hasher = new Blake2b(64);
@ -1626,11 +1626,11 @@ function calculateFirstChallangeHash(curve, power, logger) {
}
function keyFromBeacon(curve, challangeHash, beaconHash, numIterationsExp) {
function keyFromBeacon(curve, challengeHash, beaconHash, numIterationsExp) {
const rng = rngFromBeaconParams(beaconHash, numIterationsExp);
const key = createPTauKey(curve, challangeHash, rng);
const key = createPTauKey(curve, challengeHash, rng);
return key;
}
@ -1812,7 +1812,7 @@ contributions(7)
beta_g1sx
beta_g1spx
partialHash (216 bytes) See https://github.com/mafintosh/blake2b-wasm/blob/23bee06945806309977af802bc374727542617c7/blake2b.wat#L9
hashNewChallange
hashNewChallenge
]
*/
@ -1881,40 +1881,40 @@ async function newAccumulator(curve, power, fileName, logger) {
await fd.close();
const firstChallangeHash = calculateFirstChallangeHash(curve, power, logger);
const firstChallengeHash = calculateFirstChallengeHash(curve, power, logger);
if (logger) logger.debug(formatHash(Blake2b(64).digest(), "Blank Contribution Hash:"));
if (logger) logger.info(formatHash(firstChallangeHash, "First Contribution Hash:"));
if (logger) logger.info(formatHash(firstChallengeHash, "First Contribution Hash:"));
return firstChallangeHash;
return firstChallengeHash;
}
// Format of the outpu
async function exportChallange(pTauFilename, challangeFilename, logger) {
async function exportChallenge(pTauFilename, challengeFilename, logger) {
await Blake2b.ready();
const {fd: fdFrom, sections} = await readBinFile$1(pTauFilename, "ptau", 1);
const {curve, power} = await readPTauHeader(fdFrom, sections);
const contributions = await readContributions(fdFrom, curve, sections);
let lastResponseHash, curChallangeHash;
let lastResponseHash, curChallengeHash;
if (contributions.length == 0) {
lastResponseHash = Blake2b(64).digest();
curChallangeHash = calculateFirstChallangeHash(curve, power);
curChallengeHash = calculateFirstChallengeHash(curve, power);
} else {
lastResponseHash = contributions[contributions.length-1].responseHash;
curChallangeHash = contributions[contributions.length-1].nextChallange;
curChallengeHash = contributions[contributions.length-1].nextChallenge;
}
if (logger) logger.info(formatHash(lastResponseHash, "Last Response Hash: "));
if (logger) logger.info(formatHash(curChallangeHash, "New Challange Hash: "));
if (logger) logger.info(formatHash(curChallengeHash, "New Challenge Hash: "));
const fdTo = await createOverride(challangeFilename);
const fdTo = await createOverride(challengeFilename);
const toHash = Blake2b(64);
await fdTo.write(lastResponseHash);
@ -1929,16 +1929,16 @@ async function exportChallange(pTauFilename, challangeFilename, logger) {
await fdFrom.close();
await fdTo.close();
const calcCurChallangeHash = toHash.digest();
const calcCurChallengeHash = toHash.digest();
if (!hashIsEqual (curChallangeHash, calcCurChallangeHash)) {
if (logger) logger.info(formatHash(calcCurChallangeHash, "Calc Curret Challange Hash: "));
if (!hashIsEqual (curChallengeHash, calcCurChallengeHash)) {
if (logger) logger.info(formatHash(calcCurChallengeHash, "Calc Curret Challenge Hash: "));
if (logger) logger.error("PTau file is corrupted. Calculated new challange hash does not match with the eclared one");
throw new Error("PTau file is corrupted. Calculated new challange hash does not match with the eclared one");
if (logger) logger.error("PTau file is corrupted. Calculated new challenge hash does not match with the eclared one");
throw new Error("PTau file is corrupted. Calculated new challenge hash does not match with the eclared one");
}
return curChallangeHash;
return curChallengeHash;
async function exportSection(sectionId, groupName, nPoints, sectionName) {
const G = curve[groupName];
@ -1989,12 +1989,12 @@ async function importResponse(oldPtauFilename, contributionFilename, newPTauFile
sG1*6 + sG2*3)
throw new Error("Size of the contribution is invalid");
let lastChallangeHash;
let lastChallengeHash;
if (contributions.length>0) {
lastChallangeHash = contributions[contributions.length-1].nextChallange;
lastChallengeHash = contributions[contributions.length-1].nextChallenge;
} else {
lastChallangeHash = calculateFirstChallangeHash(curve, power, logger);
lastChallengeHash = calculateFirstChallengeHash(curve, power, logger);
}
const fdNew = await createBinFile(newPTauFilename, "ptau", 1, 7);
@ -2002,7 +2002,7 @@ async function importResponse(oldPtauFilename, contributionFilename, newPTauFile
const contributionPreviousHash = await fdResponse.read(64);
if(!hashIsEqual(contributionPreviousHash,lastChallangeHash))
if(!hashIsEqual(contributionPreviousHash,lastChallengeHash))
throw new Error("Wrong contribution. this contribution is not based on the previus hash");
const hasherResponse = new Blake2b(64);
@ -2033,8 +2033,8 @@ async function importResponse(oldPtauFilename, contributionFilename, newPTauFile
if (logger) logger.info(formatHash(hashResponse, "Contribution Response Hash imported: "));
const nextChallangeHasher = new Blake2b(64);
nextChallangeHasher.update(hashResponse);
const nextChallengeHasher = new Blake2b(64);
nextChallengeHasher.update(hashResponse);
await hashSection(fdNew, "G1", 2, (1 << power) * 2 -1, "tauG1", logger);
await hashSection(fdNew, "G2", 3, (1 << power) , "tauG2", logger);
@ -2042,9 +2042,9 @@ async function importResponse(oldPtauFilename, contributionFilename, newPTauFile
await hashSection(fdNew, "G1", 5, (1 << power) , "betaTauG1", logger);
await hashSection(fdNew, "G2", 6, 1 , "betaG2", logger);
currentContribution.nextChallange = nextChallangeHasher.digest();
currentContribution.nextChallenge = nextChallengeHasher.digest();
if (logger) logger.info(formatHash(currentContribution.nextChallange, "Next Challange Hash: "));
if (logger) logger.info(formatHash(currentContribution.nextChallenge, "Next Challenge Hash: "));
contributions.push(currentContribution);
@ -2054,7 +2054,7 @@ async function importResponse(oldPtauFilename, contributionFilename, newPTauFile
await fdNew.close();
await fdOld.close();
return currentContribution.nextChallange;
return currentContribution.nextChallenge;
async function processSection(fdFrom, fdTo, groupName, sectionId, nPoints, singularPointIndexes, sectionName) {
@ -2111,7 +2111,7 @@ async function importResponse(oldPtauFilename, contributionFilename, newPTauFile
const buffU = await G.batchLEMtoU(buffLEM);
nextChallangeHasher.update(buffU);
nextChallengeHasher.update(buffU);
}
fdTo.pos = oldPos;
@ -2124,97 +2124,97 @@ const sameRatio$1 = sameRatio;
async function verifyContribution(curve, cur, prev, logger) {
let sr;
if (cur.type == 1) { // Verify the beacon.
const beaconKey = keyFromBeacon(curve, prev.nextChallange, cur.beaconHash, cur.numIterationsExp);
const beaconKey = keyFromBeacon(curve, prev.nextChallenge, cur.beaconHash, cur.numIterationsExp);
if (!curve.G1.eq(cur.key.tau.g1_s, beaconKey.tau.g1_s)) {
if (logger) logger.error(`BEACON key (tauG1_s) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (tauG1_s) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
if (!curve.G1.eq(cur.key.tau.g1_sx, beaconKey.tau.g1_sx)) {
if (logger) logger.error(`BEACON key (tauG1_sx) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (tauG1_sx) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
if (!curve.G2.eq(cur.key.tau.g2_spx, beaconKey.tau.g2_spx)) {
if (logger) logger.error(`BEACON key (tauG2_spx) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (tauG2_spx) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
if (!curve.G1.eq(cur.key.alpha.g1_s, beaconKey.alpha.g1_s)) {
if (logger) logger.error(`BEACON key (alphaG1_s) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (alphaG1_s) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
if (!curve.G1.eq(cur.key.alpha.g1_sx, beaconKey.alpha.g1_sx)) {
if (logger) logger.error(`BEACON key (alphaG1_sx) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (alphaG1_sx) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
if (!curve.G2.eq(cur.key.alpha.g2_spx, beaconKey.alpha.g2_spx)) {
if (logger) logger.error(`BEACON key (alphaG2_spx) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (alphaG2_spx) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
if (!curve.G1.eq(cur.key.beta.g1_s, beaconKey.beta.g1_s)) {
if (logger) logger.error(`BEACON key (betaG1_s) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (betaG1_s) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
if (!curve.G1.eq(cur.key.beta.g1_sx, beaconKey.beta.g1_sx)) {
if (logger) logger.error(`BEACON key (betaG1_sx) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (betaG1_sx) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
if (!curve.G2.eq(cur.key.beta.g2_spx, beaconKey.beta.g2_spx)) {
if (logger) logger.error(`BEACON key (betaG2_spx) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (betaG2_spx) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
}
cur.key.tau.g2_sp = curve.G2.toAffine(getG2sp(curve, 0, prev.nextChallange, cur.key.tau.g1_s, cur.key.tau.g1_sx));
cur.key.alpha.g2_sp = curve.G2.toAffine(getG2sp(curve, 1, prev.nextChallange, cur.key.alpha.g1_s, cur.key.alpha.g1_sx));
cur.key.beta.g2_sp = curve.G2.toAffine(getG2sp(curve, 2, prev.nextChallange, cur.key.beta.g1_s, cur.key.beta.g1_sx));
cur.key.tau.g2_sp = curve.G2.toAffine(getG2sp(curve, 0, prev.nextChallenge, cur.key.tau.g1_s, cur.key.tau.g1_sx));
cur.key.alpha.g2_sp = curve.G2.toAffine(getG2sp(curve, 1, prev.nextChallenge, cur.key.alpha.g1_s, cur.key.alpha.g1_sx));
cur.key.beta.g2_sp = curve.G2.toAffine(getG2sp(curve, 2, prev.nextChallenge, cur.key.beta.g1_s, cur.key.beta.g1_sx));
sr = await sameRatio$1(curve, cur.key.tau.g1_s, cur.key.tau.g1_sx, cur.key.tau.g2_sp, cur.key.tau.g2_spx);
if (sr !== true) {
if (logger) logger.error("INVALID key (tau) in challange #"+cur.id);
if (logger) logger.error("INVALID key (tau) in challenge #"+cur.id);
return false;
}
sr = await sameRatio$1(curve, cur.key.alpha.g1_s, cur.key.alpha.g1_sx, cur.key.alpha.g2_sp, cur.key.alpha.g2_spx);
if (sr !== true) {
if (logger) logger.error("INVALID key (alpha) in challange #"+cur.id);
if (logger) logger.error("INVALID key (alpha) in challenge #"+cur.id);
return false;
}
sr = await sameRatio$1(curve, cur.key.beta.g1_s, cur.key.beta.g1_sx, cur.key.beta.g2_sp, cur.key.beta.g2_spx);
if (sr !== true) {
if (logger) logger.error("INVALID key (beta) in challange #"+cur.id);
if (logger) logger.error("INVALID key (beta) in challenge #"+cur.id);
return false;
}
sr = await sameRatio$1(curve, prev.tauG1, cur.tauG1, cur.key.tau.g2_sp, cur.key.tau.g2_spx);
if (sr !== true) {
if (logger) logger.error("INVALID tau*G1. challange #"+cur.id+" It does not follow the previous contribution");
if (logger) logger.error("INVALID tau*G1. challenge #"+cur.id+" It does not follow the previous contribution");
return false;
}
sr = await sameRatio$1(curve, cur.key.tau.g1_s, cur.key.tau.g1_sx, prev.tauG2, cur.tauG2);
if (sr !== true) {
if (logger) logger.error("INVALID tau*G2. challange #"+cur.id+" It does not follow the previous contribution");
if (logger) logger.error("INVALID tau*G2. challenge #"+cur.id+" It does not follow the previous contribution");
return false;
}
sr = await sameRatio$1(curve, prev.alphaG1, cur.alphaG1, cur.key.alpha.g2_sp, cur.key.alpha.g2_spx);
if (sr !== true) {
if (logger) logger.error("INVALID alpha*G1. challange #"+cur.id+" It does not follow the previous contribution");
if (logger) logger.error("INVALID alpha*G1. challenge #"+cur.id+" It does not follow the previous contribution");
return false;
}
sr = await sameRatio$1(curve, prev.betaG1, cur.betaG1, cur.key.beta.g2_sp, cur.key.beta.g2_spx);
if (sr !== true) {
if (logger) logger.error("INVALID beta*G1. challange #"+cur.id+" It does not follow the previous contribution");
if (logger) logger.error("INVALID beta*G1. challenge #"+cur.id+" It does not follow the previous contribution");
return false;
}
sr = await sameRatio$1(curve, cur.key.beta.g1_s, cur.key.beta.g1_sx, prev.betaG2, cur.betaG2);
if (sr !== true) {
if (logger) logger.error("INVALID beta*G2. challange #"+cur.id+"It does not follow the previous contribution");
if (logger) logger.error("INVALID beta*G2. challenge #"+cur.id+"It does not follow the previous contribution");
return false;
}
@ -2240,7 +2240,7 @@ async function verify(tauFilename, logger) {
alphaG1: curve.G1.g,
betaG1: curve.G1.g,
betaG2: curve.G2.g,
nextChallange: calculateFirstChallangeHash(curve, ceremonyPower, logger),
nextChallenge: calculateFirstChallengeHash(curve, ceremonyPower, logger),
responseHash: Blake2b(64).digest()
};
@ -2264,7 +2264,7 @@ async function verify(tauFilename, logger) {
const nextContributionHasher = Blake2b(64);
nextContributionHasher.update(curContr.responseHash);
// Verify powers and compute nextChallangeHash
// Verify powers and compute nextChallengeHash
// await test();
@ -2340,13 +2340,13 @@ async function verify(tauFilename, logger) {
const nextContributionHash = nextContributionHasher.digest();
// Check the nextChallangeHash
if (!hashIsEqual(nextContributionHash,curContr.nextChallange)) {
if (logger) logger.error("Hash of the values does not match the next challange of the last contributor in the contributions section");
// Check the nextChallengeHash
if (!hashIsEqual(nextContributionHash,curContr.nextChallenge)) {
if (logger) logger.error("Hash of the values does not match the next challenge of the last contributor in the contributions section");
return false;
}
if (logger) logger.info(formatHash(nextContributionHash, "Next challange hash: "));
if (logger) logger.info(formatHash(nextContributionHash, "Next challenge hash: "));
// Verify Previous contributions
@ -2386,7 +2386,7 @@ async function verify(tauFilename, logger) {
logger.info("-----------------------------------------------------");
logger.info(`Contribution #${curContr.id}: ${curContr.name ||""}`);
logger.info(formatHash(curContr.nextChallange, "Next Challange: "));
logger.info(formatHash(curContr.nextChallenge, "Next Challenge: "));
const buffV = new Uint8Array(curve.G1.F.n8*2*6+curve.G2.F.n8*2*3);
toPtauPubKeyRpr(buffV, 0, curve, curContr.key, false);
@ -2398,7 +2398,7 @@ async function verify(tauFilename, logger) {
logger.info(formatHash(responseHash, "Response Hash:"));
logger.info(formatHash(prevContr.nextChallange, "Response Hash:"));
logger.info(formatHash(prevContr.nextChallenge, "Response Hash:"));
if (curContr.type == 1) {
logger.info(`Beacon generator: ${byteArray2hex(curContr.beaconHash)}`);
@ -2555,7 +2555,7 @@ async function verify(tauFilename, logger) {
This function creates a new section in the fdTo file with id idSection.
It multiplies the pooints in fdFrom by first, first*inc, first*inc^2, ....
nPoint Times.
It also updates the newChallangeHasher with the new points
It also updates the newChallengeHasher with the new points
*/
async function applyKeyToSection(fdOld, sections, fdNew, idSection, curve, groupName, first, inc, sectionName, logger) {
@ -2584,7 +2584,7 @@ async function applyKeyToSection(fdOld, sections, fdNew, idSection, curve, group
async function applyKeyToChallangeSection(fdOld, fdNew, responseHasher, curve, groupName, nPoints, first, inc, formatOut, sectionName, logger) {
async function applyKeyToChallengeSection(fdOld, fdNew, responseHasher, curve, groupName, nPoints, first, inc, formatOut, sectionName, logger) {
const G = curve[groupName];
const sG = G.F.n8*2;
const chunkSize = Math.floor((1<<20) / sG); // 128Mb chunks
@ -2610,10 +2610,10 @@ async function applyKeyToChallangeSection(fdOld, fdNew, responseHasher, curve, g
// Format of the output
async function challangeContribute(curve, challangeFilename, responesFileName, entropy, logger) {
async function challengeContribute(curve, challengeFilename, responesFileName, entropy, logger) {
await Blake2b.ready();
const fdFrom = await readExisting$1(challangeFilename);
const fdFrom = await readExisting$1(challengeFilename);
const sG1 = curve.F1.n64*8*2;
@ -2634,21 +2634,21 @@ async function challangeContribute(curve, challangeFilename, responesFileName, e
const fdTo = await createOverride(responesFileName);
// Calculate the hash
if (logger) logger.debug("Hashing challange");
const challangeHasher = Blake2b(64);
if (logger) logger.debug("Hashing challenge");
const challengeHasher = Blake2b(64);
for (let i=0; i<fdFrom.totalSize; i+= fdFrom.pageSize) {
const s = Math.min(fdFrom.totalSize - i, fdFrom.pageSize);
const buff = await fdFrom.read(s);
challangeHasher.update(buff);
challengeHasher.update(buff);
}
const claimedHash = await fdFrom.read(64, 0);
if (logger) logger.info(formatHash(claimedHash, "Claimed Previus Response Hash: "));
const challangeHash = challangeHasher.digest();
if (logger) logger.info(formatHash(challangeHash, "Current Challange Hash: "));
const challengeHash = challengeHasher.digest();
if (logger) logger.info(formatHash(challengeHash, "Current Challenge Hash: "));
const key = createPTauKey(curve, challangeHash, rng);
const key = createPTauKey(curve, challengeHash, rng);
if (logger) {
["tau", "alpha", "beta"].forEach( (k) => {
@ -2662,14 +2662,14 @@ async function challangeContribute(curve, challangeFilename, responesFileName, e
const responseHasher = Blake2b(64);
await fdTo.write(challangeHash);
responseHasher.update(challangeHash);
await fdTo.write(challengeHash);
responseHasher.update(challengeHash);
await applyKeyToChallangeSection(fdFrom, fdTo, responseHasher, curve, "G1", (1<<power)*2-1, curve.Fr.one , key.tau.prvKey, "COMPRESSED", "tauG1" , logger );
await applyKeyToChallangeSection(fdFrom, fdTo, responseHasher, curve, "G2", (1<<power) , curve.Fr.one , key.tau.prvKey, "COMPRESSED", "tauG2" , logger );
await applyKeyToChallangeSection(fdFrom, fdTo, responseHasher, curve, "G1", (1<<power) , key.alpha.prvKey, key.tau.prvKey, "COMPRESSED", "alphaTauG1", logger );
await applyKeyToChallangeSection(fdFrom, fdTo, responseHasher, curve, "G1", (1<<power) , key.beta.prvKey , key.tau.prvKey, "COMPRESSED", "betaTauG1" , logger );
await applyKeyToChallangeSection(fdFrom, fdTo, responseHasher, curve, "G2", 1 , key.beta.prvKey , key.tau.prvKey, "COMPRESSED", "betaTauG2" , logger );
await applyKeyToChallengeSection(fdFrom, fdTo, responseHasher, curve, "G1", (1<<power)*2-1, curve.Fr.one , key.tau.prvKey, "COMPRESSED", "tauG1" , logger );
await applyKeyToChallengeSection(fdFrom, fdTo, responseHasher, curve, "G2", (1<<power) , curve.Fr.one , key.tau.prvKey, "COMPRESSED", "tauG2" , logger );
await applyKeyToChallengeSection(fdFrom, fdTo, responseHasher, curve, "G1", (1<<power) , key.alpha.prvKey, key.tau.prvKey, "COMPRESSED", "alphaTauG1", logger );
await applyKeyToChallengeSection(fdFrom, fdTo, responseHasher, curve, "G1", (1<<power) , key.beta.prvKey , key.tau.prvKey, "COMPRESSED", "betaTauG1" , logger );
await applyKeyToChallengeSection(fdFrom, fdTo, responseHasher, curve, "G2", 1 , key.beta.prvKey , key.tau.prvKey, "COMPRESSED", "betaTauG2" , logger );
// Write and hash key
const buffKey = new Uint8Array(curve.F1.n8*2*6+curve.F2.n8*2*3);
@ -2722,18 +2722,18 @@ async function beacon(oldPtauFilename, newPTauFilename, name, beaconHashStr,num
beaconHash: beaconHash
};
let lastChallangeHash;
let lastChallengeHash;
if (contributions.length>0) {
lastChallangeHash = contributions[contributions.length-1].nextChallange;
lastChallengeHash = contributions[contributions.length-1].nextChallenge;
} else {
lastChallangeHash = calculateFirstChallangeHash(curve, power, logger);
lastChallengeHash = calculateFirstChallengeHash(curve, power, logger);
}
curContribution.key = keyFromBeacon(curve, lastChallangeHash, beaconHash, numIterationsExp);
curContribution.key = keyFromBeacon(curve, lastChallengeHash, beaconHash, numIterationsExp);
const responseHasher = new Blake2b(64);
responseHasher.update(lastChallangeHash);
responseHasher.update(lastChallengeHash);
const fdNew = await createBinFile(newPTauFilename, "ptau", 1, 7);
await writePTauHeader(fdNew, curve, power);
@ -2763,8 +2763,8 @@ async function beacon(oldPtauFilename, newPTauFilename, name, beaconHashStr,num
if (logger) logger.info(formatHash(hashResponse, "Contribution Response Hash imported: "));
const nextChallangeHasher = new Blake2b(64);
nextChallangeHasher.update(hashResponse);
const nextChallengeHasher = new Blake2b(64);
nextChallengeHasher.update(hashResponse);
await hashSection(fdNew, "G1", 2, (1 << power) * 2 -1, "tauG1", logger);
await hashSection(fdNew, "G2", 3, (1 << power) , "tauG2", logger);
@ -2772,9 +2772,9 @@ async function beacon(oldPtauFilename, newPTauFilename, name, beaconHashStr,num
await hashSection(fdNew, "G1", 5, (1 << power) , "betaTauG1", logger);
await hashSection(fdNew, "G2", 6, 1 , "betaG2", logger);
curContribution.nextChallange = nextChallangeHasher.digest();
curContribution.nextChallenge = nextChallengeHasher.digest();
if (logger) logger.info(formatHash(curContribution.nextChallange, "Next Challange Hash: "));
if (logger) logger.info(formatHash(curContribution.nextChallenge, "Next Challenge Hash: "));
contributions.push(curContribution);
@ -2844,7 +2844,7 @@ async function beacon(oldPtauFilename, newPTauFilename, name, beaconHashStr,num
const buffU = await G.batchLEMtoU(buffLEM);
nextChallangeHasher.update(buffU);
nextChallengeHasher.update(buffU);
}
fdTo.pos = oldPos;
@ -2871,24 +2871,24 @@ async function contribute(oldPtauFilename, newPTauFilename, name, entropy, logge
type: 0, // Beacon
};
let lastChallangeHash;
let lastChallengeHash;
const rng = await getRandomRng(entropy);
if (contributions.length>0) {
lastChallangeHash = contributions[contributions.length-1].nextChallange;
lastChallengeHash = contributions[contributions.length-1].nextChallenge;
} else {
lastChallangeHash = calculateFirstChallangeHash(curve, power, logger);
lastChallengeHash = calculateFirstChallengeHash(curve, power, logger);
}
// Generate a random key
curContribution.key = createPTauKey(curve, lastChallangeHash, rng);
curContribution.key = createPTauKey(curve, lastChallengeHash, rng);
const responseHasher = new Blake2b(64);
responseHasher.update(lastChallangeHash);
responseHasher.update(lastChallengeHash);
const fdNew = await createBinFile(newPTauFilename, "ptau", 1, 7);
await writePTauHeader(fdNew, curve, power);
@ -2918,8 +2918,8 @@ async function contribute(oldPtauFilename, newPTauFilename, name, entropy, logge
if (logger) logger.info(formatHash(hashResponse, "Contribution Response Hash imported: "));
const nextChallangeHasher = new Blake2b(64);
nextChallangeHasher.update(hashResponse);
const nextChallengeHasher = new Blake2b(64);
nextChallengeHasher.update(hashResponse);
await hashSection(fdNew, "G1", 2, (1 << power) * 2 -1, "tauG1");
await hashSection(fdNew, "G2", 3, (1 << power) , "tauG2");
@ -2927,9 +2927,9 @@ async function contribute(oldPtauFilename, newPTauFilename, name, entropy, logge
await hashSection(fdNew, "G1", 5, (1 << power) , "betaTauG1");
await hashSection(fdNew, "G2", 6, 1 , "betaG2");
curContribution.nextChallange = nextChallangeHasher.digest();
curContribution.nextChallenge = nextChallengeHasher.digest();
if (logger) logger.info(formatHash(curContribution.nextChallange, "Next Challange Hash: "));
if (logger) logger.info(formatHash(curContribution.nextChallenge, "Next Challenge Hash: "));
contributions.push(curContribution);
@ -2999,7 +2999,7 @@ async function contribute(oldPtauFilename, newPTauFilename, name, entropy, logge
const buffU = await G.batchLEMtoU(buffLEM);
nextChallangeHasher.update(buffU);
nextChallengeHasher.update(buffU);
}
fdTo.pos = oldPos;
@ -4900,7 +4900,7 @@ async function zkeyExportJson(zkeyFileName, verbose) {
// Format of the output
async function bellmanContribute(curve, challangeFilename, responesFileName, entropy, logger) {
async function bellmanContribute(curve, challengeFilename, responesFileName, entropy, logger) {
await Blake2b.ready();
const rng = await getRandomRng(entropy);
@ -4911,7 +4911,7 @@ async function bellmanContribute(curve, challangeFilename, responesFileName, ent
const sG1 = curve.G1.F.n8*2;
const sG2 = curve.G2.F.n8*2;
const fdFrom = await readExisting$1(challangeFilename);
const fdFrom = await readExisting$1(challengeFilename);
const fdTo = await createOverride(responesFileName);
@ -4934,12 +4934,12 @@ async function bellmanContribute(curve, challangeFilename, responesFileName, ent
// H
const nH = await fdFrom.readUBE32();
await fdTo.writeUBE32(nH);
await applyKeyToChallangeSection(fdFrom, fdTo, null, curve, "G1", nH, invDelta, curve.Fr.e(1), "UNCOMPRESSED", "H", logger);
await applyKeyToChallengeSection(fdFrom, fdTo, null, curve, "G1", nH, invDelta, curve.Fr.e(1), "UNCOMPRESSED", "H", logger);
// L
const nL = await fdFrom.readUBE32();
await fdTo.writeUBE32(nL);
await applyKeyToChallangeSection(fdFrom, fdTo, null, curve, "G1", nL, invDelta, curve.Fr.e(1), "UNCOMPRESSED", "L", logger);
await applyKeyToChallengeSection(fdFrom, fdTo, null, curve, "G1", nL, invDelta, curve.Fr.e(1), "UNCOMPRESSED", "L", logger);
// A
const nA = await fdFrom.readUBE32();
@ -5647,18 +5647,18 @@ const commands = [
action: powersOfTawContribute
},
{
cmd: "powersoftau export challange <powersoftau_0000.ptau> [challange]",
description: "Creates a challange",
cmd: "powersoftau export challenge <powersoftau_0000.ptau> [challenge]",
description: "Creates a challenge",
alias: ["ptec"],
options: "-verbose|v",
action: powersOfTawExportChallange
action: powersOfTawExportChallenge
},
{
cmd: "powersoftau challange contribute <curve> <challange> [response]",
description: "Contribute to a challange",
cmd: "powersoftau challenge contribute <curve> <challenge> [response]",
description: "Contribute to a challenge",
alias: ["ptcc"],
options: "-verbose|v -entropy|e",
action: powersOfTawChallangeContribute
action: powersOfTawChallengeContribute
},
{
cmd: "powersoftau import response <powersoftau_old.ptau> <response> <<powersoftau_new.ptau>",
@ -6202,41 +6202,41 @@ async function powersOfTawNew(params, options) {
return await newAccumulator(curve, power, ptauName, logger);
}
async function powersOfTawExportChallange(params, options) {
async function powersOfTawExportChallenge(params, options) {
let ptauName;
let challangeName;
let challengeName;
ptauName = params[0];
if (params.length < 2) {
challangeName = "challange";
challengeName = "challenge";
} else {
challangeName = params[1];
challengeName = params[1];
}
if (options.verbose) Logger.setLogLevel("DEBUG");
return await exportChallange(ptauName, challangeName, logger);
return await exportChallenge(ptauName, challengeName, logger);
}
// powersoftau challange contribute <curve> <challange> [response]
async function powersOfTawChallangeContribute(params, options) {
let challangeName;
// powersoftau challenge contribute <curve> <challenge> [response]
async function powersOfTawChallengeContribute(params, options) {
let challengeName;
let responseName;
const curve = await getCurveFromName(params[0]);
challangeName = params[1];
challengeName = params[1];
if (params.length < 3) {
responseName = changeExt(challangeName, "response");
responseName = changeExt(challengeName, "response");
} else {
responseName = params[2];
}
if (options.verbose) Logger.setLogLevel("DEBUG");
return await challangeContribute(curve, challangeName, responseName, options.entropy, logger);
return await challengeContribute(curve, challengeName, responseName, options.entropy, logger);
}
@ -6472,22 +6472,22 @@ async function zkeyBeacon(params, options) {
}
// zkey challange contribute <curve> <challange> [response]",
// zkey challenge contribute <curve> <challenge> [response]",
async function zkeyBellmanContribute(params, options) {
let challangeName;
let challengeName;
let responseName;
const curve = await getCurveFromName(params[0]);
challangeName = params[1];
challengeName = params[1];
if (params.length < 3) {
responseName = changeExt(challangeName, "response");
responseName = changeExt(challengeName, "response");
} else {
responseName = params[2];
}
if (options.verbose) Logger.setLogLevel("DEBUG");
return bellmanContribute(curve, challangeName, responseName, options.entropy, logger);
return bellmanContribute(curve, challengeName, responseName, options.entropy, logger);
}

@ -1588,12 +1588,12 @@ function hashToG2(curve, hash) {
return g2_sp;
}
function getG2sp(curve, persinalization, challange, g1s, g1sx) {
function getG2sp(curve, persinalization, challenge, g1s, g1sx) {
const h = Blake2b(64);
const b1 = new Uint8Array([persinalization]);
h.update(b1);
h.update(challange);
h.update(challenge);
const b3 = curve.G1.toUncompressed(g1s);
h.update( b3);
const b4 = curve.G1.toUncompressed(g1sx);
@ -1603,15 +1603,15 @@ function getG2sp(curve, persinalization, challange, g1s, g1sx) {
return hashToG2(curve, hash);
}
function calculatePubKey(k, curve, personalization, challangeHash, rng ) {
function calculatePubKey(k, curve, personalization, challengeHash, rng ) {
k.g1_s = curve.G1.toAffine(curve.G1.fromRng(rng));
k.g1_sx = curve.G1.toAffine(curve.G1.timesFr(k.g1_s, k.prvKey));
k.g2_sp = curve.G2.toAffine(getG2sp(curve, personalization, challangeHash, k.g1_s, k.g1_sx));
k.g2_sp = curve.G2.toAffine(getG2sp(curve, personalization, challengeHash, k.g1_s, k.g1_sx));
k.g2_spx = curve.G2.toAffine(curve.G2.timesFr(k.g2_sp, k.prvKey));
return k;
}
function createPTauKey(curve, challangeHash, rng) {
function createPTauKey(curve, challengeHash, rng) {
const key = {
tau: {},
alpha: {},
@ -1620,9 +1620,9 @@ function createPTauKey(curve, challangeHash, rng) {
key.tau.prvKey = curve.Fr.fromRng(rng);
key.alpha.prvKey = curve.Fr.fromRng(rng);
key.beta.prvKey = curve.Fr.fromRng(rng);
calculatePubKey(key.tau, curve, 0, challangeHash, rng);
calculatePubKey(key.alpha, curve, 1, challangeHash, rng);
calculatePubKey(key.beta, curve, 2, challangeHash, rng);
calculatePubKey(key.tau, curve, 0, challengeHash, rng);
calculatePubKey(key.alpha, curve, 1, challengeHash, rng);
calculatePubKey(key.beta, curve, 2, challengeHash, rng);
return key;
}
@ -1773,7 +1773,7 @@ async function readContribution$1(fd, curve) {
c.betaG2 = await readG2();
c.key = await readPtauPubKey(fd, curve, true);
c.partialHash = await fd.read(216);
c.nextChallange = await fd.read(64);
c.nextChallenge = await fd.read(64);
c.type = await fd.readULE32();
const buffV = new Uint8Array(curve.G1.F.n8*2*6+curve.G2.F.n8*2*3);
@ -1856,7 +1856,7 @@ async function writeContribution$1(fd, curve, contribution) {
await writeG2(contribution.betaG2);
await writePtauPubKey(fd, curve, contribution.key, true);
await fd.write(contribution.partialHash);
await fd.write(contribution.nextChallange);
await fd.write(contribution.nextChallenge);
await fd.writeULE32(contribution.type || 0);
const params = [];
@ -1913,8 +1913,8 @@ async function writeContributions(fd, curve, contributions) {
fd.pos = oldPos;
}
function calculateFirstChallangeHash(curve, power, logger) {
if (logger) logger.debug("Calculating First Challange Hash");
function calculateFirstChallengeHash(curve, power, logger) {
if (logger) logger.debug("Calculating First Challenge Hash");
const hasher = new Blake2b(64);
@ -1960,11 +1960,11 @@ function calculateFirstChallangeHash(curve, power, logger) {
}
function keyFromBeacon(curve, challangeHash, beaconHash, numIterationsExp) {
function keyFromBeacon(curve, challengeHash, beaconHash, numIterationsExp) {
const rng = rngFromBeaconParams(beaconHash, numIterationsExp);
const key = createPTauKey(curve, challangeHash, rng);
const key = createPTauKey(curve, challengeHash, rng);
return key;
}
@ -2013,7 +2013,7 @@ contributions(7)
beta_g1sx
beta_g1spx
partialHash (216 bytes) See https://github.com/mafintosh/blake2b-wasm/blob/23bee06945806309977af802bc374727542617c7/blake2b.wat#L9
hashNewChallange
hashNewChallenge
]
*/
@ -2082,40 +2082,40 @@ async function newAccumulator(curve, power, fileName, logger) {
await fd.close();
const firstChallangeHash = calculateFirstChallangeHash(curve, power, logger);
const firstChallengeHash = calculateFirstChallengeHash(curve, power, logger);
if (logger) logger.debug(formatHash(Blake2b(64).digest(), "Blank Contribution Hash:"));
if (logger) logger.info(formatHash(firstChallangeHash, "First Contribution Hash:"));
if (logger) logger.info(formatHash(firstChallengeHash, "First Contribution Hash:"));
return firstChallangeHash;
return firstChallengeHash;
}
// Format of the outpu
async function exportChallange(pTauFilename, challangeFilename, logger) {
async function exportChallenge(pTauFilename, challengeFilename, logger) {
await Blake2b.ready();
const {fd: fdFrom, sections} = await readBinFile(pTauFilename, "ptau", 1);
const {curve, power} = await readPTauHeader(fdFrom, sections);
const contributions = await readContributions(fdFrom, curve, sections);
let lastResponseHash, curChallangeHash;
let lastResponseHash, curChallengeHash;
if (contributions.length == 0) {
lastResponseHash = Blake2b(64).digest();
curChallangeHash = calculateFirstChallangeHash(curve, power);
curChallengeHash = calculateFirstChallengeHash(curve, power);
} else {
lastResponseHash = contributions[contributions.length-1].responseHash;
curChallangeHash = contributions[contributions.length-1].nextChallange;
curChallengeHash = contributions[contributions.length-1].nextChallenge;
}
if (logger) logger.info(formatHash(lastResponseHash, "Last Response Hash: "));
if (logger) logger.info(formatHash(curChallangeHash, "New Challange Hash: "));
if (logger) logger.info(formatHash(curChallengeHash, "New Challenge Hash: "));
const fdTo = await createOverride(challangeFilename);
const fdTo = await createOverride(challengeFilename);
const toHash = Blake2b(64);
await fdTo.write(lastResponseHash);
@ -2130,16 +2130,16 @@ async function exportChallange(pTauFilename, challangeFilename, logger) {
await fdFrom.close();
await fdTo.close();
const calcCurChallangeHash = toHash.digest();
const calcCurChallengeHash = toHash.digest();
if (!hashIsEqual (curChallangeHash, calcCurChallangeHash)) {
if (logger) logger.info(formatHash(calcCurChallangeHash, "Calc Curret Challange Hash: "));
if (!hashIsEqual (curChallengeHash, calcCurChallengeHash)) {
if (logger) logger.info(formatHash(calcCurChallengeHash, "Calc Curret Challenge Hash: "));
if (logger) logger.error("PTau file is corrupted. Calculated new challange hash does not match with the eclared one");
throw new Error("PTau file is corrupted. Calculated new challange hash does not match with the eclared one");
if (logger) logger.error("PTau file is corrupted. Calculated new challenge hash does not match with the eclared one");
throw new Error("PTau file is corrupted. Calculated new challenge hash does not match with the eclared one");
}
return curChallangeHash;
return curChallengeHash;
async function exportSection(sectionId, groupName, nPoints, sectionName) {
const G = curve[groupName];
@ -2190,12 +2190,12 @@ async function importResponse(oldPtauFilename, contributionFilename, newPTauFile
sG1*6 + sG2*3)
throw new Error("Size of the contribution is invalid");
let lastChallangeHash;
let lastChallengeHash;
if (contributions.length>0) {
lastChallangeHash = contributions[contributions.length-1].nextChallange;
lastChallengeHash = contributions[contributions.length-1].nextChallenge;
} else {
lastChallangeHash = calculateFirstChallangeHash(curve, power, logger);
lastChallengeHash = calculateFirstChallengeHash(curve, power, logger);
}
const fdNew = await createBinFile(newPTauFilename, "ptau", 1, 7);
@ -2203,7 +2203,7 @@ async function importResponse(oldPtauFilename, contributionFilename, newPTauFile
const contributionPreviousHash = await fdResponse.read(64);
if(!hashIsEqual(contributionPreviousHash,lastChallangeHash))
if(!hashIsEqual(contributionPreviousHash,lastChallengeHash))
throw new Error("Wrong contribution. this contribution is not based on the previus hash");
const hasherResponse = new Blake2b(64);
@ -2234,8 +2234,8 @@ async function importResponse(oldPtauFilename, contributionFilename, newPTauFile
if (logger) logger.info(formatHash(hashResponse, "Contribution Response Hash imported: "));
const nextChallangeHasher = new Blake2b(64);
nextChallangeHasher.update(hashResponse);
const nextChallengeHasher = new Blake2b(64);
nextChallengeHasher.update(hashResponse);
await hashSection(fdNew, "G1", 2, (1 << power) * 2 -1, "tauG1", logger);
await hashSection(fdNew, "G2", 3, (1 << power) , "tauG2", logger);
@ -2243,9 +2243,9 @@ async function importResponse(oldPtauFilename, contributionFilename, newPTauFile
await hashSection(fdNew, "G1", 5, (1 << power) , "betaTauG1", logger);
await hashSection(fdNew, "G2", 6, 1 , "betaG2", logger);
currentContribution.nextChallange = nextChallangeHasher.digest();
currentContribution.nextChallenge = nextChallengeHasher.digest();
if (logger) logger.info(formatHash(currentContribution.nextChallange, "Next Challange Hash: "));
if (logger) logger.info(formatHash(currentContribution.nextChallenge, "Next Challenge Hash: "));
contributions.push(currentContribution);
@ -2255,7 +2255,7 @@ async function importResponse(oldPtauFilename, contributionFilename, newPTauFile
await fdNew.close();
await fdOld.close();
return currentContribution.nextChallange;
return currentContribution.nextChallenge;
async function processSection(fdFrom, fdTo, groupName, sectionId, nPoints, singularPointIndexes, sectionName) {
@ -2312,7 +2312,7 @@ async function importResponse(oldPtauFilename, contributionFilename, newPTauFile
const buffU = await G.batchLEMtoU(buffLEM);
nextChallangeHasher.update(buffU);
nextChallengeHasher.update(buffU);
}
fdTo.pos = oldPos;
@ -2325,97 +2325,97 @@ const sameRatio$1 = sameRatio;
async function verifyContribution(curve, cur, prev, logger) {
let sr;
if (cur.type == 1) { // Verify the beacon.
const beaconKey = keyFromBeacon(curve, prev.nextChallange, cur.beaconHash, cur.numIterationsExp);
const beaconKey = keyFromBeacon(curve, prev.nextChallenge, cur.beaconHash, cur.numIterationsExp);
if (!curve.G1.eq(cur.key.tau.g1_s, beaconKey.tau.g1_s)) {
if (logger) logger.error(`BEACON key (tauG1_s) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (tauG1_s) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
if (!curve.G1.eq(cur.key.tau.g1_sx, beaconKey.tau.g1_sx)) {
if (logger) logger.error(`BEACON key (tauG1_sx) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (tauG1_sx) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
if (!curve.G2.eq(cur.key.tau.g2_spx, beaconKey.tau.g2_spx)) {
if (logger) logger.error(`BEACON key (tauG2_spx) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (tauG2_spx) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
if (!curve.G1.eq(cur.key.alpha.g1_s, beaconKey.alpha.g1_s)) {
if (logger) logger.error(`BEACON key (alphaG1_s) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (alphaG1_s) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
if (!curve.G1.eq(cur.key.alpha.g1_sx, beaconKey.alpha.g1_sx)) {
if (logger) logger.error(`BEACON key (alphaG1_sx) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (alphaG1_sx) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
if (!curve.G2.eq(cur.key.alpha.g2_spx, beaconKey.alpha.g2_spx)) {
if (logger) logger.error(`BEACON key (alphaG2_spx) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (alphaG2_spx) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
if (!curve.G1.eq(cur.key.beta.g1_s, beaconKey.beta.g1_s)) {
if (logger) logger.error(`BEACON key (betaG1_s) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (betaG1_s) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
if (!curve.G1.eq(cur.key.beta.g1_sx, beaconKey.beta.g1_sx)) {
if (logger) logger.error(`BEACON key (betaG1_sx) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (betaG1_sx) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
if (!curve.G2.eq(cur.key.beta.g2_spx, beaconKey.beta.g2_spx)) {
if (logger) logger.error(`BEACON key (betaG2_spx) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (betaG2_spx) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
}
cur.key.tau.g2_sp = curve.G2.toAffine(getG2sp(curve, 0, prev.nextChallange, cur.key.tau.g1_s, cur.key.tau.g1_sx));
cur.key.alpha.g2_sp = curve.G2.toAffine(getG2sp(curve, 1, prev.nextChallange, cur.key.alpha.g1_s, cur.key.alpha.g1_sx));
cur.key.beta.g2_sp = curve.G2.toAffine(getG2sp(curve, 2, prev.nextChallange, cur.key.beta.g1_s, cur.key.beta.g1_sx));
cur.key.tau.g2_sp = curve.G2.toAffine(getG2sp(curve, 0, prev.nextChallenge, cur.key.tau.g1_s, cur.key.tau.g1_sx));
cur.key.alpha.g2_sp = curve.G2.toAffine(getG2sp(curve, 1, prev.nextChallenge, cur.key.alpha.g1_s, cur.key.alpha.g1_sx));
cur.key.beta.g2_sp = curve.G2.toAffine(getG2sp(curve, 2, prev.nextChallenge, cur.key.beta.g1_s, cur.key.beta.g1_sx));
sr = await sameRatio$1(curve, cur.key.tau.g1_s, cur.key.tau.g1_sx, cur.key.tau.g2_sp, cur.key.tau.g2_spx);
if (sr !== true) {
if (logger) logger.error("INVALID key (tau) in challange #"+cur.id);
if (logger) logger.error("INVALID key (tau) in challenge #"+cur.id);
return false;
}
sr = await sameRatio$1(curve, cur.key.alpha.g1_s, cur.key.alpha.g1_sx, cur.key.alpha.g2_sp, cur.key.alpha.g2_spx);
if (sr !== true) {
if (logger) logger.error("INVALID key (alpha) in challange #"+cur.id);
if (logger) logger.error("INVALID key (alpha) in challenge #"+cur.id);
return false;
}
sr = await sameRatio$1(curve, cur.key.beta.g1_s, cur.key.beta.g1_sx, cur.key.beta.g2_sp, cur.key.beta.g2_spx);
if (sr !== true) {
if (logger) logger.error("INVALID key (beta) in challange #"+cur.id);
if (logger) logger.error("INVALID key (beta) in challenge #"+cur.id);
return false;
}
sr = await sameRatio$1(curve, prev.tauG1, cur.tauG1, cur.key.tau.g2_sp, cur.key.tau.g2_spx);
if (sr !== true) {
if (logger) logger.error("INVALID tau*G1. challange #"+cur.id+" It does not follow the previous contribution");
if (logger) logger.error("INVALID tau*G1. challenge #"+cur.id+" It does not follow the previous contribution");
return false;
}
sr = await sameRatio$1(curve, cur.key.tau.g1_s, cur.key.tau.g1_sx, prev.tauG2, cur.tauG2);
if (sr !== true) {
if (logger) logger.error("INVALID tau*G2. challange #"+cur.id+" It does not follow the previous contribution");
if (logger) logger.error("INVALID tau*G2. challenge #"+cur.id+" It does not follow the previous contribution");
return false;
}
sr = await sameRatio$1(curve, prev.alphaG1, cur.alphaG1, cur.key.alpha.g2_sp, cur.key.alpha.g2_spx);
if (sr !== true) {
if (logger) logger.error("INVALID alpha*G1. challange #"+cur.id+" It does not follow the previous contribution");
if (logger) logger.error("INVALID alpha*G1. challenge #"+cur.id+" It does not follow the previous contribution");
return false;
}
sr = await sameRatio$1(curve, prev.betaG1, cur.betaG1, cur.key.beta.g2_sp, cur.key.beta.g2_spx);
if (sr !== true) {
if (logger) logger.error("INVALID beta*G1. challange #"+cur.id+" It does not follow the previous contribution");
if (logger) logger.error("INVALID beta*G1. challenge #"+cur.id+" It does not follow the previous contribution");
return false;
}
sr = await sameRatio$1(curve, cur.key.beta.g1_s, cur.key.beta.g1_sx, prev.betaG2, cur.betaG2);
if (sr !== true) {
if (logger) logger.error("INVALID beta*G2. challange #"+cur.id+"It does not follow the previous contribution");
if (logger) logger.error("INVALID beta*G2. challenge #"+cur.id+"It does not follow the previous contribution");
return false;
}
@ -2441,7 +2441,7 @@ async function verify(tauFilename, logger) {
alphaG1: curve.G1.g,
betaG1: curve.G1.g,
betaG2: curve.G2.g,
nextChallange: calculateFirstChallangeHash(curve, ceremonyPower, logger),
nextChallenge: calculateFirstChallengeHash(curve, ceremonyPower, logger),
responseHash: Blake2b(64).digest()
};
@ -2465,7 +2465,7 @@ async function verify(tauFilename, logger) {
const nextContributionHasher = Blake2b(64);
nextContributionHasher.update(curContr.responseHash);
// Verify powers and compute nextChallangeHash
// Verify powers and compute nextChallengeHash
// await test();
@ -2541,13 +2541,13 @@ async function verify(tauFilename, logger) {
const nextContributionHash = nextContributionHasher.digest();
// Check the nextChallangeHash
if (!hashIsEqual(nextContributionHash,curContr.nextChallange)) {
if (logger) logger.error("Hash of the values does not match the next challange of the last contributor in the contributions section");
// Check the nextChallengeHash
if (!hashIsEqual(nextContributionHash,curContr.nextChallenge)) {
if (logger) logger.error("Hash of the values does not match the next challenge of the last contributor in the contributions section");
return false;
}
if (logger) logger.info(formatHash(nextContributionHash, "Next challange hash: "));
if (logger) logger.info(formatHash(nextContributionHash, "Next challenge hash: "));
// Verify Previous contributions
@ -2587,7 +2587,7 @@ async function verify(tauFilename, logger) {
logger.info("-----------------------------------------------------");
logger.info(`Contribution #${curContr.id}: ${curContr.name ||""}`);
logger.info(formatHash(curContr.nextChallange, "Next Challange: "));
logger.info(formatHash(curContr.nextChallenge, "Next Challenge: "));
const buffV = new Uint8Array(curve.G1.F.n8*2*6+curve.G2.F.n8*2*3);
toPtauPubKeyRpr(buffV, 0, curve, curContr.key, false);
@ -2599,7 +2599,7 @@ async function verify(tauFilename, logger) {
logger.info(formatHash(responseHash, "Response Hash:"));
logger.info(formatHash(prevContr.nextChallange, "Response Hash:"));
logger.info(formatHash(prevContr.nextChallenge, "Response Hash:"));
if (curContr.type == 1) {
logger.info(`Beacon generator: ${byteArray2hex(curContr.beaconHash)}`);
@ -2756,7 +2756,7 @@ async function verify(tauFilename, logger) {
This function creates a new section in the fdTo file with id idSection.
It multiplies the pooints in fdFrom by first, first*inc, first*inc^2, ....
nPoint Times.
It also updates the newChallangeHasher with the new points
It also updates the newChallengeHasher with the new points
*/
async function applyKeyToSection(fdOld, sections, fdNew, idSection, curve, groupName, first, inc, sectionName, logger) {
@ -2785,7 +2785,7 @@ async function applyKeyToSection(fdOld, sections, fdNew, idSection, curve, group
async function applyKeyToChallangeSection(fdOld, fdNew, responseHasher, curve, groupName, nPoints, first, inc, formatOut, sectionName, logger) {
async function applyKeyToChallengeSection(fdOld, fdNew, responseHasher, curve, groupName, nPoints, first, inc, formatOut, sectionName, logger) {
const G = curve[groupName];
const sG = G.F.n8*2;
const chunkSize = Math.floor((1<<20) / sG); // 128Mb chunks
@ -2811,10 +2811,10 @@ async function applyKeyToChallangeSection(fdOld, fdNew, responseHasher, curve, g
// Format of the output
async function challangeContribute(curve, challangeFilename, responesFileName, entropy, logger) {
async function challengeContribute(curve, challengeFilename, responesFileName, entropy, logger) {
await Blake2b.ready();
const fdFrom = await readExisting$1(challangeFilename);
const fdFrom = await readExisting$1(challengeFilename);
const sG1 = curve.F1.n64*8*2;
@ -2835,21 +2835,21 @@ async function challangeContribute(curve, challangeFilename, responesFileName, e
const fdTo = await createOverride(responesFileName);
// Calculate the hash
if (logger) logger.debug("Hashing challange");
const challangeHasher = Blake2b(64);
if (logger) logger.debug("Hashing challenge");
const challengeHasher = Blake2b(64);
for (let i=0; i<fdFrom.totalSize; i+= fdFrom.pageSize) {
const s = Math.min(fdFrom.totalSize - i, fdFrom.pageSize);
const buff = await fdFrom.read(s);
challangeHasher.update(buff);
challengeHasher.update(buff);
}
const claimedHash = await fdFrom.read(64, 0);
if (logger) logger.info(formatHash(claimedHash, "Claimed Previus Response Hash: "));
const challangeHash = challangeHasher.digest();
if (logger) logger.info(formatHash(challangeHash, "Current Challange Hash: "));
const challengeHash = challengeHasher.digest();
if (logger) logger.info(formatHash(challengeHash, "Current Challenge Hash: "));
const key = createPTauKey(curve, challangeHash, rng);
const key = createPTauKey(curve, challengeHash, rng);
if (logger) {
["tau", "alpha", "beta"].forEach( (k) => {
@ -2863,14 +2863,14 @@ async function challangeContribute(curve, challangeFilename, responesFileName, e
const responseHasher = Blake2b(64);
await fdTo.write(challangeHash);
responseHasher.update(challangeHash);
await fdTo.write(challengeHash);
responseHasher.update(challengeHash);
await applyKeyToChallangeSection(fdFrom, fdTo, responseHasher, curve, "G1", (1<<power)*2-1, curve.Fr.one , key.tau.prvKey, "COMPRESSED", "tauG1" , logger );
await applyKeyToChallangeSection(fdFrom, fdTo, responseHasher, curve, "G2", (1<<power) , curve.Fr.one , key.tau.prvKey, "COMPRESSED", "tauG2" , logger );
await applyKeyToChallangeSection(fdFrom, fdTo, responseHasher, curve, "G1", (1<<power) , key.alpha.prvKey, key.tau.prvKey, "COMPRESSED", "alphaTauG1", logger );
await applyKeyToChallangeSection(fdFrom, fdTo, responseHasher, curve, "G1", (1<<power) , key.beta.prvKey , key.tau.prvKey, "COMPRESSED", "betaTauG1" , logger );
await applyKeyToChallangeSection(fdFrom, fdTo, responseHasher, curve, "G2", 1 , key.beta.prvKey , key.tau.prvKey, "COMPRESSED", "betaTauG2" , logger );
await applyKeyToChallengeSection(fdFrom, fdTo, responseHasher, curve, "G1", (1<<power)*2-1, curve.Fr.one , key.tau.prvKey, "COMPRESSED", "tauG1" , logger );
await applyKeyToChallengeSection(fdFrom, fdTo, responseHasher, curve, "G2", (1<<power) , curve.Fr.one , key.tau.prvKey, "COMPRESSED", "tauG2" , logger );
await applyKeyToChallengeSection(fdFrom, fdTo, responseHasher, curve, "G1", (1<<power) , key.alpha.prvKey, key.tau.prvKey, "COMPRESSED", "alphaTauG1", logger );
await applyKeyToChallengeSection(fdFrom, fdTo, responseHasher, curve, "G1", (1<<power) , key.beta.prvKey , key.tau.prvKey, "COMPRESSED", "betaTauG1" , logger );
await applyKeyToChallengeSection(fdFrom, fdTo, responseHasher, curve, "G2", 1 , key.beta.prvKey , key.tau.prvKey, "COMPRESSED", "betaTauG2" , logger );
// Write and hash key
const buffKey = new Uint8Array(curve.F1.n8*2*6+curve.F2.n8*2*3);
@ -2923,18 +2923,18 @@ async function beacon(oldPtauFilename, newPTauFilename, name, beaconHashStr,num
beaconHash: beaconHash
};
let lastChallangeHash;
let lastChallengeHash;
if (contributions.length>0) {
lastChallangeHash = contributions[contributions.length-1].nextChallange;
lastChallengeHash = contributions[contributions.length-1].nextChallenge;
} else {
lastChallangeHash = calculateFirstChallangeHash(curve, power, logger);
lastChallengeHash = calculateFirstChallengeHash(curve, power, logger);
}
curContribution.key = keyFromBeacon(curve, lastChallangeHash, beaconHash, numIterationsExp);
curContribution.key = keyFromBeacon(curve, lastChallengeHash, beaconHash, numIterationsExp);
const responseHasher = new Blake2b(64);
responseHasher.update(lastChallangeHash);
responseHasher.update(lastChallengeHash);
const fdNew = await createBinFile(newPTauFilename, "ptau", 1, 7);
await writePTauHeader(fdNew, curve, power);
@ -2964,8 +2964,8 @@ async function beacon(oldPtauFilename, newPTauFilename, name, beaconHashStr,num
if (logger) logger.info(formatHash(hashResponse, "Contribution Response Hash imported: "));
const nextChallangeHasher = new Blake2b(64);
nextChallangeHasher.update(hashResponse);
const nextChallengeHasher = new Blake2b(64);
nextChallengeHasher.update(hashResponse);
await hashSection(fdNew, "G1", 2, (1 << power) * 2 -1, "tauG1", logger);
await hashSection(fdNew, "G2", 3, (1 << power) , "tauG2", logger);
@ -2973,9 +2973,9 @@ async function beacon(oldPtauFilename, newPTauFilename, name, beaconHashStr,num
await hashSection(fdNew, "G1", 5, (1 << power) , "betaTauG1", logger);
await hashSection(fdNew, "G2", 6, 1 , "betaG2", logger);
curContribution.nextChallange = nextChallangeHasher.digest();
curContribution.nextChallenge = nextChallengeHasher.digest();
if (logger) logger.info(formatHash(curContribution.nextChallange, "Next Challange Hash: "));
if (logger) logger.info(formatHash(curContribution.nextChallenge, "Next Challenge Hash: "));
contributions.push(curContribution);
@ -3045,7 +3045,7 @@ async function beacon(oldPtauFilename, newPTauFilename, name, beaconHashStr,num
const buffU = await G.batchLEMtoU(buffLEM);
nextChallangeHasher.update(buffU);
nextChallengeHasher.update(buffU);
}
fdTo.pos = oldPos;
@ -3072,24 +3072,24 @@ async function contribute(oldPtauFilename, newPTauFilename, name, entropy, logge
type: 0, // Beacon
};
let lastChallangeHash;
let lastChallengeHash;
const rng = await getRandomRng(entropy);
if (contributions.length>0) {
lastChallangeHash = contributions[contributions.length-1].nextChallange;
lastChallengeHash = contributions[contributions.length-1].nextChallenge;
} else {
lastChallangeHash = calculateFirstChallangeHash(curve, power, logger);
lastChallengeHash = calculateFirstChallengeHash(curve, power, logger);
}
// Generate a random key
curContribution.key = createPTauKey(curve, lastChallangeHash, rng);
curContribution.key = createPTauKey(curve, lastChallengeHash, rng);
const responseHasher = new Blake2b(64);
responseHasher.update(lastChallangeHash);
responseHasher.update(lastChallengeHash);
const fdNew = await createBinFile(newPTauFilename, "ptau", 1, 7);
await writePTauHeader(fdNew, curve, power);
@ -3119,8 +3119,8 @@ async function contribute(oldPtauFilename, newPTauFilename, name, entropy, logge
if (logger) logger.info(formatHash(hashResponse, "Contribution Response Hash imported: "));
const nextChallangeHasher = new Blake2b(64);
nextChallangeHasher.update(hashResponse);
const nextChallengeHasher = new Blake2b(64);
nextChallengeHasher.update(hashResponse);
await hashSection(fdNew, "G1", 2, (1 << power) * 2 -1, "tauG1");
await hashSection(fdNew, "G2", 3, (1 << power) , "tauG2");
@ -3128,9 +3128,9 @@ async function contribute(oldPtauFilename, newPTauFilename, name, entropy, logge
await hashSection(fdNew, "G1", 5, (1 << power) , "betaTauG1");
await hashSection(fdNew, "G2", 6, 1 , "betaG2");
curContribution.nextChallange = nextChallangeHasher.digest();
curContribution.nextChallenge = nextChallengeHasher.digest();
if (logger) logger.info(formatHash(curContribution.nextChallange, "Next Challange Hash: "));
if (logger) logger.info(formatHash(curContribution.nextChallenge, "Next Challenge Hash: "));
contributions.push(curContribution);
@ -3200,7 +3200,7 @@ async function contribute(oldPtauFilename, newPTauFilename, name, entropy, logge
const buffU = await G.batchLEMtoU(buffLEM);
nextChallangeHasher.update(buffU);
nextChallengeHasher.update(buffU);
}
fdTo.pos = oldPos;
@ -3420,10 +3420,10 @@ async function exportJson(pTauFilename, verbose) {
var powersoftau = /*#__PURE__*/Object.freeze({
__proto__: null,
newAccumulator: newAccumulator,
exportChallange: exportChallange,
exportChallenge: exportChallenge,
importResponse: importResponse,
verify: verify,
challangeContribute: challangeContribute,
challengeContribute: challengeContribute,
beacon: beacon,
contribute: contribute,
preparePhase2: preparePhase2,
@ -5076,7 +5076,7 @@ async function zkeyExportJson(zkeyFileName, verbose) {
// Format of the output
async function bellmanContribute(curve, challangeFilename, responesFileName, entropy, logger) {
async function bellmanContribute(curve, challengeFilename, responesFileName, entropy, logger) {
await Blake2b.ready();
const rng = await getRandomRng(entropy);
@ -5087,7 +5087,7 @@ async function bellmanContribute(curve, challangeFilename, responesFileName, ent
const sG1 = curve.G1.F.n8*2;
const sG2 = curve.G2.F.n8*2;
const fdFrom = await readExisting$1(challangeFilename);
const fdFrom = await readExisting$1(challengeFilename);
const fdTo = await createOverride(responesFileName);
@ -5110,12 +5110,12 @@ async function bellmanContribute(curve, challangeFilename, responesFileName, ent
// H
const nH = await fdFrom.readUBE32();
await fdTo.writeUBE32(nH);
await applyKeyToChallangeSection(fdFrom, fdTo, null, curve, "G1", nH, invDelta, curve.Fr.e(1), "UNCOMPRESSED", "H", logger);
await applyKeyToChallengeSection(fdFrom, fdTo, null, curve, "G1", nH, invDelta, curve.Fr.e(1), "UNCOMPRESSED", "H", logger);
// L
const nL = await fdFrom.readUBE32();
await fdTo.writeUBE32(nL);
await applyKeyToChallangeSection(fdFrom, fdTo, null, curve, "G1", nL, invDelta, curve.Fr.e(1), "UNCOMPRESSED", "L", logger);
await applyKeyToChallengeSection(fdFrom, fdTo, null, curve, "G1", nL, invDelta, curve.Fr.e(1), "UNCOMPRESSED", "L", logger);
// A
const nA = await fdFrom.readUBE32();

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

44
cli.js

@ -59,18 +59,18 @@ const commands = [
action: powersOfTawContribute
},
{
cmd: "powersoftau export challange <powersoftau_0000.ptau> [challange]",
description: "Creates a challange",
cmd: "powersoftau export challenge <powersoftau_0000.ptau> [challenge]",
description: "Creates a challenge",
alias: ["ptec"],
options: "-verbose|v",
action: powersOfTawExportChallange
action: powersOfTawExportChallenge
},
{
cmd: "powersoftau challange contribute <curve> <challange> [response]",
description: "Contribute to a challange",
cmd: "powersoftau challenge contribute <curve> <challenge> [response]",
description: "Contribute to a challenge",
alias: ["ptcc"],
options: "-verbose|v -entropy|e",
action: powersOfTawChallangeContribute
action: powersOfTawChallengeContribute
},
{
cmd: "powersoftau import response <powersoftau_old.ptau> <response> <<powersoftau_new.ptau>",
@ -614,41 +614,41 @@ async function powersOfTawNew(params, options) {
return await powersOfTaw.newAccumulator(curve, power, ptauName, logger);
}
async function powersOfTawExportChallange(params, options) {
async function powersOfTawExportChallenge(params, options) {
let ptauName;
let challangeName;
let challengeName;
ptauName = params[0];
if (params.length < 2) {
challangeName = "challange";
challengeName = "challenge";
} else {
challangeName = params[1];
challengeName = params[1];
}
if (options.verbose) Logger.setLogLevel("DEBUG");
return await powersOfTaw.exportChallange(ptauName, challangeName, logger);
return await powersOfTaw.exportChallenge(ptauName, challengeName, logger);
}
// powersoftau challange contribute <curve> <challange> [response]
async function powersOfTawChallangeContribute(params, options) {
let challangeName;
// powersoftau challenge contribute <curve> <challenge> [response]
async function powersOfTawChallengeContribute(params, options) {
let challengeName;
let responseName;
const curve = await curves.getCurveFromName(params[0]);
challangeName = params[1];
challengeName = params[1];
if (params.length < 3) {
responseName = changeExt(challangeName, "response");
responseName = changeExt(challengeName, "response");
} else {
responseName = params[2];
}
if (options.verbose) Logger.setLogLevel("DEBUG");
return await powersOfTaw.challangeContribute(curve, challangeName, responseName, options.entropy, logger);
return await powersOfTaw.challengeContribute(curve, challengeName, responseName, options.entropy, logger);
}
@ -884,23 +884,23 @@ async function zkeyBeacon(params, options) {
}
// zkey challange contribute <curve> <challange> [response]",
// zkey challenge contribute <curve> <challenge> [response]",
async function zkeyBellmanContribute(params, options) {
let challangeName;
let challengeName;
let responseName;
const curve = await curves.getCurveFromName(params[0]);
challangeName = params[1];
challengeName = params[1];
if (params.length < 3) {
responseName = changeExt(challangeName, "response");
responseName = changeExt(challengeName, "response");
} else {
responseName = params[2];
}
if (options.verbose) Logger.setLogLevel("DEBUG");
return zkey.bellmanContribute(curve, challangeName, responseName, options.entropy, logger);
return zkey.bellmanContribute(curve, challengeName, responseName, options.entropy, logger);
}

@ -17,12 +17,12 @@ export function hashToG2(curve, hash) {
return g2_sp;
}
export function getG2sp(curve, persinalization, challange, g1s, g1sx) {
export function getG2sp(curve, persinalization, challenge, g1s, g1sx) {
const h = blake2b(64);
const b1 = new Uint8Array([persinalization]);
h.update(b1);
h.update(challange);
h.update(challenge);
const b3 = curve.G1.toUncompressed(g1s);
h.update( b3);
const b4 = curve.G1.toUncompressed(g1sx);
@ -32,15 +32,15 @@ export function getG2sp(curve, persinalization, challange, g1s, g1sx) {
return hashToG2(curve, hash);
}
function calculatePubKey(k, curve, personalization, challangeHash, rng ) {
function calculatePubKey(k, curve, personalization, challengeHash, rng ) {
k.g1_s = curve.G1.toAffine(curve.G1.fromRng(rng));
k.g1_sx = curve.G1.toAffine(curve.G1.timesFr(k.g1_s, k.prvKey));
k.g2_sp = curve.G2.toAffine(getG2sp(curve, personalization, challangeHash, k.g1_s, k.g1_sx));
k.g2_sp = curve.G2.toAffine(getG2sp(curve, personalization, challengeHash, k.g1_s, k.g1_sx));
k.g2_spx = curve.G2.toAffine(curve.G2.timesFr(k.g2_sp, k.prvKey));
return k;
}
export function createPTauKey(curve, challangeHash, rng) {
export function createPTauKey(curve, challengeHash, rng) {
const key = {
tau: {},
alpha: {},
@ -49,9 +49,9 @@ export function createPTauKey(curve, challangeHash, rng) {
key.tau.prvKey = curve.Fr.fromRng(rng);
key.alpha.prvKey = curve.Fr.fromRng(rng);
key.beta.prvKey = curve.Fr.fromRng(rng);
calculatePubKey(key.tau, curve, 0, challangeHash, rng);
calculatePubKey(key.alpha, curve, 1, challangeHash, rng);
calculatePubKey(key.beta, curve, 2, challangeHash, rng);
calculatePubKey(key.tau, curve, 0, challengeHash, rng);
calculatePubKey(key.alpha, curve, 1, challengeHash, rng);
calculatePubKey(key.beta, curve, 2, challengeHash, rng);
return key;
}

@ -5,7 +5,7 @@ import * as binFileUtils from "./binfileutils.js";
This function creates a new section in the fdTo file with id idSection.
It multiplies the pooints in fdFrom by first, first*inc, first*inc^2, ....
nPoint Times.
It also updates the newChallangeHasher with the new points
It also updates the newChallengeHasher with the new points
*/
export async function applyKeyToSection(fdOld, sections, fdNew, idSection, curve, groupName, first, inc, sectionName, logger) {
@ -34,7 +34,7 @@ export async function applyKeyToSection(fdOld, sections, fdNew, idSection, curve
export async function applyKeyToChallangeSection(fdOld, fdNew, responseHasher, curve, groupName, nPoints, first, inc, formatOut, sectionName, logger) {
export async function applyKeyToChallengeSection(fdOld, fdNew, responseHasher, curve, groupName, nPoints, first, inc, formatOut, sectionName, logger) {
const G = curve[groupName];
const sG = G.F.n8*2;
const chunkSize = Math.floor((1<<20) / sG); // 128Mb chunks

@ -1,9 +1,9 @@
export {default as newAccumulator} from "./powersoftau_new.js";
export {default as exportChallange} from "./powersoftau_export_challange.js";
export {default as exportChallenge} from "./powersoftau_export_challenge.js";
export {default as importResponse} from "./powersoftau_import.js";
export {default as verify} from "./powersoftau_verify.js";
export {default as challangeContribute} from "./powersoftau_challange_contribute.js";
export {default as challengeContribute} from "./powersoftau_challenge_contribute.js";
export {default as beacon} from "./powersoftau_beacon.js";
export {default as contribute} from "./powersoftau_contribute.js";
export {default as preparePhase2} from "./powersoftau_preparephase2.js";

@ -44,18 +44,18 @@ export default async function beacon(oldPtauFilename, newPTauFilename, name, be
beaconHash: beaconHash
};
let lastChallangeHash;
let lastChallengeHash;
if (contributions.length>0) {
lastChallangeHash = contributions[contributions.length-1].nextChallange;
lastChallengeHash = contributions[contributions.length-1].nextChallenge;
} else {
lastChallangeHash = utils.calculateFirstChallangeHash(curve, power, logger);
lastChallengeHash = utils.calculateFirstChallengeHash(curve, power, logger);
}
curContribution.key = utils.keyFromBeacon(curve, lastChallangeHash, beaconHash, numIterationsExp);
curContribution.key = utils.keyFromBeacon(curve, lastChallengeHash, beaconHash, numIterationsExp);
const responseHasher = new Blake2b(64);
responseHasher.update(lastChallangeHash);
responseHasher.update(lastChallengeHash);
const fdNew = await binFileUtils.createBinFile(newPTauFilename, "ptau", 1, 7);
await utils.writePTauHeader(fdNew, curve, power);
@ -85,8 +85,8 @@ export default async function beacon(oldPtauFilename, newPTauFilename, name, be
if (logger) logger.info(misc.formatHash(hashResponse, "Contribution Response Hash imported: "));
const nextChallangeHasher = new Blake2b(64);
nextChallangeHasher.update(hashResponse);
const nextChallengeHasher = new Blake2b(64);
nextChallengeHasher.update(hashResponse);
await hashSection(fdNew, "G1", 2, (1 << power) * 2 -1, "tauG1", logger);
await hashSection(fdNew, "G2", 3, (1 << power) , "tauG2", logger);
@ -94,9 +94,9 @@ export default async function beacon(oldPtauFilename, newPTauFilename, name, be
await hashSection(fdNew, "G1", 5, (1 << power) , "betaTauG1", logger);
await hashSection(fdNew, "G2", 6, 1 , "betaG2", logger);
curContribution.nextChallange = nextChallangeHasher.digest();
curContribution.nextChallenge = nextChallengeHasher.digest();
if (logger) logger.info(misc.formatHash(curContribution.nextChallange, "Next Challange Hash: "));
if (logger) logger.info(misc.formatHash(curContribution.nextChallenge, "Next Challenge Hash: "));
contributions.push(curContribution);
@ -166,7 +166,7 @@ export default async function beacon(oldPtauFilename, newPTauFilename, name, be
const buffU = await G.batchLEMtoU(buffLEM);
nextChallangeHasher.update(buffU);
nextChallengeHasher.update(buffU);
}
fdTo.pos = oldPos;

@ -20,13 +20,13 @@ import * as fastFile from "fastfile";
import Blake2b from "blake2b-wasm";
import * as utils from "./powersoftau_utils.js";
import * as misc from "./misc.js";
import { applyKeyToChallangeSection } from "./mpc_applykey.js";
import { applyKeyToChallengeSection } from "./mpc_applykey.js";
import * as keyPair from "./keypair.js";
export default async function challangeContribute(curve, challangeFilename, responesFileName, entropy, logger) {
export default async function challengeContribute(curve, challengeFilename, responesFileName, entropy, logger) {
await Blake2b.ready();
const fdFrom = await fastFile.readExisting(challangeFilename);
const fdFrom = await fastFile.readExisting(challengeFilename);
const sG1 = curve.F1.n64*8*2;
@ -47,21 +47,21 @@ export default async function challangeContribute(curve, challangeFilename, resp
const fdTo = await fastFile.createOverride(responesFileName);
// Calculate the hash
if (logger) logger.debug("Hashing challange");
const challangeHasher = Blake2b(64);
if (logger) logger.debug("Hashing challenge");
const challengeHasher = Blake2b(64);
for (let i=0; i<fdFrom.totalSize; i+= fdFrom.pageSize) {
const s = Math.min(fdFrom.totalSize - i, fdFrom.pageSize);
const buff = await fdFrom.read(s);
challangeHasher.update(buff);
challengeHasher.update(buff);
}
const claimedHash = await fdFrom.read(64, 0);
if (logger) logger.info(misc.formatHash(claimedHash, "Claimed Previus Response Hash: "));
const challangeHash = challangeHasher.digest();
if (logger) logger.info(misc.formatHash(challangeHash, "Current Challange Hash: "));
const challengeHash = challengeHasher.digest();
if (logger) logger.info(misc.formatHash(challengeHash, "Current Challenge Hash: "));
const key = keyPair.createPTauKey(curve, challangeHash, rng);
const key = keyPair.createPTauKey(curve, challengeHash, rng);
if (logger) {
["tau", "alpha", "beta"].forEach( (k) => {
@ -75,14 +75,14 @@ export default async function challangeContribute(curve, challangeFilename, resp
const responseHasher = Blake2b(64);
await fdTo.write(challangeHash);
responseHasher.update(challangeHash);
await fdTo.write(challengeHash);
responseHasher.update(challengeHash);
await applyKeyToChallangeSection(fdFrom, fdTo, responseHasher, curve, "G1", (1<<power)*2-1, curve.Fr.one , key.tau.prvKey, "COMPRESSED", "tauG1" , logger );
await applyKeyToChallangeSection(fdFrom, fdTo, responseHasher, curve, "G2", (1<<power) , curve.Fr.one , key.tau.prvKey, "COMPRESSED", "tauG2" , logger );
await applyKeyToChallangeSection(fdFrom, fdTo, responseHasher, curve, "G1", (1<<power) , key.alpha.prvKey, key.tau.prvKey, "COMPRESSED", "alphaTauG1", logger );
await applyKeyToChallangeSection(fdFrom, fdTo, responseHasher, curve, "G1", (1<<power) , key.beta.prvKey , key.tau.prvKey, "COMPRESSED", "betaTauG1" , logger );
await applyKeyToChallangeSection(fdFrom, fdTo, responseHasher, curve, "G2", 1 , key.beta.prvKey , key.tau.prvKey, "COMPRESSED", "betaTauG2" , logger );
await applyKeyToChallengeSection(fdFrom, fdTo, responseHasher, curve, "G1", (1<<power)*2-1, curve.Fr.one , key.tau.prvKey, "COMPRESSED", "tauG1" , logger );
await applyKeyToChallengeSection(fdFrom, fdTo, responseHasher, curve, "G2", (1<<power) , curve.Fr.one , key.tau.prvKey, "COMPRESSED", "tauG2" , logger );
await applyKeyToChallengeSection(fdFrom, fdTo, responseHasher, curve, "G1", (1<<power) , key.alpha.prvKey, key.tau.prvKey, "COMPRESSED", "alphaTauG1", logger );
await applyKeyToChallengeSection(fdFrom, fdTo, responseHasher, curve, "G1", (1<<power) , key.beta.prvKey , key.tau.prvKey, "COMPRESSED", "betaTauG1" , logger );
await applyKeyToChallengeSection(fdFrom, fdTo, responseHasher, curve, "G2", 1 , key.beta.prvKey , key.tau.prvKey, "COMPRESSED", "betaTauG2" , logger );
// Write and hash key
const buffKey = new Uint8Array(curve.F1.n8*2*6+curve.F2.n8*2*3);

@ -29,24 +29,24 @@ export default async function contribute(oldPtauFilename, newPTauFilename, name,
type: 0, // Beacon
};
let lastChallangeHash;
let lastChallengeHash;
const rng = await misc.getRandomRng(entropy);
if (contributions.length>0) {
lastChallangeHash = contributions[contributions.length-1].nextChallange;
lastChallengeHash = contributions[contributions.length-1].nextChallenge;
} else {
lastChallangeHash = utils.calculateFirstChallangeHash(curve, power, logger);
lastChallengeHash = utils.calculateFirstChallengeHash(curve, power, logger);
}
// Generate a random key
curContribution.key = keyPair.createPTauKey(curve, lastChallangeHash, rng);
curContribution.key = keyPair.createPTauKey(curve, lastChallengeHash, rng);
const responseHasher = new Blake2b(64);
responseHasher.update(lastChallangeHash);
responseHasher.update(lastChallengeHash);
const fdNew = await binFileUtils.createBinFile(newPTauFilename, "ptau", 1, 7);
await utils.writePTauHeader(fdNew, curve, power);
@ -76,8 +76,8 @@ export default async function contribute(oldPtauFilename, newPTauFilename, name,
if (logger) logger.info(misc.formatHash(hashResponse, "Contribution Response Hash imported: "));
const nextChallangeHasher = new Blake2b(64);
nextChallangeHasher.update(hashResponse);
const nextChallengeHasher = new Blake2b(64);
nextChallengeHasher.update(hashResponse);
await hashSection(fdNew, "G1", 2, (1 << power) * 2 -1, "tauG1");
await hashSection(fdNew, "G2", 3, (1 << power) , "tauG2");
@ -85,9 +85,9 @@ export default async function contribute(oldPtauFilename, newPTauFilename, name,
await hashSection(fdNew, "G1", 5, (1 << power) , "betaTauG1");
await hashSection(fdNew, "G2", 6, 1 , "betaG2");
curContribution.nextChallange = nextChallangeHasher.digest();
curContribution.nextChallenge = nextChallengeHasher.digest();
if (logger) logger.info(misc.formatHash(curContribution.nextChallange, "Next Challange Hash: "));
if (logger) logger.info(misc.formatHash(curContribution.nextChallenge, "Next Challenge Hash: "));
contributions.push(curContribution);
@ -157,7 +157,7 @@ export default async function contribute(oldPtauFilename, newPTauFilename, name,
const buffU = await G.batchLEMtoU(buffLEM);
nextChallangeHasher.update(buffU);
nextChallengeHasher.update(buffU);
}
fdTo.pos = oldPos;

@ -12,28 +12,28 @@ import * as utils from "./powersoftau_utils.js";
import * as binFileUtils from "./binfileutils.js";
import * as misc from "./misc.js";
export default async function exportChallange(pTauFilename, challangeFilename, logger) {
export default async function exportChallenge(pTauFilename, challengeFilename, logger) {
await Blake2b.ready();
const {fd: fdFrom, sections} = await binFileUtils.readBinFile(pTauFilename, "ptau", 1);
const {curve, power} = await utils.readPTauHeader(fdFrom, sections);
const contributions = await utils.readContributions(fdFrom, curve, sections);
let lastResponseHash, curChallangeHash;
let lastResponseHash, curChallengeHash;
if (contributions.length == 0) {
lastResponseHash = Blake2b(64).digest();
curChallangeHash = utils.calculateFirstChallangeHash(curve, power);
curChallengeHash = utils.calculateFirstChallengeHash(curve, power);
} else {
lastResponseHash = contributions[contributions.length-1].responseHash;
curChallangeHash = contributions[contributions.length-1].nextChallange;
curChallengeHash = contributions[contributions.length-1].nextChallenge;
}
if (logger) logger.info(misc.formatHash(lastResponseHash, "Last Response Hash: "));
if (logger) logger.info(misc.formatHash(curChallangeHash, "New Challange Hash: "));
if (logger) logger.info(misc.formatHash(curChallengeHash, "New Challenge Hash: "));
const fdTo = await fastFile.createOverride(challangeFilename);
const fdTo = await fastFile.createOverride(challengeFilename);
const toHash = Blake2b(64);
await fdTo.write(lastResponseHash);
@ -48,16 +48,16 @@ export default async function exportChallange(pTauFilename, challangeFilename, l
await fdFrom.close();
await fdTo.close();
const calcCurChallangeHash = toHash.digest();
const calcCurChallengeHash = toHash.digest();
if (!misc.hashIsEqual (curChallangeHash, calcCurChallangeHash)) {
if (logger) logger.info(misc.formatHash(calcCurChallangeHash, "Calc Curret Challange Hash: "));
if (!misc.hashIsEqual (curChallengeHash, calcCurChallengeHash)) {
if (logger) logger.info(misc.formatHash(calcCurChallengeHash, "Calc Curret Challenge Hash: "));
if (logger) logger.error("PTau file is corrupted. Calculated new challange hash does not match with the eclared one");
throw new Error("PTau file is corrupted. Calculated new challange hash does not match with the eclared one");
if (logger) logger.error("PTau file is corrupted. Calculated new challenge hash does not match with the eclared one");
throw new Error("PTau file is corrupted. Calculated new challenge hash does not match with the eclared one");
}
return curChallangeHash;
return curChallengeHash;
async function exportSection(sectionId, groupName, nPoints, sectionName) {
const G = curve[groupName];

@ -32,12 +32,12 @@ export default async function importResponse(oldPtauFilename, contributionFilena
sG1*6 + sG2*3)
throw new Error("Size of the contribution is invalid");
let lastChallangeHash;
let lastChallengeHash;
if (contributions.length>0) {
lastChallangeHash = contributions[contributions.length-1].nextChallange;
lastChallengeHash = contributions[contributions.length-1].nextChallenge;
} else {
lastChallangeHash = utils.calculateFirstChallangeHash(curve, power, logger);
lastChallengeHash = utils.calculateFirstChallengeHash(curve, power, logger);
}
const fdNew = await binFileUtils.createBinFile(newPTauFilename, "ptau", 1, 7);
@ -45,7 +45,7 @@ export default async function importResponse(oldPtauFilename, contributionFilena
const contributionPreviousHash = await fdResponse.read(64);
if(!misc.hashIsEqual(contributionPreviousHash,lastChallangeHash))
if(!misc.hashIsEqual(contributionPreviousHash,lastChallengeHash))
throw new Error("Wrong contribution. this contribution is not based on the previus hash");
const hasherResponse = new Blake2b(64);
@ -76,8 +76,8 @@ export default async function importResponse(oldPtauFilename, contributionFilena
if (logger) logger.info(misc.formatHash(hashResponse, "Contribution Response Hash imported: "));
const nextChallangeHasher = new Blake2b(64);
nextChallangeHasher.update(hashResponse);
const nextChallengeHasher = new Blake2b(64);
nextChallengeHasher.update(hashResponse);
await hashSection(fdNew, "G1", 2, (1 << power) * 2 -1, "tauG1", logger);
await hashSection(fdNew, "G2", 3, (1 << power) , "tauG2", logger);
@ -85,9 +85,9 @@ export default async function importResponse(oldPtauFilename, contributionFilena
await hashSection(fdNew, "G1", 5, (1 << power) , "betaTauG1", logger);
await hashSection(fdNew, "G2", 6, 1 , "betaG2", logger);
currentContribution.nextChallange = nextChallangeHasher.digest();
currentContribution.nextChallenge = nextChallengeHasher.digest();
if (logger) logger.info(misc.formatHash(currentContribution.nextChallange, "Next Challange Hash: "));
if (logger) logger.info(misc.formatHash(currentContribution.nextChallenge, "Next Challenge Hash: "));
contributions.push(currentContribution);
@ -97,7 +97,7 @@ export default async function importResponse(oldPtauFilename, contributionFilena
await fdNew.close();
await fdOld.close();
return currentContribution.nextChallange;
return currentContribution.nextChallenge;
async function processSection(fdFrom, fdTo, groupName, sectionId, nPoints, singularPointIndexes, sectionName) {
@ -154,7 +154,7 @@ export default async function importResponse(oldPtauFilename, contributionFilena
const buffU = await G.batchLEMtoU(buffLEM);
nextChallangeHasher.update(buffU);
nextChallengeHasher.update(buffU);
}
fdTo.pos = oldPos;

@ -42,7 +42,7 @@ contributions(7)
beta_g1sx
beta_g1spx
partialHash (216 bytes) See https://github.com/mafintosh/blake2b-wasm/blob/23bee06945806309977af802bc374727542617c7/blake2b.wat#L9
hashNewChallange
hashNewChallenge
]
*/
@ -116,12 +116,12 @@ export default async function newAccumulator(curve, power, fileName, logger) {
await fd.close();
const firstChallangeHash = ptauUtils.calculateFirstChallangeHash(curve, power, logger);
const firstChallengeHash = ptauUtils.calculateFirstChallengeHash(curve, power, logger);
if (logger) logger.debug(misc.formatHash(Blake2b(64).digest(), "Blank Contribution Hash:"));
if (logger) logger.info(misc.formatHash(firstChallangeHash, "First Contribution Hash:"));
if (logger) logger.info(misc.formatHash(firstChallengeHash, "First Contribution Hash:"));
return firstChallangeHash;
return firstChallengeHash;
}

@ -151,7 +151,7 @@ async function readContribution(fd, curve) {
c.betaG2 = await readG2();
c.key = await readPtauPubKey(fd, curve, true);
c.partialHash = await fd.read(216);
c.nextChallange = await fd.read(64);
c.nextChallenge = await fd.read(64);
c.type = await fd.readULE32();
const buffV = new Uint8Array(curve.G1.F.n8*2*6+curve.G2.F.n8*2*3);
@ -234,7 +234,7 @@ async function writeContribution(fd, curve, contribution) {
await writeG2(contribution.betaG2);
await writePtauPubKey(fd, curve, contribution.key, true);
await fd.write(contribution.partialHash);
await fd.write(contribution.nextChallange);
await fd.write(contribution.nextChallenge);
await fd.writeULE32(contribution.type || 0);
const params = [];
@ -291,8 +291,8 @@ export async function writeContributions(fd, curve, contributions) {
fd.pos = oldPos;
}
export function calculateFirstChallangeHash(curve, power, logger) {
if (logger) logger.debug("Calculating First Challange Hash");
export function calculateFirstChallengeHash(curve, power, logger) {
if (logger) logger.debug("Calculating First Challenge Hash");
const hasher = new Blake2b(64);
@ -338,11 +338,11 @@ export function calculateFirstChallangeHash(curve, power, logger) {
}
export function keyFromBeacon(curve, challangeHash, beaconHash, numIterationsExp) {
export function keyFromBeacon(curve, challengeHash, beaconHash, numIterationsExp) {
const rng = misc.rngFromBeaconParams(beaconHash, numIterationsExp);
const key = keyPair.createPTauKey(curve, challangeHash, rng);
const key = keyPair.createPTauKey(curve, challengeHash, rng);
return key;
}

@ -10,97 +10,97 @@ const sameRatio = misc.sameRatio;
async function verifyContribution(curve, cur, prev, logger) {
let sr;
if (cur.type == 1) { // Verify the beacon.
const beaconKey = utils.keyFromBeacon(curve, prev.nextChallange, cur.beaconHash, cur.numIterationsExp);
const beaconKey = utils.keyFromBeacon(curve, prev.nextChallenge, cur.beaconHash, cur.numIterationsExp);
if (!curve.G1.eq(cur.key.tau.g1_s, beaconKey.tau.g1_s)) {
if (logger) logger.error(`BEACON key (tauG1_s) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (tauG1_s) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
if (!curve.G1.eq(cur.key.tau.g1_sx, beaconKey.tau.g1_sx)) {
if (logger) logger.error(`BEACON key (tauG1_sx) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (tauG1_sx) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
if (!curve.G2.eq(cur.key.tau.g2_spx, beaconKey.tau.g2_spx)) {
if (logger) logger.error(`BEACON key (tauG2_spx) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (tauG2_spx) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
if (!curve.G1.eq(cur.key.alpha.g1_s, beaconKey.alpha.g1_s)) {
if (logger) logger.error(`BEACON key (alphaG1_s) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (alphaG1_s) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
if (!curve.G1.eq(cur.key.alpha.g1_sx, beaconKey.alpha.g1_sx)) {
if (logger) logger.error(`BEACON key (alphaG1_sx) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (alphaG1_sx) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
if (!curve.G2.eq(cur.key.alpha.g2_spx, beaconKey.alpha.g2_spx)) {
if (logger) logger.error(`BEACON key (alphaG2_spx) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (alphaG2_spx) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
if (!curve.G1.eq(cur.key.beta.g1_s, beaconKey.beta.g1_s)) {
if (logger) logger.error(`BEACON key (betaG1_s) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (betaG1_s) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
if (!curve.G1.eq(cur.key.beta.g1_sx, beaconKey.beta.g1_sx)) {
if (logger) logger.error(`BEACON key (betaG1_sx) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (betaG1_sx) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
if (!curve.G2.eq(cur.key.beta.g2_spx, beaconKey.beta.g2_spx)) {
if (logger) logger.error(`BEACON key (betaG2_spx) is not generated correctly in challange #${cur.id} ${cur.name || ""}` );
if (logger) logger.error(`BEACON key (betaG2_spx) is not generated correctly in challenge #${cur.id} ${cur.name || ""}` );
return false;
}
}
cur.key.tau.g2_sp = curve.G2.toAffine(keyPair.getG2sp(curve, 0, prev.nextChallange, cur.key.tau.g1_s, cur.key.tau.g1_sx));
cur.key.alpha.g2_sp = curve.G2.toAffine(keyPair.getG2sp(curve, 1, prev.nextChallange, cur.key.alpha.g1_s, cur.key.alpha.g1_sx));
cur.key.beta.g2_sp = curve.G2.toAffine(keyPair.getG2sp(curve, 2, prev.nextChallange, cur.key.beta.g1_s, cur.key.beta.g1_sx));
cur.key.tau.g2_sp = curve.G2.toAffine(keyPair.getG2sp(curve, 0, prev.nextChallenge, cur.key.tau.g1_s, cur.key.tau.g1_sx));
cur.key.alpha.g2_sp = curve.G2.toAffine(keyPair.getG2sp(curve, 1, prev.nextChallenge, cur.key.alpha.g1_s, cur.key.alpha.g1_sx));
cur.key.beta.g2_sp = curve.G2.toAffine(keyPair.getG2sp(curve, 2, prev.nextChallenge, cur.key.beta.g1_s, cur.key.beta.g1_sx));
sr = await sameRatio(curve, cur.key.tau.g1_s, cur.key.tau.g1_sx, cur.key.tau.g2_sp, cur.key.tau.g2_spx);
if (sr !== true) {
if (logger) logger.error("INVALID key (tau) in challange #"+cur.id);
if (logger) logger.error("INVALID key (tau) in challenge #"+cur.id);
return false;
}
sr = await sameRatio(curve, cur.key.alpha.g1_s, cur.key.alpha.g1_sx, cur.key.alpha.g2_sp, cur.key.alpha.g2_spx);
if (sr !== true) {
if (logger) logger.error("INVALID key (alpha) in challange #"+cur.id);
if (logger) logger.error("INVALID key (alpha) in challenge #"+cur.id);
return false;
}
sr = await sameRatio(curve, cur.key.beta.g1_s, cur.key.beta.g1_sx, cur.key.beta.g2_sp, cur.key.beta.g2_spx);
if (sr !== true) {
if (logger) logger.error("INVALID key (beta) in challange #"+cur.id);
if (logger) logger.error("INVALID key (beta) in challenge #"+cur.id);
return false;
}
sr = await sameRatio(curve, prev.tauG1, cur.tauG1, cur.key.tau.g2_sp, cur.key.tau.g2_spx);
if (sr !== true) {
if (logger) logger.error("INVALID tau*G1. challange #"+cur.id+" It does not follow the previous contribution");
if (logger) logger.error("INVALID tau*G1. challenge #"+cur.id+" It does not follow the previous contribution");
return false;
}
sr = await sameRatio(curve, cur.key.tau.g1_s, cur.key.tau.g1_sx, prev.tauG2, cur.tauG2);
if (sr !== true) {
if (logger) logger.error("INVALID tau*G2. challange #"+cur.id+" It does not follow the previous contribution");
if (logger) logger.error("INVALID tau*G2. challenge #"+cur.id+" It does not follow the previous contribution");
return false;
}
sr = await sameRatio(curve, prev.alphaG1, cur.alphaG1, cur.key.alpha.g2_sp, cur.key.alpha.g2_spx);
if (sr !== true) {
if (logger) logger.error("INVALID alpha*G1. challange #"+cur.id+" It does not follow the previous contribution");
if (logger) logger.error("INVALID alpha*G1. challenge #"+cur.id+" It does not follow the previous contribution");
return false;
}
sr = await sameRatio(curve, prev.betaG1, cur.betaG1, cur.key.beta.g2_sp, cur.key.beta.g2_spx);
if (sr !== true) {
if (logger) logger.error("INVALID beta*G1. challange #"+cur.id+" It does not follow the previous contribution");
if (logger) logger.error("INVALID beta*G1. challenge #"+cur.id+" It does not follow the previous contribution");
return false;
}
sr = await sameRatio(curve, cur.key.beta.g1_s, cur.key.beta.g1_sx, prev.betaG2, cur.betaG2);
if (sr !== true) {
if (logger) logger.error("INVALID beta*G2. challange #"+cur.id+"It does not follow the previous contribution");
if (logger) logger.error("INVALID beta*G2. challenge #"+cur.id+"It does not follow the previous contribution");
return false;
}
@ -126,7 +126,7 @@ export default async function verify(tauFilename, logger) {
alphaG1: curve.G1.g,
betaG1: curve.G1.g,
betaG2: curve.G2.g,
nextChallange: utils.calculateFirstChallangeHash(curve, ceremonyPower, logger),
nextChallenge: utils.calculateFirstChallengeHash(curve, ceremonyPower, logger),
responseHash: Blake2b(64).digest()
};
@ -150,7 +150,7 @@ export default async function verify(tauFilename, logger) {
const nextContributionHasher = Blake2b(64);
nextContributionHasher.update(curContr.responseHash);
// Verify powers and compute nextChallangeHash
// Verify powers and compute nextChallengeHash
// await test();
@ -226,13 +226,13 @@ export default async function verify(tauFilename, logger) {
const nextContributionHash = nextContributionHasher.digest();
// Check the nextChallangeHash
if (!misc.hashIsEqual(nextContributionHash,curContr.nextChallange)) {
if (logger) logger.error("Hash of the values does not match the next challange of the last contributor in the contributions section");
// Check the nextChallengeHash
if (!misc.hashIsEqual(nextContributionHash,curContr.nextChallenge)) {
if (logger) logger.error("Hash of the values does not match the next challenge of the last contributor in the contributions section");
return false;
}
if (logger) logger.info(misc.formatHash(nextContributionHash, "Next challange hash: "));
if (logger) logger.info(misc.formatHash(nextContributionHash, "Next challenge hash: "));
// Verify Previous contributions
@ -272,7 +272,7 @@ export default async function verify(tauFilename, logger) {
logger.info("-----------------------------------------------------");
logger.info(`Contribution #${curContr.id}: ${curContr.name ||""}`);
logger.info(misc.formatHash(curContr.nextChallange, "Next Challange: "));
logger.info(misc.formatHash(curContr.nextChallenge, "Next Challenge: "));
const buffV = new Uint8Array(curve.G1.F.n8*2*6+curve.G2.F.n8*2*3);
utils.toPtauPubKeyRpr(buffV, 0, curve, curContr.key, false);
@ -284,7 +284,7 @@ export default async function verify(tauFilename, logger) {
logger.info(misc.formatHash(responseHash, "Response Hash:"));
logger.info(misc.formatHash(prevContr.nextChallange, "Response Hash:"));
logger.info(misc.formatHash(prevContr.nextChallenge, "Response Hash:"));
if (curContr.type == 1) {
logger.info(`Beacon generator: ${misc.byteArray2hex(curContr.beaconHash)}`);

@ -20,11 +20,11 @@ import * as fastFile from "fastfile";
import Blake2b from "blake2b-wasm";
import * as utils from "./zkey_utils.js";
import * as misc from "./misc.js";
import { applyKeyToChallangeSection } from "./mpc_applykey.js";
import { applyKeyToChallengeSection } from "./mpc_applykey.js";
import { hashPubKey } from "./zkey_utils.js";
import { hashToG2 as hashToG2 } from "./keypair.js";
export default async function bellmanContribute(curve, challangeFilename, responesFileName, entropy, logger) {
export default async function bellmanContribute(curve, challengeFilename, responesFileName, entropy, logger) {
await Blake2b.ready();
const rng = await misc.getRandomRng(entropy);
@ -35,7 +35,7 @@ export default async function bellmanContribute(curve, challangeFilename, respon
const sG1 = curve.G1.F.n8*2;
const sG2 = curve.G2.F.n8*2;
const fdFrom = await fastFile.readExisting(challangeFilename);
const fdFrom = await fastFile.readExisting(challengeFilename);
const fdTo = await fastFile.createOverride(responesFileName);
@ -58,12 +58,12 @@ export default async function bellmanContribute(curve, challangeFilename, respon
// H
const nH = await fdFrom.readUBE32();
await fdTo.writeUBE32(nH);
await applyKeyToChallangeSection(fdFrom, fdTo, null, curve, "G1", nH, invDelta, curve.Fr.e(1), "UNCOMPRESSED", "H", logger);
await applyKeyToChallengeSection(fdFrom, fdTo, null, curve, "G1", nH, invDelta, curve.Fr.e(1), "UNCOMPRESSED", "H", logger);
// L
const nL = await fdFrom.readUBE32();
await fdTo.writeUBE32(nL);
await applyKeyToChallangeSection(fdFrom, fdTo, null, curve, "G1", nL, invDelta, curve.Fr.e(1), "UNCOMPRESSED", "L", logger);
await applyKeyToChallengeSection(fdFrom, fdTo, null, curve, "G1", nL, invDelta, curve.Fr.e(1), "UNCOMPRESSED", "L", logger);
// A
const nA = await fdFrom.readUBE32();

@ -12,7 +12,7 @@ describe("Full process", function () {
const ptau_2 = {type: "mem"};
const ptau_beacon = {type: "mem"};
const ptau_final = {type: "mem"};
const ptau_challange2 = {type: "mem"};
const ptau_challenge2 = {type: "mem"};
const ptau_response2 = {type: "mem"};
const zkey_0 = {type: "mem"};
const zkey_1 = {type: "mem"};
@ -40,12 +40,12 @@ describe("Full process", function () {
await snarkjs.powersOfTau.contribute(ptau_0, ptau_1, "C1", "Entropy1");
});
it ("powersoftau export challange", async () => {
await snarkjs.powersOfTau.exportChallange(ptau_1, ptau_challange2);
it ("powersoftau export challenge", async () => {
await snarkjs.powersOfTau.exportChallenge(ptau_1, ptau_challenge2);
});
it ("powersoftau challange contribute", async () => {
await snarkjs.powersOfTau.challangeContribute(curve, ptau_challange2, ptau_response2, "Entropy2");
it ("powersoftau challenge contribute", async () => {
await snarkjs.powersOfTau.challengeContribute(curve, ptau_challenge2, ptau_response2, "Entropy2");
});
it ("powersoftau import response", async () => {

@ -16,7 +16,7 @@ describe("keypair", () => {
});
it("It should calculate the right g2_s for the test vectors", async () => {
const challange = hex2ByteArray(
const challenge = hex2ByteArray(
"bc0bde7980381fa642b2097591dd83f1"+
"ed15b003e15c35520af32c95eb519149"+
"2a6f3175215635cfc10e6098e2c612d0"+
@ -33,7 +33,7 @@ describe("keypair", () => {
Scalar.e("0x0c17fd067df52c480a1db3c6890821f975932d89d0d53c6c60777cc56f1dd712"),
Scalar.e("1")
]);
const tau_g2_sp = getG2sp(curve, 0, challange, tau_g1_s, tau_g1_sx);
const tau_g2_sp = getG2sp(curve, 0, challenge, tau_g1_s, tau_g1_sx);
const tau_g2_spx = curve.G2.fromObject([
[
@ -64,7 +64,7 @@ describe("keypair", () => {
Scalar.e("0x0b3a94f2b61178f2974e039cfd671e7405ec43eb2c09dc8f43a34f450917a62f"),
Scalar.e("1")
]);
const alpha_g2_sp = getG2sp(curve, 1, challange, alpha_g1_s, alpha_g1_sx);
const alpha_g2_sp = getG2sp(curve, 1, challenge, alpha_g1_s, alpha_g1_sx);
const alpha_g2_spx = curve.G2.fromObject([
[
@ -96,7 +96,7 @@ describe("keypair", () => {
Scalar.e("0x12074f06ef232a472cb36c328e760c4acfb4bedad4ca3ee09971578a0fe185ab"),
Scalar.e("1")
]);
const beta_g2_sp = getG2sp(curve, 2, challange, beta_g1_s, beta_g1_sx);
const beta_g2_sp = getG2sp(curve, 2, challenge, beta_g1_s, beta_g1_sx);
const beta_g2_spx = curve.G2.fromObject([
[