diff --git a/.gitignore b/.gitignore index b5437ff..3edd018 100644 --- a/.gitignore +++ b/.gitignore @@ -2,6 +2,7 @@ *.ph1 *.ph2 *.r1cs +*.sol evals srs.lag pk diff --git a/CEREMONY.md b/CEREMONY.md deleted file mode 100644 index 39f3103..0000000 --- a/CEREMONY.md +++ /dev/null @@ -1,267 +0,0 @@ -# Worldcoin trusted setup ceremony - -[World ID](https://worldcoin.org/blog/announcements/introducing-world-id-and-sdk) is a privacy-preserving proof of personhood protocol that leverages the Semaphore protocol in order to prove inclusion of members in a merkle tree where each identity commitment was generated by an orb and its corresponding private key is stored on our users' phones. In the Semaphore protocol implementation there are individual sequential insertions. Individual insertions for big merkle trees are very expensive and therefore economically unfeasible at the World ID userbase scale ([currently nearing 2M](https://worldcoin.org/)). In order to resolve that issue we developed custom circuits written in gnark in order to do batch insertions into the Semaphore merkle tree. This gnark circuit leverages the groth16 proof system on the bn254 curve and requires a custom trusted setup ceremony to be made in order to achieve verifier soundness. - -### Specification - -For the phase 1 contribution (a.k.a powers of tau ceremony) we are using the [Perpetual Powers of Tau ceremony](https://github.com/privacy-scaling-explorations/perpetualpowersoftau) (up to the 54th contribution) through the s3 hosted bucket in the [snarkjs repo README](https://github.com/iden3/snarkjs/blob/master/README.md#7-prepare-phase-2). We built a [deserializer](https://github.com/worldcoin/ptau-deserializer) from the `.ptau` format into the `.ph1` format used by gnark and initialized a phase 2 using a fork ([semaphore-mtb-setup](https://github.com/worldcoin/semaphore-mtb-setup)) of a ceremony coordinator wrapper on top of gnark built by the [zkbnb team](https://github.com/bnb-chain/zkbnb-setup). - -#### System used - -[AWS m5.16xlarge](https://aws.amazon.com/ec2/instance-types/m5/) instance - -- 256 GiB RAM -- 64 cores -- 500 GiB Volume - -### Pre-Contribution (Already done) - -The chain of comands that has been performed before the first contribution. - - -```bash -git clone https://github.com/worldcoin/semaphore-mtb-setup -git clone https://github.com/worldcoin/semaphore-mtb -``` - -Download the trusted setup ceremony coordinator tool and the powers of tau files. - -```bash -cd semaphore-mtb-setup && go build -v - -# Download Powers of Tau files for each respective circuit -wget https://hermez.s3-eu-west-1.amazonaws.com/powersOfTau28_hez_final_20.ptau -mv powersOfTau28_hez_final_20.ptau 20.ptau - -wget https://hermez.s3-eu-west-1.amazonaws.com/powersOfTau28_hez_final_23.ptau -mv powersOfTau28_hez_final_23.ptau 23.ptau - -wget https://hermez.s3-eu-west-1.amazonaws.com/powersOfTau28_hez_final_26.ptau -mv powersOfTau28_hez_final_26.ptau 26.ptau - -# Convert .ptau format into .ph1 -./semaphore-mtb-setup p1i 20.ptau 20.ph1 -./semaphore-mtb-setup p1i 23.ptau 23.ph1 -./semaphore-mtb-setup p1i 26.ptau 26.ph1 - -# go up a folder -cd ../ -``` - -Generate r1cs representation of the necessary sizes of the Semaphore Merkle Tree Batcher (SMTB): - -- Tree depth: 30, Batch size: 10 - - constraints: 725439 - - powers of Tau needed: 20 -- Tree depth: 30, Batch size: 100 - - constraints: 6305289 - - powers of Tau needed: 23 -- Tree depth: 30, Batch size: 1000 - - constraints: 60375789 - - powers of Tau needed: 26 - -```bash -cd semaphore-mtb && go build -v - -# requires quite a bit of compute -./gnark-mbu r1cs --batch-size 10 --tree-depth 30 --output b10t30.r1cs -./gnark-mbu r1cs --batch-size 100 --tree-depth 30 --output b100t30.r1cs -./gnark-mbu r1cs --batch-size 1000 --tree-depth 30 --output b1000t30.r1cs - -# move the r1cs files into the coordinator folder -mv b10t30.r1cs b100t30.r1cs b1000t30.r1cs ../semaphore-mtb-setup/ - -# go up a folder -cd ../ -``` - -Initialize the phase 2 of the setup (the slowest process): - -```bash -# make folders for each phase 2 -mkdir b10 b100 b1000 - -cd semaphore-mtb-setup - -# initialize each respective phase2 -./semaphore-mtb-setup p2n 20.ph1 b10t30.r1cs b10t30c0.ph2 - -mv b10t30c0.ph2 srs.lag evals ../b10/ - -./semaphore-mtb-setup p2n 23.ph1 b100t30.r1cs b100t30c0.ph2 - -mv b100t30c0.ph2 srs.lag evals ../b100/ - -./semaphore-mtb-setup p2n 26.ph1 b1000t30.r1cs b1000t30c0.ph2 - -mv b1000t30c0.ph2 srs.lag evals ../b1000/ -``` - -### The Contribution Process (plz help) - - -#### Requirements -- Go (using 1.20.5) -- Git -- \> 16 GiB RAM -- Good connectivity (to upload and download files fast to s3) -- The more cores the better (shorter contribution time) -- \> 10 GiB storage - -#### Steps - -Download the corresponding files for contributions (presigned AWS S3 bucket urls) - -Contribution time (on aws m5.16xlarge 256GiB RAM <4GB used - the more cores the better): -b10: 5-10sec (50MB file) -b100: < 5 min (419MB file) -b1000: 20-30 min (3.5GB file) - -You will receive pre-signed URLs from [dcbuilder.eth](https://twitter.com/DCbuild3r) for a GET request of a .ph2 file off of AWS S3 of the form: - -> https://.s3.amazonaws.com/?AWSAccessKeyId=&Signature=&Expires= - -Submit a GET request using: - -``` -curl --output b10t30cXX.ph2 -curl --output b100t30cXX.ph2 -curl --output b1000t30cXX.ph2 -``` - -where XX is the current contribution number. - -Download and build the [`semaphore-mtb-setup`](https://github.com/worldcoin/semaphore-mtb-setup) coordinator tool to perform the contribution: - -``` -git clone https://github.com/worldcoin/semaphore-mtb-setup -cd semaphore-mtb-setup -go build -v -``` - -Perform the contribution for each individual .ph2 file and increase the XX counter by one. Each command will output a contribution hash, please copy each of these down into a file of the format `_CONTRIBUTION.txt` and prepend each value with the corresponding batch size of the .ph2 file you contributed to (b10, b100 or b1000). Please also share via a message what NAME or PSEUDONYM you selected since it is required to generate a pre-signed S3 upload URL. - -``` -./semaphore-mtb-setup p2c b10t30cXX.ph2 b10t30c(XX + 1).ph2 -``` -``` -./semaphore-mtb-setup p2c b100t30cXX.ph2 b100t30c(XX + 1).ph2 -``` -``` -./semaphore-mtb-setup p2c b100t30cXX.ph2 b100t30c(XX + 1).ph2 -``` - -You will also receive pre-signed URLs to upload your contribution to the S3 bucket, after your contributions are done and you have the output files, upload them using the following commands: - -``` -curl -v -T b10t30c(XX + 1).ph2 -curl -v -T b100t30c(XX + 1).ph2 -curl -v -T b1000t30c(XX + 1).ph2 -curl -v -T _CONTRIBUTION.txt -``` - -> NOTE: if your file is above 5GiB (shouldn't ever get to that) the request will fail. If that happens, please reach out. - -Congratulations! You have successfully contributed to our phase 2 trusted setup ceremony! - -# List of contributors - -### Batch size 10 SMTB circuit - -1. [**dcbuilder.eth**](https://twitter.com/DCbuild3r) -- contribution hash: `b0b44102bf1201e83bffb1cb0c492cfb93421656c4d2d113840bb904a48936c5` -- generated file: `b10t30c01.ph2` -2. [**reldev**](http://twitter.com/reldev) -- contribution hash: `ed4570a3668448102b8f0fa2c47cdd592281a7dd30e568f112ec784b5f1d96e2` -- generated file: `b10t30c02.ph2` -3. [**remco**](https://twitter.com/recmo) -- contribution hash: `1a0925e14d4c3035d8ee5d288e376e723299b712181b4e34b59a337ebe08761a` -- generated file: `b10t30c03.ph2` -4. [**worldbridger**](https://twitter.com/shumochu) -- contribution hash: `c3092354da6c45b43a81e410364972d635ee563d46c24db77e1dc171f440a9ca` -- generated file: `b10t30c04.ph2` -5. [**kobigurk**](https://twitter.com/kobigurk) -- contribution hash: `d4e581570bbe53ffa9e08cc93816f48486cfd756df50894de74ac1845ae07feb` -- generated file: `b10t30c05.ph2` -6. [**kustosz**](https://twitter.com/mmkostrzewa) -- contribution hash: `7f13b7d526ec05775c9646908da8d99a35f48777fa341f8c25adf099135bb162` -- generated file: `b10t30c06.ph2` -7. [**m1guelpf**](https://twitter.com/m1guelpf) -- contribution hash: `e4ce787995a6c97b8be12f418ae3c026e26f84ad3d433e3fe6f33d3449d08c91` -- generated file: `b10t30c07.ph2` -8. [**atris**](https://twitter.com/atris_eth) -- contribution hash: `807552014597a0c51e4e0f2e9b989be7633d8a050906a894cab226bc2a85333d` -- generated file: `b10t30c08.ph2` -9. [**zellic**](https://twitter.com/zellic_io) -- contribution hash: `e0e690b8acf0dfd8bed0c65c55c1507fd261ac5eb86ef7f111d04b9a35b81587` -- generated file: `b10t30c09.ph2` -10. [**eddylazzarin**](https://twitter.com/eddylazzarin) -- contribution hash: `43c7cb5dd53447967cfe531ac2120948efd90fa5c80eb3b80fb340df452d7e18` -- generated file: `b10t30c10.ph2` - -### Batch size 100 SMTB circuit - -1. [**dcbuilder.eth**](https://twitter.com/DCbuild3r) -- contribution hash: `66e1845efd543078218a92fd575478bd2f71d64b16a3bf427f5032dcc479a808` -- generated file: `b100t30c01.ph2` -2. [**reldev**](http://twitter.com/reldev) -- contribution hash: `b136004f834e6a4db35a5def1050d2aed0b4d41df57cbeb4577eb5c5997f9995` -- generated file: `b10t30c02.ph2` -3. [**remco**](https://twitter.com/recmo) -- contribution hash: `be313375479fa2ce7097129bb187f16df3211dba4e395acdd84e946fb4ae1e4b` -- generated file: `b100t30c03.ph2` -4. [**worldbridger**](https://twitter.com/shumochu) -- contribution hash: `30fac7bc6ff5f2bc24991dd5e6537e203714459c1e6d31c241d512490e966dde` -- generated file: `b100t30c04.ph2` -5. [**kobigurk**](https://twitter.com/kobigurk) -- contribution hash: `e53e7eae98d270b44e25ac8b413e1dcf45a13062c51cb45e6faad089a25d8511` -- generated file: `b100t30c05.ph2` -6. [**kustosz**](https://twitter.com/mmkostrzewa) -- contribution hash: `c202ab3e6a3e29041b1e3781cb1ffdbae7683541d32a91d8d92454b2ce0508c2` -- generated file: `b100t30c06.ph2` -7. [**m1guelpf**](https://twitter.com/m1guelpf) -- contribution hash: `10b5376bfbe4e6235a8b7f9b000699063ee8c5c9626e98dc70ed594f5c2e0326` -- generated file: `b100t30c07.ph2` -8. [**atris**](https://twitter.com/atris_eth) -- contribution hash: `e2b20b3379496da808a502df477b47846e7c6375a40e84e81730cf5d15142e44` -- generated file: `b100t30c08.ph2` -9. [**zellic**](https://twitter.com/zellic_io) -- contribution hash: `6ac522c33145afcbcd05291834f13b0b83833578ca1a13e4ef91a33a187f23e2` -- generated file: `b100t30c09.ph2` -10. [**eddylazzarin**](https://twitter.com/eddylazzarin) -- contribution hash: `3a4112517f6b9082c6d8faf1091164bcd4f818aeb7e53cb8ec2f5aa1eabdd05d` -- generated file: `b100t30c10.ph2` - -### Batch size 1000 SMTB circuit - -1. [**dcbuilder.eth**](https://twitter.com/DCbuild3r) -- contribution hash: `6a1d0b08bf79fe564cd701e9af70bb5f3936f668869f529e83d8ecb1a3d474b3` -- generated file: `b1000t30c01.ph2` -2. [**reldev**](http://twitter.com/reldev) -- contribution hash: `d263eafd25ba809748390850966fd689379311033de6d4fc2de69acebb247d2b` -- generated file: `b10t30c02.ph2` -3. [**remco**](https://twitter.com/recmo) -- contribution hash: `1b5bb5f54d1126398b723ae4f42f86d9659550d57af262e67b9f8a6ddc7d926e` -- generated file: `b1000t30c03.ph2` -4. [**worldbridger**](https://twitter.com/shumochu) -- contribution hash: `9155ac2cc06510ffb63ac9a8c01284bf533183542054a94d735d358cc42cdca4` -- generated file: `b1000t30c04.ph2` -5. [**kobigurk**](https://twitter.com/kobigurk) -- contribution hash: `463c17a0c8ed79c56f4579d1eef02beac8edc91549fc9abae51c08e76b8bb4e9` -- generated file: `b1000t30c05.ph2` -6. [**kustosz**](https://twitter.com/mmkostrzewa) -- contribution hash: `dc8992e6d8bf5f9ed4b0cf9e98f582c0594d5775f315446c7ed1e94d3020e787` -- generated file: `b10t30c06.ph2` -7. [**m1guelpf**](https://twitter.com/m1guelpf) -- contribution hash: `1c512817cc55404489dc0184d7d049e6906f8282c542c17d9817a2bdab8ab524` -- generated file: `b1000t30c07.ph2` -8. [**atris**](https://twitter.com/atris_eth) -- contribution hash: `606a17d757c56171416c81dcd79171369e6ff651020d0dba0d7c98fa31ce78ca` -- generated file: `b1000t30c08.ph2` -9. [**zellic**](https://twitter.com/zellic_io) -- contribution hash: `dca70be8175332c8f3afe78018eecb5c205fef05fc0f177577dfed3bdb562304` -- generated file: `b1000t30c09.ph2` -10. [**eddylazzarin**](https://twitter.com/eddylazzarin) -- contribution hash: `34e92e61360b5e36d44736a8b2b6687c7557985e0acccd770b63de1b710c4506` -- generated file: `b1000t30c10.ph2` - diff --git a/actions.go b/actions.go index ba2f99e..178dc1e 100644 --- a/actions.go +++ b/actions.go @@ -2,166 +2,214 @@ package main import ( "errors" - "strconv" + "os" + groth16 "github.com/consensys/gnark/backend/groth16/bn254" + "github.com/consensys/gnark/backend/groth16/bn254/mpcsetup" + "github.com/consensys/gnark/backend/solidity" + cs "github.com/consensys/gnark/constraint/bn254" "github.com/urfave/cli/v2" deserializer "github.com/worldcoin/ptau-deserializer/deserialize" - "github.com/worldcoin/semaphore-mtb-setup/keys" - "github.com/worldcoin/semaphore-mtb-setup/phase1" - "github.com/worldcoin/semaphore-mtb-setup/phase2" ) -func p1t(cCtx *cli.Context) error { +func p1i(cCtx *cli.Context) error { // sanity check - if cCtx.Args().Len() != 4 { + if cCtx.Args().Len() != 2 { return errors.New("please provide the correct arguments") } - inputPath := cCtx.Args().Get(0) - outputPath := cCtx.Args().Get(1) - inPowStr := cCtx.Args().Get(2) - inPower, err := strconv.Atoi(inPowStr) + + ptauFilePath := cCtx.Args().Get(0) + outputFilePath := cCtx.Args().Get(1) + + ptau, err := deserializer.ReadPtau(ptauFilePath) if err != nil { return err } - outPowStr := cCtx.Args().Get(3) - outPower, err := strconv.Atoi(outPowStr) + + phase1, err := deserializer.ConvertPtauToPhase1(ptau) if err != nil { return err } - if inPower < outPower { - return errors.New("cannot transform to a higher power") - } - err = phase1.Transform(inputPath, outputPath, byte(inPower), byte(outPower)) - return err -} -func p1n(cCtx *cli.Context) error { - // sanity check - if cCtx.Args().Len() != 2 { - return errors.New("please provide the correct arguments") - } - powerStr := cCtx.Args().Get(0) - power, err := strconv.Atoi(powerStr) + outputFile, err := os.Create(outputFilePath) if err != nil { return err } - if power > 26 { - return errors.New("can't support powers larger than 26") + + _, err = phase1.WriteTo(outputFile) + if err != nil { + return err } - outputPath := cCtx.Args().Get(1) - err = phase1.Initialize(byte(power), outputPath) - return err + + return nil } -func p1c(cCtx *cli.Context) error { - // sanity check - if cCtx.Args().Len() != 2 { +func p2n(cCtx *cli.Context) error { + if cCtx.Args().Len() != 4 { return errors.New("please provide the correct arguments") } - inputPath := cCtx.Args().Get(0) - outputPath := cCtx.Args().Get(1) - err := phase1.Contribute(inputPath, outputPath) - return err -} -func p1i(cCtx *cli.Context) error { - ptauFilePath := cCtx.Args().Get(0) - outputFilePath := cCtx.Args().Get(1) + phase1Path := cCtx.Args().Get(0) + r1csPath := cCtx.Args().Get(1) + phase2Path := cCtx.Args().Get(2) + evalsPath := cCtx.Args().Get(3) - ptau, err := deserializer.ReadPtau(ptauFilePath) + phase1File, err := os.Open(phase1Path) + if err != nil { + return err + } + phase1 := &mpcsetup.Phase1{} + phase1.ReadFrom(phase1File) + r1csFile, err := os.Open(r1csPath) if err != nil { return err } + r1cs := cs.R1CS{} + r1cs.ReadFrom(r1csFile) - phase1, err := deserializer.ConvertPtauToPhase1(ptau) + phase2, evals := mpcsetup.InitPhase2(&r1cs, phase1) + phase2File, err := os.Create(phase2Path) if err != nil { return err } + phase2.WriteTo(phase2File) - // Write phase1 to file - err = deserializer.WritePhase1(phase1, uint8(ptau.Header.Power), outputFilePath) - + evalsFile, err := os.Create(evalsPath) if err != nil { return err } + evals.WriteTo(evalsFile) return nil } -func p1v(cCtx *cli.Context) error { - // sanity check - if cCtx.Args().Len() != 1 { - return errors.New("please provide the correct arguments") +func p2c(cCtx *cli.Context) error { + inputPh2Path := cCtx.Args().Get(0) + outputPh2Path := cCtx.Args().Get(1) + + inputFile, err := os.Open(inputPh2Path) + if err != nil { + return err } - inputPath := cCtx.Args().Get(0) - err := phase1.Verify(inputPath, "") - return err + phase2 := &mpcsetup.Phase2{} + phase2.ReadFrom(inputFile) + + phase2.Contribute() + + outputFile, err := os.Create(outputPh2Path) + if err != nil { + return err + } + phase2.WriteTo(outputFile) + + return nil } -func p1vt(cCtx *cli.Context) error { - // sanity check +func p2v(cCtx *cli.Context) error { if cCtx.Args().Len() != 2 { return errors.New("please provide the correct arguments") } inputPath := cCtx.Args().Get(0) - transformedPath := cCtx.Args().Get(1) - err := phase1.Verify(inputPath, transformedPath) - return err + originPath := cCtx.Args().Get(1) + + inputFile, err := os.Open(inputPath) + if err != nil { + return err + } + input := &mpcsetup.Phase2{} + input.ReadFrom(inputFile) + + originFile, err := os.Open(originPath) + if err != nil { + return err + } + origin := &mpcsetup.Phase2{} + origin.ReadFrom(originFile) + + mpcsetup.VerifyPhase2(origin, input) + + return nil } -func p2n(cCtx *cli.Context) error { +func keys(cCtx *cli.Context) error { // sanity check - if cCtx.Args().Len() != 3 { + if cCtx.Args().Len() != 4 { return errors.New("please provide the correct arguments") } phase1Path := cCtx.Args().Get(0) - r1csPath := cCtx.Args().Get(1) - phase2Path := cCtx.Args().Get(2) - err := phase2.Initialize(phase1Path, r1csPath, phase2Path) - return err -} + phase1 := &mpcsetup.Phase1{} + phase1File, err := os.Open(phase1Path) + if err != nil { + return err + } + phase1.ReadFrom(phase1File) -func p2c(cCtx *cli.Context) error { - // sanity check - if cCtx.Args().Len() != 2 { - return errors.New("please provide the correct arguments") + phase2Path := cCtx.Args().Get(1) + phase2 := &mpcsetup.Phase2{} + phase2File, err := os.Open(phase2Path) + if err != nil { + return err } - inputPath := cCtx.Args().Get(0) - outputPath := cCtx.Args().Get(1) - err := phase2.Contribute(inputPath, outputPath) - return err -} + phase2.ReadFrom(phase2File) -func p2v(cCtx *cli.Context) error { - // sanity check - if cCtx.Args().Len() != 2 { - return errors.New("please provide the correct arguments") + evalsPath := cCtx.Args().Get(2) + evals := &mpcsetup.Phase2Evaluations{} + evalsFile, err := os.Open(evalsPath) + if err != nil { + return err } - inputPath := cCtx.Args().Get(0) - originPath := cCtx.Args().Get(1) - err := phase2.Verify(inputPath, originPath) - return err -} + evals.ReadFrom(evalsFile) -func extract(cCtx *cli.Context) error { - // sanity check - if cCtx.Args().Len() != 1 { - return errors.New("please provide the correct arguments") + r1csPath := cCtx.Args().Get(3) + r1cs := &cs.R1CS{} + r1csFile, err := os.Open(r1csPath) + if err != nil { + return err } - inputPath := cCtx.Args().Get(0) - err := keys.ExtractKeys(inputPath) - return err + r1cs.ReadFrom(r1csFile) + + // get number of constraints + nbConstraints := r1cs.GetNbConstraints() + + pk, vk := mpcsetup.ExtractKeys(phase1, phase2, evals, nbConstraints) + + pkFile, err := os.Create("pk") + if err != nil { + return err + } + pk.WriteTo(pkFile) + + vkFile, err := os.Create("vk") + if err != nil { + return err + } + vk.WriteTo(vkFile) + + return nil } -func exportSol(cCtx *cli.Context) error { +func sol(cCtx *cli.Context) error { // sanity check if cCtx.Args().Len() != 1 { return errors.New("please provide the correct arguments") } - session := cCtx.Args().Get(0) - err := keys.ExportSol(session) + + vkPath := cCtx.Args().Get(0) + vk := &groth16.VerifyingKey{} + vkFile, err := os.Open(vkPath) + if err != nil { + return err + } + vk.ReadFrom(vkFile) + + solFile, err := os.Create("Groth16Verifier.sol") + if err != nil { + return err + } + + err = vk.ExportSolidity(solFile, solidity.WithPragmaVersion("0.8.20")) return err } diff --git a/common/keypair.go b/common/keypair.go deleted file mode 100644 index a3aa375..0000000 --- a/common/keypair.go +++ /dev/null @@ -1,48 +0,0 @@ -package common - -import ( - "math/big" - - "github.com/consensys/gnark-crypto/ecc/bn254" - "github.com/consensys/gnark-crypto/ecc/bn254/fr" -) - -type PublicKey struct { - S bn254.G1Affine - SX bn254.G1Affine - SPX bn254.G2Affine -} - -func GenPublicKey(x fr.Element, challenge []byte, dst byte) PublicKey { - var pk PublicKey - _, _, g1, _ := bn254.Generators() - - var s fr.Element - var sBi big.Int - s.SetRandom() - s.BigInt(&sBi) - pk.S.ScalarMultiplication(&g1, &sBi) - - // compute x*sG1 - var xBi big.Int - x.BigInt(&xBi) - pk.SX.ScalarMultiplication(&pk.S, &xBi) - - // generate R based on sG1, sxG1, challenge, and domain separation tag (tau, alpha or beta) - SP := GenSP(pk.S, pk.SX, challenge, dst) - - // compute x*spG2 - pk.SPX.ScalarMultiplication(&SP, &xBi) - return pk -} - -// Generate SP in G₂ as Hash(gˢ, gˢˣ, challenge, dst) -func GenSP(sG1, sxG1 bn254.G1Affine, challenge []byte, dst byte) bn254.G2Affine { - buffer := append(sG1.Marshal()[:], sxG1.Marshal()...) - buffer = append(buffer, challenge...) - spG2, err := bn254.HashToG2(buffer, []byte{dst}) - if err != nil { - panic(err) - } - return spG2 -} diff --git a/common/parallelize.go b/common/parallelize.go deleted file mode 100644 index 0ace558..0000000 --- a/common/parallelize.go +++ /dev/null @@ -1,44 +0,0 @@ -package common - -import ( - "runtime" - "sync" -) - -// Parallelize process in parallel the work function -func Parallelize(nbIterations int, work func(int, int), maxCpus ...int) { - - nbTasks := runtime.NumCPU() - if len(maxCpus) == 1 { - nbTasks = maxCpus[0] - } - nbIterationsPerCpus := nbIterations / nbTasks - - // more CPUs than tasks: a CPU will work on exactly one iteration - if nbIterationsPerCpus < 1 { - nbIterationsPerCpus = 1 - nbTasks = nbIterations - } - - var wg sync.WaitGroup - - extraTasks := nbIterations - (nbTasks * nbIterationsPerCpus) - extraTasksOffset := 0 - - for i := 0; i < nbTasks; i++ { - wg.Add(1) - _start := i*nbIterationsPerCpus + extraTasksOffset - _end := _start + nbIterationsPerCpus - if extraTasks > 0 { - _end++ - extraTasks-- - extraTasksOffset++ - } - go func() { - work(_start, _end) - wg.Done() - }() - } - - wg.Wait() -} diff --git a/common/utils.go b/common/utils.go deleted file mode 100644 index e88726a..0000000 --- a/common/utils.go +++ /dev/null @@ -1,44 +0,0 @@ -package common - -import ( - "math/bits" - - "github.com/consensys/gnark-crypto/ecc/bn254" -) - -func BitReverseG1(a []bn254.G1Affine) { - n := uint64(len(a)) - nn := uint64(64 - bits.TrailingZeros64(n)) - - for i := uint64(0); i < n; i++ { - irev := bits.Reverse64(i) >> nn - if irev > i { - a[i], a[irev] = a[irev], a[i] - } - } -} - -func BitReverseG2(a []bn254.G2Affine) { - n := uint64(len(a)) - nn := uint64(64 - bits.TrailingZeros64(n)) - - for i := uint64(0); i < n; i++ { - irev := bits.Reverse64(i) >> nn - if irev > i { - a[i], a[irev] = a[irev], a[i] - } - } -} - -// Check e(a₁, a₂) = e(b₁, b₂) -func SameRatio(a1, b1 bn254.G1Affine, a2, b2 bn254.G2Affine) bool { - var na2 bn254.G2Affine - na2.Neg(&a2) - res, err := bn254.PairingCheck( - []bn254.G1Affine{a1, b1}, - []bn254.G2Affine{na2, b2}) - if err != nil { - panic(err) - } - return res -} \ No newline at end of file diff --git a/go.mod b/go.mod index 3f8c454..c61172e 100644 --- a/go.mod +++ b/go.mod @@ -1,28 +1,33 @@ -module github.com/worldcoin/semaphore-mtb-setup +module github.com/mattstam/semaphore-mtb-setup -go 1.19 +go 1.23 + +toolchain go1.23.1 require ( - github.com/consensys/gnark v0.8.0 - github.com/consensys/gnark-crypto v0.9.1 + github.com/consensys/gnark v0.11.0 + github.com/consensys/gnark-crypto v0.14.0 github.com/urfave/cli/v2 v2.25.7 - github.com/worldcoin/ptau-deserializer v0.1.3 + github.com/worldcoin/ptau-deserializer v0.2.0 ) require ( + github.com/bits-and-blooms/bitset v1.14.2 // indirect github.com/blang/semver/v4 v4.0.0 // indirect github.com/consensys/bavard v0.1.13 // indirect github.com/cpuguy83/go-md2man/v2 v2.0.2 // indirect - github.com/fxamacker/cbor/v2 v2.4.0 // indirect - github.com/google/pprof v0.0.0-20230602150820-91b7bce49751 // indirect + github.com/fxamacker/cbor/v2 v2.7.0 // indirect + github.com/google/pprof v0.0.0-20240727154555-813a5fbdbec8 // indirect github.com/mattn/go-colorable v0.1.13 // indirect - github.com/mattn/go-isatty v0.0.19 // indirect + github.com/mattn/go-isatty v0.0.20 // indirect github.com/mmcloughlin/addchain v0.4.0 // indirect - github.com/rs/zerolog v1.29.1 // indirect + github.com/ronanh/intcomp v1.1.0 // indirect + github.com/rs/zerolog v1.33.0 // indirect github.com/russross/blackfriday/v2 v2.1.0 // indirect github.com/x448/float16 v0.8.4 // indirect github.com/xrash/smetrics v0.0.0-20201216005158-039620a65673 // indirect - golang.org/x/crypto v0.6.0 // indirect - golang.org/x/sys v0.9.0 // indirect + golang.org/x/crypto v0.26.0 // indirect + golang.org/x/sync v0.8.0 // indirect + golang.org/x/sys v0.24.0 // indirect rsc.io/tmplfunc v0.0.3 // indirect ) diff --git a/go.sum b/go.sum index 36cc371..9eff950 100644 --- a/go.sum +++ b/go.sum @@ -1,57 +1,77 @@ +github.com/bits-and-blooms/bitset v1.14.2 h1:YXVoyPndbdvcEVcseEovVfp0qjJp7S+i5+xgp/Nfbdc= +github.com/bits-and-blooms/bitset v1.14.2/go.mod h1:7hO7Gc7Pp1vODcmWvKMRA9BNmbv6a/7QIWpPxHddWR8= github.com/blang/semver/v4 v4.0.0 h1:1PFHFE6yCCTv8C1TeyNNarDzntLi7wMI5i/pzqYIsAM= github.com/blang/semver/v4 v4.0.0/go.mod h1:IbckMUScFkM3pff0VJDNKRiT6TG/YpiHIM2yvyW5YoQ= github.com/consensys/bavard v0.1.13 h1:oLhMLOFGTLdlda/kma4VOJazblc7IM5y5QPd2A/YjhQ= github.com/consensys/bavard v0.1.13/go.mod h1:9ItSMtA/dXMAiL7BG6bqW2m3NdSEObYWoH223nGHukI= -github.com/consensys/gnark v0.8.0 h1:0bQ2MyDG4oNjMQpNyL8HjrrUSSL3yYJg0Elzo6LzmcU= -github.com/consensys/gnark v0.8.0/go.mod h1:aKmA7dIiLbTm0OV37xTq0z+Bpe4xER8EhRLi6necrm8= -github.com/consensys/gnark-crypto v0.9.1 h1:mru55qKdWl3E035hAoh1jj9d7hVnYY5pfb6tmovSmII= -github.com/consensys/gnark-crypto v0.9.1/go.mod h1:a2DQL4+5ywF6safEeZFEPGRiiGbjzGFRUN2sg06VuU4= +github.com/consensys/gnark v0.11.0 h1:YlndnlbRAoIEA+aIIHzNIW4P0dCIOM9/jCVzsXf356c= +github.com/consensys/gnark v0.11.0/go.mod h1:2LbheIOxsBI1a9Ck1XxUoy6PRnH28mSI9qrvtN2HwDY= +github.com/consensys/gnark-crypto v0.13.1-0.20240802214859-ff4c0ddbe1ef h1:4DaS1IYXk0vKcCdguGjkHVyN43YqmKUmpYDxb90VBnU= +github.com/consensys/gnark-crypto v0.13.1-0.20240802214859-ff4c0ddbe1ef/go.mod h1:wKqwsieaKPThcFkHe0d0zMsbHEUWFmZcG7KBCse210o= +github.com/consensys/gnark-crypto v0.14.0 h1:DDBdl4HaBtdQsq/wfMwJvZNE80sHidrK3Nfrefatm0E= +github.com/consensys/gnark-crypto v0.14.0/go.mod h1:CU4UijNPsHawiVGNxe9co07FkzCeWHHrb1li/n1XoU0= github.com/coreos/go-systemd/v22 v22.5.0/go.mod h1:Y58oyj3AT4RCenI/lSvhwexgC+NSVTIJ3seZv2GcEnc= github.com/cpuguy83/go-md2man/v2 v2.0.2 h1:p1EgwI/C7NhT0JmVkwCD2ZBK8j4aeHQX2pMHHBfMQ6w= github.com/cpuguy83/go-md2man/v2 v2.0.2/go.mod h1:tgQtvFlXSQOSOSIRvRPT7W67SCa46tRHOmNcaadrF8o= github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c= -github.com/fxamacker/cbor/v2 v2.4.0 h1:ri0ArlOR+5XunOP8CRUowT0pSJOwhW098ZCUyskZD88= -github.com/fxamacker/cbor/v2 v2.4.0/go.mod h1:TA1xS00nchWmaBnEIxPSE5oHLuJBAVvqrtAnWBwBCVo= +github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38= +github.com/ewoolsey/gnark v0.10.1 h1:F2dZVCcQ2IjkX9k3LS7fOE0V/e1lRb9J/QShH8Q+W4c= +github.com/ewoolsey/gnark v0.10.1/go.mod h1:/LeTXLgwmnGfrcyTT2K+GzBuj0+iiljmijrAQKslSFU= +github.com/fxamacker/cbor/v2 v2.7.0 h1:iM5WgngdRBanHcxugY4JySA0nk1wZorNOpTgCMedv5E= +github.com/fxamacker/cbor/v2 v2.7.0/go.mod h1:pxXPTn3joSm21Gbwsv0w9OSA2y1HFR9qXEeXQVeNoDQ= github.com/godbus/dbus/v5 v5.0.4/go.mod h1:xhWf0FNVPg57R7Z0UbKHbJfkEywrmjJnf7w5xrFpKfA= -github.com/google/go-cmp v0.5.9 h1:O2Tfq5qg4qc4AmwVlvv0oLiVAGB7enBSJ2x2DqQFi38= -github.com/google/pprof v0.0.0-20230602150820-91b7bce49751 h1:hR7/MlvK23p6+lIw9SN1TigNLn9ZnF3W4SYRKq2gAHs= -github.com/google/pprof v0.0.0-20230602150820-91b7bce49751/go.mod h1:Jh3hGz2jkYak8qXPD19ryItVnUgpgeqzdkY/D0EaeuA= +github.com/google/go-cmp v0.6.0 h1:ofyhxvXcZhMsU5ulbFiLKl/XBFqE1GSq7atu8tAmTRI= +github.com/google/go-cmp v0.6.0/go.mod h1:17dUlkBOakJ0+DkrSSNjCkIjxS6bF9zb3elmeNGIjoY= +github.com/google/pprof v0.0.0-20240727154555-813a5fbdbec8 h1:FKHo8hFI3A+7w0aUQuYXQ+6EN5stWmeY/AZqtM8xk9k= +github.com/google/pprof v0.0.0-20240727154555-813a5fbdbec8/go.mod h1:K1liHPHnj73Fdn/EKuT8nrFqBihUSKXoLYU0BuatOYo= github.com/google/subcommands v1.2.0/go.mod h1:ZjhPrFU+Olkh9WazFPsl27BQ4UPiG37m3yTrtFlrHVk= -github.com/leanovate/gopter v0.2.9 h1:fQjYxZaynp97ozCzfOyOuAGOU4aU/z37zf/tOujFk7c= -github.com/mattn/go-colorable v0.1.12/go.mod h1:u5H1YNBxpqRaxsYJYSkiCWKzEfiAb1Gb520KVy5xxl4= +github.com/ingonyama-zk/icicle v1.1.0 h1:a2MUIaF+1i4JY2Lnb961ZMvaC8GFs9GqZgSnd9e95C8= +github.com/ingonyama-zk/icicle v1.1.0/go.mod h1:kAK8/EoN7fUEmakzgZIYdWy1a2rBnpCaZLqSHwZWxEk= +github.com/ingonyama-zk/iciclegnark v0.1.0 h1:88MkEghzjQBMjrYRJFxZ9oR9CTIpB8NG2zLeCJSvXKQ= +github.com/ingonyama-zk/iciclegnark v0.1.0/go.mod h1:wz6+IpyHKs6UhMMoQpNqz1VY+ddfKqC/gRwR/64W6WU= +github.com/leanovate/gopter v0.2.11 h1:vRjThO1EKPb/1NsDXuDrzldR28RLkBflWYcU9CvzWu4= +github.com/leanovate/gopter v0.2.11/go.mod h1:aK3tzZP/C+p1m3SPRE4SYZFGP7jjkuSI4f7Xvpt0S9c= github.com/mattn/go-colorable v0.1.13 h1:fFA4WZxdEF4tXPZVKMLwD8oUnCTTo08duU7wxecdEvA= github.com/mattn/go-colorable v0.1.13/go.mod h1:7S9/ev0klgBDR4GtXTXX8a3vIGJpMovkB8vQcUbaXHg= -github.com/mattn/go-isatty v0.0.14/go.mod h1:7GGIvUiUoEMVVmxf/4nioHXj79iQHKdU27kJ6hsGG94= github.com/mattn/go-isatty v0.0.16/go.mod h1:kYGgaQfpe5nmfYZH+SKPsOc2e4SrIfOl2e/yFXSvRLM= -github.com/mattn/go-isatty v0.0.19 h1:JITubQf0MOLdlGRuRq+jtsDlekdYPia9ZFsB8h/APPA= github.com/mattn/go-isatty v0.0.19/go.mod h1:W+V8PltTTMOvKvAeJH7IuucS94S2C6jfK/D7dTCTo3Y= +github.com/mattn/go-isatty v0.0.20 h1:xfD0iDuEKnDkl03q4limB+vH+GxLEtL/jb4xVJSWWEY= +github.com/mattn/go-isatty v0.0.20/go.mod h1:W+V8PltTTMOvKvAeJH7IuucS94S2C6jfK/D7dTCTo3Y= github.com/mmcloughlin/addchain v0.4.0 h1:SobOdjm2xLj1KkXN5/n0xTIWyZA2+s99UCY1iPfkHRY= github.com/mmcloughlin/addchain v0.4.0/go.mod h1:A86O+tHqZLMNO4w6ZZ4FlVQEadcoqkyU72HC5wJ4RlU= github.com/mmcloughlin/profile v0.1.1/go.mod h1:IhHD7q1ooxgwTgjxQYkACGA77oFTDdFVejUS1/tS/qU= github.com/pkg/errors v0.9.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0= github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM= -github.com/rs/xid v1.4.0/go.mod h1:trrq9SKmegXys3aeAKXMUTdJsYXVwGY3RLcfgqegfbg= -github.com/rs/zerolog v1.29.1 h1:cO+d60CHkknCbvzEWxP0S9K6KqyTjrCNUy1LdQLCGPc= -github.com/rs/zerolog v1.29.1/go.mod h1:Le6ESbR7hc+DP6Lt1THiV8CQSdkkNrd3R0XbEgp3ZBU= +github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4= +github.com/ronanh/intcomp v1.1.0 h1:i54kxmpmSoOZFcWPMWryuakN0vLxLswASsGa07zkvLU= +github.com/ronanh/intcomp v1.1.0/go.mod h1:7FOLy3P3Zj3er/kVrU/pl+Ql7JFZj7bwliMGketo0IU= +github.com/rs/xid v1.5.0/go.mod h1:trrq9SKmegXys3aeAKXMUTdJsYXVwGY3RLcfgqegfbg= +github.com/rs/zerolog v1.33.0 h1:1cU2KZkvPxNyfgEmhHAz/1A9Bz+llsdYzklWFzgp0r8= +github.com/rs/zerolog v1.33.0/go.mod h1:/7mN4D5sKwJLZQ2b/znpjC3/GQWY/xaDXUM0kKWRHss= github.com/russross/blackfriday/v2 v2.1.0 h1:JIOH55/0cWyOuilr9/qlrm0BSXldqnqwMsf35Ld67mk= github.com/russross/blackfriday/v2 v2.1.0/go.mod h1:+Rmxgy9KzJVeS9/2gXHxylqXiyQDYRxCVz55jmeOWTM= -github.com/stretchr/testify v1.8.4 h1:CcVxjf3Q8PM0mHUKJCdn+eZZtm5yQwehR5yeSVQQcUk= +github.com/stretchr/testify v1.9.0 h1:HtqpIVDClZ4nwg75+f6Lvsy/wHu+3BoSGCbBAcpTsTg= +github.com/stretchr/testify v1.9.0/go.mod h1:r2ic/lqez/lEtzL7wO/rwa5dbSLXVDPFyf8C91i36aY= github.com/urfave/cli/v2 v2.25.7 h1:VAzn5oq403l5pHjc4OhD54+XGO9cdKVL/7lDjF+iKUs= github.com/urfave/cli/v2 v2.25.7/go.mod h1:8qnjx1vcq5s2/wpsqoZFndg2CE5tNFyrTvS6SinrnYQ= -github.com/worldcoin/ptau-deserializer v0.1.3 h1:VU9k9EaEZ6tSpfqf11md9eoyglOEmuNaEOA0btcM5yI= -github.com/worldcoin/ptau-deserializer v0.1.3/go.mod h1:uYwfSrLvfNi2CPFNwXFHz/VbGKEWbuOYXfojQof6pZs= +github.com/worldcoin/ptau-deserializer v0.2.0 h1:d99Lou4g3eK4tnwFIAg5MYocyMYSm3vv3vvppbiXCuk= +github.com/worldcoin/ptau-deserializer v0.2.0/go.mod h1:hMdMAGwRmjKf/FqJjzVP4MyJwAVZF/yoEUeIzzxdK6U= github.com/x448/float16 v0.8.4 h1:qLwI1I70+NjRFUR3zs1JPUCgaCXSh3SW62uAKT1mSBM= github.com/x448/float16 v0.8.4/go.mod h1:14CWIYCyZA/cWjXOioeEpHeN/83MdbZDRQHoFcYsOfg= github.com/xrash/smetrics v0.0.0-20201216005158-039620a65673 h1:bAn7/zixMGCfxrRTfdpNzjtPYqr8smhKouy9mxVdGPU= github.com/xrash/smetrics v0.0.0-20201216005158-039620a65673/go.mod h1:N3UwUGtsrSj3ccvlPHLoLsHnpR27oXr4ZE984MbSER8= -golang.org/x/crypto v0.6.0 h1:qfktjS5LUO+fFKeJXZ+ikTRijMmljikvG68fpMMruSc= -golang.org/x/crypto v0.6.0/go.mod h1:OFC/31mSvZgRz0V1QTNCzfAI1aIRzbiufJtkMIlEp58= -golang.org/x/sys v0.0.0-20210630005230-0f9fa26af87c/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg= -golang.org/x/sys v0.0.0-20210927094055-39ccf1dd6fa6/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg= +golang.org/x/crypto v0.26.0 h1:RrRspgV4mU+YwB4FYnuBoKsUapNIL5cohGAmSH3azsw= +golang.org/x/crypto v0.26.0/go.mod h1:GY7jblb9wI+FOo5y8/S2oY4zWP07AkOJ4+jxCqdqn54= +golang.org/x/exp v0.0.0-20240823005443-9b4947da3948 h1:kx6Ds3MlpiUHKj7syVnbp57++8WpuKPcR5yjLBjvLEA= +golang.org/x/exp v0.0.0-20240823005443-9b4947da3948/go.mod h1:akd2r19cwCdwSwWeIdzYQGa/EZZyqcOdwWiwj5L5eKQ= +golang.org/x/sync v0.8.0 h1:3NFvSEYkUoMifnESzZl15y791HH1qU2xm6eCJU5ZPXQ= +golang.org/x/sync v0.8.0/go.mod h1:Czt+wKu1gCyEFDUtn0jG5QVvpJ6rzVqr5aXyt9drQfk= golang.org/x/sys v0.0.0-20220811171246-fbc7d0a398ab/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg= golang.org/x/sys v0.6.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg= -golang.org/x/sys v0.9.0 h1:KS/R3tvhPqvJvwcKfnBHJwwthS11LRhmM5D59eEXa0s= -golang.org/x/sys v0.9.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg= +golang.org/x/sys v0.12.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg= +golang.org/x/sys v0.24.0 h1:Twjiwq9dn6R1fQcyiK+wQyHWfaz/BJB+YIpzU/Cv3Xg= +golang.org/x/sys v0.24.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA= gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA= +gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM= rsc.io/tmplfunc v0.0.3 h1:53XFQh69AfOa8Tw0Jm7t+GV7KZhOi6jzsCzTtKbMvzU= rsc.io/tmplfunc v0.0.3/go.mod h1:AG3sTPzElb1Io3Yg4voV9AGZJuleGAwaVRxL9M49PhA= diff --git a/keys/keys.go b/keys/keys.go deleted file mode 100644 index 0dad445..0000000 --- a/keys/keys.go +++ /dev/null @@ -1,385 +0,0 @@ -package keys - -import ( - "bufio" - "fmt" - "io" - "os" - - "github.com/consensys/gnark-crypto/ecc" - "github.com/consensys/gnark-crypto/ecc/bn254" - "github.com/consensys/gnark-crypto/ecc/bn254/fr/fft" - "github.com/consensys/gnark-crypto/ecc/bn254/fr/pedersen" - "github.com/consensys/gnark/backend/groth16" - "github.com/consensys/gnark/constraint" - "github.com/worldcoin/semaphore-mtb-setup/phase2" -) - -type VerifyingKey struct { - G1 struct { - Alpha bn254.G1Affine - Beta, Delta bn254.G1Affine // unused, here for compatibility purposes - K []bn254.G1Affine // The indexes correspond to the public wires - } - - G2 struct { - Beta, Delta, Gamma bn254.G2Affine - } - - CommitmentKey pedersen.Key - CommitmentInfo constraint.Commitment // since the verifier doesn't input a constraint system, this needs to be provided here -} - -func (vk *VerifyingKey) writeTo(w io.Writer) (int64, error) { - enc := bn254.NewEncoder(w, bn254.RawEncoding()) - - // [α]1,[β]1,[β]2,[γ]2,[δ]1,[δ]2 - if err := enc.Encode(&vk.G1.Alpha); err != nil { - return enc.BytesWritten(), err - } - if err := enc.Encode(&vk.G1.Beta); err != nil { - return enc.BytesWritten(), err - } - if err := enc.Encode(&vk.G2.Beta); err != nil { - return enc.BytesWritten(), err - } - if err := enc.Encode(&vk.G2.Gamma); err != nil { - return enc.BytesWritten(), err - } - if err := enc.Encode(&vk.G1.Delta); err != nil { - return enc.BytesWritten(), err - } - if err := enc.Encode(&vk.G2.Delta); err != nil { - return enc.BytesWritten(), err - } - - // uint32(len(Kvk)),[Kvk]1 - if err := enc.Encode(vk.G1.K); err != nil { - return enc.BytesWritten(), err - } - - return enc.BytesWritten(), nil -} - -func extractPK(phase2Path string) error { - // Phase 2 file - phase2File, err := os.Open(phase2Path) - if err != nil { - return err - } - defer phase2File.Close() - - // Evaluations - evalsFile, err := os.Open("evals") - if err != nil { - return err - } - defer evalsFile.Close() - - // Use buffered IO to write parameters efficiently - ph2Reader := bufio.NewReader(phase2File) - evalsReader := bufio.NewReader(evalsFile) - - var header phase2.Header - if err := header.Read(ph2Reader); err != nil { - return err - } - - decPh2 := bn254.NewDecoder(ph2Reader) - decEvals := bn254.NewDecoder(evalsReader) - - pkFile, err := os.Create("pk") - if err != nil { - return err - } - defer pkFile.Close() - pkWriter := bufio.NewWriter(pkFile) - defer pkWriter.Flush() - encPk := bn254.NewEncoder((pkWriter)) - - var alphaG1, betaG1, deltaG1 bn254.G1Affine - var betaG2, deltaG2 bn254.G2Affine - - // 0. Write domain - domain := fft.NewDomain(uint64(header.Domain)) - domain.WriteTo(pkWriter) - - // 1. Read/Write [α]₁ - if err := decEvals.Decode(&alphaG1); err != nil { - return err - } - if err := encPk.Encode(&alphaG1); err != nil { - return err - } - - // 2. Read/Write [β]₁ - if err := decEvals.Decode(&betaG1); err != nil { - return err - } - if err := encPk.Encode(&betaG1); err != nil { - return err - } - - // 3. Read/Write [δ]₁ - if err := decPh2.Decode(&deltaG1); err != nil { - return err - } - if err := encPk.Encode(&deltaG1); err != nil { - return err - } - - // Read [β]₂ - if err := decEvals.Decode(&betaG2); err != nil { - return err - } - // Read [δ]₂ - if err := decPh2.Decode(&deltaG2); err != nil { - return err - } - - // 4. Read, Filter, Write A - var buffG1 []bn254.G1Affine - if err := decEvals.Decode(&buffG1); err != nil { - return err - } - buffG1, infinityA, nbInfinityA := filterInfinityG1(buffG1) - if err := encPk.Encode(buffG1); err != nil { - return err - } - - // 5. Read, Filter, Write B - if err := decEvals.Decode(&buffG1); err != nil { - return err - } - buffG1, infinityB, nbInfinityB := filterInfinityG1(buffG1) - if err := encPk.Encode(buffG1); err != nil { - return err - } - - // 6. Read/Write Z - buffG1 = make([]bn254.G1Affine, header.Domain) - for i := 0; i < header.Domain; i++ { - if err := decPh2.Decode(&buffG1[i]); err != nil { - return err - } - } - if err := encPk.Encode(buffG1); err != nil { - return err - } - - // 7. Read/Write PKK - buffG1 = make([]bn254.G1Affine, header.Witness) - for i := 0; i < header.Witness; i++ { - if err := decPh2.Decode(&buffG1[i]); err != nil { - return err - } - } - if err := encPk.Encode(buffG1); err != nil { - return err - } - - // 8. Write [β]₂ - if err := encPk.Encode(&betaG2); err != nil { - return err - } - - // 9. Write [δ]₂ - if err := encPk.Encode(&deltaG2); err != nil { - return err - } - - // 10. Read, Filter, Write B₂ - var buffG2 []bn254.G2Affine - if err := decEvals.Decode(&buffG2); err != nil { - return err - } - buffG2, _, _ = filterInfinityG2(buffG2) - if err := encPk.Encode(buffG2); err != nil { - return err - } - buffG2 = nil - - // 11. Write nbWires - nbWires := uint64(header.Wires) - if err := encPk.Encode(&nbWires); err != nil { - return err - } - - // 12. Write nbInfinityA - if err := encPk.Encode(&nbInfinityA); err != nil { - return err - } - - // 13. Write nbInfinityB - if err := encPk.Encode(&nbInfinityB); err != nil { - return err - } - - // 14. Write infinityA - if err := encPk.Encode(&infinityA); err != nil { - return err - } - - // 15. Write infinityB - if err := encPk.Encode(&infinityB); err != nil { - return err - } - - return nil -} - -func extractVK(phase2Path string) error { - vk := VerifyingKey{} - // Phase 2 file - phase2File, err := os.Open(phase2Path) - if err != nil { - return err - } - defer phase2File.Close() - - // Evaluations - evalsFile, err := os.Open("evals") - if err != nil { - return err - } - defer evalsFile.Close() - - // Use buffered IO to write parameters efficiently - ph2Reader := bufio.NewReader(phase2File) - evalsReader := bufio.NewReader(evalsFile) - - var header phase2.Header - if err := header.Read(ph2Reader); err != nil { - return err - } - - decPh2 := bn254.NewDecoder(ph2Reader) - decEvals := bn254.NewDecoder(evalsReader) - - vkFile, err := os.Create("vk") - if err != nil { - return err - } - defer vkFile.Close() - vkWriter := bufio.NewWriter(vkFile) - defer vkWriter.Flush() - - // 1. Read [α]₁ - if err := decEvals.Decode(&vk.G1.Alpha); err != nil { - return err - } - - // 2. Read [β]₁ - if err := decEvals.Decode(&vk.G1.Beta); err != nil { - return err - } - - // 3. Read [β]₂ - if err := decEvals.Decode(&vk.G2.Beta); err != nil { - return err - } - - // 4. Set [γ]₂ - _, _, _, gammaG2 := bn254.Generators() - vk.G2.Gamma.Set(&gammaG2) - - // 5. Read [δ]₁ - if err := decPh2.Decode(&vk.G1.Delta); err != nil { - return err - } - - // 6. Read [δ]₂ - if err := decPh2.Decode(&vk.G2.Delta); err != nil { - return err - } - - // 7. Read VKK - pos := int64(128*(header.Wires+1) + 12) - if _, err := evalsFile.Seek(pos, io.SeekStart); err != nil { - return err - } - evalsReader.Reset(evalsFile) - if err := decEvals.Decode(&vk.G1.K); err != nil { - return err - } - - // 8. Setup commitment key - var ckk []bn254.G1Affine - if err := decEvals.Decode(&ckk); err != nil { - return err - } - vk.CommitmentKey, err = pedersen.Setup(ckk) - if err != nil { - return err - } - if _, err := vk.writeTo(vkWriter); err != nil { - return err - } - return nil -} - -func ExtractKeys(phase2Path string) error { - fmt.Println("Extracting proving key") - if err := extractPK(phase2Path); err != nil { - return err - } - fmt.Println("Extracting verifying key") - if err := extractVK(phase2Path); err != nil { - return err - } - fmt.Println("Keys have been extracted successfully") - return nil -} - -func ExportSol(session string) error { - filename := session + ".sol" - fmt.Printf("Exporting %s\n", filename) - f, _ := os.Open(session + ".vk.save") - verifyingKey := groth16.NewVerifyingKey(ecc.BN254) - _, err := verifyingKey.ReadFrom(f) - if err != nil { - panic(fmt.Errorf("read file error")) - } - err = f.Close() - f, err = os.Create(filename) - if err != nil { - panic(err) - } - err = verifyingKey.ExportSolidity(f) - if err != nil { - panic(err) - } - fmt.Printf("%s has been extracted successfully\n", filename) - return nil -} - -func filterInfinityG1(buff []bn254.G1Affine) ([]bn254.G1Affine, []bool, uint64) { - infinityAt := make([]bool, len(buff)) - filtered := make([]bn254.G1Affine, len(buff)) - j := 0 - for i, e := range buff { - if e.IsInfinity() { - infinityAt[i] = true - continue - } - filtered[j] = buff[i] - j++ - } - return filtered[:j], infinityAt, uint64(len(buff) - j) -} - -func filterInfinityG2(buff []bn254.G2Affine) ([]bn254.G2Affine, []bool, uint64) { - infinityAt := make([]bool, len(buff)) - filtered := make([]bn254.G2Affine, len(buff)) - j := 0 - for i, e := range buff { - if e.IsInfinity() { - infinityAt[i] = true - continue - } - filtered[j] = buff[i] - j++ - } - return filtered[:j], infinityAt, uint64(len(buff) - j) - -} diff --git a/lagrange/g1.go b/lagrange/g1.go deleted file mode 100644 index 73d4664..0000000 --- a/lagrange/g1.go +++ /dev/null @@ -1,159 +0,0 @@ -package lagrange - -import ( - "math/big" - "math/bits" - "runtime" - - "github.com/consensys/gnark-crypto/ecc" - "github.com/consensys/gnark-crypto/ecc/bn254" - "github.com/consensys/gnark-crypto/ecc/bn254/fr" - "github.com/consensys/gnark-crypto/ecc/bn254/fr/fft" - "github.com/worldcoin/semaphore-mtb-setup/common" -) - -type Empty struct { -} - -func butterflyG1(a *bn254.G1Jac, b *bn254.G1Jac) { - t := *a - a.AddAssign(b) - t.SubAssign(b) - *b = t -} - -// KerDIF8 is a kernel that process an FFT of size 8 -func kerDIF8G1(a []bn254.G1Jac, twiddles [][]fr.Element, stage int) { - butterflyG1(&a[0], &a[4]) - butterflyG1(&a[1], &a[5]) - butterflyG1(&a[2], &a[6]) - butterflyG1(&a[3], &a[7]) - - var twiddle big.Int - twiddles[stage+0][1].BigInt(&twiddle) - a[5].ScalarMultiplication(&a[5], &twiddle) - twiddles[stage+0][2].BigInt(&twiddle) - a[6].ScalarMultiplication(&a[6], &twiddle) - twiddles[stage+0][3].BigInt(&twiddle) - a[7].ScalarMultiplication(&a[7], &twiddle) - butterflyG1(&a[0], &a[2]) - butterflyG1(&a[1], &a[3]) - butterflyG1(&a[4], &a[6]) - butterflyG1(&a[5], &a[7]) - twiddles[stage+1][1].BigInt(&twiddle) - a[3].ScalarMultiplication(&a[3], &twiddle) - twiddles[stage+1][1].BigInt(&twiddle) - a[7].ScalarMultiplication(&a[7], &twiddle) - butterflyG1(&a[0], &a[1]) - butterflyG1(&a[2], &a[3]) - butterflyG1(&a[4], &a[5]) - butterflyG1(&a[6], &a[7]) -} - -// parallelize threshold for a single butterfly op, if the fft stage is not parallelized already -const butterflyThreshold = 16 - -func difFFTG1(a []bn254.G1Jac, twiddles [][]fr.Element, stage, maxSplits int, chDone chan struct{}) { - if chDone != nil { - defer close(chDone) - } - - n := len(a) - if n == 1 { - return - } else if n == 8 { - kerDIF8G1(a, twiddles, stage) - return - } - m := n >> 1 - - if (m > butterflyThreshold) && (stage < maxSplits) { - // 1 << stage == estimated used CPUs - numCPU := runtime.NumCPU() / (1 << (stage)) - common.Parallelize(m, func(start, end int) { - var twiddle big.Int - for i := start; i < end; i++ { - butterflyG1(&a[i], &a[i+m]) - twiddles[stage][i].BigInt(&twiddle) - a[i+m].ScalarMultiplication(&a[i+m], &twiddle) - } - }, numCPU) - } else { - // i == 0 - butterflyG1(&a[0], &a[m]) - var twiddle big.Int - for i := 1; i < m; i++ { - butterflyG1(&a[i], &a[i+m]) - twiddles[stage][i].BigInt(&twiddle) - a[i+m].ScalarMultiplication(&a[i+m], &twiddle) - } - } - - if m == 1 { - return - } - - nextStage := stage + 1 - if stage < maxSplits { - chDone := make(chan struct{}, 1) - go difFFTG1(a[m:n], twiddles, nextStage, maxSplits, chDone) - difFFTG1(a[0:m], twiddles, nextStage, maxSplits, nil) - <-chDone - } else { - difFFTG1(a[0:m], twiddles, nextStage, maxSplits, nil) - difFFTG1(a[m:n], twiddles, nextStage, maxSplits, nil) - } -} - -func bitReversePointsG1(a []bn254.G1Jac) { - n := uint64(len(a)) - nn := uint64(64 - bits.TrailingZeros64(n)) - - numCPU := uint64(runtime.NumCPU()) - chDone := make(chan Empty, numCPU) - - for id := 0; id < int(numCPU); id++ { - start := n / numCPU * uint64(id) - end := n / numCPU * uint64(id+1) - if id == int(numCPU-1) { - end = n - } - go func(start uint64, end uint64) { - for j := start; j < end; j++ { - irev := bits.Reverse64(j) >> nn - if irev > j { - a[j], a[irev] = a[irev], a[j] - } - } - chDone <- Empty{} - }(start, end) - } - for i := 0; i < int(numCPU); i++ { - <-chDone - } -} - -func ConvertG1(buff []bn254.G1Affine, domain *fft.Domain) { - numCPU := uint64(runtime.NumCPU()) - maxSplits := bits.TrailingZeros64(ecc.NextPowerOfTwo(numCPU)) - jac := make([]bn254.G1Jac, len(buff)) - for i := 0; i < len(buff); i++ { - jac[i].FromAffine(&buff[i]) - } - - difFFTG1(jac, domain.TwiddlesInv, 0, maxSplits, nil) - bitReversePointsG1(jac) - var invBigint big.Int - domain.CardinalityInv.BigInt(&invBigint) - common.Parallelize(len(jac), func(start, end int) { - for i := start; i < end; i++ { - jac[i].ScalarMultiplication(&jac[i], &invBigint) - } - }) - - common.Parallelize(len(buff), func(start, end int) { - for i := start; i < end; i++ { - buff[i].FromJacobian(&jac[i]) - } - }) -} diff --git a/lagrange/g2.go b/lagrange/g2.go deleted file mode 100644 index 46e7e8c..0000000 --- a/lagrange/g2.go +++ /dev/null @@ -1,153 +0,0 @@ -package lagrange - -import ( - "math/big" - "math/bits" - "runtime" - - "github.com/consensys/gnark-crypto/ecc" - "github.com/consensys/gnark-crypto/ecc/bn254" - "github.com/consensys/gnark-crypto/ecc/bn254/fr" - "github.com/consensys/gnark-crypto/ecc/bn254/fr/fft" - "github.com/worldcoin/semaphore-mtb-setup/common" -) - -func butterflyG2(a *bn254.G2Jac, b *bn254.G2Jac) { - t := *a - a.AddAssign(b) - t.SubAssign(b) - *b = t -} - -// KerDIF8 is a kernel that process an FFT of size 8 -func kerDIF8G2(a []bn254.G2Jac, twiddles [][]fr.Element, stage int) { - butterflyG2(&a[0], &a[4]) - butterflyG2(&a[1], &a[5]) - butterflyG2(&a[2], &a[6]) - butterflyG2(&a[3], &a[7]) - - var twiddle big.Int - twiddles[stage+0][1].BigInt(&twiddle) - a[5].ScalarMultiplication(&a[5], &twiddle) - twiddles[stage+0][2].BigInt(&twiddle) - a[6].ScalarMultiplication(&a[6], &twiddle) - twiddles[stage+0][3].BigInt(&twiddle) - a[7].ScalarMultiplication(&a[7], &twiddle) - butterflyG2(&a[0], &a[2]) - butterflyG2(&a[1], &a[3]) - butterflyG2(&a[4], &a[6]) - butterflyG2(&a[5], &a[7]) - twiddles[stage+1][1].BigInt(&twiddle) - a[3].ScalarMultiplication(&a[3], &twiddle) - twiddles[stage+1][1].BigInt(&twiddle) - a[7].ScalarMultiplication(&a[7], &twiddle) - butterflyG2(&a[0], &a[1]) - butterflyG2(&a[2], &a[3]) - butterflyG2(&a[4], &a[5]) - butterflyG2(&a[6], &a[7]) -} - -func difFFTG2(a []bn254.G2Jac, twiddles [][]fr.Element, stage, maxSplits int, chDone chan struct{}) { - if chDone != nil { - defer close(chDone) - } - - n := len(a) - if n == 1 { - return - } else if n == 8 { - kerDIF8G2(a, twiddles, stage) - return - } - m := n >> 1 - - if (m > butterflyThreshold) && (stage < maxSplits) { - // 1 << stage == estimated used CPUs - numCPU := runtime.NumCPU() / (1 << (stage)) - common.Parallelize(m, func(start, end int) { - var twiddle big.Int - for i := start; i < end; i++ { - butterflyG2(&a[i], &a[i+m]) - twiddles[stage][i].BigInt(&twiddle) - a[i+m].ScalarMultiplication(&a[i+m], &twiddle) - } - }, numCPU) - } else { - // i == 0 - butterflyG2(&a[0], &a[m]) - var twiddle big.Int - for i := 1; i < m; i++ { - butterflyG2(&a[i], &a[i+m]) - twiddles[stage][i].BigInt(&twiddle) - a[i+m].ScalarMultiplication(&a[i+m], &twiddle) - } - } - - if m == 1 { - return - } - - nextStage := stage + 1 - if stage < maxSplits { - chDone := make(chan struct{}, 1) - go difFFTG2(a[m:n], twiddles, nextStage, maxSplits, chDone) - difFFTG2(a[0:m], twiddles, nextStage, maxSplits, nil) - <-chDone - } else { - difFFTG2(a[0:m], twiddles, nextStage, maxSplits, nil) - difFFTG2(a[m:n], twiddles, nextStage, maxSplits, nil) - } -} - -func bitReversePointsG2(a []bn254.G2Jac) { - n := uint64(len(a)) - nn := uint64(64 - bits.TrailingZeros64(n)) - - numCPU := uint64(runtime.NumCPU()) - chDone := make(chan Empty, numCPU) - - for id := 0; id < int(numCPU); id++ { - start := n / numCPU * uint64(id) - end := n / numCPU * uint64(id+1) - if id == int(numCPU-1) { - end = n - } - go func(start uint64, end uint64) { - for j := start; j < end; j++ { - irev := bits.Reverse64(j) >> nn - if irev > j { - a[j], a[irev] = a[irev], a[j] - } - } - chDone <- Empty{} - }(start, end) - } - for i := 0; i < int(numCPU); i++ { - <-chDone - } -} - -func ConvertG2(buff []bn254.G2Affine, domain *fft.Domain) { - numCPU := uint64(runtime.NumCPU()) - maxSplits := bits.TrailingZeros64(ecc.NextPowerOfTwo(numCPU)) - jac := make([]bn254.G2Jac, len(buff)) - for i := 0; i < len(buff); i++ { - jac[i].FromAffine(&buff[i]) - } - - difFFTG2(jac, domain.TwiddlesInv, 0, maxSplits, nil) - bitReversePointsG2(jac) - var invBigint big.Int - domain.CardinalityInv.BigInt(&invBigint) - common.Parallelize(len(jac), func(start, end int) { - for i := start; i < end; i++ { - jac[i].ScalarMultiplication(&jac[i], &invBigint) - } - }) - - common.Parallelize(len(buff), func(start, end int) { - for i := start; i < end; i++ { - buff[i].FromJacobian(&jac[i]) - } - }) -} diff --git a/main.go b/main.go index 447b429..24bbfc3 100644 --- a/main.go +++ b/main.go @@ -17,21 +17,21 @@ func main() { /* ----------------------------- Phase 1 Import ----------------------------- */ { Name: "p1i", - Usage: "p1i ", + Usage: "p1i ", Description: "Deserialize snarkjs .ptau file into gnark's phase1 format and write to `OUTPUT`.ph1", Action: p1i, }, /* --------------------------- Phase 2 Initialize --------------------------- */ { Name: "p2n", - Usage: "p2n ", + Usage: "p2n ", Description: "initialize phase 2 for the given circuit", Action: p2n, }, /* --------------------------- Phase 2 Contribute --------------------------- */ { Name: "p2c", - Usage: "p2c ", + Usage: "p2c ", Description: "contribute phase 2 randomness for Groth16", Action: p2c, }, @@ -45,15 +45,15 @@ func main() { /* ----------------------------- Keys Extraction ---------------------------- */ { Name: "key", - Usage: "key ", + Usage: "key ", Description: "extract proving and verifying keys", - Action: extract, + Action: keys, }, { Name: "sol", - Usage: "sol ", + Usage: "sol ", Description: "export verifier smart contract from verifying key", - Action: exportSol, + Action: sol, }, // Unused since we use the powers of tau ceremony from PPoT diff --git a/phase1/contribution.go b/phase1/contribution.go deleted file mode 100644 index 8c499e4..0000000 --- a/phase1/contribution.go +++ /dev/null @@ -1,191 +0,0 @@ -package phase1 - -import ( - "crypto/sha256" - "io" - "math" - "os" - - "github.com/consensys/gnark-crypto/ecc/bn254" - "github.com/worldcoin/semaphore-mtb-setup/common" -) - -const ContributionSize = 640 - -type Contribution struct { - G1 struct { - Tau, Alpha, Beta bn254.G1Affine - } - G2 struct { - Tau, Beta bn254.G2Affine - } - PublicKeys struct { - Tau, Alpha, Beta common.PublicKey - } - Hash []byte -} - -func (c *Contribution) writeTo(writer io.Writer) (int64, error) { - toEncode := []interface{}{ - &c.G1.Tau, - &c.G1.Alpha, - &c.G1.Beta, - &c.G2.Tau, - &c.G2.Beta, - &c.PublicKeys.Tau.S, - &c.PublicKeys.Tau.SX, - &c.PublicKeys.Tau.SPX, - &c.PublicKeys.Alpha.S, - &c.PublicKeys.Alpha.SX, - &c.PublicKeys.Alpha.SPX, - &c.PublicKeys.Beta.S, - &c.PublicKeys.Beta.SX, - &c.PublicKeys.Beta.SPX, - } - - enc := bn254.NewEncoder(writer) - for _, v := range toEncode { - if err := enc.Encode(v); err != nil { - return enc.BytesWritten(), err - } - } - nBytes, err := writer.Write(c.Hash) - return int64(nBytes), err -} - -func (c *Contribution) ReadFrom(reader io.Reader) (int64, error) { - toDecode := []interface{}{ - &c.G1.Tau, - &c.G1.Alpha, - &c.G1.Beta, - &c.G2.Tau, - &c.G2.Beta, - &c.PublicKeys.Tau.S, - &c.PublicKeys.Tau.SX, - &c.PublicKeys.Tau.SPX, - &c.PublicKeys.Alpha.S, - &c.PublicKeys.Alpha.SX, - &c.PublicKeys.Alpha.SPX, - &c.PublicKeys.Beta.S, - &c.PublicKeys.Beta.SX, - &c.PublicKeys.Beta.SPX, - } - - dec := bn254.NewDecoder(reader) - for _, v := range toDecode { - if err := dec.Decode(v); err != nil { - return dec.BytesRead(), err - } - } - c.Hash = make([]byte, 32) - nBytes, err := reader.Read(c.Hash) - return int64(nBytes), err -} - -func computeHash(c *Contribution) []byte { - sha := sha256.New() - toEncode := []interface{}{ - &c.G1.Tau, - &c.G1.Alpha, - &c.G1.Beta, - &c.G2.Tau, - &c.G2.Beta, - &c.PublicKeys.Tau.S, - &c.PublicKeys.Tau.SX, - &c.PublicKeys.Tau.SPX, - &c.PublicKeys.Alpha.S, - &c.PublicKeys.Alpha.SX, - &c.PublicKeys.Alpha.SPX, - &c.PublicKeys.Beta.S, - &c.PublicKeys.Beta.SX, - &c.PublicKeys.Beta.SPX, - } - - enc := bn254.NewEncoder(sha) - for _, v := range toEncode { - enc.Encode(v) - } - - return sha.Sum(nil) -} - -func defaultContribution(transformedPath string) (Contribution, error) { - var c Contribution - c.Hash = nil - - // Initialize with generators - if transformedPath == "" { - _, _, g1, g2 := bn254.Generators() - c.G1.Tau.Set(&g1) - c.G1.Alpha.Set(&g1) - c.G1.Beta.Set(&g1) - c.G2.Tau.Set(&g2) - c.G2.Beta.Set(&g2) - } else { - // Read parameters from transformed file - const G1CompressedSize = 32 - const G2CompressedSize = 64 - inputFile, err := os.Open(transformedPath) - if err != nil { - return c, err - } - defer inputFile.Close() - dec := bn254.NewDecoder(inputFile) - - // Read header - var header Header - if err := header.ReadFrom(inputFile); err != nil { - return c, err - } - - N := int(math.Pow(2, float64(header.Power))) - - var posTauG1 int64 = 3 + G1CompressedSize - var posAlphaG1 int64 = posTauG1 + int64(2*N-2)*G1CompressedSize - var posBetaG1 int64 = posAlphaG1 + int64(N)*G1CompressedSize - var posTauG2 int64 = posBetaG1 + int64(N)*G1CompressedSize + G2CompressedSize - var posBetaG2 int64 = posTauG2 + int64(N-1)*G2CompressedSize - - // Read TauG1 - if _, err := inputFile.Seek(posTauG1, io.SeekStart); err != nil { - return c, err - } - if err := dec.Decode(&c.G1.Tau); err != nil { - return c, err - } - - // Read AlphaG1 - if _, err := inputFile.Seek(posAlphaG1, io.SeekStart); err != nil { - return c, err - } - if err := dec.Decode(&c.G1.Alpha); err != nil { - return c, err - } - - // Read BetaG1 - if _, err := inputFile.Seek(posBetaG1, io.SeekStart); err != nil { - return c, err - } - if err := dec.Decode(&c.G1.Beta); err != nil { - return c, err - } - - // Read TauG2 - if _, err := inputFile.Seek(posTauG2, io.SeekStart); err != nil { - return c, err - } - if err := dec.Decode(&c.G2.Tau); err != nil { - return c, err - } - - // Read BetaG2 - if _, err := inputFile.Seek(posBetaG2, io.SeekStart); err != nil { - return c, err - } - if err := dec.Decode(&c.G2.Beta); err != nil { - return c, err - } - } - - return c, nil -} diff --git a/phase1/header.go b/phase1/header.go deleted file mode 100644 index 88eb1b3..0000000 --- a/phase1/header.go +++ /dev/null @@ -1,44 +0,0 @@ -package phase1 - -import ( - "encoding/binary" - "io" -) - -type Header struct { - Power byte - Contributions uint16 -} - -func (p *Header) ReadFrom(reader io.Reader) error { - buffPower := make([]byte, 1) - // Read NConstraints - if _, err := reader.Read(buffPower); err != nil { - return err - } - p.Power = buffPower[0] - - // Read NContribution - buffContributions := make([]byte, 2) - if _, err := reader.Read(buffContributions); err != nil { - return err - } - p.Contributions = binary.BigEndian.Uint16(buffContributions) - return nil -} - -func (p *Header) writeTo(writer io.Writer) error { - // Write Power - if _, err := writer.Write([]byte{p.Power}); err != nil { - return err - } - - // Write Contribution - buffContributions := make([]byte, 2) - binary.BigEndian.PutUint16(buffContributions, p.Contributions) - if _, err := writer.Write(buffContributions); err != nil { - return err - } - - return nil -} diff --git a/phase1/phase1.go b/phase1/phase1.go deleted file mode 100644 index 30e5a16..0000000 --- a/phase1/phase1.go +++ /dev/null @@ -1,375 +0,0 @@ -package phase1 - -import ( - "bufio" - "encoding/hex" - "errors" - "fmt" - "math" - "math/big" - "os" - - "github.com/consensys/gnark-crypto/ecc/bn254" - "github.com/consensys/gnark-crypto/ecc/bn254/fr" - "github.com/worldcoin/semaphore-mtb-setup/common" -) - -func Transform(inputPath, outputPath string, inPower, outPower byte) error { - // Input file is in uncompressed representation - const G1Size = 64 - const G2Size = 128 - - // Formatted as - // Hash, 2^(2n)-1[TauG1], 2^n[TauG2], 2^n[AlphaG1], 2^n[BetaG1], BetaG2 - inputFile, err := os.Open(inputPath) - if err != nil { - return err - } - defer inputFile.Close() - - // Output file is in compressed representation - outputFile, err := os.Create(outputPath) - if err != nil { - return err - } - defer outputFile.Close() - - // Write header - header := Header{Power: outPower, Contributions: 0} - if err := header.writeTo(outputFile); err != nil { - return err - } - - inN := int(math.Pow(2, float64(inPower))) - outN := int(math.Pow(2, float64(outPower))) - - var posTauG1 int64 = 64 - var posTauG2 int64 = posTauG1 + int64(2*inN-1)*G1Size - var posAlphaG1 int64 = posTauG2 + int64(inN)*G2Size - var posBetaG1 int64 = posAlphaG1 + int64(inN)*G1Size - var posBetaG2 int64 = posBetaG1 + int64(inN)*G1Size - - // Transform TauG1 - fmt.Println("Transforming TauG1") - if err := transformG1(inputFile, outputFile, posTauG1, 2*outN-1); err != nil { - return err - } - - // Transform AlphaG1 - fmt.Println("Transforming AlphaG1") - if err := transformG1(inputFile, outputFile, posAlphaG1, outN); err != nil { - return err - } - - // Transform BetaG1 - fmt.Println("Transforming BetaG1") - if err := transformG1(inputFile, outputFile, posBetaG1, outN); err != nil { - return err - } - - // Transform TauG2 - fmt.Println("Transforming TauG2") - if err := transformG2(inputFile, outputFile, posTauG2, outN); err != nil { - return err - } - - // Transform BetaG2 - fmt.Println("Transforming BetaG2") - if err := transformG2(inputFile, outputFile, posBetaG2, 1); err != nil { - return err - } - - return nil -} - -func Initialize(power byte, outputPath string) error { - _, _, g1, g2 := bn254.Generators() - // output outputFile - outputFile, err := os.Create(outputPath) - if err != nil { - return err - } - defer outputFile.Close() - var header Header - - header.Power = power - N := int(math.Pow(2, float64(power))) - fmt.Printf("Power %d supports up to %d constraints\n", power, N) - - // Write the header - header.writeTo(outputFile) - - // Use buffered IO to write parameters efficiently - buffSize := int(math.Pow(2, 20)) - writer := bufio.NewWriterSize(outputFile, buffSize) - defer writer.Flush() - - // BN254 encoder using compressed representation of points to save storage space - enc := bn254.NewEncoder(writer) - - // In the initialization, τ = α = β = 1, so we are writing the generators directly - // Write [τ⁰]₁, [τ¹]₁, [τ²]₁, …, [τ²ᴺ⁻²]₁ - fmt.Println("1. Writing TauG1") - for i := 0; i < 2*N-1; i++ { - if err := enc.Encode(&g1); err != nil { - return err - } - } - - // Write α[τ⁰]₁, α[τ¹]₁, α[τ²]₁, …, α[τᴺ⁻¹]₁ - fmt.Println("2. Writing AlphaTauG1") - for i := 0; i < N; i++ { - if err := enc.Encode(&g1); err != nil { - return err - } - } - - // Write β[τ⁰]₁, β[τ¹]₁, β[τ²]₁, …, β[τᴺ⁻¹]₁ - fmt.Println("3. Writing BetaTauG1") - for i := 0; i < N; i++ { - if err := enc.Encode(&g1); err != nil { - return err - } - } - - // Write {[τ⁰]₂, [τ¹]₂, [τ²]₂, …, [τᴺ⁻¹]₂} - fmt.Println("4. Writing TauG2") - for i := 0; i < N; i++ { - if err := enc.Encode(&g2); err != nil { - return err - } - } - - // Write [β]₂ - fmt.Println("5. Writing BetaG2") - enc.Encode(&g2) - - fmt.Println("Initialization has been completed successfully") - return nil -} - -func Contribute(inputPath, outputPath string) error { - // Input file - inputFile, err := os.Open(inputPath) - if err != nil { - return err - } - defer inputFile.Close() - - // Output file - outputFile, err := os.Create(outputPath) - if err != nil { - return err - } - defer outputFile.Close() - - // Read/Write header with extra contribution - var header Header - if err := header.ReadFrom(inputFile); err != nil { - return err - } - fmt.Printf("Power := %d and #Contributions := %d\n", header.Power, header.Contributions) - N := int(math.Pow(2, float64(header.Power))) - header.Contributions++ - if err := header.writeTo(outputFile); err != nil { - return err - } - - // Use buffered IO to write parameters efficiently - reader := bufio.NewReader(inputFile) - writer := bufio.NewWriter(outputFile) - defer writer.Flush() - - dec := bn254.NewDecoder(reader) - enc := bn254.NewEncoder(writer) - - // Sample toxic parameters - fmt.Println("Sampling toxic parameters Tau, Alpha, and Beta") - var tau, alpha, beta, one fr.Element - tau.SetRandom() - alpha.SetRandom() - beta.SetRandom() - one.SetOne() - - var contribution Contribution - var firstG1 *bn254.G1Affine - var firstG2 *bn254.G2Affine - - // Process Tau section - fmt.Println("Processing TauG1") - if firstG1, err = scaleG1(dec, enc, 2*N-1, &tau, nil); err != nil { - return err - } - contribution.G1.Tau.Set(firstG1) - - // Process AlphaTauG1 section - fmt.Println("Processing AlphaTauG1") - if firstG1, err = scaleG1(dec, enc, N, &tau, &alpha); err != nil { - return err - } - contribution.G1.Alpha.Set(firstG1) - - // Process BetaTauG1 section - fmt.Println("Processing BetaTauG1") - if firstG1, err = scaleG1(dec, enc, N, &tau, &beta); err != nil { - return err - } - contribution.G1.Beta.Set(firstG1) - - // Process TauG2 section - fmt.Println("Processing TauG2") - if firstG2, err = scaleG2(dec, enc, N, &tau); err != nil { - return err - } - contribution.G2.Tau.Set(firstG2) - - // Process BetaG2 section - fmt.Println("Processing BetaG2") - var betaG2 bn254.G2Affine - var betaBi big.Int - if err := dec.Decode(&betaG2); err != nil { - return err - } - beta.BigInt(&betaBi) - betaG2.ScalarMultiplication(&betaG2, &betaBi) - if err := enc.Encode(&betaG2); err != nil { - return err - } - contribution.G2.Beta.Set(&betaG2) - - // Copy old contributions - nExistingContributions := int(header.Contributions - 1) - var c Contribution - for i := 0; i < nExistingContributions; i++ { - if _, err := c.ReadFrom(reader); err != nil { - return err - } - if _, err := c.writeTo(writer); err != nil { - return err - } - } - - // Get hash of previous contribution - var prevHash []byte - if nExistingContributions == 0 { - prevHash = nil - } else { - prevHash = c.Hash - } - - // Generate public keys - contribution.PublicKeys.Tau = common.GenPublicKey(tau, prevHash, 1) - contribution.PublicKeys.Alpha = common.GenPublicKey(alpha, prevHash, 2) - contribution.PublicKeys.Beta = common.GenPublicKey(beta, prevHash, 3) - contribution.Hash = computeHash(&contribution) - - // Write the contribution - contribution.writeTo(writer) - - fmt.Println("Contirbution has been successful!") - fmt.Println("Contribution Hash := ", hex.EncodeToString(contribution.Hash)) - - return nil -} - -func Verify(inputPath, transformedPath string) error { - // Input file - inputFile, err := os.Open(inputPath) - if err != nil { - return err - } - defer inputFile.Close() - - // Read header - var header Header - if err := header.ReadFrom(inputFile); err != nil { - return err - } - fmt.Printf("Power := %d and #Contributions := %d\n", header.Power, header.Contributions) - N := int(math.Pow(2, float64(header.Power))) - - // Use buffered IO to write parameters efficiently - buffSize := int(math.Pow(2, 20)) - reader := bufio.NewReaderSize(inputFile, buffSize) - dec := bn254.NewDecoder(reader) - - fmt.Println("Processing TauG1") - tau1L1, tau1L2, err := linearCombinationG1(dec, 2*N-1) - if err != nil { - return err - } - - fmt.Println("Processing AlphaTauG1") - alphaTau1L1, alphaTau1L2, err := linearCombinationG1(dec, N) - if err != nil { - return err - } - - fmt.Println("Processing BetaTauG1") - betaTau1L1, betaTau1L2, err := linearCombinationG1(dec, N) - if err != nil { - return err - } - - fmt.Println("Processing TauG2") - tau2L1, tau2L2, err := linearCombinationG2(dec, N) - if err != nil { - return err - } - - fmt.Println("Processing BetaG2") - var betaG2 bn254.G2Affine - if err = dec.Decode(&betaG2); err != nil { - return err - } - - // Verify contributions - var current Contribution - prev, err := defaultContribution(transformedPath) - if err != nil { - return err - } - for i := 0; i < int(header.Contributions); i++ { - current.ReadFrom(reader) - fmt.Printf("Verifying contribution %d with Hash := %s\n", i+1, hex.EncodeToString(current.Hash)) - if err := verifyContribution(current, prev); err != nil { - return err - } - prev = current - } - - // Verify consistency of parameters update - _, _, g1, g2 := bn254.Generators() - // Read and verify TauG1 - fmt.Println("Verifying powers of TauG1") - if !common.SameRatio(tau1L1, tau1L2, current.G2.Tau, g2) { - return errors.New("failed pairing check") - } - - // Read and verify AlphaTauG1 - fmt.Println("Verifying powers of AlphaTauG1") - if !common.SameRatio(alphaTau1L1, alphaTau1L2, current.G2.Tau, g2) { - return errors.New("failed pairing check") - } - - // Read and verify BetaTauG1 - fmt.Println("Verifying powers of BetaTauG1") - if !common.SameRatio(betaTau1L1, betaTau1L2, current.G2.Tau, g2) { - return errors.New("failed pairing check") - } - - // Read and verify TauG2 - fmt.Println("Verifying powers of TauG2") - if !common.SameRatio(g1, current.G1.Tau, tau2L1, tau2L2) { - return errors.New("failed pairing check") - } - - // Verify BetaG2 - fmt.Println("Verifying powers of BetaG2") - if !betaG2.Equal(¤t.G2.Beta) { - return errors.New("failed verifying update of Beta") - } - - fmt.Println("Contributions verification has been successful") - return nil -} diff --git a/phase1/utils.go b/phase1/utils.go deleted file mode 100644 index b5873b5..0000000 --- a/phase1/utils.go +++ /dev/null @@ -1,315 +0,0 @@ -package phase1 - -import ( - "bufio" - "bytes" - "errors" - "io" - "math" - "math/big" - "os" - - "github.com/consensys/gnark-crypto/ecc" - "github.com/consensys/gnark-crypto/ecc/bn254" - "github.com/consensys/gnark-crypto/ecc/bn254/fr" - "github.com/worldcoin/semaphore-mtb-setup/common" -) - -const batchSize = 1048576 // 2^20 - -// Returns powers of b starting from a as [a, ba, ..., abⁿ⁻¹ ] -func powers(a, b *fr.Element, n int) []fr.Element { - result := make([]fr.Element, n) - result[0].Set(a) - for i := 1; i < n; i++ { - result[i].Mul(&result[i-1], b) - } - return result -} - -// Multiply each element by b -func batchMul(a []fr.Element, b *fr.Element) { - common.Parallelize(len(a), func(start, end int) { - for i := start; i < end; i++ { - a[i].Mul(&a[i], b) - } - }) -} - -func scaleG1(dec *bn254.Decoder, enc *bn254.Encoder, N int, tau, multiplicand *fr.Element) (*bn254.G1Affine, error) { - // Allocate batch with smallest of (N, batchSize) - var initialSize = int(math.Min(float64(N), float64(batchSize))) - buff := make([]bn254.G1Affine, initialSize) - var firstPoint bn254.G1Affine - var startPower fr.Element - var scalars []fr.Element - startPower.SetOne() - - remaining := N - for remaining > 0 { - // Read batch - readCount := int(math.Min(float64(remaining), float64(batchSize))) - for i := 0; i < readCount; i++ { - if err := dec.Decode(&buff[i]); err != nil { - return nil, err - } - } - - // Compute powers for the current batch - scalars = powers(&startPower, tau, readCount) - - // Update startPower for next batch - startPower.Mul(&scalars[readCount-1], tau) - - // If there is α or β, then mul it with powers of τ - if multiplicand != nil { - batchMul(scalars, multiplicand) - } - - // Process the batch - common.Parallelize(readCount, func(start, end int) { - for i := start; i < end; i++ { - var tmpBi big.Int - scalars[i].BigInt(&tmpBi) - buff[i].ScalarMultiplication(&buff[i], &tmpBi) - } - }) - - // Write the batch - for i := 0; i < readCount; i++ { - if err := enc.Encode(&buff[i]); err != nil { - return nil, err - } - } - - // Should be initialized in first batch only - if firstPoint.X.IsZero() { - if multiplicand == nil { - // Set firstPoint to the second point = [τ] - firstPoint.Set(&buff[1]) - } else { - // Set firstPoint to the first point = [α] or [β] - firstPoint.Set(&buff[0]) - } - } - - // Update remaining - remaining -= readCount - } - return &firstPoint, nil -} - -func scaleG2(dec *bn254.Decoder, enc *bn254.Encoder, N int, tau *fr.Element) (*bn254.G2Affine, error) { - // Allocate batch with smallest of (N, batchSize) - var initialSize = int(math.Min(float64(N), float64(batchSize))) - buff := make([]bn254.G2Affine, initialSize) - var firstPoint bn254.G2Affine - var startPower fr.Element - var scalars []fr.Element - startPower.SetOne() - - remaining := N - for remaining > 0 { - // Read batch - readCount := int(math.Min(float64(remaining), float64(batchSize))) - for i := 0; i < readCount; i++ { - if err := dec.Decode(&buff[i]); err != nil { - return nil, err - } - } - - // Compute powers for the current batch - scalars = powers(&startPower, tau, readCount) - - // Update startPower for next batch - startPower.Mul(&scalars[readCount-1], tau) - - // Process the batch - common.Parallelize(readCount, func(start, end int) { - for i := start; i < end; i++ { - var tmpBi big.Int - scalars[i].BigInt(&tmpBi) - buff[i].ScalarMultiplication(&buff[i], &tmpBi) - } - }) - - // Write the batch - for i := 0; i < readCount; i++ { - if err := enc.Encode(&buff[i]); err != nil { - return nil, err - } - } - - // Should be initialized in first batch only - if firstPoint.X.IsZero() { - - firstPoint.Set(&buff[1]) - - } - - // Update remaining - remaining -= readCount - } - return &firstPoint, nil -} - -func randomize(r []fr.Element) { - common.Parallelize(len(r), func(start, end int) { - for i := start; i < end; i++ { - r[i].SetRandom() - } - }) -} - -func linearCombinationG1(dec *bn254.Decoder, N int) (bn254.G1Affine, bn254.G1Affine, error) { - // Allocate batch with smallest of (N, batchSize) - var initialSize = int(math.Min(float64(N), float64(batchSize))) - buff := make([]bn254.G1Affine, initialSize) - r := make([]fr.Element, initialSize) - var L1, L2, tmpL1, tmpL2 bn254.G1Affine - - remaining := N - for remaining > 0 { - // Read batch - readCount := int(math.Min(float64(remaining), float64(batchSize))) - for i := 0; i < readCount; i++ { - if err := dec.Decode(&buff[i]); err != nil { - return L1, L2, err - } - } - - // Generate randomness - randomize(r) - - // Process the batch - tmpL1.MultiExp(buff[:readCount-1], r, ecc.MultiExpConfig{}) - tmpL2.MultiExp(buff[1:readCount], r, ecc.MultiExpConfig{}) - L1.Add(&L1, &tmpL1) - L2.Add(&L2, &tmpL2) - - // Update remaining - remaining -= readCount - } - return L1, L2, nil -} - -func linearCombinationG2(dec *bn254.Decoder, N int) (bn254.G2Affine, bn254.G2Affine, error) { - // Allocate batch with smallest of (N, batchSize) - var initialSize = int(math.Min(float64(N), float64(batchSize))) - buff := make([]bn254.G2Affine, initialSize) - r := make([]fr.Element, initialSize) - var L1, L2, tmpL1, tmpL2 bn254.G2Affine - - remaining := N - for remaining > 0 { - // Read batch - readCount := int(math.Min(float64(remaining), float64(batchSize))) - for i := 0; i < readCount; i++ { - if err := dec.Decode(&buff[i]); err != nil { - return L1, L2, err - } - } - - // Generate randomness - randomize(r) - - // Process the batch - tmpL1.MultiExp(buff[:readCount-1], r, ecc.MultiExpConfig{}) - tmpL2.MultiExp(buff[1:readCount], r, ecc.MultiExpConfig{}) - L1.Add(&L1, &tmpL1) - L2.Add(&L2, &tmpL2) - - // Update remaining - remaining -= readCount - } - return L1, L2, nil -} - -func verifyContribution(current, prev Contribution) error { - // Compute SP for τ, α, β - tauSP := common.GenSP(current.PublicKeys.Tau.S, current.PublicKeys.Tau.SX, prev.Hash[:], 1) - alphaSP := common.GenSP(current.PublicKeys.Alpha.S, current.PublicKeys.Alpha.SX, prev.Hash[:], 2) - betaSP := common.GenSP(current.PublicKeys.Beta.S, current.PublicKeys.Beta.SX, prev.Hash[:], 3) - - // Check for knowledge of toxic parameters - if !common.SameRatio(current.PublicKeys.Tau.S, current.PublicKeys.Tau.SX, current.PublicKeys.Tau.SPX, tauSP) { - return errors.New("couldn't verify knowledge of Tau") - } - if !common.SameRatio(current.PublicKeys.Alpha.S, current.PublicKeys.Alpha.SX, current.PublicKeys.Alpha.SPX, alphaSP) { - return errors.New("couldn't verify knowledge of Alpha") - } - if !common.SameRatio(current.PublicKeys.Beta.S, current.PublicKeys.Beta.SX, current.PublicKeys.Beta.SPX, betaSP) { - return errors.New("couldn't verify knowledge of Beta") - } - - // Check for valid updates using previous parameters - if !common.SameRatio(current.G1.Tau, prev.G1.Tau, tauSP, current.PublicKeys.Tau.SPX) { - return errors.New("couldn't verify that TauG1 is based on previous contribution") - } - if !common.SameRatio(current.G1.Alpha, prev.G1.Alpha, alphaSP, current.PublicKeys.Alpha.SPX) { - return errors.New("couldn't verify that AlphaTauG1 is based on previous contribution") - } - if !common.SameRatio(current.G1.Beta, prev.G1.Beta, betaSP, current.PublicKeys.Beta.SPX) { - return errors.New("couldn't verify that BetaTauG1 is based on previous contribution") - } - if !common.SameRatio(current.PublicKeys.Tau.S, current.PublicKeys.Tau.SX, current.G2.Tau, prev.G2.Tau) { - return errors.New("couldn't verify that TauG2 is based on previous contribution") - } - if !common.SameRatio(current.PublicKeys.Beta.S, current.PublicKeys.Beta.SX, current.G2.Beta, prev.G2.Beta) { - return errors.New("couldn't verify that BetaG2 is based on previous contribution") - } - - // Check hash of the contribution - h := computeHash(¤t) - if !bytes.Equal(current.Hash, h) { - return errors.New("couldn't verify hash of contribution") - } - - return nil -} - -func transformG1(inputFile, outputFile *os.File, position int64, size int) error { - var g1 bn254.G1Affine - if _, err := inputFile.Seek(position, io.SeekStart); err != nil { - return err - } - reader := bufio.NewReader(inputFile) - writer := bufio.NewWriter(outputFile) - defer writer.Flush() - - dec := bn254.NewDecoder(reader) - enc := bn254.NewEncoder(writer) - - for i := 0; i < size; i++ { - if err := dec.Decode(&g1); err != nil { - return err - } - if err := enc.Encode(&g1); err != nil { - return err - } - } - return nil -} - -func transformG2(inputFile, outputFile *os.File, position int64, size int) error { - var g2 bn254.G2Affine - if _, err := inputFile.Seek(position, io.SeekStart); err != nil { - return err - } - reader := bufio.NewReader(inputFile) - writer := bufio.NewWriter(outputFile) - defer writer.Flush() - - dec := bn254.NewDecoder(reader) - enc := bn254.NewEncoder(writer) - - for i := 0; i < size; i++ { - if err := dec.Decode(&g2); err != nil { - return err - } - if err := enc.Encode(&g2); err != nil { - return err - } - } - return nil -} diff --git a/phase2/contribution.go b/phase2/contribution.go deleted file mode 100644 index fdbd8ae..0000000 --- a/phase2/contribution.go +++ /dev/null @@ -1,69 +0,0 @@ -package phase2 - -import ( - "crypto/sha256" - "io" - - "github.com/consensys/gnark-crypto/ecc/bn254" - "github.com/worldcoin/semaphore-mtb-setup/common" -) - -type Contribution struct { - Delta bn254.G1Affine - PublicKey common.PublicKey - Hash []byte -} - -func (c *Contribution) writeTo(writer io.Writer) (int64, error) { - toEncode := []interface{}{ - &c.Delta, - &c.PublicKey.S, - &c.PublicKey.SX, - &c.PublicKey.SPX, - } - - enc := bn254.NewEncoder(writer) - for _, v := range toEncode { - if err := enc.Encode(v); err != nil { - return enc.BytesWritten(), err - } - } - nBytes, err := writer.Write(c.Hash) - return int64(nBytes), err -} - -func (c *Contribution) readFrom(reader io.Reader) (int64, error) { - toDecode := []interface{}{ - &c.Delta, - &c.PublicKey.S, - &c.PublicKey.SX, - &c.PublicKey.SPX, - } - - dec := bn254.NewDecoder(reader) - for _, v := range toDecode { - if err := dec.Decode(v); err != nil { - return dec.BytesRead(), err - } - } - c.Hash = make([]byte, 32) - nBytes, err := io.ReadFull(reader, c.Hash) - return int64(nBytes), err -} - -func computeHash(c *Contribution) []byte { - sha := sha256.New() - toEncode := []interface{}{ - &c.Delta, - &c.PublicKey.S, - &c.PublicKey.SX, - &c.PublicKey.SPX, - } - - enc := bn254.NewEncoder(sha) - for _, v := range toEncode { - enc.Encode(v) - } - - return sha.Sum(nil) -} diff --git a/phase2/header.go b/phase2/header.go deleted file mode 100644 index 9485872..0000000 --- a/phase2/header.go +++ /dev/null @@ -1,44 +0,0 @@ -package phase2 - -import ( - "encoding/gob" - "io" -) - -type Header struct { - Wires int - Witness int - Public int - PrivateCommitted int - Constraints int - Domain int - Contributions int -} - -func (h *Header) Read(reader io.Reader) error { - dec := gob.NewDecoder(reader) - if err := dec.Decode(h); err != nil { - return err - } - return nil -} - -func (h *Header) write(writer io.Writer) error { - enc := gob.NewEncoder(writer) - if err := enc.Encode(*h); err != nil { - return err - } - return nil -} - -func (h *Header) Equal(h2 *Header) bool { - if h.Wires == h2.Wires && - h.Witness == h2.Witness && - h.Public == h2.Public && - h.PrivateCommitted == h2.PrivateCommitted && - h.Constraints == h2.Constraints && - h.Domain == h2.Domain { - return true - } - return false -} diff --git a/phase2/lagrange.go b/phase2/lagrange.go deleted file mode 100644 index a4981ff..0000000 --- a/phase2/lagrange.go +++ /dev/null @@ -1,66 +0,0 @@ -package phase2 - -import ( - "bufio" - "io" - "os" - - "github.com/consensys/gnark-crypto/ecc/bn254" - "github.com/consensys/gnark-crypto/ecc/bn254/fr/fft" - "github.com/worldcoin/semaphore-mtb-setup/lagrange" -) - -func lagrangeG1(phase1File, lagFile *os.File, position int64, domain *fft.Domain) error { - if _, err := phase1File.Seek(position, io.SeekStart); err != nil { - return err - } - - reader := bufio.NewReader(phase1File) - writer := bufio.NewWriter(lagFile) - defer writer.Flush() - dec := bn254.NewDecoder(reader) - enc := bn254.NewEncoder(writer) - - size := int(domain.Cardinality) - buff := make([]bn254.G1Affine, size) - for i := 0; i < len(buff); i++ { - if err := dec.Decode(&buff[i]); err != nil { - return err - } - } - - lagrange.ConvertG1(buff, domain) - - if err := enc.Encode(buff); err != nil { - return err - } - return nil -} - -func lagrangeG2(phase1File, lagFile *os.File, position int64, domain *fft.Domain) error { - // Seek to position - if _, err := phase1File.Seek(position, io.SeekStart); err != nil { - return err - } - - reader := bufio.NewReader(phase1File) - writer := bufio.NewWriter(lagFile) - defer writer.Flush() - dec := bn254.NewDecoder(reader) - enc := bn254.NewEncoder(writer) - - size := int(domain.Cardinality) - buff := make([]bn254.G2Affine, size) - for i := 0; i < len(buff); i++ { - if err := dec.Decode(&buff[i]); err != nil { - return err - } - } - - lagrange.ConvertG2(buff, domain) - - if err := enc.Encode(buff); err != nil { - return err - } - return nil -} diff --git a/phase2/phase2.go b/phase2/phase2.go deleted file mode 100644 index 8fbfc61..0000000 --- a/phase2/phase2.go +++ /dev/null @@ -1,259 +0,0 @@ -package phase2 - -import ( - "bufio" - "encoding/hex" - "fmt" - "math/big" - "os" - - "github.com/consensys/gnark-crypto/ecc/bn254" - "github.com/consensys/gnark-crypto/ecc/bn254/fr" - "github.com/worldcoin/semaphore-mtb-setup/common" -) - -func Initialize(phase1Path, r1csPath, phase2Path string) error { - phase1File, err := os.Open(phase1Path) - if err != nil { - return err - } - defer phase1File.Close() - - phase2File, err := os.Create(phase2Path) - if err != nil { - return err - } - defer phase2File.Close() - - // 1. Process Headers - header1, header2, err := processHeader(r1csPath, phase1File, phase2File) - if err != nil { - return err - } - - // 2. Convert phase 1 SRS to Lagrange basis - if err := processLagrange(header1, header2, phase1File, phase2File); err != nil { - return err - } - - // 3. Process evaluation - if err := processEvaluations(header1, header2, r1csPath, phase1File); err != nil { - return err - } - - // Evaluate Delta and Z - if err := processDeltaAndZ(header1, header2, phase1File, phase2File); err != nil { - return err - } - - // Process parameters - if err := processPVCKK(header1, header2, r1csPath, phase2File); err != nil { - return err - } - - fmt.Println("Phase 2 has been initialized successfully") - return nil -} - -func Contribute(inputPath, outputPath string) error { - // Input file - inputFile, err := os.Open(inputPath) - if err != nil { - return err - } - defer inputFile.Close() - reader := bufio.NewReader(inputFile) - dec := bn254.NewDecoder(reader) - - // Output file - outputFile, err := os.Create(outputPath) - if err != nil { - return err - } - defer outputFile.Close() - writer := bufio.NewWriter(outputFile) - defer writer.Flush() - enc := bn254.NewEncoder(writer) - - // Read/Write header with extra contribution - var header Header - if err := header.Read(reader); err != nil { - return err - } - fmt.Printf("Current #Contributions := %d\n", header.Contributions) - header.Contributions++ - if err := header.write(writer); err != nil { - return err - } - - // Sample toxic parameters - fmt.Println("Sampling toxic parameters Delta") - // Sample toxic δ - var delta, deltaInv fr.Element - var deltaBI, deltaInvBI big.Int - delta.SetRandom() - deltaInv.Inverse(&delta) - - delta.BigInt(&deltaBI) - deltaInv.BigInt(&deltaInvBI) - - // Process δ₁ - fmt.Println("Processing DeltaG1 and DeltaG2") - var delta1 bn254.G1Affine - if err := dec.Decode(&delta1); err != nil { - return err - } - delta1.ScalarMultiplication(&delta1, &deltaBI) - if err := enc.Encode(&delta1); err != nil { - return err - } - - // Process δ₂ - var delta2 bn254.G2Affine - if err := dec.Decode(&delta2); err != nil { - return err - } - delta2.ScalarMultiplication(&delta2, &deltaBI) - if err := enc.Encode(&delta2); err != nil { - return err - } - - // Process Z using δ⁻¹ - if err = scale(dec, enc, header.Domain, &deltaInvBI); err != nil { - return err - } - - // Process PKK using δ⁻¹ - if err = scale(dec, enc, header.Witness, &deltaInvBI); err != nil { - return err - } - - // Copy old contributions - nExistingContributions := header.Contributions - 1 - var c Contribution - for i := 0; i < nExistingContributions; i++ { - if _, err := c.readFrom(reader); err != nil { - return err - } - if _, err := c.writeTo(writer); err != nil { - return err - } - } - - // Get hash of previous contribution - var prevHash []byte - if nExistingContributions == 0 { - prevHash = nil - } else { - prevHash = c.Hash - } - - var contribution Contribution - contribution.Delta.Set(&delta1) - contribution.PublicKey = common.GenPublicKey(delta, prevHash, 1) - contribution.Hash = computeHash(&contribution) - - // Write the contribution - contribution.writeTo(writer) - - fmt.Println("Contirbution has been successful!") - fmt.Println("Contribution Hash := ", hex.EncodeToString(contribution.Hash)) - - return nil -} - -func Verify(inputPath, originPath string) error { - // Input file - inputFile, err := os.Open(inputPath) - if err != nil { - return err - } - defer inputFile.Close() - - // Origin file from Phase2.Initialize - originFile, err := os.Open(originPath) - if err != nil { - return err - } - defer originFile.Close() - - inputReader := bufio.NewReader(inputFile) - inputDec := bn254.NewDecoder(inputReader) - originReader := bufio.NewReader(originFile) - originDec := bn254.NewDecoder(originReader) - - // Read curHeader - var curHeader, orgHeader Header - if err := curHeader.Read(inputReader); err != nil { - return err - } - - if err := orgHeader.Read(originReader); err != nil { - return err - } - if curHeader.Contributions == 0 { - return fmt.Errorf("there are no contributions to verify") - } - if !curHeader.Equal(&orgHeader) { - return fmt.Errorf("there is a mismatch between origin and curren headers for phase 2") - } - - // Read [δ]₁ and [δ]₂ - var d1, g1 bn254.G1Affine - var d2, g2 bn254.G2Affine - if err := originDec.Decode(&g1); err != nil { - return err - } - if err := originDec.Decode(&g2); err != nil { - return err - } - if err := inputDec.Decode(&d1); err != nil { - return err - } - if err := inputDec.Decode(&d2); err != nil { - return err - } - - // Check δ₁ and δ₂ are consistent - if !common.SameRatio(g1, d1, d2, g2) { - return fmt.Errorf("deltaG1 and deltaG2 aren't consistent") - } - - // Check Z is updated correctly from origin to the latest state - fmt.Println("Verifying update of Z") - if err := verifyParameter(&d2, &g2, inputDec, originDec, curHeader.Domain, "Z"); err != nil { - return err - } - - // Check PKK is updated correctly from origin to the latest state - fmt.Println("Verifying update of PKK") - if err := verifyParameter(&d2, &g2, inputDec, originDec, curHeader.Witness, "PKK"); err != nil { - return err - } - - // Verify contributions - fmt.Printf("#Contributions := %d\n", curHeader.Contributions) - var prevDelta = g1 - var prevHash []byte = nil - var c Contribution - for i := 0; i < curHeader.Contributions; i++ { - if _, err := c.readFrom(inputReader); err != nil { - return err - } - fmt.Printf("Verifying contribution %d with Hash := %s\n", i+1, hex.EncodeToString(c.Hash)) - if err := verifyContribution(&c, prevDelta, prevHash); err != nil { - return err - } - prevDelta = c.Delta - prevHash = c.Hash - } - - // Verify last contribution has the same delta in parameters - fmt.Println("Verifying Delta of last contribution") - if !c.Delta.Equal(&d1) { - return fmt.Errorf("delta of last contribution delta isn't the same as in parameters") - } - - fmt.Println("Contributions verification has been successful") - return nil -} diff --git a/phase2/utils.go b/phase2/utils.go deleted file mode 100644 index 92e2124..0000000 --- a/phase2/utils.go +++ /dev/null @@ -1,596 +0,0 @@ -package phase2 - -import ( - "bufio" - "bytes" - "encoding/gob" - "errors" - "fmt" - "io" - "math" - "math/big" - "os" - - "github.com/consensys/gnark-crypto/ecc" - "github.com/consensys/gnark-crypto/ecc/bn254" - "github.com/consensys/gnark-crypto/ecc/bn254/fr" - "github.com/consensys/gnark-crypto/ecc/bn254/fr/fft" - "github.com/consensys/gnark/constraint" - cs_bn254 "github.com/consensys/gnark/constraint/bn254" - "github.com/worldcoin/semaphore-mtb-setup/common" - "github.com/worldcoin/semaphore-mtb-setup/phase1" -) - -func nextPowerofTwo(number int) int { - res := 2 - for i := 1; i < 28; i++ { // max power is 28 - if res >= number { - return res - } else { - res *= 2 - } - } - // Shouldn't happen - panic("the power is beyond 28") -} - -func processHeader(r1csPath string, phase1File, phase2File *os.File) (*phase1.Header, *Header, error) { - fmt.Println("Processing the headers ...") - - var header2 Header - var header1 phase1.Header - - // Read the #Constraints - r1csFile, err := os.Open(r1csPath) - if err != nil { - return nil, nil, err - } - defer r1csFile.Close() - var r1cs cs_bn254.R1CS - if _, err := r1cs.ReadFrom(r1csFile); err != nil { - return nil, nil, err - } - header2.Constraints = r1cs.GetNbConstraints() - header2.Domain = nextPowerofTwo(header2.Constraints) - - // Check if phase 1 power can support the current #Constraints - if err := header1.ReadFrom(phase1File); err != nil { - return nil, nil, err - } - N := int(math.Pow(2, float64(header1.Power))) - if N < header2.Constraints { - return nil, nil, fmt.Errorf("phase 1 parameters can support up to %d, but the circuit #Constraints are %d", N, header2.Constraints) - } - // Initialize Domain, #Wires, #Witness, #Public, #PrivateCommitted - header2.Wires = r1cs.NbInternalVariables + r1cs.GetNbPublicVariables() + r1cs.GetNbSecretVariables() - header2.PrivateCommitted = r1cs.CommitmentInfo.NbPrivateCommitted - header2.Public = r1cs.GetNbPublicVariables() - header2.Witness = r1cs.GetNbSecretVariables() + r1cs.NbInternalVariables - header2.PrivateCommitted - - if r1cs.CommitmentInfo.Is() { // the commitment itself is defined by a hint so the prover considers it private - header2.Public++ // but the verifier will need to inject the value itself so on the groth16 - header2.Witness-- // level it must be considered public - } - - // Write header of phase 2 - if err := header2.write(phase2File); err != nil { - return nil, nil, err - } - fmt.Printf("Circuit Info: #Constraints:=%d\n#Wires:=%d\n#Public:=%d\n#Witness:=%d\n#PrivateCommitted:=%d\n", - header2.Constraints, header2.Wires, header2.Public, header2.Witness, header2.PrivateCommitted) - return &header1, &header2, nil -} - -func processLagrange(header1 *phase1.Header, header2 *Header, phase1File, phase2File *os.File) error { - fmt.Println("Converting to Lagrange basis ...") - domain := fft.NewDomain(uint64(header2.Domain)) - N := int(math.Pow(2, float64(header1.Power))) - - lagFile, err := os.Create("srs.lag") - if err != nil { - return err - } - defer lagFile.Close() - - // TauG1 - fmt.Println("Converting TauG1") - pos := int64(3) - if err := lagrangeG1(phase1File, lagFile, pos, domain); err != nil { - return err - } - // AlphaTauG1 - fmt.Println("Converting AlphaTauG1") - pos += 32 * (2*int64(N) - 1) - if err := lagrangeG1(phase1File, lagFile, pos, domain); err != nil { - return err - } - - // BetaTauG1 - fmt.Println("Converting BetaTauG1") - pos += 32 * int64(N) - if err := lagrangeG1(phase1File, lagFile, pos, domain); err != nil { - return err - } - - // TauG2 - fmt.Println("Converting TauG2") - pos += 32 * int64(N) - if err := lagrangeG2(phase1File, lagFile, pos, domain); err != nil { - return err - } - - return nil -} - -func processEvaluations(header1 *phase1.Header, header2 *Header, r1csPath string, phase1File *os.File) error { - fmt.Println("Processing evaluation of [A]₁, [B]₁, [B]₂") - - lagFile, err := os.Open("srs.lag") - if err != nil { - return err - } - defer lagFile.Close() - - evalFile, err := os.Create("evals") - if err != nil { - return err - } - defer evalFile.Close() - - // Read [α]₁ , [β]₁ , [β]₂ from phase1 (Check Phase 1 file format for reference) - alpha, beta1, beta2, err := readPhase1(phase1File, header1.Power) - if err != nil { - return err - } - - // Write [α]₁ , [β]₁ , [β]₂ - enc := bn254.NewEncoder(evalFile) - if err := enc.Encode(alpha); err != nil { - return err - } - if err := enc.Encode(beta1); err != nil { - return err - } - if err := enc.Encode(beta2); err != nil { - return err - } - - var tauG1 []bn254.G1Affine - - // Read R1CS File - r1csFile, err := os.Open(r1csPath) - if err != nil { - return err - } - defer r1csFile.Close() - var r1cs cs_bn254.R1CS - if _, err := r1cs.ReadFrom(r1csFile); err != nil { - return err - } - - // Deserialize Lagrange SRS TauG1 - dec := bn254.NewDecoder(lagFile) - if err := dec.Decode(&tauG1); err != nil { - return err - } - - // Accumlate {[A]₁} - buff := make([]bn254.G1Affine, header2.Wires) - for i, c := range r1cs.Constraints { - for _, t := range c.L { - accumulateG1(&r1cs, &buff[t.WireID()], t, &tauG1[i]) - } - } - // Serialize {[A]₁} - if err := enc.Encode(buff); err != nil { - return err - } - - // Reset buff - buff = make([]bn254.G1Affine, header2.Wires) - // Accumlate {[B]₁} - for i, c := range r1cs.Constraints { - for _, t := range c.R { - accumulateG1(&r1cs, &buff[t.WireID()], t, &tauG1[i]) - } - } - // Serialize {[B]₁} - if err := enc.Encode(buff); err != nil { - return err - } - - var tauG2 []bn254.G2Affine - buff2 := make([]bn254.G2Affine, header2.Wires) - - // Seek to Lagrange SRS TauG2 by skipping AlphaTau and BetaTau - pos := 2*32*int64(header2.Domain) + 2*4 - if _, err := lagFile.Seek(pos, io.SeekCurrent); err != nil { - return err - } - - // Deserialize Lagrange SRS TauG2 - if err := dec.Decode(&tauG2); err != nil { - return err - } - // Accumlate {[B]₂} - for i, c := range r1cs.Constraints { - for _, t := range c.R { - accumulateG2(&r1cs, &buff2[t.WireID()], t, &tauG2[i]) - } - } - // Serialize {[B]₂} - if err := enc.Encode(buff2); err != nil { - return err - } - - return nil -} - -func processDeltaAndZ(header1 *phase1.Header, header2 *Header, phase1File, phase2File *os.File) error { - fmt.Println("Processing Delta and Z") - writer := bufio.NewWriter(phase2File) - defer writer.Flush() - enc := bn254.NewEncoder(writer) - - // Write [δ]₁ and [δ]₂ - _, _, g1, g2 := bn254.Generators() - if err := enc.Encode(&g1); err != nil { - return err - } - if err := enc.Encode(&g2); err != nil { - return err - } - - // Seek to TauG1 - var pos int64 = 3 - if _, err := phase1File.Seek(pos, io.SeekStart); err != nil { - return err - } - reader := bufio.NewReader(phase1File) - dec := bn254.NewDecoder(reader) - - n := header2.Domain - tauG1 := make([]bn254.G1Affine, 2*n-1) - for i := 0; i < len(tauG1); i++ { - if err := dec.Decode(&tauG1[i]); err != nil { - return err - } - } - - // Calculate Z - Z := make([]bn254.G1Affine, n) - for i := 0; i < n-1; i++ { - Z[i].Sub(&tauG1[i+n], &tauG1[i]) - } - common.BitReverseG1(Z) - - // Write Z - for i := 0; i < len(Z); i++ { - if err := enc.Encode(&Z[i]); err != nil { - return err - } - } - return nil -} - -func processPVCKK(header1 *phase1.Header, header2 *Header, r1csPath string, phase2File *os.File) error { - fmt.Println("Processing PKK, VKK, and CKK") - lagFile, err := os.Open("srs.lag") - if err != nil { - return err - } - defer lagFile.Close() - - // Read R1CS File - r1csFile, err := os.Open(r1csPath) - if err != nil { - return err - } - defer r1csFile.Close() - var r1cs cs_bn254.R1CS - if _, err := r1cs.ReadFrom(r1csFile); err != nil { - return err - } - - var buffSRS []bn254.G1Affine - reader := bufio.NewReader(lagFile) - writer := bufio.NewWriter(phase2File) - defer writer.Flush() - dec := bn254.NewDecoder(reader) - enc := bn254.NewEncoder(writer) - - // L = O(TauG1) + R(AlphaTauG1) + L(BetaTauG1) - L := make([]bn254.G1Affine, header2.Wires) - - // Deserialize Lagrange SRS TauG1 - if err := dec.Decode(&buffSRS); err != nil { - return err - } - - for i, c := range r1cs.Constraints { - // Output(Tau) - for _, t := range c.O { - accumulateG1(&r1cs, &L[t.WireID()], t, &buffSRS[i]) - } - } - - // Deserialize Lagrange SRS AlphaTauG1 - if err := dec.Decode(&buffSRS); err != nil { - return err - } - for i, c := range r1cs.Constraints { - // Right(AlphaTauG1) - for _, t := range c.R { - accumulateG1(&r1cs, &L[t.WireID()], t, &buffSRS[i]) - } - } - - // Deserialize Lagrange SRS BetaTauG1 - if err := dec.Decode(&buffSRS); err != nil { - return err - } - for i, c := range r1cs.Constraints { - // Left(BetaTauG1) - for _, t := range c.L { - accumulateG1(&r1cs, &L[t.WireID()], t, &buffSRS[i]) - } - } - - pkk, vkk, ckk := filterL(L, header2, &r1cs.CommitmentInfo) - // Write PKK - for i := 0; i < len(pkk); i++ { - if err := enc.Encode(&pkk[i]); err != nil { - return err - } - } - - // VKK - evalFile, err := os.OpenFile("evals", os.O_APPEND|os.O_WRONLY, 0644) - if err != nil { - return err - } - defer evalFile.Close() - evalWriter := bufio.NewWriter(evalFile) - defer evalWriter.Flush() - evalEnc := bn254.NewEncoder(evalWriter) - if err := evalEnc.Encode(vkk); err != nil { - return err - } - - // Write CKK - if err := evalEnc.Encode(ckk); err != nil { - return err - } - - // Write CommitmentInfo - cmtEnc := gob.NewEncoder(evalWriter) - if err := cmtEnc.Encode(r1cs.CommitmentInfo); err != nil { - return err - } - - return nil -} - -func accumulateG1(r1cs *cs_bn254.R1CS, res *bn254.G1Affine, t constraint.Term, value *bn254.G1Affine) { - cID := t.CoeffID() - switch cID { - case constraint.CoeffIdZero: - return - case constraint.CoeffIdOne: - res.Add(res, value) - case constraint.CoeffIdMinusOne: - res.Sub(res, value) - case constraint.CoeffIdTwo: - res.Add(res, value).Add(res, value) - default: - var tmp bn254.G1Affine - var vBi big.Int - r1cs.Coefficients[cID].BigInt(&vBi) - tmp.ScalarMultiplication(value, &vBi) - res.Add(res, &tmp) - } -} - -func accumulateG2(r1cs *cs_bn254.R1CS, res *bn254.G2Affine, t constraint.Term, value *bn254.G2Affine) { - cID := t.CoeffID() - switch cID { - case constraint.CoeffIdZero: - return - case constraint.CoeffIdOne: - res.Add(res, value) - case constraint.CoeffIdMinusOne: - res.Sub(res, value) - case constraint.CoeffIdTwo: - res.Add(res, value).Add(res, value) - default: - var tmp bn254.G2Affine - var vBi big.Int - r1cs.Coefficients[cID].BigInt(&vBi) - tmp.ScalarMultiplication(value, &vBi) - res.Add(res, &tmp) - } -} - -func scale(dec *bn254.Decoder, enc *bn254.Encoder, N int, delta *big.Int) error { - // Allocate batch with smallest of (N, batchSize) - const batchSize = 1048576 // 2^20 - var initialSize = int(math.Min(float64(N), float64(batchSize))) - buff := make([]bn254.G1Affine, initialSize) - - remaining := N - for remaining > 0 { - // Read batch - readCount := int(math.Min(float64(remaining), float64(batchSize))) - for i := 0; i < readCount; i++ { - if err := dec.Decode(&buff[i]); err != nil { - return err - } - } - - // Process the batch - common.Parallelize(readCount, func(start, end int) { - for i := start; i < end; i++ { - buff[i].ScalarMultiplication(&buff[i], delta) - } - }) - - // Write batch - for i := 0; i < readCount; i++ { - if err := enc.Encode(&buff[i]); err != nil { - return err - } - } - - // Update remaining - remaining -= readCount - } - - return nil -} - -func verifyContribution(c *Contribution, prevDelta bn254.G1Affine, prevHash []byte) error { - // Compute SP for δ - deltaSP := common.GenSP(c.PublicKey.S, c.PublicKey.SX, prevHash, 1) - - // Check for knowledge of δ - if !common.SameRatio(c.PublicKey.S, c.PublicKey.SX, c.PublicKey.SPX, deltaSP) { - return errors.New("couldn't verify knowledge of Delta") - } - - // Check for valid update δ using previous parameters - if !common.SameRatio(c.Delta, prevDelta, deltaSP, c.PublicKey.SPX) { - return errors.New("couldn't verify that [δ]₁ is based on previous contribution") - } - // Verify contribution hash - b := computeHash(c) - if !bytes.Equal(c.Hash, b) { - return fmt.Errorf("contribution hash is invalid") - } - - return nil -} - -func verifyParameter(delta, g *bn254.G2Affine, inputDecoder, originDecoder *bn254.Decoder, size int, field string) error { - // aggregate points - if in, or, err := aggregate(inputDecoder, originDecoder, size); err != nil { - return nil - } else { - if !common.SameRatio(*in, *or, *delta, *g) { - return fmt.Errorf("inconsistent update to %s", field) - } - } - return nil -} - -func aggregate(inputDecoder, originDecoder *bn254.Decoder, size int) (*bn254.G1Affine, *bn254.G1Affine, error) { - var inG, orG, tmp bn254.G1Affine - // Allocate batch with smallest of (N, batchSize) - const batchSize = 1048576 // 2^20 - var initialSize = int(math.Min(float64(size), float64(batchSize))) - buff := make([]bn254.G1Affine, initialSize) - r := make([]fr.Element, size) - - remaining := size - for remaining > 0 { - - // generate randomness - common.Parallelize(len(r), func(start, end int) { - for i := start; i < end; i++ { - r[i].SetRandom() - } - }) - - // Read from input - readCount := int(math.Min(float64(remaining), float64(batchSize))) - for i := 0; i < readCount; i++ { - if err := inputDecoder.Decode(&buff[i]); err != nil { - return nil, nil, err - } - } - - // Aggregate input - if _, err := tmp.MultiExp(buff[:readCount], r[:readCount], ecc.MultiExpConfig{}); err != nil { - return nil, nil, err - } - inG.Add(&inG, &tmp) - - // Read from origin - for i := 0; i < readCount; i++ { - if err := originDecoder.Decode(&buff[i]); err != nil { - return nil, nil, err - } - } - - // Aggregate origin - if _, err := tmp.MultiExp(buff[:readCount], r[:readCount], ecc.MultiExpConfig{}); err != nil { - return nil, nil, err - } - orG.Add(&orG, &tmp) - - // Update remaining - remaining -= readCount - } - - return &inG, &orG, nil -} - -func filterL(L []bn254.G1Affine, header2 *Header, cmtInfo *constraint.Commitment) ([]bn254.G1Affine, []bn254.G1Affine, []bn254.G1Affine) { - pkk := make([]bn254.G1Affine, header2.Witness) - vkk := make([]bn254.G1Affine, header2.Public) - ckk := make([]bn254.G1Affine, header2.PrivateCommitted) - vI, cI := 0, 0 - for i := range L { - isCommittedPrivate := cI < cmtInfo.NbPrivateCommitted && i == cmtInfo.PrivateCommitted()[i] - isCommitment := cmtInfo.Is() && i == cmtInfo.CommitmentIndex - isPublic := i < header2.Public - if isCommittedPrivate { - ckk[cI].Set(&L[i]) - cI++ - } else if isCommitment || isPublic { - vkk[vI].Set(&L[i]) - vI++ - } else { - pkk[i-cI-vI].Set(&L[i]) - } - } - - return pkk, vkk, ckk -} - -func readPhase1(phase1File *os.File, power byte) (*bn254.G1Affine, *bn254.G1Affine, *bn254.G2Affine, error) { - var alpha, beta1 bn254.G1Affine - var beta2 bn254.G2Affine - N := int64(math.Pow(2, float64(power))) - const HeaderSize = 3 - posAlpha := HeaderSize + 32*(2*N-1) - posBeta1 := posAlpha + 32*N - posBeta2 := posBeta1 + 96*N - - dec := bn254.NewDecoder(phase1File) - // Read AlphaG1 - if _, err := phase1File.Seek(posAlpha, io.SeekStart); err != nil { - return nil, nil, nil, err - } - if err := dec.Decode(&alpha); err != nil { - return nil, nil, nil, err - } - - // Read BetaG1 - if _, err := phase1File.Seek(posBeta1, io.SeekStart); err != nil { - return nil, nil, nil, err - } - if err := dec.Decode(&beta1); err != nil { - return nil, nil, nil, err - } - - // Read BetaG2 - if _, err := phase1File.Seek(posBeta2, io.SeekStart); err != nil { - return nil, nil, nil, err - } - if err := dec.Decode(&beta2); err != nil { - return nil, nil, nil, err - } - - return &alpha, &beta1, &beta2, nil - -} diff --git a/test/example_test.go b/test/example_test.go index 53e0b59..f6915eb 100644 --- a/test/example_test.go +++ b/test/example_test.go @@ -1,143 +1,145 @@ package test -import ( - "os" - "testing" - - "github.com/consensys/gnark/backend/groth16" - "github.com/consensys/gnark/std/hash/mimc" - - "github.com/consensys/gnark-crypto/ecc" - "github.com/consensys/gnark-crypto/ecc/bn254" - "github.com/consensys/gnark/frontend" - "github.com/consensys/gnark/frontend/cs/r1cs" - "github.com/worldcoin/semaphore-mtb-setup/keys" - "github.com/worldcoin/semaphore-mtb-setup/phase1" - "github.com/worldcoin/semaphore-mtb-setup/phase2" -) - -// Circuit defines a pre-image knowledge proof -// mimc(secret preImage) = public hash - -type Circuit struct { - // struct tag on a variable is optional - // default uses variable name and secret visibility. - PreImage frontend.Variable - Hash frontend.Variable `gnark:",public"` -} - -// Define declares the circuit's constraints -// Hash = mimc(PreImage) -func (circuit *Circuit) Define(api frontend.API) error { - // hash function - mimc, _ := mimc.NewMiMC(api) - - // specify constraints - // mimc(preImage) == hash - mimc.Write(circuit.PreImage) - api.AssertIsEqual(circuit.Hash, mimc.Sum()) - - return nil -} -func TestSetup(t *testing.T) { - - // Compile the circuit - var myCircuit Circuit - ccs, err := frontend.Compile(bn254.ID.ScalarField(), r1cs.NewBuilder, &myCircuit) - if err != nil { - t.Error(err) - } - writer, err := os.Create("circuit.r1cs") - if err != nil { - t.Error(err) - } - defer writer.Close() - ccs.WriteTo(writer) - - var power byte = 9 - - // Initialize to Phase 1 - if err := phase1.Initialize(power, "0.ph1"); err != nil { - t.Error(err) - } - - // Contribute to Phase 1 - if err := phase1.Contribute("0.ph1", "1.ph1"); err != nil { - t.Error(err) - } - if err := phase1.Contribute("1.ph1", "2.ph1"); err != nil { - t.Error(err) - } - if err := phase1.Contribute("2.ph1", "3.ph1"); err != nil { - t.Error(err) - } - if err := phase1.Contribute("3.ph1", "4.ph1"); err != nil { - t.Error(err) - } - - // Verify Phase 1 contributions - if err := phase1.Verify("4.ph1", ""); err != nil { - t.Error(err) - } - - // Phase 2 initialization - if err := phase2.Initialize("4.ph1", "circuit.r1cs", "0.ph2"); err != nil { - t.Error(err) - } - - // Contribute to Phase 2 - if err := phase2.Contribute("0.ph2", "1.ph2"); err != nil { - t.Error(err) - } - - if err := phase2.Contribute("1.ph2", "2.ph2"); err != nil { - t.Error(err) - } - - if err := phase2.Contribute("2.ph2", "3.ph2"); err != nil { - t.Error(err) - } - - // Verify Phase 2 contributions - if err := phase2.Verify("3.ph2", "0.ph2"); err != nil { - t.Error(err) - } - - if err := keys.ExtractKeys("3.ph2"); err != nil { - t.Error(err) - } -} - -func TestProveAndVerify(t *testing.T) { - // Compile the circuit - var myCircuit Circuit - ccs, _ := frontend.Compile(bn254.ID.ScalarField(), r1cs.NewBuilder, &myCircuit) - - // Read PK and VK - pkk := groth16.NewProvingKey(ecc.BN254) - pkFile, _ := os.Open("pk") - defer pkFile.Close() - vkFile, _ := os.Open("vk") - defer vkFile.Close() - pkk.ReadFrom(pkFile) - vkk := groth16.NewVerifyingKey(ecc.BN254) - vkk.ReadFrom(vkFile) - - assignment := &Circuit{ - PreImage: "16130099170765464552823636852555369511329944820189892919423002775646948828469", - Hash: "12886436712380113721405259596386800092738845035233065858332878701083870690753", - } - witness, _ := frontend.NewWitness(assignment, bn254.ID.ScalarField()) - prf, err := groth16.Prove(ccs, pkk, witness) - if err != nil { - panic(err) - } - pubWitness, err := witness.Public() - if err != nil { - panic(err) - } - err = groth16.Verify(prf, vkk, pubWitness) - if err != nil { - panic(err) - } -} +// import ( +// "os" +// "testing" + +// "github.com/consensys/gnark/backend/groth16" +// "github.com/consensys/gnark/std/hash/mimc" + +// "github.com/consensys/gnark-crypto/ecc" +// "github.com/consensys/gnark-crypto/ecc/bn254" +// "github.com/consensys/gnark/frontend" +// "github.com/consensys/gnark/frontend/cs/r1cs" +// "github.com/worldcoin/semaphore-mtb-setup/phase1" +// "github.com/worldcoin/semaphore-mtb-setup/phase2" +// ) + +// // Circuit defines a pre-image knowledge proof +// // mimc(secret preImage) = public hash + +// type Circuit struct { +// // struct tag on a variable is optional +// // default uses variable name and secret visibility. +// PreImage frontend.Variable +// Hash frontend.Variable `gnark:",public"` +// } + +// // Define declares the circuit's constraints +// // Hash = mimc(PreImage) +// func (circuit *Circuit) Define(api frontend.API) error { +// // hash function +// mimc, _ := mimc.NewMiMC(api) + +// // specify constraints +// // mimc(preImage) == hash +// mimc.Write(circuit.PreImage) +// api.AssertIsEqual(circuit.Hash, mimc.Sum()) + +// return nil +// } +// func TestSetup(t *testing.T) { + +// // Compile the circuit +// var myCircuit Circuit +// ccs, err := frontend.Compile(bn254.ID.ScalarField(), r1cs.NewBuilder, &myCircuit) +// if err != nil { +// t.Error(err) +// } +// writer, err := os.Create("circuit.r1cs") +// if err != nil { +// t.Error(err) +// } +// defer writer.Close() +// ccs.WriteTo(writer) + +// var power byte = 9 + +// // Initialize to Phase 1 +// if err := phase1.Initialize(power, "0.ph1"); err != nil { +// t.Error(err) +// } + +// // Contribute to Phase 1 +// if err := phase1.Contribute("0.ph1", "1.ph1"); err != nil { +// t.Error(err) +// } +// if err := phase1.Contribute("1.ph1", "2.ph1"); err != nil { +// t.Error(err) +// } +// if err := phase1.Contribute("2.ph1", "3.ph1"); err != nil { +// t.Error(err) +// } +// if err := phase1.Contribute("3.ph1", "4.ph1"); err != nil { +// t.Error(err) +// } + +// // Verify Phase 1 contributions +// if err := phase1.Verify("4.ph1", ""); err != nil { +// t.Error(err) +// } + +// // Phase 2 initialization +// if err := phase2.Initialize("4.ph1", "circuit.r1cs", "0.ph2"); err != nil { +// t.Error(err) +// } + +// // Contribute to Phase 2 +// if err := phase2.Contribute("0.ph2", "1.ph2"); err != nil { +// t.Error(err) +// } + +// if err := phase2.Contribute("1.ph2", "2.ph2"); err != nil { +// t.Error(err) +// } + +// if err := phase2.Contribute("2.ph2", "3.ph2"); err != nil { +// t.Error(err) +// } + +// // Verify Phase 2 contributions +// if err := phase2.Verify("3.ph2", "0.ph2"); err != nil { +// t.Error(err) +// } + +// if err := keys.ExtractKeys("3.ph2"); err != nil { +// t.Error(err) +// } +// } + +// func TestProveAndVerify(t *testing.T) { +// // Compile the circuit +// var myCircuit Circuit +// ccs, err := frontend.Compile(bn254.ID.ScalarField(), r1cs.NewBuilder, &myCircuit) +// if err != nil { +// t.Error(err) +// } + +// // Read PK and VK +// pkk := groth16.NewProvingKey(ecc.BN254) +// pkFile, _ := os.Open("pk") +// defer pkFile.Close() +// vkFile, _ := os.Open("vk") +// defer vkFile.Close() +// pkk.ReadFrom(pkFile) +// vkk := groth16.NewVerifyingKey(ecc.BN254) +// vkk.ReadFrom(vkFile) + +// assignment := &Circuit{ +// PreImage: "16130099170765464552823636852555369511329944820189892919423002775646948828469", +// Hash: "12886436712380113721405259596386800092738845035233065858332878701083870690753", +// } +// witness, _ := frontend.NewWitness(assignment, bn254.ID.ScalarField()) +// prf, err := groth16.Prove(ccs, pkk, witness) +// if err != nil { +// panic(err) +// } +// pubWitness, err := witness.Public() +// if err != nil { +// panic(err) +// } +// err = groth16.Verify(prf, vkk, pubWitness) +// if err != nil { +// panic(err) +// } +// }