Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Modification: Large Dataset Multisig Process - v3.0 Single multisig entity to many approved clients #509

Closed
galen-mcandrew opened this issue Apr 16, 2022 · 6 comments
Assignees
Labels
Proposal For Fil+ change proposals

Comments

@galen-mcandrew
Copy link
Collaborator

galen-mcandrew commented Apr 16, 2022

Issue Description

Since September 2021, we have created ~100 new ‘notary’ entities as multisigs, with all existing notaries listed as signers and a threshold of 2. This has added ~300PiB of DataCap to the top of funnel. As of March, this has resulted in ~20PiB of verified deals. This means that in 6 months of running version 2.0 of LDN structure, we have created a 2,000% increase in verified deals on the network.

However, with this notary election cycle closing we need to make a decision about how Large Dataset Notary multisigs will be structured.

Impact

As we close our third round of notary elections, we will approximately double the number of notary addresses (from 23 to ~45, still finalizing counts). To add these new notaries to the existing LDN multisig entities would require a massive technical lift: 2 messages per new notary address per existing LDN = ~4000 messages, which would need to come from existing signers/notaries, and not root key holders or governance team members (because they are not signers on the existing LDN multisigs).

There are two other potential alternatives to adding new notaries to existing multisigs: 1) only add new notaries to new approved large dataset clients, or 2) deprecate all existing LDN multisigs by removing remaining DataCap and creating 100 new multisigs with all new notary addresses.

  1. This could create bottlenecks and confusion, with different sets of notaries available to sign large dataset allocations, based on when the client was approved.
  2. This also brings technical debt, requiring 2 messages per new LDN created and 2 messages per existing LDN deprecated. It also impacts tracking and dashboard continuity, as the allocations to a client address will now fork from different notary multisigs. This also does not scale as we continue to grow the program or modify notary addresses.

The impact of ratifying these new notaries (and doubling the number of notaries available to participate in the Large Dataset process) but keeping the same overall process (one LDN multisig with all signers -> one approved client), would require massive technical debt, slow downs, and confusion.

Proposed Solution

Create a single LDN multisig, with all 3rd round notary addresses, along with members of governance team.

Start this LDN with 2 weeks worth of DataCap (based on current allocation rate to clients, this would be ~25PiB). Utilize a bot to request top-ups from RKH, targeting top-up every other week. This amount and top-up rate could scale based on usage and RKH responsiveness.

By including governance team for management of the multisig, we greatly decrease the lift to manage addresses (adding or removing notaries). Short term, we will clearly state that governance team will not sign allocation requests. Long term we could implement programmatic blocks to this through things like FVM or even a “multisig of multisigs” structure.

We potentially lose the inflated top-of-funnel metric tracking on-chain the amount of DataCap that has been allocated to ‘notaries’. For example, at this time there is 338PiB of DataCap awarded on chain to large dataset notaries. This metric would revert back to DC allocated for direct notary behavior, and we would need a new way to track large dataset funnel requests. For example, track the cumulative volume of DC requested by clients in approved applications (this is not a chain value, it would need to be summed via GitHub issues). This also alleviates the current artificial ceiling of 5PiB per client application. That ceiling was put in place to help bound the changes from LDNv2.0, and also standardize the total each new client could request to minimize runaway threats. By restructuring to a single LDN address that serves many clients we could allow each applicant to request the total expected volume for their dataset, including replicas. This would, overall, lead to an increase in requested verified data volume.

Summary: transition from “1 notary multisig to 1 client” into 1:many

Multisig Addresses at this time:

Admin:

  • Galen: f1a5b5cn5xgsmui2kivzj2eryzrqmhmew4tytj3by
  • Deep: f14l6ugtjaqp72yfrum6mwpbmw6v6rncbqp64526y
  • Kevin: f15hsz52t266gczi2fbyx7rrxbsngek6cqmcvpthi

Notaries:

  • Deon Erda - 4Everland - f1wp5enznnbgzyoaiafzlox3h3bkwcl347iwqivga
  • Irene Young - 12ships Foundation - f1d4gmpqz3execjj2wvrxuuhvbms5mzh7t7yqrviq
  • Simon - 1475 - f1ofq4mngy7ggcp755pfquq2gphjjnlydolf6awtq
  • Claudia Richoux - Banyan - f1oc6qvenzp7wsriu7edyebb325gnaovktmujl7jq
  • BingHe Web3.0 Lab - f14gme3f52prtyzk6pblogrdd6b6ivp4swc6qmesi
  • Suki - Bitrise Capital - f1dfkdmjhuvvol6okun57mix447wdpolwcd323ktq
  • Sounghwan Park - BlockChain World - f1qdko4jg25vo35qmyvcrw4ak4fmuu3f5rif2kc7i
  • BlockMaker - f1o3twrcpwjtpcd4q36lpq4qmy2qfbgtyy5h6tsty
  • Eric - ByteBase - f1yh6q3nmsg7i2sys7f7dexcuajgoweudcqj2chfi
  • Cabrina Huang - f1a2lia2cwwekeubwo4nppt4v4vebxs2frozarz3q
  • Coffee Cloud - f14qmuid2b6ne4342m5dk56f4rcr7y5sz4sg5fiwy
  • Eden - DeFil - f1dnb3uz7sylxk6emti3ififcvu3nlufnnsjui6ea
  • Alex Kim - Define Platform - f1hhippi64yiyhpjdtbidfyzma6irc2nuav7mrwmi
  • Gary Gao - FBG Capital - f1zffqhxwq2rrg7rtot6lmkl6hb2xyrrseawprzsq
  • Fenbushi Capital - f1yqydpmqb5en262jpottko2kd65msajax7fi4rmq
  • Filecoin Foundation - f1k6wwevxvp466ybil7y2scqlhtnrz5atjkkyvm4a
  • embedsky - Firefly (YuanHe Tech) - f1fg6jkxsr3twfnyhdlatmq36xca6sshptscds7xa
  • Tim Guo (柏礼) - Force Web3 Community - f1mxtcba3l5qnzkh2wjmnynnrrht35bya6ec53pnq
  • Anne - Genesis - f1mdk7s2vntzm6hu35yuo6vjubtrpfnb2awhgvrri
  • Holon Innovations - f1ystxl2ootvpirpa7ebgwl7vlhwkbx2r4zjxwe5i
  • Masaaki Nawatani - IPFS Japan Consortium - f1ciu27vn3r7htmanzvwrm6ksx6y774gqsz3cot3i
  • Nic Wong - IPFS Metaverse Community - f1j4n74chme7whbz3yls4a7ixqewb6dijypqg2a3a
  • Steven Li - IPFSForce - f1w2vyp4w6df44gbh4vxqle4w65zfrfnwhrl3hojy
  • Neo Ge - IPFSMain - f13k5zr6ovc2gjmg3lvd43ladbydhovpylcvbflpa
  • Fei Yan - Kernelogic - f1yjhnsoga2ccnepb7t3p3ov5fzom3syhsuinxexa
  • Matrix Storage - f1tbxqwjxfyv7swsdin4einirlsfquv3vnmlapley
  • Jazz Hsiao - MetaWave - f1ktlkcxnmzxcdaoqfsunrg3vocfbmgv4n3mrn74a
  • New Web HK Group Holdings Limited - f1e77zuityhvvw6u2t6tb5qlnsegy2s67qs4lbbbq
  • ORIGIN Storage - f1q6bpjlqia6iemqbrdaxr2uehrhpvoju3qh4lpga
  • James Hoang - PiKNiK - f1ypuqpi4xn5q7zi5at3rmdltosozifhqmrt66vhq
  • Pluskit - f1tgnlhtcmhwipfm7thsftxhn5k52velyjlazpvka
  • Wijnand Schouten - Speedium - f1krmypm4uoxxf3g7okrwtrahlmpcph3y7rbqqgfa
  • Barry - STCloud - f1jvvltduw35u6inn5tr4nfualyd42bh3vjtylgci
  • Xinan Xu - Tech Greedy - f1k3ysofkrrmqcot6fkx4wnezpczlltpirmrpsgui
  • Nicklas Reiersen - TechHedge (Reiers) - f1oz43ckvmtxmmsfzqm6bpnemqlavz4ifyl524chq
  • Bailey-li - Tianji Studio - f1pszcrsciyixyuxxukkvtazcokexbn54amf7gvoq
  • Tinfra LLC - f1jqk7xok5kautet2knhwlg74jvcfbrqlj47kbp2i
  • Julien NOEL - Twinquasar - f1wxhnytjmklj2czezaqcfl7eb4nkgmaxysnegwii
  • Jackie Mo - Union Labs (formally IPFSUnion) - f17xdri3wunqgld7dm23e4f3eqsntjakwc47xjo6i
  • Waterdrop Lab - f122qmy25wdtt5mxd77kndiq7z5x2n3iwiuz2wdsa
  • West Labs Venture - f1b5wse72uiusm4n2waqx4vuvsnvadz4ltcfruksa
  • Joss Hua - Venus Team - IPFSForce - f1tfg54zzscugttejv336vivknmsnzzmyudp3t7wi

Technical Requirements

Some tooling changes for the verifier front-end would need to be spec'ed and implemented.

Some tooling changes across various community dashboards would be required to more accurate track signatures for allocations and then client DataCap usage, decoupling the "large dataset notary multisig" as a key metric.

Risks

Changes amount of DataCap on chain available to notaries through the large dataset notary multisigs at a given point in time. Requires technical investment to track notary signers and create new subscription top-up tools.

Related Issues

#227
#217

@galen-mcandrew galen-mcandrew added the Proposal For Fil+ change proposals label Apr 16, 2022
@galen-mcandrew galen-mcandrew self-assigned this Apr 16, 2022
@AlexxNica
Copy link
Contributor

How much time do we have to make this decision?

@galen-mcandrew
Copy link
Collaborator Author

@AlexxNica We can still move forwards with ratifying the new notaries, which we are doing. We will still need to verify each notary's claimed address, and some notaries are still waiting to receive their hardware ledger (which we need for our current tooling).

We can also still process new Large Dataset applications, which we are doing. Currently though, new LDN's will follow the previous set of ratified notary addresses (from round 2).

So, short answer, I would say we have a few weeks to make a final decision.

@MasaakiNawatani
Copy link

This is a major improvement to the LDN process. I support this change 100%. Thanks!

@MegTei
Copy link

MegTei commented Apr 19, 2022

Support this proposal and automation. Please just call out if any changes to notary, client or SP process

@galen-mcandrew
Copy link
Collaborator Author

Governance team will need to manually construct message to add new signer: f1tfg54zzscugttejv336vivknmsnzzmyudp3t7wi

Joss Hua

@dkkapur
Copy link
Collaborator

dkkapur commented Sep 21, 2022

LDN v3 is up and running for a while now. There are a few notaries from round 3 that may still need to get added. Let's address those separately and close this issue out for now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Proposal For Fil+ change proposals
Projects
None yet
Development

No branches or pull requests

5 participants