-
Notifications
You must be signed in to change notification settings - Fork 58
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Modification: Large Dataset Multisig Process - v3.0 Single multisig entity to many approved clients #509
Comments
How much time do we have to make this decision? |
@AlexxNica We can still move forwards with ratifying the new notaries, which we are doing. We will still need to verify each notary's claimed address, and some notaries are still waiting to receive their hardware ledger (which we need for our current tooling). We can also still process new Large Dataset applications, which we are doing. Currently though, new LDN's will follow the previous set of ratified notary addresses (from round 2). So, short answer, I would say we have a few weeks to make a final decision. |
This is a major improvement to the LDN process. I support this change 100%. Thanks! |
Support this proposal and automation. Please just call out if any changes to notary, client or SP process |
Governance team will need to manually construct message to add new signer: f1tfg54zzscugttejv336vivknmsnzzmyudp3t7wi |
LDN v3 is up and running for a while now. There are a few notaries from round 3 that may still need to get added. Let's address those separately and close this issue out for now. |
Issue Description
Since September 2021, we have created ~100 new ‘notary’ entities as multisigs, with all existing notaries listed as signers and a threshold of 2. This has added ~300PiB of DataCap to the top of funnel. As of March, this has resulted in ~20PiB of verified deals. This means that in 6 months of running version 2.0 of LDN structure, we have created a 2,000% increase in verified deals on the network.
However, with this notary election cycle closing we need to make a decision about how Large Dataset Notary multisigs will be structured.
Impact
As we close our third round of notary elections, we will approximately double the number of notary addresses (from 23 to ~45, still finalizing counts). To add these new notaries to the existing LDN multisig entities would require a massive technical lift: 2 messages per new notary address per existing LDN = ~4000 messages, which would need to come from existing signers/notaries, and not root key holders or governance team members (because they are not signers on the existing LDN multisigs).
There are two other potential alternatives to adding new notaries to existing multisigs: 1) only add new notaries to new approved large dataset clients, or 2) deprecate all existing LDN multisigs by removing remaining DataCap and creating 100 new multisigs with all new notary addresses.
The impact of ratifying these new notaries (and doubling the number of notaries available to participate in the Large Dataset process) but keeping the same overall process (one LDN multisig with all signers -> one approved client), would require massive technical debt, slow downs, and confusion.
Proposed Solution
Create a single LDN multisig, with all 3rd round notary addresses, along with members of governance team.
Start this LDN with 2 weeks worth of DataCap (based on current allocation rate to clients, this would be ~25PiB). Utilize a bot to request top-ups from RKH, targeting top-up every other week. This amount and top-up rate could scale based on usage and RKH responsiveness.
By including governance team for management of the multisig, we greatly decrease the lift to manage addresses (adding or removing notaries). Short term, we will clearly state that governance team will not sign allocation requests. Long term we could implement programmatic blocks to this through things like FVM or even a “multisig of multisigs” structure.
We potentially lose the inflated top-of-funnel metric tracking on-chain the amount of DataCap that has been allocated to ‘notaries’. For example, at this time there is 338PiB of DataCap awarded on chain to large dataset notaries. This metric would revert back to DC allocated for direct notary behavior, and we would need a new way to track large dataset funnel requests. For example, track the cumulative volume of DC requested by clients in approved applications (this is not a chain value, it would need to be summed via GitHub issues). This also alleviates the current artificial ceiling of 5PiB per client application. That ceiling was put in place to help bound the changes from LDNv2.0, and also standardize the total each new client could request to minimize runaway threats. By restructuring to a single LDN address that serves many clients we could allow each applicant to request the total expected volume for their dataset, including replicas. This would, overall, lead to an increase in requested verified data volume.
Summary: transition from “1 notary multisig to 1 client” into 1:many
Multisig Addresses at this time:
Admin:
Notaries:
Technical Requirements
Some tooling changes for the verifier front-end would need to be spec'ed and implemented.
Some tooling changes across various community dashboards would be required to more accurate track signatures for allocations and then client DataCap usage, decoupling the "large dataset notary multisig" as a key metric.
Risks
Changes amount of DataCap on chain available to notaries through the large dataset notary multisigs at a given point in time. Requires technical investment to track notary signers and create new subscription top-up tools.
Related Issues
#227
#217
The text was updated successfully, but these errors were encountered: