Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: implement light client verification methods #1116

Open
wants to merge 34 commits into
base: master
Choose a base branch
from

Conversation

austinabell
Copy link
Contributor

Pre-flight checklist

  • I have read the Contributing Guidelines on pull requests.
  • Commit messages follow the conventional commits spec
  • If this is a code change: I have written unit tests.
  • If this changes code in a published package: I have run pnpm changeset to create a changeset JSON document appropriate for this change.
    • Going to wait for input on where this should live in the monorepo
  • If this is a new API or substantial change: the PR has an accompanying issue (closes #0000) and the maintainers have approved on my working plan.

Motivation

Part of the implementation for https://near.social/#/devgovgigs.near/widget/gigs-board.pages.Post?id=148

This enables NAJ to verify light client blocks and execution proofs. The API is designed to be low-level and unopinionated. There can definitely be higher-level APIs built on this to automatically sync and verify transactions or just be more ergonomic, but anything I could think of came with tradeoffs. This might give some context as to why the API surfaces (specifically validateExecutionProof) are the way they are -- for flexibility and extensibility.

Open questions/comments:

  • Very unclear where this should live in the monorepo. I just put it in the root level of the NAJ package for now, and hopefully, a maintainer can indicate where they want something like this. I tried to follow patterns within the repo. Waiting to do changeset until this is decided
    • The block verification test had to be duplicated to NAJ to test this verification. If API is kept as is, should the test just exist in NAJ or is the duplication fine?
  • Test framework will be implemented here https://github.com/austinabell/near-light-client-tests. There are some scripts in that repo that are testing the logic of these functions more in-depth now, but there will be test vectors added in the future.
  • The block verification fails for certain blocks below the 55mil range, due to a change in the protocol at that time. Unclear if there can be or is intended to be backwards-compatibility support for this, and unclear what that would look like as it's not documented. Context: https://near.zulipchat.com/#narrow/stream/295558-pagoda.2Fcore/topic/Light.20client.20block.20queries.20failing/near/350605623
  • borsh-js sucks to work with, kept all types private and did not expose anything to the public API. Was not 100% sure what to do for the borsh schema, as it couldn't be combined with the one from the transactions package, so the PublicKey schema type is duplicated in the changes. There are also a bunch of types added that might seem unnecessary, but it's just because the borsh-js API is very limiting
  • Some types are cast as any until fix: add missing fields to execution outcome and test #1113 comes in
  • validateLightClientBlock taking last known block lite view and block producers separately as params is intentional. These might be kept separately, and the RPC marks returning bp set as optional (although the current nearcore RPC never returns null)
  • validateExecutionProof taking the merkleRoot as bytes is intentional. One might not be using the JS light client to pull the header->merkleRoot or just have this root encoded in a different way. Bytes is most unopinionated here, instead of having to know that it is a base58 encoded string from the RPC response. This also keeps the API more stable if the data is deserialized properly in the future.
  • computeBlockHash could be a utility that is moved outside of this module, but might be confusing if it exists in another namespace. Possibly re-export it from utilities/provider?

Test Plan

Added tests here, building out a test framework for this now.

Related issues/PRs

#1113 is the only open one relating to this. All others have been pulled in.

@changeset-bot
Copy link

changeset-bot bot commented Apr 25, 2023

🦋 Changeset detected

Latest commit: 6aed942

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 15 packages
Name Type
@near-js/light-client Patch
near-api-js Patch
@near-js/types Patch
@near-js/accounts Patch
@near-js/cookbook Patch
@near-js/crypto Patch
@near-js/keystores Patch
@near-js/providers Patch
@near-js/transactions Patch
@near-js/utils Patch
@near-js/wallet-account Patch
@near-js/biometric-ed25519 Patch
@near-js/keystores-browser Patch
@near-js/keystores-node Patch
@near-js/signers Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@andy-haynes
Copy link
Collaborator

Very unclear where this should live in the monorepo.

The API is designed to be low-level and unopinionated. There can definitely be higher-level APIs built on this

Sounds like a new package makes sense here, what do you think?

couldn't be combined with the one from the transactions package, so the PublicKey schema type is duplicated in the changes

Why not add the new types to the existing serialization map? I'll concede it's not a great fit in transactions but the goal is to eventually get serialization into a lower-level package.

Test framework will be implemented here https://github.com/austinabell/near-light-client-tests

Can the tests not live in this repo?

computeBlockHash could be a utility that is moved outside of this module, but might be confusing if it exists in another namespace. Possibly re-export it from utilities/provider?

Seems like it could just live in utils unless you see no compelling reason for it to be used outside the context of light clients. I would strongly prefer not to duplicate new functionality across multiple packages.

Copy link
Collaborator

@andy-haynes andy-haynes left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a few comments inline around structure

packages/near-api-js/src/light-client.ts Outdated Show resolved Hide resolved
packages/near-api-js/src/light-client.ts Outdated Show resolved Hide resolved
packages/near-api-js/src/light-client.ts Outdated Show resolved Hide resolved
packages/near-api-js/test/providers.test.js Outdated Show resolved Hide resolved
@austinabell
Copy link
Contributor Author

austinabell commented Apr 26, 2023

Sounds like a new package makes sense here, what do you think?

I don't have an opinion here, but I can move it into it's own package! Just didn't want to add any package bloat if it was against the goals of the refactoring that's going on :)

Why not add the new types to the existing serialization map? I'll concede it's not a great fit in transactions but the goal is to eventually get serialization into a lower-level package.

Because there is only one overlapping type, it wasn't clear that having the larger map would be a clear win. This type of refactor sounds like something best split from this PR. Do you mind if it's handled separately and I open an issue for this?

Can the tests not live in this repo?

They could, but the scripts are meant to be more non-deterministic to sample the chain to give confidence in the impl, and the tests vectors and framework I'm just starting might add bloat to the repo. After I'm done with it, we can revisit and see if we want? It also shouldn't take too long, so could wait for that if it's a blocker. Edit: also the other main reason is that this framework is intended to be used across different language impls, and not tied to this specific one

Seems like it could just live in utils unless you see no compelling reason for it to be used outside the context of light clients. I would strongly prefer not to duplicate new functionality across multiple packages.

I don't think there is because the param is just the response type for the light client header and data. I don't think it's even feasible to manually construct that. I could be missing a use case here though.

@austinabell
Copy link
Contributor Author

I made some of the changes. I'll make the following outstanding changes tomorrow (listing mainly for my own memory):

  • Docs on verification steps
  • Creating new types to replace positional arguments
  • Move test changes to packages/accounts/test/providers.test.js (this one, the test case would have to be moved to this new package now, though. Correct me if wrong)

@austinabell
Copy link
Contributor Author

austinabell commented Apr 26, 2023

So the light client verification test was moved to the new light-client package, but the execution verification can't happen in the accounts package because the light client utilities aren't exposed from it. Do you have an opinion on what I should do here? The test itself uses components from the account crate and the light-client utils to be able to verify that the data returned from RPC is a valid response.

Alternatively, I could decouple this and just write a new test that checks a few pre-defined execution proofs, so it depends on what types of tests are preferred here. The current test is conflicting with #1120, so if the intention is to remove all NAJ tests forever, then this might be the only way to go.

@andy-haynes
Copy link
Collaborator

So the light client verification test was moved to the new light-client package, but the execution verification can't happen in the accounts package because the light client utilities aren't exposed from it. Do you have an opinion on what I should do here?

The way I've been handling this is to add @near-js/light-client as a dev dependency to @near-js/accounts, making the verification method accessible in accounts tests without light-client becoming a build dependency.

@andy-haynes
Copy link
Collaborator

I don't have an opinion here, but I can move it into it's own package! Just didn't want to add any package bloat if it was against the goals of the refactoring that's going on :)

Initially I was thinking it made sense in providers, but if it's intended as a low-level API to be built-upon I'm not convinced it makes sense to couple these packages (or maybe providers would reference this as a new package 🤔). On the other hand it may be bloat in practice if these are intended only as methods for working with RPC responses. I imagine there are others who would want to weigh in before we commit to it though, I'm not entirely sure who will own this process in the future.

Because there is only one overlapping type, it wasn't clear that having the larger map would be a clear win. This type of refactor sounds like something best split from this PR. Do you mind if it's handled separately and I open an issue for this?

I see, I missed that @near-js/transactions wasn't already a dependency here. I'm OK with the overlap in the interim, this is good motivation to get serialization decoupled from @near-js/transactions.

After I'm done with it, we can revisit and see if we want?

Definitely! I was more curious whether you had anything in particular that couldn't be accommodated here.

packages/light-client/src/index.ts Outdated Show resolved Hide resolved
packages/light-client/CHANGELOG.md Outdated Show resolved Hide resolved
packages/light-client/src/index.ts Outdated Show resolved Hide resolved
packages/light-client/package.json Outdated Show resolved Hide resolved
packages/light-client/package.json Outdated Show resolved Hide resolved
@austinabell
Copy link
Contributor Author

Definitely! I was more curious whether you had anything in particular that couldn't be accommodated here.

Nope, was assuming you wouldn't want to include a submodule in this repo with all of the test vectors (it probably will grow to a decent size). The runner logic is pretty light (https://github.com/austinabell/near-light-client-tests/blob/main/scripts/runBlockTestVectors.ts) and could be moved here pretty easily, and just have that be executed outside of the current test command. I just wanted to make sure my changes were as least invasive as possible.

@austinabell austinabell requested a review from andy-haynes April 27, 2023 23:28
@austinabell
Copy link
Contributor Author

Update on this: I added a runner in rainbow-bridge to validate the logic against the shared test suite here Near-One/rainbow-bridge#897. Hope that gives confidence in the implementation.

I can move the runner logic from https://github.com/austinabell/near-light-client-tests/blob/main/scripts/runExecutionTestVectors.ts, https://github.com/austinabell/near-light-client-tests/blob/main/scripts/runBlockTestVectors.ts into a test in this repo, similar to that PR, if preferred for more test coverage on this logic. If so, let me know how you'd like that set up :)

robin-near
robin-near previously approved these changes Jun 1, 2023
}

const blockProducers: ValidatorStakeView[] = currentBlockProducers;
if (newBlock.approvals_after_next.length < blockProducers.length) {

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The python implementation checks for strict equality. The spec uses a zip of the two arrays. Should we also assert the lengths are equal?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That implementation is wrong. The spec says, and in practice, there are blocks with more block producers than approvals because in the case of validator rotations, both the old and new bps must be included.

I can find and give a block where this is the case if needed, but I don't have an example off-hand.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Gotcha, I trust your judgment here. Thanks!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

References, for posterity, to remove the trust aspect for others 😄 : https://github.com/near/NEPs/blob/master/specs/ChainSpec/LightClient.md#signature-verification. Also, line in rainbow bridge implementation, which is used in production https://github.com/aurora-is-near/rainbow-bridge/blob/33ca808b45cb5e9cf2e27f741b0f6e42d97c276b/contracts/eth/nearbridge/contracts/NearBridge.sol#L212 (python client just used for tests)

Also, anyone would be able to switch this to strict equal and see that it fails on recent blocks.

const borshBps: BorshValidatorStakeView[] = bps.map((bp) => {
if (bp.validator_stake_struct_version) {
const version = parseInt(
bp.validator_stake_struct_version.slice(1)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

how about the more strict (and safer) check: bp.validator_stake_struct_version === "V1"

],
},
],
]);

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: consistency on newline after file

@@ -0,0 +1,44 @@
module.exports = function getConfig(env) {

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a specific reason why we're writing tests in js and not ts?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no idea, was following the format of existing tests when I wrote

throw new Error('Enum can only take single value');
}
Object.keys(properties).map((key: string) => {
(this as any)[key] = properties[key];

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: assert that key != "enum"?


export class BorshEmpty extends Assignable {}

export class BorshPartialExecutionStatus extends Enum {

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not a comment for this PR, but a general one for the borsh-js design. I think the design is rather inflexible as it requires that serializables are classes, and that makes it very unnatural that we have to use Enum and Assignable.

I'd imagine something like this is more reasonable:

export type BorshPartialExecutionStatus = (
  {unknown: {}} | {failure: {}} | {successValue: Uint8Array} | {successReceiptId: Uint8Array});

const SCHEMA_BorshEmpty: Schema<{}> = {
    kind: 'struct',
    fields: [],
};

const SCHEMA_BorshPartialExecutionStatus: Schema<BorshPartialExecutionStatus> = {
    kind: 'enum',
    field: 'enum',
    values: [
        ['unknown', SCHEMA_BorshEmpty],
        ['failure', SCHEMA_BorshEmpty],
        ['successValue', ['u8']],
        ['successReceiptId', [32]],
    ],
};

// to deserialize:
const executionStatus = deserialize(SCHEMA_BorshPartialExecutionStatus, buffer);
if ('successValue' in executionStatus) { useBuffer(executionStatus.successValue); }
// to serialize (no need to wrap):
const buffer = serialize(SCHEMA_BorshPartialExecutionStatus, {failure: {}});

But that requires changing borsh-js.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I agree. Was a large initiative and unrelated to my changes though, so I won't be addressing here.

@vikinatora
Copy link
Collaborator

Hey @austinabell, is this PR ready to be merged? As far as I understand most comments have been addressed and has went through the eyes of several people. If it's good I can resolve the merge conflicts and get it merged.

@austinabell
Copy link
Contributor Author

Hey @austinabell, is this PR ready to be merged? As far as I understand most comments have been addressed and has went through the eyes of several people. If it's good I can resolve the merge conflicts and get it merged.

Hi @vikinatora, it's out of my control when and how this PR gets merged, but feel free to resolve merge conflicts if you'd like!

@vikinatora
Copy link
Collaborator

@frol, I believe we can proceed with merging in this case. What do you think?

@frol
Copy link
Collaborator

frol commented Jan 27, 2024

@vikinatora Let's get it merged and iterate from there 🙏

@vikinatora
Copy link
Collaborator

@austinabell can I have write permission to your branch? I believe it's unnecessary to fork your fork and create a new PR just to resolve the conflicts. We'd make it harder to track the conversations here and context of the PR.

@austinabell
Copy link
Contributor Author

@austinabell can I have write permission to your branch? I believe it's unnecessary to fork your fork and create a new PR just to resolve the conflicts. We'd make it harder to track the conversations here and context of the PR.

gave access! Let me know if you need anything else!

@vikinatora
Copy link
Collaborator

CI node is down and tests aren't passing. We just need to wait out for it to be up again and we can merge I believe

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants