Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: implement light client verification methods #1116

Open
wants to merge 34 commits into
base: master
Choose a base branch
from
Open
Changes from 1 commit
Commits
Show all changes
34 commits
Select commit Hold shift + click to select a range
a537493
feat: implement light client verification methods
austinabell Apr 24, 2023
39b4575
chore: docs
austinabell Apr 25, 2023
879a7ab
refactor: move light-client APIs to own crate, migrate sha impl
austinabell Apr 26, 2023
17f0cf3
chore: changeset
austinabell Apr 26, 2023
f47b4fa
chore: add docs to validation steps
austinabell Apr 26, 2023
f8612ff
refactor: switch function params to be an object rather than positional
austinabell Apr 26, 2023
b280d8e
chore: revert naj test changes
austinabell Apr 26, 2023
4e81fae
test: adds back light client block verification test
austinabell Apr 26, 2023
09f957f
chore: lint
austinabell Apr 26, 2023
8ad77ae
test: add back execution proof verification in NAJ test
austinabell Apr 26, 2023
153dfa7
test: add execution proof test vectors
austinabell Apr 26, 2023
59b4f14
chore: address comments before refactoring
austinabell Apr 27, 2023
68601e5
refactor: move Enum class to types
austinabell Apr 27, 2023
73644e2
refactor: move light client logic into separate files
austinabell Apr 27, 2023
ab94d18
refactor: move borsh utils to own file
austinabell Apr 27, 2023
e15fad7
Merge branch 'master' into light_client
austinabell Apr 27, 2023
206df23
chore: remove todo from updated types
austinabell Apr 27, 2023
de393fb
chore: update changeset
austinabell Apr 27, 2023
daca5d2
test: move execution verification providers check into accounts
austinabell Apr 27, 2023
f509b3f
Merge branch 'master' into light_client
austinabell Apr 29, 2023
0dde04f
Merge branch 'master' into light_client
austinabell May 4, 2023
29c9ea6
fix: bug with bp signature validation
austinabell May 5, 2023
fe45242
refactor: move block hash generation to after trivial validation
austinabell May 5, 2023
20bff2e
chore: lint fix
austinabell May 5, 2023
0559e23
fix: bn comparison bug
austinabell May 23, 2023
d867179
fix: execution test parameter format from changes
austinabell May 24, 2023
8731572
Merge branch 'master' into light_client
austinabell May 24, 2023
f766671
fix: changes based on review comments
austinabell Jun 2, 2023
c5d1378
Merge branch 'master' into light_client
austinabell Jun 2, 2023
a8e19d8
chore: empty commit to re-trigger flaky CI
austinabell Jun 2, 2023
e7c26ce
Merge branch 'master' into light_client
vikinatora Feb 26, 2024
d97a801
fix: pnpm-lock.yaml
vikinatora Feb 26, 2024
35c12da
chore: lock package version
vikinatora Feb 26, 2024
6aed942
Merge remote-tracking branch 'upstream/master' into light_client
vikinatora Mar 12, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Next Next commit
feat: implement light client verification methods
austinabell committed Apr 24, 2023

Verified

This commit was signed with the committer’s verified signature.
austinabell Austin Abell
commit a537493cf94da9298e66129adf036014dfd65639
2 changes: 2 additions & 0 deletions packages/near-api-js/src/common-index.ts
Original file line number Diff line number Diff line change
@@ -3,6 +3,7 @@ import * as providers from './providers';
import * as utils from './utils';
import * as transactions from './transaction';
import * as validators from './validators';
import * as lightClient from './light-client';

import { Account } from './account';
import * as multisig from './account_multisig';
@@ -24,6 +25,7 @@ export {
utils,
transactions,
validators,
lightClient,

multisig,
Account,
511 changes: 511 additions & 0 deletions packages/near-api-js/src/light-client.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,511 @@
import bs58 from "bs58";
import crypto from "crypto";
andy-haynes marked this conversation as resolved.
Show resolved Hide resolved
import {
BlockHeaderInnerLiteView,
ExecutionOutcomeWithIdView,
ExecutionStatus,
ExecutionStatusBasic,
LightClientBlockLiteView,
LightClientProof,
MerklePath,
NextLightClientBlockResponse,
ValidatorStakeView,
} from "./providers/provider";
import { Assignable, Enum } from "./utils/enums";
import BN from "bn.js";
import { serialize } from "./utils/serialize";
import { PublicKey } from "./utils";

const ED_PREFIX = "ed25519:";

class BorshBlockHeaderInnerLite extends Assignable {
height: BN;
epoch_id: Uint8Array;
next_epoch_id: Uint8Array;
prev_state_root: Uint8Array;
outcome_root: Uint8Array;
timestamp: BN;
next_bp_hash: Uint8Array;
block_merkle_root: Uint8Array;
}

class BorshApprovalInner extends Enum {
endorsement?: Uint8Array;
skip?: BN;
}

class BorshValidatorStakeViewV1 extends Assignable {
account_id: string;
public_key: PublicKey;
stake: BN;
}

class BorshValidatorStakeView extends Enum {
v1?: BorshValidatorStakeViewV1;
}

class BorshValidatorStakeViewWrapper extends Assignable {
bps: BorshValidatorStakeView[];
}

class BorshEmpty extends Assignable {}

class BorshPartialExecutionStatus extends Enum {
unknown?: BorshEmpty;
failure?: BorshEmpty;
successValue?: Uint8Array;
successReceiptId?: Uint8Array;
}

class BorshPartialExecutionOutcome extends Assignable {
receiptIds: Uint8Array[];
gasBurnt: BN;
tokensBurnt: BN;
executorId: string;
status: BorshPartialExecutionStatus;
}

class BorshCryptoHash extends Assignable {
array: Uint8Array;
}

class BorshCryptoHashes extends Assignable {
hashes: Uint8Array[];
}

type Class<T = any> = new (...args: any[]) => T;
const SCHEMA = new Map<Class, any>([
[
BorshBlockHeaderInnerLite,
{
kind: "struct",
fields: [
["height", "u64"],
["epoch_id", [32]],
["next_epoch_id", [32]],
["prev_state_root", [32]],
["outcome_root", [32]],
["timestamp", "u64"],
["next_bp_hash", [32]],
["block_merkle_root", [32]],
],
},
],
[
BorshApprovalInner,
{
kind: "enum",
field: "enum",
values: [
["endorsement", [32]],
["skip", "u64"],
],
},
],
[
BorshValidatorStakeViewV1,
{
kind: "struct",
fields: [
["account_id", "string"],
["public_key", PublicKey],
["stake", "u128"],
],
},
],
[
BorshValidatorStakeView,
{
kind: "enum",
field: "enum",
values: [["v1", BorshValidatorStakeViewV1]],
},
],
[
BorshValidatorStakeViewWrapper,
{
kind: "struct",
fields: [["bps", [BorshValidatorStakeView]]],
},
],
[
BorshEmpty,
{
kind: "struct",
fields: [],
},
],
[
BorshCryptoHash,
{
kind: "struct",
fields: [["hash", [32]]],
},
],
[
BorshCryptoHashes,
{
kind: "struct",
fields: [["hashes", [[32]]]],
},
],
[
BorshPartialExecutionStatus,
{
kind: "enum",
field: "enum",
values: [
["unknown", BorshEmpty],
["failure", BorshEmpty],
["successValue", ["u8"]],
["successReceiptId", [32]],
],
},
],
[
BorshPartialExecutionOutcome,
{
kind: "struct",
fields: [
["receiptIds", [[32]]],
["gasBurnt", "u64"],
["tokensBurnt", "u128"],
["executorId", "string"],
["status", BorshPartialExecutionStatus],
],
},
],
// Note: Copied from transactions schema
[
PublicKey,
{
kind: "struct",
fields: [
["keyType", "u8"],
["data", [32]],
],
},
],
]);

function hashBlockProducers(bps: ValidatorStakeView[]): Buffer {
const borshBps: BorshValidatorStakeView[] = bps.map((bp) => {
if (bp.validator_stake_struct_version) {
const version = parseInt(
bp.validator_stake_struct_version.slice(1)
);
if (version !== 1) {
throw new Error(
"Only version 1 of the validator stake struct is supported"
);
}
}
return new BorshValidatorStakeView({
v1: new BorshValidatorStakeViewV1({
account_id: bp.account_id,
public_key: PublicKey.fromString(bp.public_key),
stake: bp.stake,
}),
});
});
const serializedBps = serialize(
SCHEMA,
// NOTE: just wrapping because borsh-js requires this type to be in the schema for some reason
new BorshValidatorStakeViewWrapper({ bps: borshBps })
);
return crypto.createHash("sha256").update(serializedBps).digest();
}

function combineHash(h1: Uint8Array, h2: Uint8Array): Buffer {
const hash = crypto.createHash("sha256");
hash.update(h1);
hash.update(h2);
return hash.digest();
}

export function computeBlockHash(block: LightClientBlockLiteView): Buffer {
const header = block.inner_lite;
const borshHeader = new BorshBlockHeaderInnerLite({
height: new BN(header.height),
epoch_id: bs58.decode(header.epoch_id),
next_epoch_id: bs58.decode(header.next_epoch_id),
prev_state_root: bs58.decode(header.prev_state_root),
outcome_root: bs58.decode(header.outcome_root),
timestamp: new BN(header.timestamp_nanosec),
next_bp_hash: bs58.decode(header.next_bp_hash),
block_merkle_root: bs58.decode(header.block_merkle_root),
});
const msg = serialize(SCHEMA, borshHeader);
const innerRestHash = bs58.decode(block.inner_rest_hash);
const prevHash = bs58.decode(block.prev_block_hash);
const innerLiteHash = crypto.createHash("sha256").update(msg).digest();
const innerHash = combineHash(innerLiteHash, innerRestHash);
const finalHash = combineHash(innerHash, prevHash);

return finalHash;
}

export function validateLightClientBlock(
lastKnownBlock: LightClientBlockLiteView,
currentBlockProducers: ValidatorStakeView[],
newBlock: NextLightClientBlockResponse
andy-haynes marked this conversation as resolved.
Show resolved Hide resolved
) {
// Numbers for each step references the spec:
// https://github.com/near/NEPs/blob/c7d72138117ed0ab86629a27d1f84e9cce80848f/specs/ChainSpec/LightClient.md
const newBlockHash = computeBlockHash(lastKnownBlock);
const nextBlockHashDecoded = combineHash(
bs58.decode(newBlock.next_block_inner_hash),
newBlockHash
);

// (1)
andy-haynes marked this conversation as resolved.
Show resolved Hide resolved
if (newBlock.inner_lite.height <= lastKnownBlock.inner_lite.height) {
throw new Error(
"New block must be at least the height of the last known block"
);
}

// (2)
if (
newBlock.inner_lite.epoch_id !== lastKnownBlock.inner_lite.epoch_id &&
newBlock.inner_lite.epoch_id !== lastKnownBlock.inner_lite.next_epoch_id
) {
throw new Error(
"New block must either be in the same epoch or the next epoch from the last known block"
);
}

const blockProducers: ValidatorStakeView[] = currentBlockProducers;
if (newBlock.approvals_after_next.length < blockProducers.length) {
throw new Error(
"Number of approvals for next epoch must be at least the number of current block producers"
);
}

// (4) and (5)
const totalStake = new BN(0);
const approvedStake = new BN(0);

for (let i = 0; i < blockProducers.length; i++) {
const approval = newBlock.approvals_after_next[i];
const stake = blockProducers[i].stake;

totalStake.iadd(new BN(stake));

if (approval === null) {
continue;
}

approvedStake.iadd(new BN(stake));

const publicKey = PublicKey.fromString(blockProducers[i].public_key);
const signature = bs58.decode(approval.slice(ED_PREFIX.length));

const approvalEndorsement = serialize(
SCHEMA,
new BorshApprovalInner({ endorsement: nextBlockHashDecoded })
);

const approvalHeight: BN = new BN(newBlock.inner_lite.height + 2);
const approvalHeightLe = approvalHeight.toArrayLike(Buffer, "le", 8);
const approvalMessage = new Uint8Array([
...approvalEndorsement,
...approvalHeightLe,
]);

publicKey.verify(approvalMessage, signature);
}

// (5)
const threshold = totalStake.mul(new BN(2)).div(new BN(3));
if (approvedStake <= threshold) {
throw new Error("Approved stake does not exceed the 2/3 threshold");
}

// (6)
if (
newBlock.inner_lite.epoch_id === lastKnownBlock.inner_lite.next_epoch_id
) {
// (3)
if (!newBlock.next_bps) {
throw new Error(
"New block must include next block producers if a new epoch starts"
);
}

const bpsHash = hashBlockProducers(newBlock.next_bps);

if (!bpsHash.equals(bs58.decode(newBlock.inner_lite.next_bp_hash))) {
throw new Error("Next block producers hash doesn't match");
}
}
}

function blockHeaderInnerLiteHash(data: BlockHeaderInnerLiteView): Buffer {
const hash = crypto.createHash("sha256");
hash.update(new BN(data.height).toArrayLike(Buffer, "le", 8));
hash.update(bs58.decode(data.epoch_id));
hash.update(bs58.decode(data.next_epoch_id));
hash.update(bs58.decode(data.prev_state_root));
hash.update(bs58.decode(data.outcome_root));
hash.update(
new BN(data.timestamp_nanosec || data.timestamp).toArrayLike(
Buffer,
"le",
8
)
);
hash.update(bs58.decode(data.next_bp_hash));
hash.update(bs58.decode(data.block_merkle_root));
return hash.digest();
}

function computeRoot(node: Buffer, proof: MerklePath): Buffer {
proof.forEach((step) => {
if (step.direction == "Left") {
node = combineHash(bs58.decode(step.hash), node);
} else {
node = combineHash(node, bs58.decode(step.hash));
}
});
return node;
}

function computeMerkleRoot(proof: LightClientProof): Buffer {
const innerLiteHash = blockHeaderInnerLiteHash(
proof.block_header_lite.inner_lite
);

const headerHash = combineHash(
combineHash(
innerLiteHash,
bs58.decode(proof.block_header_lite.inner_rest_hash)
),
bs58.decode(proof.block_header_lite.prev_block_hash)
);

return computeRoot(headerHash, proof.block_proof);
}

function computeOutcomeRoot(
outcomeWithId: ExecutionOutcomeWithIdView,
outcomeRootProof: MerklePath
) {
// Generate outcome proof hash through borsh encoding
const receiptIds = outcomeWithId.outcome.receipt_ids.map((id) =>
bs58.decode(id)
);

const borshStatus = (
status: ExecutionStatus | ExecutionStatusBasic
): BorshPartialExecutionStatus => {
if (status === ExecutionStatusBasic.Pending) {
throw new Error("Pending status is not supported");
} else if (status === ExecutionStatusBasic.Unknown) {
return new BorshPartialExecutionStatus({
unknown: new BorshEmpty({}),
});
} else if (
status === ExecutionStatusBasic.Failure ||
"Failure" in status
) {
return new BorshPartialExecutionStatus({
failure: new BorshEmpty({}),
});
} else if (
status.SuccessValue !== undefined &&
status.SuccessValue !== null
) {
return new BorshPartialExecutionStatus({
successValue: Buffer.from(status.SuccessValue, "base64"),
});
} else if (
status.SuccessReceiptId !== undefined &&
status.SuccessReceiptId !== null
) {
return new BorshPartialExecutionStatus({
successReceiptId: bs58.decode(status.SuccessReceiptId),
});
} else {
throw new Error(`Unexpected execution status ${status}`);
}
};
const partialExecOutcome: BorshPartialExecutionOutcome =
new BorshPartialExecutionOutcome({
receiptIds: receiptIds,
gasBurnt: new BN(outcomeWithId.outcome.gas_burnt),
// TODO update with types once https://github.com/near/near-api-js/pull/1113 comes in
tokensBurnt: new BN((outcomeWithId.outcome as any).tokens_burnt),
executorId: (outcomeWithId.outcome as any).executor_id,
status: borshStatus(outcomeWithId.outcome.status),
});
const serializedPartialOutcome = serialize(SCHEMA, partialExecOutcome);
const partialOutcomeHash = crypto
.createHash("sha256")
.update(serializedPartialOutcome)
.digest();

const logsHashes: Uint8Array[] = outcomeWithId.outcome.logs.map((log) => {
return crypto.createHash("sha256").update(log).digest();
});
const outcomeHashes: Uint8Array[] = [
bs58.decode(outcomeWithId.id),
partialOutcomeHash,
...logsHashes,
];

const outcomeSerialized = serialize(
SCHEMA,
new BorshCryptoHashes({ hashes: outcomeHashes })
);
const outcomeHash = crypto
.createHash("sha256")
.update(outcomeSerialized)
.digest();

// Generate shard outcome root
// computeRoot(sha256(borsh(outcome)), outcome.proof)
const outcomeShardRoot = computeRoot(outcomeHash, outcomeWithId.proof);

// Generate block outcome root
// computeRoot(sha256(borsh(shardOutcomeRoot)), outcomeRootProof)
const shardRootBorsh = serialize(
SCHEMA,
new BorshCryptoHash({ hash: outcomeShardRoot })
);
const shardRootHash = crypto
.createHash("sha256")
.update(shardRootBorsh)
.digest();

return computeRoot(shardRootHash, outcomeRootProof);
}

export function validateExecutionProof(
proof: LightClientProof,
merkleRoot: Uint8Array
) {
// Execution outcome root verification
const blockOutcomeRoot = computeOutcomeRoot(
proof.outcome_proof,
proof.outcome_root_proof
);
const proofRoot = proof.block_header_lite.inner_lite.outcome_root;
if (!blockOutcomeRoot.equals(bs58.decode(proofRoot))) {
throw new Error(
`Block outcome root (${bs58.encode(
blockOutcomeRoot
)}) doesn't match proof (${proofRoot})}`
);
}

// Block merkle root verification
const blockMerkleRoot = computeMerkleRoot(proof);
if (!blockMerkleRoot.equals(merkleRoot)) {
throw new Error(
`Block merkle root (${bs58.encode(
blockMerkleRoot
)}) doesn't match proof (${bs58.encode(merkleRoot)})}`
);
}
}
57 changes: 47 additions & 10 deletions packages/near-api-js/test/providers.test.js
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@
const nearApi = require('../src/index');
const testUtils = require('./test-utils');
const testUtils = require('./test-utils');
const BN = require('bn.js');
const base58 = require('bs58');
const { lightClient } = require('../src/index');

jest.setTimeout(30000);

@@ -61,7 +62,7 @@ test('json rpc fetch validators info', withProvider(async (provider) => {
expect(validators.current_validators.length).toBeGreaterThanOrEqual(1);
}));

test('txStatus with string hash and buffer hash', withProvider(async(provider) => {
test('txStatus with string hash and buffer hash', withProvider(async (provider) => {
const near = await testUtils.setUpTestConnection();
const sender = await testUtils.createAccount(near);
const receiver = await testUtils.createAccount(near);
@@ -73,7 +74,7 @@ test('txStatus with string hash and buffer hash', withProvider(async(provider) =
expect(responseWithUint8Array).toMatchObject(outcome);
}));

test('txStatusReciept with string hash and buffer hash', withProvider(async(provider) => {
test('txStatusReciept with string hash and buffer hash', withProvider(async (provider) => {
const near = await testUtils.setUpTestConnection();
const sender = await testUtils.createAccount(near);
const receiver = await testUtils.createAccount(near);
@@ -86,7 +87,7 @@ test('txStatusReciept with string hash and buffer hash', withProvider(async(prov
expect(responseWithUint8Array).toMatchObject(reciepts);
}));

test('json rpc query with block_id', withProvider(async(provider) => {
test('json rpc query with block_id', withProvider(async (provider) => {
const stat = await provider.status();
let block_id = stat.sync_info.latest_block_height - 1;

@@ -121,7 +122,7 @@ test('json rpc query view_state', withProvider(async (provider) => {

await contract.setValue({ args: { value: 'hello' } });

return testUtils.waitFor(async() => {
return testUtils.waitFor(async () => {
const response = await provider.query({
request_type: 'view_state',
finality: 'final',
@@ -162,7 +163,7 @@ test('json rpc query view_code', withProvider(async (provider) => {
const account = await testUtils.createAccount(near);
const contract = await testUtils.deployContract(account, testUtils.generateUniqueString('test'));

return testUtils.waitFor(async() => {
return testUtils.waitFor(async () => {
const response = await provider.query({
request_type: 'view_code',
finality: 'final',
@@ -185,7 +186,7 @@ test('json rpc query call_function', withProvider(async (provider) => {

await contract.setValue({ args: { value: 'hello' } });

return testUtils.waitFor(async() => {
return testUtils.waitFor(async () => {
const response = await provider.query({
request_type: 'call_function',
finality: 'final',
@@ -210,7 +211,7 @@ test('json rpc query call_function', withProvider(async (provider) => {
});
}));

test('final tx result', async() => {
test('final tx result', async () => {
const result = {
status: { SuccessValue: 'e30=' },
transaction: { id: '11111', outcome: { status: { SuccessReceiptId: '11112' }, logs: [], receipt_ids: ['11112'], gas_burnt: 1 } },
@@ -222,7 +223,7 @@ test('final tx result', async() => {
expect(nearApi.providers.getTransactionLastResult(result)).toEqual({});
});

test('final tx result with null', async() => {
test('final tx result with null', async () => {
andy-haynes marked this conversation as resolved.
Show resolved Hide resolved
const result = {
status: 'Failure',
transaction: { id: '11111', outcome: { status: { SuccessReceiptId: '11112' }, logs: [], receipt_ids: ['11112'], gas_burnt: 1 } },
@@ -234,7 +235,7 @@ test('final tx result with null', async() => {
expect(nearApi.providers.getTransactionLastResult(result)).toEqual(null);
});

test('json rpc light client proof', async() => {
test('json rpc light client proof', async () => {
const near = await testUtils.setUpTestConnection();
const workingAccount = await testUtils.createAccount(near);
const executionOutcome = await workingAccount.sendMoney(workingAccount.accountId, new BN(10000));
@@ -260,6 +261,7 @@ test('json rpc light client proof', async() => {

const block = await provider.block({ blockId: finalizedStatus.sync_info.latest_block_hash });
const lightClientHead = block.header.last_final_block;
const finalBlock = await provider.block({ blockId: lightClientHead });
let lightClientRequest = {
type: 'transaction',
light_client_head: lightClientHead,
@@ -276,6 +278,9 @@ test('json rpc light client proof', async() => {
expect(lightClientProof.outcome_root_proof).toEqual([]);
expect(lightClientProof.block_proof.length).toBeGreaterThan(0);

// Validate the proof against the finalized block
lightClient.validateExecutionProof(lightClientProof, base58.decode(finalBlock.header.block_merkle_root));

// pass nonexistent hash for light client head will fail
lightClientRequest = {
type: 'transaction',
@@ -296,6 +301,38 @@ test('json rpc light client proof', async() => {
await expect(provider.lightClientProof(lightClientRequest)).rejects.toThrow(/.+ block .+ is ahead of head block .+/);
});

test('json rpc get next light client block with validation', withProvider(async (provider) => {
const stat = await provider.status();

// Get block in at least the last epoch (epoch duration 43,200 blocks on mainnet and testnet)
const height = stat.sync_info.latest_block_height;
const protocolConfig = await provider.experimental_protocolConfig({ finality: 'final' });

// NOTE: This will underflow if the network used has not produced an epoch yet. If a new network
// config is required, can retrieve a block a few height behind (1+buffer for indexing). If run
// on a fresh network, would need to wait for blocks to be produced and indexed.
const firstBlockHeight = height - protocolConfig.epoch_length * 2;
const firstBlock = await provider.block({ blockId: firstBlockHeight });
const prevBlock = await provider.nextLightClientBlock({ last_block_hash: firstBlock.header.hash });
const nextBlock = await provider.nextLightClientBlock({ last_block_hash: base58.encode(lightClient.computeBlockHash(prevBlock)) });
expect('inner_lite' in nextBlock).toBeTruthy();
// Verify that requesting from previous epoch includes the set of new block producers.
expect('next_bps' in nextBlock).toBeTruthy();

// Greater than or equal check because a block could have been produced during the test.
// There is a buffer of 10 given to the height, because this seems to be lagging behind the
// latest finalized block by a few seconds. This delay might just be due to slow or delayed
// indexing in a node's db. If this fails in the future, we can increase the buffer.
expect(nextBlock.inner_lite.height).toBeGreaterThanOrEqual(height - 10);
expect(nextBlock.inner_lite.height).toBeGreaterThan(prevBlock.inner_lite.height);
expect('prev_block_hash' in nextBlock).toBeTruthy();
expect('next_block_inner_hash' in nextBlock).toBeTruthy();
expect('inner_rest_hash' in nextBlock).toBeTruthy();
expect('approvals_after_next' in nextBlock).toBeTruthy();

lightClient.validateLightClientBlock(prevBlock, prevBlock.next_bps, nextBlock);
}));

test('json rpc fetch protocol config', withProvider(async (provider) => {
const status = await provider.status();
const blockHeight = status.sync_info.latest_block_height;