Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Port some commits from kl/sync-layer-reorg #697

Merged
merged 28 commits into from
Aug 14, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
28 commits
Select commit Hold shift + click to select a range
b5ccc20
added batch root sending to executor fn (#611)
koloz193 Jul 12, 2024
d4124fe
add gas test ci for l1 contracts (#626)
koloz193 Jul 22, 2024
7ee53bc
Merge branch 'sync-layer-stable' of ssh://github.com/matter-labs/era-…
kelemeno Jul 31, 2024
8536918
Increase Test Coverage (#632)
koloz193 Aug 8, 2024
e6cd619
fix: gateway audit all fixes (#684)
kelemeno Aug 9, 2024
19f08a4
Custom Asset Bridging after OZ fixes were applied (#679) (#682)
Raid5594 Aug 9, 2024
0144965
fix: more docs, don't accept new batches when chain was migrated (#686)
mm-zk Aug 13, 2024
4fb4ab2
set of fixes, including extended gateway calldata da
StanislavBreadless Aug 13, 2024
bb50678
fix length
StanislavBreadless Aug 13, 2024
383491f
resolve a todo
StanislavBreadless Aug 13, 2024
2e018f6
additional fixes for stm deployment tracker
StanislavBreadless Aug 13, 2024
e9b3829
mini refactor + ensure that only chainadmin can migrate chain
StanislavBreadless Aug 13, 2024
e6ab078
some more fixes
StanislavBreadless Aug 13, 2024
ef388fb
change the value of relayed sender
StanislavBreadless Aug 13, 2024
393cc46
delete unneeded debug method and preserve a helpful one
StanislavBreadless Aug 13, 2024
c3a16bf
foundry unit tests pass
StanislavBreadless Aug 13, 2024
fb7b189
lint fix
StanislavBreadless Aug 13, 2024
e98126a
fix lint
StanislavBreadless Aug 13, 2024
fd1d20d
fix system contracts build
StanislavBreadless Aug 13, 2024
ee7c367
fix some unneeded updates
StanislavBreadless Aug 13, 2024
828d0f6
fix lint
StanislavBreadless Aug 13, 2024
8109652
fix check hashes
StanislavBreadless Aug 13, 2024
b4b7c03
fix spell
StanislavBreadless Aug 13, 2024
4c4411c
Merge pull request #692 from matter-labs/sb-set-of-fixes-1
StanislavBreadless Aug 13, 2024
2a2e6cf
amend some functions
StanislavBreadless Aug 14, 2024
9f6d439
make scripts work
StanislavBreadless Aug 14, 2024
576a59c
fmt
StanislavBreadless Aug 14, 2024
779c707
fix lint
StanislavBreadless Aug 14, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
53 changes: 53 additions & 0 deletions .github/workflows/l1-contracts-ci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -215,3 +215,56 @@ jobs:
coverage-files: ./l1-contracts/lcov.info
working-directory: l1-contracts
minimum-coverage: 85 # Set coverage threshold.

gas-report:
needs: [build, lint]
runs-on: ubuntu-latest

steps:
- name: Checkout the repository
uses: actions/checkout@v4
with:
submodules: recursive

- name: Use Foundry
uses: foundry-rs/foundry-toolchain@v1

- name: Use Node.js
uses: actions/setup-node@v3
with:
node-version: 18.18.0
cache: yarn

- name: Install dependencies
run: yarn

- name: Restore artifacts cache
uses: actions/cache/restore@v3
with:
fail-on-cache-miss: true
key: artifacts-l1-${{ github.sha }}
path: |
l1-contracts/artifacts
l1-contracts/cache
l1-contracts/typechain

# Add any step generating a gas report to a temporary file named gasreport.ansi. For example:
- name: Run tests
run: yarn l1 test:foundry --gas-report | tee gasreport.ansi # <- this file name should be unique in your repository!

- name: Compare gas reports
uses: Rubilmax/foundry-gas-diff@v3.18
with:
summaryQuantile: 0.0 # only display the 10% most significant gas diffs in the summary (defaults to 20%)
sortCriteria: avg,max # sort diff rows by criteria
sortOrders: desc,asc # and directions
ignore: test-foundry/**/*,l1-contracts/contracts/dev-contracts/**/*,l1-contracts/lib/**/*,l1-contracts/contracts/common/Dependencies.sol
id: gas_diff

- name: Add gas diff to sticky comment
if: github.event_name == 'pull_request' || github.event_name == 'pull_request_target'
uses: marocchino/sticky-pull-request-comment@v2
with:
# delete the comment in case changes no longer impact gas costs
delete: ${{ !steps.gas_diff.outputs.markdown }}
message: ${{ steps.gas_diff.outputs.markdown }}
62 changes: 41 additions & 21 deletions da-contracts/contracts/CalldataDA.sol
Original file line number Diff line number Diff line change
Expand Up @@ -4,14 +4,21 @@ pragma solidity 0.8.24;

// solhint-disable gas-custom-errors, reason-string

import {BLOB_SIZE_BYTES} from "./DAUtils.sol";
/// @dev Total number of bytes in a blob. Blob = 4096 field elements * 31 bytes per field element
/// @dev EIP-4844 defines it as 131_072 but we use 4096 * 31 within our circuits to always fit within a field element
/// @dev Our circuits will prove that a EIP-4844 blob and our internal blob are the same.
uint256 constant BLOB_SIZE_BYTES = 126_976;

uint256 constant BLOBS_SUPPORTED = 6;
/// @dev The state diff hash, hash of pubdata + the number of blobs.
uint256 constant BLOB_DATA_OFFSET = 65;

/// @dev The size of the commitment for a single blob.
uint256 constant BLOB_COMMITMENT_SIZE = 32;

/// @notice Contract that contains the functionality for process the calldata DA.
/// @dev The expected l2DAValidator that should be used with it `RollupL2DAValidator`.
abstract contract CalldataDA {
/// @notice Parses the input that the l2 Da validator has provided to the contract.
/// @notice Parses the input that the L2 DA validator has provided to the contract.
/// @param _l2DAValidatorOutputHash The hash of the output of the L2 DA validator.
/// @param _maxBlobsSupported The maximal number of blobs supported by the chain.
/// @param _operatorDAInput The DA input by the operator provided on L1.
Expand All @@ -30,14 +37,14 @@ abstract contract CalldataDA {
bytes calldata l1DaInput
)
{
// The preimage under the hash `l2DAValidatorOutputHash` is expected to be in the following format:
// The preimage under the hash `_l2DAValidatorOutputHash` is expected to be in the following format:
// - First 32 bytes are the hash of the uncompressed state diff.
// - Then, there is a 32-byte hash of the full pubdata.
// - Then, there is the 1-byte number of blobs published.
// - Then, there are linear hashes of the published blobs, 32 bytes each.

// Check that it accommodates enough pubdata for the state diff hash, hash of pubdata + the number of blobs.
require(_operatorDAInput.length >= 32 + 32 + 1, "too small");
require(_operatorDAInput.length >= BLOB_DATA_OFFSET, "too small");

stateDiffHash = bytes32(_operatorDAInput[:32]);
fullPubdataHash = bytes32(_operatorDAInput[32:64]);
Expand All @@ -49,43 +56,56 @@ abstract contract CalldataDA {
// the `_maxBlobsSupported`
blobsLinearHashes = new bytes32[](_maxBlobsSupported);

require(_operatorDAInput.length >= 65 + 32 * blobsProvided, "invalid blobs hashes");
require(_operatorDAInput.length >= BLOB_DATA_OFFSET + 32 * blobsProvided, "invalid blobs hashes");

uint256 ptr = 65;
cloneCalldata(blobsLinearHashes, _operatorDAInput[BLOB_DATA_OFFSET:], blobsProvided);

for (uint256 i = 0; i < blobsProvided; ++i) {
// Take the 32 bytes of the blob linear hash
blobsLinearHashes[i] = bytes32(_operatorDAInput[ptr:ptr + 32]);
ptr += 32;
}
uint256 ptr = BLOB_DATA_OFFSET + 32 * blobsProvided;

// Now, we need to double check that the provided input was indeed retutned by the L2 DA validator.
// Now, we need to double check that the provided input was indeed returned by the L2 DA validator.
require(keccak256(_operatorDAInput[:ptr]) == _l2DAValidatorOutputHash, "invalid l2 DA output hash");

// The rest of the output were provided specifically by the operator
// The rest of the output was provided specifically by the operator
l1DaInput = _operatorDAInput[ptr:];
}

/// @notice Verify that the calldata DA was correctly provided.
/// todo: better doc comments
/// @param _blobsProvided The number of blobs provided.
/// @param _fullPubdataHash Hash of the pubdata preimage.
/// @param _maxBlobsSupported Maximum number of blobs supported.
/// @param _pubdataInput Full pubdata + an additional 32 bytes containing the blob commitment for the pubdata.
/// @dev We supply the blob commitment as part of the pubdata because even with calldata the prover will check these values.
function _processCalldataDA(
uint256 _blobsProvided,
bytes32 _fullPubdataHash,
uint256 _maxBlobsSupported,
bytes calldata _pubdataInput
) internal pure returns (bytes32[] memory blobCommitments, bytes calldata _pubdata) {
) internal pure virtual returns (bytes32[] memory blobCommitments, bytes calldata _pubdata) {
require(_blobsProvided == 1, "one blob with calldata");
require(_pubdataInput.length >= BLOB_COMMITMENT_SIZE, "pubdata too small");

// We typically do not know whether we'll use calldata or blobs at the time when
// we start proving the batch. That's why the blob commitment for a single blob is still present in the case of calldata.

blobCommitments = new bytes32[](_maxBlobsSupported);

require(_blobsProvided == 1, "one one blob with calldata");

_pubdata = _pubdataInput[:_pubdataInput.length - 32];
_pubdata = _pubdataInput[:_pubdataInput.length - BLOB_COMMITMENT_SIZE];

// FIXME: allow larger lengths for SyncLayer-based chains.
require(_pubdata.length <= BLOB_SIZE_BYTES, "cz");
require(_fullPubdataHash == keccak256(_pubdata), "wp");
blobCommitments[0] = bytes32(_pubdataInput[_pubdataInput.length - 32:_pubdataInput.length]);
blobCommitments[0] = bytes32(_pubdataInput[_pubdataInput.length - BLOB_COMMITMENT_SIZE:_pubdataInput.length]);
}

/// @notice Method that clones a slice of calldata into a bytes32[] memory array.
/// @param _dst The destination array.
/// @param _input The input calldata.
/// @param _len The length of the slice in 32-byte words to clone.
function cloneCalldata(bytes32[] memory _dst, bytes calldata _input, uint256 _len) internal pure {
assembly {
// The pointer to the allocated memory above. We skip 32 bytes to avoid overwriting the length.
let dstPtr := add(_dst, 0x20)
let inputPtr := _input.offset
calldatacopy(dstPtr, inputPtr, mul(_len, 32))
}
}
}
19 changes: 10 additions & 9 deletions da-contracts/contracts/IL1DAValidator.sol
Original file line number Diff line number Diff line change
Expand Up @@ -14,21 +14,22 @@ struct L1DAValidatorOutput {
bytes32[] blobsOpeningCommitments;
}

// TODO: require EIP165 support as this will allow changes for future compatibility.
interface IL1DAValidator {
/// @notice The function that checks the data availability for the given batch input.
/// @param chainId The chain id of the chain that is being committed.
/// @param l2DAValidatorOutputHash The hash of that was returned by the l2DAValidator.
/// @param operatorDAInput The DA input by the operator provided on L1.
/// @param maxBlobsSupported The maximal number of blobs supported by the chain.
/// @param _chainId The chain id of the chain that is being committed.
/// @param _chainId The batch number for which the data availability is being checked.
/// @param _l2DAValidatorOutputHash The hash of that was returned by the l2DAValidator.
/// @param _operatorDAInput The DA input by the operator provided on L1.
/// @param _maxBlobsSupported The maximal number of blobs supported by the chain.
/// We provide this value for future compatibility.
/// This is needed because the corresponding `blobsLinearHashes`/`blobsOpeningCommitments`
/// in the `L1DAValidatorOutput` struct will have to have this length as it is required
/// to be static by the circuits.
function checkDA(
uint256 chainId,
bytes32 l2DAValidatorOutputHash,
bytes calldata operatorDAInput,
uint256 maxBlobsSupported
uint256 _chainId,
uint256 _batchNumber,
bytes32 _l2DAValidatorOutputHash,
bytes calldata _operatorDAInput,
uint256 _maxBlobsSupported
) external returns (L1DAValidatorOutput memory output);
}
89 changes: 45 additions & 44 deletions da-contracts/contracts/RollupL1DAValidator.sol
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ uint256 constant BLOBS_SUPPORTED = 6;

contract RollupL1DAValidator is IL1DAValidator, CalldataDA {
/// @dev The published blob commitments. Note, that the correctness of blob commitment with relation to the linear hash
/// is *not* checked in this contract, but is expected to be checked at the veriifcation stage of the ZK contract.
/// is *not* checked in this contract, but is expected to be checked at the verification stage of the ZK contract.
mapping(bytes32 blobCommitment => bool isPublished) public publishedBlobCommitments;

/// @notice Publishes certain blobs, marking commitments to them as published.
Expand All @@ -37,6 +37,49 @@ contract RollupL1DAValidator is IL1DAValidator, CalldataDA {
}
}

/// @inheritdoc IL1DAValidator
function checkDA(
uint256, // _chainId
uint256, // _batchNumber
bytes32 _l2DAValidatorOutputHash,
bytes calldata _operatorDAInput,
uint256 _maxBlobsSupported
) external view returns (L1DAValidatorOutput memory output) {
(
bytes32 stateDiffHash,
bytes32 fullPubdataHash,
bytes32[] memory blobsLinearHashes,
uint256 blobsProvided,
bytes calldata l1DaInput
) = _processL2RollupDAValidatorOutputHash(_l2DAValidatorOutputHash, _maxBlobsSupported, _operatorDAInput);

uint8 pubdataSource = uint8(l1DaInput[0]);
bytes32[] memory blobCommitments;

if (pubdataSource == uint8(PubdataSource.Blob)) {
blobCommitments = _processBlobDA(blobsProvided, _maxBlobsSupported, l1DaInput[1:]);
} else if (pubdataSource == uint8(PubdataSource.Calldata)) {
(blobCommitments, ) = _processCalldataDA(blobsProvided, fullPubdataHash, _maxBlobsSupported, l1DaInput[1:]);
} else {
revert("l1-da-validator/invalid-pubdata-source");
}

// We verify that for each set of blobHash/blobCommitment are either both empty
// or there are values for both.
// This is mostly a sanity check and it is not strictly required.
for (uint256 i = 0; i < _maxBlobsSupported; ++i) {
require(
(blobsLinearHashes[i] == bytes32(0) && blobCommitments[i] == bytes32(0)) ||
(blobsLinearHashes[i] != bytes32(0) && blobCommitments[i] != bytes32(0)),
"bh"
);
}

output.stateDiffHash = stateDiffHash;
output.blobsLinearHashes = blobsLinearHashes;
output.blobsOpeningCommitments = blobCommitments;
}

/// @notice Generated the blob commitemnt to be used in the cryptographic proof by calling the point evaluation precompile.
/// @param _index The index of the blob in this transaction.
/// @param _commitment The packed: opening point (16 bytes) || claimed value (32 bytes) || commitment (48 bytes) || proof (48 bytes)) = 144 bytes
Expand Down Expand Up @@ -81,7 +124,7 @@ contract RollupL1DAValidator is IL1DAValidator, CalldataDA {

uint256 versionedHashIndex = 0;

// we iterate over the `_operatorDAInput`, while advacning the pointer by `BLOB_DA_INPUT_SIZE` each time
// we iterate over the `_operatorDAInput`, while advancing the pointer by `BLOB_DA_INPUT_SIZE` each time
for (uint256 i = 0; i < _blobsProvided; ++i) {
bytes calldata commitmentData = _operatorDAInput[:PUBDATA_COMMITMENT_SIZE];
bytes32 prepublishedCommitment = bytes32(
Expand Down Expand Up @@ -109,48 +152,6 @@ contract RollupL1DAValidator is IL1DAValidator, CalldataDA {
require(versionedHash == bytes32(0), "lh");
}

/// @inheritdoc IL1DAValidator
function checkDA(
uint256, // _chainId
bytes32 _l2DAValidatorOutputHash,
bytes calldata _operatorDAInput,
uint256 _maxBlobsSupported
) external returns (L1DAValidatorOutput memory output) {
(
bytes32 stateDiffHash,
bytes32 fullPubdataHash,
bytes32[] memory blobsLinearHashes,
uint256 blobsProvided,
bytes calldata l1DaInput
) = _processL2RollupDAValidatorOutputHash(_l2DAValidatorOutputHash, _maxBlobsSupported, _operatorDAInput);

uint8 pubdataSource = uint8(l1DaInput[0]);
bytes32[] memory blobCommitments;

if (pubdataSource == uint8(PubdataSource.Blob)) {
blobCommitments = _processBlobDA(blobsProvided, _maxBlobsSupported, l1DaInput[1:]);
} else if (pubdataSource == uint8(PubdataSource.Calldata)) {
(blobCommitments, ) = _processCalldataDA(blobsProvided, fullPubdataHash, _maxBlobsSupported, l1DaInput[1:]);
} else {
revert("l1-da-validator/invalid-pubdata-source");
}

// We verify that for each set of blobHash/blobCommitment are either both empty
// or there are values for both.
// This is mostly a sanity check and it is not strictly required.
for (uint256 i = 0; i < _maxBlobsSupported; ++i) {
require(
(blobsLinearHashes[i] == bytes32(0) && blobCommitments[i] == bytes32(0)) ||
(blobsLinearHashes[i] != bytes32(0) && blobCommitments[i] != bytes32(0)),
"bh"
);
}

output.stateDiffHash = stateDiffHash;
output.blobsLinearHashes = blobsLinearHashes;
output.blobsOpeningCommitments = blobCommitments;
}

/// @notice Calls the point evaluation precompile and verifies the output
/// Verify p(z) = y given commitment that corresponds to the polynomial p(x) and a KZG proof.
/// Also verify that the provided commitment matches the provided versioned_hash.
Expand Down
5 changes: 3 additions & 2 deletions da-contracts/contracts/ValidiumL1DAValidator.sol
Original file line number Diff line number Diff line change
Expand Up @@ -9,13 +9,14 @@ import {IL1DAValidator, L1DAValidatorOutput} from "./IL1DAValidator.sol";
contract ValidiumL1DAValidator is IL1DAValidator {
function checkDA(
uint256, // _chainId
uint256, // _batchNumber
bytes32, // _l2DAValidatorOutputHash
bytes calldata _operatorDAInput,
uint256 // maxBlobsSupported
) external override returns (L1DAValidatorOutput memory output) {
// For Validiums, we expect the operator to just provide the data for us.
// We don't need to do any checks with regard to the l2DAValidatorOutputHash.
require(_operatorDAInput.length == 32);
require(_operatorDAInput.length == 32, "ValL1DA wrong input length");

bytes32 stateDiffHash = abi.decode(_operatorDAInput, (bytes32));

Expand All @@ -24,6 +25,6 @@ contract ValidiumL1DAValidator is IL1DAValidator {
}

function supportsInterface(bytes4 interfaceId) external pure returns (bool) {
return interfaceId == type(IL1DAValidator).interfaceId;
return (interfaceId == this.supportsInterface.selector) || (interfaceId == type(IL1DAValidator).interfaceId);
}
}
2 changes: 1 addition & 1 deletion docs/gateway/contracts-review-gateway.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ Known issues, and features that still need to be implemented:
- Upgrade process, how do we upgrade to CAB bridge, to the new system contracts.
- We had the syncLayer internal name previously for the Gateway. This has not been replaced everywhere yet.
- permissions for some functions are not properly restricted yet, mostly they are missing a modifier.
- Bridgehub setAssetHandlerAddressInitial `address sender` might be an issue.
- Bridgehub setAssetHandlerAddress `address sender` might be an issue.
- MessageRoot should be renamed to MessageRootAggregator

![Untitled](./Hyperchain-scheme.png)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ import {MSG_VALUE_SYSTEM_CONTRACT, MSG_VALUE_SIMULATOR_IS_SYSTEM_BIT} from "@mat
import {Utils} from "@matterlabs/zksync-contracts/l2/system-contracts/libraries/Utils.sol";

// Addresses used for the compiler to be replaced with the
// zkSync-specific opcodes during the compilation.
// ZKsync-specific opcodes during the compilation.
// IMPORTANT: these are just compile-time constants and are used
// only if used in-place by Yul optimizer.
address constant TO_L1_CALL_ADDRESS = address((1 << 16) - 1);
Expand Down
29 changes: 29 additions & 0 deletions l1-contracts/contracts/bridge/BridgeHelper.sol
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
// SPDX-License-Identifier: MIT

pragma solidity 0.8.24;

// solhint-disable gas-custom-errors

import {IERC20Metadata} from "@openzeppelin/contracts/token/ERC20/extensions/IERC20Metadata.sol";

/**
* @author Matter Labs
* @custom:security-contact security@matterlabs.dev
* @notice Helper library for working with L2 contracts on L1.
*/
library BridgeHelper {
/// @dev Receives and parses (name, symbol, decimals) from the token contract
function getERC20Getters(address _token, address _ethTokenAddress) internal view returns (bytes memory) {
if (_token == _ethTokenAddress) {
bytes memory name = abi.encode("Ether");
bytes memory symbol = abi.encode("ETH");
bytes memory decimals = abi.encode(uint8(18));
return abi.encode(name, symbol, decimals); // when depositing eth to a non-eth based chain it is an ERC20
}

(, bytes memory data1) = _token.staticcall(abi.encodeCall(IERC20Metadata.name, ()));
(, bytes memory data2) = _token.staticcall(abi.encodeCall(IERC20Metadata.symbol, ()));
(, bytes memory data3) = _token.staticcall(abi.encodeCall(IERC20Metadata.decimals, ()));
return abi.encode(data1, data2, data3);
}
}
Loading
Loading