Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prefetch JUMPDESTs through RPC #427

Open
wants to merge 96 commits into
base: develop
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
96 commits
Select commit Hold shift + click to select a range
7b01f5d
Implement JumpDest fetching from RPC.
einar-polygon Jul 15, 2024
4591482
feedback + cleanups
einar-polygon Sep 15, 2024
91c2945
cleanups
einar-polygon Sep 15, 2024
b58c5d6
fix overflow
einar-polygon Sep 15, 2024
037fb57
fmt
einar-polygon Sep 16, 2024
85ee8c2
fix testscripts
einar-polygon Sep 16, 2024
1243768
refactor
einar-polygon Sep 16, 2024
e7244c6
for testing
einar-polygon Sep 16, 2024
16e9c26
extract initcode
einar-polygon Sep 17, 2024
f3871d9
improve test script
einar-polygon Sep 17, 2024
4fd6b8b
fix stack issue
einar-polygon Sep 18, 2024
88eb73d
random fixes
einar-polygon Sep 18, 2024
39cd26c
fix CREATE2
einar-polygon Sep 18, 2024
8a964b8
fmt, clippy
einar-polygon Sep 18, 2024
32e68bf
investigate 15,35
einar-polygon Sep 19, 2024
71b003e
merge
einar-polygon Sep 19, 2024
76df518
Merge remote-tracking branch 'origin/develop' into einar/prefetch_tra…
einar-polygon Sep 19, 2024
184878d
fix scripts
einar-polygon Sep 19, 2024
c000b5a
remove redtape for JUMP/I
einar-polygon Sep 19, 2024
ec81701
misc
einar-polygon Sep 19, 2024
bff471e
fix ci
einar-polygon Sep 20, 2024
ca9620d
minimize diff
einar-polygon Sep 20, 2024
4c97c0f
include whole function in timeout
einar-polygon Sep 20, 2024
8bce013
avoid ensure macro
einar-polygon Sep 20, 2024
b0ebc2c
fix CREATE
einar-polygon Sep 20, 2024
62c7053
small adjustments
einar-polygon Sep 23, 2024
74b86fd
fmt
einar-polygon Sep 23, 2024
c8be888
feedback
einar-polygon Sep 23, 2024
d00439f
feedback
einar-polygon Sep 24, 2024
6876c07
Add JumpdestSrc parameter
einar-polygon Sep 24, 2024
60efef9
Refactor
einar-polygon Sep 24, 2024
b07752d
Add jmp src to native
einar-polygon Sep 24, 2024
66ea811
Feedback
einar-polygon Sep 24, 2024
f230b84
fixup! Feedback
einar-polygon Sep 24, 2024
90722a3
feedback
einar-polygon Sep 25, 2024
a0e0879
fix missing code for CREATE
einar-polygon Sep 25, 2024
6bff4e4
fix
einar-polygon Sep 26, 2024
5e4162d
Merge remote-tracking branch 'origin/develop' into einar/prefetch_tra…
einar-polygon Sep 26, 2024
c26f475
fix arguments
einar-polygon Sep 26, 2024
f367409
feedback
einar-polygon Sep 30, 2024
464dbc0
fix
einar-polygon Sep 30, 2024
2783ccd
debugging 460
einar-polygon Sep 30, 2024
abee812
debugging 460
einar-polygon Oct 1, 2024
8bfccdd
dbg
einar-polygon Oct 1, 2024
f9c2f76
bugfix
einar-polygon Oct 1, 2024
313d78d
dbg
einar-polygon Oct 1, 2024
09079e6
fix
einar-polygon Oct 1, 2024
7c84a63
batching working
einar-polygon Oct 1, 2024
4202ece
cleanups
einar-polygon Oct 1, 2024
e1124d3
feedback docs
einar-polygon Oct 2, 2024
1f8d476
feedback
einar-polygon Oct 2, 2024
7319a15
feedback filtermap
einar-polygon Oct 2, 2024
d4838e0
review
einar-polygon Oct 2, 2024
27b5719
fmt
einar-polygon Oct 3, 2024
eaf3ed7
fix set_jumpdest_analysis_inputs_rpc
einar-polygon Oct 3, 2024
10b6a22
discuss: deser in #427 (#681)
0xaatif Oct 3, 2024
c11d17d
feat: block structlog retrieval (#682)
atanmarko Oct 7, 2024
06b1913
better tracing
einar-polygon Oct 9, 2024
339f6af
bug fix
einar-polygon Oct 9, 2024
61a6b6a
json
einar-polygon Oct 9, 2024
f8f0a85
reinstantiate timeout
einar-polygon Oct 9, 2024
8d609ad
merge
einar-polygon Oct 9, 2024
dbb65ea
ignore None
einar-polygon Oct 9, 2024
d415d22
feedback
einar-polygon Oct 9, 2024
54a7df8
feedback: rustdoc
einar-polygon Oct 9, 2024
44b421c
feedback: add user-specified timeout
einar-polygon Oct 10, 2024
98b9c8e
feedback
einar-polygon Oct 11, 2024
4707d38
fix: addresses
einar-polygon Oct 14, 2024
4843501
todo: fix todo
einar-polygon Oct 14, 2024
8f980d2
testing: improve prove_stdio script
einar-polygon Oct 14, 2024
ee7e5f3
testing: improve test_native script
einar-polygon Oct 14, 2024
36557d1
Merge remote-tracking branch 'origin/develop' into einar/prefetch_tra…
einar-polygon Oct 14, 2024
5451399
fmt
einar-polygon Oct 14, 2024
e9a8702
Round 5
einar-polygon Oct 14, 2024
b2f66ed
testing
einar-polygon Oct 15, 2024
cfb293c
testing: improve reporting, add error cases
einar-polygon Oct 15, 2024
3c497cc
change exit code
einar-polygon Oct 15, 2024
2dc52cb
don't panic!
einar-polygon Oct 16, 2024
dd89251
fix type 5 errors
einar-polygon Oct 18, 2024
6c59c41
Fix: 19548491
einar-polygon Oct 19, 2024
0d7f6b7
add stats
einar-polygon Oct 19, 2024
b8cf325
dbg
einar-polygon Oct 21, 2024
e9ec9f8
remove test scripts
einar-polygon Oct 21, 2024
7263ea3
remove modifications
einar-polygon Oct 21, 2024
ced5d5f
rename a
einar-polygon Oct 21, 2024
7dc2f51
remove todo
einar-polygon Oct 21, 2024
2848ede
add derive_more and add docs
einar-polygon Oct 21, 2024
552d569
clean up
einar-polygon Oct 21, 2024
83a0820
reinstantiate failover simulation
einar-polygon Oct 21, 2024
81f847c
use Hash2code
einar-polygon Oct 21, 2024
2313337
re-add prove_stdio.sh
einar-polygon Oct 21, 2024
206e9a4
mv derive_more
einar-polygon Oct 21, 2024
0b7e997
cleanup
einar-polygon Oct 21, 2024
d67911e
remove tracing
einar-polygon Oct 21, 2024
ee64de9
Merge branch 'develop' into einar/prefetch_transaction_jumps/pr
einar-polygon Oct 21, 2024
6b88ff6
fix derive_more
einar-polygon Oct 21, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 16 additions & 0 deletions Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

8 changes: 6 additions & 2 deletions Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,8 @@ alloy = { version = '0.3.0', default-features = false, features = [
"transport-http",
"rpc-types-debug",
] }
alloy-primitives = "0.8.0"
alloy-serde = "0.3.0"
anyhow = "1.0.86"
async-stream = "0.3.5"
axum = "0.7.5"
Expand All @@ -46,6 +48,7 @@ ciborium-io = "0.2.2"
clap = { version = "4.5.7", features = ["derive", "env"] }
compat = { path = "compat" }
criterion = "0.5.1"
derive_more = { version = "1.0.0", features = ["deref", "deref_mut"] }
dotenvy = "0.15.7"
either = "1.12.0"
enum-as-inner = "0.6.0"
Expand Down Expand Up @@ -85,6 +88,7 @@ ruint = "1.12.3"
serde = "1.0.203"
serde_json = "1.0.118"
serde_path_to_error = "0.1.16"
serde_with = "3.8.1"
serde-big-array = "0.5.1"
sha2 = "0.10.8"
static_assertions = "1.1.0"
Expand All @@ -93,8 +97,8 @@ thiserror = "1.0.61"
tiny-keccak = "2.0.2"
tokio = { version = "1.38.0", features = ["full"] }
tower = "0.4"
tracing = "0.1"
tracing-subscriber = { version = "0.3", features = ["env-filter"] }
tracing = { version = "0.1", features = ["attributes"] }
tracing-subscriber = { version = "0.3", features = ["env-filter", "json"] }
trybuild = "1.0"
u4 = "0.1.0"
uint = "0.9.5"
Expand Down
1 change: 1 addition & 0 deletions evm_arithmetization/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@ keywords.workspace = true
[dependencies]
anyhow.workspace = true
bytes.workspace = true
derive_more.workspace = true
env_logger.workspace = true
ethereum-types.workspace = true
hashbrown.workspace = true
Expand Down
1 change: 1 addition & 0 deletions evm_arithmetization/benches/fibonacci_25m_gas.rs
Original file line number Diff line number Diff line change
Expand Up @@ -192,6 +192,7 @@ fn prepare_setup() -> anyhow::Result<GenerationInputs<F>> {
prev_hashes: vec![H256::default(); 256],
cur_hash: H256::default(),
},
jumpdest_table: None,
})
}

Expand Down
70 changes: 63 additions & 7 deletions evm_arithmetization/src/cpu/kernel/interpreter.rs
Original file line number Diff line number Diff line change
Expand Up @@ -5,10 +5,12 @@
//! the future execution and generate nondeterministically the corresponding
//! jumpdest table, before the actual CPU carries on with contract execution.

use core::option::Option::None;
use std::collections::{BTreeSet, HashMap};

use anyhow::anyhow;
use ethereum_types::{BigEndianHash, U256};
use keccak_hash::H256;
use log::Level;
use mpt_trie::partial_trie::PartialTrie;
use plonky2::hash::hash_types::RichField;
Expand All @@ -19,8 +21,10 @@ use crate::cpu::columns::CpuColumnsView;
use crate::cpu::kernel::aggregator::KERNEL;
use crate::cpu::kernel::constants::global_metadata::GlobalMetadata;
use crate::generation::debug_inputs;
use crate::generation::jumpdest::{Context, JumpDestTableProcessed, JumpDestTableWitness};
use crate::generation::linked_list::LinkedListsPtrs;
use crate::generation::mpt::{load_linked_lists_and_txn_and_receipt_mpts, TrieRootPtrs};
use crate::generation::prover_input::get_proofs_and_jumpdests;
use crate::generation::rlp::all_rlp_prover_inputs_reversed;
use crate::generation::state::{
all_ger_prover_inputs, all_withdrawals_prover_inputs_reversed, GenerationState,
Expand Down Expand Up @@ -56,6 +60,7 @@ pub(crate) struct Interpreter<F: RichField> {
pub(crate) halt_context: Option<usize>,
/// Counts the number of appearances of each opcode. For debugging purposes.
pub(crate) opcode_count: HashMap<Operation, usize>,
/// A table of call contexts and the JUMPDEST offsets that they jumped to.
jumpdest_table: HashMap<usize, BTreeSet<usize>>,
/// `true` if the we are currently carrying out a jumpdest analysis.
pub(crate) is_jumpdest_analysis: bool,
Expand All @@ -71,9 +76,9 @@ pub(crate) struct Interpreter<F: RichField> {
pub(crate) fn simulate_cpu_and_get_user_jumps<F: RichField>(
final_label: &str,
state: &GenerationState<F>,
) -> Option<HashMap<usize, Vec<usize>>> {
) -> Option<(JumpDestTableProcessed, JumpDestTableWitness)> {
match state.jumpdest_table {
Some(_) => None,
Some(_) => Default::default(),
None => {
let halt_pc = KERNEL.global_labels[final_label];
let initial_context = state.registers.context;
Expand All @@ -92,16 +97,16 @@ pub(crate) fn simulate_cpu_and_get_user_jumps<F: RichField>(

let clock = interpreter.get_clock();

interpreter
let (jdtp, jdtw) = interpreter
.generation_state
.set_jumpdest_analysis_inputs(interpreter.jumpdest_table);
.get_jumpdest_analysis_inputs(interpreter.jumpdest_table.clone());

log::debug!(
"Simulated CPU for jumpdest analysis halted after {:?} cycles.",
clock
);

interpreter.generation_state.jumpdest_table
interpreter.generation_state.jumpdest_table = Some(jdtp.clone());
Some((jdtp, jdtw))
}
}
}
Expand All @@ -114,7 +119,7 @@ pub(crate) struct ExtraSegmentData {
pub(crate) withdrawal_prover_inputs: Vec<U256>,
pub(crate) ger_prover_inputs: Vec<U256>,
pub(crate) trie_root_ptrs: TrieRootPtrs,
pub(crate) jumpdest_table: Option<HashMap<usize, Vec<usize>>>,
pub(crate) jumpdest_table: Option<JumpDestTableProcessed>,
pub(crate) access_lists_ptrs: LinkedListsPtrs,
pub(crate) state_ptrs: LinkedListsPtrs,
pub(crate) next_txn_index: usize,
Expand Down Expand Up @@ -150,6 +155,57 @@ pub(crate) fn set_registers_and_run<F: RichField>(
interpreter.run()
}

/// Computes the JUMPDEST proofs for each context.
///
/// # Arguments
///
/// - `jumpdest_table_rpc`: The raw table received from RPC.
/// - `code_db`: The corresponding database of contract code used in the trace.
///
/// # Output
///
/// Returns a [`JumpDestTableProccessed`].
pub(crate) fn get_jumpdest_analysis_inputs_rpc(
jumpdest_table_rpc: &JumpDestTableWitness,
code_map: &HashMap<H256, Vec<u8>>,
) -> JumpDestTableProcessed {
let ctx_proofs = (*jumpdest_table_rpc)
.iter()
.flat_map(|(code_addr, ctx_jumpdests)| {
let code = if code_map.contains_key(code_addr) {
&code_map[code_addr]
} else {
&vec![]
};
prove_context_jumpdests(code, ctx_jumpdests)
})
.collect();
JumpDestTableProcessed::new(ctx_proofs)
}

/// Orchestrates the proving of all contexts in a specific bytecode.
///
/// # Arguments
///
/// - `code`: The bytecode for the context `ctx`.
/// - `ctx`: Map from `ctx` to its list of `JUMPDEST` offsets.
///
/// # Outputs
///
/// Returns a [`HashMap`] from `ctx` to [`Vec`] of proofs. Each proofs ia a
/// pair.
fn prove_context_jumpdests(code: &[u8], ctx: &Context) -> HashMap<usize, Vec<usize>> {
ctx.0
.iter()
.map(|(&ctx, jumpdests)| {
let proofs = jumpdests.last().map_or(Vec::default(), |&largest_address| {
get_proofs_and_jumpdests(code, largest_address, jumpdests.clone())
});
(ctx, proofs)
})
.collect()
}

impl<F: RichField> Interpreter<F> {
/// Returns an instance of `Interpreter` given `GenerationInputs`, and
/// assuming we are initializing with the `KERNEL` code.
Expand Down
2 changes: 2 additions & 0 deletions evm_arithmetization/src/cpu/kernel/tests/add11.rs
Original file line number Diff line number Diff line change
Expand Up @@ -193,6 +193,7 @@ fn test_add11_yml() {
prev_hashes: vec![H256::default(); 256],
cur_hash: H256::default(),
},
jumpdest_table: None,
};

let initial_stack = vec![];
Expand Down Expand Up @@ -370,6 +371,7 @@ fn test_add11_yml_with_exception() {
prev_hashes: vec![H256::default(); 256],
cur_hash: H256::default(),
},
jumpdest_table: None,
};

let initial_stack = vec![];
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,13 +10,15 @@ use plonky2::hash::hash_types::RichField;
use crate::cpu::kernel::aggregator::KERNEL;
use crate::cpu::kernel::interpreter::Interpreter;
use crate::cpu::kernel::opcodes::{get_opcode, get_push_opcode};
use crate::generation::jumpdest::JumpDestTableProcessed;
use crate::memory::segments::Segment;
use crate::witness::memory::MemoryAddress;
use crate::witness::operation::CONTEXT_SCALING_FACTOR;

impl<F: RichField> Interpreter<F> {
pub(crate) fn set_jumpdest_analysis_inputs(&mut self, jumps: HashMap<usize, BTreeSet<usize>>) {
self.generation_state.set_jumpdest_analysis_inputs(jumps);
let (jdtp, _jdtw) = self.generation_state.get_jumpdest_analysis_inputs(jumps);
self.generation_state.jumpdest_table = Some(jdtp);
}

pub(crate) fn get_jumpdest_bit(&self, offset: usize) -> U256 {
Expand Down Expand Up @@ -106,7 +108,10 @@ fn test_jumpdest_analysis() -> Result<()> {
interpreter.generation_state.jumpdest_table,
// Context 3 has jumpdest 1, 5, 7. All have proof 0 and hence
// the list [proof_0, jumpdest_0, ... ] is [0, 1, 0, 5, 0, 7, 8, 40]
Some(HashMap::from([(3, vec![0, 1, 0, 5, 0, 7, 8, 40])]))
Some(JumpDestTableProcessed::new(HashMap::from([(
3,
vec![0, 1, 0, 5, 0, 7, 8, 40]
)])))
);

// Run jumpdest analysis with context = 3
Expand Down Expand Up @@ -175,7 +180,9 @@ fn test_packed_verification() -> Result<()> {
let mut interpreter: Interpreter<F> =
Interpreter::new(write_table_if_jumpdest, initial_stack.clone(), None);
interpreter.set_code(CONTEXT, code.clone());
interpreter.generation_state.jumpdest_table = Some(HashMap::from([(3, vec![1, 33])]));
interpreter.generation_state.jumpdest_table = Some(JumpDestTableProcessed::new(HashMap::from(
[(3, vec![1, 33])],
)));

interpreter.run()?;

Expand All @@ -188,7 +195,9 @@ fn test_packed_verification() -> Result<()> {
let mut interpreter: Interpreter<F> =
Interpreter::new(write_table_if_jumpdest, initial_stack.clone(), None);
interpreter.set_code(CONTEXT, code.clone());
interpreter.generation_state.jumpdest_table = Some(HashMap::from([(3, vec![1, 33])]));
interpreter.generation_state.jumpdest_table = Some(JumpDestTableProcessed::new(
HashMap::from([(3, vec![1, 33])]),
));

assert!(interpreter.run().is_err());

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -101,6 +101,7 @@ fn test_init_exc_stop() {
cur_hash: H256::default(),
},
ger_data: None,
jumpdest_table: None,
};
let initial_stack = vec![];
let initial_offset = KERNEL.global_labels["init"];
Expand Down
Loading
Loading