Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add publish rsr to smart contract #383

Merged
merged 70 commits into from
Oct 29, 2024
Merged
Show file tree
Hide file tree
Changes from 5 commits
Commits
Show all changes
70 commits
Select commit Hold shift + click to select a range
bd3b790
add `rsrContract`
juliangruber Oct 20, 2024
f6a7fea
add `storacha`
juliangruber Oct 20, 2024
9aafd48
persist round `cids` and `details`
juliangruber Oct 20, 2024
fa84db3
add publish rsr (untested)
juliangruber Oct 20, 2024
272cdf2
add git commit
juliangruber Oct 20, 2024
5e5a051
update env var name
juliangruber Oct 21, 2024
a5ee114
refactor `createStorachaClient()`
juliangruber Oct 23, 2024
5cdad06
update schema
juliangruber Oct 24, 2024
687151d
hide `GIT_COMMIT` in scope
juliangruber Oct 24, 2024
58f71ac
`gitCommit` -> `sparkEvaluateVersion`
juliangruber Oct 24, 2024
b1ae3b0
`postEvaluate` -> `prepareAcceptedRetrievalTaskMeasurementsCommitment`
juliangruber Oct 24, 2024
bf0941b
`round.cids` -> `round.measurementCommitments`
juliangruber Oct 24, 2024
f5e6ee0
fix `.total`
juliangruber Oct 24, 2024
384a32c
`publish_rsr_rounds` -> `unpublished_rsr_rounds`
juliangruber Oct 24, 2024
b5ecbc0
Merge branch 'main' into add/publish-rsr
juliangruber Oct 24, 2024
fff2303
update contract
juliangruber Oct 28, 2024
4a6e210
update contract
juliangruber Oct 28, 2024
f9d8b5a
fix rsr calculation
juliangruber Oct 28, 2024
64cbe3a
share logic for building retrieval stats
juliangruber Oct 28, 2024
833381b
rename method to match contract
juliangruber Oct 28, 2024
ac1e25a
Update lib/publish-rsr.js
juliangruber Oct 28, 2024
6d09e29
get date string from db
juliangruber Oct 28, 2024
90c3599
fix deletion logic
juliangruber Oct 28, 2024
3de56fc
format
juliangruber Oct 28, 2024
35ddb53
always publish oldest publishable date
juliangruber Oct 28, 2024
070a7fa
update column to match smart contract
juliangruber Oct 28, 2024
609a698
add contract address
juliangruber Oct 28, 2024
7b541f3
Merge branch 'main' into add/publish-rsr
juliangruber Oct 28, 2024
0f63301
consistent naming
juliangruber Oct 29, 2024
cd8aa54
measurement_commitments -> _batches
juliangruber Oct 29, 2024
ba3f885
update schema (wip)
juliangruber Oct 29, 2024
e5edb03
upload dag-json
juliangruber Oct 29, 2024
5df3d54
unify terminology
juliangruber Oct 29, 2024
abfd927
upload round details to storacha
juliangruber Oct 29, 2024
07a4493
minerId -> providerId
juliangruber Oct 29, 2024
79e0f1e
update schema
juliangruber Oct 29, 2024
25f25f9
refactor
juliangruber Oct 29, 2024
95909cc
consistent naming
juliangruber Oct 29, 2024
66dca30
consistent naming
juliangruber Oct 29, 2024
87a405a
doc
juliangruber Oct 29, 2024
33f984a
add passing tests
juliangruber Oct 29, 2024
88a102a
add passing test
juliangruber Oct 29, 2024
cda11e8
improve error message
juliangruber Oct 29, 2024
53b9520
add passing test
juliangruber Oct 29, 2024
dc5035f
move stuff around
juliangruber Oct 29, 2024
24ca741
fix lint
juliangruber Oct 29, 2024
f71efef
add passing test
juliangruber Oct 29, 2024
9ec9f05
add passing test
juliangruber Oct 29, 2024
9bbe3d6
fix
juliangruber Oct 29, 2024
4ff817e
add passing test
juliangruber Oct 29, 2024
8a122cc
consistent naming
juliangruber Oct 29, 2024
9951310
add passing test
juliangruber Oct 29, 2024
284a59e
improve car/cid tests
juliangruber Oct 29, 2024
2d93051
fix query with test
juliangruber Oct 29, 2024
9dd9099
docs: fix CID
juliangruber Oct 29, 2024
1219068
add test and fixes
juliangruber Oct 29, 2024
86006b2
add passing test
juliangruber Oct 29, 2024
9cba58c
fix lint
juliangruber Oct 29, 2024
41ffd36
add passing test
juliangruber Oct 29, 2024
88a8479
fix test name
juliangruber Oct 29, 2024
0e484c6
refactor
juliangruber Oct 29, 2024
8f5df40
add passing test
juliangruber Oct 29, 2024
6da0230
add test and fix
juliangruber Oct 29, 2024
1ebd405
clean up
juliangruber Oct 29, 2024
96af4cc
add passing tests
juliangruber Oct 29, 2024
2800cfe
add passing test
juliangruber Oct 29, 2024
ea5cdc3
add passing test
juliangruber Oct 29, 2024
9ac3899
consistent naming
juliangruber Oct 29, 2024
480a8fb
refactor
juliangruber Oct 29, 2024
2b3bd2f
fix lint
juliangruber Oct 29, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -102,7 +102,7 @@ jobs:
steps:
- uses: actions/checkout@v4
- uses: superfly/flyctl-actions/setup-flyctl@master
- run: flyctl deploy --remote-only
- run: flyctl deploy --remote-only —-build-arg GIT_COMMIT=$(git rev-parse HEAD)
env:
FLY_API_TOKEN: ${{ secrets.FLY_API_TOKEN }}
- if: failure()
Expand Down
3 changes: 3 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,9 @@ ENV NODE_ENV=production
# 4096MB available memory - 200MB for anything else
ENV NODE_OPTIONS="--max-old-space-size=3896"

ARG GIT_COMMIT
ENV GIT_COMMIT=$GIT_COMMIT
Comment on lines +17 to +18
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we document this new required argument for the situations when we want to run docker build locally?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I never run docker build locally, what is the use case for that?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I usually build locally when troubleshooting docker build failures on the CI or when making changes to the Dockerfile. It's faster to do this locally than wait for the CI run.

It's not a big deal for me if you don't document this.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Got it! I'm happy to document it, where's a good place for that?


# Throw-away build stage to reduce size of final image
FROM base AS build

Expand Down
4 changes: 2 additions & 2 deletions bin/cancel-pending-tx.js
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ import { ethers } from 'ethers'
import { CoinType, newDelegatedEthAddress } from '@glif/filecoin-address'
import pRetry from 'p-retry'

import { createMeridianContract } from '../lib/ie-contract.js'
import { createContracts } from '../lib/contracts.js'

const {
WALLET_SEED
Expand All @@ -17,7 +17,7 @@ const [, , tx] = process.argv
assert(WALLET_SEED, 'WALLET_SEED required')
assert(tx, 'Transaction hash must be provided as the first argument')

const { provider } = await createMeridianContract()
const { provider } = createContracts()

const signer = ethers.Wallet.fromPhrase(WALLET_SEED, provider)
const walletDelegatedAddress = newDelegatedEthAddress(/** @type {any} */(signer.address), CoinType.MAIN).toString()
Expand Down
9 changes: 5 additions & 4 deletions bin/dry-run.js
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ import path from 'node:path'
import { fileURLToPath } from 'node:url'
import pg from 'pg'
import { RoundData } from '../lib/round.js'
import { createMeridianContract } from '../lib/ie-contract.js'
import { createContracts } from '../lib/contracts.js'
import * as SparkImpactEvaluator from '@filecoin-station/spark-impact-evaluator'

/** @typedef {import('../lib/preprocess.js').Measurement} Measurement */
Expand Down Expand Up @@ -142,7 +142,8 @@ const { ignoredErrors } = await evaluate({
setScores,
logger: console,
recordTelemetry,
createPgClient
createPgClient,
gitCommit: ''
})

console.log('Duration: %sms', Date.now() - started)
Expand Down Expand Up @@ -213,7 +214,7 @@ async function fetchMeasurementsAddedEvents (contractAddress, roundIndex) {
* @param {bigint} roundIndex
*/
async function fetchMeasurementsAddedFromChain (contractAddress, roundIndex) {
const { ieContract, provider } = await createMeridianContract(contractAddress)
const { ieContract, provider } = createContracts(contractAddress)

console.log('Fetching MeasurementsAdded events from the ledger')

Expand Down Expand Up @@ -268,6 +269,6 @@ function isEventLog (logOrEventLog) {
* @param {string} contractAddress
*/
async function fetchLastRoundIndex (contractAddress) {
const { ieContract } = await createMeridianContract(contractAddress)
const { ieContract } = await createContracts(contractAddress)
return await ieContract.currentRoundIndex()
}
7 changes: 4 additions & 3 deletions bin/fetch-recent-miner-measurements.js
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ import { mkdir, readFile, writeFile } from 'node:fs/promises'
import os from 'node:os'
import path from 'node:path'
import pMap from 'p-map'
import { createMeridianContract } from '../lib/ie-contract.js'
import { createContracts } from '../lib/contracts.js'
import { fetchMeasurements, preprocess } from '../lib/preprocess.js'
import { RoundData } from '../lib/round.js'
import * as SparkImpactEvaluator from '@filecoin-station/spark-impact-evaluator'
Expand Down Expand Up @@ -140,7 +140,7 @@ console.error('Wrote human-readable summary for %s to %s', minerId, MINER_SUMMAR
* @returns
*/
async function getRecentMeasurementsAddedEvents (contractAddress, blocksToQuery = Number.POSITIVE_INFINITY) {
const { ieContract } = await createMeridianContract(contractAddress)
const { ieContract } = createContracts(contractAddress)

// max look-back period allowed by Glif.io is 2000 blocks (approx 16h40m)
// in practice, requests for the last 2000 blocks are usually rejected,
Expand Down Expand Up @@ -214,7 +214,8 @@ async function processRound (roundIndex, measurementCids, resultCounts) {
recordTelemetry,
logger: { log: debug, error: debug },
ieContract,
setScores: async () => {}
setScores: async () => {},
gitCommit: ''
})

for (const m of round.measurements) {
Expand Down
57 changes: 45 additions & 12 deletions bin/spark-evaluate.js
Original file line number Diff line number Diff line change
Expand Up @@ -9,12 +9,20 @@ import { recordTelemetry } from '../lib/telemetry.js'
import { fetchMeasurements } from '../lib/preprocess.js'
import { migrateWithPgConfig } from '../lib/migrate.js'
import pg from 'pg'
import { createMeridianContract } from '../lib/ie-contract.js'
import { createContracts } from '../lib/contracts.js'
import { setScores } from '../lib/submit-scores.js'
import { runPublishRsrLoop } from '../lib/publish-rsr.js'
import * as Client from '@web3-storage/w3up-client'
import { ed25519 } from '@ucanto/principal'
import { CarReader } from '@ipld/car'
import { importDAG } from '@ucanto/core/delegation'

const {
SENTRY_ENVIRONMENT = 'development',
WALLET_SEED
WALLET_SEED,
STORACHA_PRIVATE_KEY,
STORACHA_PROOF,
GIT_COMMIT
} = process.env

Sentry.init({
Expand All @@ -25,10 +33,27 @@ Sentry.init({
})

assert(WALLET_SEED, 'WALLET_SEED required')
assert(STORACHA_PRIVATE_KEY, 'STORACHA_PRIVATE_KEY required')
assert(STORACHA_PROOF, 'STORACHA_PROOF required')

await migrateWithPgConfig({ connectionString: DATABASE_URL })

const { ieContract, provider } = await createMeridianContract()
async function parseProof (data) {
const blocks = []
const reader = await CarReader.fromBytes(Buffer.from(data, 'base64'))
for await (const block of reader.blocks()) {
blocks.push(block)
}
return importDAG(blocks)
}

const principal = ed25519.Signer.parse(STORACHA_PRIVATE_KEY)
const storachaClient = await Client.create({ principal })
const proof = await parseProof(STORACHA_PROOF)
const space = await storachaClient.addSpace(proof)
await storachaClient.setCurrentSpace(space.did())
juliangruber marked this conversation as resolved.
Show resolved Hide resolved

const { ieContract, rsrContract, provider } = createContracts()

const signer = ethers.Wallet.fromPhrase(WALLET_SEED, provider)
const walletDelegatedAddress = newDelegatedEthAddress(/** @type {any} */(signer.address), CoinType.MAIN).toString()
Expand All @@ -41,12 +66,20 @@ const createPgClient = async () => {
return pgClient
}

await startEvaluate({
ieContract,
fetchMeasurements,
fetchRoundDetails,
recordTelemetry,
createPgClient,
logger: console,
setScores: (participants, values) => setScores(signer, participants, values)
})
await Promise.all([
startEvaluate({
ieContract,
fetchMeasurements,
fetchRoundDetails,
recordTelemetry,
createPgClient,
logger: console,
setScores: (participants, values) => setScores(signer, participants, values),
gitCommit: GIT_COMMIT
juliangruber marked this conversation as resolved.
Show resolved Hide resolved
juliangruber marked this conversation as resolved.
Show resolved Hide resolved
}),
runPublishRsrLoop({
createPgClient,
storachaClient,
rsrContract: rsrContract.connect(signer)
})
])
6 changes: 4 additions & 2 deletions index.js
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,8 @@ export const startEvaluate = async ({
recordTelemetry,
createPgClient,
logger,
setScores
setScores,
gitCommit
}) => {
assert(typeof createPgClient === 'function', 'createPgClient must be a function')

Expand Down Expand Up @@ -116,7 +117,8 @@ export const startEvaluate = async ({
recordTelemetry,
createPgClient,
logger,
setScores
setScores,
gitCommit
}).catch(err => {
console.error('CANNOT EVALUATE ROUND %s:', evaluatedRoundIndex, err)
Sentry.captureException(err, {
Expand Down
21 changes: 18 additions & 3 deletions lib/ie-contract.js → lib/contracts.js
Original file line number Diff line number Diff line change
@@ -1,8 +1,17 @@
import { ethers } from 'ethers'
import { rpcUrls, GLIF_TOKEN } from './config.js'
import * as SparkImpactEvaluator from '@filecoin-station/spark-impact-evaluator'
import fs from 'node:fs/promises'
import { fileURLToPath } from 'node:url'

export const createMeridianContract = async (contractAddress = SparkImpactEvaluator.ADDRESS) => {
const rsrContractAbi = JSON.parse(
await fs.readFile(
fileURLToPath(new URL('./rsrContract.json', import.meta.url)),
'utf8'
)
).abi

export const createContracts = (ieContractAddress = SparkImpactEvaluator.ADDRESS) => {
const provider = new ethers.FallbackProvider(rpcUrls.map(url => {
const fetchRequest = new ethers.FetchRequest(url)
fetchRequest.setHeader('Authorization', `Bearer ${GLIF_TOKEN}`)
Expand All @@ -16,10 +25,16 @@ export const createMeridianContract = async (contractAddress = SparkImpactEvalua
// provider.on('debug', d => console.log('[ethers:debug %s] %s %o', new Date().toISOString().split('T')[1], d.action, d.payload ?? d.result))

const ieContract = new ethers.Contract(
contractAddress,
ieContractAddress,
SparkImpactEvaluator.ABI,
provider
)

return { ieContract, provider }
const rsrContract = new ethers.Contract(
'TODO',
rsrContractAbi,
provider
)

return { ieContract, rsrContract, provider }
}
14 changes: 13 additions & 1 deletion lib/evaluate.js
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ import { buildRetrievalStats, getTaskId, recordCommitteeSizes } from './retrieva
import { getTasksAllowedForStations } from './tasker.js'
import { groupMeasurementsToCommittees } from './committee.js'
import pRetry from 'p-retry'
import * as publishRsr from './publish-rsr.js'

/** @import {Measurement} from './preprocess.js' */

Expand All @@ -26,6 +27,7 @@ export const REQUIRED_COMMITTEE_SIZE = 30
* @param {import('./typings.js').RecordTelemetryFn} args.recordTelemetry
* @param {import('./typings.js').CreatePgClient} [args.createPgClient]
* @param {Pick<Console, 'log' | 'error'>} args.logger
* @param {string} args.gitCommit
*/
export const evaluate = async ({
round,
Expand All @@ -36,7 +38,8 @@ export const evaluate = async ({
fetchRoundDetails,
recordTelemetry,
createPgClient,
logger
logger,
gitCommit
}) => {
requiredCommitteeSize ??= REQUIRED_COMMITTEE_SIZE

Expand All @@ -47,6 +50,7 @@ export const evaluate = async ({
// Detect fraud

const sparkRoundDetails = await fetchRoundDetails(await ieContract.getAddress(), roundIndex, recordTelemetry)
round.details = sparkRoundDetails
// Omit the roundDetails object from the format string to get nicer formatting
debug('ROUND DETAILS for round=%s', roundIndex, sparkRoundDetails)

Expand Down Expand Up @@ -202,6 +206,14 @@ export const evaluate = async ({
}
}

try {
await publishRsr.postEvaluate({ createPgClient, round, gitCommit })
} catch (err) {
console.error('Cannot store data for `spark-rsr-contract`.', err)
ignoredErrors.push(err)
Sentry.captureException(err)
}

return { ignoredErrors }
}

Expand Down
1 change: 1 addition & 0 deletions lib/preprocess.js
Original file line number Diff line number Diff line change
Expand Up @@ -116,6 +116,7 @@ export const preprocess = async ({
logger.log('Retrieval Success Rate: %s%s (%s of %s)', Math.round(100 * okCount / total), '%', okCount, total)

round.measurements.push(...validMeasurements)
round.cids.push(cid)
juliangruber marked this conversation as resolved.
Show resolved Hide resolved

recordTelemetry('preprocess', point => {
point.intField('round_index', roundIndex)
Expand Down
96 changes: 96 additions & 0 deletions lib/publish-rsr.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,96 @@
import * as Sentry from '@sentry/node'
import timers from 'node:timers/promises'
import pRetry from 'p-retry'

const ONE_HOUR = 60 * 60 * 1000

const withPgClient = fn => async ({ createPgClient, ...args }) => {
const pgClient = await createPgClient()
try {
return await fn({ pgClient, ...args })
} finally {
await pgClient.end()
}
}

export const postEvaluate = withPgClient(async ({ pgClient, round, gitCommit }) => {
juliangruber marked this conversation as resolved.
Show resolved Hide resolved
await pgClient.query(`
INSERT INTO publish_rsr_rounds
juliangruber marked this conversation as resolved.
Show resolved Hide resolved
(round_index, spark_evaluate_version, measurement_commitments, round_details, providers)
VALUES
($1, $2, $3, $4, $5)
`, [
round.index,
gitCommit,
round.cids,
round.details,
round.measurements.reduce((acc, m) => {
acc[m.minerId] = acc[m.minerId] || { successful: 0, total: 0 }
acc[m.minerId].total++
if (m.retrievalResult === 'OK') {
acc[m.minerId].successful++
}
return acc
}, {})
juliangruber marked this conversation as resolved.
Show resolved Hide resolved
])
})

const publishRsr = withPgClient(async ({ pgClient, storachaClient, rsrContract }) => {
const { rows } = await pgClient.query(`
SELECT *
FROM publish_rsr_rounds
WHERE evaluated_at <= now()::date AND evaluated_at > now()::date - interval '1 day'
juliangruber marked this conversation as resolved.
Show resolved Hide resolved
ORDER BY round_index
`)

const providers = new Map()
for (const row of rows) {
for (const [minerId, { successful, total }] of Object.entries(row.providers)) {
if (!providers.has(minerId)) {
providers.set(minerId, { successful: 0, total: 0 })
}
providers.get(minerId).successful += successful
providers.get(minerId).total += total
}
}

const directoryCid = await pRetry(() => storachaClient.uploadDirectory([
new File([JSON.stringify({
date: rows[0].evaluated_at.toISOString().split('T')[0],
juliangruber marked this conversation as resolved.
Show resolved Hide resolved
meta: {
rounds: Object.fromEntries(rows.map(row => [
row.round_index,
{
sparkEvaluateVersion: row.spark_evaluate_version,
measurementCommitments: row.measurement_commitments,
juliangruber marked this conversation as resolved.
Show resolved Hide resolved
roundDetails: row.round_details
}
]))
},
providers
})], 'commitment.json')
]))
console.log(`https://${directoryCid}.ipfs.w3s.link/commitment.json`)

const tx = await pRetry(() => rsrContract.addCommitment(directoryCid.toString()))
console.log(tx.hash)
await tx.wait()

await pgClient.query(`
DELETE FROM publish_rsr_rounds
WHERE round_index < $1
juliangruber marked this conversation as resolved.
Show resolved Hide resolved
`, [rows[0].round_index])
})

export const runPublishRsrLoop = async ({ createPgClient, storachaClient, rsrContract }) => {
while (true) {
try {
await publishRsr({ createPgClient, storachaClient, rsrContract })
} catch (err) {
console.error(err)
Sentry.captureException(err)
} finally {
await timers.setTimeout(ONE_HOUR)
}
}
}
Loading