Skip to content

Commit

Permalink
Add publish rsr to smart contract (#383)
Browse files Browse the repository at this point in the history
* add `rsrContract`

* add `storacha`

* persist round `cids` and `details`

* add publish rsr (untested)

* add git commit

* update env var name

* refactor `createStorachaClient()`

* update schema

* hide `GIT_COMMIT` in scope

* `gitCommit` -> `sparkEvaluateVersion`

* `postEvaluate` -> `prepareAcceptedRetrievalTaskMeasurementsCommitment`

* `round.cids` -> `round.measurementCommitments`

* fix `.total`

* `publish_rsr_rounds` -> `unpublished_rsr_rounds`

* update contract

* update contract

* fix rsr calculation

* share logic for building retrieval stats

* rename method to match contract

* Update lib/publish-rsr.js

Co-authored-by: Miroslav Bajtoš <oss@bajtos.net>

* get date string from db

* fix deletion logic

* format

* always publish oldest publishable date

* update column to match smart contract

* add contract address

* consistent naming

* measurement_commitments -> _batches

* update schema (wip)

* upload dag-json

* unify terminology

* upload round details to storacha

* minerId -> providerId

* update schema

* refactor

* consistent naming

* consistent naming

* doc

* add passing tests

* add passing test

* improve error message

* add passing test

* move stuff around

* fix lint

* add passing test

* add passing test

* fix

* add passing test

* consistent naming

* add passing test

* improve car/cid tests

* fix query with test

* docs: fix CID

* add test and fixes

* add passing test

* fix lint

* add passing test

* fix test name

* refactor

* add passing test

* add test and fix

* clean up

* add passing tests

* add passing test

* add passing test

* consistent naming

* refactor

* fix lint

---------

Co-authored-by: Miroslav Bajtoš <oss@bajtos.net>
  • Loading branch information
juliangruber and bajtos authored Oct 29, 2024
1 parent c74b4bc commit 100aa85
Show file tree
Hide file tree
Showing 25 changed files with 1,857 additions and 105 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -102,7 +102,7 @@ jobs:
steps:
- uses: actions/checkout@v4
- uses: superfly/flyctl-actions/setup-flyctl@master
- run: flyctl deploy --remote-only
- run: flyctl deploy --remote-only —-build-arg GIT_COMMIT=$(git rev-parse HEAD)
env:
FLY_API_TOKEN: ${{ secrets.FLY_API_TOKEN }}
- if: failure()
Expand Down
3 changes: 3 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,9 @@ ENV NODE_ENV=production
# 4096MB available memory - 200MB for anything else
ENV NODE_OPTIONS="--max-old-space-size=3896"

ARG GIT_COMMIT
ENV GIT_COMMIT=$GIT_COMMIT

# Throw-away build stage to reduce size of final image
FROM base AS build

Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -155,7 +155,7 @@ You can perform a dry-run evaluation of a given Meridan round using the script `
You can optionally specify the smart contract address, round index and list of CIDs of measurements
to load. For example, run the following command to evaluate round `273` of the Meridian version
`0x3113b83ccec38a18df936f31297de490485d7b2e` with measurements from CID
`bafybeie5rekb2jox77ow64wjjd2bjdsp6d3yeivhzzd234hnbpscfjarv4z`:
`bafybeie5rekb2jox77ow64wjjd2bjdsp6d3yeivhzzd234hnbpscfjarv4`:

```shell
node bin/dry-run.js \
Expand Down
4 changes: 2 additions & 2 deletions bin/cancel-pending-tx.js
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ import { ethers } from 'ethers'
import { CoinType, newDelegatedEthAddress } from '@glif/filecoin-address'
import pRetry from 'p-retry'

import { createMeridianContract } from '../lib/ie-contract.js'
import { createContracts } from '../lib/contracts.js'

const {
WALLET_SEED
Expand All @@ -17,7 +17,7 @@ const [, , tx] = process.argv
assert(WALLET_SEED, 'WALLET_SEED required')
assert(tx, 'Transaction hash must be provided as the first argument')

const { provider } = await createMeridianContract()
const { provider } = createContracts()

const signer = ethers.Wallet.fromPhrase(WALLET_SEED, provider)
const walletDelegatedAddress = newDelegatedEthAddress(/** @type {any} */(signer.address), CoinType.MAIN).toString()
Expand Down
9 changes: 5 additions & 4 deletions bin/dry-run.js
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ import path from 'node:path'
import { fileURLToPath } from 'node:url'
import pg from 'pg'
import { RoundData } from '../lib/round.js'
import { createMeridianContract } from '../lib/ie-contract.js'
import { createContracts } from '../lib/contracts.js'
import * as SparkImpactEvaluator from '@filecoin-station/spark-impact-evaluator'

/** @typedef {import('../lib/preprocess.js').Measurement} Measurement */
Expand Down Expand Up @@ -142,7 +142,8 @@ const { ignoredErrors } = await evaluate({
setScores,
logger: console,
recordTelemetry,
createPgClient
createPgClient,
prepareProviderRetrievalResultStats: async () => {}
})

console.log('Duration: %sms', Date.now() - started)
Expand Down Expand Up @@ -213,7 +214,7 @@ async function fetchMeasurementsAddedEvents (contractAddress, roundIndex) {
* @param {bigint} roundIndex
*/
async function fetchMeasurementsAddedFromChain (contractAddress, roundIndex) {
const { ieContract, provider } = await createMeridianContract(contractAddress)
const { ieContract, provider } = createContracts(contractAddress)

console.log('Fetching MeasurementsAdded events from the ledger')

Expand Down Expand Up @@ -268,6 +269,6 @@ function isEventLog (logOrEventLog) {
* @param {string} contractAddress
*/
async function fetchLastRoundIndex (contractAddress) {
const { ieContract } = await createMeridianContract(contractAddress)
const { ieContract } = await createContracts(contractAddress)
return await ieContract.currentRoundIndex()
}
7 changes: 4 additions & 3 deletions bin/fetch-recent-miner-measurements.js
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ import { mkdir, readFile, writeFile } from 'node:fs/promises'
import os from 'node:os'
import path from 'node:path'
import pMap from 'p-map'
import { createMeridianContract } from '../lib/ie-contract.js'
import { createContracts } from '../lib/contracts.js'
import { fetchMeasurements, preprocess } from '../lib/preprocess.js'
import { RoundData } from '../lib/round.js'
import * as SparkImpactEvaluator from '@filecoin-station/spark-impact-evaluator'
Expand Down Expand Up @@ -140,7 +140,7 @@ console.error('Wrote human-readable summary for %s to %s', minerId, MINER_SUMMAR
* @returns
*/
async function getRecentMeasurementsAddedEvents (contractAddress, blocksToQuery = Number.POSITIVE_INFINITY) {
const { ieContract } = await createMeridianContract(contractAddress)
const { ieContract } = createContracts(contractAddress)

// max look-back period allowed by Glif.io is 2000 blocks (approx 16h40m)
// in practice, requests for the last 2000 blocks are usually rejected,
Expand Down Expand Up @@ -214,7 +214,8 @@ async function processRound (roundIndex, measurementCids, resultCounts) {
recordTelemetry,
logger: { log: debug, error: debug },
ieContract,
setScores: async () => {}
setScores: async () => {},
prepareProviderRetrievalResultStats: async () => {}
})

for (const m of round.measurements) {
Expand Down
50 changes: 38 additions & 12 deletions bin/spark-evaluate.js
Original file line number Diff line number Diff line change
Expand Up @@ -9,12 +9,17 @@ import { recordTelemetry } from '../lib/telemetry.js'
import { fetchMeasurements } from '../lib/preprocess.js'
import { migrateWithPgConfig } from '../lib/migrate.js'
import pg from 'pg'
import { createMeridianContract } from '../lib/ie-contract.js'
import { createContracts } from '../lib/contracts.js'
import { setScores } from '../lib/submit-scores.js'
import * as providerRetrievalResultStats from '../lib/provider-retrieval-result-stats.js'
import { createStorachaClient } from '../lib/storacha.js'

const {
SENTRY_ENVIRONMENT = 'development',
WALLET_SEED
WALLET_SEED,
STORACHA_SECRET_KEY,
STORACHA_PROOF,
GIT_COMMIT
} = process.env

Sentry.init({
Expand All @@ -25,10 +30,16 @@ Sentry.init({
})

assert(WALLET_SEED, 'WALLET_SEED required')
assert(STORACHA_SECRET_KEY, 'STORACHA_SECRET_KEY required')
assert(STORACHA_PROOF, 'STORACHA_PROOF required')

await migrateWithPgConfig({ connectionString: DATABASE_URL })

const { ieContract, provider } = await createMeridianContract()
const storachaClient = await createStorachaClient({
secretKey: STORACHA_SECRET_KEY,
proof: STORACHA_PROOF
})
const { ieContract, ieContractAddress, rsrContract, provider } = createContracts()

const signer = ethers.Wallet.fromPhrase(WALLET_SEED, provider)
const walletDelegatedAddress = newDelegatedEthAddress(/** @type {any} */(signer.address), CoinType.MAIN).toString()
Expand All @@ -41,12 +52,27 @@ const createPgClient = async () => {
return pgClient
}

await startEvaluate({
ieContract,
fetchMeasurements,
fetchRoundDetails,
recordTelemetry,
createPgClient,
logger: console,
setScores: (participants, values) => setScores(signer, participants, values)
})
await Promise.all([
startEvaluate({
ieContract,
fetchMeasurements,
fetchRoundDetails,
recordTelemetry,
createPgClient,
logger: console,
setScores: (participants, values) => setScores(signer, participants, values),
prepareProviderRetrievalResultStats: (round, committees) => providerRetrievalResultStats.prepare({
storachaClient,
createPgClient,
round,
committees,
sparkEvaluateVersion: GIT_COMMIT,
ieContractAddress
})
}),
providerRetrievalResultStats.runPublishLoop({
createPgClient,
storachaClient,
rsrContract: rsrContract.connect(signer)
})
])
6 changes: 4 additions & 2 deletions index.js
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,8 @@ export const startEvaluate = async ({
recordTelemetry,
createPgClient,
logger,
setScores
setScores,
prepareProviderRetrievalResultStats
}) => {
assert(typeof createPgClient === 'function', 'createPgClient must be a function')

Expand Down Expand Up @@ -116,7 +117,8 @@ export const startEvaluate = async ({
recordTelemetry,
createPgClient,
logger,
setScores
setScores,
prepareProviderRetrievalResultStats
}).catch(err => {
console.error('CANNOT EVALUATE ROUND %s:', evaluatedRoundIndex, err)
Sentry.captureException(err, {
Expand Down
37 changes: 37 additions & 0 deletions lib/car.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
import { CarWriter } from '@ipld/car'
import * as dagJSON from '@ipld/dag-json'
import { sha256 } from 'multiformats/hashes/sha2'
import { CID } from 'multiformats'

/**
* @param {*} json
* @returns {Promise<{ cid: CID, car: Blob }>}
*/
export async function createDagJsonCar (json) {
const bytes = dagJSON.encode(json)
const hash = await sha256.digest(bytes)
const cid = CID.create(1, dagJSON.code, hash)
const car = await createCar({ cid, bytes }, cid)
return { cid, car }
}

/**
* @param {{ cid: CID, bytes: dagJSON.ByteView<any> }} block
* @param {CID} root
* @returns {Promise<Blob>}
*/
export async function createCar (block, root) {
const { writer, out } = CarWriter.create(root)
const [chunks] = await Promise.all([
(async () => {
const chunks = []
for await (const chunk of out) chunks.push(chunk)
return chunks
})(),
(async () => {
await writer.put(block)
await writer.close()
})()
])
return Object.assign(new Blob(chunks), { version: 1, roots: [root] })
}
21 changes: 18 additions & 3 deletions lib/ie-contract.js → lib/contracts.js
Original file line number Diff line number Diff line change
@@ -1,8 +1,17 @@
import { ethers } from 'ethers'
import { rpcUrls, GLIF_TOKEN } from './config.js'
import * as SparkImpactEvaluator from '@filecoin-station/spark-impact-evaluator'
import fs from 'node:fs/promises'
import { fileURLToPath } from 'node:url'

export const createMeridianContract = async (contractAddress = SparkImpactEvaluator.ADDRESS) => {
const rsrContractAbi = JSON.parse(
await fs.readFile(
fileURLToPath(new URL('./rsrContract.json', import.meta.url)),
'utf8'
)
).abi

export const createContracts = (ieContractAddress = SparkImpactEvaluator.ADDRESS) => {
const provider = new ethers.FallbackProvider(rpcUrls.map(url => {
const fetchRequest = new ethers.FetchRequest(url)
fetchRequest.setHeader('Authorization', `Bearer ${GLIF_TOKEN}`)
Expand All @@ -16,10 +25,16 @@ export const createMeridianContract = async (contractAddress = SparkImpactEvalua
// provider.on('debug', d => console.log('[ethers:debug %s] %s %o', new Date().toISOString().split('T')[1], d.action, d.payload ?? d.result))

const ieContract = new ethers.Contract(
contractAddress,
ieContractAddress,
SparkImpactEvaluator.ABI,
provider
)

return { ieContract, provider }
const rsrContract = new ethers.Contract(
'0x620bfc5AdE7eeEE90034B05DC9Bb5b540336ff90',
rsrContractAbi,
provider
)

return { ieContract, ieContractAddress, rsrContract, provider }
}
13 changes: 12 additions & 1 deletion lib/evaluate.js
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@ export const REQUIRED_COMMITTEE_SIZE = 30
* @param {import('./typings.js').RecordTelemetryFn} args.recordTelemetry
* @param {import('./typings.js').CreatePgClient} [args.createPgClient]
* @param {Pick<Console, 'log' | 'error'>} args.logger
* @param {(round: import('./round.js').RoundData, committees: Iterable<import('./committee.js').Committee>) => Promise<void>} args.prepareProviderRetrievalResultStats
*/
export const evaluate = async ({
round,
Expand All @@ -36,7 +37,8 @@ export const evaluate = async ({
fetchRoundDetails,
recordTelemetry,
createPgClient,
logger
logger,
prepareProviderRetrievalResultStats
}) => {
requiredCommitteeSize ??= REQUIRED_COMMITTEE_SIZE

Expand All @@ -47,6 +49,7 @@ export const evaluate = async ({
// Detect fraud

const sparkRoundDetails = await fetchRoundDetails(await ieContract.getAddress(), roundIndex, recordTelemetry)
round.details = sparkRoundDetails
// Omit the roundDetails object from the format string to get nicer formatting
debug('ROUND DETAILS for round=%s', roundIndex, sparkRoundDetails)

Expand Down Expand Up @@ -202,6 +205,14 @@ export const evaluate = async ({
}
}

try {
await prepareProviderRetrievalResultStats(round, committees)
} catch (err) {
console.error('Cannot prepare provider retrieval result stats.', err)
ignoredErrors.push(err)
Sentry.captureException(err)
}

return { ignoredErrors }
}

Expand Down
1 change: 1 addition & 0 deletions lib/preprocess.js
Original file line number Diff line number Diff line change
Expand Up @@ -116,6 +116,7 @@ export const preprocess = async ({
logger.log('Retrieval Success Rate: %s%s (%s of %s)', Math.round(100 * okCount / total), '%', okCount, total)

round.measurements.push(...validMeasurements)
round.measurementBatches.push(cid)

recordTelemetry('preprocess', point => {
point.intField('round_index', roundIndex)
Expand Down
Loading

0 comments on commit 100aa85

Please sign in to comment.