diff --git a/packages/ipfs-unixfs-exporter/README.md b/packages/ipfs-unixfs-exporter/README.md index 7199bcb5..5ad2df74 100644 --- a/packages/ipfs-unixfs-exporter/README.md +++ b/packages/ipfs-unixfs-exporter/README.md @@ -21,13 +21,13 @@ - [Usage](#usage) - [Example](#example) - [API](#api) - - [`exporter(cid, ipld, options)`](#exportercid-ipld-options) + - [`exporter(cid, blockstore, options)`](#exportercid-blockstore-options) - [UnixFSEntry](#unixfsentry) - [Raw entries](#raw-entries) - [CBOR entries](#cbor-entries) - [`entry.content({ offset, length })`](#entrycontent-offset-length-) - - [`walkPath(cid, ipld)`](#walkpathcid-ipld) - - [`recursive(cid, ipld)`](#recursivecid-ipld) + - [`walkPath(cid, blockstore)`](#walkpathcid-blockstore) + - [`recursive(cid, blockstore)`](#recursivecid-blockstore) - [Contribute](#contribute) - [License](#license) @@ -43,21 +43,24 @@ ```js // import a file and export it again -const { importer } = require('ipfs-unixfs-importer') -const { exporter } = require('ipfs-unixfs-exporter') +import { importer } from 'ipfs-unixfs-importer' +import { exporter } from 'ipfs-unixfs-exporter' +import { MemoryBlockstore } from 'blockstore-core/memory' +// Should contain the blocks we are trying to export +const blockstore = new MemoryBlockstore() const files = [] for await (const file of importer([{ path: '/foo/bar.txt', content: new Uint8Array([0, 1, 2, 3]) -}], ipld)) { +}], blockstore)) { files.push(file) } console.info(files[0].cid) // Qmbaz -const entry = await exporter(files[0].cid, ipld) +const entry = await exporter(files[0].cid, blockstore) console.info(entry.cid) // Qmqux console.info(entry.path) // Qmbaz/foo/bar.txt @@ -80,12 +83,12 @@ console.info(bytes) // 0, 1, 2, 3 #### API ```js -const { exporter } = require('ipfs-unixfs-exporter') +import { exporter } from 'ipfs-unixfs-exporter' ``` -### `exporter(cid, ipld, options)` +### `exporter(cid, blockstore, options)` -Uses the given [ipld](https://github.com/ipld/js-ipld) instance to fetch an IPFS node by it's CID. +Uses the given [blockstore][] instance to fetch an IPFS node by it's CID. Returns a Promise which resolves to a `UnixFSEntry`. @@ -202,32 +205,32 @@ for await (const entry of dir.content({ // `entries` contains the first 5 files/directories in the directory ``` -### `walkPath(cid, ipld)` +### `walkPath(cid, blockstore)` `walkPath` will return an async iterator that yields entries for all segments in a path: ```javascript -const { walkPath } = require('ipfs-unixfs-exporter') +import { walkPath } from 'ipfs-unixfs-exporter' const entries = [] -for await (const entry of walkPath('Qmfoo/foo/bar/baz.txt', ipld)) { +for await (const entry of walkPath('Qmfoo/foo/bar/baz.txt', blockstore)) { entries.push(entry) } // entries contains 4x `entry` objects ``` -### `recursive(cid, ipld)` +### `recursive(cid, blockstore)` `recursive` will return an async iterator that yields all entries beneath a given CID or IPFS path, as well as the containing directory. ```javascript -const { recursive } = require('ipfs-unixfs-exporter') +import { recursive } from 'ipfs-unixfs-exporter' const entries = [] -for await (const child of recursive('Qmfoo/foo/bar', ipld)) { +for await (const child of recursive('Qmfoo/foo/bar', blockstore)) { entries.push(entry) } @@ -235,9 +238,8 @@ for await (const child of recursive('Qmfoo/foo/bar', ipld)) { ``` [dag API]: https://github.com/ipfs/interface-ipfs-core/blob/master/SPEC/DAG.md -[ipld-resolver instance]: https://github.com/ipld/js-ipld-resolver +[blockstore]: https://github.com/ipfs/js-ipfs-interfaces/tree/master/packages/interface-blockstore#readme [UnixFS]: https://github.com/ipfs/specs/tree/master/unixfs -[pull-stream]: https://www.npmjs.com/package/pull-stream ## Contribute diff --git a/packages/ipfs-unixfs-importer/README.md b/packages/ipfs-unixfs-importer/README.md index 525cd8ab..0697cf25 100644 --- a/packages/ipfs-unixfs-importer/README.md +++ b/packages/ipfs-unixfs-importer/README.md @@ -18,7 +18,7 @@ - [Usage](#usage) - [Example](#example) - [API](#api) - - [const stream = importer(source, ipld [, options])](#const-stream--importersource-ipld--options) + - [const stream = importer(source, blockstore [, options])](#const-stream--importersource-blockstore--options) - [Overriding internals](#overriding-internals) - [Contribute](#contribute) - [License](#license) @@ -45,7 +45,11 @@ Let's create a little directory to import: And write the importing logic: ```js -const { importer } = require('ipfs-unixfs-importer') +import { importer } from 'ipfs-unixfs-importer' +import { MemoryBlockstore } from 'blockstore-core/memory' + +// Where the blocks will be stored +const blockstore = new MemoryBlockstore() // Import path /tmp/foo/bar const source = [{ @@ -56,9 +60,7 @@ const source = [{ content: fs.createReadStream(file2) }] -// You need to create and pass an ipld-resolve instance -// https://github.com/ipld/js-ipld-resolver -for await (const entry of importer(source, ipld, options)) { +for await (const entry of importer(source, blockstore, options)) { console.info(entry) } ``` @@ -91,10 +93,10 @@ When run, metadata about DAGNodes in the created tree is printed until the root: #### API ```js -const { importer } = require('ipfs-unixfs-importer') +import { importer } from 'ipfs-unixfs-importer' ``` -#### const stream = importer(source, ipld [, options]) +#### const stream = importer(source, blockstore [, options]) The `importer` function returns an async iterator takes a source async iterator that yields objects of the form: @@ -109,9 +111,9 @@ The `importer` function returns an async iterator takes a source async iterator `stream` will output file info objects as files get stored in IPFS. When stats on a node are emitted they are guaranteed to have been written. -`ipld` is an instance of the [`IPLD Resolver`](https://github.com/ipld/js-ipld-resolver) +`blockstore` is an instance of a [blockstore][] -The input's file paths and directory structure will be preserved in the [`dag-pb`](https://github.com/ipld/js-ipld-dag-pb) created nodes. +The input's file paths and directory structure will be preserved in the [`dag-pb`](https://github.com/ipld/js-dag-pb) created nodes. `options` is an JavaScript option that might include the following keys: @@ -150,20 +152,20 @@ Several aspects of the importer are overridable by specifying functions as part - It should yield `Buffer` objects constructed from the `source` or throw an `Error` - `chunker` (function): Optional function that supports the signature `async function * (source, options)` where `source` is an async generator and `options` is an options object - It should yield `Buffer` objects. -- `bufferImporter` (function): Optional function that supports the signature `async function * (entry, ipld, options)` - - This function should read `Buffer`s from `source` and persist them using `ipld.put` or similar +- `bufferImporter` (function): Optional function that supports the signature `async function * (entry, blockstore, options)` + - This function should read `Buffer`s from `source` and persist them using `blockstore.put` or similar - `entry` is the `{ path, content }` entry, where `entry.content` is an async generator that yields Buffers - It should yield functions that return a Promise that resolves to an object with the properties `{ cid, unixfs, size }` where `cid` is a [CID], `unixfs` is a [UnixFS] entry and `size` is a `Number` that represents the serialized size of the [IPLD] node that holds the buffer data. - Values will be pulled from this generator in parallel - the amount of parallelisation is controlled by the `blockWriteConcurrency` option (default: 10) -- `dagBuilder` (function): Optional function that supports the signature `async function * (source, ipld, options)` +- `dagBuilder` (function): Optional function that supports the signature `async function * (source, blockstore, options)` - This function should read `{ path, content }` entries from `source` and turn them into DAGs - It should yield a `function` that returns a `Promise` that resolves to `{ cid, path, unixfs, node }` where `cid` is a `CID`, `path` is a string, `unixfs` is a UnixFS entry and `node` is a `DAGNode`. - Values will be pulled from this generator in parallel - the amount of parallelisation is controlled by the `fileImportConcurrency` option (default: 50) -- `treeBuilder` (function): Optional function that supports the signature `async function * (source, ipld, options)` +- `treeBuilder` (function): Optional function that supports the signature `async function * (source, blockstore, options)` - This function should read `{ cid, path, unixfs, node }` entries from `source` and place them in a directory structure - It should yield an object with the properties `{ cid, path, unixfs, size }` where `cid` is a `CID`, `path` is a string, `unixfs` is a UnixFS entry and `size` is a `Number`. -[ipld-resolver instance]: https://github.com/ipld/js-ipld-resolver +[blockstore]: https://github.com/ipfs/js-ipfs-interfaces/tree/master/packages/interface-blockstore#readme [UnixFS]: https://github.com/ipfs/specs/tree/master/unixfs [IPLD]: https://github.com/ipld/js-ipld [CID]: https://github.com/multiformats/js-cid diff --git a/packages/ipfs-unixfs/README.md b/packages/ipfs-unixfs/README.md index d7ca41e3..2b2b759f 100644 --- a/packages/ipfs-unixfs/README.md +++ b/packages/ipfs-unixfs/README.md @@ -47,7 +47,7 @@ The UnixFS spec can be found inside the [ipfs/specs repository](http://github.co ### Use in Node.js ```JavaScript -var { UnixFS } = require('ipfs-unixfs') +import { UnixFS } from 'ipfs-unixfs' ``` ### Use in a browser with browserify, webpack or any other bundler @@ -55,7 +55,7 @@ var { UnixFS } = require('ipfs-unixfs') The code published to npm that gets loaded on require is in fact a ES5 transpiled version with the right shims added. This means that you can require it and use with your favourite bundler without having to adjust asset management process. ```JavaScript -var { UnixFS } = require('ipfs-unixfs') +import { UnixFS } from 'ipfs-unixfs' ``` ### Use in a browser Using a script tag