From 2a5cc5ed8c3ee568817c2f7d61998f9c2832d3f4 Mon Sep 17 00:00:00 2001 From: David Dias Date: Tue, 19 Jun 2018 21:53:42 +0100 Subject: [PATCH] feat: pin API (#1045) * revert: default assets are not added when running on a browser refactor: change pin.flush logging message * feat(test): add tests for failure cases of normalizeHashes fix: don't need to cast the object.get result with toJSON revert: use interface-datastore.Key for datastore pin storage The proper change would be that datastore-level automatically casts operations into Keys fix: do not invoke callback within a try/catch feat(test): make cli pin tests more robust By using files that aren't added on IPFS initialization. Still needs work on files.rm (direct) and ipfs ls (indirect). fix: remove commented code, traced test failures to pin-set Got to go for the night, though, so will checkpoint here and address tomorrow. feat: parseIpfsPath now throws errors for consistency feat: resolveIpfsPaths error message lists the relative path that failed feat: use follow.bind instead of mutating the links Also decided not show relative paths. Less human friendly but probably cleaner otherwise. refactor: resolveIpfsPaths -> resolvePaths feat: promisify resolvePaths test: change parseIpfsPath failure tests to use try/catch docs: edit resolvePath doc revert: accidentally deleted commands/pin.js * feat: jsipfs pin improvements (#1249) * initial sweep through to understand how pin works. Did make some changes but mostly minor. * refactor pb schema to it's own file * fix: don't pin files during files.add if opts.pin === false * feat: add some http qs parsing, http route/resources cleanup, cleanup core/utils.parseIpfsPath * feat: expand pin tests. \nFirst draft. still needs some further work. * feat: Add logging for entry/exit of pins: add/rm/flush/load. Clean some documentation. * feat: add --pin to files.add, fix: improper pin option parsing in core. * feat: Use ipfs.files.add to add init-docs instead of directly using the unix-fs importer. * feat(tests): Add tests for cli --pin option. I know this should be more of an integration test. Should be written in /core. Maybe talk with Victor about testing different layers * feat: use isIPFS to valiate a multihash. * fix: add some changes missed during rebase, syntax fixes, etc I think my original rebase for this branch 2 weeks ago might have changed history for the intervening commits, indirectly causing some of these missed changes. or I just rebase onto the wrong oldparent. fix: some onlyHash and pin tests broke after merging onlyHash and pin interact: shouldn't pin when --only-hash. fix: trim output for 'pin ls when no hash is passed' test: indirect pins supersede direct pins: turns out we had a bug feat: add expectTimeout test utility feat: promisify some additional pin utils * test: initial work testing the core/pin.js implementation I think I'll end up moving most tests here. test: add tests for pin.ls and pin.rm Based tests on other pin fixtures, need to migrate the isPinned* tests to them as well. fix: direct pins are now deleted by a default pin.rm(hash) test: prepare for pin.add tests 'indirect supersedes direct' test exposes a bug in pin.ls feat: switch away from multihashes for isPinned* tests test: impl pin.add tests fix: add fixture files only once test: add test for a potential bug, clean isPinned* tests refactor: remove a test that's no longer needed fix: pin.ls, indirect pins should supersede direct pins test: naive pin.load, pin.flush tests feat: remove most pin cli tests as functionality is tested in pin core tests refactor: rename solarSystem * refactor: move pin http-api tests to http-api/inject fix: attempt to find a way to use http-api/inject test structure for pin tests test: fix pin.rm http-api tests test: fix pin.add http-api tests docs: docs and cleanup of http-api pin tests refactor: renaming fix: lint errors fix: resolvePaths tests are failing on CI, it might be long ops, testing a timeout bump fix: add files explicitly before testing resolvePaths fix: remove mocha.only from resolvePaths. let's hope tests pass, they are passing CI now fix: rename test/core/utils.spec.js -> utils.js so it's not run during browser tests * test: first draft of pin-set tests Need to leave computer, this is a checkpoint. test: add sanity test for walkItems and hasChild, clean others These tests are more descriptive than really pushing the impl. I'd love others' thoughts on what else should be hit and how. I also need to compare go's pinset impl against ours fix: stop daemons feat: documentation and multihash buffer handling for dag.get fix: lint * feat: simplify root dagnode generation for storeItems base case * feat: rename vars, fix _depth default value, add docs fix: pinset.hasChild buffer check feat: hardcode expected length for flush/load tests * feat: parallelize pin.isPinnedWithType * refactor: refactor pinset.storeItems * fix: re-add pin interface tests I must have missed a commit during a rebase. * fix: lint * feat: docs, rename resolvePaths, pin.getIndirectKeys now uses eachLimit * chore: rebase a month of changes, resolve minor issues from that fix: yarg arugment naming for pin cli commands fix: convert file multihashes to a b58 string fix: another way of checking for CID-ness fix: lint fix: toB58String handles non-buffers fix: key-exchange core tests now shutdown daemon. * chore: update big.js version * revert: do not pin content added with a non-default hash algorithm * revert: internalKey recording * refactor: use lodash.flattenDeep refactor: pinset.hasChild -> pinset.hasDescendent fix: invoke someCb if we've seen the hash before refactor: async patterns in dag._getRecursive refactor: pinset.hasDescendant refactor: pinset.storeItems async patterns refactor: pinset.loadSet and pin.walkItem async patterns docs: add link to go-ipfs' fanout bin implementation refactor: async patterns of pin.load/flush refactor: lint refactor: privatize internal pin key storage refactor: change encapsulation of ipfs.pin, fix resulting issues fix: lint fix: 'files add --pin=false' test was giving a false positive refactor: use is-ipfs to check CID-ability of a string refactor: remove last instance of 'once' in the pin code * refactor: do not expose pinTypes They're simple enough, documented elsewhere, and not used by any exposed functionality. * fix: do not destructure node callback results --- package.json | 7 +- src/cli/commands/files/add.js | 8 +- src/cli/commands/pin.js | 15 + src/cli/commands/pin/add.js | 29 ++ src/cli/commands/pin/ls.js | 43 +++ src/cli/commands/pin/rm.js | 28 ++ src/core/boot.js | 1 + src/core/components/dag.js | 25 ++ src/core/components/files.js | 23 +- src/core/components/index.js | 1 + src/core/components/init-assets.js | 23 +- src/core/components/init.js | 4 +- src/core/components/pin-set.js | 228 +++++++++++++++ src/core/components/pin.js | 393 ++++++++++++++++++++++++++ src/core/components/pin.proto.js | 19 ++ src/core/index.js | 1 + src/core/utils.js | 108 +++++++ src/http/api/resources/files.js | 4 +- src/http/api/resources/index.js | 1 + src/http/api/resources/pin.js | 100 +++++++ src/http/api/routes/index.js | 1 + src/http/api/routes/pin.js | 40 +++ test/cli/commands.js | 3 +- test/cli/files.js | 32 ++- test/cli/pin.js | 78 +++++ test/core/interface/pin.js | 35 +++ test/core/key-exchange.js | 2 +- test/core/pin-set.js | 192 +++++++++++++ test/core/pin.js | 328 +++++++++++++++++++++ test/core/utils.js | 160 +++++++++++ test/fixtures/planets/mercury/wiki.md | 12 + test/fixtures/planets/solar-system.md | 10 + test/http-api/inject/pin.js | 174 ++++++++++++ test/utils/expect-timeout.js | 16 ++ 34 files changed, 2117 insertions(+), 27 deletions(-) create mode 100644 src/cli/commands/pin.js create mode 100644 src/cli/commands/pin/add.js create mode 100644 src/cli/commands/pin/ls.js create mode 100644 src/cli/commands/pin/rm.js create mode 100644 src/core/components/pin-set.js create mode 100644 src/core/components/pin.js create mode 100644 src/core/components/pin.proto.js create mode 100644 src/http/api/resources/pin.js create mode 100644 src/http/api/routes/pin.js create mode 100644 test/cli/pin.js create mode 100644 test/core/interface/pin.js create mode 100644 test/core/pin-set.js create mode 100644 test/core/pin.js create mode 100644 test/core/utils.js create mode 100644 test/fixtures/planets/mercury/wiki.md create mode 100644 test/fixtures/planets/solar-system.md create mode 100644 test/http-api/inject/pin.js create mode 100644 test/utils/expect-timeout.js diff --git a/package.json b/package.json index dba09a090d..737988ff32 100644 --- a/package.json +++ b/package.json @@ -88,7 +88,7 @@ }, "dependencies": { "async": "^2.6.0", - "big.js": "^5.0.3", + "big.js": "^5.1.2", "binary-querystring": "~0.1.2", "bl": "^1.2.2", "boom": "^7.2.0", @@ -98,6 +98,7 @@ "debug": "^3.1.0", "file-type": "^7.7.1", "filesize": "^3.6.1", + "fnv1a": "^1.0.1", "fsm-event": "^2.1.0", "get-folder-size": "^1.0.1", "glob": "^7.1.2", @@ -105,6 +106,7 @@ "hapi-set-header": "^1.0.2", "hoek": "^5.0.3", "human-to-milliseconds": "^1.0.0", + "interface-datastore": "^0.4.1", "ipfs-api": "^22.0.0", "ipfs-bitswap": "~0.20.0", "ipfs-block": "~0.7.1", @@ -137,6 +139,7 @@ "libp2p-websocket-star": "~0.8.0", "libp2p-websockets": "~0.12.0", "lodash.flatmap": "^4.5.0", + "lodash.flattendeep": "^4.4.0", "lodash.get": "^4.4.2", "lodash.set": "^4.3.2", "lodash.sortby": "^4.7.0", @@ -218,7 +221,7 @@ "Jade Meskill ", "Johannes Wikner ", "Jon Schlinkert ", - "Jonathan ", + "Jonathan Krone ", "João Antunes ", "João Santos ", "Kevin Wang ", diff --git a/src/cli/commands/files/add.js b/src/cli/commands/files/add.js index 1abedb5122..d4a9ab764a 100644 --- a/src/cli/commands/files/add.js +++ b/src/cli/commands/files/add.js @@ -173,6 +173,11 @@ module.exports = { type: 'boolean', default: false, describe: 'Write no output' + }, + pin: { + type: 'boolean', + default: true, + describe: 'Pin this object when adding' } }, @@ -188,7 +193,8 @@ module.exports = { rawLeaves: argv.rawLeaves, onlyHash: argv.onlyHash, hashAlg: argv.hash, - wrapWithDirectory: argv.wrapWithDirectory + wrapWithDirectory: argv.wrapWithDirectory, + pin: argv.pin } // Temporary restriction on raw-leaves: diff --git a/src/cli/commands/pin.js b/src/cli/commands/pin.js new file mode 100644 index 0000000000..d7a68fb023 --- /dev/null +++ b/src/cli/commands/pin.js @@ -0,0 +1,15 @@ +'use strict' + +module.exports = { + command: 'pin', + + description: 'Pin and unpin objects to local storage.', + + builder (yargs) { + return yargs + .commandDir('pin') + }, + + handler (argv) { + } +} diff --git a/src/cli/commands/pin/add.js b/src/cli/commands/pin/add.js new file mode 100644 index 0000000000..bfb0a0b9a3 --- /dev/null +++ b/src/cli/commands/pin/add.js @@ -0,0 +1,29 @@ +'use strict' + +const print = require('../../utils').print + +module.exports = { + command: 'add ', + + describe: 'Pins object to local storage.', + + builder: { + recursive: { + type: 'boolean', + alias: 'r', + default: true, + describe: 'Recursively pin the object linked to by the specified object(s).' + } + }, + + handler (argv) { + const recursive = argv.recursive + const type = recursive ? 'recursive' : 'direct' + argv.ipfs.pin.add(argv.ipfsPath, { recursive: recursive }, (err, results) => { + if (err) { throw err } + results.forEach((res) => { + print(`pinned ${res.hash} ${type}ly`) + }) + }) + } +} diff --git a/src/cli/commands/pin/ls.js b/src/cli/commands/pin/ls.js new file mode 100644 index 0000000000..a59958942d --- /dev/null +++ b/src/cli/commands/pin/ls.js @@ -0,0 +1,43 @@ +'use strict' + +const print = require('../../utils').print + +module.exports = { + // bracket syntax with '...' tells yargs to optionally accept a list + command: 'ls [ipfsPath...]', + + describe: 'List objects pinned to local storage.', + + builder: { + type: { + type: 'string', + alias: 't', + default: 'all', + choices: ['direct', 'indirect', 'recursive', 'all'], + describe: 'The type of pinned keys to list.' + }, + quiet: { + type: 'boolean', + alias: 'q', + default: false, + describe: 'Write just hashes of objects.' + } + }, + + handler: (argv) => { + const paths = argv.ipfsPath + const type = argv.type + const quiet = argv.quiet + + argv.ipfs.pin.ls(paths, { type }, (err, results) => { + if (err) { throw err } + results.forEach((res) => { + let line = res.hash + if (!quiet) { + line += ` ${res.type}` + } + print(line) + }) + }) + } +} diff --git a/src/cli/commands/pin/rm.js b/src/cli/commands/pin/rm.js new file mode 100644 index 0000000000..acbadf8a96 --- /dev/null +++ b/src/cli/commands/pin/rm.js @@ -0,0 +1,28 @@ +'use strict' + +const print = require('../../utils').print + +module.exports = { + command: 'rm ', + + describe: 'Removes the pinned object from local storage.', + + builder: { + recursive: { + type: 'boolean', + alias: 'r', + default: true, + describe: 'Recursively unpin the objects linked to by the specified object(s).' + } + }, + + handler: (argv) => { + const recursive = argv.recursive + argv.ipfs.pin.rm(argv.ipfsPath, { recursive: recursive }, (err, results) => { + if (err) { throw err } + results.forEach((res) => { + print(`unpinned ${res.hash}`) + }) + }) + } +} diff --git a/src/core/boot.js b/src/core/boot.js index 113c9919bf..a8cb79d179 100644 --- a/src/core/boot.js +++ b/src/core/boot.js @@ -29,6 +29,7 @@ module.exports = (self) => { series([ (cb) => self._repo.open(cb), + (cb) => self.pin._load(cb), (cb) => self.preStart(cb), (cb) => { self.log('initialized') diff --git a/src/core/components/dag.js b/src/core/components/dag.js index b5d77e0c6c..e43ea241f8 100644 --- a/src/core/components/dag.js +++ b/src/core/components/dag.js @@ -3,6 +3,8 @@ const promisify = require('promisify-es6') const CID = require('cids') const pull = require('pull-stream') +const mapAsync = require('async/map') +const flattenDeep = require('lodash.flattendeep') module.exports = function dag (self) { return { @@ -33,6 +35,12 @@ module.exports = function dag (self) { } else { path = '/' } + } else if (Buffer.isBuffer(cid)) { + try { + cid = new CID(cid) + } catch (err) { + return callback(err) + } } self._ipld.get(cid, path, options, callback) @@ -73,6 +81,23 @@ module.exports = function dag (self) { self._ipld.treeStream(cid, path, options), pull.collect(callback) ) + }), + + // TODO - use IPLD selectors once they are implemented + _getRecursive: promisify((multihash, callback) => { + // gets flat array of all DAGNodes in tree given by multihash + + self.dag.get(new CID(multihash), (err, res) => { + if (err) { return callback(err) } + + mapAsync(res.value.links, (link, cb) => { + self.dag._getRecursive(link.multihash, cb) + }, (err, nodes) => { + // console.log('nodes:', nodes) + if (err) return callback(err) + callback(null, flattenDeep([res.value, nodes])) + }) + }) }) } } diff --git a/src/core/components/files.js b/src/core/components/files.js index e223e26d13..f69868bd77 100644 --- a/src/core/components/files.js +++ b/src/core/components/files.js @@ -32,7 +32,9 @@ function prepareFile (self, opts, file, callback) { } waterfall([ - (cb) => opts.onlyHash ? cb(null, file) : self.object.get(file.multihash, opts, cb), + (cb) => opts.onlyHash + ? cb(null, file) + : self.object.get(file.multihash, opts, cb), (node, cb) => { const b58Hash = cid.toBaseEncodedString() @@ -87,6 +89,19 @@ function normalizeContent (opts, content) { }) } +function pinFile (self, opts, file, cb) { + // Pin a file if it is the root dir of a recursive add or the single file + // of a direct add. + const pin = 'pin' in opts ? opts.pin : true + const isRootDir = !file.path.includes('/') + const shouldPin = pin && isRootDir && !opts.onlyHash && !opts.hashAlg + if (shouldPin) { + return self.pin.add(file.hash, err => cb(err, file)) + } else { + cb(null, file) + } +} + class AddHelper extends Duplex { constructor (pullStream, push, options) { super(Object.assign({ objectMode: true }, options)) @@ -130,7 +145,8 @@ module.exports = function files (self) { } let total = 0 - let prog = opts.progress || (() => {}) + + const prog = opts.progress || noop const progress = (bytes) => { total += bytes prog(total) @@ -141,7 +157,8 @@ module.exports = function files (self) { pull.map(normalizeContent.bind(null, opts)), pull.flatten(), importer(self._ipld, opts), - pull.asyncMap(prepareFile.bind(null, self, opts)) + pull.asyncMap(prepareFile.bind(null, self, opts)), + pull.asyncMap(pinFile.bind(null, self, opts)) ) } diff --git a/src/core/components/index.js b/src/core/components/index.js index ce95b27a53..7221575007 100644 --- a/src/core/components/index.js +++ b/src/core/components/index.js @@ -18,6 +18,7 @@ exports.swarm = require('./swarm') exports.ping = require('./ping') exports.pingPullStream = require('./ping-pull-stream') exports.pingReadableStream = require('./ping-readable-stream') +exports.pin = require('./pin') exports.files = require('./files') exports.bitswap = require('./bitswap') exports.pubsub = require('./pubsub') diff --git a/src/core/components/init-assets.js b/src/core/components/init-assets.js index 00c37120a8..096a3d20c4 100644 --- a/src/core/components/init-assets.js +++ b/src/core/components/init-assets.js @@ -1,9 +1,7 @@ 'use strict' const path = require('path') -const fs = require('fs') const glob = require('glob') -const importer = require('ipfs-unixfs-engine').importer const pull = require('pull-stream') const file = require('pull-file') const CID = require('cids') @@ -15,23 +13,20 @@ module.exports = function addDefaultAssets (self, log, callback) { pull( pull.values([initDocsPath]), - pull.asyncMap((val, cb) => glob(path.join(val, '/**/*'), cb)), + pull.asyncMap((val, cb) => + glob(path.join(val, '/**/*'), { nodir: true }, cb) + ), pull.flatten(), - pull.map((element) => { + pull.map(element => { const addPath = element.substring(index + 1) - - if (fs.statSync(element).isDirectory()) { return } - return { path: addPath, content: file(element) } }), - // Filter out directories, which are undefined from above - pull.filter(Boolean), - importer(self._ipld), - pull.through((el) => { - if (el.path === 'init-docs') { - const cid = new CID(el.multihash) + self.files.addPullStream(), + pull.through(file => { + if (file.path === 'init-docs') { + const cid = new CID(file.hash) log('to get started, enter:\n') - log(`\t jsipfs files cat /ipfs/${cid.toBaseEncodedString()}/readme\n`) + log(`\tjsipfs files cat /ipfs/${cid.toBaseEncodedString()}/readme\n`) } }), pull.collect((err) => { diff --git a/src/core/components/init.js b/src/core/components/init.js index 59555f383a..5b5f77d8e7 100644 --- a/src/core/components/init.js +++ b/src/core/components/init.js @@ -85,16 +85,18 @@ module.exports = function init (self) { return cb(null, true) } - self.log('adding assets') const tasks = [ // add empty unixfs dir object (go-ipfs assumes this exists) (cb) => self.object.new('unixfs-dir', cb) ] if (typeof addDefaultAssets === 'function') { + // addDefaultAssets is undefined on browsers. + // See package.json browser config tasks.push((cb) => addDefaultAssets(self, opts.log, cb)) } + self.log('adding assets') parallel(tasks, (err) => { if (err) { cb(err) diff --git a/src/core/components/pin-set.js b/src/core/components/pin-set.js new file mode 100644 index 0000000000..f18a248604 --- /dev/null +++ b/src/core/components/pin-set.js @@ -0,0 +1,228 @@ +'use strict' + +const multihashes = require('multihashes') +const CID = require('cids') +const protobuf = require('protons') +const fnv1a = require('fnv1a') +const varint = require('varint') +const { DAGNode, DAGLink } = require('ipld-dag-pb') +const async = require('async') + +const pbSchema = require('./pin.proto') + +const emptyKeyHash = 'QmdfTbBqBPQ7VNxZEYEj14VmRuZBkqFbiwReogJgS1zR1n' +const emptyKey = multihashes.fromB58String(emptyKeyHash) +const defaultFanout = 256 +const maxItems = 8192 +const pb = protobuf(pbSchema) + +function toB58String (hash) { + return new CID(hash).toBaseEncodedString() +} + +function readHeader (rootNode) { + // rootNode.data should be a buffer of the format: + // < varint(headerLength) | header | itemData... > + const rootData = rootNode.data + const hdrLength = varint.decode(rootData) + const vBytes = varint.decode.bytes + if (vBytes <= 0) { + throw new Error('Invalid Set header length') + } + if (vBytes + hdrLength > rootData.length) { + throw new Error('Impossibly large set header length') + } + const hdrSlice = rootData.slice(vBytes, hdrLength + vBytes) + const header = pb.Set.decode(hdrSlice) + if (header.version !== 1) { + throw new Error(`Unsupported Set version: ${header.version}`) + } + if (header.fanout > rootNode.links.length) { + throw new Error('Impossibly large fanout') + } + return { + header: header, + data: rootData.slice(hdrLength + vBytes) + } +} + +function hash (seed, key) { + const buf = Buffer.alloc(4) + buf.writeUInt32LE(seed, 0) + const data = Buffer.concat([ + buf, Buffer.from(toB58String(key)) + ]) + return fnv1a(data.toString('binary')) +} + +exports = module.exports = function (dag) { + const pinSet = { + // should this be part of `object` API? + hasDescendant: (root, childhash, callback) => { + const seen = {} + if (CID.isCID(childhash) || Buffer.isBuffer(childhash)) { + childhash = toB58String(childhash) + } + + return searchChildren(root, callback) + + function searchChildren (root, cb) { + async.some(root.links, ({ multihash }, someCb) => { + const bs58Link = toB58String(multihash) + if (bs58Link === childhash) { return someCb(null, true) } + if (bs58Link in seen) { return someCb(null, false) } + + seen[bs58Link] = true + + dag.get(multihash, (err, res) => { + if (err) { return someCb(err) } + searchChildren(res.value, someCb) + }) + }, cb) + } + }, + + storeSet: (keys, callback) => { + const pins = keys.map(key => ({ + key: key, + data: null + })) + + pinSet.storeItems(pins, (err, rootNode) => { + if (err) { return callback(err) } + const opts = { cid: new CID(rootNode.multihash) } + dag.put(rootNode, opts, (err, cid) => { + if (err) { return callback(err) } + callback(null, rootNode) + }) + }) + }, + + storeItems: (items, callback) => { + return storePins(items, 0, callback) + + function storePins (pins, depth, storePinsCb) { + const pbHeader = pb.Set.encode({ + version: 1, + fanout: defaultFanout, + seed: depth + }) + const headerBuf = Buffer.concat([ + Buffer.from(varint.encode(pbHeader.length)), pbHeader + ]) + const fanoutLinks = [] + for (let i = 0; i < defaultFanout; i++) { + fanoutLinks.push(new DAGLink('', 1, emptyKey)) + } + + if (pins.length <= maxItems) { + const nodes = pins + .map(item => ({ + link: new DAGLink('', 1, item.key), + data: item.data || Buffer.alloc(0) + })) + // sorting makes any ordering of `pins` produce the same DAGNode + .sort((a, b) => Buffer.compare(a.link.multihash, b.link.multihash)) + + const rootLinks = fanoutLinks.concat(nodes.map(item => item.link)) + const rootData = Buffer.concat( + [headerBuf].concat(nodes.map(item => item.data)) + ) + + DAGNode.create(rootData, rootLinks, (err, rootNode) => { + if (err) { return storePinsCb(err) } + return storePinsCb(null, rootNode) + }) + } else { + // If the array of pins is > maxItems, we: + // - distribute the pins among `defaultFanout` bins + // - create a DAGNode for each bin + // - add each pin as a DAGLink to that bin + // - create a root DAGNode + // - add each bin as a DAGLink + // - send that root DAGNode via callback + // (using go-ipfs' "wasteful but simple" approach for consistency) + // https://github.com/ipfs/go-ipfs/blob/master/pin/set.go#L57 + + const bins = pins.reduce((bins, pin) => { + const n = hash(depth, pin.key) % defaultFanout + bins[n] = n in bins ? bins[n].concat([pin]) : [pin] + return bins + }, {}) + + async.eachOf(bins, (bin, idx, eachCb) => { + storePins( + bin, + depth + 1, + (err, child) => storeChild(err, child, idx, eachCb) + ) + }, err => { + if (err) { return storePinsCb(err) } + DAGNode.create(headerBuf, fanoutLinks, (err, rootNode) => { + if (err) { return storePinsCb(err) } + return storePinsCb(null, rootNode) + }) + }) + } + + function storeChild (err, child, binIdx, cb) { + if (err) { return cb(err) } + + dag.put(child, { cid: new CID(child._multihash) }, err => { + if (err) { return cb(err) } + fanoutLinks[binIdx] = new DAGLink('', child.size, child.multihash) + cb(null) + }) + } + } + }, + + loadSet: (rootNode, name, callback) => { + const link = rootNode.links.find(l => l.name === name) + if (!link) { + return callback(new Error('No link found with name ' + name)) + } + + dag.get(link.multihash, (err, res) => { + if (err) { return callback(err) } + const keys = [] + const step = link => keys.push(link.multihash) + pinSet.walkItems(res.value, step, err => { + if (err) { return callback(err) } + return callback(null, keys) + }) + }) + }, + + walkItems: (node, step, callback) => { + let pbh + try { + pbh = readHeader(node) + } catch (err) { + return callback(err) + } + + async.eachOf(node.links, (link, idx, eachCb) => { + if (idx < pbh.header.fanout) { + // the first pbh.header.fanout links are fanout bins + // if a fanout bin is not 'empty', dig into and walk its DAGLinks + const linkHash = link.multihash + + if (!emptyKey.equals(linkHash)) { + // walk the links of this fanout bin + return dag.get(linkHash, (err, res) => { + if (err) { return eachCb(err) } + pinSet.walkItems(res.value, step, eachCb) + }) + } + } else { + // otherwise, the link is a pin + step(link, idx, pbh.data) + } + + eachCb(null) + }, callback) + } + } + return pinSet +} diff --git a/src/core/components/pin.js b/src/core/components/pin.js new file mode 100644 index 0000000000..93f79a8e74 --- /dev/null +++ b/src/core/components/pin.js @@ -0,0 +1,393 @@ +/* eslint max-nested-callbacks: ["error", 8] */ +'use strict' + +const promisify = require('promisify-es6') +const { DAGNode, DAGLink } = require('ipld-dag-pb') +const CID = require('cids') +const multihashes = require('multihashes') +const async = require('async') +const { Key } = require('interface-datastore') + +const createPinSet = require('./pin-set') +const { resolvePath } = require('../utils') + +// arbitrary limit to the number of concurrent dag operations +const concurrencyLimit = 300 +const pinDataStoreKey = new Key('/local/pins') + +function toB58String (hash) { + return new CID(hash).toBaseEncodedString() +} + +module.exports = function pin (self) { + const repo = self._repo + const dag = self.dag + const pinset = createPinSet(dag) + const types = { + direct: 'direct', + recursive: 'recursive', + indirect: 'indirect', + all: 'all' + } + + let directPins = new Set() + let recursivePins = new Set() + + const directKeys = () => + Array.from(directPins).map(key => multihashes.fromB58String(key)) + const recursiveKeys = () => + Array.from(recursivePins).map(key => multihashes.fromB58String(key)) + + function getIndirectKeys (callback) { + const indirectKeys = new Set() + async.eachLimit(recursiveKeys(), concurrencyLimit, (multihash, cb) => { + dag._getRecursive(multihash, (err, nodes) => { + if (err) { return cb(err) } + + nodes + .map(({ multihash }) => toB58String(multihash)) + // recursive pins pre-empt indirect pins + .filter(key => !recursivePins.has(key)) + .forEach(key => indirectKeys.add(key)) + + cb() + }) + }, (err) => { + if (err) { return callback(err) } + callback(null, Array.from(indirectKeys)) + }) + } + + // Encode and write pin key sets to the datastore: + // a DAGLink for each of the recursive and direct pinsets + // a DAGNode holding those as DAGLinks, a kind of root pin + function flushPins (callback) { + let dLink, rLink, root + async.series([ + // create a DAGLink to the node with direct pins + cb => async.waterfall([ + cb => pinset.storeSet(directKeys(), cb), + (node, cb) => DAGLink.create(types.direct, node.size, node.multihash, cb), + (link, cb) => { dLink = link; cb(null) } + ], cb), + + // create a DAGLink to the node with recursive pins + cb => async.waterfall([ + cb => pinset.storeSet(recursiveKeys(), cb), + (node, cb) => DAGLink.create(types.recursive, node.size, node.multihash, cb), + (link, cb) => { rLink = link; cb(null) } + ], cb), + + // the pin-set nodes link to a special 'empty' node, so make sure it exists + cb => DAGNode.create(Buffer.alloc(0), (err, empty) => { + if (err) { return cb(err) } + dag.put(empty, { cid: new CID(empty.multihash) }, cb) + }), + + // create a root node with DAGLinks to the direct and recursive DAGs + cb => DAGNode.create(Buffer.alloc(0), [dLink, rLink], (err, node) => { + if (err) { return cb(err) } + root = node + dag.put(root, { cid: new CID(root.multihash) }, cb) + }), + + // hack for CLI tests + cb => repo.closed ? repo.datastore.open(cb) : cb(null, null), + + // save root to datastore under a consistent key + cb => repo.datastore.put(pinDataStoreKey, root.multihash, cb) + ], (err, res) => { + if (err) { return callback(err) } + self.log(`Flushed pins with root: ${root}`) + return callback(null, root) + }) + } + + const pin = { + add: promisify((paths, options, callback) => { + if (typeof options === 'function') { + callback = options + options = null + } + const recursive = options ? options.recursive : true + + resolvePath(self.object, paths, (err, mhs) => { + if (err) { return callback(err) } + + // verify that each hash can be pinned + async.map(mhs, (multihash, cb) => { + const key = toB58String(multihash) + if (recursive) { + if (recursivePins.has(key)) { + // it's already pinned recursively + return cb(null, key) + } + + // entire graph of nested links should be pinned, + // so make sure we have all the objects + dag._getRecursive(key, (err) => { + if (err) { return cb(err) } + // found all objects, we can add the pin + return cb(null, key) + }) + } else { + if (recursivePins.has(key)) { + // recursive supersedes direct, can't have both + return cb(new Error(`${key} already pinned recursively`)) + } + if (directPins.has(key)) { + // already directly pinned + return cb(null, key) + } + + // make sure we have the object + dag.get(new CID(multihash), (err) => { + if (err) { return cb(err) } + // found the object, we can add the pin + return cb(null, key) + }) + } + }, (err, results) => { + if (err) { return callback(err) } + + // update the pin sets in memory + const pinset = recursive ? recursivePins : directPins + results.forEach(key => pinset.add(key)) + + // persist updated pin sets to datastore + flushPins((err, root) => { + if (err) { return callback(err) } + return callback(null, results.map(hash => ({ hash }))) + }) + }) + }) + }), + + rm: promisify((paths, options, callback) => { + let recursive = true + if (typeof options === 'function') { + callback = options + } else if (options && options.recursive === false) { + recursive = false + } + + resolvePath(self.object, paths, (err, mhs) => { + if (err) { return callback(err) } + + // verify that each hash can be unpinned + async.map(mhs, (multihash, cb) => { + pin._isPinnedWithType(multihash, types.all, (err, res) => { + if (err) { return cb(err) } + const { pinned, reason } = res + const key = toB58String(multihash) + if (!pinned) { + return cb(new Error(`${key} is not pinned`)) + } + + switch (reason) { + case (types.recursive): + if (recursive) { + return cb(null, key) + } else { + return cb(new Error(`${key} is pinned recursively`)) + } + case (types.direct): + return cb(null, key) + default: + return cb(new Error( + `${key} is pinned indirectly under ${reason}` + )) + } + }) + }, (err, results) => { + if (err) { return callback(err) } + + // update the pin sets in memory + results.forEach(key => { + if (recursive && recursivePins.has(key)) { + recursivePins.delete(key) + } else { + directPins.delete(key) + } + }) + + // persist updated pin sets to datastore + flushPins((err, root) => { + if (err) { return callback(err) } + self.log(`Removed pins: ${results}`) + return callback(null, results.map(hash => ({ hash }))) + }) + }) + }) + }), + + ls: promisify((paths, options, callback) => { + let type = types.all + if (typeof paths === 'function') { + callback = paths + options = null + paths = null + } + if (typeof options === 'function') { + callback = options + } + if (paths && paths.type) { + options = paths + paths = null + } + if (options && options.type) { + type = options.type.toLowerCase() + } + if (!types[type]) { + return callback(new Error( + `Invalid type '${type}', must be one of {direct, indirect, recursive, all}` + )) + } + + if (paths) { + // check the pinned state of specific hashes + resolvePath(self.object, paths, (err, mhs) => { + if (err) { return callback(err) } + + async.mapSeries(mhs, (multihash, cb) => { + pin._isPinnedWithType(multihash, types.all, (err, res) => { + if (err) { return cb(err) } + const { pinned, reason } = res + const key = toB58String(multihash) + if (!pinned) { + return cb(new Error(`Path ${key} is not pinned`)) + } + + switch (reason) { + case types.direct: + case types.recursive: + return cb(null, { + hash: key, + type: reason + }) + default: + return cb(null, { + hash: key, + type: `${types.indirect} through ${reason}` + }) + } + }) + }, callback) + }) + } else { + // show all pinned items of type + let pins = [] + if (type === types.direct || type === types.all) { + pins = pins.concat( + Array.from(directPins).map(hash => ({ + type: types.direct, + hash + })) + ) + } + if (type === types.recursive || type === types.all) { + pins = pins.concat( + Array.from(recursivePins).map(hash => ({ + type: types.recursive, + hash + })) + ) + } + if (type === types.indirect || type === types.all) { + getIndirectKeys((err, indirects) => { + if (err) { return callback(err) } + pins = pins + // if something is pinned both directly and indirectly, + // report the indirect entry + .filter(({ hash }) => + !indirects.includes(hash) || + (indirects.includes(hash) && !directPins.has(hash)) + ) + .concat(indirects.map(hash => ({ + type: types.indirect, + hash + }))) + return callback(null, pins) + }) + } else { + return callback(null, pins) + } + } + }), + + _isPinnedWithType: promisify((multihash, type, callback) => { + const key = toB58String(multihash) + const { recursive, direct, all } = types + // recursive + if ((type === recursive || type === all) && recursivePins.has(key)) { + return callback(null, {pinned: true, reason: recursive}) + } + if ((type === recursive)) { + return callback(null, {pinned: false}) + } + // direct + if ((type === direct || type === all) && directPins.has(key)) { + return callback(null, {pinned: true, reason: direct}) + } + if ((type === direct)) { + return callback(null, {pinned: false}) + } + + // indirect (default) + // check each recursive key to see if multihash is under it + // arbitrary limit, enables handling 1000s of pins. + let foundPin + async.someLimit(recursiveKeys(), concurrencyLimit, (key, cb) => { + dag.get(new CID(key), (err, res) => { + if (err) { return cb(err) } + + pinset.hasDescendant(res.value, multihash, (err, has) => { + if (has) { + foundPin = toB58String(res.value.multihash) + } + cb(err, has) + }) + }) + }, (err, found) => { + if (err) { return callback(err) } + return callback(null, { pinned: found, reason: foundPin }) + }) + }), + + _load: promisify(callback => { + async.waterfall([ + // hack for CLI tests + (cb) => repo.closed ? repo.datastore.open(cb) : cb(null, null), + (_, cb) => repo.datastore.has(pinDataStoreKey, cb), + (has, cb) => has ? cb() : cb(new Error('No pins to load')), + (cb) => repo.datastore.get(pinDataStoreKey, cb), + (mh, cb) => dag.get(new CID(mh), cb) + ], (err, pinRoot) => { + if (err) { + if (err.message === 'No pins to load') { + self.log('No pins to load') + return callback() + } else { + return callback(err) + } + } + + async.parallel([ + cb => pinset.loadSet(pinRoot.value, types.recursive, cb), + cb => pinset.loadSet(pinRoot.value, types.direct, cb) + ], (err, keys) => { + if (err) { return callback(err) } + const [ rKeys, dKeys ] = keys + + directPins = new Set(dKeys.map(toB58String)) + recursivePins = new Set(rKeys.map(toB58String)) + + self.log('Loaded pins from the datastore') + return callback(null) + }) + }) + }) + } + + return pin +} diff --git a/src/core/components/pin.proto.js b/src/core/components/pin.proto.js new file mode 100644 index 0000000000..8e94fd8f52 --- /dev/null +++ b/src/core/components/pin.proto.js @@ -0,0 +1,19 @@ +'use strict' + +/** + * Protobuf interface + * from go-ipfs/pin/internal/pb/header.proto + */ +module.exports = ` + syntax = "proto2"; + + package ipfs.pin; + + option go_package = "pb"; + + message Set { + optional uint32 version = 1; + optional uint32 fanout = 2; + optional fixed32 seed = 3; + } +` diff --git a/src/core/index.js b/src/core/index.js index 0b19160429..57efa58385 100644 --- a/src/core/index.js +++ b/src/core/index.js @@ -101,6 +101,7 @@ class IPFS extends EventEmitter { this.swarm = components.swarm(this) this.files = components.files(this) this.bitswap = components.bitswap(this) + this.pin = components.pin(this) this.ping = components.ping(this) this.pingPullStream = components.pingPullStream(this) this.pingReadableStream = components.pingReadableStream(this) diff --git a/src/core/utils.js b/src/core/utils.js index a492250383..a0d67e449a 100644 --- a/src/core/utils.js +++ b/src/core/utils.js @@ -1,3 +1,111 @@ 'use strict' +const multihashes = require('multihashes') +const promisify = require('promisify-es6') +const map = require('async/map') +const isIpfs = require('is-ipfs') + exports.OFFLINE_ERROR = 'This command must be run in online mode. Try running \'ipfs daemon\' first.' + +/** + * Break an ipfs-path down into it's hash and an array of links. + * + * examples: + * b58Hash -> { hash: 'b58Hash', links: [] } + * b58Hash/mercury/venus -> { hash: 'b58Hash', links: ['mercury', 'venus']} + * /ipfs/b58Hash/links/by/name -> { hash: 'b58Hash', links: ['links', 'by', 'name'] } + * + * @param {String} ipfsPath An ipfs-path + * @return {Object} { hash: base58 string, links: [string], ?err: Error } + * @throws on an invalid @param ipfsPath + */ +function parseIpfsPath (ipfsPath) { + const invalidPathErr = new Error('invalid ipfs ref path') + ipfsPath = ipfsPath.replace(/^\/ipfs\//, '') + const matched = ipfsPath.match(/([^/]+(?:\/[^/]+)*)\/?$/) + if (!matched) { + throw invalidPathErr + } + + const [hash, ...links] = matched[1].split('/') + + // check that a CID can be constructed with the hash + if (isIpfs.cid(hash)) { + return { hash, links } + } else { + throw invalidPathErr + } +} + +/** + * Resolve various styles of an ipfs-path to the hash of the target node. + * Follows links in the path. + * + * Accepts formats: + * - + * - /link/to/venus + * - /ipfs//link/to/pluto + * - multihash Buffer + * - Arrays of the above + * + * @param {IPFS} objectAPI The IPFS object api + * @param {Described above} ipfsPaths A single or collection of ipfs-paths + * @param {Function} callback res is Array + * if no callback is passed, returns a Promise + * @return {Promise|void} + */ +const resolvePath = promisify(function (objectAPI, ipfsPaths, callback) { + if (!Array.isArray(ipfsPaths)) { + ipfsPaths = [ipfsPaths] + } + + map(ipfsPaths, (path, cb) => { + if (typeof path !== 'string') { + try { + multihashes.validate(path) + } catch (err) { + return cb(err) + } + return cb(null, path) + } + + let parsedPath + try { + parsedPath = exports.parseIpfsPath(path) + } catch (err) { + return cb(err) + } + + const rootHash = multihashes.fromB58String(parsedPath.hash) + const rootLinks = parsedPath.links + if (!rootLinks.length) { + return cb(null, rootHash) + } + + objectAPI.get(rootHash, follow.bind(null, rootLinks)) + + // recursively follow named links to the target node + function follow (links, err, obj) { + if (err) { + return cb(err) + } + if (!links.length) { + // done tracing, obj is the target node + return cb(null, obj.multihash) + } + + const linkName = links[0] + const nextObj = obj.links.find(link => link.name === linkName) + if (!nextObj) { + return cb(new Error( + `no link named "${linkName}" under ${obj.toJSON().multihash}` + )) + } + + objectAPI.get(nextObj.multihash, follow.bind(null, links.slice(1))) + } + }, callback) +}) + +exports.parseIpfsPath = parseIpfsPath +exports.resolvePath = resolvePath diff --git a/src/http/api/resources/files.js b/src/http/api/resources/files.js index b62fb67aa1..1cd93e158e 100644 --- a/src/http/api/resources/files.js +++ b/src/http/api/resources/files.js @@ -166,6 +166,7 @@ exports.add = { otherwise: Joi.boolean().valid(false) }), 'only-hash': Joi.boolean(), + pin: Joi.boolean().default(true), 'wrap-with-directory': Joi.boolean() }) // TODO: Necessary until validate "recursive", "stream-channels" etc. @@ -227,7 +228,8 @@ exports.add = { progress: request.query.progress ? progressHandler : null, onlyHash: request.query['only-hash'], hashAlg: request.query['hash'], - wrapWithDirectory: request.query['wrap-with-directory'] + wrapWithDirectory: request.query['wrap-with-directory'], + pin: request.query.pin } const aborter = abortable() diff --git a/src/http/api/resources/index.js b/src/http/api/resources/index.js index 37f38f246b..59040a99d8 100644 --- a/src/http/api/resources/index.js +++ b/src/http/api/resources/index.js @@ -7,6 +7,7 @@ exports.ping = require('./ping') exports.bootstrap = require('./bootstrap') exports.repo = require('./repo') exports.object = require('./object') +exports.pin = require('./pin') exports.config = require('./config') exports.block = require('./block') exports.swarm = require('./swarm') diff --git a/src/http/api/resources/pin.js b/src/http/api/resources/pin.js new file mode 100644 index 0000000000..e42dd1f8c4 --- /dev/null +++ b/src/http/api/resources/pin.js @@ -0,0 +1,100 @@ +'use strict' + +const _ = require('lodash') +const debug = require('debug') +const log = debug('jsipfs:http-api:pin') +log.error = debug('jsipfs:http-api:pin:error') + +exports = module.exports + +function parseArgs (request, reply) { + if (!request.query.arg) { + return reply({ + Message: "Argument 'arg' is required", + Code: 0 + }).code(400).takeover() + } + + const recursive = request.query.recursive !== 'false' + + return reply({ + path: request.query.arg, + recursive: recursive + }) +} + +exports.ls = { + parseArgs: (request, reply) => { + const type = request.query.type || 'all' + + return reply({ + path: request.query.arg, + type: type + }) + }, + + handler: (request, reply) => { + const { path, type } = request.pre.args + const ipfs = request.server.app.ipfs + ipfs.pin.ls(path, { type }, (err, result) => { + if (err) { + log.error(err) + return reply({ + Message: `Failed to list pins: ${err.message}`, + Code: 0 + }).code(500) + } + + return reply({ + Keys: _.mapValues( + _.keyBy(result, obj => obj.hash), + obj => ({Type: obj.type}) + ) + }) + }) + } +} + +exports.add = { + parseArgs: parseArgs, + + handler: (request, reply) => { + const ipfs = request.server.app.ipfs + const { path, recursive } = request.pre.args + ipfs.pin.add(path, { recursive }, (err, result) => { + if (err) { + log.error(err) + return reply({ + Message: `Failed to add pin: ${err.message}`, + Code: 0 + }).code(500) + } + + return reply({ + Pins: result.map(obj => obj.hash) + }) + }) + } +} + +exports.rm = { + parseArgs: parseArgs, + + handler: (request, reply) => { + const ipfs = request.server.app.ipfs + const { path, recursive } = request.pre.args + ipfs.pin.rm(path, { recursive }, (err, result) => { + if (err) { + log.error(err) + return reply({ + Message: `Failed to remove pin: ${err.message}`, + Code: 0 + }).code(500) + } + + return reply({ + Pins: result.map(obj => obj.hash) + }) + }) + } +} diff --git a/src/http/api/routes/index.js b/src/http/api/routes/index.js index d7c30851f7..bfec26a460 100644 --- a/src/http/api/routes/index.js +++ b/src/http/api/routes/index.js @@ -7,6 +7,7 @@ module.exports = (server) => { require('./bootstrap')(server) require('./block')(server) require('./object')(server) + require('./pin')(server) require('./repo')(server) require('./config')(server) require('./ping')(server) diff --git a/src/http/api/routes/pin.js b/src/http/api/routes/pin.js new file mode 100644 index 0000000000..657bb375ac --- /dev/null +++ b/src/http/api/routes/pin.js @@ -0,0 +1,40 @@ +'use strict' + +const resources = require('./../resources') + +module.exports = (server) => { + const api = server.select('API') + + api.route({ + method: '*', + path: '/api/v0/pin/add', + config: { + pre: [ + { method: resources.pin.add.parseArgs, assign: 'args' } + ], + handler: resources.pin.add.handler + } + }) + + api.route({ + method: '*', + path: '/api/v0/pin/rm', + config: { + pre: [ + { method: resources.pin.rm.parseArgs, assign: 'args' } + ], + handler: resources.pin.rm.handler + } + }) + + api.route({ + method: '*', + path: '/api/v0/pin/ls', + config: { + pre: [ + { method: resources.pin.ls.parseArgs, assign: 'args' } + ], + handler: resources.pin.ls.handler + } + }) +} diff --git a/test/cli/commands.js b/test/cli/commands.js index 1e194eb143..7a8502bc4c 100644 --- a/test/cli/commands.js +++ b/test/cli/commands.js @@ -4,8 +4,7 @@ const expect = require('chai').expect const runOnAndOff = require('../utils/on-and-off') -const commandCount = 74 - +const commandCount = 78 describe('commands', () => runOnAndOff((thing) => { let ipfs diff --git a/test/cli/files.js b/test/cli/files.js index 60175ddfab..32fe28fc00 100644 --- a/test/cli/files.js +++ b/test/cli/files.js @@ -5,11 +5,13 @@ const fs = require('fs') const os = require('os') const expect = require('chai').expect const path = require('path') +const hat = require('hat') const compareDir = require('dir-compare').compareSync const rimraf = require('rimraf').sync const CID = require('cids') const mh = require('multihashes') const runOnAndOff = require('../utils/on-and-off') +const clean = require('../utils/clean') // TODO: Test against all algorithms Object.keys(mh.names) // This subset is known to work with both go-ipfs and js-ipfs as of 2017-09-05 @@ -296,7 +298,7 @@ describe('files', () => runOnAndOff((thing) => { it('add --only-hash does not add a file to the datastore', function () { this.timeout(30 * 1000) this.slow(10 * 1000) - const content = String(Math.random() + Date.now()) + const content = String(Math.random()) const filepath = path.join(os.tmpdir(), `${content}.txt`) fs.writeFileSync(filepath, content) @@ -310,10 +312,36 @@ describe('files', () => runOnAndOff((thing) => { ipfs.fail(`object get ${hash}`), new Promise((resolve, reject) => setTimeout(resolve, 4000)) ]) - .then(() => fs.unlinkSync(filepath)) + .then(() => clean(filepath)) }) }) + it('add pins by default', function () { + this.timeout(10 * 1000) + const filePath = path.join(os.tmpdir(), hat()) + const content = String(Math.random()) + fs.writeFileSync(filePath, content) + + return ipfs(`files add -Q ${filePath}`) + .then(out => { + const hash = out.trim() + return ipfs(`pin ls ${hash}`) + .then(ls => expect(ls).to.include(hash)) + }) + .then(() => clean(filePath)) + }) + + it('add does not pin with --pin=false', function () { + this.timeout(20 * 1000) + const filePath = path.join(os.tmpdir(), hat()) + const content = String(Math.random()) + fs.writeFileSync(filePath, content) + + return ipfs(`files add -Q --pin=false ${filePath}`) + .then(out => ipfs.fail(`pin ls ${out.trim()}`)) + .then(() => clean(filePath)) + }) + HASH_ALGS.forEach((name) => { it(`add with hash=${name} and raw-leaves=false`, function () { this.timeout(30 * 1000) diff --git a/test/cli/pin.js b/test/cli/pin.js new file mode 100644 index 0000000000..f6c77eb67a --- /dev/null +++ b/test/cli/pin.js @@ -0,0 +1,78 @@ +/* eslint-env mocha */ +/* eslint max-nested-callbacks: ["error", 8] */ +'use strict' + +const expect = require('chai').expect +const runOnAndOff = require('../utils/on-and-off') + +// fixture structure: +// planets/ +// solar-system.md +// mercury/ +// wiki.md +const fixturePath = 'test/fixtures/planets' +const pins = { + root: 'QmTAMavb995EHErSrKo7mB8dYkpaSJxu6ys1a6XJyB2sys', + solarWiki: 'QmTMbkDfvHwq3Aup6Nxqn3KKw9YnoKzcZvuArAfQ9GF3QG', + mercuryDir: 'QmbJCNKXJqVK8CzbjpNFz2YekHwh3CSHpBA86uqYg3sJ8q', + mercuryWiki: 'QmVgSHAdMxFAuMP2JiMAYkB8pCWP1tcB9djqvq8GKAFiHi' +} + +describe('pin', () => runOnAndOff(thing => { + let ipfs + + before(function () { + this.timeout(15 * 1000) + ipfs = thing.ipfs + return ipfs(`files add -r ${fixturePath}`) + }) + + describe('rm', function () { + it('recursively (default)', function () { + this.timeout(10 * 1000) + return ipfs(`pin rm ${pins.root}`) + .then(out => expect(out).to.equal(`unpinned ${pins.root}\n`)) + }) + }) + + describe('add', function () { + it('recursively (default)', () => { + return ipfs(`pin add ${pins.root}`) + .then(out => + expect(out).to.eql(`pinned ${pins.root} recursively\n`) + ) + }) + + it('direct', () => { + return ipfs(`pin add ${pins.solarWiki} --recursive false`) + .then(out => + expect(out).to.eql(`pinned ${pins.solarWiki} directly\n`) + ) + }) + }) + + describe('ls', function () { + it('lists all pins when no hash is passed', function () { + return ipfs('pin ls -q').then(out => { + const results = out.split('\n') + expect(results).to.include.members(Object.values(pins)) + }) + }) + + it('handles multiple hashes', function () { + return ipfs(`pin ls ${pins.root} ${pins.solarWiki}`) + .then(out => { + expect(out).to.eql( + `${pins.root} recursive\n${pins.solarWiki} direct\n` + ) + }) + }) + + it('can print quietly', function () { + return ipfs('pin ls -q').then(out => { + const firstLineParts = out.split(/\s/)[0].split(' ') + expect(firstLineParts).to.have.length(1) + }) + }) + }) +})) diff --git a/test/core/interface/pin.js b/test/core/interface/pin.js new file mode 100644 index 0000000000..604a5e2440 --- /dev/null +++ b/test/core/interface/pin.js @@ -0,0 +1,35 @@ +/* eslint-env mocha */ +'use strict' + +const test = require('interface-ipfs-core') +const parallel = require('async/parallel') + +const IPFS = require('../../../src') + +const DaemonFactory = require('ipfsd-ctl') +const df = DaemonFactory.create({ type: 'proc', exec: IPFS }) + +const nodes = [] +const common = { + setup: function (callback) { + callback(null, { + spawnNode: (cb) => { + df.spawn({ + initOptions: { bits: 512 } + }, (err, _ipfsd) => { + if (err) { + return cb(err) + } + + nodes.push(_ipfsd) + cb(null, _ipfsd.api) + }) + } + }) + }, + teardown: function (callback) { + parallel(nodes.map((node) => (cb) => node.stop(cb)), callback) + } +} + +test.pin(common) diff --git a/test/core/key-exchange.js b/test/core/key-exchange.js index 86a2781ad6..5628fc5b71 100644 --- a/test/core/key-exchange.js +++ b/test/core/key-exchange.js @@ -28,7 +28,7 @@ describe('key exchange', () => { ipfs.on('ready', () => done()) }) - after((done) => repo.teardown(done)) + after((done) => ipfs.stop(done)) it('exports', (done) => { ipfs.key.export('self', passwordPem, (err, pem) => { diff --git a/test/core/pin-set.js b/test/core/pin-set.js new file mode 100644 index 0000000000..243c81d25c --- /dev/null +++ b/test/core/pin-set.js @@ -0,0 +1,192 @@ +/* eslint max-nested-callbacks: ["error", 8] */ +/* eslint-env mocha */ +'use strict' + +const chai = require('chai') +const dirtyChai = require('dirty-chai') +const expect = chai.expect +chai.use(dirtyChai) + +const parallelLimit = require('async/parallelLimit') +const series = require('async/series') +const { fromB58String } = require('multihashes') +const { DAGNode } = require('ipld-dag-pb') +const CID = require('CIDs') + +const IPFS = require('../../src/core') +const createPinSet = require('../../src/core/components/pin-set') +const createTempRepo = require('../utils/create-repo-nodejs') + +const defaultFanout = 256 +const maxItems = 8192 + +/** + * Creates @param num DAGNodes, limited to 500 at a time to save memory + * @param {[type]} num the number of nodes to create + * @param {Function} callback node-style callback, result is an Array of all + * created nodes + * @return {void} + */ +function createNodes (num, callback) { + const items = [] + for (let i = 0; i < num; i++) { + items.push(cb => + createNode(String(i), (err, node) => cb(err, node._multihash)) + ) + } + + parallelLimit(items, 500, callback) +} + +function createNode (data, links = [], callback) { + if (typeof links === 'function') { + callback = links + links = [] + } + + DAGNode.create(data, links, callback) +} + +describe('pinSet', function () { + let ipfs + let pinSet + let repo + + before(function (done) { + this.timeout(20 * 1000) + repo = createTempRepo() + ipfs = new IPFS({ repo }) + ipfs.on('ready', () => { + pinSet = createPinSet(ipfs.dag) + done() + }) + }) + + after(function (done) { + this.timeout(10 * 1000) + ipfs.stop(done) + }) + + describe('storeItems', function () { + it('generates a root node with links and hash', function (done) { + const expectedRootHash = 'QmcLiSTjcjoVC2iuGbk6A2PVcWV3WvjZT4jxfNis1vjyrR' + + createNode('data', (err, node) => { + expect(err).to.not.exist() + const nodeHash = node._multihash + pinSet.storeSet([nodeHash], (err, rootNode) => { + expect(err).to.not.exist() + const node = rootNode.toJSON() + expect(node.multihash).to.eql(expectedRootHash) + expect(node.links).to.have.length(defaultFanout + 1) + + const lastLink = node.links[node.links.length - 1] + const mhash = fromB58String(lastLink.multihash) + expect(mhash).to.eql(nodeHash) + done() + }) + }) + }) + }) + + describe('handles large sets', function () { + it('handles storing items > maxItems', function (done) { + this.timeout(19 * 1000) + const expectedHash = 'QmbvhSy83QWfgLXDpYjDmLWBFfGc8utoqjcXHyj3gYuasT' + const count = maxItems + 1 + createNodes(count, (err, nodes) => { + expect(err).to.not.exist() + pinSet.storeSet(nodes, (err, node) => { + expect(err).to.not.exist() + + node = node.toJSON() + expect(node.size).to.eql(3184696) + expect(node.links).to.have.length(defaultFanout) + expect(node.multihash).to.eql(expectedHash) + + pinSet.loadSet(node, '', (err, loaded) => { + expect(err).to.not.exist() + expect(loaded).to.have.length(30) + const hashes = loaded.map(l => new CID(l).toBaseEncodedString()) + + // just check the first node, assume all are children if successful + pinSet.hasDescendant(node, hashes[0], (err, has) => { + expect(err).to.not.exist() + expect(has).to.eql(true) + done() + }) + }) + }) + }) + }) + + // This test is largely taken from go-ipfs/pin/set_test.go + // It fails after reaching maximum call stack depth but I don't believe it's + // infinite. We need to reference go's pinSet impl to make sure + // our sharding behaves correctly, or perhaps this test is misguided + it.skip('stress test: stores items > (maxItems * defaultFanout) + 1', function (done) { + this.timeout(180 * 1000) + + // this value triggers the creation of a recursive shard. + // If the recursive sharding is done improperly, this will result in + // an infinite recursion and crash (OOM) + const limit = (defaultFanout * maxItems) + 1 + + createNodes(limit, (err, nodes) => { + expect(err).to.not.exist() + series([ + cb => pinSet.storeSet(nodes.slice(0, -1), (err, res) => { + expect(err).to.not.exist() + cb(null, res) + }), + cb => pinSet.storeSet(nodes, (err, res) => { + expect(err).to.not.exist() + cb(null, res) + }) + ], (err, rootNodes) => { + expect(err).to.not.exist() + expect(rootNodes[1].length - rootNodes[2].length).to.eql(2) + done() + }) + }) + }) + }) + + describe('walkItems', function () { + it(`fails if node doesn't have a pin-set protobuf header`, function (done) { + createNode('datum', (err, node) => { + expect(err).to.not.exist() + + pinSet.walkItems(node, () => {}, (err, res) => { + expect(err).to.exist() + expect(res).to.not.exist() + done() + }) + }) + }) + + it('visits all non-fanout links of a root node', function (done) { + const seen = [] + const walker = (link, idx, data) => seen.push({ link, idx, data }) + + createNodes(defaultFanout, (err, nodes) => { + expect(err).to.not.exist() + + pinSet.storeSet(nodes, (err, node) => { + expect(err).to.not.exist() + + pinSet.walkItems(node, walker, err => { + expect(err).to.not.exist() + expect(seen).to.have.length(defaultFanout) + expect(seen[0].idx).to.eql(defaultFanout) + seen.forEach(item => { + expect(item.data).to.eql(Buffer.alloc(0)) + expect(item.link).to.exist() + }) + done() + }) + }) + }) + }) + }) +}) diff --git a/test/core/pin.js b/test/core/pin.js new file mode 100644 index 0000000000..32618422a0 --- /dev/null +++ b/test/core/pin.js @@ -0,0 +1,328 @@ +/* eslint max-nested-callbacks: ["error", 8] */ +/* eslint-env mocha */ +'use strict' + +const chai = require('chai') +const dirtyChai = require('dirty-chai') +const expect = chai.expect +chai.use(dirtyChai) + +const fs = require('fs') + +const IPFS = require('../../src/core') +const createTempRepo = require('../utils/create-repo-nodejs') +const expectTimeout = require('../utils/expect-timeout') + +// fixture structure: +// planets/ +// solar-system.md +// mercury/ +// wiki.md +const pins = { + root: 'QmTAMavb995EHErSrKo7mB8dYkpaSJxu6ys1a6XJyB2sys', + solarWiki: 'QmTMbkDfvHwq3Aup6Nxqn3KKw9YnoKzcZvuArAfQ9GF3QG', + mercuryDir: 'QmbJCNKXJqVK8CzbjpNFz2YekHwh3CSHpBA86uqYg3sJ8q', + mercuryWiki: 'QmVgSHAdMxFAuMP2JiMAYkB8pCWP1tcB9djqvq8GKAFiHi' +} +const pinTypes = { + direct: 'direct', + recursive: 'recursive', + indirect: 'indirect', + all: 'all' +} + +describe('pin', function () { + const fixtures = [ + 'test/fixtures/planets/mercury/wiki.md', + 'test/fixtures/planets/solar-system.md' + ].map(path => ({ + path, + content: fs.readFileSync(path) + })) + + let ipfs + let pin + let repo + + function expectPinned (hash, type, pinned = true) { + if (typeof type === 'boolean') { + pinned = type + type = undefined + } + + return pin._isPinnedWithType(hash, type || pinTypes.all) + .then(result => expect(result.pinned).to.eql(pinned)) + } + + function clearPins () { + return pin.ls() + .then(ls => { + const pinsToRemove = ls + .filter(out => out.type === pinTypes.recursive) + .map(out => pin.rm(out.hash)) + return Promise.all(pinsToRemove) + }) + .then(() => pin.ls()) + .then(ls => { + const pinsToRemove = ls + .filter(out => out.type === pinTypes.direct) + .map(out => pin.rm(out.hash)) + return Promise.all(pinsToRemove) + }) + } + + before(function (done) { + this.timeout(20 * 1000) + repo = createTempRepo() + ipfs = new IPFS({ repo }) + ipfs.on('ready', () => { + pin = ipfs.pin + ipfs.files.add(fixtures, done) + }) + }) + + after(done => ipfs.stop(done)) + + describe('isPinnedWithType', function () { + beforeEach(function () { + return clearPins() + .then(() => pin.add(pins.root)) + }) + + it('when node is pinned', function () { + return pin.add(pins.solarWiki) + .then(() => pin._isPinnedWithType(pins.solarWiki, pinTypes.all)) + .then(pinned => expect(pinned.pinned).to.eql(true)) + }) + + it('when node is not in datastore', function () { + const falseHash = `${pins.root.slice(0, -2)}ss` + return pin._isPinnedWithType(falseHash, pinTypes.all) + .then(pinned => { + expect(pinned.pinned).to.eql(false) + expect(pinned.reason).to.eql(undefined) + }) + }) + + it('when node is in datastore but not pinned', function () { + return pin.rm(pins.root) + .then(() => expectPinned(pins.root, false)) + }) + + it('when pinned recursively', function () { + return pin._isPinnedWithType(pins.root, pinTypes.recursive) + .then(result => { + expect(result.pinned).to.eql(true) + expect(result.reason).to.eql(pinTypes.recursive) + }) + }) + + it('when pinned indirectly', function () { + return pin._isPinnedWithType(pins.mercuryWiki, pinTypes.indirect) + .then(result => { + expect(result.pinned).to.eql(true) + expect(result.reason).to.eql(pins.root) + }) + }) + + it('when pinned directly', function () { + return pin.add(pins.mercuryDir, { recursive: false }) + .then(() => { + return pin._isPinnedWithType(pins.mercuryDir, pinTypes.direct) + .then(result => { + expect(result.pinned).to.eql(true) + expect(result.reason).to.eql(pinTypes.direct) + }) + }) + }) + + it('when not pinned', function () { + return clearPins() + .then(() => pin._isPinnedWithType(pins.mercuryDir, pinTypes.direct)) + .then(pin => expect(pin.pinned).to.eql(false)) + }) + }) + + describe('add', function () { + beforeEach(function () { + return clearPins() + }) + + it('recursive', function () { + return pin.add(pins.root) + .then(() => { + const pinChecks = Object.values(pins) + .map(hash => expectPinned(hash)) + + return Promise.all(pinChecks) + }) + }) + + it('direct', function () { + return pin.add(pins.root, { recursive: false }) + .then(() => Promise.all([ + expectPinned(pins.root), + expectPinned(pins.solarWiki, false) + ])) + }) + + it('recursive pin parent of direct pin', function () { + return pin.add(pins.solarWiki, { recursive: false }) + .then(() => pin.add(pins.root)) + .then(() => Promise.all([ + // solarWiki is pinned both directly and indirectly o.O + expectPinned(pins.solarWiki, pinTypes.direct), + expectPinned(pins.solarWiki, pinTypes.indirect) + ])) + }) + + it('directly pinning a recursive pin fails', function () { + return pin.add(pins.root) + .then(() => pin.add(pins.root, { recursive: false })) + .catch(err => expect(err).to.match(/already pinned recursively/)) + }) + + it('can\'t pin item not in datastore', function () { + this.timeout(10 * 1000) + const falseHash = `${pins.root.slice(0, -2)}ss` + return expectTimeout(pin.add(falseHash), 4000) + }) + + // TODO block rm breaks subsequent tests + it.skip('needs all children in datastore to pin recursively', function () { + this.timeout(10 * 1000) + return ipfs.block.rm(pins.mercuryWiki) + .then(() => expectTimeout(pin.add(pins.root), 4000)) + }) + }) + + describe('ls', function () { + before(function () { + return clearPins() + .then(() => Promise.all([ + pin.add(pins.root), + pin.add(pins.mercuryDir, { recursive: false }) + ])) + }) + + it('lists pins of a particular hash', function () { + return pin.ls(pins.mercuryDir) + .then(out => expect(out[0].hash).to.eql(pins.mercuryDir)) + }) + + it('indirect pins supersedes direct pins', function () { + return pin.ls() + .then(ls => { + const pinType = ls.find(out => out.hash === pins.mercuryDir).type + expect(pinType).to.eql(pinTypes.indirect) + }) + }) + + describe('list pins of type', function () { + it('all', function () { + return pin.ls() + .then(out => + expect(out).to.deep.include.members([ + { type: 'recursive', + hash: 'QmTAMavb995EHErSrKo7mB8dYkpaSJxu6ys1a6XJyB2sys' }, + { type: 'indirect', + hash: 'QmTMbkDfvHwq3Aup6Nxqn3KKw9YnoKzcZvuArAfQ9GF3QG' }, + { type: 'indirect', + hash: 'QmbJCNKXJqVK8CzbjpNFz2YekHwh3CSHpBA86uqYg3sJ8q' }, + { type: 'indirect', + hash: 'QmVgSHAdMxFAuMP2JiMAYkB8pCWP1tcB9djqvq8GKAFiHi' } + ]) + ) + }) + + it('direct', function () { + return pin.ls({ type: 'direct' }) + .then(out => + expect(out).to.deep.include.members([ + { type: 'direct', + hash: 'QmbJCNKXJqVK8CzbjpNFz2YekHwh3CSHpBA86uqYg3sJ8q' } + ]) + ) + }) + + it('recursive', function () { + return pin.ls({ type: 'recursive' }) + .then(out => + expect(out).to.deep.include.members([ + { type: 'recursive', + hash: 'QmTAMavb995EHErSrKo7mB8dYkpaSJxu6ys1a6XJyB2sys' } + ]) + ) + }) + + it('indirect', function () { + return pin.ls({ type: 'indirect' }) + .then(out => + expect(out).to.deep.include.members([ + { type: 'indirect', + hash: 'QmTMbkDfvHwq3Aup6Nxqn3KKw9YnoKzcZvuArAfQ9GF3QG' }, + { type: 'indirect', + hash: 'QmbJCNKXJqVK8CzbjpNFz2YekHwh3CSHpBA86uqYg3sJ8q' }, + { type: 'indirect', + hash: 'QmVgSHAdMxFAuMP2JiMAYkB8pCWP1tcB9djqvq8GKAFiHi' } + ]) + ) + }) + }) + }) + + describe('rm', function () { + beforeEach(function () { + return clearPins() + .then(() => pin.add(pins.root)) + }) + + it('a recursive pin', function () { + return pin.rm(pins.root) + .then(() => { + return Promise.all([ + expectPinned(pins.root, false), + expectPinned(pins.mercuryWiki, false) + ]) + }) + }) + + it('a direct pin', function () { + return clearPins() + .then(() => pin.add(pins.mercuryDir, { recursive: false })) + .then(() => pin.rm(pins.mercuryDir)) + .then(() => expectPinned(pins.mercuryDir, false)) + }) + + it('fails to remove an indirect pin', function () { + return pin.rm(pins.solarWiki) + .catch(err => expect(err).to.match(/is pinned indirectly under/)) + .then(() => expectPinned(pins.solarWiki)) + }) + + it('fails when an item is not pinned', function () { + return pin.rm(pins.root) + .then(() => pin.rm(pins.root)) + .catch(err => expect(err).to.match(/is not pinned/)) + }) + }) + + describe('flush', function () { + beforeEach(function () { + return pin.add(pins.root) + }) + + it('flushes', function () { + return pin.ls() + .then(ls => expect(ls.length).to.eql(4)) + .then(() => { + // indirectly trigger a datastore flush by adding something + return clearPins() + .then(() => pin.add(pins.mercuryWiki)) + }) + .then(() => pin._load()) + .then(() => pin.ls()) + .then(ls => expect(ls.length).to.eql(1)) + }) + }) +}) diff --git a/test/core/utils.js b/test/core/utils.js new file mode 100644 index 0000000000..b5c84b15c1 --- /dev/null +++ b/test/core/utils.js @@ -0,0 +1,160 @@ +/* eslint max-nested-callbacks: ["error", 8] */ +/* eslint-env mocha */ +'use strict' + +const chai = require('chai') +const dirtyChai = require('dirty-chai') +const expect = chai.expect +chai.use(dirtyChai) + +const fs = require('fs') +const fromB58String = require('multihashes').fromB58String + +// This gets replaced by `create-repo-browser.js` in the browser +const createTempRepo = require('../utils/create-repo-nodejs.js') +const IPFS = require('../../src/core') +const utils = require('../../src/core/utils') + +describe('utils', () => { + const rootHash = 'QmTAMavb995EHErSrKo7mB8dYkpaSJxu6ys1a6XJyB2sys' + const rootPath = `/ipfs/${rootHash}` + const rootMultihash = fromB58String(rootHash) + const aboutHash = 'QmbJCNKXJqVK8CzbjpNFz2YekHwh3CSHpBA86uqYg3sJ8q' + const aboutPath = `${rootPath}/mercury` + const aboutMultihash = fromB58String(aboutHash) + + describe('parseIpfsPath', () => { + it('parses path with no links', function () { + expect(utils.parseIpfsPath(rootHash)) + .to.deep.equal({ + hash: rootHash, + links: [] + }) + }) + + it('parses path with links', function () { + expect(utils.parseIpfsPath(`${rootHash}/docs/index`)) + .to.deep.equal({ + hash: rootHash, + links: ['docs', 'index'] + }) + }) + + it('parses path with /ipfs/ prefix', function () { + expect(utils.parseIpfsPath(`/ipfs/${rootHash}/about`)) + .to.deep.equal({ + hash: rootHash, + links: ['about'] + }) + }) + + it('parses path with leading and trailing slashes', function () { + expect(utils.parseIpfsPath(`/${rootHash}/`)) + .to.deep.equal({ + hash: rootHash, + links: [] + }) + }) + + it('parses non sha2-256 paths', function () { + // There are many, many hashing algorithms. Just one should be a sufficient + // indicator. Used go-ipfs@0.4.13 `add --hash=keccak-512` to generate + const keccak512 = 'zB7S6ZdcqsTqvNhBpx3SbFTocRpAUHj1w9WQXQGyWBVEsLStNfaaNtsdFUQbRk4tYPZvnpGbtDN5gEH4uVzUwsFyJh9Ei' + expect(utils.parseIpfsPath(keccak512)) + .to.deep.equal({ + hash: keccak512, + links: [] + }) + }) + + it('returns error for malformed path', function () { + const fn = () => utils.parseIpfsPath(`${rootHash}//about`) + expect(fn).to.throw('invalid ipfs ref path') + }) + + it('returns error if root is not a valid sha2-256 multihash', function () { + const fn = () => utils.parseIpfsPath('invalid/ipfs/path') + expect(fn).to.throw('invalid ipfs ref path') + }) + }) + + describe('resolvePath', function () { + this.timeout(80 * 1000) + const fixtures = [ + 'test/fixtures/planets/mercury/wiki.md', + 'test/fixtures/planets/solar-system.md' + ].map(path => ({ + path, + content: fs.readFileSync(path) + })) + + let node + let repo + + before(done => { + repo = createTempRepo() + node = new IPFS({ + repo: repo + }) + node.once('ready', () => node.files.add(fixtures, done)) + }) + + after(done => node.stop(done)) + + it('handles base58 hash format', (done) => { + utils.resolvePath(node.object, rootHash, (err, hashes) => { + expect(err).to.not.exist() + expect(hashes.length).to.equal(1) + expect(hashes[0]).to.deep.equal(rootMultihash) + done() + }) + }) + + it('handles multihash format', (done) => { + utils.resolvePath(node.object, aboutMultihash, (err, hashes) => { + expect(err).to.not.exist() + expect(hashes.length).to.equal(1) + expect(hashes[0]).to.deep.equal(aboutMultihash) + done() + }) + }) + + it('handles ipfs paths format', function (done) { + this.timeout(200 * 1000) + utils.resolvePath(node.object, aboutPath, (err, hashes) => { + expect(err).to.not.exist() + expect(hashes.length).to.equal(1) + expect(hashes[0]).to.deep.equal(aboutMultihash) + done() + }) + }) + + it('handles an array', (done) => { + const paths = [rootHash, rootPath, rootMultihash] + utils.resolvePath(node.object, paths, (err, hashes) => { + expect(err).to.not.exist() + expect(hashes.length).to.equal(3) + expect(hashes[0]).to.deep.equal(rootMultihash) + expect(hashes[1]).to.deep.equal(rootMultihash) + expect(hashes[2]).to.deep.equal(rootMultihash) + done() + }) + }) + + it('should error on invalid hashes', function (done) { + utils.resolvePath(node.object, '/ipfs/asdlkjahsdfkjahsdfd', err => { + expect(err).to.exist() + done() + }) + }) + + it(`should error when a link doesn't exist`, function (done) { + utils.resolvePath(node.object, `${aboutPath}/fusion`, err => { + expect(err.message).to.include( + `no link named "fusion" under QmbJCNKXJqVK8CzbjpNFz2YekHwh3CSHpBA86uqYg3sJ8q` + ) + done() + }) + }) + }) +}) diff --git a/test/fixtures/planets/mercury/wiki.md b/test/fixtures/planets/mercury/wiki.md new file mode 100644 index 0000000000..1b4039ba80 --- /dev/null +++ b/test/fixtures/planets/mercury/wiki.md @@ -0,0 +1,12 @@ +# Mercury (planet) +> From Wikipedia, the free encyclopedia + +Mercury is the smallest and innermost planet in the Solar System. Its orbital period around the Sun of 87.97 days is the shortest of all the planets in the Solar System. It is named after the Roman deity Mercury, the messenger of the gods. + +Like Venus, Mercury orbits the Sun within Earth's orbit as an inferior planet, and never exceeds 28° away from the Sun. When viewed from Earth, this proximity to the Sun means the planet can only be seen near the western or eastern horizon during the early evening or early morning. At this time it may appear as a bright star-like object, but is often far more difficult to observe than Venus. The planet telescopically displays the complete range of phases, similar to Venus and the Moon, as it moves in its inner orbit relative to Earth, which reoccurs over the so-called synodic period approximately every 116 days. + +Mercury is gravitationally locked with the Sun in a 3:2 spin-orbit resonance, and rotates in a way that is unique in the Solar System. As seen relative to the fixed stars, it rotates on its axis exactly three times for every two revolutions it makes around the Sun. As seen from the Sun, in a frame of reference that rotates with the orbital motion, it appears to rotate only once every two Mercurian years. An observer on Mercury would therefore see only one day every two years. + +Mercury's axis has the smallest tilt of any of the Solar System's planets (about ​1⁄30 degree). Its orbital eccentricity is the largest of all known planets in the Solar System; at perihelion, Mercury's distance from the Sun is only about two-thirds (or 66%) of its distance at aphelion. Mercury's surface appears heavily cratered and is similar in appearance to the Moon's, indicating that it has been geologically inactive for billions of years. Having almost no atmosphere to retain heat, it has surface temperatures that vary diurnally more than on any other planet in the Solar System, ranging from 100 K (−173 °C; −280 °F) at night to 700 K (427 °C; 800 °F) during the day across the equatorial regions. The polar regions are constantly below 180 K (−93 °C; −136 °F). The planet has no known natural satellites. + +Two spacecraft have visited Mercury: Mariner 10 flew by in 1974 and 1975; and MESSENGER, launched in 2004, orbited Mercury over 4,000 times in four years before exhausting its fuel and crashing into the planet's surface on April 30, 2015. diff --git a/test/fixtures/planets/solar-system.md b/test/fixtures/planets/solar-system.md new file mode 100644 index 0000000000..f249cd3a53 --- /dev/null +++ b/test/fixtures/planets/solar-system.md @@ -0,0 +1,10 @@ +# Solar System +> From Wikipedia, the free encyclopedia + +The Solar System is the gravitationally bound system comprising the Sun and the objects that orbit it, either directly or indirectly. Of those objects that orbit the Sun directly, the largest eight are the planets, with the remainder being smaller objects, such as dwarf planets and small Solar System bodies. Of the objects that orbit the Sun indirectly, the moons, two are larger than the smallest planet, Mercury. + +The Solar System formed 4.6 billion years ago from the gravitational collapse of a giant interstellar molecular cloud. The vast majority of the system's mass is in the Sun, with the majority of the remaining mass contained in Jupiter. The four smaller inner planets, Mercury, Venus, Earth and Mars, are terrestrial planets, being primarily composed of rock and metal. The four outer planets are giant planets, being substantially more massive than the terrestrials. The two largest, Jupiter and Saturn, are gas giants, being composed mainly of hydrogen and helium; the two outermost planets, Uranus and Neptune, are ice giants, being composed mostly of substances with relatively high melting points compared with hydrogen and helium, called volatiles, such as water, ammonia and methane. All eight planets have almost circular orbits that lie within a nearly flat disc called the ecliptic. + +The Solar System also contains smaller objects. The asteroid belt, which lies between the orbits of Mars and Jupiter, mostly contains objects composed, like the terrestrial planets, of rock and metal. Beyond Neptune's orbit lie the Kuiper belt and scattered disc, which are populations of trans-Neptunian objects composed mostly of ices, and beyond them a newly discovered population of sednoids. Within these populations are several dozen to possibly tens of thousands of objects large enough that they have been rounded by their own gravity. Such objects are categorized as dwarf planets. Identified dwarf planets include the asteroid Ceres and the trans-Neptunian objects Pluto and Eris. In addition to these two regions, various other small-body populations, including comets, centaurs and interplanetary dust clouds, freely travel between regions. Six of the planets, at least four of the dwarf planets, and many of the smaller bodies are orbited by natural satellites, usually termed "moons" after the Moon. Each of the outer planets is encircled by planetary rings of dust and other small objects. + +The solar wind, a stream of charged particles flowing outwards from the Sun, creates a bubble-like region in the interstellar medium known as the heliosphere. The heliopause is the point at which pressure from the solar wind is equal to the opposing pressure of the interstellar medium; it extends out to the edge of the scattered disc. The Oort cloud, which is thought to be the source for long-period comets, may also exist at a distance roughly a thousand times further than the heliosphere. The Solar System is located in the Orion Arm, 26,000 light-years from the center of the Milky Way. diff --git a/test/http-api/inject/pin.js b/test/http-api/inject/pin.js new file mode 100644 index 0000000000..72d9374e14 --- /dev/null +++ b/test/http-api/inject/pin.js @@ -0,0 +1,174 @@ +/* eslint-env mocha */ +/* eslint max-nested-callbacks: ["error", 8] */ +'use strict' + +const expect = require('chai').expect + +// We use existing pin structure in the go-ipfs-repo fixture +// so that we don't have to stream a bunch of object/put operations +// This is suitable because these tests target the functionality +// of the /pin endpoints and don't delve into the pin core +// +// fixture's pins: +// - root1 +// - c1 +// - c2 +// - c3 +// - c4 +// - c5 +// - c6 +// - root2 + +const pins = { + root1: 'QmVtU7ths96fMgZ8YSZAbKghyieq7AjxNdcqyVzxTt3qVe', + c1: 'QmZTR5bcpQD7cFgTorqxZDYaew1Wqgfbd2ud9QqGPAkK2V', + c2: 'QmYCvbfNbCwFR45HiNP45rwJgvatpiW38D961L5qAhUM5Y', + c3: 'QmY5heUM5qgRubMDD1og9fhCPA6QdkMp3QCwd4s7gJsyE7', + c4: 'QmUzLxaXnM8RYCPEqLDX5foToi5aNZHqfYr285w2BKhkft', + c5: 'QmPZ9gcCEpqKTo6aq61g2nXGUhM4iCL3ewB6LDXZCtioEB', + c6: 'QmTumTjvcYCAvRRwQ8sDRxh8ezmrcr88YFU7iYNroGGTBZ', + root2: 'QmUNLLsPACCz1vLxQVkXqqLX5R1X345qqfHbsf67hvA3Nn' +} + +module.exports = (http) => { + describe('pin', () => { + let api + + before(() => { + api = http.api.server.select('API') + }) + + describe('rm', () => { + it('fails on invalid args', done => { + api.inject({ + method: 'POST', + url: `/api/v0/pin/rm?arg=invalid` + }, res => { + expect(res.statusCode).to.equal(500) + expect(res.result.Message).to.match(/invalid ipfs ref path/) + done() + }) + }) + + it('unpins recursive pins', done => { + api.inject({ + method: 'POST', + url: `/api/v0/pin/rm?arg=${pins.root1}` + }, (res) => { + expect(res.statusCode).to.equal(200) + expect(res.result.Pins).to.deep.eql([pins.root1]) + done() + }) + }) + + it('unpins direct pins', done => { + api.inject({ + method: 'POST', + url: `/api/v0/pin/add?arg=${pins.root1}&recursive=false` + }, res => { + expect(res.statusCode).to.equal(200) + api.inject({ + method: 'POST', + url: `/api/v0/pin/rm?arg=${pins.root1}&recursive=false` + }, (res) => { + expect(res.statusCode).to.equal(200) + expect(res.result.Pins).to.deep.eql([pins.root1]) + done() + }) + }) + }) + }) + + describe('add', () => { + it('fails on invalid args', done => { + api.inject({ + method: 'POST', + url: `/api/v0/pin/add?arg=invalid` + }, res => { + expect(res.statusCode).to.equal(500) + expect(res.result.Message).to.match(/invalid ipfs ref path/) + done() + }) + }) + + it('recursively', done => { + api.inject({ + method: 'POST', + url: `/api/v0/pin/add?arg=${pins.root1}` + }, (res) => { + expect(res.statusCode).to.equal(200) + expect(res.result.Pins).to.deep.eql([pins.root1]) + done() + }) + }) + + it('directly', done => { + api.inject({ + method: 'POST', + url: `/api/v0/pin/add?arg=${pins.root1}&recursive=false` + }, (res) => { + // by directly pinning a node that is already recursively pinned, + // it should error and verifies that the endpoint is parsing + // the recursive arg correctly. + expect(res.statusCode).to.equal(500) + expect(res.result.Message).to.match(/already pinned recursively/) + done() + }) + }) + }) + + describe('ls', () => { + it('fails on invalid args', done => { + api.inject({ + method: 'GET', + url: `/api/v0/pin/ls?arg=invalid` + }, res => { + expect(res.statusCode).to.equal(500) + expect(res.result.Message).to.match(/invalid ipfs ref path/) + done() + }) + }) + + it('finds all pinned objects', done => { + api.inject({ + method: 'GET', + url: '/api/v0/pin/ls' + }, (res) => { + expect(res.statusCode).to.equal(200) + expect(res.result.Keys).to.have.all.keys(Object.values(pins)) + done() + }) + }) + + it('finds specific pinned objects', done => { + api.inject({ + method: 'GET', + url: `/api/v0/pin/ls?arg=${pins.c1}` + }, (res) => { + expect(res.statusCode).to.equal(200) + expect(res.result.Keys[pins.c1].Type) + .to.equal(`indirect through ${pins.root1}`) + done() + }) + }) + + it('finds pins of type', done => { + api.inject({ + method: 'GET', + url: `/api/v0/pin/ls?type=recursive` + }, (res) => { + expect(res.statusCode).to.equal(200) + expect(res.result.Keys).to.deep.eql({ + QmUNLLsPACCz1vLxQVkXqqLX5R1X345qqfHbsf67hvA3Nn: { + Type: 'recursive' + }, + QmVtU7ths96fMgZ8YSZAbKghyieq7AjxNdcqyVzxTt3qVe: { + Type: 'recursive' + } + }) + done() + }) + }) + }) + }) +} diff --git a/test/utils/expect-timeout.js b/test/utils/expect-timeout.js new file mode 100644 index 0000000000..51c7330755 --- /dev/null +++ b/test/utils/expect-timeout.js @@ -0,0 +1,16 @@ +'use strict' + +/** + * Resolve if @param promise hangs for at least @param ms, throw otherwise + * @param {Promise} promise promise that you expect to hang + * @param {Number} ms millis to wait + * @return {Promise} + */ +module.exports = (promise, ms) => { + return Promise.race([ + promise.then((out) => { + throw new Error('Expected Promise to timeout but it was successful.') + }), + new Promise((resolve, reject) => setTimeout(resolve, ms)) + ]) +}