Skip to content
This repository has been archived by the owner on Apr 16, 2021. It is now read-only.

3 blog posts #10

Merged
8 commits merged into from
Dec 10, 2015
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
37 changes: 18 additions & 19 deletions build.js
Original file line number Diff line number Diff line change
Expand Up @@ -4,14 +4,13 @@ var Metalsmith = require('metalsmith')
var debug = require('metalsmith-debug')
var templates = require('metalsmith-templates')
var collections = require('metalsmith-collections')
var partial = require('metalsmith-partial')
var msstatic = require('metalsmith-static')
var serve = require('metalsmith-serve')
var watch = require('metalsmith-watch')
var markdown = require('metalsmith-markdown')
var headingsidentifier = require("metalsmith-headings-identifier");
var permalinks = require('metalsmith-permalinks')
var feed = require('metalsmith-feed')
var msstatic = require('metalsmith-static')
var headingsidentifier = require('metalsmith-headings-identifier')

var nunjucks = require('nunjucks')
var njmd = require('nunjucks-markdown')
Expand All @@ -21,13 +20,13 @@ var marked = require('marked')
marked.setOptions({
gfm: true,
tables: true,
smartLists: true,
smartLists: true
})

njenv = nunjucks.configure()
var njenv = nunjucks.configure()
njmd.register(njenv, marked)

njdate.setDefaultFormat('YYYY-MM-DD, h:mm:ss a');
njdate.setDefaultFormat('YYYY-MM-DD, h:mm:ss a')
njdate.install(njenv)

njenv.addFilter('dump', JSON.stringify)
Expand All @@ -36,30 +35,30 @@ Metalsmith(__dirname)
.use(debug())
.metadata({
site: {
title: "IPFS Blog",
url: "http://ipfs.io/blog/",
author: "The IPFS Team",
},
title: 'IPFS Blog',
url: 'http://ipfs.io/blog/',
author: 'The IPFS Team'
}
})
.use(collections({
posts: {},
posts: {}
}))
.use(markdown())
.use(templates({ "directory": ".", "engine": "nunjucks", "inPlace": true }))
.use(templates({ "directory": ".", "engine": "nunjucks" }))
.use(templates({ 'directory': '.', 'engine': 'nunjucks', 'inPlace': true }))
.use(templates({ 'directory': '.', 'engine': 'nunjucks' }))
.use(headingsidentifier())
.use(permalinks())
.use(feed({"collection": "posts"}))
.use(msstatic({"src": "tmpl/static", "dest": "static"}))
.use(feed({'collection': 'posts'}))
.use(msstatic({'src': 'tmpl/static', 'dest': 'static'}))
.use(serve({
"port": 8081,
"verbose": true
'port': 8081,
'verbose': true
}))
.use(watch())
.destination('./build')
.build(function(err){
.build(function (err) {
if (err) {
console.log(err)
throw err;
throw err
}
})
86 changes: 86 additions & 0 deletions src/2-idempotent-encoder-decoder/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,86 @@
---
date: 2015-09-07
id: 2-idempotent-encoder-decoder
template: tmpl/layouts/post.html
baseurl: ..
breadcrumbs:
- {name: "2-idempotent-encoder-decoder", link: "./" }
tags: codec
title: Idempotent encoders and decoders
author: David Dias
collection: posts
---

Encoding and decoding data is a practice that has been present since we started storing and transmiting information -- even before the Information Age. Encoding and decoding with a specific algorithm, or as we are used to call it, a `codec`, has several different benefits such as: reduced storage space, faster transmission over a selected transport, the possibility to capture data in a specific system-readable form, among others. However not every codec has offers the same features and it is typically wise to use the best codec for the type of data we are dealing with.

A deterministic and idempotent encoding and decoding process should return the same data that was served for the encode function, once the decode function is applied. In essence, something like:

```JavaScript
function isIdempotent(codec, val) {
return val === codec.decode(codec.encode(val))
}
// should always return true, for all encodable values val
```

Having deterministic and idempotent codecs are not always possible. For example, take a look at sound and video. Capturing a continuous signal would require infinite memory storage, as the segment between two points in a continuum space has an infinite number of other points. And so, in order to capture this, we have to encode 'samples' of the input signal and not the signal as a whole. This loses information and is typically known as lossy compression. Have in mind that there are some codecs for sound and video that are known as lossless compression, simply because the deviations from the original signal don't represent a significant enough change to be relevant.

The expectation is that for discrete signals and finite sets of data, the codecs should be always deterministic and idempotent. Unfortunately this is not always true in the languages we have access to today, with the serializers and deserializers they offer.

I remembered that this expectation can be violated recently, when I needed to rename some keys in a JavaScript object for a Linked Data expander function. My first quick (and hacky) solution was to encode the object to a String. From that format, I'd apply a Regular Expression to change all of the occurrences of a given key.

```JavaScript
function remapKeys(key, newKey, obj)
// so hacky
var strObj = JSON.stringify(obj)
strObj.replace(key, newKey)
return JSON.parse(strObj)
})
```

This solution worked 'fine', for a while, until I found a case where the data the function was returning started to look diferent, although I was just remaping the keys. Since the type Buffer is not part of the JSON spec (it was later added by in Node.js), there is no standardised way to express it in the JSON format. So, Node.js changes the format if `decoding(encoding(objWithABuffer))` is applied.

```JavaScript
» node
> var obj = { buf: new Buffer('aaaah the data')}
undefined
> obj
{ buf: <Buffer 61 61 61 61 68 20 74 68 65 20 64 61 74 61> }
> JSON.stringify(obj)
'{"buf":{"type":"Buffer","data":[97,97,97,97,104,32,116,104,101,32,100,97,116,97]}}'
> JSON.parse(JSON.stringify(obj))
{ buf:
{ type: 'Buffer',
data: [ 97, 97, 97, 97, 104, 32, 116, 104, 101, 32, 100, 97, 116, 97 ] } }
```

As we can see in the example above, there is a mutation of the data and a violation of our deterministic and idempotent codec expectation.

Fortunately, I was able to solve my problem with a proper solution, which is a work around the encoding/decoding problem. This is also a more elegant way to remap keys.

```
function remapKeys (obj, keyMap) {
return _.reduce(obj, remap, {})

function remap (newObj, val, oldKey) {
var newKey
if (keyMap[oldKey]) {
newKey = keyMap[oldKey]
} else {
newKey = oldKey
}

if (val instanceof Object && !Buffer.isBuffer(val)) {
newObj[newKey] = _.reduce(val, remap, {})
} else {
newObj[newKey] = val
}
return newObj
}
}
```

You can find this code available as an npm module [remap-keys](https://www.npmjs.com/package/remap-keys).

Unfortunately, it is very hard to revert these decisions, since it would break the current developers expectations. A newer, recent finding of encoder/decoder that doesn't comply with the expectation is [node-cbor](https://www.npmjs.com/package/cbor). In this case, the decoder function returns the decoded objects inside an Array, even if the encoded data was a single object. If you would like to participate in the discussion to find a good approach to fix this (if you consider it needs to be changed), visit https://github.com/hildjj/node-cbor/issues/21.

In summary, I hope this post helped expose how we can not always assume that our data won't be mangled once it traverses a encoding -> decoding routine, although we should shoot to build codecs that are idempotent and deterministic for discrete sets of data.
104 changes: 104 additions & 0 deletions src/3-ipscend/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,104 @@
---
date: 2015-11-27
id: 3-ipscend
template: tmpl/layouts/post.html
baseurl: ..
breadcrumbs:
- {name: "3-ipscend", link: "./" }
tags: codec
title: ipscend - Publish static web content to IPFS
author: David Dias
collection: posts
---

[![](/img/ipscend.png)](https://github.com/diasdavid/ipscend)

[`ipscend`](https://github.com/diasdavid/ipscend) is a new tool to help developers publish their static web content to IPFS and share it easily, while keeping history and more. It is heavily inspired by previous static web content publishing tools, like GitHub Pages and surge.

## Features

Currently, `ipscend` offers a set of features, accessible through a CLI, and installable through npm (`npm i -g ipscend`), which enable a simple workflow for working in your Web page/app and publishing it to the IPFS network.

- `ipscend browse` - Opens the last published version of your application in the browser.
- `ipscend init` - Initializes your project. Asks for the folder where the web application will be available and stores an `ipscend.json` object in your current path to store all the metadata it generates, such as published versions and taken screenshots.
- `ipscend preview` - Serves your application on a local static file server, so that you can try it out before you feel ready to publish it.
- `ipscend publish` - Publishes the current state of your application to IPFS and stores a reference to it.
- `ipscend screenshot` - Opens a screenshot preview of all the published versions of your app. In order to generate the screenshots, you must first run `ipscend screenshot --gen`.
![](http://zippy.gfycat.com/TameDampKob.gif)
- `ipscend versions` - Prints out the published versions for the app and its respective timestamp.

An `ipscend.json` object for a project with some published versions will look like:

```bash
$ cat ipscend.json
{
"versions": [
{
"hash": "QmQhNMwk7fThpwRNUR3bStb1eA7aeFJaApZjDaQCnykowL",
"timestamp": "2015-11-14T14:50:10.998Z",
"snapshot": "QmNMqiKZG7gCsnaQFTqG3AUeVhA1n8byy974Yqn3qRGZcJ"
},
{
"hash": "QmSAmgQPCWjbrpbYHZQ2rVkH7a9vavubG1Jzv5CjDWrUmt",
"timestamp": "2015-11-14T14:51:00.860Z",
"snapshot": "QmcCNrn72FuHWkXtpJuUYfbH87d61qa6PSagUbLiK6VfLJ"
},
{
"hash": "QmVNgdUoBQHiBhSeDe2z8LttJaDZq7JZi17sR1SPnJmjMh",
"timestamp": "2015-11-14T14:51:24.379Z",
"snapshot": "QmP5NuGdozeWaZqEdY1zBpcupB6qQ66AWReqm4L2vJzt73"
}
],
"path": "src/public"
}
```

## Workflow

In order to get started, all you need to do is initiate your ipscend in your web page/app project:

```bash
$ ipscend init
This utility will walk you through creating a ipscend.json file.
Path of your Web Application (project)? (public) src/public
$ cat ipscend.json
{
"versions": [],
"path": "src/public"
}
```

Once you are ready to publish, run the `ipscend publish` command:

```bash
$ ipscend publish
{ hash: 'QmVNgdUoBQHiBhSeDe2z8LttJaDZq7JZi17sR1SPnJmjMh',
timestamp: Fri Nov 27 2015 10:23:37 GMT+0000 (WET) }
published src/public QmVNgdUoBQHiBhSeDe2z8LttJaDZq7JZi17sR1SPnJmjMh
$ cat ipscend.json
{
"versions": [
{
"hash": "QmVNgdUoBQHiBhSeDe2z8LttJaDZq7JZi17sR1SPnJmjMh",
"timestamp": "2015-11-27T16:23:37.971Z"
}
],
"path": "src/public"
```

Grab that hash and share it with your friends, by sending them a link to ipfs.io, appending "/ipfs/Hash" (e.g. https:/ipfs.io/ipfs/ QmVNgdUoBQHiBhSeDe2z8LttJaDZq7JZi17sR1SPnJmjMh).

If you want to use your awesome.domain.com to load your page from IPFS, you can check how to do it now at https://github.com/diasdavid/ipscend#use-ipfs-to-host-your-webpage-using-a-standard-domain-includes-cool-dns-trick.

## Awesome (FUTURE)!

`ipscend` is still in its humble beginnings. Some of the ideas and plans to build in the future include being able to:

- Extract the version from the VCS itself (https://github.com/ipfs/notes/issues/23), so that every commit can be a different working version, allowing you to test every commit in your CI.
- Roll back history, following the 'time machine' analogy.
- Update your DNS provider automatically, avoiding having to use an external tool like [`dnslink-deploy`](https://github.com/ipfs/dnslink-deploy).
- Enable reviewers to write notes.
- Take screenshots in every browser version, so that we can use the timeline to see if there was any regression at a point in time, which might happen in a specific browser.
- moaaaar :D If you have ideas or want to contribute, ipscend is fully MIT Licensed, so feel free to open a issue or PR on https://github.com/diasdavid/ipscend.

A big thank you to [Andrés Gutgon](https://github.com/andresgutgon) who made the screenshot preview look [really good](https://github.com/diasdavid/ipscend-screenshot-visualizer/pull/1)
121 changes: 121 additions & 0 deletions src/4-registry-mirror/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,121 @@
---
date: 2015-12-08
id: 4-registry-mirror
template: tmpl/layouts/post.html
baseurl: ..
breadcrumbs:
- {name: "4-registry-mirror", link: "./" }
tags: modules
title: Stellar Module Management - Install your Node.js modules using IPFS
author: David Dias
collection: posts
---

![](/img/node-interactive-logo.png)

Node.js Interactive, the first Node.js conference organized by the Linux Foundation, happened on Dec 8-9 of 2015. There were hundreds of participants, and dozens of really amazing talks divided in 3 specific tracks: backend, frontend and IoT.

I was fortunate to attend and present a project we've been developing at [Protocol Labs](https://ipn.io), that builds on on top of [IPFS, the InterPlanetary FileSystem](https://ipfs.io).

You can learn about that project in this blog post, check out the [talk slides](http://www.slideshare.net/DavidDias11/nodejs-interactive) or wait for the video recording of the talk. I will update this blog post when that happens.

## Enter registry-mirror

![](/img/enter-registry-mirror.png)
[![](https://img.shields.io/badge/made%20by-Protocol%20Labs-blue.svg?style=flat-square)](http://ipn.io) [![](https://img.shields.io/badge/project-IPFS-blue.svg?style=flat-square)](http://ipfs.io/) [![](https://img.shields.io/badge/freenode-%23ipfs-blue.svg?style=flat-square)](http://webchat.freenode.net/?channels=%23ipfs)

`registry-mirror` enables distributed discovery of npm modules by fetching and caching the latest state of npm through IPNS, the InterPlanetary Naming System. With this state, a node in the network is capable of querying IPFS network for an npm module's cryptographic hash, fetching it from any peer that has it available.

`registry-mirror` is open source, MIT licensed and available at [github.com/diasdavid/registry-mirror](https://github.com/diasdavid/registry-mirror).

## Getting started

In order to get started, you must first be sure that you are running IPFS 0.4.0. IPFS 0.4.0 is not yet released, but you can already use it by compiling from source or downloading the pre-built binary.

#### Compiling from source

You can find a tutorial on how to compile and install IPFS from source at [https://github.com/ipfs/go-ipfs#build-from-source](https://github.com/ipfs/go-ipfs#build-from-source). Just make sure to change to the `dev0.4.0` branch, as 0.4.0 isn't released yet.

Please make sure you have go 1.5.2 or above installed.

#### Downloading pre-built Binary

Download the pre-built binary for your OS and Arch at [gobuilder](https://gobuilder.me/github.com/ipfs/go-ipfs/cmd/ipfs?branch=v0.4.0-dev).

#### Installing and running registry-mirror

Once you have IPFS 0.4.0 available, install registry-mirror by running the following command (you should have Node.js 4 and npm 2 or above available):

```bash
$ npm i registry-mirror -g
# ...
```

Then start your IPFS daemon, run:

```bash
$ ipfs daemon
Initializing daemon...
Swarm listening on /ip4/127.0.0.1/tcp/4001
Swarm listening on /ip4/172.19.248.69/tcp/4001
Swarm listening on /ip6/::1/tcp/4001
API server listening on /ip4/127.0.0.1/tcp/5001
Gateway (readonly) server listening on /ip4/127.0.0.1/tcp/8080
Daemon is ready
```

After, run registry-mirror daemon with the `--ipfs` option:

```bash
$ registry-mirror daemon --ipfs --port=9595
IPFS mode ON
registry-mirror [info] output dir: /npm-registry/
registry-mirror [info] listening:127.0.0.1:9595
registry-mirror [info] Updated /npm-registry to: /ipfs/QmSjG9fadu4mPdtRsQYtXhwwCBouFEPiYHtVf8f4iH6vwj
```

Now, to install a module using IPFS, you only need to set this local registry when running an `npm install`. This can be done through [config](https://docs.npmjs.com/cli/config) or a command line argument:

```bash
$ npm i bignumber --registry=http://localhost:9595
npm http request GET http://localhost:9595/bignumber
npm http 200 http://localhost:9595/bignumber
npm http fetch GET http://localhost:9595/bignumber/-/bignumber-1.1.0.tgz
npm http fetch 200 http://localhost:9595/bignumber/-/bignumber-1.1.0.tgz
/Users/david/Documents/code/ipfs/ip-npm/node-interactive
└── bignumber@1.1.0
```

## Features

`registry-mirror` itself is quite a simple application, as most of the heavy lifting is done by [IPFS](https://ipfs.io). IPFS's distributed nature affords a set of really nice features as a transport layer that `registry-mirror` leverages to create its service.

#### Find where the module lives without having to hit the backbone

With `registry-mirror`, a registry becomes a curated list of hashes. While the modules live in the network, as soon as `registry-mirror` caches this list locally (which it gets from the IPFS network), it has a list of the hashes of the modules that a user might need in the future. With this list, a user doesn't have to know of the whereabouts of a module until it needs to request it from the network.

This list is fetched and kept up to date through IPNS, and since IPNS records are signed and validated, there is no way for the list to be compromised (with the exception of the case where the private key of the publisher is compromised).

#### Work offline/disconnected

Just like git, `registry-mirror` is able to work offline and/or in a disconnected scenario. As long as the module you are looking for exists in the part of the network you are currently in, IPFS would be able to find it through Peer and Content Routing (with a DHT).

#### Enable several registries to coexist

Once the notion of a registry becomes a curated list of modules available, enabling more than one registry to exist becomes simpler. This scenario can be especially interesting for private networks such as the ones within companies and organizations that don't want their modules to be publicly known and available.

#### Run only what you were looking for

Through cryptographic hashing -- the IPFS strategy to find, deliver and check that the content it received was what was requested -- you can always be sure that what you are running was what you were looking for.

#### Faster

By leveraging local and network caches efficiently, downloading your dependencies can be much faster as it avoids going to npm's servers or CDN all the time. This can be crucial in high latency networks or more remote areas.

## Demo Video

<iframe src="https://player.vimeo.com/video/147968322" width="500" height="281" frameborder="0" webkitallowfullscreen mozallowfullscreen allowfullscreen></iframe> <p><a href="https://vimeo.com/147968322">registry-mirror demo</a> from <a href="https://vimeo.com/daviddias">David Dias</a> on <a href="https://vimeo.com">Vimeo</a>.</p>

## A special thanks

A very big thank you goes to [Bryan English](https://github.com/bengl) and everyone that was involved in the [discussion](https://github.com/ipfs/notes/issues/2) and contributed to make this possible.
Binary file added src/img/enter-registry-mirror.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added src/img/favicon.ico
Binary file not shown.
Binary file added src/img/ipscend.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added src/img/node-interactive-logo.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading