Skip to content
/ json2nd Public

Filter to convert json array data, and plain JSON, to NDJSON (newline delimited JSON)

License

Notifications You must be signed in to change notification settings

draxil/json2nd

Repository files navigation

json2nd

Usage

Convert JSON with a top-level array to NDJSON (new-line delimited JSON, aka JSONL):

cat large_array.json | json2nd > object_per_line.json

Like a lot of filter programs you can switch from using STDIN to a list of filenames, so now it acts a bit like a JSON cat that converts arrays:

json2nd large_array1.json large_array2.json > object_per_line.json

If the array you want to unpack is below the surface:

{
    "stuff": {
	  "things" : [1, 2, 3]
    }
}

then you can use the -path flag to extract it:

json2nd -path stuff.things file.json

For more see other usage scenarios, and JSON considerations.

Installation

Using Go

Assuming your $GOBIN directory is somewhere in your $PATH it’s as simple as:

go install github.com/draxil/json2nd@latest

Github releases

There are builds of release points on github. Grab the relevent build from the github releases page, right now these just contain a binary and docs.

Plans / what about XYZ?

Why are your error messages so bad?

Since we moved off a proper parser the first priorities were: speed, memory use and working.

Being helpful when it goes wrong is pretty much the next goal.

Windows style line endings (“\r\n”)?

Maybe. Honestly would be getting beyond the simplicity of this thing, but I can see how it could be useful to some people. Bug me?

What about jq?

jq is great! And a lot of the time I’m running this so I can slice a file up so I can run jq more quickly! This is simpler to use as it’s a single-use tool, and should be faster than jq. Also as a lot of the use case for this is people sending me in advisably large files, we don’t load the whole thing into memory. jq unavoidably does and it kills my machine on some of my 4G+ (I kid you not) examples.

Sometimes you want UNIX philosophy sometimes you need an atomic chainsaw. Why not have both.

Why aren’t you using a JSON parser?

Originally I did, but they were either slow or insisted on having the entire file in memory, and I want this to cope with very large files. See also JSON considerations.

Author / credits

Joe Higton <draxil@gmail.com>

Licence

Please see the licence file.

About

Filter to convert json array data, and plain JSON, to NDJSON (newline delimited JSON)

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages