Hargo parses HAR files, can convert to curl format, and serve as a load test driver.
NAME:
hargo - work with HTTP Archive (.har) files
USAGE:
hargo <command> [arguments] <.har file>
VERSION:
0.1.2-dev.57 (da53069)
AUTHOR:
Mark A. Richman <mark@markrichman.com>
COMMANDS:
fetch, f Fetch URLs in .har
curl, c Convert .har to curl
run, r Run .har file
validate, v Validate .har file
dump, d Dump .har file
load, l Load test .har file
help, h Shows a list of commands or help for one command
GLOBAL OPTIONS:
--debug Show debug output
--help, -h show help
--version, -v print the version
COPYRIGHT:
(c) 2021 Mark A. Richman
git clone https://github.com/mrichman/hargo.git
cd hargo
make install
hargo validate test/golang.org.har
If you use Google Chrome, you can record these files by following the steps below:
- Right-click anywhere on that page and click on Inspect Element to open Chrome's Developer Tools
- The Developer Tools will open as a panel at the bottom of the page. Click on the Network tab.
- Click the Record button, which is the solid black circle at the bottom of the Network tab, and you'll start recording activity in your browser.
- Refresh the page and start working normally
- Right-click within the Network tab and click Save as HAR with Content to save a copy of the activity that you recorded.
- Within the file window, save the HAR file.
The fetch
command downloads all resources references in .har file:
hargo fetch foo.har
This will produce a directory named hargo-fetch-yyyymmddhhmmss
containing all assets references by the .har file. This is similar to what you'd see when invoking wget
on a particular URL.
The curl
command will output a curl command line for each entry in the .har file.
hargo curl foo.har
The run
command executes each HTTP request in .har file:
hargo run foo.har
This is similar to fetch
but will not save any output.
The validate
command will report any errors in the format of a .har file.
hargo validate foo.har
HAR file format is defined here: https://w3c.github.io/web-performance/specs/HAR/Overview.html
Dump prints information about all HTTP requests in .har file
hargo dump foo.har
Hargo can act as a load test agent. Given a .har file, hargo can spawn a number of concurrent workers to repeat each HTTP request in order. By default, hargo will spawn 10 workers and run for a duration of 60 seconds.
Hargo will also save its results to InfluxDB, if available. Each HTTP response is stored as a point of time-series data, which can be graphed by Chronograf, Grafana, or similar visualization tool for analysis.
docker build -t hargo .
docker run --rm -v `pwd`/test:/test hargo hargo run /test/golang.org.har
The example docker-compose file will start three containers:
- hargo
- influxdb
- grafana
The hargo container will first needs to be built. See build. When the compose file is run it will start a hargo load process that will write the results to InfluxDB. This InfluxDB instance can be viewed using the grafana container. This contains an example dashboard showing the latency of the executed request. Username/password for all the containers is hargo/hargo.
cd example/docker-compose
docker-compose up
docker-compose down -v