SAINT-HUBERT: Sniff the dead urls of your webpage from the terminal !
Learning Golang one project at the time
go build -o sniff
Project roadmap:
- GET request on the page
- Check if any link on the page
- retrieve all href content (http, https)
- Create slice
- For each href create struct with link as key
- For each key element of the data structure
- GET request
- Retrieve HTTP status code
- add HTTP code status to value of each key
- Remove duplicates from array
- Display result in CLI
- Save output as JSON to file (when asked)
Future features to have:
- Nice CLI UI (with colors !),
- Arg to display only issues,
- Add statistics below headers (percentage of issues, number of links)
- Add argument options
- Give location of file to make parsing
- Parse a website given a root hook (example.com => retrieve all routes based on hrefs)
Also:
- Refactor
- Add more tests to the existing tests