Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add linting to CI for martin book #1030

Merged
merged 38 commits into from
Dec 4, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
38 commits
Select commit Hold shift + click to select a range
45f7fc8
Add markdown lint action
sharkAndshark Nov 29, 2023
b0c1ff1
Add custom markdown config file
sharkAndshark Nov 29, 2023
6eb61f5
Update globs expressions
sharkAndshark Nov 29, 2023
f2f8d15
Ignore the linting of `CHANGELOG.md`
sharkAndshark Nov 29, 2023
f9e9125
Update
sharkAndshark Nov 29, 2023
a1cd714
Eliminate `MD022/blanks-around-headings` warnings
sharkAndshark Nov 29, 2023
da85d68
Eliminate `MD022/blanks-around-headings` warnings
sharkAndshark Nov 29, 2023
445cc21
Eliminate `MD022/blanks-around-headings` warnings
sharkAndshark Nov 29, 2023
d75a162
Run `markdownlint-cli2 --fix`
sharkAndshark Nov 29, 2023
688c9d6
Ignore MD013 rule
sharkAndshark Nov 30, 2023
db70bf1
Ignore MD041, MD033
sharkAndshark Dec 1, 2023
316169c
Add relative link check
sharkAndshark Dec 1, 2023
aca15c3
Use markdown-link-check github action
sharkAndshark Dec 1, 2023
407e642
Set link check path
sharkAndshark Dec 1, 2023
58b335a
Ignore localhost
sharkAndshark Dec 1, 2023
a728eaf
fix link in run-with-nginx.md
sharkAndshark Dec 1, 2023
20a33d8
Add httpheader and timeout
sharkAndshark Dec 1, 2023
072c5cb
Update link check config
sharkAndshark Dec 1, 2023
e136541
Add headers for opensource.org
sharkAndshark Dec 1, 2023
63f7188
Update license link
sharkAndshark Dec 1, 2023
87da4c7
Update headers for opensource.org
sharkAndshark Dec 1, 2023
88b4548
Rename markdown lint config file
sharkAndshark Dec 1, 2023
02b3bf7
Add markdown fmt to jsut fmt
sharkAndshark Dec 1, 2023
eeae4b0
Add link check to just clippy
sharkAndshark Dec 1, 2023
911aa38
Remove img.shields.io from ignore list
sharkAndshark Dec 2, 2023
b0ec469
Add --fix to just fmt
sharkAndshark Dec 2, 2023
d0d0da5
Ignore opensource.org
sharkAndshark Dec 2, 2023
74d9b63
Ignore MD045 MD001 in markdown lintting
sharkAndshark Dec 2, 2023
6c9c102
Merge remote-tracking branch 'remote/main' into doc_style_check
sharkAndshark Dec 2, 2023
186155e
Update martin-cp.md
sharkAndshark Dec 2, 2023
c0f9987
Remove opensource.org from httpheaders
sharkAndshark Dec 4, 2023
aae73ea
Remove NPM packages
sharkAndshark Dec 4, 2023
307f60d
Add fmt-md to justfile
sharkAndshark Dec 4, 2023
f700492
Move link check to just clippy-md
sharkAndshark Dec 4, 2023
cd0847a
Update link checking headers
sharkAndshark Dec 4, 2023
fdac18e
optimize link check
nyurik Dec 4, 2023
8c03249
minor doc
nyurik Dec 4, 2023
3117989
cleanup justfile
nyurik Dec 4, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 18 additions & 0 deletions .github/files/config.markdownlint-cli2.jsonc
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
{
"config": {
"default": true,
"relative-links": true,
// Line length Check. See https://github.com/DavidAnson/markdownlint/blob/v0.32.1/doc/md013.md
"MD013": false,
"MD041": false,
"MD033": false,
"MD045": false,
"MD001": false
},
// globs expression. See https://github.com/DavidAnson/markdownlint-cli2#command-line
"globs": [
"README.md",
"!CHANGELOG.md",
"docs/src/*.md"
]
}
36 changes: 36 additions & 0 deletions .github/files/markdown.links.config.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
{
"replacementPatterns": [
{
"__comment__": "See https://github.com/tcort/markdown-link-check/issues/264",
"pattern": "%23",
"replacement": ""
}
],
"timeout": "2s",
"retryOn429": true,
"ignorePatterns": [
{
"pattern": "^http://127.0.0.1"
},
{
"pattern": "^http://localhost"
},
{
"pattern": "^http://opensource.org"
}
],
"httpHeaders": [
{
"urls": [
"https://crates.io",
"https://github.com"
],
"headers": {
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/119.0.0.0 Safari/537.36 Edg/119.0.0.0",
"Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.7",
"Accept-Encoding": "gzip, deflate, br",
"Accept-Language": "en"
}
}
]
}
14 changes: 14 additions & 0 deletions .github/workflows/build-deploy-docs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,20 @@ jobs:
- uses: actions/checkout@v4
- uses: Swatinem/rust-cache@v2

- name: 'Validate .md files (use "just fmt-md" to fix)'
uses: DavidAnson/markdownlint-cli2-action@v14
with:
config: '.github/files/config.markdownlint-cli2.jsonc'

- name: 'Check Markdown URLs (same as "just clippy-md")'
uses: gaurav-nelson/github-action-markdown-link-check@v1
with:
use-quiet-mode: 'no'
use-verbose-mode: 'yes'
folder-path: 'docs/src'
file-path: './README.md'
config-file: '.github/files/markdown.links.config.json'

- name: Setup mdBook
uses: peaceiris/actions-mdbook@v1
with:
Expand Down
6 changes: 5 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,11 +16,11 @@ Additionally, there are [several tools](https://maplibre.org/martin/tools.html)
See [Martin book](https://maplibre.org/martin/) for complete documentation.

## Installation

_See [installation instructions](https://maplibre.org/martin/installation.html) in the Martin book._

**Prerequisites:** If using Martin with PostgreSQL database, you must install PostGIS with at least v3.0+, v3.1+ recommended.


You can download martin from [GitHub releases page](https://github.com/maplibre/martin/releases).

| Platform | AMD-64 | ARM-64 |
Expand All @@ -45,6 +45,7 @@ brew install martin
```

## Running Martin Service

_See [running instructions](https://maplibre.org/martin/run.html) in the Martin book._

Martin supports any number of PostgreSQL/PostGIS database connections with [geospatial-enabled](https://postgis.net/docs/using_postgis_dbmanagement.html#geometry_columns) tables and tile-producing SQL functions, as well as [PMTile](https://protomaps.com/blog/pmtiles-v3-whats-new) and [MBTile](https://github.com/mapbox/mbtiles-spec) files as tile sources.
Expand Down Expand Up @@ -72,6 +73,7 @@ martin --config config.yaml
```

#### Docker Example

_See [Docker instructions](https://maplibre.org/martin/run-with-docker.html) in the Martin book._

Martin is also available as a [Docker image](https://ghcr.io/maplibre/martin). You could either share a configuration file from the host with the container via the `-v` param, or you can let Martin auto-discover all sources e.g. by passing `DATABASE_URL` or specifying the .mbtiles/.pmtiles files.
Expand All @@ -86,6 +88,7 @@ docker run -p 3000:3000 \
```

## API

_See [API documentation](https://maplibre.org/martin/using.html) in the Martin book._

Martin data is available via the HTTP `GET` endpoints:
Expand All @@ -104,6 +107,7 @@ Martin data is available via the HTTP `GET` endpoints:
| `/health` | Martin server health check: returns 200 `OK` |

## Documentation

See [Martin book](https://maplibre.org/martin/) for complete documentation.

## License
Expand Down
1 change: 1 addition & 0 deletions docs/src/SUMMARY.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
[Introduction](introduction.md)

- [Installation](installation.md)
- [Running](run.md)
- [Command Line Interface](run-with-cli.md)
Expand Down
4 changes: 1 addition & 3 deletions docs/src/martin-cp.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ After copying, `martin-cp` will update the `agg_tiles_hash` metadata value unles

## Usage

This copies tiles from a PostGIS table `my_table` into an MBTiles file `tileset.mbtiles` using [normalized](mbtiles-schema.md) schema, with zoom levels from 0 to 10, and bounds of the whole world.
This copies tiles from a PostGIS table `my_table` into an MBTiles file `tileset.mbtiles` using [normalized](mbtiles-schema.md) schema, with zoom levels from 0 to 10, and bounds of the whole world.

```shell
martin-cp --output-file tileset.mbtiles \
Expand All @@ -17,5 +17,3 @@ martin-cp --output-file tileset.mbtiles \
--source source_name \
postgresql://postgres@localhost:5432/db
```

You
3 changes: 3 additions & 0 deletions docs/src/mbtiles-copy.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
# Copying, Diffing, and Patching MBTiles

## `mbtiles copy`

Copy command copies an mbtiles file, optionally filtering its content by zoom levels.

```shell
Expand All @@ -16,6 +17,7 @@ mbtiles copy normalized.mbtiles dst.mbtiles \
```

## `mbtiles copy --diff-with-file`

Copy command can also be used to compare two mbtiles files and generate a delta (diff) file. The diff file can be applied to the `src_file.mbtiles` elsewhere, to avoid copying/transmitting the entire modified dataset. The delta file will contain all tiles that are different between the two files (modifications, insertions, and deletions as `NULL` values), for both the tile and metadata tables.

There is one exception: `agg_tiles_hash` metadata value will be renamed to `agg_tiles_hash_in_diff`, and a new `agg_tiles_hash` will be generated for the diff file itself. This is done to avoid confusion when applying the diff file to the original file, as the `agg_tiles_hash` value will be different after the diff is applied. The `apply-diff` command will automatically rename the `agg_tiles_hash_in_diff` value back to `agg_tiles_hash` when applying the diff.
Expand Down Expand Up @@ -45,6 +47,7 @@ mbtiles apply_diff src_file.mbtiles diff_file.mbtiles
```

#### Applying diff with SQLite

Another way to apply the diff is to use the `sqlite3` command line tool directly. This SQL will delete all tiles from `src_file.mbtiles` that are set to `NULL` in `diff_file.mbtiles`, and then insert or update all new tiles from `diff_file.mbtiles` into `src_file.mbtiles`, where both files are of `flat` type. The name of the diff file is passed as a query parameter to the sqlite3 command line tool, and then used in the SQL statements. Note that this does not update the `agg_tiles_hash` metadata value, so it will be incorrect after the diff is applied.

```shell
Expand Down
4 changes: 4 additions & 0 deletions docs/src/mbtiles-meta.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
# MBTiles information and metadata

## summary

Use `mbtiles summary` to get a summary of the contents of an MBTiles file. The command will print a table with the number of tiles per zoom level, the size of the smallest and largest tiles, and the average size of tiles at each zoom level. The command will also print the bounding box of the covered area per zoom level.

```shell
Expand All @@ -22,20 +23,23 @@ Page count: 12
```

## meta-all

Print all metadata values to stdout, as well as the results of tile detection. The format of the values printed is not stable, and should only be used for visual inspection.

```shell
mbtiles meta-all my_file.mbtiles
```

## meta-get

Retrieve raw metadata value by its name. The value is printed to stdout without any modifications. For example, to get the `description` value from an mbtiles file:

```shell
mbtiles meta-get my_file.mbtiles description
```

## meta-set

Set metadata value by its name, or delete the key if no value is supplied. For example, to set the `description` value to `A vector tile dataset`:

```shell
Expand Down
4 changes: 4 additions & 0 deletions docs/src/mbtiles-schema.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,9 @@
# MBTiles Schemas

The `mbtiles` tool builds on top of the original [MBTiles specification](https://github.com/mapbox/mbtiles-spec#readme) by specifying three different kinds of schema for `tiles` data: `flat`, `flat-with-hash`, and `normalized`. The `mbtiles` tool can convert between these schemas, and can also generate a diff between two files of any schemas, as well as merge multiple schema files into one file.

## flat

Flat schema is the closest to the original MBTiles specification. It stores all tiles in a single table. This schema is the most efficient when the tileset contains no duplicate tiles.

```sql, ignore
Expand All @@ -16,6 +18,7 @@ CREATE UNIQUE INDEX tile_index on tiles (
```

## flat-with-hash

Similar to the `flat` schema, but also includes a `tile_hash` column that contains a hash value of the `tile_data` column. Use this schema when the tileset has no duplicate tiles, but you still want to be able to validate the content of each tile individually.

```sql, ignore
Expand All @@ -35,6 +38,7 @@ CREATE VIEW tiles AS
```

## normalized

Normalized schema is the most efficient when the tileset contains duplicate tiles. It stores all tile blobs in the `images` table, and stores the tile Z,X,Y coordinates in a `map` table. The `map` table contains a `tile_id` column that is a foreign key to the `images` table. The `tile_id` column is a hash of the `tile_data` column, making it possible to both validate each individual tile like in the `flat-with-hash` schema, and also to optimize storage by storing each unique tile only once.

```sql, ignore
Expand Down
3 changes: 3 additions & 0 deletions docs/src/mbtiles-validation.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,12 +7,15 @@ mbtiles validate src_file.mbtiles
```

## SQLite Integrity check

The `validate` command will run `PRAGMA integrity_check` on the file, and will fail if the result is not `ok`. The `--integrity-check` flag can be used to disable this check, or to make it more thorough with `full` value. Default is `quick`.

## Schema check

The `validate` command will verify that the `tiles` table/view exists, and that it has the expected columns and indexes. It will also verify that the `metadata` table/view exists, and that it has the expected columns and indexes.

## Per-tile validation

If the `.mbtiles` file uses [flat_with_hash](mbtiles-schema.md#flat-with-hash) or [normalized](mbtiles-schema.md#normalized) schema, the `validate` command will verify that the MD5 hash of the `tile_data` column matches the `tile_hash` or `tile_id` columns (depending on the schema).

A typical Normalized schema generated by tools like [tilelive-copy](https://github.com/mapbox/TileLive#bintilelive-copy) use MD5 hash in the `tile_id` column. The Martin's `mbtiles` tool can use this hash to verify the content of each tile. We also define a new [flat-with-hash](mbtiles-schema.md#flat-with-hash) schema that stores the hash and tile data in the same table, allowing per-tile validation without the multiple table layout.
Expand Down
1 change: 1 addition & 0 deletions docs/src/recipes.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@ martin
```

You may also be able to validate SSL certificate with an explicit sslmode, e.g.

```shell
export DATABASE_URL="$(heroku config:get DATABASE_URL -a APP_NAME)?sslmode=verify-ca"
```
1 change: 1 addition & 0 deletions docs/src/run-with-docker.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
You can use official Docker image [`ghcr.io/maplibre/martin`](https://ghcr.io/maplibre/martin)

### Using Non-Local PostgreSQL

```shell
docker run \
-p 3000:3000 \
Expand Down
2 changes: 1 addition & 1 deletion docs/src/run-with-nginx.md
Original file line number Diff line number Diff line change
Expand Up @@ -91,4 +91,4 @@ http {
}
```

You can find an example NGINX configuration file [here](https://github.com/maplibre/martin/blob/main/nginx.conf).
You can find an example NGINX configuration file [here](https://github.com/maplibre/martin/blob/main/demo/frontend/nginx.conf).
3 changes: 2 additions & 1 deletion docs/src/sources-fonts.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,14 +4,14 @@ Martin can serve glyph ranges from `otf`, `ttf`, and `ttc` fonts as needed by Ma
The glyph range generation is not yet cached, and may require external reverse proxy or CDN for faster operation.

## API

Fonts ranges are available either for a single font, or a combination of multiple fonts. The font names are case-sensitive and should match the font name in the font file as published in the catalog. Make sure to URL-escape font names as they usually contain spaces.

| | Font Request |
|---------|--------------------------------------|
| Pattern | `/font/{name}/{start}-{end}` |
| Example | `/font/Overpass%20Mono%20Bold/0-255` |


### Composite Font Request

When combining multiple fonts, the glyph range will contain glyphs from the first listed font if available, and fallback to the next font if the glyph is not available in the first font, etc. The glyph range will be empty if none of the fonts contain the glyph.
Expand All @@ -22,6 +22,7 @@ When combining multiple fonts, the glyph range will contain glyphs from the firs
| Example | `/font/Overpass%20Mono%20Bold,Overpass%20Mono%20Light/0-255` |

### Catalog

Martin will show all available fonts at the `/catalog` endpoint.

```shell
Expand Down
2 changes: 2 additions & 0 deletions docs/src/sources-pg-functions.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ Function Source is a database function which can be used to query [vector tiles]
| query (optional, any name) | json | Query string parameters |

### Simple Function

For example, if you have a table `table_source` in WGS84 (`4326` SRID), then you can use this function as a Function Source:

```sql, ignore
Expand All @@ -35,6 +36,7 @@ $$ LANGUAGE plpgsql IMMUTABLE STRICT PARALLEL SAFE;
```

### Function with Query Parameters

Users may add a `query` parameter to pass additional parameters to the function.

_**TODO**: Modify this example to actually use the query parameters._
Expand Down
2 changes: 2 additions & 0 deletions docs/src/sources-pg-tables.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,11 +9,13 @@ For example, if there is a table `public.table_source`:
the default `TileJSON` might look like this (note that URL will be automatically adjusted to match the request host):

The table:

```sql
CREATE TABLE "public"."table_source" ( "gid" int4 NOT NULL, "geom" "public"."geometry" );
```

The TileJSON:

```json
{
"tilejson": "3.0.0",
Expand Down
4 changes: 4 additions & 0 deletions docs/src/sources-sprites.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
Given a directory with SVG images, Martin will generate a sprite -- a JSON index and a PNG image, for both low and high resolution displays. The SVG filenames without extension will be used as the sprite image IDs. The images are searched recursively in the given directory, so subdirectory names will be used as prefixes for the image IDs, e.g. `icons/bicycle.svg` will be available as `icons/bicycle` sprite image. The sprite generation is not yet cached, and may require external reverse proxy or CDN for faster operation.

### API

Martin uses [MapLibre sprites API](https://maplibre.org/maplibre-style-spec/sprite/) specification to serve sprites via several endpoints. The sprite image and index are generated on the fly, so if the sprite directory is updated, the changes will be reflected immediately.

##### Sprite PNG
Expand All @@ -12,6 +13,7 @@ Martin uses [MapLibre sprites API](https://maplibre.org/maplibre-style-spec/spri
`GET /sprite/<sprite_id>.png` endpoint contains a single PNG sprite image that combines all sources images. Additionally, there is a high DPI version available at `GET /sprite/<sprite_id>@2x.png`.

##### Sprite index

`/sprite/<sprite_id>.json` metadata index describing the position and size of each image inside the sprite. Just like the PNG, there is a high DPI version available at `/sprite/<sprite_id>@2x.json`.

```json
Expand All @@ -26,7 +28,9 @@ Martin uses [MapLibre sprites API](https://maplibre.org/maplibre-style-spec/spri
...
}
```

#### Combining Multiple Sprites

Multiple sprite_id values can be combined into one sprite with the same pattern as for tile joining: `/sprite/<sprite_id1>,<sprite_id2>,...,<sprite_idN>`. No ID renaming is done, so identical sprite names will override one another.

### Configuring from CLI
Expand Down
2 changes: 2 additions & 0 deletions docs/src/tools.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,11 @@
Martin project contains additional tooling to help manage the data servable with Martin tile server.

## `martin-cp`

`martin-cp` is a tool for generating tiles in bulk, and save retrieved tiles into a new or an existing MBTiles file. It can be used to generate tiles for a large area or multiple areas. If multiple areas overlap, it will generate tiles only once. `martin-cp` supports the same configuration file and CLI arguments as Martin server, so it can support all sources and even combining sources.

## `mbtiles`

`mbtiles` is a small utility to interact with the `*.mbtiles` files from the command line. It allows users to examine, copy, validate, compare, and apply diffs between them.

Use `mbtiles --help` to see a list of available commands, and `mbtiles <command> --help` to see help for a specific command.
Expand Down
1 change: 0 additions & 1 deletion docs/src/using-with-maplibre.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,6 @@

You can add a layer to the map and specify Martin [TileJSON](https://github.com/mapbox/tilejson-spec) endpoint as a vector source URL. You should also specify a `source-layer` property. For [Table Sources](sources-pg-tables.md) it is `{table_name}` by default.


```js
map.addLayer({
id: 'points',
Expand Down
2 changes: 2 additions & 0 deletions docs/src/using.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,9 +16,11 @@ Martin data is available via the HTTP `GET` endpoints:
| `/health` | Martin server health check: returns 200 `OK` |

### Duplicate Source ID

In case there is more than one source that has the same name, e.g. a PG function is available in two schemas/connections, or a table has more than one geometry columns, sources will be assigned unique IDs such as `/points`, `/points.1`, etc.

### Reserved Source IDs

Some source IDs are reserved for internal use. If you try to use them, they will be automatically renamed to a unique ID the same way as duplicate source IDs are handled, e.g. a `catalog` source will become `catalog.1`.

Some of the reserved IDs: `_`, `catalog`, `config`, `font`, `health`, `help`, `index`, `manifest`, `metrics`, `refresh`,
Expand Down
9 changes: 9 additions & 0 deletions justfile
Original file line number Diff line number Diff line change
Expand Up @@ -256,6 +256,10 @@ lint: fmt clippy
fmt:
cargo fmt --all -- --check

# Reformat markdown files using markdownlint-cli2
fmt-md:
docker run -it --rm -v $PWD:/workdir davidanson/markdownlint-cli2 --config /workdir/.github/files/config.markdownlint-cli2.jsonc --fix

# Run Nightly cargo fmt, ordering imports
fmt2:
cargo +nightly fmt -- --config imports_granularity=Module,group_imports=StdExternalCrate
Expand All @@ -265,6 +269,11 @@ clippy:
cargo clippy --workspace --all-targets --bins --tests --lib --benches -- -D warnings
RUSTDOCFLAGS="-D warnings" cargo doc --no-deps --workspace

# Validate markdown URLs with markdown-link-check
clippy-md:
docker run -it --rm -v ${PWD}:/workdir --entrypoint sh ghcr.io/tcort/markdown-link-check -c \
'echo -e "/workdir/README.md\n$(find /workdir/docs/src -name "*.md")" | tr "\n" "\0" | xargs -0 -P 5 -n1 -I{} markdown-link-check --config /workdir/.github/files/markdown.links.config.json {}'

# These steps automatically run before git push via a git hook
[private]
git-pre-push: env-info restart lint test
Expand Down
Loading