Skip to content

Commit

Permalink
[DOC] Fix help command in documentation (#540)
Browse files Browse the repository at this point in the history
Signed-off-by: Ahmed Hussein (amahussein) <a@ahussein.me>

Fixes #496
  • Loading branch information
amahussein authored Sep 8, 2023
1 parent 0e8640a commit 61460fd
Show file tree
Hide file tree
Showing 5 changed files with 13 additions and 13 deletions.
8 changes: 4 additions & 4 deletions user_tools/docs/user-tools-aws-emr.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ the applications running on AWS EMR.
- from source: `pip install -e .`
- verify the command is installed correctly by running
```bash
spark_rapids_user_tools emr --help
spark_rapids_user_tools emr -- --help
```

### 4.Environment variables
Expand All @@ -55,7 +55,7 @@ Before running any command, you can set environment variables to specify configu

```
spark_rapids_user_tools emr qualification [options]
spark_rapids_user_tools emr qualification --help
spark_rapids_user_tools emr qualification -- --help
```
The local deployment runs on the local development machine. It requires:
Expand Down Expand Up @@ -307,7 +307,7 @@ The CLI is triggered by providing the location where the yaml file is stored `--

```
spark_rapids_user_tools emr bootstrap [options]
spark_rapids_user_tools emr bootstrap --help
spark_rapids_user_tools emr bootstrap -- --help
```

The command generates an output with a list of properties to be applied to Spark configurations.
Expand Down Expand Up @@ -370,7 +370,7 @@ generate an output while displaying warning that the remote changes failed.

```
spark_rapids_user_tools emr diagnostic [options]
spark_rapids_user_tools emr diagnostic --help
spark_rapids_user_tools emr diagnostic -- --help
```

Run diagnostic command to collects information from EMR cluster, such as OS version, # of worker
Expand Down
4 changes: 2 additions & 2 deletions user_tools/docs/user-tools-databricks-aws.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ The tool currently only supports event logs stored on S3 (no DBFS paths). The re
- from source: `pip install -e .`
- verify the command is installed correctly by running
```bash
spark_rapids_user_tools databricks-aws --help
spark_rapids_user_tools databricks-aws -- --help
```

### 5.Environment variables
Expand All @@ -53,7 +53,7 @@ Before running any command, you can set environment variables to specify configu

```
spark_rapids_user_tools databricks-aws qualification [options]
spark_rapids_user_tools databricks-aws qualification --help
spark_rapids_user_tools databricks-aws qualification -- --help
```
The local deployment runs on the local development machine. It requires:
Expand Down
4 changes: 2 additions & 2 deletions user_tools/docs/user-tools-databricks-azure.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ The tool currently only supports event logs stored on ABFS ([Azure Blob File Sys
- from source: `pip install -e .`
- Verify the command is installed correctly by running
```bash
spark_rapids_user_tools databricks-azure --help
spark_rapids_user_tools databricks-azure -- --help
```

### 5.Environment variables
Expand All @@ -57,7 +57,7 @@ Before running any command, you can set environment variables to specify configu

```
spark_rapids_user_tools databricks-azure qualification [options]
spark_rapids_user_tools databricks-azure qualification --help
spark_rapids_user_tools databricks-azure qualification -- --help
```
The local deployment runs on the local development machine. It requires:
Expand Down
6 changes: 3 additions & 3 deletions user_tools/docs/user-tools-dataproc.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ the applications running on _Google Cloud Dataproc_.
- from source: `pip install -e .`
- verify the command is installed correctly by running
```bash
spark_rapids_user_tools dataproc --help
spark_rapids_user_tools dataproc -- --help
```

### 4.Environment variables
Expand All @@ -55,7 +55,7 @@ RAPIDS variables have a naming pattern `RAPIDS_USER_TOOLS_*`:

```
spark_rapids_user_tools dataproc qualification [options]
spark_rapids_user_tools dataproc qualification --help
spark_rapids_user_tools dataproc qualification -- --help
```
The local deployment runs on the local development machine. It requires:
Expand Down Expand Up @@ -380,7 +380,7 @@ generate an output while displaying warning that the remote changes failed.

```
spark_rapids_user_tools dataproc diagnostic [options]
spark_rapids_user_tools dataproc diagnostic --help
spark_rapids_user_tools dataproc diagnostic -- --help
```

Run diagnostic command to collects information from Dataproc cluster, such as OS version, # of worker
Expand Down
4 changes: 2 additions & 2 deletions user_tools/docs/user-tools-onprem.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ The tool currently only supports event logs stored on local path. The remote out
- from source: `pip install -e .`
- verify the command is installed correctly by running
```bash
spark_rapids_user_tools onprem --help
spark_rapids_user_tools onprem -- --help
```

### 3.Environment variables
Expand All @@ -39,7 +39,7 @@ Before running any command, you can set environment variables to specify configu

```
spark_rapids_user_tools onprem qualification [options]
spark_rapids_user_tools onprem qualification --help
spark_rapids_user_tools onprem qualification -- --help
```
The local deployment runs on the local development machine. It requires:
Expand Down

0 comments on commit 61460fd

Please sign in to comment.