Skip to content

Commit

Permalink
revert deletion of api scenario
Browse files Browse the repository at this point in the history
  • Loading branch information
Konrad Jamrozik committed Apr 15, 2024
1 parent 803809c commit 57ec311
Show file tree
Hide file tree
Showing 15 changed files with 2,480 additions and 0 deletions.
36 changes: 36 additions & 0 deletions documentation/api-scenario/how-to/GenerateABasicApiScenario.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
# Generate a basic API Scenario file

In this section, we will show you how to generate a basic API scenario file from Swagger. This is useful for quickly generating a basic API scenario file as baseline, and you can improve the API Scenario (adjusting step orders and providing meaningful values) and make it runnable.

## Prerequisites

1. Install [oav](https://www.npmjs.com/package/oav)

```bash
npm i -g oav
```
2. Install [docker](https://docs.docker.com/get-docker/)

## Steps

1. Compile Swagger into dependencies.json with Restler.

```bash
docker run --rm -v $(pwd)/specification:/swagger -w /swagger/.restler_output mcr.microsoft.com/restlerfuzzer/restler dotnet /RESTler/restler/Restler.dll compile --api_spec /swagger/appconfiguration/resource-manager/Microsoft.AppConfiguration/stable/2022-05-01/appconfiguration.json
```

2. Generate a basic API Scenario file.

The generated API Scenario file will contain all the operations in the Swagger file, ordered by the dependencies. At each step, the minimum required parameters will be filled in.

```bash
oav generate-api-scenario static --specs specification/appconfiguration/resource-manager/Microsoft.AppConfiguration/stable/2022-05-01/appconfiguration.json --dependency specification/.restler_output/Compile/dependencies.json -o specification/appconfiguration/resource-manager/Microsoft.AppConfiguration/stable/2022-05-01/scenarios
```

As an alternative, if Swagger examples are ready, you can add `--useExample` parameter to generate the API scenario file based on Swagger examples:

```bash
oav generate-api-scenario static --specs specification/appconfiguration/resource-manager/Microsoft.AppConfiguration/stable/2022-05-01/appconfiguration.json --dependency specification/.restler_output/Compile/dependencies.json -o specification/appconfiguration/resource-manager/Microsoft.AppConfiguration/stable/2022-05-01/scenarios --useExample
```

Next you can run the API scenario file with `oav run`. See how in [QuickStart](./QuickStart.md).
87 changes: 87 additions & 0 deletions documentation/api-scenario/how-to/MakeTestProxyRecording.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,87 @@
# How to use API Scenario Test and test-proxy to make traffic recording

In this section, we'll show you how to make traffic recording with API Scenario Test and test-proxy, assuming you've already learnt how to use API Scenario test. If not, please go to [QuickStart](./QuickStart.md).

Test-proxy is a tool that provides out-of-process record/playback capabilities compatible with any language. It also supports pushing/restoring the API test recordings from external git repository. Traffic recording can be an evidence of running API test, and also help validate Swagger consistency. See [how to install test-proxy](https://github.com/Azure/azure-sdk-tools/blob/main/tools/test-proxy/Azure.Sdk.Tools.TestProxy/README.md#installation)

## Prepare assets.json file

The assets.json file is a configuration file used by test-proxy to push/restore recording to/from external git repository. Create an assets.json file under `scenarios/` folder with following content:

```json
{
"AssetsRepo": "Azure/azure-sdk-assets",
"AssetsRepoPrefixPath": "",
"TagPrefix": "apitest/<ServiceName>/<package>"
}
```

Take appConfiguration as an example,

```
specification/appconfiguration/data-plane
├── Microsoft.AppConfiguration
│ └── stable
│ └── 1.0
│ ├── appconfiguration.json
│ ├── examples
│ │ ├── CheckKeyValue.json
│ │ ├── PutLock.json
│ │ └── PutLock_IfMatch.json
│ └── scenarios
│ └── assets.json <----- check-in assets.json here
└── readme.md
```

## Make recording

1. Start test-proxy under the repository root folder

```bash
test-proxy start
```

By default, it listens to 5000/5001 ports for http/https respectively.

2. Run API Scenario test

```bash
oav run <your-scenario-file>.yaml -e <your-env>.json --testProxy http://localhost:5000 --testProxyAssets <your-assets-file>.json
```

3. Check the recording file

The recording file can be found in the `<repository-root>/.assets` folder.

## Push recording

1. Delete secrets from recording files

Please ensure no secrets in the recording file, before trying to push to external git repository.

2. Push recording to the git repository

```bash
test-proxy push -a <your-assets-file>.json
```

3. Commit updated assets.json file

After pushing the recording, test-proxy will write the latest git tag to assets.json file. Check the content of recording on github assets repository if as expected. If no problem, commit and push the updated assets.json file.

## Restore recording

Use following command to fetch recording:

```bash
test-proxy restore -a <your-assets-file>.json
```

or reset if any weird thing happens:

```bash
test-proxy reset -a <your-assets-file>.json
```
## Playback recording

TBA
150 changes: 150 additions & 0 deletions documentation/api-scenario/how-to/QuickStart.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,150 @@
<!--
Copyright (c) 2021 Microsoft Corporation
This software is released under the MIT License.
https://opensource.org/licenses/MIT
-->

# Quick start with API Scenario test

## Install

`oav` is an open-source powerful tool for swagger validation, API Scenario testing, and examples generation. GitHub: https://github.com/Azure/oav.

```sh
npm install -g oav
```

### OAV Features

- Very easy to use and run. It supports running API Scenario with ARM Dogfood/Canary/Production environments, and local environment as well.
- Support Postman collection format. Debug easily.
- Traffic schema validation and Azure API guidelines validation. `oav` implements a powerful validation algorithm and help developer to detect service issue in the early phase.
- Generating high quality Swagger examples from API test traffic.
- Validation result report. After each run API scenario, developer will get a validation report which contains detected issue in API test.
- Integrate everywhere. Easily integrate with azure-pipeline, cloud-test.

Run `oav run -h` to find all available options.

## Create AAD app

To run API test, first please prepare an AAD app which is used for provisioning Azure resource. Please grant subscription contributor permission to this AAD app.

For how to create AAD app, please follow this doc https://docs.microsoft.com/en-us/azure/active-directory/develop/howto-create-service-principal-portal

## Authoring steps

We will write API scenario file for AppConfiguration service as an example.

#### 1. Write your first API scenario file

First, create a folder `scenarios` under the api version folder. All API scenario files under the `scenarios` folder should bind with the api version.

![folder-structure](./folder-structure.png)

Now write your basic API scenario. For more detail about API scenario file format, please refer to
[API Scenario Definition Reference](../references/ApiScenarioDefinition.md).

```yaml
# yaml-language-server: $schema=https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/documentation/api-scenario/references/v1.2/schema.json
scope: ResourceGroup
variables:
configStoreName:
type: string
prefix: configstor

scenarios:
- scenario: quickStart
description: Quick start with AppConfiguration ConfigurationStores
steps:
- step: Operations_CheckNameAvailability
exampleFile: ../examples/CheckNameAvailable.json
- step: ConfigurationStores_Create
exampleFile: ../examples/ConfigurationStoresCreate.json
- step: ConfigurationStores_Get
exampleFile: ../examples/ConfigurationStoresGet.json
```
or use operation based step if Swagger examples are not ready or you want to create more scenarios without writing Swagger examples.
```yaml
# yaml-language-server: $schema=https://raw.githubusercontent.com/Azure/azure-rest-api-specs/main/documentation/api-scenario/references/v1.2/schema.json
scope: ResourceGroup
variables:
configStoreName:
type: string
prefix: configstor

scenarios:
- scenario: quickStart
description: Quick start with AppConfiguration ConfigurationStores
steps:
- step: Operations_CheckNameAvailability
operationId: Operations_CheckNameAvailability
parameters:
checkNameAvailabilityParameters:
name: $(configStoreName)
type: Microsoft.AppConfiguration/configurationStores
- step: ConfigurationStores_Create
operationId: ConfigurationStores_Create
parameters:
configStoreCreationParameters:
location: $(location)
sku:
name: Standard
tags:
myTag: myTagValue
- step: ConfigurationStores_Get
operationId: ConfigurationStores_Get
```
#### 2. Create env file
The `env.json` file contains required API scenario variables such as, subscriptionId, AAD applicationId, AAD applicationSecret.

```json
{
"subscriptionId": "<your subscription id>",
"location": "westcentralus",
"tenantId": "<AAD app tenantId>",
"client_id": "<your add client_id>",
"client_secret": "<your aad client_secret>"
}
```

#### 3. Run API Scenario test

```sh
oav run ~/workspace/azure-rest-api-specs/specification/appconfiguration/resource-manager/Microsoft.AppConfiguration/stable/2022-05-01/scenarios/quickstart.yaml --tag package-2022-05-01 -e env.json --verbose
```

The `--tag` parameter specifies tag name in the autorest configuration (readme.md) file, and Swagger files under the tag will be loaded. By default, oav tries to find the closest readme.md file in the upper directories of the scenario file, and use the "default" tag in it. You can use `--readme` and `--tag` to specify readme.md file and tag to load Swagger files, or `--specs` to specify Swagger files directly.

#### 4. Debug with Postman

Sometimes the command `oav run` may fail due to non 2xx HTTP status code. Now you need to debug the API scenario with Postman.

When run `run`, it automatically generate Postman collection and postman env in `.apitest/<apiScenarioFile>/<runId>/<scenario>` folder. Here is the output folder structure. The `collection.json` and `env.json` is generated postman collection file and environment file. `202207221820-cyq4mk` is the runId, generated uniquely for each run command.

```
.apitest
└── quickstart.yaml
└── 202207221820-cyq4mk
├── quickStart
│ ├── collection.json
│ ├── env.json
│ └── report.json
└── quickStart.json
```
Postman is a widely used GUI API testing tool. And you could use Postman import the generated postman collection and env for your local debug.
![import-postman-collection](./import-postman-collection.png)
After you import Postman collection, you will get such requests. Now you could debug API test with postman locally.
![postman-collection-list](./postman-collection-list.PNG)
#### 5. Manual update API Scenario or example
After debug with Postman, you need to rewrite back all the updated values and run `oav run <api-scenario-file> -e <env.json>` again. The result should be successful.
Loading

0 comments on commit 57ec311

Please sign in to comment.