Skip to content

How to On Board New RP with Azure PowerShell Generator

Yan Xu edited this page Sep 5, 2023 · 28 revisions

The audience of this guide should be Azure service owners including both Microsoft internal teams and Microsoft’s partners, who want to on-board their services on Az 4.0 by leveraging Azure PowerShell generator.

Note: Currently, we only support on-boarding new services, for legacy services already existed in Az 3.0, Process has not been ready yet.

The content of the guide will try to cover the whole process to on-board a new service to Az PowerShell, which includes prerequisite, code generation, customization, examples, design review, test, code review.

Prerequisite

Service Readiness

  • Service design is finalized, and the integration test should pass
  • Swagger API review should pass
  • Some best practice when naming the operationId in the swagger
    • The operationId format is ‘<Resource>_<Verb>’
    • Please see the bottom of https://github.com/Azure/autorest.powershell/blob/main/powershell/autorest-configuration.md for the verb mapping between the <Verb> used in operationId and the <Verb> used in PowerShell, and the left part is the <Verb> used in operationId. Please try to select the verbs in the list.
    • We will only provide New-* and Update-* cmdlets for a resource
      • So it is better the operationId for your PUT API should be named as ‘<resource>_Create’, which will prevent the Set-* cmdlets from being generated.
    • If you do not have a fully functional PATCH API for a resource, you may need to implement it through customization.
      • Using “GET” and “PUT” to implement the Update-* cmdlet

Tools preparation

Code Generation

Service Configuration

### AutoRest Configuration
> see https://aka.ms/autorest

``` yaml
# Please specify the commit id that includes your features to make sure generated codes stable.
branch: 314f28163917b9cfc527f7776b5e4a1dea69d295
require:
# readme.azure.noprofile.md is the common configuration file
  - $(this-folder)/../readme.azure.noprofile.md
input-file:
# You need to specify your swagger files here.
  - $(repo)/specification/databricks/resource-manager/Microsoft.Databricks/stable/2018-04-01/databricks.json
# If the swagger has not been put in the repo, you may uncomment the following line and refer to it locally
# - (this-folder)/relative-path-to-your-swagger 

# For new RP, the version is 0.1.0
module-version: 0.1.0
# Normally, title is the service name
title: Databricks
subject-prefix: $(service-name)

# If there are post APIs for some kinds of actions in the RP, you may need to 
# uncomment following line to support viaIdentity for these post APIs
# identity-correction-for-post: true

directive:
  # Following is two common directive which are normally required in all the RPs
  # 1. Remove the unexpanded parameter set
  # 2. For New-* cmdlets, ViaIdentity is not required, so CreateViaIdentityExpanded is removed as well
  - where:
      variant: ^Create$|^CreateViaIdentity$|^CreateViaIdentityExpanded$|^Update$|^UpdateViaIdentity$
    remove: true
  # Remove the set-* cmdlet
  - where:
      verb: Set
    remove: true

```

Generate Code

Follow the link https://github.com/Azure/autorest.powershell/tree/main/tools/docker to generate, build and run your service within a docker container.

Customization

Customization Through Directives

Guide is located at https://github.com/Azure/autorest.powershell/blob/main/docs/directives.md.

Code Level Customization

Guide is located at https://github.com/Azure/autorest.powershell/blob/main/docs/customization.md

Directive for Creating Model Cmdlet

Currently most cmdlets are generated based on swagger operator. But sometimes we need users to input complex model object. We usually have two ways for this situation: let user construct a powershell hashmap object and offer user a cmdlet to construct the object. The first way is supported by the generated code. If you want to create a cmdlet for user, we offer a directive for you to generate the cmdlet. Here is the example of usage (please add it in README.md under the directive section and you should find all the models in generated/api/Models/Apixxxx) :

  - model-cmdlet:
    - ModelA
    - ModelB

Common Modules

In some cases, you may need to call cmdlets in other RP modules to complete a complex operation, and we put these common cmdlets in the helper, which may be required in your RP module. There are three common modules listed as below.

  • AppInsights
  • MSI
  • Storage You just need to add following code in the README.md to require them.
require:
  - $(this-folder)/../helpers/Storage/readme.noprofile.md
  - $(this-folder)/../helpers/AppInsights/readme.noprofile.md
  - $(this-folder)/../helpers/ManagedIdentity/readme.noprofile.md

Break change and preview message

If you want to add break change or preview message for your cmdlets, please refer to https://github.com/Azure/azure-powershell/blob/main/documentation/development-docs/breakingchange-for-autogen-module.md.

Examples

After you complete coding, you may need to add some examples for help and design review. You need to add the examples manually based on the generated example stubs under the example folder. And Ideally, you need to provide one example per parameter set.

Design Review

When all above steps are completed, you may create an issue in https://github.com/Azure/azure-powershell-cmdlet-review-pr for design review. Please provides following information in the issue.

  • Link to the speclet of the RP if there is
  • Cmdlet syntax and examples (you may copy them from the docs folder)
  • Link to docs folder in your fork repo

If the RP is complex, a review meeting will be scheduled. Otherwise, review will be done through comments in the issue/email.

Test

How to Implement Tests

During the build (‘./build-module.ps1’), we will generate stubs and some utilities for test, which are located at the folder test. Considering before test, users may need to create some resources to be used in the test, we have integrated a simplified version of the Resource module. And you will find it in under $Home.PSSharedModules\Resources as a result of running ‘./test-module.ps1' the first time. You may get available cmdlets through the following commands.

PS C:\Users\xidi\.PSSharedModules\Resources> .\run-module.ps1
PS C:\Users\xidi\.PSSharedModules\Resources [Az.Resources.TestSupport]> Get-Command -Module Az.Resources.TestSupport

And in the test folder, there is a test file called utils.ps1. And in the file, there are two functions you may need to customize.

  • setupEnv
    • You can create resources for your test here
    • You can create some random strings for your test
    • Strongly recommend you to create resources with New-AzDeployment with a template file
  • cleanupEnv
    • Clean the test resources when the test is completed, normally you just need to delete the test group

Regarding to the Pester test stubs, we will generate one test case per parameter set, and you should implement the test logic accordingly.

Test Modes

After you have completed the test cases, you may run ‘test-module.ps1’ to run the test. Test supports 3 modes.

  • Live

    • Switch is -Live

    • Run a live test

      ./test-module.ps1 -live
    • In this mode, the tests will be run lively (sending requests to Azure services), no recording files will be saved.

  • Record

    • Switch is -Record

    • Run a live test and meanwhile record the response from server in a local json file

      ./test-module.ps1 -Record
    • In this mode, the tests will be run lively (sending requests to Azure services), and recording files will be saved as <cmdlet-name>.Recording.json.

  • Playback

    • Switch Is -Playback
    • Run a mock test and data is from the json file generated in the record mode
    • Before running the playback test, you will need to run Clear-AzContext first to clear the context
      ./test-module.ps1 -Playback
    • In this mode, the tests will not send the requests to Azure services, but rely on recording files named as <cmdlet-name>.Recording.json.

Run Specific Test Cases

If you only want to run one or more specific test cases, here is the example. Provide the cmdlet names to parameter TestName and seperate them with ,.

./test-module.ps1 -TestName Get-AzDatabricksWorkspace

or

./test-module.ps1 -TestName Get-AzDatabricksWorkspace,New-AzDatabricksWorkspace

Limitations on Supporting Recording Specific Cases

If we only record some specific test cases, we may use a different subscription id from the previous one. In this case, after recording, you have to replace all old subscription ids with the new one.

Use Previous Config in Recording

In recording tests, we use env.json to save the test environment variables. Normally we use setupEnv function in utils.ps1 file to prepare the environment. For example, in src/Databricks/test/utils.ps1, we set an environment variable workSpaceName1:

$workSpaceName1 = RandomString -allChars $false -len 6
$env.Add("workSpaceName1", $workSpaceName1)

In this case, the env.json will be updated with the new $workSpaceName1 every time in recording tests. (Since it's a random string.)

If we don't want to update env.json every time but only use the preview configured variables, we can use a parameter -UsePreviousConfigForRecord.

Here are 2 steps:

  • Always use the function $env.AddWithCache instead of $env.Add to add the environment variables.
  • Add a new parameter -UsePreviousConfigForRecord to test-module.ps1 if you want to use the previous config.

Take $workSpaceName1 in src/Databricks/test/utils.ps1 for example, the script should be

$workSpaceName1 = RandomString -allChars $false -len 6
$env.AddWithCache("workSpaceName1", $workSpaceName1, $UsePreviousConfigForRecord)

And run test-module.ps1 like this:

./test-module.ps1 -TestName Get-AzDatabricksWorkspace -Record -UsePreviousConfigForRecord

Avoid unnecessary loadings

NOTE: You can skip this section if you're using the latest autorest.

An old version of autorest will generate tests with header like this (<cmdlet-name> should be the actual cmdlet name.):

$loadEnvPath = Join-Path $PSScriptRoot 'loadEnv.ps1'
if (-Not (Test-Path -Path $loadEnvPath)) {
    $loadEnvPath = Join-Path $PSScriptRoot '..\loadEnv.ps1'
}
. ($loadEnvPath)
$TestRecordingFile = Join-Path $PSScriptRoot '<cmdlet-name>.Recording.json'
$currentPath = $PSScriptRoot
while(-not $mockingPath) {
    $mockingPath = Get-ChildItem -Path $currentPath -Recurse -Include 'HttpPipelineMocking.ps1' -File
    $currentPath = Split-Path -Path $currentPath -Parent
}
. ($mockingPath | Select-Object -First 1).FullName

This script will load $loadEnvPath and $mockingPath even if you are not running this test case.

To avoid the unnecessary loadings, you can update the scripts to the new version:

if(($null -eq $TestName) -or ($TestName -contains '<cmdlet-name>'))
{
  $loadEnvPath = Join-Path $PSScriptRoot 'loadEnv.ps1'
  if (-Not (Test-Path -Path $loadEnvPath)) {
      $loadEnvPath = Join-Path $PSScriptRoot '..\loadEnv.ps1'
  }
  . ($loadEnvPath)
  $TestRecordingFile = Join-Path $PSScriptRoot '<cmdlet-name>.Recording.json'
  $currentPath = $PSScriptRoot
  while(-not $mockingPath) {
      $mockingPath = Get-ChildItem -Path $currentPath -Recurse -Include 'HttpPipelineMocking.ps1' -File
      $currentPath = Split-Path -Path $currentPath -Parent
  }
  . ($mockingPath | Select-Object -First 1).FullName
}

Skip a Test

If you don't want to run a test case at all but only want to keep it, you can add "-skip" to it. E.g.

Describe '<cmdlet-name>' {
    It '<parameter-set-name>' -skip {
        ...
    }
}

Make a Test LiveOnly

If you don't want to record a test case but only run it lively, you can add a "LiveOnly" tag to it. E.g.

Describe 'New-AzDatabricksWorkspace' -Tag 'LiveOnly'{
}

Code Review

Source code files needs to be checked in.

Update Module

  1. Update your changes under the module folder in your branch which is checked out from generation branch.
  2. Submit a PR to generation branch for review.
  3. After the PR is merged, our team member will help to sync change to main branch.

Sync Module From Generate Branch To Main

This is for Powershell team members.

  1. Use pipeline azure-powershell - Code Gen to create a branch codegen/{ModuleName} for PR.
  2. Add changelog on that branch codegen/{ModuleName}.
  3. Submit PR from codegen/{ModuleName} to main.

Another thing need when the module first add to main is to update documentation/azure-powershell-modules.md by following the format.

Troubleshooting

How to debug generated code

note: Instead of using Docker, you will need to setup the environment in you Windows for debug. Please see https://github.com/Azure/autorest.powershell/tree/main/docs#requirements for details.

Please follow steps below to debug if you run into any issues.

  • Open your module folder in Visual Studio Code
  • After you generate and build your module, run .\run-module.ps1 -Code
  • Add ‘-Break’ switch when executing the cmdlet
  • In Visual Studio Code, set break point and then press ‘F5’

Parameter type is <ParameterAttribute> in the generated docs

Check if you are decorating your parameter with [Parameter] attribute. It should be [Parameter()], or you could just remove the line.

References