page_type | languages | products | urlFragment | name | description | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
sample |
|
|
contoso-creative-writer |
Creative Writing Assistant - Working with Agents using Prompty (Python Implementation) |
Using Azure OpenAI agent with Python, integrating Bing Search API and Azure AI Search, to create articles based on user topics and instruction. |
- Features
- Azure account requirements
- Getting Started
- Deployment
- Testing the sample
- Guidance
- Resources
- Code of Conduct
Contoso Creative Writer is an app that will help you write well researched, product specific articles. Enter the required information and then click "Start Work". To watch the steps in the agent workflow select the debug button in the bottom right corner of the screen. The result will begin writing once the agents complete the tasks to write the article.
This sample demonstrates how to create and work with AI agents driven by Azure OpenAI. It includes a FastAPI app that takes a topic and instruction from a user and then calls a research agent that uses the Bing Search API to research the topic, a product agent that uses Azure AI Search to do a semantic similarity search for related products from a vector store, a writer agent to combine the research and product information into a helpful article, and an editor agent to refine the article that's finally presented to the user.
This project template provides the following features:
- Azure OpenAI to drive the various agents
- Prompty to create, manage and evaluate the prompt into our code.
- Bing Search API to research the topic provided
- Azure AI Search for performing semantic similarity search
IMPORTANT: In order to deploy and run this example, you'll need:
- Azure account. If you're new to Azure, get an Azure account for free and you'll get some free Azure credits to get started. See guide to deploying with the free trial.
- Azure subscription with access enabled for the Azure OpenAI Service. If your access request to Azure OpenAI Service doesn't match the acceptance criteria, you can use OpenAI public API instead.
- Ability to deploy
gpt-35-turbo-0613
,gpt-4-1106-Preview
andgpt-4o-2024-05-13
. - We recommend using Canada East, as this region has access to all models and services required.
- Ability to deploy
- Azure subscription with access enabled for Bing Search API
- Azure subscription with access enabled for Azure AI Search
You have a few options for setting up this project. The easiest way to get started is GitHub Codespaces, since it will setup all the tools for you, but you can also set it up locally.
-
You can run this template virtually by using GitHub Codespaces. The button will open a web-based VS Code instance in your browser:
-
Open a terminal window.
-
Sign in to your Azure account. You'll need to login to both the Azure Developer CLI and Azure CLI:
i. First with Azure Developer CLI
azd auth login
ii. Then sign in with Azure CLI
az login --use-device-code
-
Provision the resources and deploy the code:
azd up
You will be prompted to select some details about your deployed resources, including location. As a reminder we recommend Canada East as the region for this project. Once the deployment is complete you should be able to scroll up in your terminal and see the url that the app has been deployed to. It should look similar to this
Ingress Updated. Access your app at https://env-name.codespacesname.eastus2.azurecontainerapps.io/
. Navigate to the link to try out the app straight away! -
Once the above steps are completed you can test the sample.
A related option is VS Code Dev Containers, which will open the project in your local VS Code using the Dev Containers extension:
-
Start Docker Desktop (install it if not already installed)
-
Open the project:
-
In the VS Code window that opens, once the project files show up (this may take several minutes), open a terminal window.
-
Install required packages:
cd src/api pip install -r requirements.txt
Once you've completed these steps jump to deployment.
Note for Windows users: If you are not using a container to run this sample, our hooks are currently all shell scripts. To provision this sample correctly while we work on updates we recommend using git bash.
-
Create a new folder and switch to it in the terminal, then run this command to download the project code:
azd init -t agent-openai-python-prompty
Note that this command will initialize a git repository, so you do not need to clone this repository.
-
Install required packages:
cd src/api pip install -r requirements.txt
Once you've opened the project in Codespaces, Dev Containers, or locally, you can deploy it to Azure.
-
Sign in to your Azure account. You'll need to login to both the Azure Developer CLI and Azure CLI:
i. First with Azure Developer CLI
azd auth login
ii. Then sign in with Azure CLI
az login --use-device-code
If you have any issues with that command, you may also want to try
azd auth login --use-device-code
.This will create a folder under
.azure/
in your project to store the configuration for this deployment. You may have multiple azd environments if desired. -
Provision the resources and deploy the code:
azd up
This project uses
gpt-35-turbo-0613
,gpt-4-1106-Preview
andgpt-4o-2024-05-13
which may not be available in all Azure regions. Check for up-to-date region availability and select a region during deployment accordingly. We recommend using Canada East for this project.After running azd up, you may be asked the following question during
Github Setup
:Do you want to configure a GitHub action to automatically deploy this repo to Azure when you push code changes? (Y/n) Y
You should respond with
N
, as this is not a necessary step, and takes some time to set up.
This sample repository contains an agents folder that includes subfolders for each agent. Each agent folder contains a prompty file where the agent's prompty is defined and a python file with the code used to run it. Exploring these files will help you understand what each agent is doing. The agent's folder also contains an orchestrator.py
file that can be used to run the entire flow and to create an article. When you ran azd up
a catalogue of products was uploaded to the Azure AI Search vector store and index name contoso-products
was created.
To test the sample:
-
Run the example web app locally using a FastAPI server.
First navigate to the src/api folder
cd ./src/api
Run the FastAPI webserver
fastapi dev main.py
Important Note: If you are running in Codespaces, you will need to change the visibility of the API's 8000 and 5173 ports to
public
in your VS Code terminal'sPORTS
tab. The ports tab should look like this:If you open the server link in a browser, you will see a URL not found error, this is because we haven't created a home url route in FastAPI. We have instead created a
/get_article
route which is used to pass context and instructions directly to the get_article.py file which runs the agent workflow.(Optional) We have created a web interface which we will run next, but you can test the API is working as expected by running this in the browser:
http://127.0.0.1:8080/get_article?context=Write an article about camping in alaska&instructions=find specifics about what type of gear they would need and explain in detail
-
Once the FastAPI server is running you can now run the web app. To do this open a new terminal window and navigate to the web folder using this command:
cd ./src/web
First install node packages:
npm install
Then run the web app with a local dev web server:
npm run dev
This will launch the app, where you can use example context and instructions to get started. On the 'Creative Team' page you can examine the output of each agent by clicking on it. The app should look like this:
Change the instructions and context to create an article of your choice.
-
For debugging purposes you may want to test in Python using the orchestrator Logic
To run the sample using just the orchestrator logic use the following command:
cd ./src/api python -m orchestrator
To activate the Prompty tracing server:
export LOCAL_TRACING=true
Then start the orchestrator:
cd ./src/api
python -m orchestrator
Once you can see the article has been generated, a .runs
folder should appear in the ./src/api
. Select this folder and click the .tracy
file in it.
This shows you all the Python functions that were called in order to generate the article. Explore each section and see what helpful information you can find.
Contoso Creative Writer uses evaluators to assess application response quality. The 4 metrics the evaluators in this project assess are Coherence, Fluency, Relevance and Groundedness. A custom evaluate.py
script has been written to run all evaulations for you.
- To run the script run the following commands:
cd ./src/api
python -m evaluate.evaluate
- Check: You see scores for Coherence, Fluency, Relevance and Groundedness.
- Check: The scores are between 1 and 5
- To understand what is being evaluated open the
src/api/evaluate/eval_inputs.jsonl
file.- Observe that 3 examples of research, product and assignment context are stored in this file. This data will be sent to the orchestrator so that each example will have:
- each example will have the evaluations run and will incoperate all of the context, research, products, and final article when grading the response.
This template is set up to run CI/CD when you push changes to your repo. When CI/CD is configured, evaluations will in GitHub actions and then automatically deploy your app on push to main.
To set up CI/CD with GitHub actions on your repository, run the following command:
azd pipeline config
This template uses gpt-35-turbo-0613
,gpt-4-1106-Preview
and gpt-4o-2024-05-13
which may not be available in all Azure regions. Check for up-to-date region availability and select a region during deployment accordingly
- We recommend using Canada East
You can estimate the cost of this project's architecture with Azure's pricing calculator
- Azure subscription with access enabled for Bing Search API
- Azure subscription with access enabled for Azure AI Search
Note
When implementing this template please specify whether the template uses Managed Identity or Key Vault
This template has either Managed Identity or Key Vault built in to eliminate the need for developers to manage these credentials. Applications can use managed identities to obtain Microsoft Entra tokens without having to manage any credentials. Additionally, we have added a GitHub Action tool that scans the infrastructure-as-code files and generates a report containing any detected issues. To ensure best practices in your repo we recommend anyone creating solutions based on our templates ensure that the Github secret scanning setting is enabled in your repos.
- Prompty Documentation
- Quickstart: Multi-agent applications using Azure OpenAI article: The Microsoft Learn Quickstart article for this sample, walks through both deployment and the relevant code for orchestrating multi-agents in chat.
- Develop Python apps that use Azure AI services
This project has adopted the Microsoft Open Source Code of Conduct.
Resources:
- Microsoft Open Source Code of Conduct
- Microsoft Code of Conduct FAQ
- Contact opencode@microsoft.com with questions or concerns
For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
This project follows below responsible AI guidelines and best practices, please review them before using this project: