Skip to content
/ prompty Public
forked from microsoft/prompty

Prompty makes it easy to create, manage, debug, and evaluate LLM prompts for your AI applications. Prompty is an asset class and format for LLM prompts designed to enhance observability, understandability, and portability for developers.

License

Notifications You must be signed in to change notification settings

auge2u/prompty

 
 

Repository files navigation

Prompty

Prompty is an asset class and format for LLM prompts designed to enhance observability, understandability, and portability for developers. The primary goal is to accelerate the developer inner loop.

This repo contains the following:

This Visual Studio Code extension offers an intuitive prompt playground within VS Code to streamline the prompt engineering process. You can find the Prompty extension in the Visual Studio Code Marketplace.

What is Prompty?

Specification

Prompty standardizes prompts and their execution into a single asset.

Language Spec

VSCode Extension Features

Quickly Create

Quickly create a basic prompty by right-clicking in the VS Code explorer and selecting "New Prompty."

Quick Create

Preview

Preview prompty similar to markdown with dynamic template rendering while typing, allowing you to see the prompt that will be sent to the model.

Preview

Define and Switch Model Configurations

  • Define your model configurations directly in VS Code.

  • Quickly switch between different model configurations.

    Define Configuration

    Switch Model Configuration

  • Use VS Code settings to define model configuration at:

    • User level for use across different prompty files.

    • Workspace level to share with team members via Git.

      ModelConfigurationSettings

  • We strongly encourage using Azure Active Directory authentication for enhanced security. Leave the api_key empty to trigger AAD auth.

  • OpenAI is also supported. You can store the key in VSCode settings or use ${env:xxx} to read the API key from an environment variable.

    • You can put environment variables in .env file, in the same folder of the prompty file, or in the workspace root folder.
    • Alternatively, you can also specify it in system variables, follow OpenAI's Guide for key safety, setting it through Control Panel/zsh/bash, and then restart VS Code to load new values.

Quick Run

Hit F5 or click the Run button at the top. There are two output windows:

  • Prompty Output shows a concise view.

    Prompty Output

  • Prompty Output (Verbose) shows detailed requests sent and received.

    Prompty Output (Verbose)

Orchestrator Integration

Prompty is supported by popular orchestration frameworks:

Right-click on a .prompty file to quickly generate integration snippets.

Orchestrator Integration

Feedback

Submit your feedback about Prompty or the VS Code extension to the Microsoft/prompty GitHub repository.

Documentation

About

Prompty makes it easy to create, manage, debug, and evaluate LLM prompts for your AI applications. Prompty is an asset class and format for LLM prompts designed to enhance observability, understandability, and portability for developers.

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 46.9%
  • MDX 22.7%
  • C# 17.7%
  • TypeScript 11.9%
  • Dockerfile 0.5%
  • CSS 0.2%
  • JavaScript 0.1%