diff --git a/samples/notebooks/python/08-native-function-inline.ipynb b/samples/notebooks/python/08-native-function-inline.ipynb new file mode 100644 index 000000000000..196f1a99dacc --- /dev/null +++ b/samples/notebooks/python/08-native-function-inline.ipynb @@ -0,0 +1,559 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "3c93ac5b", + "metadata": {}, + "source": [ + "# Running Native Functions" + ] + }, + { + "cell_type": "markdown", + "id": "40201641", + "metadata": {}, + "source": [ + "Two of the previous notebooks showed how to [execute semantic functions inline](./03-semantic-function-inline.ipynb) and how to [run prompts from a file](./02-running-prompts-from-file.ipynb).\n", + "\n", + "In this notebook, we'll show how to use native functions from a file. We will also show how to call semantic functions from native functions.\n", + "\n", + "This can be useful in a few scenarios:\n", + "\n", + "* Writing logic around how to run a prompt that changes the prompt's outcome.\n", + "* Using external data sources to gather data to concatenate into your prompt.\n", + "* Validating user input data prior to sending it to the LLM prompt.\n", + "\n", + "Native functions are defined using standard Python code. The structure is simple, but not well documented at this point.\n", + "\n", + "The following examples are intended to help guide new users towards successful native & semantic function use with the SK Python framework." + ] + }, + { + "cell_type": "markdown", + "id": "d90b0c13", + "metadata": {}, + "source": [ + "Prepare a semantic kernel instance first, loading also the AI service settings defined in the [Setup notebook](00-getting-started.ipynb):" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "1da651d4", + "metadata": {}, + "outputs": [], + "source": [ + "!python -m pip install semantic-kernel==0.2.7.dev0" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "dd150646", + "metadata": {}, + "outputs": [], + "source": [ + "import os\n", + "import sys\n", + "import semantic_kernel as sk\n", + "from semantic_kernel.connectors.ai.open_ai import OpenAITextCompletion\n", + "\n", + "kernel = sk.Kernel()\n", + "\n", + "api_key, org_id = sk.openai_settings_from_dot_env()\n", + "kernel.add_text_completion_service(\"dv\", OpenAITextCompletion(\"text-davinci-003\", api_key, org_id))" + ] + }, + { + "cell_type": "markdown", + "id": "186767f8", + "metadata": {}, + "source": [ + "Let's create a **native** function that gives us a random number between 3 and a user input as the upper limit. We'll use this number to create 3-x paragraphs of text when passed to a semantic function." + ] + }, + { + "cell_type": "markdown", + "id": "589733c5", + "metadata": {}, + "source": [ + "First, let's create our native function." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "ae29c207", + "metadata": {}, + "outputs": [], + "source": [ + "import random\n", + "from semantic_kernel.skill_definition import sk_function\n", + "\n", + "class GenerateNumberSkill:\n", + " \"\"\"\n", + " Description: Generate a number between 3-x.\n", + " \"\"\"\n", + "\n", + " @sk_function(\n", + " description=\"Generate a random number between 3-x\",\n", + " name=\"GenerateNumberThreeOrHigher\"\n", + " )\n", + " def generate_number_three_or_higher(self, input: str) -> str:\n", + " \"\"\"\n", + " Generate a number between 3-\n", + " Example:\n", + " \"8\" => rand(3,8)\n", + " Args:\n", + " input -- The upper limit for the random number generation\n", + " Returns:\n", + " int value\n", + " \"\"\"\n", + " try:\n", + " return str(random.randint(3, int(input))) \n", + " except ValueError as e:\n", + " print(f\"Invalid input {input}\")\n", + " raise e" + ] + }, + { + "cell_type": "markdown", + "id": "f26b90c4", + "metadata": {}, + "source": [ + "Next, let's create a semantic function that accepts a number as `{{$input}}` and generates that number of paragraphs about two Corgis on an adventure. `$input` is a default variable semantic functions can use. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "7890943f", + "metadata": {}, + "outputs": [], + "source": [ + "sk_prompt = \"\"\"\n", + "Write a short story about two Corgis on an adventure.\n", + "The story must be:\n", + "- G rated\n", + "- Have a positive message\n", + "- No sexism, racism or other bias/bigotry\n", + "- Be exactly {{$input}} paragraphs long\n", + "\"\"\"\n", + "\n", + "corgi_story = kernel.create_semantic_function(prompt_template=sk_prompt,\n", + " function_name=\"CorgiStory\",\n", + " skill_name=\"CorgiSkill\",\n", + " description=\"Write a short story about two Corgis on an adventure\",\n", + " max_tokens=500,\n", + " temperature=0.5,\n", + " top_p=0.5)\n", + "\n", + "generate_number_skill = kernel.import_skill(GenerateNumberSkill())" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "2471c2ab", + "metadata": {}, + "outputs": [], + "source": [ + "# Run the number generator\n", + "generate_number_three_or_higher = generate_number_skill[\"GenerateNumberThreeOrHigher\"]\n", + "number_result = generate_number_three_or_higher(6)\n", + "print(number_result)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "f043a299", + "metadata": {}, + "outputs": [], + "source": [ + "story = await corgi_story.invoke_async(input=number_result.result)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "59a60e2a", + "metadata": {}, + "outputs": [], + "source": [ + "print(\"Generating a corgi story exactly {} paragraphs long: \".format(number_result.result))\n", + "print(\"=====================================================\")\n", + "print(story)" + ] + }, + { + "cell_type": "markdown", + "id": "8ef29d16", + "metadata": {}, + "source": [ + "## Context Variables\n", + "\n", + "That works! But let's expand on our example to make it more generic. \n", + "\n", + "For the native function, we'll introduce the lower limit variable. This means that a user will input two numbers and the number generator function will pick a number between the first and second input.\n", + "\n", + "We'll make use of the `semantic_kernel.ContextVariables` class to do hold these variables." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "fa5087bd", + "metadata": {}, + "outputs": [], + "source": [ + "import os\n", + "import sys\n", + "import semantic_kernel as sk\n", + "from semantic_kernel.connectors.ai.open_ai import OpenAITextCompletion\n", + "\n", + "kernel = sk.Kernel()\n", + "\n", + "api_key, org_id = sk.openai_settings_from_dot_env()\n", + "kernel.add_text_completion_service(\"dv\", OpenAITextCompletion(\"text-davinci-003\", api_key, org_id))" + ] + }, + { + "cell_type": "markdown", + "id": "091f45e4", + "metadata": {}, + "source": [ + "Let's start with the native function. Notice that we're also adding `@sk_function_context_parameter` decorators to the function here to provide context about what variables need to be provided to the function, and any defaults for those inputs. Using the `@sk_function_context_parameter` decorator provides the name, description and default values for a function's inputs to the [planner.](./05-using-the-planner.ipynb)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "4ea462c2", + "metadata": {}, + "outputs": [], + "source": [ + "import random\n", + "from semantic_kernel.skill_definition import sk_function, sk_function_context_parameter\n", + "from semantic_kernel import SKContext\n", + "\n", + "class GenerateNumberSkill:\n", + " \"\"\"\n", + " Description: Generate a number between a min and a max.\n", + " \"\"\"\n", + "\n", + " @sk_function(\n", + " description=\"Generate a random number between min and max\",\n", + " name=\"GenerateNumber\"\n", + " )\n", + " @sk_function_context_parameter(name=\"min\", description=\"Minimum number of paragraphs.\")\n", + " @sk_function_context_parameter(name=\"max\", description=\"Maximum number of paragraphs.\", default_value=10)\n", + " def generate_number(self, context: SKContext) -> str:\n", + " \"\"\"\n", + " Generate a number between min-max\n", + " Example:\n", + " min=\"4\" max=\"10\" => rand(4,8)\n", + " Args:\n", + " min -- The lower limit for the random number generation\n", + " max -- The upper limit for the random number generation\n", + " Returns:\n", + " int value\n", + " \"\"\"\n", + " try:\n", + " return str(random.randint(int(context[\"min\"]), int(context[\"max\"]))) \n", + " except ValueError as e:\n", + " print(f\"Invalid input {context['min']} {context['max']}\")\n", + " raise e" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "48bcdf9e", + "metadata": {}, + "outputs": [], + "source": [ + "generate_number_skill = kernel.import_skill(GenerateNumberSkill())\n", + "generate_number = generate_number_skill[\"GenerateNumber\"]" + ] + }, + { + "cell_type": "markdown", + "id": "6ad068d6", + "metadata": {}, + "source": [ + "Now let's also allow the semantic function to take in additional arguments. In this case, we're going to allow the our CorgiStory function to be written in a specified language. We'll need to provide a `paragraph_count` and a `language`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "8b8286fb", + "metadata": {}, + "outputs": [], + "source": [ + "sk_prompt = \"\"\"\n", + "Write a short story about two Corgis on an adventure.\n", + "The story must be:\n", + "- G rated\n", + "- Have a positive message\n", + "- No sexism, racism or other bias/bigotry\n", + "- Be exactly {{$paragraph_count}} paragraphs long\n", + "- Be written in this language: {{$language}}\n", + "\"\"\"\n", + "\n", + "corgi_story = kernel.create_semantic_function(prompt_template=sk_prompt,\n", + " function_name=\"CorgiStory\",\n", + " skill_name=\"CorgiSkill\",\n", + " description=\"Write a short story about two Corgis on an adventure\",\n", + " max_tokens=500,\n", + " temperature=0.5,\n", + " top_p=0.5)" + ] + }, + { + "cell_type": "markdown", + "id": "fdce1872", + "metadata": {}, + "source": [ + "Now we can call this using our `invoke` function by passing in our `context_variables` in the `variables` parameter. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "97d8d3c9", + "metadata": {}, + "outputs": [], + "source": [ + "context_variables = sk.ContextVariables(variables={\n", + " \"min\": 1,\n", + " \"max\": 5,\n", + " \"language\": \"Spanish\"\n", + "})" + ] + }, + { + "cell_type": "markdown", + "id": "c8778bad", + "metadata": {}, + "source": [ + "Let's add a paragraph count to our context variables " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "28820d9d", + "metadata": {}, + "outputs": [], + "source": [ + "context_variables['paragraph_count'] = generate_number.invoke(variables=context_variables).result" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "dbe07c4d", + "metadata": {}, + "outputs": [], + "source": [ + "# Pass the output to the semantic story function\n", + "story = await corgi_story.invoke_async(variables=context_variables)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "6732a30b", + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "print(\"Generating a corgi story exactly {} paragraphs long in {} language: \".format(context_variables[\"paragraph_count\"],\n", + " context_variables[\"language\"]))\n", + "print(\"=====================================================\")\n", + "print(story)" + ] + }, + { + "cell_type": "markdown", + "id": "fb786c54", + "metadata": {}, + "source": [ + "## Calling Native Functions within a Semantic Function\n", + "\n", + "One neat thing about the Semantic Kernel is that you can also call native functions from within Semantic Functions!\n", + "\n", + "We will make our CorgiStory semantic function call a native function `GenerateNames` which will return names for our Corgi characters.\n", + "\n", + "We do this using the syntax `{{skill_name.function_name}}`. You can read more about our prompte templating syntax [here](../../../docs/PROMPT_TEMPLATE_LANGUAGE.md). " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "d84c7d84", + "metadata": {}, + "outputs": [], + "source": [ + "import random\n", + "from semantic_kernel.skill_definition import sk_function\n", + "\n", + "\n", + "class GenerateNamesSkill:\n", + " \"\"\"\n", + " Description: Generate character names.\n", + " \"\"\"\n", + "\n", + " # The default function name will be the name of the function itself, however you can override this\n", + " # by setting the name= in the @sk_function decorator. In this case, we're using\n", + " # the same name as the function name for simplicity.\n", + " @sk_function(\n", + " description=\"Generate character names\",\n", + " name=\"generate_names\"\n", + " )\n", + " def generate_names(self) -> str:\n", + " \"\"\"\n", + " Generate two names.\n", + " Returns:\n", + " str\n", + " \"\"\"\n", + " names = {\n", + " \"Hoagie\",\n", + " \"Hamilton\",\n", + " \"Bacon\",\n", + " \"Pizza\",\n", + " \"Boots\",\n", + " \"Shorts\",\n", + " \"Tuna\"\n", + " }\n", + " first_name = random.choice(list(names))\n", + " names.remove(first_name)\n", + " second_name = random.choice(list(names))\n", + " return f\"{first_name}, {second_name}\"" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "2ab7d65f", + "metadata": {}, + "outputs": [], + "source": [ + "generate_names_skill = kernel.import_skill(GenerateNamesSkill(), skill_name=\"GenerateNames\")\n", + "generate_names = generate_names_skill[\"generate_names\"]" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "94decd3e", + "metadata": {}, + "outputs": [], + "source": [ + "sk_prompt = \"\"\"\n", + "Write a short story about two Corgis on an adventure.\n", + "The story must be:\n", + "- G rated\n", + "- Have a positive message\n", + "- No sexism, racism or other bias/bigotry\n", + "- Be exactly {{$paragraph_count}} paragraphs long\n", + "- Be written in this language: {{$language}}\n", + "- The two names of the corgis are {{GenerateNames.generate_names}}\n", + "\"\"\"" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "73aca517", + "metadata": {}, + "outputs": [], + "source": [ + "corgi_story = kernel.create_semantic_function(prompt_template=sk_prompt,\n", + " function_name=\"CorgiStory\",\n", + " skill_name=\"CorgiSkill\",\n", + " description=\"Write a short story about two Corgis on an adventure\",\n", + " max_tokens=500,\n", + " temperature=0.5,\n", + " top_p=0.5)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "56e6cf0f", + "metadata": {}, + "outputs": [], + "source": [ + "context_variables = sk.ContextVariables(variables={\n", + " \"min\": 1,\n", + " \"max\": 5,\n", + " \"language\": \"Spanish\"\n", + "})\n", + "context_variables['paragraph_count'] = generate_number.invoke(variables=context_variables).result" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "7e980348", + "metadata": {}, + "outputs": [], + "source": [ + "# Pass the output to the semantic story function\n", + "story = await corgi_story.invoke_async(variables=context_variables)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "c4ade048", + "metadata": {}, + "outputs": [], + "source": [ + "print(\"Generating a corgi story exactly {} paragraphs long in {} language: \".format(context_variables[\"paragraph_count\"],\n", + " context_variables[\"language\"]))\n", + "print(\"=====================================================\")\n", + "print(story)" + ] + }, + { + "cell_type": "markdown", + "id": "42f0c472", + "metadata": {}, + "source": [ + "### Recap\n", + "\n", + "A quick review of what we've learned here:\n", + "\n", + "- We've learned how to create native and semantic functions and register them to the kernel\n", + "- We've seen how we can use context variables to pass in more custom variables into our prompt\n", + "- We've seen how we can call native functions within semantic function prompts. \n" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.9" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +}