Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Migrate to click #188

Merged
merged 3 commits into from
May 30, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
41 changes: 29 additions & 12 deletions docs/source/users/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -279,7 +279,7 @@ To clear the local vector database, you can run `/learn -d` and Jupyter AI will

To clear the chat panel, use the `/clear` command. This does not reset the AI model; the model may still remember previous messages that you sent it, and it may use them to inform its responses.

## The `%%ai` magic command
## The `%ai` and `%%ai` magic commands

Jupyter AI can also be used in notebooks via Jupyter AI magics. This section
provides guidance on how to use Jupyter AI magics effectively. The examples in
Expand Down Expand Up @@ -320,8 +320,11 @@ model you want to use and specify natural language prompts.

### Choosing a provider and model

The `%%ai` magic command enables you to specify a model provider and model with the
syntax `<provider-id>:<model-id>`. The natural language prompt starts on the second line of the cell.
The `%%ai` cell magic allows you to invoke a language model of your choice with
a given prompt. The model is identified with a **global model ID**, which is a string with the
syntax `<provider-id>:<local-model-id>`, where `<provider-id>` is the ID of the
provider and `<local-model-id>` is the ID of the model scoped to that provider.
The prompt begins on the second line of the cell.

For example, to send a text prompt to the provider `anthropic` and the model ID
`claude-v1.2`, enter the following code into a cell and run it:
Expand All @@ -331,8 +334,7 @@ For example, to send a text prompt to the provider `anthropic` and the model ID
Write a poem about C++.
```

We support the following providers, and all model IDs for each of these
providers, as defined in [`langchain.llms`](https://langchain.readthedocs.io/en/latest/reference/modules/llms.html#module-langchain.llms):
We currently support the following language model providers:

- `ai21`
- `anthropic`
Expand All @@ -342,14 +344,29 @@ providers, as defined in [`langchain.llms`](https://langchain.readthedocs.io/en/
- `openai-chat`
- `sagemaker-endpoint`

You can find a list of supported providers and models by running `%ai list`. Some providers
define a list of supported models. If a provider does not define a list of supported models,
consult the vendor's documentation. The [Hugging Face web site](https://huggingface.co/)
includes a list of models, for example.
:::{warning}
As of v0.8.0, only the `%%ai` *cell* magic may be used to invoke a language
model, while the `%ai` *line* magic is reserved for invoking subcommands.
:::

### Listing available models

Jupyter AI also includes multiple subcommands, which may be invoked via the
`%ai` *line* magic. Jupyter AI uses subcommands to provide additional utilities
in notebooks while keeping the same concise syntax for invoking a language model.
dlqqq marked this conversation as resolved.
Show resolved Hide resolved

The `%ai list` subcommand prints a list of available providers and models. Some
providers explicitly define a list of supported models in their API. However,
other providers, like HuggingFace Hub, lack a well-defined list of available
models. In such cases, it's best to consult the provider's upstream
documentation. The [HuggingFace website](https://huggingface.co/) includes a
list of models, for example.
dlqqq marked this conversation as resolved.
Show resolved Hide resolved

Optionally, you can specify a provider ID as a positional argument to `%ai list`
to get all models provided by one provider. For example, `%ai list openai` will
display only models provided by the `openai` provider.

Optionally, you can pass a provider ID as a parameter to `%ai list` to get all
models provided by one provider. For example, `%ai list openai` will display only models
provided by the `openai` provider.
### Abbreviated syntax
dlqqq marked this conversation as resolved.
Show resolved Hide resolved

If your model ID is associated with only one provider, you can omit the `provider-id` and
the colon from the first line. For example, because `ai21` is the only provider of the
Expand Down
111 changes: 45 additions & 66 deletions packages/jupyter-ai-magics/jupyter_ai_magics/magics.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,12 +5,14 @@
import warnings
from typing import Optional

import click
dlqqq marked this conversation as resolved.
Show resolved Hide resolved
from IPython import get_ipython
from IPython.core.magic import Magics, magics_class, line_cell_magic
from IPython.core.magic_arguments import magic_arguments, argument, parse_argstring
from IPython.display import HTML, JSON, Markdown, Math
from jupyter_ai_magics.utils import decompose_model_id, load_providers

from .providers import BaseProvider
from .parsers import cell_magic_parser, line_magic_parser, CellArgs, HelpArgs, ListArgs


MODEL_ID_ALIASES = {
Expand Down Expand Up @@ -96,6 +98,9 @@ def __missing__(self, key):
class EnvironmentError(BaseException):
pass

class CellMagicError(BaseException):
pass

@magics_class
class AiMagics(Magics):
def __init__(self, shell):
Expand All @@ -109,22 +114,6 @@ def __init__(self, shell):

self.providers = load_providers()

def _ai_help_command_markdown(self):
table = ("| Command | Description |\n"
"| ------- | ----------- |\n")

for command in AI_COMMANDS:
table += "| `" + command + "` | " + AI_COMMANDS[command] + "|\n";

return table

def _ai_help_command_text(self):
output = ""

for command in AI_COMMANDS:
output += command + " - " + AI_COMMANDS[command] + "\n";

return output

def _ai_bulleted_list_models_for_provider(self, provider_id, Provider):
output = ""
Expand Down Expand Up @@ -223,30 +212,6 @@ def _ai_list_command_text(self, single_provider=None):

return output

# Run an AI command using the arguments provided as a space-delimited value
def _ai_command(self, command, args_string):
args = args_string.split() # Split by whitespace

# When we can use Python 3.10+, replace this with a 'match' command
if (command == 'help'):
return TextOrMarkdown(self._ai_help_command_text(), self._ai_help_command_markdown())
elif (command == 'list'):
# Optional parameter: model provider ID
provider_id = None
if (len(args) >= 1):
provider_id = args[0]

return TextOrMarkdown(
self._ai_list_command_text(provider_id),
self._ai_list_command_markdown(provider_id)
)
else:
# This should be unreachable, since unhandled commands are treated like model names
return TextOrMarkdown(
f"No handler for command {command}\n",
f"No handler for command `{command}`"
)

def _append_exchange_openai(self, prompt: str, output: str):
"""Appends a conversational exchange between user and an OpenAI Chat
model to a transcript that will be included in future exchanges."""
Expand All @@ -269,34 +234,48 @@ def _get_provider(self, provider_id: Optional[str]) -> BaseProvider:

return self.providers[provider_id]

@magic_arguments()
@argument('model_id',
help="""Model to run, specified as a model ID that may be
optionally prefixed with the ID of the model provider, delimited
by a colon.""")
@argument('-f', '--format',
choices=["code", "html", "image", "json", "markdown", "math", "md", "text"],
nargs="?",
default="markdown",
help="""IPython display to use when rendering output. [default="markdown"]""")
@argument('-r', '--reset',
action="store_true",
help="""Clears the conversation transcript used when interacting
with an OpenAI chat model provider. Does nothing with other
providers.""")
@argument('prompt',
nargs='*',
help="""Prompt for code generation. When used as a line magic, it
runs to the end of the line. In cell mode, the entire cell is
considered the code generation prompt.""")
def handle_help(self, _: HelpArgs):
with click.Context(cell_magic_parser, info_name="%%ai") as ctx:
click.echo(cell_magic_parser.get_help(ctx))
click.echo('-' * 78)
dlqqq marked this conversation as resolved.
Show resolved Hide resolved
with click.Context(line_magic_parser, info_name="%ai") as ctx:
click.echo(line_magic_parser.get_help(ctx))

def handle_list(self, args: ListArgs):
return TextOrMarkdown(
self._ai_list_command_text(args.provider_id),
self._ai_list_command_markdown(args.provider_id)
)

@line_cell_magic
def ai(self, line, cell=None):
# parse arguments
args = parse_argstring(self.ai, line)
if cell is None:
prompt = ' '.join(args.prompt)
raw_args = line.split(' ')
if cell:
args = cell_magic_parser(raw_args, prog_name="%%ai", standalone_mode=False)
else:
prompt = cell
args = line_magic_parser(raw_args, prog_name="%ai", standalone_mode=False)

if args == 0:
# this happens when `--help` is called on the root command, in which
# case we want to exit early.
return

if args.type == "help":
return self.handle_help(args)
if args.type == "list":
return self.handle_list(args)

# hint to the IDE that this object must be of type `RootArgs`
args: CellArgs = args

if not cell:
raise CellMagicError(
"""[0.8+]: To invoke a language model, you must use the `%%ai`
dlqqq marked this conversation as resolved.
Show resolved Hide resolved
cell magic. The `%ai` line magic is only for use with
subcommands."""
)

prompt = cell.strip()

# If the user is attempting to run a command, run the command separately.
if (args.model_id in AI_COMMANDS):
Expand Down
70 changes: 70 additions & 0 deletions packages/jupyter-ai-magics/jupyter_ai_magics/parsers.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,70 @@
import click
from pydantic import BaseModel
from typing import Optional, Literal, get_args

FORMAT_CHOICES_TYPE = Literal["code", "html", "image", "json", "markdown", "math", "md", "text"]
FORMAT_CHOICES = list(get_args(FORMAT_CHOICES_TYPE))

class CellArgs(BaseModel):
type: Literal["root"] = "root"
model_id: str
format: FORMAT_CHOICES_TYPE
reset: bool

class HelpArgs(BaseModel):
type: Literal["help"] = "help"

class ListArgs(BaseModel):
type: Literal["list"] = "list"
provider_id: Optional[str]

class LineMagicGroup(click.Group):
"""Helper class to print the help string for cell magics as well when
`%ai --help` is called."""
def get_help(self, ctx):
with click.Context(cell_magic_parser, info_name="%%ai") as ctx:
click.echo(cell_magic_parser.get_help(ctx))
click.echo('-' * 78)
with click.Context(line_magic_parser, info_name="%ai") as ctx:
click.echo(super().get_help(ctx))

@click.command()
@click.argument('model_id')
@click.option('-f', '--format',
type=click.Choice(FORMAT_CHOICES, case_sensitive=False),
default="markdown",
help="""IPython display to use when rendering output. [default="markdown"]"""
)
@click.option('-r', '--reset', is_flag=True,
help="""Clears the conversation transcript used when interacting with an
OpenAI chat model provider. Does nothing with other providers."""
)
def cell_magic_parser(**kwargs):
"""
Invokes a language model identified by MODEL_ID, with the prompt being
contained in all lines after the first. Both local model IDs and global
model IDs (with the provider ID explicitly prefixed, followed by a colon)
are accepted.
dlqqq marked this conversation as resolved.
Show resolved Hide resolved

To view available language models, please run `%ai list`.
"""
return CellArgs(**kwargs)

@click.group(cls=LineMagicGroup)
def line_magic_parser():
"""
Invokes a subcommand.
"""

@line_magic_parser.command(name='help')
def help_subparser():
"""Show this message and exit."""
return HelpArgs()

@line_magic_parser.command(name='list',
short_help="List language models. See `%ai list --help` for options."
)
@click.argument('provider_id', required=False)
def list_subparser(**kwargs):
"""List language models, optionally scoped to PROVIDER_ID."""
dlqqq marked this conversation as resolved.
Show resolved Hide resolved
return ListArgs(**kwargs)
3 changes: 2 additions & 1 deletion packages/jupyter-ai-magics/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,8 @@ dependencies = [
"pydantic",
"importlib_metadata~=5.2.0",
"langchain==0.0.159",
"typing_extensions==4.5.0"
"typing_extensions==4.5.0",
"click~=8.0",
]

[project.optional-dependencies]
Expand Down