Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Template Profile #1381

Merged

Conversation

MikeBirdTech
Copy link
Collaborator

Describe the changes you have made:

Introduced a template profile for Open Interpreter, including documentation and example settings, to facilitate creating custom profiles.

Reference any relevant issues (e.g. "Fixes #000"):

Pre-Submission Checklist (optional but appreciated):

  • I have included relevant documentation updates (stored in /docs)
  • I have read docs/CONTRIBUTING.md
  • I have read docs/ROADMAP.md

OS Tests (optional but appreciated):

  • Tested on Windows
  • Tested on MacOS
  • Tested on Linux

KillianLucas and others added 2 commits August 1, 2024 11:04
New `--groq` profile + error message fix
…tation and example settings, to facilitate creating custom profiles.
Copy link
Collaborator

@human-bee human-bee left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me! Well organized and I like pointing to relevant docs for each section. Could possibly use some information about the 'power of profiles':

(pulled from Mintlify cmd+K)
Profiles in Open Interpreter are good for several purposes:

Customization for Specific Use Cases Profiles allow you to customize your instance of Open Interpreter for different use cases. This means you can have multiple configurations tailored to various tasks or projects.

Quick Configuration Profiles make it easy to get going quickly with a specific set of settings. Instead of manually configuring settings each time, you can simply load a pre-configured profile.

Sharing Configurations Profiles can be shared with others by sending them the profile yaml file. This makes it easy to collaborate or share optimized configurations with team members or the community.

Local Model Optimization For local models, profiles can be particularly useful. Local models perform better with extra guidance and direction, and you can improve performance for your use-case by creating a new Profile.

# You can set variables
today = date.today()

# LLM Settings
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Turned all-settings into a list in this format if we want to provide them all here and then recommend removing irrelevant ones? Definitely could be overwhelming though, so not sure if needed:

LLM Settings - https://docs.openinterpreter.com/settings/all-settings#language-model

interpreter.llm.model = "groq/llama-3.1-70b-versatile" # Specifies which language model to use
interpreter.llm.temperature = 0.7 # Sets the randomness level of the model's output
interpreter.llm.context_window = 110000 # Manually set the context window size in tokens
interpreter.llm.max_tokens = 4096 # Sets the maximum number of tokens for model generation
interpreter.llm.max_output = 1000 # Set the maximum number of characters for code outputs
interpreter.llm.api_base = "https://api.example.com" # Specify custom API base URL
interpreter.llm.api_key = "your_api_key_here" # Set your API key for authentication
interpreter.llm.api_version = '2.0.2' # Optionally set the API version to use
interpreter.llm.supports_functions = False # Inform if the model supports function calling
interpreter.llm.supports_vision = False # Inform if the model supports vision

Interpreter Settings - https://docs.openinterpreter.com/settings/all-settings#interpreter

interpreter.loop = True # Runs Open Interpreter in a loop for task completion
interpreter.verbose = True # Run the interpreter in verbose mode for debugging
interpreter.safe_mode = 'ask' # Enable or disable experimental safety mechanisms
interpreter.auto_run = True # Automatically run without user confirmation
interpreter.max_budget = 0.01 # Sets the maximum budget limit for the session in USD
interpreter.offline = True # Run the model locally
interpreter.system_message = "You are Open Interpreter..." # Modify the system message (not recommended)
interpreter.anonymized_telemetry = False # Opt out of telemetry
interpreter.user_message_template = "{content} Please send me some code..." # Template applied to the User's message
interpreter.always_apply_user_message_template = False # Whether to apply User Message Template to every message
interpreter.code_output_template = "Code output: {content}\nWhat does this output mean?" # Template applied to code output
interpreter.empty_code_output_template = "The code above was executed..." # Message sent when code produces no output
interpreter.code_output_sender = "user" # Determines who sends code output messages

Computer Settings - https://docs.openinterpreter.com/settings/all-settings#computer

interpreter.computer.offline = True # Run the computer in offline mode
interpreter.computer.verbose = True # Used for debugging interpreter.computer
interpreter.computer.emit_images = True # Controls whether the computer should emit images

Import Computer API - https://docs.openinterpreter.com/code-execution/computer-api

interpreter.computer.import_computer_api = True

Toggle OS Mode - https://docs.openinterpreter.com/guides/os-mode

interpreter.os = False

Set Custom Instructions to improve your Interpreter's performance at a given task

interpreter.custom_instructions = """
Today's date is {today}.
"""

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we should include all settings because then we'll have to maintain documenting them in multiple places. A clear link to the docs will let people easily see up-to-date info. Having the highest priority settings should give people enough help to get started.

Any settings that you'd like to see included?

@KillianLucas KillianLucas changed the base branch from main to development August 6, 2024 06:38
@KillianLucas KillianLucas merged commit 25a8dcb into OpenInterpreter:development Aug 6, 2024
0 of 2 checks passed
@KillianLucas
Copy link
Collaborator

Great work Mike, excellent platform for a profiles focused update. will hopefully inspire a new kind of community of profile builders. Merged!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants