diff --git a/.gitignore b/.gitignore
index 5852d0d..60d7064 100644
--- a/.gitignore
+++ b/.gitignore
@@ -1,6 +1,6 @@
# 忽略私人配置文件,包含有api_key等信息
-config private.yml
-private.yml
+config private.toml
+private.toml
# Byte-compiled / optimized / DLL files
__pycache__/
@@ -181,4 +181,5 @@ prompt_output/
log.txt
.chroma_db
config.yml
+config.toml
.pre-commit-config.yaml
\ No newline at end of file
diff --git a/README.md b/README.md
index 141a182..260ec1a 100644
--- a/README.md
+++ b/README.md
@@ -1,15 +1,33 @@
-[中文](README_CN.md) | [Background](#-background) | [Features](#-features) | [Getting Started](#-getting-started) | [Future Work](#-future-work) | [Supported Language](#-supported-language) | [Citation](#-citation)
-
-# 🤗 Introduction
-
-RepoAgent is an Open-Source project driven by Large Language Models(LLMs) that aims to provide an intelligent way to document projects.
-It is designed to be a handy tool for developers who need to organize their code and cooperate with teammates.
-
-**Paper:** http://arxiv.org/abs/2402.16667
-
-![RepoAgent](assets/images/RepoAgent.png)
+
RepoAgent: An LLM-Powered Framework for Repository-level Code Documentation Generation.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ English readme
+ •
+ 简体中文 readme
+
## 👾 Background
+
In the realm of computer programming, the significance of comprehensive project documentation, including detailed explanations for each Python file, cannot be overstated. Such documentation serves as the cornerstone for understanding, maintaining, and enhancing the codebase. It provides essential context and rationale for the code, making it easier for current and future developers to comprehend the purpose, functionality, and structure of the software. It not only facilitates current and future developers in grasping the project's purpose and structure but also ensures that the project remains accessible and modifiable over time, significantly easing the learning curve for new team members.
Traditionally, creating and maintaining software documentation demanded significant human effort and expertise, a challenge for small teams without dedicated personnel. The introduction of Large Language Models (LLMs) like GPT has transformed this, enabling AI to handle much of the documentation process. This shift allows human developers to focus on verification and fine-tuning, greatly reducing the manual burden of documentation.
@@ -31,13 +49,13 @@ Traditionally, creating and maintaining software documentation demanded signific
### Installation Method
-
+```
#### Development Setup Using PDM
@@ -56,7 +74,7 @@ If you're looking to contribute or set up a development environment:
```bash
git clone https://github.com/LOGIC-10/RepoAgent.git
cd RepoAgent
-```
+ ```
- **Setup with PDM**
@@ -76,91 +94,100 @@ If you're looking to contribute or set up a development environment:
### Configuring RepoAgent
-First, configure the OpenAI API parameters in the config.yml file.
-For details on obtaining these, please refer to [OpenAI API](https://beta.openai.com/docs/developer-quickstart/your-api-keys).
-
-In the `config.yml` file, configure other parameters like OpenAI API, the destination repository path, document language, and so on:
-
-```yaml
-api_keys:
- gpt-3.5-turbo-16k:
- - api_key: sk-XXXX
- base_url: https://example.com/v1/
- api_type: azure
- api_version: XXX
- engine: GPT-35-Turbo-16k
- # you can use any kwargs supported by openai.ChatCompletion here
- - api_key: sk-xxxxx
- organization: org-xxxxxx
- model: gpt-3.5-turbo-16k
- ...
-
-default_completion_kwargs:
- model: gpt-4-1106
- temperature: 0.2
- request_timeout: 60
-
-repo_path: /path/to/your/repo
-project_hierarchy: .project_hierarchy # This is a folder, where we store the project hierarchy and metainfo. This can be shared with your team members.
-Markdown_Docs_folder: Markdown_Docs # The folder in the root directory of your target repository to store the documentation.
-ignore_list: ["ignore_file1.py", "ignore_file2.py", "ignore_directory"] # Ignore some py files or folders that you don't want to generate documentation for by giving relative paths in ignore_list.
-whitelist_path: /path/of/whitelist_path_json #if you provide the whitelist json with the same structure in Metainfo, RepoAgent will only process the given part. This is useful in a very big project, like "higgingface Transformers"
-
-language: en # Two-letter language codes (ISO 639-1 codes), e.g. `language: en` for English. Refer to Supported Language for more languages.
-max_thread_count: 10 # We support multiprocessing to speedup the process
-max_document_tokens: 1024 # the maximum number of tokens in a document generated
-log_level: info
+Before configuring specific parameters for RepoAgent, please ensure that the OpenAI API is configured as an environment variable in the command line:
+
+```sh
+export OPENAI_API_KEY=YOUR_API_KEY # on Linux/Mac
+set OPENAI_API_KEY=YOUR_API_KEY # on Windows
+$Env:OPENAI_API_KEY = "YOUR_API_KEY" # on Windows (PowerShell)
+```
+
+Use `repoagent configure` if you need to modify the running parameters.
+
+```sh
+Enter the path to target repository:
+Enter the project hierarchy file name [.project_doc_record]:
+Enter the Markdown documents folder name [markdown_docs]:
+Enter files or directories to ignore, separated by commas []:
+Enter the language (ISO 639 code or language name, e.g., 'en', 'eng', 'English') [Chinese]:
+Enter the maximum number of threads [4]:
+Enter the maximum number of document tokens [1024]:
+Enter the log level (DEBUG, INFO, WARNING, ERROR, CRITICAL) [INFO]:
+Enter the model [gpt-3.5-turbo]:
+Enter the temperature [0.2]:
+Enter the request timeout (seconds) [60.0]:
+Enter the base URL [https://api.openai.com/v1]:
```
-### Run RepoAgent
+## Run RepoAgent
Enter the root directory of RepoAgent and try the following command in the terminal:
```sh
-python -m repo_agent #this command will generate doc, or update docs(pre-commit-hook will automatically call this)
+repoagent run #this command will generate doc, or update docs(pre-commit-hook will automatically call this)
+```
+
+The run command supports the following optional flags (if set, will override config defaults):
+
+- `-m`, `--model` TEXT: Specifies the model to use for completion. Default: `gpt-3.5-turbo`
+- `-t`, `--temperature` FLOAT: Sets the generation temperature for the model. Lower values make the model more deterministic. Default: `0.2`
+- `-r`, `--request-timeout` INTEGER: Defines the timeout in seconds for the API request. Default: `60`
+- `-b`, `--base-url` TEXT: The base URL for the API calls. Default: `https://api.openai.com/v1`
+- `-tp`, `--target-repo-path` PATH: The file system path to the target repository. Used as the root for documentation generation. Default: `path/to/your/target/repository`
+- `-hp`, `--hierarchy-path` TEXT: The name or path for the project hierarchy file, used to organize documentation structure. Default: `.project_doc_record`
+- `-mdp`, `--markdown-docs-path` TEXT: The folder path where Markdown documentation will be stored or generated. Default: `markdown_docs`
+- `-i`, `--ignore-list` TEXT: A list of files or directories to ignore during documentation generation, separated by commas.
+- `-l`, `--language` TEXT: The ISO 639 code or language name for the documentation. Default: `Chinese`
+- `-ll`, `--log-level` [DEBUG|INFO|WARNING|ERROR|CRITICAL]: Sets the logging level for the application. Default: `INFO`
+
+
+You can also try the following feature
-# you can also try the follow feature
-python -m repo_agent clean #this command will remove repoagent-related cache
-python -m repo_agent print #this command will print how repo-agent parse the target repo
-python -m repo_agent diff #this command will check what docs will be updated/generated based on current code change
+```sh
+repoagent clean # Remove repoagent-related cache
+repoagent print-hierarchy # Print how repo-agent parse the target repo
+repoagent diff # Check what docs will be updated/generated based on current code change
```
If it's your first time generating documentation for the target repository, RepoAgent will automatically create a JSON file maintaining the global structure information and a folder named Markdown_Docs in the root directory of the target repository for storing documents.
-The paths of the global structure information json file and the documentation folder can be configured in `config.yml`.
-
-Once you have initially generated the global documentation for the target repository, or if the project you cloned already contains global documentation information, you can then seamlessly and automatically maintain internal project documentation with your team by configuring the **pre-commit hook** in the target repository!
+Once you have initially generated the global documentation for the target repository, or if the project you cloned already contains global documentation information, you can then seamlessly and automatically maintain internal project documentation with your team by configuring the **pre-commit hook** in the target repository!
-### Configuring the Target Repository
+### Use `pre-commit`
RepoAgent currently supports generating documentation for projects, which requires some configuration in the target repository.
First, ensure that the target repository is a git repository and has been initialized.
-```
+
+```sh
git init
```
Install pre-commit in the target repository to detect changes in the git repository.
-```
+```sh
pip install pre-commit
```
Create a file named `.pre-commit-config.yaml` in the root directory of the target repository. An example is as follows:
-```
+```yml
repos:
- repo: local
hooks:
- id: repo-agent
name: RepoAgent
- entry: python path/to/your/repo_agent/runner.py
+ entry: repoagent
language: system
+ pass_filenames: false # prevent from passing filenames to the hook
# You can specify the file types that trigger the hook, but currently only python is supported.
types: [python]
```
+
For specific configuration methods of hooks, please refer to [pre-commit](https://pre-commit.com/#plugins).
After configuring the yaml file, execute the following command to install the hook.
-```
+
+```sh
pre-commit install
```
+
In this way, each git commit will trigger the RepoAgent's hook, automatically detecting changes in the target repository and generating corresponding documents.
Next, you can make some modifications to the target repository, such as adding a new file to the target repository, or modifying an existing file.
You just need to follow the normal git workflow: git add, git commit -m "your commit message", git push
@@ -178,25 +205,21 @@ We utilized the default model **gpt-3.5-turbo** to generate documentation for th
**In the end, you can flexibly adjust the output format, template, and other aspects of the document by customizing the prompt. We are excited about your exploration of a more scientific approach to Automated Technical Writing and your contributions to the community.**
### Exploring chat with repo
+
We conceptualize **Chat With Repo** as a unified gateway for these downstream applications, acting as a connector that links RepoAgent to human users and other AI agents. Our future research will focus on adapting the interface to various downstream applications and customizing it to meet their unique characteristics and implementation requirements.
Here we demonstrate a preliminary prototype of one of our downstream tasks: Automatic Q&A for Issues and Code Explanation. You can start the server by running the following code.
-```bash
-python -m repo_agent.chat_with_repo
+
+```sh
+repoagent chat-with-repo
```
## ✅ Future Work
-- [x] Identification and maintenance of parent-child relationship hierarchy structure between objects
-- [x] Implement Black commit
-- [x] **Bi-direct reference** Construct Bi-directional reference topology
-- [x] **chat with repo** Chat with the repository by giving code and document at the same time
-- [x] Automatically generate better visualizations such as Gitbook
+- [x] Support install and configure via `pip install repoagent`
- [ ] Generate README.md automatically combining with the global documentation
- [ ] **Multi-programming-language support** Support more programming languages like Java, C or C++, etc.
- [ ] Local model support like Llama, chatGLM, Qwen, GLM4, etc.
-- [ ] Support install and configure via `pip install repoagent`
-- [X] Automatically generate Gitbook for better visualization effects
## 🥰 Featured Cases
@@ -206,45 +229,6 @@ Here are featured cases that have adopted RepoAgent.
- [ChatDev](https://github.com/OpenBMB/ChatDev): Collaborative AI agents for software development.
- [XAgent](https://github.com/OpenBMB/XAgent): An Autonomous LLM Agent for Complex Task Solving.
-## 🇺🇳 Supported Language
-
-Set the target language with the two-letter language codes (ISO 639-1 codes), Click on the 'Languages List' section below to expand the list of supported languages.
-
-
-Languages List
-
-| Flag | Code | Language |
-|------|------|------------|
-| 🇬🇧 | en | English |
-| 🇪🇸 | es | Spanish |
-| 🇫🇷 | fr | French |
-| 🇩🇪 | de | German |
-| 🇨🇳 | zh | Chinese |
-| 🇯🇵 | ja | Japanese |
-| 🇷🇺 | ru | Russian |
-| 🇮🇹 | it | Italian |
-| 🇰🇷 | ko | Korean |
-| 🇳🇱 | nl | Dutch |
-| 🇵🇹 | pt | Portuguese |
-| 🇸🇦 | ar | Arabic |
-| 🇹🇷 | tr | Turkish |
-| 🇸🇪 | sv | Swedish |
-| 🇩🇰 | da | Danish |
-| 🇫🇮 | fi | Finnish |
-| 🇳🇴 | no | Norwegian |
-| 🇵🇱 | pl | Polish |
-| 🇨🇿 | cs | Czech |
-| 🇭🇺 | hu | Hungarian |
-| 🇬🇷 | el | Greek |
-| 🇮🇱 | he | Hebrew |
-| 🇹🇭 | th | Thai |
-| 🇮🇳 | hi | Hindi |
-| 🇧🇩 | bn | Bengali |
-
-
-
-> e.g., `language: en` for English.
-
## 📊 Citation
```bibtex
diff --git a/README_CN.md b/README_CN.md
index 4ade2d5..6343afb 100644
--- a/README_CN.md
+++ b/README_CN.md
@@ -1,97 +1,158 @@
-[英文](README.md) | [背景](#-背景) | [特性](#-特性) | [快速开始](#-快速开始) | [未来工作](#-未来工作) | [支持语言](#-支持语言) | [引用我们](#-引用我们)
+RepoAgent:一个用于代码库级别代码文档生成的LLM驱动框架
-# 🤗 介绍
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
-RepoAgent是一个由大型语言模型(LLMs)驱动的开源项目,旨在提供智能化的项目文档编写方式。
-它的设计目标是成为开发人员的便捷工具,用于创建、维护清晰易懂的代码文档并在团队成员之间应用协作。
+
+
+
-**论文地址**:http://arxiv.org/abs/2402.16667
+
+ English README
+ •
+ 简体中文说明
+
-![RepoAgent](assets/images/RepoAgent.png)
+## 👾 背景
-# 👾 背景
-在计算机编程领域,全面的项目文档非常重要,包括对每个Python文件的详细解释。这样的文档是理解、维护和增强代码库的基石。它为代码提供了必要的上下文解读,使当前和未来的开发人员更容易理解软件的目的、功能和结构。它不仅有助于当前和未来的开发人员理解项目的目的和结构,还确保项目随着时间的推移保持可访问和可修改,极大地降低了新团队成员的学习曲线。
+在计算机编程领域,全面的项目文档的重要性,包括每个Python文件的详细解释,不言而喻。这样的文档是理解、维护和增强代码库的基石。它提供了代码的必要上下文和理由,使当前和未来的开发者更容易理解软件的目的、功能和结构。它不仅便于当前和未来的开发者理解项目的目的和结构,还确保了项目随时间的推移保持可访问和可修改,大大简化了新团队成员的学习曲线。
-传统上,创建和维护软件文档需要大量的人力和专业知识,这对于没有专门人员的小团队来说是一个挑战。大型语言模型(LLMs)如GPT的引入改变了这一情况,使得AI能够处理大部分文档编写过程。这种转变使得人类开发人员可以专注于验证和微调修改,极大地减轻了文档编写的人工负担。
+传统上,创建和维护软件文档需要大量的人力和专业知识,这对没有专门人员的小团队来说是一个挑战。像GPT这样的大型语言模型(LLMs)的引入改变了这一点,使得AI可以处理大部分文档化过程。这种转变允许人类开发者专注于验证和微调,极大地减少了文档化的手动负担。
-**🏆 我们的目标是创建一个智能的文档助手,自动生成和维护文档,并帮助人类阅读并理解repo项目,最终帮助人类提高效率、节省时间。**
+**🏆 我们的目标是创建一个智能文档助手,帮助人们阅读和理解仓库并生成文档,最终帮助人们提高效率和节省时间。**
-# 🪭 特性
+## ✨ 特性
-- **🤖 自动检测Git仓库中的变更,跟踪文件的添加、删除和修改。**
-- **📝 通过深度递归+AST独立分析代码结构,为各个对象生成文档。**
-- **🔍 精准识别对象间双向调用关系,丰富文档内容的全局视野**
-- **📚 根据变更无缝替换Markdown内容,保持文档的一致性。**
-- **🕙 执行多线程并发操作,提高文档生成的效率。**
-- **👭 为团队协作提供可持续、自动化的文档更新方法。**
-- **😍 美观的文档书(Gitbook)展示**
+- **🤖 自动检测Git仓库中的变化,跟踪文件的增加、删除和修改。**
+- **📝 通过AST独立分析代码结构,为各个对象生成文档。**
+- **🔍 准确识别对象间的双向调用关系,丰富文档内容的全局视角。**
+- **📚 根据变化无缝替换Markdown内容,保持文档一致性。**
+- **🕙 执行多线程并发操作,提高文档生成效率。**
+- **👭 为团队协作提供可持续的自动化文档更新方法。**
+- **😍 以惊人的方式展示代码文档(每个项目都有由Gitbook提供支持的文档书)。**
-# 📦 安装
-首先,确保您的机器安装了python3.9以上的版本
-```
-$ python --version
-python 3.11.4
-```
+## 🚀 开始使用
+
+### 安装方法
+
+#### 使用pip(普通用户首选)
-接着,克隆本项目,创建一个虚拟环境,并在环境内安装依赖
+直接使用pip安装`repoagent`包:
+
+```bash
+pip install repoagent
```
-cd RepoAgent
-conda create -n RepoAgent python=3.11.4
-conda activate RepoAgent
-pip install -r requirements.txt
+
+#### 使用PDM进行开发环境设置
+
+如果您想要贡献或者设置一个开发环境:
+
+- **安装PDM**:如果您还没有安装,请[安装PDM](https://pdm-project.org/latest/#installation)。
+- **使用CodeSpace或克隆仓库**:
+
+ - **使用CodeSpace**
+ 获取RepoAgent环境的最简单方式。点击下面链接使用GitHub Codespace,然后进行下一步。
+
+ [![在GitHub Codespaces中打开](https://github.com/codespaces/badge.svg)](https://codespaces.new/LOGIC-10/RepoAgent?quickstart=1)
+
+ - **克隆仓库**
+
+ ```bash
+ git clone https://github.com/LOGIC-10/RepoAgent.git
+ cd RepoAgent
+ ```
+
+- **使用PDM设置**
+
+ - 初始化Python虚拟环境。确保在`/RepoAgent`目录下运行下面的命令:
+
+ ```bash
+ pdm venv create --name repoagent
+ ```
+
+ - [激活虚拟环境](https://pdm-project.org/latest/usage/venv/#activate-a-virtualenv)
+
+ - 使用PDM安装依赖
+
+ ```bash
+ pdm install
+ ```
+
+### 配置RepoAgent
+
+在配置RepoAgent具体参数之前,请先确保已经在命令行配置 OpenAI API 作为环境变量:
+
+```sh
+export OPENAI_API_KEY=YOUR_API_KEY # on Linux/Mac
+
+set OPENAI_API_KEY=YOUR_API_KEY # on Windows
+$Env:OPENAI_API_KEY = "YOUR_API_KEY" # on Windows (PowerShell)
```
-# 📖 快速开始
-## 配置RepoAgent
-在`config.yml`文件中,配置OpenAI API的相关参数信息、如目标仓库的路径、文档语言等。
-```yaml
-api_keys:
- gpt-3.5-turbo-16k:
- - api_key: sk-XXXX
- base_url: https://example.com/v1/
- api_type: azure
- api_version: XXX
- engine: GPT-35-Turbo-16k
- # you can use any kwargs supported by openai.ChatCompletion here
- - api_key: sk-xxxxx
- organization: org-xxxxxx
- model: gpt-3.5-turbo-16k
- ...
-
-default_completion_kwargs:
- model: gpt-4
- temperature: 0.2
- request_timeout: 60
-
-repo_path: /path/to/your/repo
-project_hierarchy: .project_hierarchy # 全局结构信息文件夹的路径
-Markdown_Docs_folder: Markdown_Docs # 目标存储库根目录中用于存储文档的文件夹
-ignore_list: ["ignore_file1.py", "ignore_file2.py", "ignore_directory"] # 通过在ignore_list中给出相对路径来忽略一些您不想为其生成文档的py文件或文件夹
-
-language: zh # 双字母语言代码(ISO 639-1 代码),例如 `language: en` 表示英语,有关更多语言,请参阅支持的语言
-max_thread_count: 10 # 我们支持多线程执行来加速文档生成过程
-max_document_tokens: 1024 # 每一个对象文档(如类、函数)允许的最大长度
-log_level: info # log信息显示等级
+如果需要修改运行参数,使用 `repoagent configure`
+
+```sh
+Enter the path to target repository:
+Enter the project hierarchy file name [.project_doc_record]:
+Enter the Markdown documents folder name [markdown_docs]:
+Enter files or directories to ignore, separated by commas []:
+Enter the language (ISO 639 code or language name, e.g., 'en', 'eng', 'English') [Chinese]:
+Enter the maximum number of threads [4]:
+Enter the maximum number of document tokens [1024]:
+Enter the log level (DEBUG, INFO, WARNING, ERROR, CRITICAL) [INFO]:
+Enter the model [gpt-3.5-turbo]:
+Enter the temperature [0.2]:
+Enter the request timeout (seconds) [60.0]:
+Enter the base URL [https://api.openai.com/v1]:
```
## 运行RepoAgent
-进入RepoAgent根目录,在命令行输入以下命令:
+进入RepoAgent根目录并在终端尝试以下命令:
```sh
-python -m repo_agent # 此命令将生成文档或更新文档(pre-commit钩子将自动调用此命令)
+repoagent run # 这条命令会生成文档或自动更新文档 (pre-commit-hook 会自动调用它)
+```
+
+run 命令支持以下可选标志(如果设置,将覆盖配置默认值):
-# 你也可以尝试以下功能
-python -m repo_agent clean # 此命令将删除与repoagent相关的缓存
-python -m repo_agent print # 此命令将打印repo-agent如何解析目标仓库
-python -m repo_agent diff # 此命令将检查基于当前代码更改将更新/生成哪些文档
+- `-m`, `--model` TEXT:指定用于完成的模型。默认值:`gpt-3.5-turbo`
+- `-t`, `--temperature` FLOAT:设置模型的生成温度。较低的值使模型更确定性。默认值:`0.2`
+- `-r`, `--request-timeout` INTEGER:定义 API 请求的超时时间(秒)。默认值:`60`
+- `-b`, `--base-url` TEXT:API 调用的基础 URL。默认值:`https://api.openai.com/v1`
+- `-tp`, `--target-repo-path` PATH:目标仓库的文件系统路径。用作文档生成的根路径。默认值:`path/to/your/target/repository`
+- `-hp`, `--hierarchy-path` TEXT:项目层级文件的名称或路径,用于组织文档结构。默认值:`.project_doc_record`
+- `-mdp`, `--markdown-docs-path` TEXT:Markdown 文档将被存储或生成的文件夹路径。默认值:`markdown_docs`
+- `-i`, `--ignore-list` TEXT:在文档生成过程中要忽略的文件或目录列表,用逗号分隔。
+- `-l`, `--language` TEXT:文档的 ISO 639 代码或语言名称。默认值:`Chinese`
+- `-ll`, `--log-level` [DEBUG|INFO|WARNING|ERROR|CRITICAL]:设置应用程序的日志级别。默认值:`INFO`
+
+你也可以尝试以下功能
+
+```sh
+repoagent clean # 此命令将删除与repoagent相关的缓存
+repoagent print-hierarchy # 此命令将打印repoagent解析出的目标仓库
+repoagent diff # 此命令将检查基于当前代码更改将更新/生成哪些文档
```
-如果您是第一次对目标仓库生成文档,此时RepoAgent会自动生成一个维护全局结构信息的json文件,并在目标仓库根目录下创建一个名为Markdown_Docs的文件夹,用于存放文档。
+如果您是第一次对目标仓库生成文档,此时RepoAgent会自动生成一个维护全局结构信息的json文件,并在目标仓库根目录下创建一个文件夹用于存放文档。
全局结构信息json文件和文档文件夹的路径都可以在`config.yml`中进行配置。
当您首次完成对目标仓库生成全局文档后,或您clone下来的项目已经包含了全局文档信息后,就可以通过**pre-commit**配置目标仓库**hook**和团队一起无缝自动维护一个项目内部文档了!
-## 配置目标仓库
+### 配置目标仓库
RepoAgent目前支持对项目的文档生成和自动维护,因此需要对目标仓库进行一定的配置。
@@ -110,8 +171,9 @@ repos:
hooks:
- id: repo-agent
name: RepoAgent
- entry: python path/to/your/repo_agent/runner.py
+ entry: repoagent
language: system
+ pass_filenames: false # 阻止pre commit传入文件名作为参数
# 可以指定钩子触发的文件类型,但是目前只支持python
types: [python]
```
@@ -142,21 +204,18 @@ RepoAgent hook会在git commit时自动触发,检测前一步您git add的文
我们将与仓库对话视为所有下游应用的统一入口,作为连接RepoAgent与人类用户和其他AI智能体之间的接口。我们未来的研究将探索适配各种下游应用的接口,并实现这些下游任务的独特性和现实要求。
在这里,我们展示了我们的下游任务之一的初步原型:自动issue问题解答和代码解释。您可以通过在终端运行以下代码启动服务。
-```bash
-python -m repo_agent.chat_with_repo
+
+```sh
+repoagent chat_with_repo
```
# ✅ 未来工作
-- [x] 对象间父子关系层级结构识别及维护
-- [x] 实现 Black commit
-- [x] **Bi-direct reference** 构建双向引用拓扑结构
-- [x] **与仓库对话(chat with repo)** 通过直接提供相关代码文件、代码块和文档信息使用户能直接向Repo提问
+- [x] 支持通过`pip install repoagent`将项目作为包进行安装配置
- [ ] 通过全局文档信息自动生成仓库README.md文件
- [ ] **多编程语言支持** 支持更多编程语言,如Java、C或C++等
- [ ] 本地模型支持如 Llama、chatGLM、Qianwen 等
-- [ ] 支持通过`pip install repoagent`将项目作为包进行安装配置
-- [x] 自动生成Gitbook等更佳的可视化效果
+
# 🥰 精选案例
@@ -166,43 +225,6 @@ python -m repo_agent.chat_with_repo
- [ChatDev](https://github.com/OpenBMB/ChatDev): 用于软件开发的协作式AI智能体。
- [XAgent](https://github.com/OpenBMB/XAgent): 一个用于解决复杂任务的自主大型语言模型智能体。
-# 🇺🇳 支持语言
-在`config.yml`配置文件中使用两个字母的语言代码(ISO 639-1代码)设置生成文档的目标语言,点击下方的'语言列表'部分以展开支持的语言列表。
-
-
-语言列表
-
-| 国旗 | 语言代码 | 语言 |
-|------|------|------------|
-| 🇬🇧 | en | English |
-| 🇪🇸 | es | Spanish |
-| 🇫🇷 | fr | French |
-| 🇩🇪 | de | German |
-| 🇨🇳 | zh | Chinese |
-| 🇯🇵 | ja | Japanese |
-| 🇷🇺 | ru | Russian |
-| 🇮🇹 | it | Italian |
-| 🇰🇷 | ko | Korean |
-| 🇳🇱 | nl | Dutch |
-| 🇵🇹 | pt | Portuguese |
-| 🇸🇦 | ar | Arabic |
-| 🇹🇷 | tr | Turkish |
-| 🇸🇪 | sv | Swedish |
-| 🇩🇰 | da | Danish |
-| 🇫🇮 | fi | Finnish |
-| 🇳🇴 | no | Norwegian |
-| 🇵🇱 | pl | Polish |
-| 🇨🇿 | cs | Czech |
-| 🇭🇺 | hu | Hungarian |
-| 🇬🇷 | el | Greek |
-| 🇮🇱 | he | Hebrew |
-| 🇹🇭 | th | Thai |
-| 🇮🇳 | hi | Hindi |
-| 🇧🇩 | bn | Bengali |
-
-
-
-> 例如,`language: en`代表生成的文档使用英语。
# 📊 引用我们
```bibtex
diff --git a/config.toml.template b/config.toml.template
new file mode 100644
index 0000000..628caaa
--- /dev/null
+++ b/config.toml.template
@@ -0,0 +1,15 @@
+[project]
+target_repo = ""
+hierarchy_name = ".project_doc_record"
+markdown_docs_name = "markdown_docs"
+ignore_list = []
+language = "Chinese"
+max_thread_count = 4
+max_document_tokens = 1024
+log_level = "info"
+
+[chat_completion]
+model = "gpt-3.5-turbo"
+temperature = 0.2
+request_timeout = 60
+base_url = "https://api.openai.com/v1"
diff --git a/config.yml.template b/config.yml.template
deleted file mode 100644
index 7aa81de..0000000
--- a/config.yml.template
+++ /dev/null
@@ -1,51 +0,0 @@
-api_keys:
- gpt-3.5-turbo:
- - api_key: sk-XXXX
- base_url: https://example.com/v1/
- model: gpt-3.5-turbo
- gpt-3.5-turbo-16k:
- - api_key: sk-XXXX
- base_url: https://example.com/v1/
- api_type: azure
- api_version: XXX
- engine: GPT-35-Turbo-16k
- # you can use any kwargs supported by openai.ChatCompletion here
- - api_key: sk-xxxxx
- organization: org-xxxxxx
- model: gpt-3.5-turbo-16k
- gpt-4:
- - api_key: sk-XXXX
- base_url: https://example.com/v1/
- model: gpt-4
- gpt-4-32k:
- - api_key: sk-XXXX
- base_url: https://example.com/v1/
- api_type: XXX
- api_version: XXX
- engine: gpt4-32
- gpt-4-1106:
- - api_key: sk-XXXX
- base_url: https://example.com/v1/
- model: gpt-4-1106
- gpt-4-0125-preview:
- - api_key: sk-XXXX
- base_url: https://example.com/v1/
- model: gpt-4-0125-preview
-
-default_completion_kwargs:
- model: gpt-3.5-turbo
- temperature: 0.2
- request_timeout: 60
-
-
-
-repo_path: /path/to/your/local/repo
-project_hierarchy: .project_doc_record # Please NOTE that this is a folder where you can store your project hierarchy and share it with your team members.
-Markdown_Docs_folder: markdown_docs # Please pay attention to the way the path is written. Do not add a slash cuz the absolute path is written starting with a slash.
-ignore_list: ["ignore_file1.py", "ignore_file2.py", "ignore_directory"] # optional and if needed, relative to repo_path
-whitelist_path: #if whitelist_path is not none, We only generate docs on whitelist
-
-language: zh
-max_thread_count: 5
-max_document_tokens: 1024 # the maximum number of tokens in a document generated
-log_level: info
\ No newline at end of file
diff --git a/pdm.lock b/pdm.lock
index 24e78fc..ba8bf17 100644
--- a/pdm.lock
+++ b/pdm.lock
@@ -5,7 +5,7 @@
groups = ["default", "test", "dev"]
strategy = ["cross_platform", "inherit_metadata"]
lock_version = "4.4.1"
-content_hash = "sha256:2cba7560e488a60d1f3c00ba769bd000fe9ebec8210b9e9e18b9544710ff6635"
+content_hash = "sha256:42d6e2d804df1e83e58d4784fb71c6c613cb2679f5225bfb545063a045d14189"
[[package]]
name = "aiofiles"
@@ -229,30 +229,31 @@ files = [
[[package]]
name = "build"
-version = "1.0.3"
+version = "1.1.1"
requires_python = ">= 3.7"
summary = "A simple, correct Python build frontend"
groups = ["default"]
dependencies = [
"colorama; os_name == \"nt\"",
+ "importlib-metadata>=4.6; python_full_version < \"3.10.2\"",
"packaging>=19.0",
"pyproject-hooks",
"tomli>=1.1.0; python_version < \"3.11\"",
]
files = [
- {file = "build-1.0.3-py3-none-any.whl", hash = "sha256:589bf99a67df7c9cf07ec0ac0e5e2ea5d4b37ac63301c4986d1acb126aa83f8f"},
- {file = "build-1.0.3.tar.gz", hash = "sha256:538aab1b64f9828977f84bc63ae570b060a8ed1be419e7870b8b4fc5e6ea553b"},
+ {file = "build-1.1.1-py3-none-any.whl", hash = "sha256:8ed0851ee76e6e38adce47e4bee3b51c771d86c64cf578d0c2245567ee200e73"},
+ {file = "build-1.1.1.tar.gz", hash = "sha256:8eea65bb45b1aac2e734ba2cc8dad3a6d97d97901a395bd0ed3e7b46953d2a31"},
]
[[package]]
name = "cachetools"
-version = "5.3.2"
+version = "5.3.3"
requires_python = ">=3.7"
summary = "Extensible memoizing collections and decorators"
groups = ["default"]
files = [
- {file = "cachetools-5.3.2-py3-none-any.whl", hash = "sha256:861f35a13a451f94e301ce2bec7cac63e881232ccce7ed67fab9b5df4d3beaa1"},
- {file = "cachetools-5.3.2.tar.gz", hash = "sha256:086ee420196f7b2ab9ca2db2520aca326318b68fe5ba8bc4d49cca91add450f2"},
+ {file = "cachetools-5.3.3-py3-none-any.whl", hash = "sha256:0abad1021d3f8325b2fc1d2e9c8b9c9d57b04c3932657a72465447332c24d945"},
+ {file = "cachetools-5.3.3.tar.gz", hash = "sha256:ba29e2dfa0b8b556606f097407ed1aa62080ee108ab0dc5ec9d6a723a007d105"},
]
[[package]]
@@ -346,7 +347,7 @@ files = [
[[package]]
name = "chromadb"
-version = "0.4.23"
+version = "0.4.24"
requires_python = ">=3.8"
summary = "Chroma."
groups = ["default"]
@@ -381,8 +382,8 @@ dependencies = [
"uvicorn[standard]>=0.18.3",
]
files = [
- {file = "chromadb-0.4.23-py3-none-any.whl", hash = "sha256:3d3c2ffb4ff560721e3daf8c1a3729fd149c551525b6f75543eddb81a4f29e16"},
- {file = "chromadb-0.4.23.tar.gz", hash = "sha256:54d9a770640704c6cedc15317faab9fd45beb9833e7484c00037e7a8801a349f"},
+ {file = "chromadb-0.4.24-py3-none-any.whl", hash = "sha256:3a08e237a4ad28b5d176685bd22429a03717fe09d35022fb230d516108da01da"},
+ {file = "chromadb-0.4.24.tar.gz", hash = "sha256:a5c80b4e4ad9b236ed2d4899a5b9e8002b489293f2881cb2cadab5b199ee1c72"},
]
[[package]]
@@ -545,7 +546,7 @@ files = [
[[package]]
name = "fastapi"
-version = "0.109.2"
+version = "0.110.0"
requires_python = ">=3.8"
summary = "FastAPI framework, high performance, easy to learn, fast to code, ready for production"
groups = ["default"]
@@ -555,8 +556,8 @@ dependencies = [
"typing-extensions>=4.8.0",
]
files = [
- {file = "fastapi-0.109.2-py3-none-any.whl", hash = "sha256:2c9bab24667293b501cad8dd388c05240c850b58ec5876ee3283c47d6e1e3a4d"},
- {file = "fastapi-0.109.2.tar.gz", hash = "sha256:f3817eac96fe4f65a2ebb4baa000f394e55f5fccdaf7f75250804bc58f354f73"},
+ {file = "fastapi-0.110.0-py3-none-any.whl", hash = "sha256:87a1f6fb632a218222c5984be540055346a8f5d8a68e8f6fb647b1dc9934de4b"},
+ {file = "fastapi-0.110.0.tar.gz", hash = "sha256:266775f0dcc95af9d3ef39bad55cff525329a931d5fd51930aadd4f428bf7ff3"},
]
[[package]]
@@ -581,12 +582,12 @@ files = [
[[package]]
name = "flatbuffers"
-version = "23.5.26"
+version = "24.3.7"
summary = "The FlatBuffers serialization format for Python"
groups = ["default"]
files = [
- {file = "flatbuffers-23.5.26-py2.py3-none-any.whl", hash = "sha256:c0ff356da363087b915fde4b8b45bdda73432fc17cddb3c8157472eab1422ad1"},
- {file = "flatbuffers-23.5.26.tar.gz", hash = "sha256:9ea1144cac05ce5d86e2859f431c6cd5e66cd9c78c558317c7955fb8d4c78d89"},
+ {file = "flatbuffers-24.3.7-py2.py3-none-any.whl", hash = "sha256:80c4f5dcad0ee76b7e349671a0d657f2fbba927a0244f88dd3f5ed6a3694e1fc"},
+ {file = "flatbuffers-24.3.7.tar.gz", hash = "sha256:0895c22b9a6019ff2f4de2e5e2f7cd15914043e6e7033a94c0c6369422690f22"},
]
[[package]]
@@ -751,7 +752,7 @@ files = [
[[package]]
name = "gradio"
-version = "4.19.2"
+version = "4.21.0"
requires_python = ">=3.8"
summary = "Python library for easily interacting with trained machine learning models"
groups = ["default"]
@@ -760,7 +761,7 @@ dependencies = [
"altair<6.0,>=4.2.0",
"fastapi",
"ffmpy",
- "gradio-client==0.10.1",
+ "gradio-client==0.12.0",
"httpx>=0.24.1",
"huggingface-hub>=0.19.3",
"importlib-resources<7.0,>=1.3",
@@ -784,13 +785,13 @@ dependencies = [
"uvicorn>=0.14.0",
]
files = [
- {file = "gradio-4.19.2-py3-none-any.whl", hash = "sha256:acab4a35f556dbc3ae637469312738d154bcb73f0b8d5f4f65e4d067ecb1e0b1"},
- {file = "gradio-4.19.2.tar.gz", hash = "sha256:6fe5815bb4dfaeed1fc74223bffd91da70a1b463158af8c5e03d01bb09068a1d"},
+ {file = "gradio-4.21.0-py3-none-any.whl", hash = "sha256:521376440e4d5347f48798a15439497910041db4a522187fd8e3807a909c4a8e"},
+ {file = "gradio-4.21.0.tar.gz", hash = "sha256:81acbfbf87d07d472889e14c696a3da09acc6f1abd1d7e3d7f1a963e7b727363"},
]
[[package]]
name = "gradio-client"
-version = "0.10.1"
+version = "0.12.0"
requires_python = ">=3.8"
summary = "Python library for easily interacting with trained machine learning models"
groups = ["default"]
@@ -803,8 +804,8 @@ dependencies = [
"websockets<12.0,>=10.0",
]
files = [
- {file = "gradio_client-0.10.1-py3-none-any.whl", hash = "sha256:a0413fffdde3360e0f6aaec8b8c23d8a320049a571de2d111d85ebd295002165"},
- {file = "gradio_client-0.10.1.tar.gz", hash = "sha256:879eb56fae5d6b1603bb9375b88d1de0d034f3dac4b3afc8dbc66f36f6e54d5d"},
+ {file = "gradio_client-0.12.0-py3-none-any.whl", hash = "sha256:ead1d3016cd42e9275cf62dd3227ab4472d4093da1a1c6c3c0fe6ca516e9d31f"},
+ {file = "gradio_client-0.12.0.tar.gz", hash = "sha256:7a7e3406829a176153ba1a67f1e1779c2e897d775c7f70f3aeb7e7b2e1ce906b"},
]
[[package]]
@@ -958,7 +959,7 @@ files = [
[[package]]
name = "huggingface-hub"
-version = "0.20.3"
+version = "0.21.4"
requires_python = ">=3.8.0"
summary = "Client library to download and publish models, datasets and other repos on the huggingface.co hub"
groups = ["default"]
@@ -972,8 +973,8 @@ dependencies = [
"typing-extensions>=3.7.4.3",
]
files = [
- {file = "huggingface_hub-0.20.3-py3-none-any.whl", hash = "sha256:d988ae4f00d3e307b0c80c6a05ca6dbb7edba8bba3079f74cda7d9c2e562a7b6"},
- {file = "huggingface_hub-0.20.3.tar.gz", hash = "sha256:94e7f8e074475fbc67d6a71957b678e1b4a74ff1b64a644fd6cbb83da962d05d"},
+ {file = "huggingface_hub-0.21.4-py3-none-any.whl", hash = "sha256:df37c2c37fc6c82163cdd8a67ede261687d80d1e262526d6c0ce73b6b3630a7b"},
+ {file = "huggingface_hub-0.21.4.tar.gz", hash = "sha256:e1f4968c93726565a80edf6dc309763c7b546d0cfe79aa221206034d50155531"},
]
[[package]]
@@ -1017,13 +1018,13 @@ files = [
[[package]]
name = "importlib-resources"
-version = "6.1.1"
+version = "6.1.3"
requires_python = ">=3.8"
summary = "Read resources from Python packages"
groups = ["default"]
files = [
- {file = "importlib_resources-6.1.1-py3-none-any.whl", hash = "sha256:e8bf90d8213b486f428c9c39714b920041cb02c184686a3dee24905aaa8105d6"},
- {file = "importlib_resources-6.1.1.tar.gz", hash = "sha256:3893a00122eafde6894c59914446a512f728a0c1a45f9bb9b63721b6bacf0b4a"},
+ {file = "importlib_resources-6.1.3-py3-none-any.whl", hash = "sha256:4c0269e3580fe2634d364b39b38b961540a7738c02cb984e98add8b4221d793d"},
+ {file = "importlib_resources-6.1.3.tar.gz", hash = "sha256:56fb4525197b78544a3354ea27793952ab93f935bb4bf746b846bb1015020f2b"},
]
[[package]]
@@ -1313,7 +1314,7 @@ files = [
[[package]]
name = "marshmallow"
-version = "3.20.2"
+version = "3.21.1"
requires_python = ">=3.8"
summary = "A lightweight library for converting complex datatypes to and from native Python datatypes."
groups = ["default"]
@@ -1321,8 +1322,8 @@ dependencies = [
"packaging>=17.0",
]
files = [
- {file = "marshmallow-3.20.2-py3-none-any.whl", hash = "sha256:c21d4b98fee747c130e6bc8f45c4b3199ea66bc00c12ee1f639f0aeca034d5e9"},
- {file = "marshmallow-3.20.2.tar.gz", hash = "sha256:4c1daff273513dc5eb24b219a8035559dc573c8f322558ef85f5438ddd1236dd"},
+ {file = "marshmallow-3.21.1-py3-none-any.whl", hash = "sha256:f085493f79efb0644f270a9bf2892843142d80d7174bbbd2f3713f2a589dc633"},
+ {file = "marshmallow-3.21.1.tar.gz", hash = "sha256:4e65e9e0d80fc9e609574b9983cf32579f305c718afb30d7233ab818571768c3"},
]
[[package]]
@@ -1611,7 +1612,7 @@ files = [
[[package]]
name = "onnxruntime"
-version = "1.17.0"
+version = "1.17.1"
summary = "ONNX Runtime is a runtime accelerator for Machine Learning models"
groups = ["default"]
dependencies = [
@@ -1623,26 +1624,26 @@ dependencies = [
"sympy",
]
files = [
- {file = "onnxruntime-1.17.0-cp310-cp310-macosx_11_0_universal2.whl", hash = "sha256:d2b22a25a94109cc983443116da8d9805ced0256eb215c5e6bc6dcbabefeab96"},
- {file = "onnxruntime-1.17.0-cp310-cp310-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b4c87d83c6f58d1af2675fc99e3dc810f2dbdb844bcefd0c1b7573632661f6fc"},
- {file = "onnxruntime-1.17.0-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:dba55723bf9b835e358f48c98a814b41692c393eb11f51e02ece0625c756b797"},
- {file = "onnxruntime-1.17.0-cp310-cp310-win32.whl", hash = "sha256:ee48422349cc500273beea7607e33c2237909f58468ae1d6cccfc4aecd158565"},
- {file = "onnxruntime-1.17.0-cp310-cp310-win_amd64.whl", hash = "sha256:f34cc46553359293854e38bdae2ab1be59543aad78a6317e7746d30e311110c3"},
- {file = "onnxruntime-1.17.0-cp311-cp311-macosx_11_0_universal2.whl", hash = "sha256:16d26badd092c8c257fa57c458bb600d96dc15282c647ccad0ed7b2732e6c03b"},
- {file = "onnxruntime-1.17.0-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6f1273bebcdb47ed932d076c85eb9488bc4768fcea16d5f2747ca692fad4f9d3"},
- {file = "onnxruntime-1.17.0-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:cb60fd3c2c1acd684752eb9680e89ae223e9801a9b0e0dc7b28adabe45a2e380"},
- {file = "onnxruntime-1.17.0-cp311-cp311-win32.whl", hash = "sha256:4b038324586bc905299e435f7c00007e6242389c856b82fe9357fdc3b1ef2bdc"},
- {file = "onnxruntime-1.17.0-cp311-cp311-win_amd64.whl", hash = "sha256:93d39b3fa1ee01f034f098e1c7769a811a21365b4883f05f96c14a2b60c6028b"},
- {file = "onnxruntime-1.17.0-cp312-cp312-macosx_11_0_universal2.whl", hash = "sha256:90c0890e36f880281c6c698d9bc3de2afbeee2f76512725ec043665c25c67d21"},
- {file = "onnxruntime-1.17.0-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7466724e809a40e986b1637cba156ad9fc0d1952468bc00f79ef340bc0199552"},
- {file = "onnxruntime-1.17.0-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d47bee7557a8b99c8681b6882657a515a4199778d6d5e24e924d2aafcef55b0a"},
- {file = "onnxruntime-1.17.0-cp312-cp312-win32.whl", hash = "sha256:bb1bf1ee575c665b8bbc3813ab906e091a645a24ccc210be7932154b8260eca1"},
- {file = "onnxruntime-1.17.0-cp312-cp312-win_amd64.whl", hash = "sha256:ac2f286da3494b29b4186ca193c7d4e6a2c1f770c4184c7192c5da142c3dec28"},
+ {file = "onnxruntime-1.17.1-cp310-cp310-macosx_11_0_universal2.whl", hash = "sha256:d43ac17ac4fa3c9096ad3c0e5255bb41fd134560212dc124e7f52c3159af5d21"},
+ {file = "onnxruntime-1.17.1-cp310-cp310-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:55b5e92a4c76a23981c998078b9bf6145e4fb0b016321a8274b1607bd3c6bd35"},
+ {file = "onnxruntime-1.17.1-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ebbcd2bc3a066cf54e6f18c75708eb4d309ef42be54606d22e5bdd78afc5b0d7"},
+ {file = "onnxruntime-1.17.1-cp310-cp310-win32.whl", hash = "sha256:5e3716b5eec9092e29a8d17aab55e737480487deabfca7eac3cd3ed952b6ada9"},
+ {file = "onnxruntime-1.17.1-cp310-cp310-win_amd64.whl", hash = "sha256:fbb98cced6782ae1bb799cc74ddcbbeeae8819f3ad1d942a74d88e72b6511337"},
+ {file = "onnxruntime-1.17.1-cp311-cp311-macosx_11_0_universal2.whl", hash = "sha256:36fd6f87a1ecad87e9c652e42407a50fb305374f9a31d71293eb231caae18784"},
+ {file = "onnxruntime-1.17.1-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:99a8bddeb538edabc524d468edb60ad4722cff8a49d66f4e280c39eace70500b"},
+ {file = "onnxruntime-1.17.1-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:fd7fddb4311deb5a7d3390cd8e9b3912d4d963efbe4dfe075edbaf18d01c024e"},
+ {file = "onnxruntime-1.17.1-cp311-cp311-win32.whl", hash = "sha256:606a7cbfb6680202b0e4f1890881041ffc3ac6e41760a25763bd9fe146f0b335"},
+ {file = "onnxruntime-1.17.1-cp311-cp311-win_amd64.whl", hash = "sha256:53e4e06c0a541696ebdf96085fd9390304b7b04b748a19e02cf3b35c869a1e76"},
+ {file = "onnxruntime-1.17.1-cp312-cp312-macosx_11_0_universal2.whl", hash = "sha256:40f08e378e0f85929712a2b2c9b9a9cc400a90c8a8ca741d1d92c00abec60843"},
+ {file = "onnxruntime-1.17.1-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ac79da6d3e1bb4590f1dad4bb3c2979d7228555f92bb39820889af8b8e6bd472"},
+ {file = "onnxruntime-1.17.1-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ae9ba47dc099004e3781f2d0814ad710a13c868c739ab086fc697524061695ea"},
+ {file = "onnxruntime-1.17.1-cp312-cp312-win32.whl", hash = "sha256:2dff1a24354220ac30e4a4ce2fb1df38cb1ea59f7dac2c116238d63fe7f4c5ff"},
+ {file = "onnxruntime-1.17.1-cp312-cp312-win_amd64.whl", hash = "sha256:6226a5201ab8cafb15e12e72ff2a4fc8f50654e8fa5737c6f0bd57c5ff66827e"},
]
[[package]]
name = "openai"
-version = "1.12.0"
+version = "1.13.3"
requires_python = ">=3.7.1"
summary = "The official Python library for the openai API"
groups = ["default"]
@@ -1656,14 +1657,14 @@ dependencies = [
"typing-extensions<5,>=4.7",
]
files = [
- {file = "openai-1.12.0-py3-none-any.whl", hash = "sha256:a54002c814e05222e413664f651b5916714e4700d041d5cf5724d3ae1a3e3481"},
- {file = "openai-1.12.0.tar.gz", hash = "sha256:99c5d257d09ea6533d689d1cc77caa0ac679fa21efef8893d8b0832a86877f1b"},
+ {file = "openai-1.13.3-py3-none-any.whl", hash = "sha256:5769b62abd02f350a8dd1a3a242d8972c947860654466171d60fb0972ae0a41c"},
+ {file = "openai-1.13.3.tar.gz", hash = "sha256:ff6c6b3bc7327e715e4b3592a923a5a1c7519ff5dd764a83d69f633d49e77a7b"},
]
[[package]]
name = "opentelemetry-api"
-version = "1.22.0"
-requires_python = ">=3.7"
+version = "1.23.0"
+requires_python = ">=3.8"
summary = "OpenTelemetry Python API"
groups = ["default"]
dependencies = [
@@ -1671,50 +1672,48 @@ dependencies = [
"importlib-metadata<7.0,>=6.0",
]
files = [
- {file = "opentelemetry_api-1.22.0-py3-none-any.whl", hash = "sha256:43621514301a7e9f5d06dd8013a1b450f30c2e9372b8e30aaeb4562abf2ce034"},
- {file = "opentelemetry_api-1.22.0.tar.gz", hash = "sha256:15ae4ca925ecf9cfdfb7a709250846fbb08072260fca08ade78056c502b86bed"},
+ {file = "opentelemetry_api-1.23.0-py3-none-any.whl", hash = "sha256:cc03ea4025353048aadb9c64919099663664672ea1c6be6ddd8fee8e4cd5e774"},
+ {file = "opentelemetry_api-1.23.0.tar.gz", hash = "sha256:14a766548c8dd2eb4dfc349739eb4c3893712a0daa996e5dbf945f9da665da9d"},
]
[[package]]
name = "opentelemetry-exporter-otlp-proto-common"
-version = "1.22.0"
-requires_python = ">=3.7"
+version = "1.23.0"
+requires_python = ">=3.8"
summary = "OpenTelemetry Protobuf encoding"
groups = ["default"]
dependencies = [
- "backoff<3.0.0,>=1.10.0; python_version >= \"3.7\"",
- "opentelemetry-proto==1.22.0",
+ "opentelemetry-proto==1.23.0",
]
files = [
- {file = "opentelemetry_exporter_otlp_proto_common-1.22.0-py3-none-any.whl", hash = "sha256:3f2538bec5312587f8676c332b3747f54c89fe6364803a807e217af4603201fa"},
- {file = "opentelemetry_exporter_otlp_proto_common-1.22.0.tar.gz", hash = "sha256:71ae2f81bc6d6fe408d06388826edc8933759b2ca3a97d24054507dc7cfce52d"},
+ {file = "opentelemetry_exporter_otlp_proto_common-1.23.0-py3-none-any.whl", hash = "sha256:2a9e7e9d5a8b026b572684b6b24dcdefcaa58613d5ce3d644130b0c373c056c1"},
+ {file = "opentelemetry_exporter_otlp_proto_common-1.23.0.tar.gz", hash = "sha256:35e4ea909e7a0b24235bd0aaf17fba49676527feb1823b46565ff246d5a1ab18"},
]
[[package]]
name = "opentelemetry-exporter-otlp-proto-grpc"
-version = "1.22.0"
-requires_python = ">=3.7"
+version = "1.23.0"
+requires_python = ">=3.8"
summary = "OpenTelemetry Collector Protobuf over gRPC Exporter"
groups = ["default"]
dependencies = [
- "backoff<3.0.0,>=1.10.0; python_version >= \"3.7\"",
"deprecated>=1.2.6",
"googleapis-common-protos~=1.52",
"grpcio<2.0.0,>=1.0.0",
"opentelemetry-api~=1.15",
- "opentelemetry-exporter-otlp-proto-common==1.22.0",
- "opentelemetry-proto==1.22.0",
- "opentelemetry-sdk~=1.22.0",
+ "opentelemetry-exporter-otlp-proto-common==1.23.0",
+ "opentelemetry-proto==1.23.0",
+ "opentelemetry-sdk~=1.23.0",
]
files = [
- {file = "opentelemetry_exporter_otlp_proto_grpc-1.22.0-py3-none-any.whl", hash = "sha256:b5bcadc129272004316a455e9081216d3380c1fc2231a928ea6a70aa90e173fb"},
- {file = "opentelemetry_exporter_otlp_proto_grpc-1.22.0.tar.gz", hash = "sha256:1e0e5aa4bbabc74942f06f268deffd94851d12a8dc30b02527472ef1729fe5b1"},
+ {file = "opentelemetry_exporter_otlp_proto_grpc-1.23.0-py3-none-any.whl", hash = "sha256:40f9e3e7761eb34f2a1001f4543028783ac26e2db27e420d5374f2cca0182dad"},
+ {file = "opentelemetry_exporter_otlp_proto_grpc-1.23.0.tar.gz", hash = "sha256:aa1a012eea5342bfef51fcf3f7f22601dcb0f0984a07ffe6025b2fbb6d91a2a9"},
]
[[package]]
name = "opentelemetry-instrumentation"
-version = "0.43b0"
-requires_python = ">=3.7"
+version = "0.44b0"
+requires_python = ">=3.8"
summary = "Instrumentation Tools & Auto Instrumentation for OpenTelemetry Python"
groups = ["default"]
dependencies = [
@@ -1723,135 +1722,135 @@ dependencies = [
"wrapt<2.0.0,>=1.0.0",
]
files = [
- {file = "opentelemetry_instrumentation-0.43b0-py3-none-any.whl", hash = "sha256:0ff1334d7e359e27640e9d420024efeb73eacae464309c2e14ede7ba6c93967e"},
- {file = "opentelemetry_instrumentation-0.43b0.tar.gz", hash = "sha256:c3755da6c4be8033be0216d0501e11f4832690f4e2eca5a3576fbf113498f0f6"},
+ {file = "opentelemetry_instrumentation-0.44b0-py3-none-any.whl", hash = "sha256:79560f386425176bcc60c59190064597096114c4a8e5154f1cb281bb4e47d2fc"},
+ {file = "opentelemetry_instrumentation-0.44b0.tar.gz", hash = "sha256:8213d02d8c0987b9b26386ae3e091e0477d6331673123df736479322e1a50b48"},
]
[[package]]
name = "opentelemetry-instrumentation-asgi"
-version = "0.43b0"
-requires_python = ">=3.7"
+version = "0.44b0"
+requires_python = ">=3.8"
summary = "ASGI instrumentation for OpenTelemetry"
groups = ["default"]
dependencies = [
"asgiref~=3.0",
"opentelemetry-api~=1.12",
- "opentelemetry-instrumentation==0.43b0",
- "opentelemetry-semantic-conventions==0.43b0",
- "opentelemetry-util-http==0.43b0",
+ "opentelemetry-instrumentation==0.44b0",
+ "opentelemetry-semantic-conventions==0.44b0",
+ "opentelemetry-util-http==0.44b0",
]
files = [
- {file = "opentelemetry_instrumentation_asgi-0.43b0-py3-none-any.whl", hash = "sha256:1f593829fa039e9367820736fb063e92acd15c25b53d7bcb5d319971b8e93fd7"},
- {file = "opentelemetry_instrumentation_asgi-0.43b0.tar.gz", hash = "sha256:3f6f19333dca31ef696672e4e36cb1c2613c71dc7e847c11ff36a37e1130dadc"},
+ {file = "opentelemetry_instrumentation_asgi-0.44b0-py3-none-any.whl", hash = "sha256:0d95c84a8991008c8a8ac35e15d43cc7768a5bb46f95f129e802ad2990d7c366"},
+ {file = "opentelemetry_instrumentation_asgi-0.44b0.tar.gz", hash = "sha256:72d4d28ec7ccd551eac11edc5ae8cac3586c0a228467d6a95fad7b6d4edd597a"},
]
[[package]]
name = "opentelemetry-instrumentation-fastapi"
-version = "0.43b0"
-requires_python = ">=3.7"
+version = "0.44b0"
+requires_python = ">=3.8"
summary = "OpenTelemetry FastAPI Instrumentation"
groups = ["default"]
dependencies = [
"opentelemetry-api~=1.12",
- "opentelemetry-instrumentation-asgi==0.43b0",
- "opentelemetry-instrumentation==0.43b0",
- "opentelemetry-semantic-conventions==0.43b0",
- "opentelemetry-util-http==0.43b0",
+ "opentelemetry-instrumentation-asgi==0.44b0",
+ "opentelemetry-instrumentation==0.44b0",
+ "opentelemetry-semantic-conventions==0.44b0",
+ "opentelemetry-util-http==0.44b0",
]
files = [
- {file = "opentelemetry_instrumentation_fastapi-0.43b0-py3-none-any.whl", hash = "sha256:b79c044df68a52e07b35fa12a424e7cc0dd27ff0a171c5fdcc41dea9de8fc938"},
- {file = "opentelemetry_instrumentation_fastapi-0.43b0.tar.gz", hash = "sha256:2afaaf470622e1a2732182c68f6d2431ffe5e026a7edacd0f83605632b66347f"},
+ {file = "opentelemetry_instrumentation_fastapi-0.44b0-py3-none-any.whl", hash = "sha256:4441482944bea6676816668d56deb94af990e8c6e9582c581047e5d84c91d3c9"},
+ {file = "opentelemetry_instrumentation_fastapi-0.44b0.tar.gz", hash = "sha256:67ed10b93ad9d35238ae0be73cf8acbbb65a4a61fb7444d0aee5b0c492e294db"},
]
[[package]]
name = "opentelemetry-proto"
-version = "1.22.0"
-requires_python = ">=3.7"
+version = "1.23.0"
+requires_python = ">=3.8"
summary = "OpenTelemetry Python Proto"
groups = ["default"]
dependencies = [
"protobuf<5.0,>=3.19",
]
files = [
- {file = "opentelemetry_proto-1.22.0-py3-none-any.whl", hash = "sha256:ce7188d22c75b6d0fe53e7fb58501613d0feade5139538e79dedd9420610fa0c"},
- {file = "opentelemetry_proto-1.22.0.tar.gz", hash = "sha256:9ec29169286029f17ca34ec1f3455802ffb90131642d2f545ece9a63e8f69003"},
+ {file = "opentelemetry_proto-1.23.0-py3-none-any.whl", hash = "sha256:4c017deca052cb287a6003b7c989ed8b47af65baeb5d57ebf93dde0793f78509"},
+ {file = "opentelemetry_proto-1.23.0.tar.gz", hash = "sha256:e6aaf8b7ace8d021942d546161401b83eed90f9f2cc6f13275008cea730e4651"},
]
[[package]]
name = "opentelemetry-sdk"
-version = "1.22.0"
-requires_python = ">=3.7"
+version = "1.23.0"
+requires_python = ">=3.8"
summary = "OpenTelemetry Python SDK"
groups = ["default"]
dependencies = [
- "opentelemetry-api==1.22.0",
- "opentelemetry-semantic-conventions==0.43b0",
+ "opentelemetry-api==1.23.0",
+ "opentelemetry-semantic-conventions==0.44b0",
"typing-extensions>=3.7.4",
]
files = [
- {file = "opentelemetry_sdk-1.22.0-py3-none-any.whl", hash = "sha256:a730555713d7c8931657612a88a141e3a4fe6eb5523d9e2d5a8b1e673d76efa6"},
- {file = "opentelemetry_sdk-1.22.0.tar.gz", hash = "sha256:45267ac1f38a431fc2eb5d6e0c0d83afc0b78de57ac345488aa58c28c17991d0"},
+ {file = "opentelemetry_sdk-1.23.0-py3-none-any.whl", hash = "sha256:a93c96990ac0f07c6d679e2f1015864ff7a4f5587122dd5af968034436efb1fd"},
+ {file = "opentelemetry_sdk-1.23.0.tar.gz", hash = "sha256:9ddf60195837b59e72fd2033d6a47e2b59a0f74f0ec37d89387d89e3da8cab7f"},
]
[[package]]
name = "opentelemetry-semantic-conventions"
-version = "0.43b0"
-requires_python = ">=3.7"
+version = "0.44b0"
+requires_python = ">=3.8"
summary = "OpenTelemetry Semantic Conventions"
groups = ["default"]
files = [
- {file = "opentelemetry_semantic_conventions-0.43b0-py3-none-any.whl", hash = "sha256:291284d7c1bf15fdaddf309b3bd6d3b7ce12a253cec6d27144439819a15d8445"},
- {file = "opentelemetry_semantic_conventions-0.43b0.tar.gz", hash = "sha256:b9576fb890df479626fa624e88dde42d3d60b8b6c8ae1152ad157a8b97358635"},
+ {file = "opentelemetry_semantic_conventions-0.44b0-py3-none-any.whl", hash = "sha256:7c434546c9cbd797ab980cc88bf9ff3f4a5a28f941117cad21694e43d5d92019"},
+ {file = "opentelemetry_semantic_conventions-0.44b0.tar.gz", hash = "sha256:2e997cb28cd4ca81a25a9a43365f593d0c2b76be0685015349a89abdf1aa4ffa"},
]
[[package]]
name = "opentelemetry-util-http"
-version = "0.43b0"
-requires_python = ">=3.7"
+version = "0.44b0"
+requires_python = ">=3.8"
summary = "Web util for OpenTelemetry"
groups = ["default"]
files = [
- {file = "opentelemetry_util_http-0.43b0-py3-none-any.whl", hash = "sha256:f25a820784b030f6cb86b3d76e5676c769b75ed3f55a210bcdae0a5e175ebadb"},
- {file = "opentelemetry_util_http-0.43b0.tar.gz", hash = "sha256:3ff6ab361dbe99fc81200d625603c0fb890c055c6e416a3e6d661ddf47a6c7f7"},
+ {file = "opentelemetry_util_http-0.44b0-py3-none-any.whl", hash = "sha256:ff018ab6a2fa349537ff21adcef99a294248b599be53843c44f367aef6bccea5"},
+ {file = "opentelemetry_util_http-0.44b0.tar.gz", hash = "sha256:75896dffcbbeb5df5429ad4526e22307fc041a27114e0c5bfd90bb219381e68f"},
]
[[package]]
name = "orjson"
-version = "3.9.14"
+version = "3.9.15"
requires_python = ">=3.8"
summary = "Fast, correct Python JSON library supporting dataclasses, datetimes, and numpy"
groups = ["default"]
files = [
- {file = "orjson-3.9.14-cp310-cp310-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:793f6c9448ab6eb7d4974b4dde3f230345c08ca6c7995330fbceeb43a5c8aa5e"},
- {file = "orjson-3.9.14-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a6bc7928d161840096adc956703494b5c0193ede887346f028216cac0af87500"},
- {file = "orjson-3.9.14-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:58b36f54da759602d8e2f7dad958752d453dfe2c7122767bc7f765e17dc59959"},
- {file = "orjson-3.9.14-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:abcda41ecdc950399c05eff761c3de91485d9a70d8227cb599ad3a66afe93bcc"},
- {file = "orjson-3.9.14-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:df76ecd17b1b3627bddfd689faaf206380a1a38cc9f6c4075bd884eaedcf46c2"},
- {file = "orjson-3.9.14-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d450a8e0656efb5d0fcb062157b918ab02dcca73278975b4ee9ea49e2fcf5bd5"},
- {file = "orjson-3.9.14-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:95c03137b0cf66517c8baa65770507a756d3a89489d8ecf864ea92348e1beabe"},
- {file = "orjson-3.9.14-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:20837e10835c98973673406d6798e10f821e7744520633811a5a3d809762d8cc"},
- {file = "orjson-3.9.14-cp310-none-win32.whl", hash = "sha256:1f7b6f3ef10ae8e3558abb729873d033dbb5843507c66b1c0767e32502ba96bb"},
- {file = "orjson-3.9.14-cp310-none-win_amd64.whl", hash = "sha256:ea890e6dc1711aeec0a33b8520e395c2f3d59ead5b4351a788e06bf95fc7ba81"},
- {file = "orjson-3.9.14-cp311-cp311-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:c19009ff37f033c70acd04b636380379499dac2cba27ae7dfc24f304deabbc81"},
- {file = "orjson-3.9.14-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:19cdea0664aec0b7f385be84986d4defd3334e9c3c799407686ee1c26f7b8251"},
- {file = "orjson-3.9.14-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:135d518f73787ce323b1a5e21fb854fe22258d7a8ae562b81a49d6c7f826f2a3"},
- {file = "orjson-3.9.14-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d2cf1d0557c61c75e18cf7d69fb689b77896e95553e212c0cc64cf2087944b84"},
- {file = "orjson-3.9.14-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b7c11667421df2d8b18b021223505dcc3ee51be518d54e4dc49161ac88ac2b87"},
- {file = "orjson-3.9.14-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2eefc41ba42e75ed88bc396d8fe997beb20477f3e7efa000cd7a47eda452fbb2"},
- {file = "orjson-3.9.14-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:917311d6a64d1c327c0dfda1e41f3966a7fb72b11ca7aa2e7a68fcccc7db35d9"},
- {file = "orjson-3.9.14-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:4dc1c132259b38d12c6587d190cd09cd76e3b5273ce71fe1372437b4cbc65f6f"},
- {file = "orjson-3.9.14-cp311-none-win32.whl", hash = "sha256:6f39a10408478f4c05736a74da63727a1ae0e83e3533d07b19443400fe8591ca"},
- {file = "orjson-3.9.14-cp311-none-win_amd64.whl", hash = "sha256:26280a7fcb62d8257f634c16acebc3bec626454f9ab13558bbf7883b9140760e"},
- {file = "orjson-3.9.14-cp312-cp312-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:08e722a8d06b13b67a51f247a24938d1a94b4b3862e40e0eef3b2e98c99cd04c"},
- {file = "orjson-3.9.14-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a2591faa0c031cf3f57e5bce1461cfbd6160f3f66b5a72609a130924917cb07d"},
- {file = "orjson-3.9.14-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:e2450d87dd7b4f277f4c5598faa8b49a0c197b91186c47a2c0b88e15531e4e3e"},
- {file = "orjson-3.9.14-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:90903d2908158a2c9077a06f11e27545de610af690fb178fd3ba6b32492d4d1c"},
- {file = "orjson-3.9.14-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ce6f095eef0026eae76fc212f20f786011ecf482fc7df2f4c272a8ae6dd7b1ef"},
- {file = "orjson-3.9.14-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:751250a31fef2bac05a2da2449aae7142075ea26139271f169af60456d8ad27a"},
- {file = "orjson-3.9.14-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:9a1af21160a38ee8be3f4fcf24ee4b99e6184cadc7f915d599f073f478a94d2c"},
- {file = "orjson-3.9.14-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:449bf090b2aa4e019371d7511a6ea8a5a248139205c27d1834bb4b1e3c44d936"},
- {file = "orjson-3.9.14-cp312-none-win_amd64.whl", hash = "sha256:a603161318ff699784943e71f53899983b7dee571b4dd07c336437c9c5a272b0"},
- {file = "orjson-3.9.14.tar.gz", hash = "sha256:06fb40f8e49088ecaa02f1162581d39e2cf3fd9dbbfe411eb2284147c99bad79"},
+ {file = "orjson-3.9.15-cp310-cp310-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:d61f7ce4727a9fa7680cd6f3986b0e2c732639f46a5e0156e550e35258aa313a"},
+ {file = "orjson-3.9.15-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4feeb41882e8aa17634b589533baafdceb387e01e117b1ec65534ec724023d04"},
+ {file = "orjson-3.9.15-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:fbbeb3c9b2edb5fd044b2a070f127a0ac456ffd079cb82746fc84af01ef021a4"},
+ {file = "orjson-3.9.15-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b66bcc5670e8a6b78f0313bcb74774c8291f6f8aeef10fe70e910b8040f3ab75"},
+ {file = "orjson-3.9.15-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2973474811db7b35c30248d1129c64fd2bdf40d57d84beed2a9a379a6f57d0ab"},
+ {file = "orjson-3.9.15-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fe41b6f72f52d3da4db524c8653e46243c8c92df826ab5ffaece2dba9cccd58"},
+ {file = "orjson-3.9.15-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:4228aace81781cc9d05a3ec3a6d2673a1ad0d8725b4e915f1089803e9efd2b99"},
+ {file = "orjson-3.9.15-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:6f7b65bfaf69493c73423ce9db66cfe9138b2f9ef62897486417a8fcb0a92bfe"},
+ {file = "orjson-3.9.15-cp310-none-win32.whl", hash = "sha256:2d99e3c4c13a7b0fb3792cc04c2829c9db07838fb6973e578b85c1745e7d0ce7"},
+ {file = "orjson-3.9.15-cp310-none-win_amd64.whl", hash = "sha256:b725da33e6e58e4a5d27958568484aa766e825e93aa20c26c91168be58e08cbb"},
+ {file = "orjson-3.9.15-cp311-cp311-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:c8e8fe01e435005d4421f183038fc70ca85d2c1e490f51fb972db92af6e047c2"},
+ {file = "orjson-3.9.15-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:87f1097acb569dde17f246faa268759a71a2cb8c96dd392cd25c668b104cad2f"},
+ {file = "orjson-3.9.15-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ff0f9913d82e1d1fadbd976424c316fbc4d9c525c81d047bbdd16bd27dd98cfc"},
+ {file = "orjson-3.9.15-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8055ec598605b0077e29652ccfe9372247474375e0e3f5775c91d9434e12d6b1"},
+ {file = "orjson-3.9.15-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d6768a327ea1ba44c9114dba5fdda4a214bdb70129065cd0807eb5f010bfcbb5"},
+ {file = "orjson-3.9.15-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:12365576039b1a5a47df01aadb353b68223da413e2e7f98c02403061aad34bde"},
+ {file = "orjson-3.9.15-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:71c6b009d431b3839d7c14c3af86788b3cfac41e969e3e1c22f8a6ea13139404"},
+ {file = "orjson-3.9.15-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:e18668f1bd39e69b7fed19fa7cd1cd110a121ec25439328b5c89934e6d30d357"},
+ {file = "orjson-3.9.15-cp311-none-win32.whl", hash = "sha256:62482873e0289cf7313461009bf62ac8b2e54bc6f00c6fabcde785709231a5d7"},
+ {file = "orjson-3.9.15-cp311-none-win_amd64.whl", hash = "sha256:b3d336ed75d17c7b1af233a6561cf421dee41d9204aa3cfcc6c9c65cd5bb69a8"},
+ {file = "orjson-3.9.15-cp312-cp312-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:82425dd5c7bd3adfe4e94c78e27e2fa02971750c2b7ffba648b0f5d5cc016a73"},
+ {file = "orjson-3.9.15-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2c51378d4a8255b2e7c1e5cc430644f0939539deddfa77f6fac7b56a9784160a"},
+ {file = "orjson-3.9.15-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:6ae4e06be04dc00618247c4ae3f7c3e561d5bc19ab6941427f6d3722a0875ef7"},
+ {file = "orjson-3.9.15-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bcef128f970bb63ecf9a65f7beafd9b55e3aaf0efc271a4154050fc15cdb386e"},
+ {file = "orjson-3.9.15-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b72758f3ffc36ca566ba98a8e7f4f373b6c17c646ff8ad9b21ad10c29186f00d"},
+ {file = "orjson-3.9.15-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:10c57bc7b946cf2efa67ac55766e41764b66d40cbd9489041e637c1304400494"},
+ {file = "orjson-3.9.15-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:946c3a1ef25338e78107fba746f299f926db408d34553b4754e90a7de1d44068"},
+ {file = "orjson-3.9.15-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:2f256d03957075fcb5923410058982aea85455d035607486ccb847f095442bda"},
+ {file = "orjson-3.9.15-cp312-none-win_amd64.whl", hash = "sha256:5bb399e1b49db120653a31463b4a7b27cf2fbfe60469546baf681d1b39f4edf2"},
+ {file = "orjson-3.9.15.tar.gz", hash = "sha256:95cae920959d772f30ab36d3b25f83bb0f3be671e986c72ce22f8fa700dae061"},
]
[[package]]
@@ -1878,7 +1877,7 @@ files = [
[[package]]
name = "pandas"
-version = "2.2.0"
+version = "2.2.1"
requires_python = ">=3.9"
summary = "Powerful data structures for data analysis, time series, and statistics"
groups = ["default"]
@@ -1891,28 +1890,28 @@ dependencies = [
"tzdata>=2022.7",
]
files = [
- {file = "pandas-2.2.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:8108ee1712bb4fa2c16981fba7e68b3f6ea330277f5ca34fa8d557e986a11670"},
- {file = "pandas-2.2.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:736da9ad4033aeab51d067fc3bd69a0ba36f5a60f66a527b3d72e2030e63280a"},
- {file = "pandas-2.2.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:38e0b4fc3ddceb56ec8a287313bc22abe17ab0eb184069f08fc6a9352a769b18"},
- {file = "pandas-2.2.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:20404d2adefe92aed3b38da41d0847a143a09be982a31b85bc7dd565bdba0f4e"},
- {file = "pandas-2.2.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:7ea3ee3f125032bfcade3a4cf85131ed064b4f8dd23e5ce6fa16473e48ebcaf5"},
- {file = "pandas-2.2.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:f9670b3ac00a387620489dfc1bca66db47a787f4e55911f1293063a78b108df1"},
- {file = "pandas-2.2.0-cp310-cp310-win_amd64.whl", hash = "sha256:5a946f210383c7e6d16312d30b238fd508d80d927014f3b33fb5b15c2f895430"},
- {file = "pandas-2.2.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a1b438fa26b208005c997e78672f1aa8138f67002e833312e6230f3e57fa87d5"},
- {file = "pandas-2.2.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:8ce2fbc8d9bf303ce54a476116165220a1fedf15985b09656b4b4275300e920b"},
- {file = "pandas-2.2.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2707514a7bec41a4ab81f2ccce8b382961a29fbe9492eab1305bb075b2b1ff4f"},
- {file = "pandas-2.2.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:85793cbdc2d5bc32620dc8ffa715423f0c680dacacf55056ba13454a5be5de88"},
- {file = "pandas-2.2.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:cfd6c2491dc821b10c716ad6776e7ab311f7df5d16038d0b7458bc0b67dc10f3"},
- {file = "pandas-2.2.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:a146b9dcacc3123aa2b399df1a284de5f46287a4ab4fbfc237eac98a92ebcb71"},
- {file = "pandas-2.2.0-cp311-cp311-win_amd64.whl", hash = "sha256:fbc1b53c0e1fdf16388c33c3cca160f798d38aea2978004dd3f4d3dec56454c9"},
- {file = "pandas-2.2.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:a41d06f308a024981dcaa6c41f2f2be46a6b186b902c94c2674e8cb5c42985bc"},
- {file = "pandas-2.2.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:159205c99d7a5ce89ecfc37cb08ed179de7783737cea403b295b5eda8e9c56d1"},
- {file = "pandas-2.2.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:eb1e1f3861ea9132b32f2133788f3b14911b68102d562715d71bd0013bc45440"},
- {file = "pandas-2.2.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:761cb99b42a69005dec2b08854fb1d4888fdf7b05db23a8c5a099e4b886a2106"},
- {file = "pandas-2.2.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:a20628faaf444da122b2a64b1e5360cde100ee6283ae8effa0d8745153809a2e"},
- {file = "pandas-2.2.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:f5be5d03ea2073627e7111f61b9f1f0d9625dc3c4d8dda72cc827b0c58a1d042"},
- {file = "pandas-2.2.0-cp312-cp312-win_amd64.whl", hash = "sha256:a626795722d893ed6aacb64d2401d017ddc8a2341b49e0384ab9bf7112bdec30"},
- {file = "pandas-2.2.0.tar.gz", hash = "sha256:30b83f7c3eb217fb4d1b494a57a2fda5444f17834f5df2de6b2ffff68dc3c8e2"},
+ {file = "pandas-2.2.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:8df8612be9cd1c7797c93e1c5df861b2ddda0b48b08f2c3eaa0702cf88fb5f88"},
+ {file = "pandas-2.2.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:0f573ab277252ed9aaf38240f3b54cfc90fff8e5cab70411ee1d03f5d51f3944"},
+ {file = "pandas-2.2.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f02a3a6c83df4026e55b63c1f06476c9aa3ed6af3d89b4f04ea656ccdaaaa359"},
+ {file = "pandas-2.2.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c38ce92cb22a4bea4e3929429aa1067a454dcc9c335799af93ba9be21b6beb51"},
+ {file = "pandas-2.2.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:c2ce852e1cf2509a69e98358e8458775f89599566ac3775e70419b98615f4b06"},
+ {file = "pandas-2.2.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:53680dc9b2519cbf609c62db3ed7c0b499077c7fefda564e330286e619ff0dd9"},
+ {file = "pandas-2.2.1-cp310-cp310-win_amd64.whl", hash = "sha256:94e714a1cca63e4f5939cdce5f29ba8d415d85166be3441165edd427dc9f6bc0"},
+ {file = "pandas-2.2.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:f821213d48f4ab353d20ebc24e4faf94ba40d76680642fb7ce2ea31a3ad94f9b"},
+ {file = "pandas-2.2.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:c70e00c2d894cb230e5c15e4b1e1e6b2b478e09cf27cc593a11ef955b9ecc81a"},
+ {file = "pandas-2.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e97fbb5387c69209f134893abc788a6486dbf2f9e511070ca05eed4b930b1b02"},
+ {file = "pandas-2.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:101d0eb9c5361aa0146f500773395a03839a5e6ecde4d4b6ced88b7e5a1a6403"},
+ {file = "pandas-2.2.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:7d2ed41c319c9fb4fd454fe25372028dfa417aacb9790f68171b2e3f06eae8cd"},
+ {file = "pandas-2.2.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:af5d3c00557d657c8773ef9ee702c61dd13b9d7426794c9dfeb1dc4a0bf0ebc7"},
+ {file = "pandas-2.2.1-cp311-cp311-win_amd64.whl", hash = "sha256:06cf591dbaefb6da9de8472535b185cba556d0ce2e6ed28e21d919704fef1a9e"},
+ {file = "pandas-2.2.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:88ecb5c01bb9ca927ebc4098136038519aa5d66b44671861ffab754cae75102c"},
+ {file = "pandas-2.2.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:04f6ec3baec203c13e3f8b139fb0f9f86cd8c0b94603ae3ae8ce9a422e9f5bee"},
+ {file = "pandas-2.2.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a935a90a76c44fe170d01e90a3594beef9e9a6220021acfb26053d01426f7dc2"},
+ {file = "pandas-2.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c391f594aae2fd9f679d419e9a4d5ba4bce5bb13f6a989195656e7dc4b95c8f0"},
+ {file = "pandas-2.2.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:9d1265545f579edf3f8f0cb6f89f234f5e44ba725a34d86535b1a1d38decbccc"},
+ {file = "pandas-2.2.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:11940e9e3056576ac3244baef2fedade891977bcc1cb7e5cc8f8cc7d603edc89"},
+ {file = "pandas-2.2.1-cp312-cp312-win_amd64.whl", hash = "sha256:4acf681325ee1c7f950d058b05a820441075b0dd9a2adf5c4835b9bc056bf4fb"},
+ {file = "pandas-2.2.1.tar.gz", hash = "sha256:0ab90f87093c13f3e8fa45b48ba9f39181046e8f3317d3aadb2fffbb1b978572"},
]
[[package]]
@@ -1995,7 +1994,7 @@ files = [
[[package]]
name = "posthog"
-version = "3.4.2"
+version = "3.5.0"
summary = "Integrate PostHog into any python application."
groups = ["default"]
dependencies = [
@@ -2006,8 +2005,8 @@ dependencies = [
"six>=1.5",
]
files = [
- {file = "posthog-3.4.2-py2.py3-none-any.whl", hash = "sha256:c7e79b2e585d16e93749874bcbcdad78d857037398ce0d8d6c474a04d0bd3bbe"},
- {file = "posthog-3.4.2.tar.gz", hash = "sha256:f0eafa663fbc4a942b49b6168a62a890635407044bbc7593051dcb9cc1208873"},
+ {file = "posthog-3.5.0-py2.py3-none-any.whl", hash = "sha256:3c672be7ba6f95d555ea207d4486c171d06657eb34b3ce25eb043bfe7b6b5b76"},
+ {file = "posthog-3.5.0.tar.gz", hash = "sha256:8f7e3b2c6e8714d0c0c542a2109b83a7549f63b7113a133ab2763a89245ef2ef"},
]
[[package]]
@@ -2096,23 +2095,23 @@ files = [
[[package]]
name = "pydantic"
-version = "2.6.1"
+version = "2.6.3"
requires_python = ">=3.8"
summary = "Data validation using Python type hints"
groups = ["default"]
dependencies = [
"annotated-types>=0.4.0",
- "pydantic-core==2.16.2",
+ "pydantic-core==2.16.3",
"typing-extensions>=4.6.1",
]
files = [
- {file = "pydantic-2.6.1-py3-none-any.whl", hash = "sha256:0b6a909df3192245cb736509a92ff69e4fef76116feffec68e93a567347bae6f"},
- {file = "pydantic-2.6.1.tar.gz", hash = "sha256:4fd5c182a2488dc63e6d32737ff19937888001e2a6d86e94b3f233104a5d1fa9"},
+ {file = "pydantic-2.6.3-py3-none-any.whl", hash = "sha256:72c6034df47f46ccdf81869fddb81aade68056003900a8724a4f160700016a2a"},
+ {file = "pydantic-2.6.3.tar.gz", hash = "sha256:e07805c4c7f5c6826e33a1d4c9d47950d7eaf34868e2690f8594d2e30241f11f"},
]
[[package]]
name = "pydantic-core"
-version = "2.16.2"
+version = "2.16.3"
requires_python = ">=3.8"
summary = ""
groups = ["default"]
@@ -2120,61 +2119,76 @@ dependencies = [
"typing-extensions!=4.7.0,>=4.6.0",
]
files = [
- {file = "pydantic_core-2.16.2-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:3fab4e75b8c525a4776e7630b9ee48aea50107fea6ca9f593c98da3f4d11bf7c"},
- {file = "pydantic_core-2.16.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:8bde5b48c65b8e807409e6f20baee5d2cd880e0fad00b1a811ebc43e39a00ab2"},
- {file = "pydantic_core-2.16.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2924b89b16420712e9bb8192396026a8fbd6d8726224f918353ac19c4c043d2a"},
- {file = "pydantic_core-2.16.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:16aa02e7a0f539098e215fc193c8926c897175d64c7926d00a36188917717a05"},
- {file = "pydantic_core-2.16.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:936a787f83db1f2115ee829dd615c4f684ee48ac4de5779ab4300994d8af325b"},
- {file = "pydantic_core-2.16.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:459d6be6134ce3b38e0ef76f8a672924460c455d45f1ad8fdade36796df1ddc8"},
- {file = "pydantic_core-2.16.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4f9ee4febb249c591d07b2d4dd36ebcad0ccd128962aaa1801508320896575ef"},
- {file = "pydantic_core-2.16.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:40a0bd0bed96dae5712dab2aba7d334a6c67cbcac2ddfca7dbcc4a8176445990"},
- {file = "pydantic_core-2.16.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:870dbfa94de9b8866b37b867a2cb37a60c401d9deb4a9ea392abf11a1f98037b"},
- {file = "pydantic_core-2.16.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:308974fdf98046db28440eb3377abba274808bf66262e042c412eb2adf852731"},
- {file = "pydantic_core-2.16.2-cp310-none-win32.whl", hash = "sha256:a477932664d9611d7a0816cc3c0eb1f8856f8a42435488280dfbf4395e141485"},
- {file = "pydantic_core-2.16.2-cp310-none-win_amd64.whl", hash = "sha256:8f9142a6ed83d90c94a3efd7af8873bf7cefed2d3d44387bf848888482e2d25f"},
- {file = "pydantic_core-2.16.2-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:406fac1d09edc613020ce9cf3f2ccf1a1b2f57ab00552b4c18e3d5276c67eb11"},
- {file = "pydantic_core-2.16.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:ce232a6170dd6532096cadbf6185271e4e8c70fc9217ebe105923ac105da9978"},
- {file = "pydantic_core-2.16.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a90fec23b4b05a09ad988e7a4f4e081711a90eb2a55b9c984d8b74597599180f"},
- {file = "pydantic_core-2.16.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8aafeedb6597a163a9c9727d8a8bd363a93277701b7bfd2749fbefee2396469e"},
- {file = "pydantic_core-2.16.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9957433c3a1b67bdd4c63717eaf174ebb749510d5ea612cd4e83f2d9142f3fc8"},
- {file = "pydantic_core-2.16.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b0d7a9165167269758145756db43a133608a531b1e5bb6a626b9ee24bc38a8f7"},
- {file = "pydantic_core-2.16.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dffaf740fe2e147fedcb6b561353a16243e654f7fe8e701b1b9db148242e1272"},
- {file = "pydantic_core-2.16.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f8ed79883b4328b7f0bd142733d99c8e6b22703e908ec63d930b06be3a0e7113"},
- {file = "pydantic_core-2.16.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:cf903310a34e14651c9de056fcc12ce090560864d5a2bb0174b971685684e1d8"},
- {file = "pydantic_core-2.16.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:46b0d5520dbcafea9a8645a8164658777686c5c524d381d983317d29687cce97"},
- {file = "pydantic_core-2.16.2-cp311-none-win32.whl", hash = "sha256:70651ff6e663428cea902dac297066d5c6e5423fda345a4ca62430575364d62b"},
- {file = "pydantic_core-2.16.2-cp311-none-win_amd64.whl", hash = "sha256:98dc6f4f2095fc7ad277782a7c2c88296badcad92316b5a6e530930b1d475ebc"},
- {file = "pydantic_core-2.16.2-cp311-none-win_arm64.whl", hash = "sha256:ef6113cd31411eaf9b39fc5a8848e71c72656fd418882488598758b2c8c6dfa0"},
- {file = "pydantic_core-2.16.2-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:88646cae28eb1dd5cd1e09605680c2b043b64d7481cdad7f5003ebef401a3039"},
- {file = "pydantic_core-2.16.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:7b883af50eaa6bb3299780651e5be921e88050ccf00e3e583b1e92020333304b"},
- {file = "pydantic_core-2.16.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7bf26c2e2ea59d32807081ad51968133af3025c4ba5753e6a794683d2c91bf6e"},
- {file = "pydantic_core-2.16.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:99af961d72ac731aae2a1b55ccbdae0733d816f8bfb97b41909e143de735f522"},
- {file = "pydantic_core-2.16.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:02906e7306cb8c5901a1feb61f9ab5e5c690dbbeaa04d84c1b9ae2a01ebe9379"},
- {file = "pydantic_core-2.16.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d5362d099c244a2d2f9659fb3c9db7c735f0004765bbe06b99be69fbd87c3f15"},
- {file = "pydantic_core-2.16.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3ac426704840877a285d03a445e162eb258924f014e2f074e209d9b4ff7bf380"},
- {file = "pydantic_core-2.16.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b94cbda27267423411c928208e89adddf2ea5dd5f74b9528513f0358bba019cb"},
- {file = "pydantic_core-2.16.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:6db58c22ac6c81aeac33912fb1af0e930bc9774166cdd56eade913d5f2fff35e"},
- {file = "pydantic_core-2.16.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:396fdf88b1b503c9c59c84a08b6833ec0c3b5ad1a83230252a9e17b7dfb4cffc"},
- {file = "pydantic_core-2.16.2-cp312-none-win32.whl", hash = "sha256:7c31669e0c8cc68400ef0c730c3a1e11317ba76b892deeefaf52dcb41d56ed5d"},
- {file = "pydantic_core-2.16.2-cp312-none-win_amd64.whl", hash = "sha256:a3b7352b48fbc8b446b75f3069124e87f599d25afb8baa96a550256c031bb890"},
- {file = "pydantic_core-2.16.2-cp312-none-win_arm64.whl", hash = "sha256:a9e523474998fb33f7c1a4d55f5504c908d57add624599e095c20fa575b8d943"},
- {file = "pydantic_core-2.16.2-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:5f60f920691a620b03082692c378661947d09415743e437a7478c309eb0e4f82"},
- {file = "pydantic_core-2.16.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:47924039e785a04d4a4fa49455e51b4eb3422d6eaacfde9fc9abf8fdef164e8a"},
- {file = "pydantic_core-2.16.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e6294e76b0380bb7a61eb8a39273c40b20beb35e8c87ee101062834ced19c545"},
- {file = "pydantic_core-2.16.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fe56851c3f1d6f5384b3051c536cc81b3a93a73faf931f404fef95217cf1e10d"},
- {file = "pydantic_core-2.16.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9d776d30cde7e541b8180103c3f294ef7c1862fd45d81738d156d00551005784"},
- {file = "pydantic_core-2.16.2-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:72f7919af5de5ecfaf1eba47bf9a5d8aa089a3340277276e5636d16ee97614d7"},
- {file = "pydantic_core-2.16.2-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:4bfcbde6e06c56b30668a0c872d75a7ef3025dc3c1823a13cf29a0e9b33f67e8"},
- {file = "pydantic_core-2.16.2-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:ff7c97eb7a29aba230389a2661edf2e9e06ce616c7e35aa764879b6894a44b25"},
- {file = "pydantic_core-2.16.2-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:9b5f13857da99325dcabe1cc4e9e6a3d7b2e2c726248ba5dd4be3e8e4a0b6d0e"},
- {file = "pydantic_core-2.16.2-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:a7e41e3ada4cca5f22b478c08e973c930e5e6c7ba3588fb8e35f2398cdcc1545"},
- {file = "pydantic_core-2.16.2-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:60eb8ceaa40a41540b9acae6ae7c1f0a67d233c40dc4359c256ad2ad85bdf5e5"},
- {file = "pydantic_core-2.16.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7beec26729d496a12fd23cf8da9944ee338c8b8a17035a560b585c36fe81af20"},
- {file = "pydantic_core-2.16.2-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:22c5f022799f3cd6741e24f0443ead92ef42be93ffda0d29b2597208c94c3753"},
- {file = "pydantic_core-2.16.2-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:eca58e319f4fd6df004762419612122b2c7e7d95ffafc37e890252f869f3fb2a"},
- {file = "pydantic_core-2.16.2-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:ed957db4c33bc99895f3a1672eca7e80e8cda8bd1e29a80536b4ec2153fa9804"},
- {file = "pydantic_core-2.16.2-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:459c0d338cc55d099798618f714b21b7ece17eb1a87879f2da20a3ff4c7628e2"},
- {file = "pydantic_core-2.16.2.tar.gz", hash = "sha256:0ba503850d8b8dcc18391f10de896ae51d37fe5fe43dbfb6a35c5c5cad271a06"},
+ {file = "pydantic_core-2.16.3-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:75b81e678d1c1ede0785c7f46690621e4c6e63ccd9192af1f0bd9d504bbb6bf4"},
+ {file = "pydantic_core-2.16.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9c865a7ee6f93783bd5d781af5a4c43dadc37053a5b42f7d18dc019f8c9d2bd1"},
+ {file = "pydantic_core-2.16.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:162e498303d2b1c036b957a1278fa0899d02b2842f1ff901b6395104c5554a45"},
+ {file = "pydantic_core-2.16.3-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2f583bd01bbfbff4eaee0868e6fc607efdfcc2b03c1c766b06a707abbc856187"},
+ {file = "pydantic_core-2.16.3-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b926dd38db1519ed3043a4de50214e0d600d404099c3392f098a7f9d75029ff8"},
+ {file = "pydantic_core-2.16.3-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:716b542728d4c742353448765aa7cdaa519a7b82f9564130e2b3f6766018c9ec"},
+ {file = "pydantic_core-2.16.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fc4ad7f7ee1a13d9cb49d8198cd7d7e3aa93e425f371a68235f784e99741561f"},
+ {file = "pydantic_core-2.16.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bd87f48924f360e5d1c5f770d6155ce0e7d83f7b4e10c2f9ec001c73cf475c99"},
+ {file = "pydantic_core-2.16.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:0df446663464884297c793874573549229f9eca73b59360878f382a0fc085979"},
+ {file = "pydantic_core-2.16.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:4df8a199d9f6afc5ae9a65f8f95ee52cae389a8c6b20163762bde0426275b7db"},
+ {file = "pydantic_core-2.16.3-cp310-none-win32.whl", hash = "sha256:456855f57b413f077dff513a5a28ed838dbbb15082ba00f80750377eed23d132"},
+ {file = "pydantic_core-2.16.3-cp310-none-win_amd64.whl", hash = "sha256:732da3243e1b8d3eab8c6ae23ae6a58548849d2e4a4e03a1924c8ddf71a387cb"},
+ {file = "pydantic_core-2.16.3-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:519ae0312616026bf4cedc0fe459e982734f3ca82ee8c7246c19b650b60a5ee4"},
+ {file = "pydantic_core-2.16.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:b3992a322a5617ded0a9f23fd06dbc1e4bd7cf39bc4ccf344b10f80af58beacd"},
+ {file = "pydantic_core-2.16.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8d62da299c6ecb04df729e4b5c52dc0d53f4f8430b4492b93aa8de1f541c4aac"},
+ {file = "pydantic_core-2.16.3-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2acca2be4bb2f2147ada8cac612f8a98fc09f41c89f87add7256ad27332c2fda"},
+ {file = "pydantic_core-2.16.3-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1b662180108c55dfbf1280d865b2d116633d436cfc0bba82323554873967b340"},
+ {file = "pydantic_core-2.16.3-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e7c6ed0dc9d8e65f24f5824291550139fe6f37fac03788d4580da0d33bc00c97"},
+ {file = "pydantic_core-2.16.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a6b1bb0827f56654b4437955555dc3aeeebeddc47c2d7ed575477f082622c49e"},
+ {file = "pydantic_core-2.16.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:e56f8186d6210ac7ece503193ec84104da7ceb98f68ce18c07282fcc2452e76f"},
+ {file = "pydantic_core-2.16.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:936e5db01dd49476fa8f4383c259b8b1303d5dd5fb34c97de194560698cc2c5e"},
+ {file = "pydantic_core-2.16.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:33809aebac276089b78db106ee692bdc9044710e26f24a9a2eaa35a0f9fa70ba"},
+ {file = "pydantic_core-2.16.3-cp311-none-win32.whl", hash = "sha256:ded1c35f15c9dea16ead9bffcde9bb5c7c031bff076355dc58dcb1cb436c4721"},
+ {file = "pydantic_core-2.16.3-cp311-none-win_amd64.whl", hash = "sha256:d89ca19cdd0dd5f31606a9329e309d4fcbb3df860960acec32630297d61820df"},
+ {file = "pydantic_core-2.16.3-cp311-none-win_arm64.whl", hash = "sha256:6162f8d2dc27ba21027f261e4fa26f8bcb3cf9784b7f9499466a311ac284b5b9"},
+ {file = "pydantic_core-2.16.3-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:0f56ae86b60ea987ae8bcd6654a887238fd53d1384f9b222ac457070b7ac4cff"},
+ {file = "pydantic_core-2.16.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:c9bd22a2a639e26171068f8ebb5400ce2c1bc7d17959f60a3b753ae13c632975"},
+ {file = "pydantic_core-2.16.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4204e773b4b408062960e65468d5346bdfe139247ee5f1ca2a378983e11388a2"},
+ {file = "pydantic_core-2.16.3-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:f651dd19363c632f4abe3480a7c87a9773be27cfe1341aef06e8759599454120"},
+ {file = "pydantic_core-2.16.3-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:aaf09e615a0bf98d406657e0008e4a8701b11481840be7d31755dc9f97c44053"},
+ {file = "pydantic_core-2.16.3-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8e47755d8152c1ab5b55928ab422a76e2e7b22b5ed8e90a7d584268dd49e9c6b"},
+ {file = "pydantic_core-2.16.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:500960cb3a0543a724a81ba859da816e8cf01b0e6aaeedf2c3775d12ee49cade"},
+ {file = "pydantic_core-2.16.3-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:cf6204fe865da605285c34cf1172879d0314ff267b1c35ff59de7154f35fdc2e"},
+ {file = "pydantic_core-2.16.3-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:d33dd21f572545649f90c38c227cc8631268ba25c460b5569abebdd0ec5974ca"},
+ {file = "pydantic_core-2.16.3-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:49d5d58abd4b83fb8ce763be7794d09b2f50f10aa65c0f0c1696c677edeb7cbf"},
+ {file = "pydantic_core-2.16.3-cp312-none-win32.whl", hash = "sha256:f53aace168a2a10582e570b7736cc5bef12cae9cf21775e3eafac597e8551fbe"},
+ {file = "pydantic_core-2.16.3-cp312-none-win_amd64.whl", hash = "sha256:0d32576b1de5a30d9a97f300cc6a3f4694c428d956adbc7e6e2f9cad279e45ed"},
+ {file = "pydantic_core-2.16.3-cp312-none-win_arm64.whl", hash = "sha256:ec08be75bb268473677edb83ba71e7e74b43c008e4a7b1907c6d57e940bf34b6"},
+ {file = "pydantic_core-2.16.3-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:36fa178aacbc277bc6b62a2c3da95226520da4f4e9e206fdf076484363895d2c"},
+ {file = "pydantic_core-2.16.3-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:dcca5d2bf65c6fb591fff92da03f94cd4f315972f97c21975398bd4bd046854a"},
+ {file = "pydantic_core-2.16.3-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2a72fb9963cba4cd5793854fd12f4cfee731e86df140f59ff52a49b3552db241"},
+ {file = "pydantic_core-2.16.3-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b60cc1a081f80a2105a59385b92d82278b15d80ebb3adb200542ae165cd7d183"},
+ {file = "pydantic_core-2.16.3-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:cbcc558401de90a746d02ef330c528f2e668c83350f045833543cd57ecead1ad"},
+ {file = "pydantic_core-2.16.3-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:fee427241c2d9fb7192b658190f9f5fd6dfe41e02f3c1489d2ec1e6a5ab1e04a"},
+ {file = "pydantic_core-2.16.3-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:f4cb85f693044e0f71f394ff76c98ddc1bc0953e48c061725e540396d5c8a2e1"},
+ {file = "pydantic_core-2.16.3-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:b29eeb887aa931c2fcef5aa515d9d176d25006794610c264ddc114c053bf96fe"},
+ {file = "pydantic_core-2.16.3-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:a425479ee40ff021f8216c9d07a6a3b54b31c8267c6e17aa88b70d7ebd0e5e5b"},
+ {file = "pydantic_core-2.16.3-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:5c5cbc703168d1b7a838668998308018a2718c2130595e8e190220238addc96f"},
+ {file = "pydantic_core-2.16.3-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:99b6add4c0b39a513d323d3b93bc173dac663c27b99860dd5bf491b240d26137"},
+ {file = "pydantic_core-2.16.3-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:75f76ee558751746d6a38f89d60b6228fa174e5172d143886af0f85aa306fd89"},
+ {file = "pydantic_core-2.16.3-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:00ee1c97b5364b84cb0bd82e9bbf645d5e2871fb8c58059d158412fee2d33d8a"},
+ {file = "pydantic_core-2.16.3-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:287073c66748f624be4cef893ef9174e3eb88fe0b8a78dc22e88eca4bc357ca6"},
+ {file = "pydantic_core-2.16.3-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:ed25e1835c00a332cb10c683cd39da96a719ab1dfc08427d476bce41b92531fc"},
+ {file = "pydantic_core-2.16.3-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:86b3d0033580bd6bbe07590152007275bd7af95f98eaa5bd36f3da219dcd93da"},
+ {file = "pydantic_core-2.16.3.tar.gz", hash = "sha256:1cac689f80a3abab2d3c0048b29eea5751114054f032a941a32de4c852c59cad"},
+]
+
+[[package]]
+name = "pydantic-settings"
+version = "2.2.1"
+requires_python = ">=3.8"
+summary = "Settings management using Pydantic"
+groups = ["default"]
+dependencies = [
+ "pydantic>=2.3.0",
+ "python-dotenv>=0.21.0",
+]
+files = [
+ {file = "pydantic_settings-2.2.1-py3-none-any.whl", hash = "sha256:0235391d26db4d2190cb9b31051c4b46882d28a51533f97440867f012d4da091"},
+ {file = "pydantic_settings-2.2.1.tar.gz", hash = "sha256:00b9f6a5e95553590434c0fa01ead0b216c3e10bc54ae02e37f359948643c5ed"},
]
[[package]]
@@ -2200,13 +2214,13 @@ files = [
[[package]]
name = "pyparsing"
-version = "3.1.1"
+version = "3.1.2"
requires_python = ">=3.6.8"
summary = "pyparsing module - Classes and methods to define and execute parsing grammars"
groups = ["default"]
files = [
- {file = "pyparsing-3.1.1-py3-none-any.whl", hash = "sha256:32c7c0b711493c72ff18a981d24f28aaf9c1fb7ed5e9667c9e84e3db623bdbfb"},
- {file = "pyparsing-3.1.1.tar.gz", hash = "sha256:ede28a1a32462f5a9705e07aea48001a08f7cf81a021585011deba701581a0db"},
+ {file = "pyparsing-3.1.2-py3-none-any.whl", hash = "sha256:f9db75911801ed778fe61bb643079ff86601aca99fcae6345aa67292038fb742"},
+ {file = "pyparsing-3.1.2.tar.gz", hash = "sha256:a1bac0ce561155ecc3ed78ca94d3c9378656ad4c94c1270de543f621420f94ad"},
]
[[package]]
@@ -2278,7 +2292,7 @@ files = [
[[package]]
name = "python-dateutil"
-version = "2.8.2"
+version = "2.9.0.post0"
requires_python = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7"
summary = "Extensions to the standard Python datetime module"
groups = ["default"]
@@ -2286,8 +2300,8 @@ dependencies = [
"six>=1.5",
]
files = [
- {file = "python-dateutil-2.8.2.tar.gz", hash = "sha256:0123cacc1627ae19ddf3c27a5de5bd67ee4586fbdd6440d9748f8abb483d3e86"},
- {file = "python_dateutil-2.8.2-py2.py3-none-any.whl", hash = "sha256:961d03dc3453ebbc59dbdea9e4e11c5651520a876d0f4db161e8674aae935da9"},
+ {file = "python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3"},
+ {file = "python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427"},
]
[[package]]
@@ -2301,6 +2315,17 @@ files = [
{file = "python_dotenv-1.0.1-py3-none-any.whl", hash = "sha256:f7b63ef50f1b690dddf550d03497b66d609393b40b564ed0d674909a68ebf16a"},
]
+[[package]]
+name = "python-iso639"
+version = "2024.2.7"
+requires_python = ">=3.8"
+summary = "Look-up utilities for ISO 639 language codes and names"
+groups = ["default"]
+files = [
+ {file = "python-iso639-2024.2.7.tar.gz", hash = "sha256:c323233348c34d57c601e3e6d824088e492896bcb97a61a87f7d93401a305377"},
+ {file = "python_iso639-2024.2.7-py3-none-any.whl", hash = "sha256:7b149623ff74230f4ee3061fb01d18e57a8d07c5fee2aa72907f39b7f6d16cbc"},
+]
+
[[package]]
name = "python-multipart"
version = "0.0.9"
@@ -2460,7 +2485,7 @@ files = [
[[package]]
name = "rich"
-version = "13.7.0"
+version = "13.7.1"
requires_python = ">=3.7.0"
summary = "Render rich text, tables, progress bars, syntax highlighting, markdown and more to the terminal"
groups = ["default"]
@@ -2469,8 +2494,8 @@ dependencies = [
"pygments<3.0.0,>=2.13.0",
]
files = [
- {file = "rich-13.7.0-py3-none-any.whl", hash = "sha256:6da14c108c4866ee9520bbffa71f6fe3962e193b7da68720583850cd4548e235"},
- {file = "rich-13.7.0.tar.gz", hash = "sha256:5cb5123b5cf9ee70584244246816e9114227e0b98ad9176eede6ad54bf5403fa"},
+ {file = "rich-13.7.1-py3-none-any.whl", hash = "sha256:4edbae314f59eb482f54e9e30bf00d33350aaa94f4bfcd4e9e3110e64d0d7222"},
+ {file = "rich-13.7.1.tar.gz", hash = "sha256:9be308cb1fe2f1f57d67ce99e95af38a1e2bc71ad9813b0e247cf7ffbcc3a432"},
]
[[package]]
@@ -2571,28 +2596,28 @@ files = [
[[package]]
name = "ruff"
-version = "0.2.2"
+version = "0.3.2"
requires_python = ">=3.7"
summary = "An extremely fast Python linter and code formatter, written in Rust."
groups = ["default", "dev"]
files = [
- {file = "ruff-0.2.2-py3-none-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl", hash = "sha256:0a9efb032855ffb3c21f6405751d5e147b0c6b631e3ca3f6b20f917572b97eb6"},
- {file = "ruff-0.2.2-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:d450b7fbff85913f866a5384d8912710936e2b96da74541c82c1b458472ddb39"},
- {file = "ruff-0.2.2-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ecd46e3106850a5c26aee114e562c329f9a1fbe9e4821b008c4404f64ff9ce73"},
- {file = "ruff-0.2.2-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5e22676a5b875bd72acd3d11d5fa9075d3a5f53b877fe7b4793e4673499318ba"},
- {file = "ruff-0.2.2-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1695700d1e25a99d28f7a1636d85bafcc5030bba9d0578c0781ba1790dbcf51c"},
- {file = "ruff-0.2.2-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:b0c232af3d0bd8f521806223723456ffebf8e323bd1e4e82b0befb20ba18388e"},
- {file = "ruff-0.2.2-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f63d96494eeec2fc70d909393bcd76c69f35334cdbd9e20d089fb3f0640216ca"},
- {file = "ruff-0.2.2-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6a61ea0ff048e06de273b2e45bd72629f470f5da8f71daf09fe481278b175001"},
- {file = "ruff-0.2.2-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5e1439c8f407e4f356470e54cdecdca1bd5439a0673792dbe34a2b0a551a2fe3"},
- {file = "ruff-0.2.2-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:940de32dc8853eba0f67f7198b3e79bc6ba95c2edbfdfac2144c8235114d6726"},
- {file = "ruff-0.2.2-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:0c126da55c38dd917621552ab430213bdb3273bb10ddb67bc4b761989210eb6e"},
- {file = "ruff-0.2.2-py3-none-musllinux_1_2_i686.whl", hash = "sha256:3b65494f7e4bed2e74110dac1f0d17dc8e1f42faaa784e7c58a98e335ec83d7e"},
- {file = "ruff-0.2.2-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:1ec49be4fe6ddac0503833f3ed8930528e26d1e60ad35c2446da372d16651ce9"},
- {file = "ruff-0.2.2-py3-none-win32.whl", hash = "sha256:d920499b576f6c68295bc04e7b17b6544d9d05f196bb3aac4358792ef6f34325"},
- {file = "ruff-0.2.2-py3-none-win_amd64.whl", hash = "sha256:cc9a91ae137d687f43a44c900e5d95e9617cb37d4c989e462980ba27039d239d"},
- {file = "ruff-0.2.2-py3-none-win_arm64.whl", hash = "sha256:c9d15fc41e6054bfc7200478720570078f0b41c9ae4f010bcc16bd6f4d1aacdd"},
- {file = "ruff-0.2.2.tar.gz", hash = "sha256:e62ed7f36b3068a30ba39193a14274cd706bc486fad521276458022f7bccb31d"},
+ {file = "ruff-0.3.2-py3-none-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl", hash = "sha256:77f2612752e25f730da7421ca5e3147b213dca4f9a0f7e0b534e9562c5441f01"},
+ {file = "ruff-0.3.2-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:9966b964b2dd1107797be9ca7195002b874424d1d5472097701ae8f43eadef5d"},
+ {file = "ruff-0.3.2-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b83d17ff166aa0659d1e1deaf9f2f14cbe387293a906de09bc4860717eb2e2da"},
+ {file = "ruff-0.3.2-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:bb875c6cc87b3703aeda85f01c9aebdce3d217aeaca3c2e52e38077383f7268a"},
+ {file = "ruff-0.3.2-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:be75e468a6a86426430373d81c041b7605137a28f7014a72d2fc749e47f572aa"},
+ {file = "ruff-0.3.2-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:967978ac2d4506255e2f52afe70dda023fc602b283e97685c8447d036863a302"},
+ {file = "ruff-0.3.2-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1231eacd4510f73222940727ac927bc5d07667a86b0cbe822024dd00343e77e9"},
+ {file = "ruff-0.3.2-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2c6d613b19e9a8021be2ee1d0e27710208d1603b56f47203d0abbde906929a9b"},
+ {file = "ruff-0.3.2-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c8439338a6303585d27b66b4626cbde89bb3e50fa3cae86ce52c1db7449330a7"},
+ {file = "ruff-0.3.2-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:de8b480d8379620cbb5ea466a9e53bb467d2fb07c7eca54a4aa8576483c35d36"},
+ {file = "ruff-0.3.2-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:b74c3de9103bd35df2bb05d8b2899bf2dbe4efda6474ea9681280648ec4d237d"},
+ {file = "ruff-0.3.2-py3-none-musllinux_1_2_i686.whl", hash = "sha256:f380be9fc15a99765c9cf316b40b9da1f6ad2ab9639e551703e581a5e6da6745"},
+ {file = "ruff-0.3.2-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:0ac06a3759c3ab9ef86bbeca665d31ad3aa9a4b1c17684aadb7e61c10baa0df4"},
+ {file = "ruff-0.3.2-py3-none-win32.whl", hash = "sha256:9bd640a8f7dd07a0b6901fcebccedadeb1a705a50350fb86b4003b805c81385a"},
+ {file = "ruff-0.3.2-py3-none-win_amd64.whl", hash = "sha256:0c1bdd9920cab5707c26c8b3bf33a064a4ca7842d91a99ec0634fec68f9f4037"},
+ {file = "ruff-0.3.2-py3-none-win_arm64.whl", hash = "sha256:5f65103b1d76e0d600cabd577b04179ff592064eaa451a70a81085930e907d0b"},
+ {file = "ruff-0.3.2.tar.gz", hash = "sha256:fa78ec9418eb1ca3db392811df3376b46471ae93792a81af2d1cbb0e5dcb5142"},
]
[[package]]
@@ -2608,13 +2633,13 @@ files = [
[[package]]
name = "setuptools"
-version = "69.1.0"
+version = "69.1.1"
requires_python = ">=3.8"
summary = "Easily download, build, install, upgrade, and uninstall Python packages"
groups = ["default"]
files = [
- {file = "setuptools-69.1.0-py3-none-any.whl", hash = "sha256:c054629b81b946d63a9c6e732bc8b2513a7c3ea645f11d0139a2191d735c60c6"},
- {file = "setuptools-69.1.0.tar.gz", hash = "sha256:850894c4195f09c4ed30dba56213bf7c3f21d86ed6bdaafb5df5972593bfc401"},
+ {file = "setuptools-69.1.1-py3-none-any.whl", hash = "sha256:02fa291a0471b3a18b2b2481ed902af520c69e8ae0919c13da936542754b4c56"},
+ {file = "setuptools-69.1.1.tar.gz", hash = "sha256:5c0806c7d9af348e6dd3777b4f4dbb42c7ad85b190104837488eab9a7c945cf8"},
]
[[package]]
@@ -2652,18 +2677,18 @@ files = [
[[package]]
name = "sniffio"
-version = "1.3.0"
+version = "1.3.1"
requires_python = ">=3.7"
summary = "Sniff out which async library your code is running under"
groups = ["default"]
files = [
- {file = "sniffio-1.3.0-py3-none-any.whl", hash = "sha256:eecefdce1e5bbfb7ad2eeaabf7c1eeb404d7757c379bd1f7e5cce9d8bf425384"},
- {file = "sniffio-1.3.0.tar.gz", hash = "sha256:e60305c5e5d314f5389259b7f22aaa33d8f7dee49763119234af3755c55b9101"},
+ {file = "sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2"},
+ {file = "sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc"},
]
[[package]]
name = "sqlalchemy"
-version = "2.0.27"
+version = "2.0.28"
requires_python = ">=3.7"
summary = "Database Abstraction Library"
groups = ["default"]
@@ -2672,72 +2697,72 @@ dependencies = [
"typing-extensions>=4.6.0",
]
files = [
- {file = "SQLAlchemy-2.0.27-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d04e579e911562f1055d26dab1868d3e0bb905db3bccf664ee8ad109f035618a"},
- {file = "SQLAlchemy-2.0.27-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:fa67d821c1fd268a5a87922ef4940442513b4e6c377553506b9db3b83beebbd8"},
- {file = "SQLAlchemy-2.0.27-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6c7a596d0be71b7baa037f4ac10d5e057d276f65a9a611c46970f012752ebf2d"},
- {file = "SQLAlchemy-2.0.27-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:954d9735ee9c3fa74874c830d089a815b7b48df6f6b6e357a74130e478dbd951"},
- {file = "SQLAlchemy-2.0.27-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:5cd20f58c29bbf2680039ff9f569fa6d21453fbd2fa84dbdb4092f006424c2e6"},
- {file = "SQLAlchemy-2.0.27-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:03f448ffb731b48323bda68bcc93152f751436ad6037f18a42b7e16af9e91c07"},
- {file = "SQLAlchemy-2.0.27-cp310-cp310-win32.whl", hash = "sha256:d997c5938a08b5e172c30583ba6b8aad657ed9901fc24caf3a7152eeccb2f1b4"},
- {file = "SQLAlchemy-2.0.27-cp310-cp310-win_amd64.whl", hash = "sha256:eb15ef40b833f5b2f19eeae65d65e191f039e71790dd565c2af2a3783f72262f"},
- {file = "SQLAlchemy-2.0.27-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6c5bad7c60a392850d2f0fee8f355953abaec878c483dd7c3836e0089f046bf6"},
- {file = "SQLAlchemy-2.0.27-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a3012ab65ea42de1be81fff5fb28d6db893ef978950afc8130ba707179b4284a"},
- {file = "SQLAlchemy-2.0.27-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dbcd77c4d94b23e0753c5ed8deba8c69f331d4fd83f68bfc9db58bc8983f49cd"},
- {file = "SQLAlchemy-2.0.27-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d177b7e82f6dd5e1aebd24d9c3297c70ce09cd1d5d37b43e53f39514379c029c"},
- {file = "SQLAlchemy-2.0.27-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:680b9a36029b30cf063698755d277885d4a0eab70a2c7c6e71aab601323cba45"},
- {file = "SQLAlchemy-2.0.27-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:1306102f6d9e625cebaca3d4c9c8f10588735ef877f0360b5cdb4fdfd3fd7131"},
- {file = "SQLAlchemy-2.0.27-cp311-cp311-win32.whl", hash = "sha256:5b78aa9f4f68212248aaf8943d84c0ff0f74efc65a661c2fc68b82d498311fd5"},
- {file = "SQLAlchemy-2.0.27-cp311-cp311-win_amd64.whl", hash = "sha256:15e19a84b84528f52a68143439d0c7a3a69befcd4f50b8ef9b7b69d2628ae7c4"},
- {file = "SQLAlchemy-2.0.27-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:0de1263aac858f288a80b2071990f02082c51d88335a1db0d589237a3435fe71"},
- {file = "SQLAlchemy-2.0.27-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ce850db091bf7d2a1f2fdb615220b968aeff3849007b1204bf6e3e50a57b3d32"},
- {file = "SQLAlchemy-2.0.27-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8dfc936870507da96aebb43e664ae3a71a7b96278382bcfe84d277b88e379b18"},
- {file = "SQLAlchemy-2.0.27-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c4fbe6a766301f2e8a4519f4500fe74ef0a8509a59e07a4085458f26228cd7cc"},
- {file = "SQLAlchemy-2.0.27-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:4535c49d961fe9a77392e3a630a626af5baa967172d42732b7a43496c8b28876"},
- {file = "SQLAlchemy-2.0.27-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:0fb3bffc0ced37e5aa4ac2416f56d6d858f46d4da70c09bb731a246e70bff4d5"},
- {file = "SQLAlchemy-2.0.27-cp312-cp312-win32.whl", hash = "sha256:7f470327d06400a0aa7926b375b8e8c3c31d335e0884f509fe272b3c700a7254"},
- {file = "SQLAlchemy-2.0.27-cp312-cp312-win_amd64.whl", hash = "sha256:f9374e270e2553653d710ece397df67db9d19c60d2647bcd35bfc616f1622dcd"},
- {file = "SQLAlchemy-2.0.27-py3-none-any.whl", hash = "sha256:1ab4e0448018d01b142c916cc7119ca573803a4745cfe341b8f95657812700ac"},
- {file = "SQLAlchemy-2.0.27.tar.gz", hash = "sha256:86a6ed69a71fe6b88bf9331594fa390a2adda4a49b5c06f98e47bf0d392534f8"},
+ {file = "SQLAlchemy-2.0.28-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e0b148ab0438f72ad21cb004ce3bdaafd28465c4276af66df3b9ecd2037bf252"},
+ {file = "SQLAlchemy-2.0.28-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:bbda76961eb8f27e6ad3c84d1dc56d5bc61ba8f02bd20fcf3450bd421c2fcc9c"},
+ {file = "SQLAlchemy-2.0.28-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:feea693c452d85ea0015ebe3bb9cd15b6f49acc1a31c28b3c50f4db0f8fb1e71"},
+ {file = "SQLAlchemy-2.0.28-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5da98815f82dce0cb31fd1e873a0cb30934971d15b74e0d78cf21f9e1b05953f"},
+ {file = "SQLAlchemy-2.0.28-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:4a5adf383c73f2d49ad15ff363a8748319ff84c371eed59ffd0127355d6ea1da"},
+ {file = "SQLAlchemy-2.0.28-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:56856b871146bfead25fbcaed098269d90b744eea5cb32a952df00d542cdd368"},
+ {file = "SQLAlchemy-2.0.28-cp310-cp310-win32.whl", hash = "sha256:943aa74a11f5806ab68278284a4ddd282d3fb348a0e96db9b42cb81bf731acdc"},
+ {file = "SQLAlchemy-2.0.28-cp310-cp310-win_amd64.whl", hash = "sha256:c6c4da4843e0dabde41b8f2e8147438330924114f541949e6318358a56d1875a"},
+ {file = "SQLAlchemy-2.0.28-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:46a3d4e7a472bfff2d28db838669fc437964e8af8df8ee1e4548e92710929adc"},
+ {file = "SQLAlchemy-2.0.28-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:0d3dd67b5d69794cfe82862c002512683b3db038b99002171f624712fa71aeaa"},
+ {file = "SQLAlchemy-2.0.28-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c61e2e41656a673b777e2f0cbbe545323dbe0d32312f590b1bc09da1de6c2a02"},
+ {file = "SQLAlchemy-2.0.28-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0315d9125a38026227f559488fe7f7cee1bd2fbc19f9fd637739dc50bb6380b2"},
+ {file = "SQLAlchemy-2.0.28-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:af8ce2d31679006e7b747d30a89cd3ac1ec304c3d4c20973f0f4ad58e2d1c4c9"},
+ {file = "SQLAlchemy-2.0.28-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:81ba314a08c7ab701e621b7ad079c0c933c58cdef88593c59b90b996e8b58fa5"},
+ {file = "SQLAlchemy-2.0.28-cp311-cp311-win32.whl", hash = "sha256:1ee8bd6d68578e517943f5ebff3afbd93fc65f7ef8f23becab9fa8fb315afb1d"},
+ {file = "SQLAlchemy-2.0.28-cp311-cp311-win_amd64.whl", hash = "sha256:ad7acbe95bac70e4e687a4dc9ae3f7a2f467aa6597049eeb6d4a662ecd990bb6"},
+ {file = "SQLAlchemy-2.0.28-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:d3499008ddec83127ab286c6f6ec82a34f39c9817f020f75eca96155f9765097"},
+ {file = "SQLAlchemy-2.0.28-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:9b66fcd38659cab5d29e8de5409cdf91e9986817703e1078b2fdaad731ea66f5"},
+ {file = "SQLAlchemy-2.0.28-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bea30da1e76cb1acc5b72e204a920a3a7678d9d52f688f087dc08e54e2754c67"},
+ {file = "SQLAlchemy-2.0.28-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:124202b4e0edea7f08a4db8c81cc7859012f90a0d14ba2bf07c099aff6e96462"},
+ {file = "SQLAlchemy-2.0.28-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:e23b88c69497a6322b5796c0781400692eca1ae5532821b39ce81a48c395aae9"},
+ {file = "SQLAlchemy-2.0.28-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:4b6303bfd78fb3221847723104d152e5972c22367ff66edf09120fcde5ddc2e2"},
+ {file = "SQLAlchemy-2.0.28-cp312-cp312-win32.whl", hash = "sha256:a921002be69ac3ab2cf0c3017c4e6a3377f800f1fca7f254c13b5f1a2f10022c"},
+ {file = "SQLAlchemy-2.0.28-cp312-cp312-win_amd64.whl", hash = "sha256:b4a2cf92995635b64876dc141af0ef089c6eea7e05898d8d8865e71a326c0385"},
+ {file = "SQLAlchemy-2.0.28-py3-none-any.whl", hash = "sha256:78bb7e8da0183a8301352d569900d9d3594c48ac21dc1c2ec6b3121ed8b6c986"},
+ {file = "SQLAlchemy-2.0.28.tar.gz", hash = "sha256:dd53b6c4e6d960600fd6532b79ee28e2da489322fcf6648738134587faf767b6"},
]
[[package]]
name = "sqlalchemy"
-version = "2.0.27"
+version = "2.0.28"
extras = ["asyncio"]
requires_python = ">=3.7"
summary = "Database Abstraction Library"
groups = ["default"]
dependencies = [
- "SQLAlchemy==2.0.27",
+ "SQLAlchemy==2.0.28",
"greenlet!=0.4.17",
]
files = [
- {file = "SQLAlchemy-2.0.27-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d04e579e911562f1055d26dab1868d3e0bb905db3bccf664ee8ad109f035618a"},
- {file = "SQLAlchemy-2.0.27-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:fa67d821c1fd268a5a87922ef4940442513b4e6c377553506b9db3b83beebbd8"},
- {file = "SQLAlchemy-2.0.27-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6c7a596d0be71b7baa037f4ac10d5e057d276f65a9a611c46970f012752ebf2d"},
- {file = "SQLAlchemy-2.0.27-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:954d9735ee9c3fa74874c830d089a815b7b48df6f6b6e357a74130e478dbd951"},
- {file = "SQLAlchemy-2.0.27-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:5cd20f58c29bbf2680039ff9f569fa6d21453fbd2fa84dbdb4092f006424c2e6"},
- {file = "SQLAlchemy-2.0.27-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:03f448ffb731b48323bda68bcc93152f751436ad6037f18a42b7e16af9e91c07"},
- {file = "SQLAlchemy-2.0.27-cp310-cp310-win32.whl", hash = "sha256:d997c5938a08b5e172c30583ba6b8aad657ed9901fc24caf3a7152eeccb2f1b4"},
- {file = "SQLAlchemy-2.0.27-cp310-cp310-win_amd64.whl", hash = "sha256:eb15ef40b833f5b2f19eeae65d65e191f039e71790dd565c2af2a3783f72262f"},
- {file = "SQLAlchemy-2.0.27-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6c5bad7c60a392850d2f0fee8f355953abaec878c483dd7c3836e0089f046bf6"},
- {file = "SQLAlchemy-2.0.27-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a3012ab65ea42de1be81fff5fb28d6db893ef978950afc8130ba707179b4284a"},
- {file = "SQLAlchemy-2.0.27-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dbcd77c4d94b23e0753c5ed8deba8c69f331d4fd83f68bfc9db58bc8983f49cd"},
- {file = "SQLAlchemy-2.0.27-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d177b7e82f6dd5e1aebd24d9c3297c70ce09cd1d5d37b43e53f39514379c029c"},
- {file = "SQLAlchemy-2.0.27-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:680b9a36029b30cf063698755d277885d4a0eab70a2c7c6e71aab601323cba45"},
- {file = "SQLAlchemy-2.0.27-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:1306102f6d9e625cebaca3d4c9c8f10588735ef877f0360b5cdb4fdfd3fd7131"},
- {file = "SQLAlchemy-2.0.27-cp311-cp311-win32.whl", hash = "sha256:5b78aa9f4f68212248aaf8943d84c0ff0f74efc65a661c2fc68b82d498311fd5"},
- {file = "SQLAlchemy-2.0.27-cp311-cp311-win_amd64.whl", hash = "sha256:15e19a84b84528f52a68143439d0c7a3a69befcd4f50b8ef9b7b69d2628ae7c4"},
- {file = "SQLAlchemy-2.0.27-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:0de1263aac858f288a80b2071990f02082c51d88335a1db0d589237a3435fe71"},
- {file = "SQLAlchemy-2.0.27-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ce850db091bf7d2a1f2fdb615220b968aeff3849007b1204bf6e3e50a57b3d32"},
- {file = "SQLAlchemy-2.0.27-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8dfc936870507da96aebb43e664ae3a71a7b96278382bcfe84d277b88e379b18"},
- {file = "SQLAlchemy-2.0.27-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c4fbe6a766301f2e8a4519f4500fe74ef0a8509a59e07a4085458f26228cd7cc"},
- {file = "SQLAlchemy-2.0.27-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:4535c49d961fe9a77392e3a630a626af5baa967172d42732b7a43496c8b28876"},
- {file = "SQLAlchemy-2.0.27-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:0fb3bffc0ced37e5aa4ac2416f56d6d858f46d4da70c09bb731a246e70bff4d5"},
- {file = "SQLAlchemy-2.0.27-cp312-cp312-win32.whl", hash = "sha256:7f470327d06400a0aa7926b375b8e8c3c31d335e0884f509fe272b3c700a7254"},
- {file = "SQLAlchemy-2.0.27-cp312-cp312-win_amd64.whl", hash = "sha256:f9374e270e2553653d710ece397df67db9d19c60d2647bcd35bfc616f1622dcd"},
- {file = "SQLAlchemy-2.0.27-py3-none-any.whl", hash = "sha256:1ab4e0448018d01b142c916cc7119ca573803a4745cfe341b8f95657812700ac"},
- {file = "SQLAlchemy-2.0.27.tar.gz", hash = "sha256:86a6ed69a71fe6b88bf9331594fa390a2adda4a49b5c06f98e47bf0d392534f8"},
+ {file = "SQLAlchemy-2.0.28-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e0b148ab0438f72ad21cb004ce3bdaafd28465c4276af66df3b9ecd2037bf252"},
+ {file = "SQLAlchemy-2.0.28-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:bbda76961eb8f27e6ad3c84d1dc56d5bc61ba8f02bd20fcf3450bd421c2fcc9c"},
+ {file = "SQLAlchemy-2.0.28-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:feea693c452d85ea0015ebe3bb9cd15b6f49acc1a31c28b3c50f4db0f8fb1e71"},
+ {file = "SQLAlchemy-2.0.28-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5da98815f82dce0cb31fd1e873a0cb30934971d15b74e0d78cf21f9e1b05953f"},
+ {file = "SQLAlchemy-2.0.28-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:4a5adf383c73f2d49ad15ff363a8748319ff84c371eed59ffd0127355d6ea1da"},
+ {file = "SQLAlchemy-2.0.28-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:56856b871146bfead25fbcaed098269d90b744eea5cb32a952df00d542cdd368"},
+ {file = "SQLAlchemy-2.0.28-cp310-cp310-win32.whl", hash = "sha256:943aa74a11f5806ab68278284a4ddd282d3fb348a0e96db9b42cb81bf731acdc"},
+ {file = "SQLAlchemy-2.0.28-cp310-cp310-win_amd64.whl", hash = "sha256:c6c4da4843e0dabde41b8f2e8147438330924114f541949e6318358a56d1875a"},
+ {file = "SQLAlchemy-2.0.28-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:46a3d4e7a472bfff2d28db838669fc437964e8af8df8ee1e4548e92710929adc"},
+ {file = "SQLAlchemy-2.0.28-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:0d3dd67b5d69794cfe82862c002512683b3db038b99002171f624712fa71aeaa"},
+ {file = "SQLAlchemy-2.0.28-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c61e2e41656a673b777e2f0cbbe545323dbe0d32312f590b1bc09da1de6c2a02"},
+ {file = "SQLAlchemy-2.0.28-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0315d9125a38026227f559488fe7f7cee1bd2fbc19f9fd637739dc50bb6380b2"},
+ {file = "SQLAlchemy-2.0.28-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:af8ce2d31679006e7b747d30a89cd3ac1ec304c3d4c20973f0f4ad58e2d1c4c9"},
+ {file = "SQLAlchemy-2.0.28-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:81ba314a08c7ab701e621b7ad079c0c933c58cdef88593c59b90b996e8b58fa5"},
+ {file = "SQLAlchemy-2.0.28-cp311-cp311-win32.whl", hash = "sha256:1ee8bd6d68578e517943f5ebff3afbd93fc65f7ef8f23becab9fa8fb315afb1d"},
+ {file = "SQLAlchemy-2.0.28-cp311-cp311-win_amd64.whl", hash = "sha256:ad7acbe95bac70e4e687a4dc9ae3f7a2f467aa6597049eeb6d4a662ecd990bb6"},
+ {file = "SQLAlchemy-2.0.28-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:d3499008ddec83127ab286c6f6ec82a34f39c9817f020f75eca96155f9765097"},
+ {file = "SQLAlchemy-2.0.28-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:9b66fcd38659cab5d29e8de5409cdf91e9986817703e1078b2fdaad731ea66f5"},
+ {file = "SQLAlchemy-2.0.28-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bea30da1e76cb1acc5b72e204a920a3a7678d9d52f688f087dc08e54e2754c67"},
+ {file = "SQLAlchemy-2.0.28-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:124202b4e0edea7f08a4db8c81cc7859012f90a0d14ba2bf07c099aff6e96462"},
+ {file = "SQLAlchemy-2.0.28-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:e23b88c69497a6322b5796c0781400692eca1ae5532821b39ce81a48c395aae9"},
+ {file = "SQLAlchemy-2.0.28-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:4b6303bfd78fb3221847723104d152e5972c22367ff66edf09120fcde5ddc2e2"},
+ {file = "SQLAlchemy-2.0.28-cp312-cp312-win32.whl", hash = "sha256:a921002be69ac3ab2cf0c3017c4e6a3377f800f1fca7f254c13b5f1a2f10022c"},
+ {file = "SQLAlchemy-2.0.28-cp312-cp312-win_amd64.whl", hash = "sha256:b4a2cf92995635b64876dc141af0ef089c6eea7e05898d8d8865e71a326c0385"},
+ {file = "SQLAlchemy-2.0.28-py3-none-any.whl", hash = "sha256:78bb7e8da0183a8301352d569900d9d3594c48ac21dc1c2ec6b3121ed8b6c986"},
+ {file = "SQLAlchemy-2.0.28.tar.gz", hash = "sha256:dd53b6c4e6d960600fd6532b79ee28e2da489322fcf6648738134587faf767b6"},
]
[[package]]
@@ -2906,12 +2931,22 @@ version = "2.0.1"
requires_python = ">=3.7"
summary = "A lil' TOML parser"
groups = ["default", "test"]
-marker = "python_version < \"3.11\""
files = [
{file = "tomli-2.0.1-py3-none-any.whl", hash = "sha256:939de3e7a6161af0c887ef91b7d41a53e7c5a1ca976325f429cb46ea9bc30ecc"},
{file = "tomli-2.0.1.tar.gz", hash = "sha256:de526c12914f0c550d15924c62d72abc48d6fe7364aa87328337a31007fe8a4f"},
]
+[[package]]
+name = "tomli-w"
+version = "1.0.0"
+requires_python = ">=3.7"
+summary = "A lil' TOML writer"
+groups = ["default"]
+files = [
+ {file = "tomli_w-1.0.0-py3-none-any.whl", hash = "sha256:9f2a07e8be30a0729e533ec968016807069991ae2fd921a78d42f429ae5f4463"},
+ {file = "tomli_w-1.0.0.tar.gz", hash = "sha256:f463434305e0336248cac9c2dc8076b707d8a12d019dd349f5c1e382dd1ae1b9"},
+]
+
[[package]]
name = "tomlkit"
version = "0.12.0"
@@ -2983,13 +3018,13 @@ files = [
[[package]]
name = "typing-extensions"
-version = "4.9.0"
+version = "4.10.0"
requires_python = ">=3.8"
summary = "Backported and Experimental Type Hints for Python 3.8+"
groups = ["default"]
files = [
- {file = "typing_extensions-4.9.0-py3-none-any.whl", hash = "sha256:af72aea155e91adfc61c3ae9e0e342dbc0cba726d6cba4b6c72c1f34e47291cd"},
- {file = "typing_extensions-4.9.0.tar.gz", hash = "sha256:23478f88c37f27d76ac8aee6c905017a143b0b1b886c3c9f66bc2fd94f9f5783"},
+ {file = "typing_extensions-4.10.0-py3-none-any.whl", hash = "sha256:69b1a937c3a517342112fb4c6df7e72fc39a38e7891a5730ed4985b5214b5475"},
+ {file = "typing_extensions-4.10.0.tar.gz", hash = "sha256:b0abd7c89e8fb96f98db18d86106ff1d90ab692004eb746cf6eda2682f91b3cb"},
]
[[package]]
diff --git a/pyproject.toml b/pyproject.toml
index a6bccaa..6927eef 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -3,7 +3,7 @@ authors = [
{name = "Qinyu Luo", email = "qinyuluo123@gmail.com"},
]
maintainers = [
- {name = "Arno Edwards", email = "Arno.Edwards@outlook.com"},
+ {name = "Edwards Arno", email = "Edwards.Arno@outlook.com"},
]
license = {text = "Apache-2.0"}
requires-python = ">=3.10,<4.0"
@@ -15,20 +15,31 @@ dependencies = [
"pyyaml>=6.0.1",
"jedi>=0.19.1",
"GitPython>=3.1.41",
- "llama-index<0.10.0",
"chromadb>=0.4.22",
"prettytable>=3.9.0",
+ "python-iso639>=2024.2.7",
+ "pydantic-settings>=2.2.1",
+ "tomli>=2.0.1",
+ "tomli-w>=1.0.0",
+ "llama-index<0.10.0",
]
name = "repoagent"
-version = "0.0.6"
-description = "An Agent designed to offer an intelligent approach for generating project documentation."
+version = "0.1.2"
+description = "An LLM-Powered Framework for Repository-level Code Documentation Generation."
readme = "README.md"
+classifiers = [
+ "Programming Language :: Python :: 3",
+ "Programming Language :: Python :: 3.10",
+ "Programming Language :: Python :: 3.11",
+ "Programming Language :: Python :: 3.12",
+ "Topic :: Scientific/Engineering :: Artificial Intelligence"
+]
[project.urls]
repository = "https://github.com/LOGIC-10/RepoAgent"
[project.scripts]
-repoagent = "repo_agent.main:app"
+repoagent = "repo_agent.main:cli"
[tool.pdm]
[tool.pdm.dev-dependencies]
@@ -43,6 +54,7 @@ test = [
[tool.pdm.build]
includes = [
"repo_agent",
+ "assets/images/*.png"
]
@@ -50,3 +62,6 @@ includes = [
requires = ["pdm-backend"]
build-backend = "pdm.backend"
+
+[tool.pyright]
+reportCallIssue="none"
\ No newline at end of file
diff --git a/repo_agent/__main__.py b/repo_agent/__main__.py
index 94bb13d..05ccb0b 100755
--- a/repo_agent/__main__.py
+++ b/repo_agent/__main__.py
@@ -1,39 +1,4 @@
-#!/Users/yeyn/anaconda3/envs/ai-doc/bin/python
-import os
-import sys
+from .main import cli
-os.chdir(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))) #修改当前工作目录
-
-from repo_agent.runner import Runner, delete_fake_files
-from repo_agent.log import logger
-from repo_agent.config import CONFIG
-from repo_agent.doc_meta_info import MetaInfo, DocItem
-from repo_agent.utils.meta_info_utils import make_fake_files, delete_fake_files
-
-if len(sys.argv) == 1:
- runner = Runner()
- runner.run()
- logger.info("文档任务完成。")
-elif len(sys.argv) == 2:
- if sys.argv[1] == "clean":
- delete_fake_files()
- elif sys.argv[1] == "print":
- runner = Runner()
- runner.meta_info.target_repo_hierarchical_tree.print_recursive()
- elif sys.argv[1] == "diff":
- runner = Runner()
- if runner.meta_info.in_generation_process: # 如果不是在生成过程中,就开始检测变更
- print("this command only support pre-check")
- exit()
- file_path_reflections, jump_files = make_fake_files()
- new_meta_info = MetaInfo.init_meta_info(file_path_reflections, jump_files)
- new_meta_info.load_doc_from_older_meta(runner.meta_info)
- delete_fake_files()
-
- ignore_list = CONFIG.get("ignore_list", [])
- DocItem.check_has_task(new_meta_info.target_repo_hierarchical_tree, ignore_list)
- if new_meta_info.target_repo_hierarchical_tree.has_task:
- print("the following docs will be generated/updated:")
- new_meta_info.target_repo_hierarchical_tree.print_recursive(diff_status = True, ignore_list = ignore_list)
- else:
- print("no docs will be generated/updated, check your source-code update")
\ No newline at end of file
+if __name__ == "__main__":
+ cli()
\ No newline at end of file
diff --git a/repo_agent/change_detector.py b/repo_agent/change_detector.py
index 5b931fe..bf0d7d0 100644
--- a/repo_agent/change_detector.py
+++ b/repo_agent/change_detector.py
@@ -1,10 +1,12 @@
-import git
-import re, os
+import os
+import re
import subprocess
+
+import git
from colorama import Fore, Style
-from repo_agent.config import CONFIG
from repo_agent.file_handler import FileHandler
+from repo_agent.settings import setting
class ChangeDetector:
@@ -166,7 +168,7 @@ def get_to_be_staged_files(self):
print(f"{Fore.LIGHTYELLOW_EX}target_repo_path{Style.RESET_ALL}: {self.repo_path}")
print(f"{Fore.LIGHTMAGENTA_EX}already_staged_files{Style.RESET_ALL}:{staged_files}")
- project_hierarchy = CONFIG["project_hierarchy"]
+ project_hierarchy = setting.project.hierarchy_name
# diffs是所有未暂存更改文件的列表。这些更改文件是相对于工作区(working directory)的,也就是说,它们是自上次提交(commit)以来在工作区发生的更改,但还没有被添加到暂存区(staging area)
# 比如原本存在的md文件现在由于代码的变更发生了更新,就会标记为未暂存diff
diffs = self.repo.index.diff(None)
@@ -178,7 +180,7 @@ def get_to_be_staged_files(self):
# 处理untrack_files中的内容
for untracked_file in untracked_files:
# 连接repo_path和untracked_file以获取完整的绝对路径
- if untracked_file.startswith(CONFIG["Markdown_Docs_folder"]):
+ if untracked_file.startswith(setting.project.markdown_docs_name):
to_be_staged_files.append(untracked_file)
continue
print(f"rel_untracked_file:{rel_untracked_file}")
@@ -187,7 +189,7 @@ def get_to_be_staged_files(self):
if rel_untracked_file.endswith(".md"):
# 把rel_untracked_file从CONFIG['Markdown_Docs_folder']中拆离出来。判断是否能跟暂存区中的某一个.py文件对应上
rel_untracked_file = os.path.relpath(
- rel_untracked_file, CONFIG["Markdown_Docs_folder"]
+ rel_untracked_file, setting.project.markdown_docs_name
)
corresponding_py_file = os.path.splitext(rel_untracked_file)[0] + ".py"
print(
@@ -198,7 +200,7 @@ def get_to_be_staged_files(self):
to_be_staged_files.append(
os.path.join(
self.repo_path.lstrip("/"),
- CONFIG["Markdown_Docs_folder"],
+ setting.project.markdown_docs_name,
rel_untracked_file,
)
)
@@ -211,7 +213,7 @@ def get_to_be_staged_files(self):
for unstaged_file in unstaged_files:
# 连接repo_path和unstaged_file以获取完整的绝对路径
- if unstaged_file.startswith(CONFIG["Markdown_Docs_folder"]):
+ if unstaged_file.startswith(setting.project.markdown_docs_name) or unstaged_file.startswith(setting.project.hierarchy_name):
# abs_unstaged_file = os.path.join(self.repo_path, unstaged_file)
# # # 获取相对于仓库根目录的相对路径
# # rel_unstaged_file = os.path.relpath(abs_unstaged_file, self.repo_path)
@@ -227,7 +229,7 @@ def get_to_be_staged_files(self):
if unstaged_file.endswith(".md"):
# 把rel_unstaged_file从CONFIG['Markdown_Docs_folder']中拆离出来。判断是否能跟暂存区中的某一个.py文件对应上
rel_unstaged_file = os.path.relpath(
- rel_unstaged_file, CONFIG["Markdown_Docs_folder"]
+ rel_unstaged_file, setting.project.markdown_docs_name
)
corresponding_py_file = os.path.splitext(rel_unstaged_file)[0] + ".py"
print(f"corresponding_py_file:{corresponding_py_file}")
@@ -236,7 +238,7 @@ def get_to_be_staged_files(self):
to_be_staged_files.append(
os.path.join(
self.repo_path.lstrip("/"),
- CONFIG["Markdown_Docs_folder"],
+ setting.project.markdown_docs_name,
rel_unstaged_file,
)
)
diff --git a/repo_agent/chat_engine.py b/repo_agent/chat_engine.py
index ab884d0..380fae6 100644
--- a/repo_agent/chat_engine.py
+++ b/repo_agent/chat_engine.py
@@ -1,20 +1,17 @@
-import os, json
+import inspect
+import os
import sys
-from openai import OpenAI
-from openai import APIConnectionError
-import tiktoken
import time
-import inspect
-from collections import defaultdict
-from colorama import Fore, Style
+from dataclasses import dataclass
+import tiktoken
+from openai import APIConnectionError, OpenAI
+
+from repo_agent.doc_meta_info import DocItem
from repo_agent.log import logger
-from repo_agent.config import language_mapping, max_input_tokens_map
from repo_agent.prompt import SYS_PROMPT, USR_PROMPT
-from repo_agent.doc_meta_info import DocItem
-class ContextLengthExceededError(Exception):
- """Exception raised when the input size exceeds the model's context length limit."""
- pass
+from repo_agent.settings import max_input_tokens_map, setting
+
def get_import_statements():
source_lines = inspect.getsourcelines(sys.modules[__name__])[0]
@@ -25,36 +22,9 @@ def get_import_statements():
]
return import_lines
-
-def build_path_tree(who_reference_me, reference_who, doc_item_path):
- def tree():
- return defaultdict(tree)
-
- path_tree = tree()
-
- for path_list in [who_reference_me, reference_who]:
- for path in path_list:
- parts = path.split(os.sep)
- node = path_tree
- for part in parts:
- node = node[part]
-
- # 处理 doc_item_path
- parts = doc_item_path.split(os.sep)
- parts[-1] = "✳️" + parts[-1] # 在最后一个对象前面加上星号
- node = path_tree
- for part in parts:
- node = node[part]
-
- def tree_to_string(tree, indent=0):
- s = ""
- for key, value in sorted(tree.items()):
- s += " " * indent + key + "\n"
- if isinstance(value, dict):
- s += tree_to_string(value, indent + 1)
- return s
-
- return tree_to_string(path_tree)
+@dataclass
+class ResponseMessage:
+ content: str
class ChatEngine:
@@ -62,8 +32,8 @@ class ChatEngine:
ChatEngine is used to generate the doc of functions or classes.
"""
- def __init__(self, CONFIG):
- self.config = CONFIG
+ def __init__(self, project_manager):
+ self.project_manager = project_manager
def num_tokens_from_string(self, string: str, encoding_name="cl100k_base") -> int:
"""Returns the number of tokens in a text string."""
@@ -71,6 +41,97 @@ def num_tokens_from_string(self, string: str, encoding_name="cl100k_base") -> in
num_tokens = len(encoding.encode(string))
return num_tokens
+ def reduce_input_length(self, shorten_attempt, prompt_data):
+ """
+ Reduces the length of the input prompts by modifying the sys_prompt contents.
+ """
+
+ logger.info(
+ f"Attempt {shorten_attempt + 1} / 2 to reduce the length of the messages."
+ )
+ if shorten_attempt == 0:
+ # First attempt, remove project_structure and project_structure_prefix
+ prompt_data.project_structure = ""
+ prompt_data.project_structure_prefix = ""
+ elif shorten_attempt == 1:
+ # Second attempt, futher remove caller and callee (reference) information
+ prompt_data.project_structure = ""
+ prompt_data.project_structure_prefix = ""
+
+ prompt_data.referenced = False
+ prompt_data.referencer_content = ""
+ prompt_data.reference_letter = ""
+ prompt_data.combine_ref_situation = ""
+
+ # Update sys_prompt
+ sys_prompt = SYS_PROMPT.format(**prompt_data)
+
+ return sys_prompt
+
+ def generate_response(self, model, sys_prompt, usr_prompt, max_tokens):
+ client = OpenAI(
+ api_key=setting.chat_completion.openai_api_key.get_secret_value(),
+ base_url=str(setting.chat_completion.base_url),
+ timeout=setting.chat_completion.request_timeout,
+ )
+
+ messages = [
+ {"role": "system", "content": sys_prompt},
+ {"role": "user", "content": usr_prompt},
+ ]
+
+ response = client.chat.completions.create(
+ model=model,
+ messages=messages,
+ temperature=setting.chat_completion.temperature,
+ max_tokens=max_tokens,
+ )
+
+ response_message = response.choices[0].message
+
+ return response_message
+
+ def attempt_generate_response(
+ self, model, sys_prompt, usr_prompt, max_tokens, max_attempts=5
+ ):
+ attempt = 0
+ while attempt < max_attempts:
+ try:
+ response_message = self.generate_response(
+ model, sys_prompt, usr_prompt, max_tokens
+ )
+
+ # 如果 response_message 是 None,则继续下一次循环
+ if response_message is None:
+ attempt += 1
+ continue
+ return response_message
+
+ except APIConnectionError as e:
+ logger.error(
+ f"Connection error: {e}. Attempt {attempt + 1} of {max_attempts}"
+ )
+ # Retry after 7 seconds
+ time.sleep(7)
+ attempt += 1
+ if attempt == max_attempts:
+ raise
+ else:
+ continue # Try to request again
+
+ except Exception as e:
+ logger.error(
+ f"An unknown error occurred: {e}. \nAttempt {attempt + 1} of {max_attempts}"
+ )
+ # Retry after 10 seconds
+ time.sleep(10)
+ attempt += 1
+ if attempt == max_attempts:
+ response_message = ResponseMessage(
+ "An unknown error occurred while generating this documentation after many tries."
+ )
+ return response_message
+
def generate_doc(self, doc_item: DocItem, file_handler):
code_info = doc_item.content
referenced = len(doc_item.who_reference_me) > 0
@@ -85,10 +146,18 @@ def generate_doc(self, doc_item: DocItem, file_handler):
doc_item_path = os.path.join(file_path, code_name)
# 树结构路径通过全局信息中的who reference me 和 reference who + 自身的file_path来获取
- project_structure = build_path_tree(
+ # 使用 ProjectManager 实例来获取项目结构
+ project_structure = self.project_manager.build_path_tree(
who_reference_me, reference_who, doc_item_path
)
+ # project_manager = ProjectManager(repo_path=file_handler.repo_path, project_hierarchy=file_handler.project_hierarchy)
+ # project_structure = project_manager.get_project_structure()
+ # file_path = os.path.join(file_handler.repo_path, file_handler.file_path)
+ # code_from_referencer = get_code_from_json(project_manager.project_hierarchy, referencer) #
+ # referenced = True if len(code_from_referencer) > 0 else False
+ # referencer_content = '\n'.join([f'File_Path:{file_path}\n' + '\n'.join([f'Corresponding code as follows:\n{code}\n[End of this part of code]' for code in codes]) + f'\n[End of {file_path}]' for file_path, codes in code_from_referencer.items()])
+
def get_referenced_prompt(doc_item: DocItem) -> str:
if len(doc_item.reference_who) == 0:
return ""
@@ -119,7 +188,7 @@ def get_referencer_prompt(doc_item: DocItem) -> str:
def get_relationship_description(referencer_content, reference_letter):
if referencer_content and reference_letter:
- has_relationship = "And please include the reference relationship with its callers and callees in the project from a functional perspective"
+ return "And please include the reference relationship with its callers and callees in the project from a functional perspective"
elif referencer_content:
return "And please include the relationship with its callers in the project from a functional perspective."
elif reference_letter:
@@ -127,15 +196,7 @@ def get_relationship_description(referencer_content, reference_letter):
else:
return ""
-
- max_tokens = self.config.get("max_document_tokens", 1024) or 1024
- max_attempts = 5 # 设置最大尝试次数
- language = self.config["language"] # setting document language
- if language not in language_mapping:
- raise KeyError(
- f"Language code {language} is not provided! Supported languages are: {json.dumps(language_mapping)}"
- )
- language = language_mapping[language]
+ max_tokens = setting.project.max_document_tokens
code_type_tell = "Class" if code_type == "ClassDef" else "Function"
parameters_or_attribute = (
@@ -155,145 +216,91 @@ def get_relationship_description(referencer_content, reference_letter):
referencer_content = get_referencer_prompt(doc_item)
reference_letter = get_referenced_prompt(doc_item)
- has_relationship = get_relationship_description(referencer_content, reference_letter)
-
- project_structure_prefix = ", and the related hierarchical structure of this project is as follows (The current object is marked with an *):"
-
- sys_prompt = SYS_PROMPT.format(
- combine_ref_situation=combine_ref_situation,
- file_path=file_path,
- project_structure_prefix=project_structure_prefix,
- project_structure=project_structure,
- code_type_tell=code_type_tell,
- code_name=code_name,
- code_content=code_content,
- have_return_tell=have_return_tell,
- # referenced=referenced,
- has_relationship=has_relationship,
- reference_letter=reference_letter,
- referencer_content=referencer_content,
- parameters_or_attribute=parameters_or_attribute,
- language=language,
+ has_relationship = get_relationship_description(
+ referencer_content, reference_letter
)
- usr_prompt = USR_PROMPT.format(language=language)
-
- # # 保存prompt到txt文件
- # with open(f'prompt_output/sys_prompt_{code_name}.txt', 'w', encoding='utf-8') as f:
- # f.write(sys_prompt+'\n'+ usr_prompt)
+ project_structure_prefix = ", and the related hierarchical structure of this project is as follows (The current object is marked with an *):"
- # logger.info(f"Using {max_input_tokens_map} for context window judgment.")
-
- model = self.config["default_completion_kwargs"]["model"]
+ # 第一次尝试构建完整的prompt
+ prompt_data = {
+ "combine_ref_situation": combine_ref_situation,
+ "file_path": file_path,
+ "project_structure_prefix": project_structure_prefix,
+ "project_structure": project_structure,
+ "code_type_tell": code_type_tell,
+ "code_name": code_name,
+ "code_content": code_content,
+ "have_return_tell": have_return_tell,
+ "has_relationship": has_relationship,
+ "reference_letter": reference_letter,
+ "referencer_content": referencer_content,
+ "parameters_or_attribute": parameters_or_attribute,
+ "language": setting.project.language,
+ }
+
+ sys_prompt = SYS_PROMPT.format(**prompt_data)
+
+ usr_prompt = USR_PROMPT.format(language=setting.project.language)
+
+ model = setting.chat_completion.model
max_input_length = max_input_tokens_map.get(model, 4096) - max_tokens
- total_tokens = (
- self.num_tokens_from_string(sys_prompt) +
- self.num_tokens_from_string(usr_prompt)
- )
+ total_tokens = self.num_tokens_from_string(
+ sys_prompt
+ ) + self.num_tokens_from_string(usr_prompt)
# 如果总tokens超过当前模型的限制,则尝试寻找较大模型或者缩减输入
if total_tokens >= max_input_length:
# 查找一个拥有更大输入限制的模型
- larger_models = {k: v for k, v in max_input_tokens_map.items() if (v-max_tokens) > max_input_length}
- if larger_models:
- # 选择一个拥有更大输入限制的模型
- new_model = max(larger_models, key=larger_models.get)
- print(f"{Fore.LIGHTRED_EX}[Context Length Exceeded]{Style.RESET_ALL} model switching {model} -> {new_model}")
- model = new_model
- else:
- for attempt in range(2):
- logger.info(f"Attempt {attempt + 1} of {max_attempts}: Reducing the length of the messages.")
- if attempt == 0:
- # 第一次尝试,移除 project_structure 和 project_structure_prefix
- project_structure = ""
- project_structure_prefix = ""
- elif attempt == 1:
- # 第二次尝试,移除相关的调用者和被调用者信息
- referenced = False
- referencer_content = ""
- reference_letter = ""
- combine_ref_situation = ""
-
- # 更新 sys_prompt
- sys_prompt = SYS_PROMPT.format(
- reference_letter=reference_letter,
- combine_ref_situation=combine_ref_situation,
- file_path=file_path,
- project_structure_prefix=project_structure_prefix,
- project_structure=project_structure,
- code_type_tell=code_type_tell,
- code_name=code_name,
- code_content=code_content,
- have_return_tell=have_return_tell,
- has_relationship=has_relationship,
- referenced=referenced,
- referencer_content=referencer_content,
- parameters_or_attribute=parameters_or_attribute,
- language=language,
- )
-
- # 重新计算 tokens
- total_tokens = (
- self.num_tokens_from_string(sys_prompt) +
- self.num_tokens_from_string(usr_prompt)
- )
- # 检查是否满足要求
- if total_tokens < max_input_length:
- break
-
- if total_tokens >= max_input_length:
- error_message = (
- f"Context length of {total_tokens} exceeds the maximum limit of {max_input_length} tokens..."
+ larger_models = {
+ k: v
+ for k, v in max_input_tokens_map.items()
+ if (v - max_tokens) > total_tokens
+ } # 抽取出所有上下文长度大于当前总输入tokens的模型
+ for model_name, max_input_length in larger_models.items():
+ if max_input_length - max_tokens > total_tokens:
+ try:
+ # Attempt to make a request with the larger model
+ logger.info(
+ f"Trying model {model_name} for large-context processing."
+ )
+ response_message = self.attempt_generate_response(
+ model_name, sys_prompt, usr_prompt, max_tokens
+ ) # response_message在attempt_generate_response中已经被校验过了
+ return response_message
+ except Exception as e:
+ # 否则直接跳过,尝试下一个模型
+ continue # Try the next model
+ # If no larger models succeed, fallback to original model
+ # 对于最初的model模型,尝试缩减输入长度
+ for shorten_attempt in range(2):
+ shorten_success = False
+ sys_prompt = self.reduce_input_length(shorten_attempt, prompt_data)
+ # 重新计算 tokens
+ total_tokens = self.num_tokens_from_string(
+ sys_prompt
+ ) + self.num_tokens_from_string(usr_prompt)
+ # 检查是否满足要求
+ if total_tokens < max_input_length:
+ shorten_success = True
+ # 如满足要求直接发送请求来生成文档
+ response_message = self.attempt_generate_response(
+ model, sys_prompt, usr_prompt, max_tokens
)
- # raise ContextLengthExceededError(error_message)
- return None
-
- attempt = 0
- while attempt < max_attempts:
- try:
- # 获取基本配置
- client = OpenAI(
- api_key=self.config["api_keys"][model][0]["api_key"],
- base_url=self.config["api_keys"][model][0]["base_url"],
- timeout=self.config["default_completion_kwargs"]["request_timeout"],
- )
-
- messages = [
- {"role": "system", "content": sys_prompt},
- {"role": "user", "content": usr_prompt},
- ]
- response = client.chat.completions.create(
- model=model,
- messages=messages,
- temperature=self.config["default_completion_kwargs"]["temperature"],
- max_tokens=max_tokens,
+ if not shorten_success:
+ # 意味着这个doc_item无法生成doc(因为代码本身的长度就超过了模型的限制)
+ # 返回一个自定义的response_message对象,它的content是"Tried to generate the document, but the code is too long to process."
+ # 在其他代码调用的时候使用的是response_message.content,所以必须确保content能通过这种方式从response_message中被读取出来
+ response_message = ResponseMessage(
+ "Tried to generate the document, but the code is too long to process."
)
-
- response_message = response.choices[0].message
-
- # 如果 response_message 是 None,则继续下一次循环
- if response_message is None:
- attempt += 1
- continue
return response_message
- except APIConnectionError as e:
- logger.error(f"Connection error: {e}. Attempt {attempt + 1} of {max_attempts}")
- # Retry after 7 seconds
- time.sleep(7)
- attempt += 1
- if attempt == max_attempts:
- raise
- else:
- continue # Try to request again
- except Exception as e:
- logger.error(
- f"An unknown error occurred: {e}. \nAttempt {attempt + 1} of {max_attempts}"
- )
- # Retry after 10 seconds
- time.sleep(10)
- attempt += 1
- if attempt == max_attempts:
- return None
+ else: # 如果总tokens没有超过模型限制,直接发送请求
+ response_message = self.attempt_generate_response(
+ model, sys_prompt, usr_prompt, max_tokens
+ )
+
+ return response_message
diff --git a/repo_agent/chat_with_repo/__init__.py b/repo_agent/chat_with_repo/__init__.py
new file mode 100644
index 0000000..12b0f6e
--- /dev/null
+++ b/repo_agent/chat_with_repo/__init__.py
@@ -0,0 +1,3 @@
+# repo_agent/chat_with_repo/__init__.py
+
+from .main import main
diff --git a/repo_agent/chat_with_repo/gradio_interface.py b/repo_agent/chat_with_repo/gradio_interface.py
index 7df59db..634ec21 100644
--- a/repo_agent/chat_with_repo/gradio_interface.py
+++ b/repo_agent/chat_with_repo/gradio_interface.py
@@ -1,5 +1,6 @@
import gradio as gr
import markdown
+
from repo_agent.log import logger
diff --git a/repo_agent/chat_with_repo/json_handler.py b/repo_agent/chat_with_repo/json_handler.py
index cfe97aa..a06114e 100644
--- a/repo_agent/chat_with_repo/json_handler.py
+++ b/repo_agent/chat_with_repo/json_handler.py
@@ -1,5 +1,6 @@
import json
import sys
+
from repo_agent.log import logger
@@ -87,4 +88,4 @@ def search_code_contents_by_name(self, file_path, search_text):
if __name__ == "__main__":
processor = JsonFileProcessor("database.json")
- md_contents = processor.extract_md_contents()
+ md_contents,extracted_contents = processor.extract_data()
diff --git a/repo_agent/chat_with_repo/main.py b/repo_agent/chat_with_repo/main.py
index abdfb7b..2086dec 100644
--- a/repo_agent/chat_with_repo/main.py
+++ b/repo_agent/chat_with_repo/main.py
@@ -1,19 +1,19 @@
-import os
from repo_agent.chat_with_repo.gradio_interface import GradioInterface
from repo_agent.chat_with_repo.rag import RepoAssistant
-from repo_agent.config import CONFIG
+from repo_agent.settings import setting
def main():
- _model = CONFIG["default_completion_kwargs"]["model"]
- api_key = CONFIG["api_keys"][_model][0]["api_key"]
- api_base = CONFIG["api_keys"][_model][0]["base_url"]
- db_path = os.path.join(
- CONFIG["repo_path"], CONFIG["project_hierarchy"], "project_hierarchy.json"
+ api_key = setting.chat_completion.openai_api_key.get_secret_value()
+ api_base = str(setting.chat_completion.base_url)
+ db_path = (
+ setting.project.target_repo
+ / setting.project.hierarchy_name
+ / "project_hierarchy.json"
)
assistant = RepoAssistant(api_key, api_base, db_path)
- md_contents,meta_data = assistant.json_data.extract_data()
+ md_contents, meta_data = assistant.json_data.extract_data()
assistant.chroma_data.create_vector_store(md_contents, meta_data)
GradioInterface(assistant.respond)
diff --git a/repo_agent/chat_with_repo/prompt.py b/repo_agent/chat_with_repo/prompt.py
index f7ba781..56cb576 100644
--- a/repo_agent/chat_with_repo/prompt.py
+++ b/repo_agent/chat_with_repo/prompt.py
@@ -1,6 +1,7 @@
from llama_index.llms import OpenAI
-from repo_agent.log import logger
+
from repo_agent.chat_with_repo.json_handler import JsonFileProcessor
+from repo_agent.log import logger
class TextAnalysisTool:
diff --git a/repo_agent/chat_with_repo/rag.py b/repo_agent/chat_with_repo/rag.py
index 420a1a7..cbd9f1d 100644
--- a/repo_agent/chat_with_repo/rag.py
+++ b/repo_agent/chat_with_repo/rag.py
@@ -1,12 +1,14 @@
-from repo_agent.chat_with_repo.json_handler import JsonFileProcessor
-from repo_agent.chat_with_repo.vectordb import ChromaManager
-from repo_agent.chat_with_repo.prompt import TextAnalysisTool
-from repo_agent.log import logger
+import json
+
from llama_index import PromptTemplate
from llama_index.llms import OpenAI
-import json
from openai import OpenAI as AI
+from repo_agent.chat_with_repo.json_handler import JsonFileProcessor
+from repo_agent.chat_with_repo.prompt import TextAnalysisTool
+from repo_agent.chat_with_repo.vectordb import ChromaManager
+from repo_agent.log import logger
+
class RepoAssistant:
def __init__(self, api_key, api_base, db_path):
diff --git a/repo_agent/chat_with_repo/vectordb.py b/repo_agent/chat_with_repo/vectordb.py
index 5e142db..fbb543f 100644
--- a/repo_agent/chat_with_repo/vectordb.py
+++ b/repo_agent/chat_with_repo/vectordb.py
@@ -1,5 +1,6 @@
import chromadb
from chromadb.utils import embedding_functions
+
from repo_agent.log import logger
diff --git a/repo_agent/config.py b/repo_agent/config.py
deleted file mode 100644
index 614004f..0000000
--- a/repo_agent/config.py
+++ /dev/null
@@ -1,64 +0,0 @@
-import yaml
-import sys
-
-try:
- CONFIG = yaml.load(open("config.yml", "r"), Loader=yaml.FullLoader)
-except FileNotFoundError:
- print(
- "The file does not exist! Maybe you forgot to rename config.yml.template to config.yml and update the essential content"
- )
- sys.exit(1) # Exits the program
-
-language_mapping = {
- "en": "English",
- "es": "Spanish",
- "fr": "French",
- "de": "German",
- "zh": "Chinese",
- "ja": "Japanese",
- "ru": "Russian",
- "it": "Italian",
- "ko": "Korean",
- "nl": "Dutch",
- "pt": "Portuguese",
- "ar": "Arabic",
- "tr": "Turkish",
- "sv": "Swedish",
- "da": "Danish",
- "fi": "Finnish",
- "no": "Norwegian",
- "pl": "Polish",
- "cs": "Czech",
- "hu": "Hungarian",
- "el": "Greek",
- "he": "Hebrew",
- "th": "Thai",
- "hi": "Hindi",
- "bn": "Bengali",
-}
-
-# 获取 yaml 文件中实际使用的模型列表
-used_models = []
-for model_name, _ in CONFIG['api_keys'].items():
- used_models.append(model_name)
-
-# NOTE Each model's token limit has been reduced by 1024 tokens to account for the output space and 1 for boundary conditions.
-max_input_tokens_map = {
- "gpt-3.5-turbo": 4096, # NOTE OPENAI said that The gpt-3.5-turbo model alias will be automatically upgraded from gpt-3.5-turbo-0613 to gpt-3.5-turbo-0125 on February 16th. But in 2/20, then still maintain 4,096 tokens for context window.
- "gpt-3.5-turbo-0613": 4096, # NOTE Will be deprecated on June 13, 2024.
- "gpt-3.5-turbo-16k": 16384, # NOTE Will be deprecated on June 13, 2024.
- "gpt-3.5-turbo-16k-0613": 16384, # NOTE Will be deprecated on June 13, 2024.
- "gpt-3.5-turbo-0125": 16384,
- "gpt-4": 8192,
- "gpt-4-0613": 8192,
- "gpt-4-32k": 32768, # NOTE This model was never rolled out widely in favor of GPT-4 Turbo.
- "gpt-4-1106": 131072,
- "gpt-4-0125-preview": 131072,
- "gpt-4-turbo-preview": 131072,
-}
-
-# 移除在 yaml 文件中未使用的模型
-for model_key in list(max_input_tokens_map.keys()):
- if model_key not in used_models:
- del max_input_tokens_map[model_key]
-
diff --git a/repo_agent/config_manager.py b/repo_agent/config_manager.py
new file mode 100644
index 0000000..b7fcb0b
--- /dev/null
+++ b/repo_agent/config_manager.py
@@ -0,0 +1,97 @@
+# repo_agent/config_manager.py
+import os
+from pathlib import Path
+from typing import Dict
+
+import tomli
+import tomli_w
+
+
+def get_config_path() -> Path:
+ # 首先检查当前工作目录的父目录
+ parent_directory = Path.cwd()
+ local_config_path = parent_directory / 'config.toml'
+
+ # 如果在程序目录找到了 config.toml,则直接返回这个路径
+ if local_config_path.exists():
+ return local_config_path
+
+ # 如果在父目录没有找到 config.toml,按照原来的逻辑进行
+ os_name = os.name
+ if os_name == 'posix':
+ # 对于 Unix 和 macOS,使用家目录
+ home_config_path = Path.home() / '.repoagent'
+ elif os_name == 'nt':
+ # 对于 Windows,使用 APPDATA 目录
+ home_config_path = Path(os.getenv('APPDATA')) / 'repoagent' # type: ignore
+ else:
+ # 如果操作系统检测失败,默认使用一个本地目录
+ home_config_path = Path.cwd() / 'repoagent'
+
+ # 确保配置目录存在
+ home_config_path.mkdir(parents=True, exist_ok=True)
+ config_path = home_config_path / 'config.toml'
+
+ # 确保配置文件存在,如果不存在则创建一个空文件
+ if not config_path.exists():
+ config_path.touch()
+
+ # 返回包含文件名的完整路径
+ return config_path
+
+
+def read_config(file_path: Path | None = None) -> Dict[str, any]: # type: ignore
+
+ if file_path is None:
+ file_path = get_config_path()
+
+ with open(file_path, "rb") as f:
+ try:
+ toml_dict = tomli.load(f)
+ except tomli.TOMLDecodeError:
+ toml_dict = {}
+
+ return toml_dict
+
+
+
+def write_config(update_config: dict, file_path: Path | None = None) -> None:
+ if file_path is None:
+ file_path = get_config_path()
+
+ # 先尝试读取现有配置,如果文件不存在则创建一个空的字典
+ with open(file_path, "rb") as f:
+ try:
+ existing_config = tomli.load(f)
+ except tomli.TOMLDecodeError:
+ existing_config = {}
+
+ # 更新配置:将新配置的键和值更新到现有配置字典中
+ existing_config.update(update_config)
+
+ # 将更新后的配置写回文件
+ with open(file_path, "wb") as f:
+ tomli_w.dump(existing_config, f)
+
+if __name__ == '__main__':
+ # Sample TOML data to be written to the configuration file
+ sample_toml_data = """\
+ val2 = 2
+ val1 = 1
+
+ [table]
+ val3 = 4
+ """
+ # Convert the TOML string to a dictionary
+ sample_config = tomli.loads(sample_toml_data)
+
+ # Write the TOML configuration to a file
+ write_config(sample_config)
+
+ # Read the configuration back from the file
+ read_back_config = read_config()
+
+ # Print the configuration to verify the contents
+ print(read_back_config)
+
+ print(sample_config)
\ No newline at end of file
diff --git a/repo_agent/doc_meta_info.py b/repo_agent/doc_meta_info.py
index d75edd0..b92921f 100644
--- a/repo_agent/doc_meta_info.py
+++ b/repo_agent/doc_meta_info.py
@@ -1,23 +1,26 @@
"""存储doc对应的信息,同时处理引用的关系
"""
from __future__ import annotations
+
+import json
+import os
import threading
from dataclasses import dataclass, field
-from typing import List, Dict, Any, Callable, Optional
+from enum import Enum, auto, unique
+from pathlib import Path
+from typing import Any, Callable, Dict, List, Optional
+
+import jedi
from colorama import Fore, Style
-from enum import Enum, unique, auto
-import time
from prettytable import PrettyTable
-import os
-import json
-import jedi
from tqdm import tqdm
+from repo_agent.file_handler import FileHandler
from repo_agent.log import logger
+from repo_agent.multi_task_dispatch import Task, TaskManager
+from repo_agent.settings import setting
from repo_agent.utils.meta_info_utils import latest_verison_substring
-from repo_agent.config import CONFIG
-from repo_agent.file_handler import FileHandler
-from repo_agent.multi_task_dispatch import TaskManager, Task
+
@unique
class EdgeType(Enum):
@@ -62,9 +65,9 @@ def print_self(self):
color = Fore.BLUE
return color + self.name + Style.RESET_ALL
- def get_edge_type(
+ def get_edge_type(self,
from_item_type: DocItemType, to_item_type: DocItemType
- ) -> EdgeType:
+ ):
pass
@@ -77,7 +80,7 @@ class DocItemStatus(Enum):
referencer_not_exist = auto() # 曾经引用他的obj被删除了,或者不再引用他了
-def need_to_generate(doc_item: DocItem, ignore_list: List) -> bool:
+def need_to_generate(doc_item: DocItem, ignore_list: List[str] = []) -> bool:
"""只生成item的,文件及更高粒度都跳过。另外如果属于一个blacklist的文件也跳过"""
if doc_item.item_status == DocItemStatus.doc_up_to_date:
return False
@@ -233,14 +236,14 @@ def find(self, recursive_file_path: list) -> Optional[DocItem]:
return now
@staticmethod
- def check_has_task(now_item: DocItem, ignore_list) -> bool:
+ def check_has_task(now_item: DocItem, ignore_list: List[str] = []):
if need_to_generate(now_item, ignore_list=ignore_list):
now_item.has_task = True
for _, child in now_item.children.items():
DocItem.check_has_task(child, ignore_list)
now_item.has_task = child.has_task or now_item.has_task
- def print_recursive(self, indent=0, print_content=False, diff_status = False, ignore_list = []):
+ def print_recursive(self, indent=0, print_content=False, diff_status = False, ignore_list: List[str] = []):
"""递归打印repo对象"""
def print_indent(indent=0):
@@ -249,7 +252,7 @@ def print_indent(indent=0):
return " " * indent + "|-"
print_obj_name = self.obj_name
if self.item_type == DocItemType._repo:
- print_obj_name = CONFIG["repo_path"]
+ print_obj_name = setting.project.target_repo
if diff_status and need_to_generate(self, ignore_list=ignore_list):
print(
print_indent(indent) + f"{self.item_type.print_self()}: {print_obj_name} : {self.item_status.name}",
@@ -316,7 +319,7 @@ class MetaInfo:
@staticmethod
def init_meta_info(file_path_reflections, jump_files) -> MetaInfo:
"""从一个仓库path中初始化metainfo"""
- project_abs_path = CONFIG["repo_path"]
+ project_abs_path = setting.project.target_repo
print(f"{Fore.LIGHTRED_EX}Initializing MetaInfo: {Style.RESET_ALL}from {project_abs_path}")
file_handler = FileHandler(project_abs_path, None)
repo_structure = file_handler.generate_overall_structure(file_path_reflections, jump_files)
@@ -327,7 +330,7 @@ def init_meta_info(file_path_reflections, jump_files) -> MetaInfo:
return metainfo
@staticmethod
- def from_checkpoint_path(checkpoint_dir_path: str) -> MetaInfo:
+ def from_checkpoint_path(checkpoint_dir_path: str | Path ) -> MetaInfo:
"""从已有的metainfo dir里面读取metainfo"""
project_hierarchy_json_path = os.path.join(
checkpoint_dir_path, "project_hierarchy.json"
@@ -341,7 +344,7 @@ def from_checkpoint_path(checkpoint_dir_path: str) -> MetaInfo:
os.path.join(checkpoint_dir_path, "meta-info.json"), "r", encoding="utf-8"
) as reader:
meta_data = json.load(reader)
- metainfo.repo_path = CONFIG["repo_path"]
+ metainfo.repo_path = str(setting.project.target_repo) # Convert DirectoryPath to string
metainfo.document_version = meta_data["doc_version"]
metainfo.fake_file_reflection = meta_data["fake_file_reflection"]
metainfo.jump_files = meta_data["jump_files"]
@@ -353,7 +356,7 @@ def from_checkpoint_path(checkpoint_dir_path: str) -> MetaInfo:
)
return metainfo
- def checkpoint(self, target_dir_path: str, flash_reference_relation=False):
+ def checkpoint(self, target_dir_path: str | Path, flash_reference_relation=False):
"""
Save the MetaInfo object to the specified directory.
@@ -782,10 +785,10 @@ def from_project_hierarchy_json(project_hierarchy_json) -> MetaInfo:
for file_name, file_content in tqdm(project_hierarchy_json.items(),desc="parsing parent relationship"):
# 首先parse file archi
- if not os.path.exists(os.path.join(CONFIG["repo_path"], file_name)):
+ if not os.path.exists(os.path.join(setting.project.target_repo, file_name)):
logger.info(f"deleted content: {file_name}")
continue
- elif os.path.getsize(os.path.join(CONFIG["repo_path"], file_name)) == 0:
+ elif os.path.getsize(os.path.join(setting.project.target_repo, file_name)) == 0:
logger.info(f"blank content: {file_name}")
continue
diff --git a/repo_agent/exceptions.py b/repo_agent/exceptions.py
new file mode 100644
index 0000000..73e58a8
--- /dev/null
+++ b/repo_agent/exceptions.py
@@ -0,0 +1,18 @@
+from openai import APIConnectionError
+
+from repo_agent.log import logger
+
+
+class ErrorHandler:
+ @staticmethod
+ def handle_exception(e):
+ if isinstance(e, APIConnectionError):
+ logger.warning(f"OpenAIResponseError occurred: {e}")
+ elif isinstance(e, OpenAIError):
+ logger.error(f"OpenAIError occurred: {e}")
+ else:
+ logger.error(f"An unexpected error occurred: {e}")
+
+class OpenAIError(Exception):
+ def __init__(self, message):
+ super().__init__(message)
\ No newline at end of file
diff --git a/repo_agent/file_handler.py b/repo_agent/file_handler.py
index 65b0c05..95f3764 100644
--- a/repo_agent/file_handler.py
+++ b/repo_agent/file_handler.py
@@ -1,26 +1,26 @@
# FileHandler 类,实现对文件的读写操作,这里的文件包括markdown文件和python文件
# repo_agent/file_handler.py
-import git
-import os, json
import ast
-from tqdm import tqdm
+import json
+import os
+
+import git
from colorama import Fore, Style
-import threading
-from typing import Dict
-from repo_agent.utils.meta_info_utils import latest_verison_substring
-from repo_agent.config import CONFIG
-from repo_agent.log import logger
+from tqdm import tqdm
+
+from repo_agent.settings import setting
from repo_agent.utils.gitignore_checker import GitignoreChecker
+from repo_agent.utils.meta_info_utils import latest_verison_substring
-# 这个类会在遍历变更后的文件的循环中,为每个变更后文件(也就是当前文件)创建一个实例
class FileHandler:
+ """
+ 历变更后的文件的循环中,为每个变更后文件(也就是当前文件)创建一个实例
+ """
def __init__(self, repo_path, file_path):
self.file_path = file_path # 这里的file_path是相对于仓库根目录的路径
self.repo_path = repo_path
- self.project_hierarchy = os.path.join(
- repo_path, CONFIG["project_hierarchy"], "project_hierarchy.json"
- )
+ self.project_hierarchy = setting.project.target_repo / setting.project.hierarchy_name
def read_file(self):
"""
@@ -265,7 +265,7 @@ def generate_overall_structure(self, file_path_reflections, jump_files) -> dict:
# elif not_ignored_files.endswith(latest_version):
# """如果某文件被删除但没有暂存,文件系统有fake_file但没有对应的原始文件"""
# for k,v in file_path_reflections.items():
- # if v == not_ignored_files and not os.path.exists(os.path.join(CONFIG["repo_path"], not_ignored_files)):
+ # if v == not_ignored_files and not os.path.exists(os.path.join(setting.project.target_repo, not_ignored_files)):
# print(f"{Fore.LIGHTYELLOW_EX}[Unstaged DeleteFile] load fake-file-content: {Style.RESET_ALL}{k}")
# normal_file_names = k #原来的名字
# break
@@ -345,46 +345,12 @@ def convert_to_markdown_file(self, file_path=None):
return markdown
- # def convert_all_to_markdown_files_from_json(self):
- # """
- # Converts all files to markdown format based on the JSON data.
-
- # Reads the project hierarchy from a JSON file, checks if the Markdown_docs folder exists,
- # creates it if it doesn't, and then iterates through each file in the JSON data.
- # For each file, it converts the file to markdown format and writes it to the Markdown_docs folder.
-
- # Args:
- # self (object): The file_handler object.
-
- # Returns:
- # None
- # """
- # with open(self.project_hierarchy, 'r', encoding='utf-8') as f:
- # json_data = json.load(f)
-
- # # 检查根目录是否存在Markdown_docs文件夹,如果不存在则创建
- # markdown_docs_path = os.path.join(self.repo_path, CONFIG['Markdown_Docs_folder'])
- # if not os.path.exists(markdown_docs_path):
- # os.mkdir(markdown_docs_path)
-
- # # 遍历json_data["files"]列表中的每个字典
- # for rel_file_path, file_dict in json_data.items():
- # md_path = os.path.join(markdown_docs_path, rel_file_path.replace('.py', '.md'))
- # markdown = self.convert_to_markdown_file(rel_file_path)
-
- # # 检查目录是否存在,如果不存在,就创建它
- # os.makedirs(os.path.dirname(md_path), exist_ok=True)
-
- # # 将markdown文档写入到Markdown_docs文件夹中
- # with open(md_path, 'w', encoding='utf-8') as f:
- # f.write(markdown)
-
if __name__ == "__main__":
# 打开py文件读取源代码
# file_handler = FileHandler('/path/to/repo', '/path/to/file.py')
- file_handler = FileHandler(CONFIG["repo_path"], "XAgent/engines/pipeline_old.py")
+ file_handler = FileHandler(setting.project.target_repo, "XAgent/engines/pipeline_old.py")
# file_handler.generate_markdown_from_json()
file_handler.convert_all_to_markdown_files_from_json()
# code_content = file_handler.read_file()
diff --git a/repo_agent/log.py b/repo_agent/log.py
index 147a6b6..695a1f4 100644
--- a/repo_agent/log.py
+++ b/repo_agent/log.py
@@ -1,8 +1,8 @@
# repo_agent/log.py
-import sys
import logging
+import sys
+
from loguru import logger
-from repo_agent.config import CONFIG
logger = logger.opt(colors=True)
"""
@@ -53,8 +53,8 @@ def emit(self, record):
# Find caller from where the logged message originated
frame, depth = logging.currentframe(), 2
- while frame.f_code.co_filename == logging.__file__:
- frame = frame.f_back
+ while frame.f_code.co_filename == logging.__file__: # type: ignore
+ frame = frame.f_back # type: ignore
depth += 1
# Log to Loguru
@@ -63,79 +63,14 @@ def emit(self, record):
)
-def set_logger_level_from_config():
- log_level = CONFIG.get("log_level", "INFO").upper()
-
- try:
- logger.remove()
- logger.add(sys.stderr, level=log_level)
-
- # Intercept standard logging
- logging.basicConfig(handlers=[InterceptHandler()], level=0, force=True)
-
- logger.success(f"Log level set to {log_level}!")
- except ValueError:
- logger.warning(
- f"Invalid log level '{log_level}' in config. Using default log level."
- )
-
-
-# Automatically set logger level from config when module is imported
-set_logger_level_from_config()
-
-
-# TODO 由 Loguru 接管 Print
-# 重定向标准输出和错误输出, :
-# - `stdout` 被重定向至 logger 的 INFO 等级。
-# - `stderr` 被重定向至 logger 的 ERROR 等级。
-# class StreamToLogger:
-# """
-# Fake file-like stream object that redirects writes to a logger instance.
-
-# Args:
-# logger (loguru.Logger): The logger instance to which the output will be redirected.
-# level (str, optional): Log level for the messages. Defaults to 'INFO'.
-
-# Methods:
-# write(buffer): Redirects the buffer to the logger.
-# flush(): Dummy method to comply with file-like interface.
-# """
-
-# def __init__(self, logger, level="INFO"):
-# self.logger = logger
-# self._level = level
-
-# def write(self, buffer):
-# for line in buffer.rstrip().splitlines():
-# self.logger.opt(depth=1).log(self._level, line.rstrip())
-
-# def flush(self):
-# pass
-
-
-# # Redirect stdout to logger at success level
-# sys.stdout = StreamToLogger(logger)
-
-# Redirect stderr to logger at error level
-# sys.stderr = StreamToLogger(logger, "ERROR")
-
-# import stackprinter
+def set_logger_level_from_config(log_level):
+ logger.remove()
+ logger.add(sys.stderr, level=log_level)
-# def handle_exception(exc_type, exc_value, exc_traceback):
-# """
-# Custom exception handler using stackprinter for formatting.
+ # Intercept standard logging
+ logging.basicConfig(handlers=[InterceptHandler()], level=0, force=True)
-# Args:
-# exc_type: Exception type.
-# exc_value: Exception value.
-# exc_traceback: Exception traceback.
-# """
-# # Use stackprinter to format the exception and traceback
-# formatted_exception = stackprinter.format(exc_value, exc_traceback, style='plaintext')
+ logger.success(f"Log level set to {log_level}!")
-# # Log the formatted exception
-# logger.error("Unhandled exception:\n{}", formatted_exception)
-# # Set the custom exception handler
-# sys.excepthook = handle_exception
diff --git a/repo_agent/main.py b/repo_agent/main.py
new file mode 100644
index 0000000..bea3544
--- /dev/null
+++ b/repo_agent/main.py
@@ -0,0 +1,309 @@
+from importlib import metadata
+
+import click
+from iso639 import Language, LanguageNotFoundError
+from loguru import logger
+from tenacity import (
+ retry,
+ retry_if_exception_type,
+ stop_after_attempt,
+)
+
+from repo_agent.chat_with_repo import main as run_chat_with_repo
+from repo_agent.config_manager import write_config
+from repo_agent.doc_meta_info import DocItem, MetaInfo
+from repo_agent.log import logger, set_logger_level_from_config
+from repo_agent.runner import Runner, delete_fake_files
+from repo_agent.settings import (
+ ChatCompletionSettings,
+ LogLevel,
+ ProjectSettings,
+ Setting,
+ setting,
+)
+from repo_agent.utils.meta_info_utils import delete_fake_files, make_fake_files
+
+# 尝试获取版本号,如果失败,则使用默认版本号。
+try:
+ version_number = metadata.version("repoagent")
+except metadata.PackageNotFoundError:
+ version_number = "0.0.0"
+
+project_settings_default_instance = ProjectSettings()
+chat_completion_default_instance = ChatCompletionSettings()
+
+
+@retry(
+ retry=retry_if_exception_type(LanguageNotFoundError),
+ stop=stop_after_attempt(3),
+ retry_error_callback=lambda _: click.echo(
+ "Failed to find the language after several attempts."
+ ),
+)
+def language_prompt(default_language):
+ language_code = click.prompt(
+ "Enter the language (ISO 639 code or language name, e.g., 'en', 'eng', 'English')",
+ default=default_language,
+ )
+ try:
+ language_name = Language.match(language_code).name
+ return language_name
+ except LanguageNotFoundError:
+ click.echo(
+ "Invalid language input. Please enter a valid ISO 639 code or language name."
+ )
+ raise
+
+
+@click.group()
+@click.version_option(version_number)
+def cli():
+ """An LLM-Powered Framework for Repository-level Code Documentation Generation."""
+ pass
+
+
+@cli.command()
+def configure():
+ """Configure the agent's parameters."""
+ project_settings_instance = ProjectSettings(
+ target_repo=click.prompt(
+ "Enter the path to target repository",
+ type=click.Path(exists=True, file_okay=False, dir_okay=True),
+ ),
+ hierarchy_name=click.prompt(
+ "Enter the project hierarchy file name",
+ default=project_settings_default_instance.hierarchy_name,
+ ),
+ markdown_docs_name=click.prompt(
+ "Enter the Markdown documents folder name",
+ default=project_settings_default_instance.markdown_docs_name,
+ ),
+ ignore_list=click.prompt(
+ "Enter files or directories to ignore, separated by commas",
+ default=",".join(
+ str(path) for path in []
+ ),
+ ).split(","),
+ language=language_prompt(
+ default_language=project_settings_default_instance.language
+ ),
+ max_thread_count=click.prompt(
+ "Enter the maximum number of threads",
+ default=project_settings_default_instance.max_thread_count,
+ type=click.INT,
+ ),
+ max_document_tokens=click.prompt(
+ "Enter the maximum number of document tokens",
+ default=project_settings_default_instance.max_document_tokens,
+ type=click.INT,
+ ),
+ log_level=click.prompt(
+ "Enter the log level",
+ type=click.Choice(
+ [level.value for level in LogLevel], case_sensitive=False
+ ),
+ default=project_settings_default_instance.log_level.value,
+ ),
+ )
+
+ logger.success("Project settings saved successfully.")
+
+ chat_completion_instance = ChatCompletionSettings(
+ model=click.prompt(
+ "Enter the model", default=chat_completion_default_instance.model
+ ),
+ temperature=click.prompt(
+ "Enter the temperature",
+ default=chat_completion_default_instance.temperature,
+ type=float,
+ ),
+ request_timeout=click.prompt(
+ "Enter the request timeout (seconds)",
+ default=chat_completion_default_instance.request_timeout,
+ type=int,
+ ),
+ base_url=click.prompt(
+ "Enter the base URL", default=str(chat_completion_default_instance.base_url)
+ ),
+ )
+ logger.success("Chat completion settings saved successfully.")
+
+ update_setting = Setting(
+ project=project_settings_instance, chat_completion=chat_completion_instance
+ )
+
+ write_config(update_setting.model_dump())
+
+
+@cli.command()
+@click.option(
+ "--model",
+ "-m",
+ default=setting.chat_completion.model,
+ show_default=True,
+ help="Specifies the model to use for completion.",
+ type=str,
+)
+@click.option(
+ "--temperature",
+ "-t",
+ default=setting.chat_completion.temperature,
+ show_default=True,
+ help="Sets the generation temperature for the model. Lower values make the model more deterministic.",
+ type=float,
+)
+@click.option(
+ "--request-timeout",
+ "-r",
+ default=setting.chat_completion.request_timeout,
+ show_default=True,
+ help="Defines the timeout in seconds for the API request.",
+ type=int,
+)
+@click.option(
+ "--base-url",
+ "-b",
+ default=setting.chat_completion.base_url,
+ show_default=True,
+ help="The base URL for the API calls.",
+ type=str,
+)
+@click.option(
+ "--target-repo-path",
+ "-tp",
+ default=setting.project.target_repo,
+ show_default=True,
+ help="The file system path to the target repository. This path is used as the root for documentation generation.",
+ type=click.Path(),
+)
+@click.option(
+ "--hierarchy-path",
+ "-hp",
+ default=setting.project.hierarchy_name,
+ show_default=True,
+ help="The name or path for the project hierarchy file, used to organize documentation structure.",
+ type=str,
+)
+@click.option(
+ "--markdown-docs-path",
+ "-mdp",
+ default=setting.project.markdown_docs_name,
+ show_default=True,
+ help="The folder path where Markdown documentation will be stored or generated.",
+ type=str,
+)
+@click.option(
+ "--ignore-list",
+ "-i",
+ default=setting.project.ignore_list,
+ show_default=True,
+ help="A list of files or directories to ignore during documentation generation, separated by commas.",
+ multiple=True,
+ type=str,
+)
+@click.option(
+ "--language",
+ "-l",
+ default=setting.project.language,
+ show_default=True,
+ help="The ISO 639 code or language name for the documentation. ",
+ type=str,
+)
+@click.option(
+ "--log-level",
+ "-ll",
+ default=setting.project.log_level,
+ show_default=True,
+ help="Sets the logging level (e.g., DEBUG, INFO, WARNING, ERROR, CRITICAL) for the application. Default is INFO.",
+ type=click.Choice([level.value for level in LogLevel], case_sensitive=False),
+)
+def run(
+ model,
+ temperature,
+ request_timeout,
+ base_url,
+ target_repo_path,
+ hierarchy_path,
+ markdown_docs_path,
+ ignore_list,
+ language,
+ log_level,
+):
+ """Run the program with the specified parameters."""
+
+ project_settings = ProjectSettings(
+ target_repo=target_repo_path,
+ hierarchy_name=hierarchy_path,
+ markdown_docs_name=markdown_docs_path,
+ ignore_list=list(ignore_list), # convert tuple from 'multiple' option to list
+ language=language,
+ log_level=log_level,
+ )
+
+ chat_completion_settings = ChatCompletionSettings(
+ model=model,
+ temperature=temperature,
+ request_timeout=request_timeout,
+ base_url=base_url,
+ )
+
+ settings = Setting(
+ project=project_settings, chat_completion=chat_completion_settings
+ )
+ write_config(settings.model_dump())
+ set_logger_level_from_config(log_level=setting.project.log_level)
+
+ runner = Runner()
+ runner.run()
+ logger.success("Documentation task completed.")
+
+
+@cli.command()
+def clean():
+ """Clean the fake files generated by the documentation process."""
+ delete_fake_files()
+ logger.success("Fake files have been cleaned up.")
+
+
+@cli.command()
+def print_hierarchy():
+ """Print the hierarchy of the target repository."""
+ runner = Runner()
+ runner.meta_info.target_repo_hierarchical_tree.print_recursive()
+ logger.success("Hierarchy printed.")
+
+
+@cli.command()
+def diff():
+ """Check for changes and print which documents will be updated or generated."""
+ runner = Runner()
+ if runner.meta_info.in_generation_process: # 如果不是在生成过程中,就开始检测变更
+ click.echo("This command only supports pre-check")
+ raise click.Abort()
+
+ file_path_reflections, jump_files = make_fake_files()
+ new_meta_info = MetaInfo.init_meta_info(file_path_reflections, jump_files)
+ new_meta_info.load_doc_from_older_meta(runner.meta_info)
+ delete_fake_files()
+
+ DocItem.check_has_task(
+ new_meta_info.target_repo_hierarchical_tree,
+ ignore_list=setting.project.ignore_list,
+ )
+ if new_meta_info.target_repo_hierarchical_tree.has_task:
+ click.echo("The following docs will be generated/updated:")
+ new_meta_info.target_repo_hierarchical_tree.print_recursive(
+ diff_status=True, ignore_list=setting.project.ignore_list
+ )
+ else:
+ click.echo("No docs will be generated/updated, check your source-code update")
+
+
+@cli.command()
+def chat_with_repo():
+ """Automatic Q&A for Issues and Code Explanation."""
+ run_chat_with_repo()
+
+
+if __name__ == "__main__":
+ cli()
diff --git a/repo_agent/multi_task_dispatch.py b/repo_agent/multi_task_dispatch.py
index ecbb28e..315f42b 100644
--- a/repo_agent/multi_task_dispatch.py
+++ b/repo_agent/multi_task_dispatch.py
@@ -1,11 +1,11 @@
from __future__ import annotations
+
+import random
import threading
import time
-import random
-from typing import List, Callable, Dict, Any
-from colorama import Fore, Style
+from typing import Any, Callable, Dict, List
-from repo_agent.log import logger
+from colorama import Fore, Style
class Task:
diff --git a/repo_agent/project_manager.py b/repo_agent/project_manager.py
index 6a07a11..bbc0b07 100644
--- a/repo_agent/project_manager.py
+++ b/repo_agent/project_manager.py
@@ -1,4 +1,5 @@
import os
+
import jedi
@@ -32,33 +33,39 @@ def walk_dir(root, prefix=""):
structure = []
walk_dir(self.repo_path)
return "\n".join(structure)
+
+ def build_path_tree(self, who_reference_me, reference_who, doc_item_path):
+ from collections import defaultdict
+
+ def tree():
+ return defaultdict(tree)
- # def find_all_referencer(self, variable_name, file_path, line_number, column_number):
- # """
- # Find all references of a variable in a given file.
+ path_tree = tree()
- # Args:
- # variable_name (str): The name of the variable to search for.
- # file_path (str): The path of the file to search in.
- # line_number (int): The line number where the variable is located.
- # column_number (int): The column number where the variable is located.
+ # 构建 who_reference_me 和 reference_who 的树
+ for path_list in [who_reference_me, reference_who]:
+ for path in path_list:
+ parts = path.split(os.sep)
+ node = path_tree
+ for part in parts:
+ node = node[part]
- # Returns:
- # list: A list of tuples containing the file path, line number, and column number of each reference.
+ # 处理 doc_item_path
+ parts = doc_item_path.split(os.sep)
+ parts[-1] = "✳️" + parts[-1] # 在最后一个对象前面加上星号
+ node = path_tree
+ for part in parts:
+ node = node[part]
- # """
- # script = jedi.Script(path=os.path.join(self.repo_path, file_path))
- # references = script.get_references(line=line_number, column=column_number)
+ def tree_to_string(tree, indent=0):
+ s = ""
+ for key, value in sorted(tree.items()):
+ s += " " * indent + key + "\n"
+ if isinstance(value, dict):
+ s += tree_to_string(value, indent + 1)
+ return s
- # try:
- # # Filter out references with variable_name and return their positions
- # variable_references = [ref for ref in references if ref.name == variable_name]
- # return [(os.path.relpath(ref.module_path, self.repo_path), ref.line, ref.column) for ref in variable_references if not (ref.line == line_number and ref.column == column_number)]
- # except Exception as e:
- # # Print error message and related parameters
- # print(f"Error occurred: {e}")
- # print(f"Parameters: variable_name={variable_name}, file_path={file_path}, line_number={line_number}, column_number={column_number}")
- # return []
+ return tree_to_string(path_tree)
if __name__ == "__main__":
diff --git a/repo_agent/runner.py b/repo_agent/runner.py
index fec5272..f476f5c 100644
--- a/repo_agent/runner.py
+++ b/repo_agent/runner.py
@@ -1,71 +1,51 @@
-import threading
-import os, json
import json
-import git
-import itertools
-from tqdm import tqdm
-from typing import List
-from functools import partial
+import os
+import shutil
import subprocess
-import shutil
+import threading
+import time
from concurrent.futures import ThreadPoolExecutor
+from functools import partial
+
from colorama import Fore, Style
-import time
+from tqdm import tqdm
-from repo_agent.file_handler import FileHandler
-from repo_agent.utils.meta_info_utils import latest_verison_substring, make_fake_files, delete_fake_files
from repo_agent.change_detector import ChangeDetector
-from repo_agent.project_manager import ProjectManager
from repo_agent.chat_engine import ChatEngine
-from repo_agent.doc_meta_info import MetaInfo, DocItem, DocItemType, DocItemStatus, need_to_generate
+from repo_agent.doc_meta_info import DocItem, DocItemStatus, MetaInfo, need_to_generate
+from repo_agent.file_handler import FileHandler
from repo_agent.log import logger
-from repo_agent.config import CONFIG
from repo_agent.multi_task_dispatch import worker
-
-
-
-
-def load_whitelist():
- if CONFIG["whitelist_path"] != None:
- assert os.path.exists(
- CONFIG["whitelist_path"]
- ), f"whitelist_path must be a json-file,and must exists: {CONFIG['whitelist_path']}"
- with open(CONFIG["whitelist_path"], "r") as reader:
- white_list_json_data = json.load(reader)
-
- return white_list_json_data
- else:
- return None
+from repo_agent.project_manager import ProjectManager
+from repo_agent.settings import setting
+from repo_agent.utils.meta_info_utils import delete_fake_files, make_fake_files
class Runner:
def __init__(self):
+ self.absolute_project_hierarchy_path = setting.project.target_repo / setting.project.hierarchy_name
+
self.project_manager = ProjectManager(
- repo_path=CONFIG["repo_path"], project_hierarchy=CONFIG["project_hierarchy"]
+ repo_path=setting.project.target_repo,
+ project_hierarchy=setting.project.hierarchy_name
)
- self.change_detector = ChangeDetector(repo_path=CONFIG["repo_path"])
- self.chat_engine = ChatEngine(CONFIG=CONFIG)
+ self.change_detector = ChangeDetector(repo_path=setting.project.target_repo)
+ self.chat_engine = ChatEngine(project_manager=self.project_manager)
- if not os.path.exists(
- os.path.join(CONFIG["repo_path"], CONFIG["project_hierarchy"])
- ):
+
+ if not self.absolute_project_hierarchy_path.exists():
file_path_reflections, jump_files = make_fake_files()
self.meta_info = MetaInfo.init_meta_info(file_path_reflections, jump_files)
self.meta_info.checkpoint(
- target_dir_path=os.path.join(
- CONFIG["repo_path"], CONFIG["project_hierarchy"]
- )
+ target_dir_path=self.absolute_project_hierarchy_path
)
else: # 如果存在全局结构信息文件夹.project_hierarchy,就从中加载
self.meta_info = MetaInfo.from_checkpoint_path(
- os.path.join(CONFIG["repo_path"], CONFIG["project_hierarchy"])
+ self.absolute_project_hierarchy_path
)
- self.meta_info.white_list = load_whitelist()
self.meta_info.checkpoint( # 更新白名单后也要重新将全局信息写入到.project_doc_record文件夹中
- target_dir_path=os.path.join(
- CONFIG["repo_path"], CONFIG["project_hierarchy"]
- )
+ target_dir_path=self.absolute_project_hierarchy_path
)
self.runner_lock = threading.Lock()
@@ -90,30 +70,22 @@ def get_all_pys(self, directory):
def generate_doc_for_a_single_item(self, doc_item: DocItem):
"""为一个对象生成文档"""
- try:
- rel_file_path = doc_item.get_full_name()
+ rel_file_path = doc_item.get_full_name()
- ignore_list = CONFIG.get("ignore_list", [])
- if not need_to_generate(doc_item, ignore_list):
- print(f"内容被忽略/文档已生成,跳过:{doc_item.get_full_name()}")
- else:
- print(f" -- 正在生成文档 {Fore.LIGHTYELLOW_EX}{doc_item.item_type.name}: {doc_item.get_full_name()}{Style.RESET_ALL}")
- file_handler = FileHandler(CONFIG["repo_path"], rel_file_path)
- response_message = self.chat_engine.generate_doc(
- doc_item=doc_item,
- file_handler=file_handler,
- )
- doc_item.md_content.append(response_message.content)
- doc_item.item_status = DocItemStatus.doc_up_to_date
- self.meta_info.checkpoint(
- target_dir_path=os.path.join(
- CONFIG["repo_path"], CONFIG["project_hierarchy"]
- )
- )
- except Exception as e:
- logger.info(f" 多次尝试后生成文档失败,跳过:{doc_item.get_full_name()}")
- logger.info("Error:", e)
- doc_item.item_status = DocItemStatus.doc_has_not_been_generated
+ if not need_to_generate(doc_item, setting.project.ignore_list):
+ print(f"Content ignored/Document generated, skipping: {doc_item.get_full_name()}")
+ else:
+ print(f" -- Generating document {Fore.LIGHTYELLOW_EX}{doc_item.item_type.name}: {doc_item.get_full_name()}{Style.RESET_ALL}")
+ file_handler = FileHandler(setting.project.target_repo, rel_file_path)
+ response_message = self.chat_engine.generate_doc(
+ doc_item=doc_item,
+ file_handler=file_handler,
+ )
+ doc_item.md_content.append(response_message.content)
+ doc_item.item_status = DocItemStatus.doc_up_to_date
+ self.meta_info.checkpoint(
+ target_dir_path=self.absolute_project_hierarchy_path
+ )
def first_generate(self):
"""
@@ -123,8 +95,7 @@ def first_generate(self):
**注意**:这个生成first_generate的过程中,目标仓库代码不能修改。也就是说,一个document的生成过程必须绑定代码为一个版本。
"""
logger.info("Starting to generate documentation")
- ignore_list = CONFIG.get("ignore_list", [])
- check_task_available_func = partial(need_to_generate, ignore_list=ignore_list)
+ check_task_available_func = partial(need_to_generate, ignore_list=setting.project.ignore_list)
task_manager = self.meta_info.get_topology(
check_task_available_func
) # 将按照此顺序生成文档
@@ -149,7 +120,7 @@ def first_generate(self):
self.generate_doc_for_a_single_item,
),
)
- for process_id in range(CONFIG["max_thread_count"])
+ for process_id in range(setting.project.max_thread_count)
]
for thread in threads:
thread.start()
@@ -161,12 +132,10 @@ def first_generate(self):
)
self.meta_info.in_generation_process = False
self.meta_info.checkpoint(
- target_dir_path=os.path.join(
- CONFIG["repo_path"], CONFIG["project_hierarchy"]
- )
+ target_dir_path=self.absolute_project_hierarchy_path
)
logger.info(
- f"成功生成了 {before_task_len - len(task_manager.task_dict)} 个文档"
+ f"Successfully generated {before_task_len - len(task_manager.task_dict)} documents."
)
except BaseException as e:
@@ -178,8 +147,8 @@ def markdown_refresh(self):
"""将目前最新的document信息写入到一个markdown格式的文件夹里(不管markdown内容是不是变化了)"""
with self.runner_lock:
# 首先删除doc下所有内容,然后再重新写入
- markdown_folder = os.path.join(CONFIG["repo_path"],CONFIG["Markdown_Docs_folder"])
- if os.path.exists(markdown_folder):
+ markdown_folder = setting.project.target_repo / setting.project.markdown_docs_name
+ if markdown_folder.exists():
shutil.rmtree(markdown_folder)
os.mkdir(markdown_folder)
@@ -201,7 +170,6 @@ def recursive_check(
continue
rel_file_path = file_item.get_full_name()
- # file_handler = FileHandler(CONFIG['repo_path'], rel_file_path)
def to_markdown(item: DocItem, now_level: int) -> str:
markdown_content = ""
markdown_content += (
@@ -223,22 +191,22 @@ def to_markdown(item: DocItem, now_level: int) -> str:
markdown = ""
for _, child in file_item.children.items():
markdown += to_markdown(child, 2)
- assert markdown != None, f"markdown内容为空,文件路径为{rel_file_path}"
+ assert markdown != None, f"Markdown content is empty, the file path is: {rel_file_path}"
# 写入markdown内容到.md文件
file_path = os.path.join(
- CONFIG["Markdown_Docs_folder"],
+ setting.project.markdown_docs_name,
file_item.get_file_name().replace(".py", ".md"),
)
if file_path.startswith("/"):
# 移除开头的 '/'
file_path = file_path[1:]
- abs_file_path = os.path.join(CONFIG["repo_path"], file_path)
+ abs_file_path = setting.project.target_repo / file_path
os.makedirs(os.path.dirname(abs_file_path), exist_ok=True)
with open(abs_file_path, "w", encoding="utf-8") as file:
file.write(markdown)
logger.info(
- f"markdown document has been refreshed at {CONFIG['Markdown_Docs_folder']}"
+ f"markdown document has been refreshed at {setting.project.markdown_docs_name}"
)
def git_commit(self, commit_message):
@@ -264,9 +232,7 @@ def run(self):
# 根据document version自动检测是否仍在最初生成的process里(是否为第一次生成)
self.first_generate() # 如果是第一次做文档生成任务,就通过first_generate生成所有文档
self.meta_info.checkpoint(
- target_dir_path=os.path.join(
- CONFIG["repo_path"], CONFIG["project_hierarchy"]
- ),
+ target_dir_path=self.absolute_project_hierarchy_path,
flash_reference_relation=True,
) # 这一步将生成后的meta信息(包含引用关系)写入到.project_doc_record文件夹中
return
@@ -290,11 +256,10 @@ def run(self):
new_meta_info.load_doc_from_older_meta(self.meta_info)
self.meta_info = new_meta_info # 更新自身的meta_info信息为new的信息
- self.meta_info.in_generation_process = True # 将in_generation_process设置为True,表示检测到变更后正在生成文档的过程中
+ self.meta_info.in_generation_process = True # 将in_generation_process设置为True,表示检测到变更后Generating document 的过程中
# 处理任务队列
- ignore_list = CONFIG.get("ignore_list", [])
- check_task_available_func = partial(need_to_generate, ignore_list=ignore_list)
+ check_task_available_func = partial(need_to_generate, ignore_list=setting.project.ignore_list)
task_manager = self.meta_info.get_task_manager(self.meta_info.target_repo_hierarchical_tree,task_available_func=check_task_available_func)
@@ -310,7 +275,7 @@ def run(self):
target=worker,
args=(task_manager, process_id, self.generate_doc_for_a_single_item),
)
- for process_id in range(CONFIG["max_thread_count"])
+ for process_id in range(setting.project.max_thread_count)
]
for thread in threads:
thread.start()
@@ -321,9 +286,7 @@ def run(self):
self.meta_info.document_version = self.change_detector.repo.head.commit.hexsha
self.meta_info.checkpoint(
- target_dir_path=os.path.join(
- CONFIG["repo_path"], CONFIG["project_hierarchy"]
- ),
+ target_dir_path=self.absolute_project_hierarchy_path,
flash_reference_relation=True,
)
logger.info(f"Doc has been forwarded to the latest version")
@@ -338,7 +301,7 @@ def run(self):
git_add_result = self.change_detector.add_unstaged_files()
if len(git_add_result) > 0:
- logger.info(f"已添加 {[file for file in git_add_result]} 到暂存区")
+ logger.info(f"Added {[file for file in git_add_result]} to the staging area.")
# self.git_commit(f"Update documentation for {file_handler.file_path}") # 提交变更
@@ -376,7 +339,7 @@ def add_new_item(self, file_handler, json_data):
# 将新的项写入json文件
with open(self.project_manager.project_hierarchy, "w", encoding="utf-8") as f:
json.dump(json_data, f, indent=4, ensure_ascii=False)
- logger.info(f"已将新增文件 {file_handler.file_path} 的结构信息写入json文件。")
+ logger.info(f"The structural information of the newly added file {file_handler.file_path} has been written into a JSON file.")
# 将变更部分的json文件内容转换成markdown内容
markdown = file_handler.convert_to_markdown_file(
file_path=file_handler.file_path
@@ -385,7 +348,7 @@ def add_new_item(self, file_handler, json_data):
file_handler.write_file(
os.path.join(
self.project_manager.repo_path,
- CONFIG["Markdown_Docs_folder"],
+ setting.project.markdown_docs_name,
file_handler.file_path.replace(".py", ".md"),
),
markdown,
@@ -443,7 +406,7 @@ def process_file_changes(self, repo_path, file_path, is_new_file):
# 将markdown内容写入.md文件
file_handler.write_file(
os.path.join(
- CONFIG["Markdown_Docs_folder"],
+ setting.project.markdown_docs_name,
file_handler.file_path.replace(".py", ".md"),
),
markdown,
diff --git a/repo_agent/settings.py b/repo_agent/settings.py
new file mode 100644
index 0000000..ab363b1
--- /dev/null
+++ b/repo_agent/settings.py
@@ -0,0 +1,105 @@
+from enum import StrEnum
+
+from iso639 import Language, LanguageNotFoundError
+from pydantic import (
+ DirectoryPath,
+ Field,
+ HttpUrl,
+ PositiveFloat,
+ PositiveInt,
+ SecretStr,
+ field_serializer,
+ field_validator,
+)
+from pydantic_settings import BaseSettings
+
+from repo_agent.config_manager import read_config, write_config
+
+
+class LogLevel(StrEnum):
+ DEBUG = "DEBUG"
+ INFO = "INFO"
+ WARNING = "WARNING"
+ ERROR = "ERROR"
+ CRITICAL = "CRITICAL"
+
+
+class ProjectSettings(BaseSettings):
+ target_repo: DirectoryPath = "" # type: ignore
+ hierarchy_name: str = ".project_doc_record"
+ markdown_docs_name: str = "markdown_docs"
+ ignore_list: list[str] = []
+ language: str = "Chinese"
+ max_thread_count: PositiveInt = 4
+ max_document_tokens: PositiveInt = 1024
+ log_level: LogLevel = LogLevel.INFO
+
+ @field_serializer("ignore_list")
+ def serialize_ignore_list(self, ignore_list: list[str] = []):
+ if ignore_list == [""]:
+ self.ignore_list = [] # If the ignore_list is empty, set it to an empty list
+ return []
+ return ignore_list
+
+ @field_validator("language")
+ @classmethod
+ def validate_language_code(cls, v: str) -> str:
+ try:
+ language_name = Language.match(v).name
+ return language_name # Returning the resolved language name
+ except LanguageNotFoundError:
+ raise ValueError(
+ "Invalid language input. Please enter a valid ISO 639 code or language name."
+ )
+
+ @field_validator("log_level", mode="before")
+ @classmethod
+ def set_log_level(cls, v: str) -> LogLevel:
+ if isinstance(v, str):
+ v = v.upper() # 将输入转换为大写
+ if v in LogLevel._value2member_map_: # 检查转换后的值是否在枚举成员中
+ return LogLevel(v)
+ raise ValueError(f"Invalid log level: {v}")
+
+ @field_serializer("target_repo")
+ def serialize_target_repo(self, target_repo: DirectoryPath):
+ return str(target_repo)
+
+
+class ChatCompletionSettings(BaseSettings):
+ model: str = "gpt-3.5-turbo"
+ temperature: PositiveFloat = 0.2
+ request_timeout: PositiveFloat = 60.0
+ base_url: HttpUrl = "https://api.openai.com/v1" # type: ignore
+ openai_api_key: SecretStr = Field(..., exclude=True)
+
+ @field_serializer("base_url")
+ def serialize_base_url(self, base_url: HttpUrl):
+ return str(base_url)
+
+
+class Setting(BaseSettings):
+ project: ProjectSettings = {} # type: ignore
+ chat_completion: ChatCompletionSettings = {} # type: ignore
+
+
+_config_data = read_config()
+setting = Setting.model_validate(_config_data)
+
+if _config_data == {}:
+ write_config(setting.model_dump())
+
+# NOTE Each model's token limit has been reduced by 1024 tokens to account for the output space and 1 for boundary conditions.
+max_input_tokens_map = {
+ "gpt-3.5-turbo": 4096, # NOTE OPENAI said that The gpt-3.5-turbo model alias will be automatically upgraded from gpt-3.5-turbo-0613 to gpt-3.5-turbo-0125 on February 16th. But in 2/20, then still maintain 4,096 tokens for context window.
+ "gpt-3.5-turbo-0613": 4096, # NOTE Will be deprecated on June 13, 2024.
+ "gpt-3.5-turbo-16k": 16384, # NOTE Will be deprecated on June 13, 2024.
+ "gpt-3.5-turbo-16k-0613": 16384, # NOTE Will be deprecated on June 13, 2024.
+ "gpt-3.5-turbo-0125": 16384,
+ "gpt-4": 8192,
+ "gpt-4-0613": 8192,
+ "gpt-4-32k": 32768, # NOTE This model was never rolled out widely in favor of GPT-4 Turbo.
+ "gpt-4-1106": 131072,
+ "gpt-4-0125-preview": 131072,
+ "gpt-4-turbo-preview": 131072,
+}
diff --git a/repo_agent/utils/meta_info_utils.py b/repo_agent/utils/meta_info_utils.py
index 2d3cd52..9d36e95 100644
--- a/repo_agent/utils/meta_info_utils.py
+++ b/repo_agent/utils/meta_info_utils.py
@@ -1,11 +1,11 @@
+import itertools
import os
+
import git
from colorama import Fore, Style
-import itertools
-from typing import List
-from repo_agent.config import CONFIG
from repo_agent.log import logger
+from repo_agent.settings import setting
latest_verison_substring = "_latest_version.py"
@@ -19,7 +19,7 @@ def make_fake_files():
"""
delete_fake_files()
- repo = git.Repo(CONFIG["repo_path"])
+ repo = git.Repo(setting.project.target_repo)
unstaged_changes = repo.index.diff(None) #在git status里,但是有修改没提交
untracked_files = repo.untracked_files #在文件系统里,但没在git里的文件
@@ -44,15 +44,15 @@ def make_fake_files():
if now_file_path.endswith(".py"):
raw_file_content = diff_file.a_blob.data_stream.read().decode("utf-8")
latest_file_path = now_file_path[:-3] + latest_verison_substring
- if os.path.exists(os.path.join(CONFIG["repo_path"],now_file_path)):
- os.rename(os.path.join(CONFIG["repo_path"],now_file_path), os.path.join(CONFIG["repo_path"], latest_file_path))
+ if os.path.exists(os.path.join(setting.project.target_repo,now_file_path)):
+ os.rename(os.path.join(setting.project.target_repo,now_file_path), os.path.join(setting.project.target_repo, latest_file_path))
print(f"{Fore.LIGHTMAGENTA_EX}[Save Latest Version of Code]: {Style.RESET_ALL}{now_file_path} -> {latest_file_path}")
else:
print(f"{Fore.LIGHTMAGENTA_EX}[Create Temp-File for Deleted(But not Staged) Files]: {Style.RESET_ALL}{now_file_path} -> {latest_file_path}")
- with open(os.path.join(CONFIG["repo_path"],latest_file_path), "w") as writer:
+ with open(os.path.join(setting.project.target_repo,latest_file_path), "w") as writer:
pass
- with open(os.path.join(CONFIG["repo_path"],now_file_path), "w") as writer:
+ with open(os.path.join(setting.project.target_repo,now_file_path), "w") as writer:
writer.write(raw_file_content)
file_path_reflections[now_file_path] = latest_file_path #real指向fake
return file_path_reflections, jump_files
@@ -72,10 +72,10 @@ def gci(filepath):
origin_name = fi_d.replace(latest_verison_substring, ".py")
os.remove(origin_name)
if os.path.getsize(fi_d) == 0:
- print(f"{Fore.LIGHTRED_EX}[Deleting Temp File]: {Style.RESET_ALL}{fi_d[len(CONFIG['repo_path']):]}, {origin_name[len(CONFIG['repo_path']):]}")
+ print(f"{Fore.LIGHTRED_EX}[Deleting Temp File]: {Style.RESET_ALL}{fi_d[len(setting.project.target_repo):]}, {origin_name[len(setting.project.target_repo):]}")
os.remove(fi_d)
else:
- print(f"{Fore.LIGHTRED_EX}[Recovering Latest Version]: {Style.RESET_ALL}{origin_name[len(CONFIG['repo_path']):]} <- {fi_d[len(CONFIG['repo_path']):]}")
+ print(f"{Fore.LIGHTRED_EX}[Recovering Latest Version]: {Style.RESET_ALL}{origin_name[len(setting.project.target_repo):]} <- {fi_d[len(setting.project.target_repo):]}")
os.rename(fi_d, origin_name)
- gci(CONFIG["repo_path"])
\ No newline at end of file
+ gci(setting.project.target_repo)
\ No newline at end of file
diff --git a/requirements.txt b/requirements.txt
index fb82023..855a0b1 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -103,12 +103,12 @@ bcrypt==4.1.2 \
--hash=sha256:eb3bd3321517916696233b5e0c67fd7d6281f0ef48e66812db35fc963a422a1c \
--hash=sha256:f70d9c61f9c4ca7d57f3bfe88a5ccf62546ffbadf3681bb1e268d9d2e41c91a7 \
--hash=sha256:fbe188b878313d01b7718390f31528be4010fed1faa798c5a1d0469c9c48c369
-build==1.0.3 \
- --hash=sha256:538aab1b64f9828977f84bc63ae570b060a8ed1be419e7870b8b4fc5e6ea553b \
- --hash=sha256:589bf99a67df7c9cf07ec0ac0e5e2ea5d4b37ac63301c4986d1acb126aa83f8f
-cachetools==5.3.2 \
- --hash=sha256:086ee420196f7b2ab9ca2db2520aca326318b68fe5ba8bc4d49cca91add450f2 \
- --hash=sha256:861f35a13a451f94e301ce2bec7cac63e881232ccce7ed67fab9b5df4d3beaa1
+build==1.1.1 \
+ --hash=sha256:8ed0851ee76e6e38adce47e4bee3b51c771d86c64cf578d0c2245567ee200e73 \
+ --hash=sha256:8eea65bb45b1aac2e734ba2cc8dad3a6d97d97901a395bd0ed3e7b46953d2a31
+cachetools==5.3.3 \
+ --hash=sha256:0abad1021d3f8325b2fc1d2e9c8b9c9d57b04c3932657a72465447332c24d945 \
+ --hash=sha256:ba29e2dfa0b8b556606f097407ed1aa62080ee108ab0dc5ec9d6a723a007d105
certifi==2024.2.2 \
--hash=sha256:0569859f95fc761b18b45ef421b1290a0f65f147e92a1e5eb3e635f9a5e4e66f \
--hash=sha256:dc383c07b76109f368f6106eee2b593b04a011ea4d55f652c6ca24a754d1cdd1
@@ -172,9 +172,9 @@ chroma-hnswlib==0.7.3 \
--hash=sha256:b7dca27b8896b494456db0fd705b689ac6b73af78e186eb6a42fea2de4f71c6f \
--hash=sha256:d71a3f4f232f537b6152947006bd32bc1629a8686df22fd97777b70f416c127a \
--hash=sha256:f96f4d5699e486eb1fb95849fe35ab79ab0901265805be7e60f4eaa83ce263ec
-chromadb==0.4.23 \
- --hash=sha256:3d3c2ffb4ff560721e3daf8c1a3729fd149c551525b6f75543eddb81a4f29e16 \
- --hash=sha256:54d9a770640704c6cedc15317faab9fd45beb9833e7484c00037e7a8801a349f
+chromadb==0.4.24 \
+ --hash=sha256:3a08e237a4ad28b5d176685bd22429a03717fe09d35022fb230d516108da01da \
+ --hash=sha256:a5c80b4e4ad9b236ed2d4899a5b9e8002b489293f2881cb2cadab5b199ee1c72
click==8.1.7 \
--hash=sha256:ae74fb96c20a0277a1d615f1e4d73c8414f5a98db8b799a7931d1582f3390c28 \
--hash=sha256:ca9853ad459e787e2192211578cc907e7594e294c7ccc834310722b41b9ca6de
@@ -237,17 +237,17 @@ distro==1.9.0 \
exceptiongroup==1.2.0; python_version < "3.11" \
--hash=sha256:4bfd3996ac73b41e9b9628b04e079f193850720ea5945fc96a08633c66912f14 \
--hash=sha256:91f5c769735f051a4290d52edd0858999b57e5876e9f85937691bd4c9fa3ed68
-fastapi==0.109.2 \
- --hash=sha256:2c9bab24667293b501cad8dd388c05240c850b58ec5876ee3283c47d6e1e3a4d \
- --hash=sha256:f3817eac96fe4f65a2ebb4baa000f394e55f5fccdaf7f75250804bc58f354f73
+fastapi==0.110.0 \
+ --hash=sha256:266775f0dcc95af9d3ef39bad55cff525329a931d5fd51930aadd4f428bf7ff3 \
+ --hash=sha256:87a1f6fb632a218222c5984be540055346a8f5d8a68e8f6fb647b1dc9934de4b
ffmpy==0.3.2 \
--hash=sha256:475ebfff1044661b8d969349dbcd2db9bf56d3ee78c0627e324769b49a27a78f
filelock==3.13.1 \
--hash=sha256:521f5f56c50f8426f5e03ad3b281b490a87ef15bc6c526f168290f0c7148d44e \
--hash=sha256:57dbda9b35157b05fb3e58ee91448612eb674172fab98ee235ccb0b5bee19a1c
-flatbuffers==23.5.26 \
- --hash=sha256:9ea1144cac05ce5d86e2859f431c6cd5e66cd9c78c558317c7955fb8d4c78d89 \
- --hash=sha256:c0ff356da363087b915fde4b8b45bdda73432fc17cddb3c8157472eab1422ad1
+flatbuffers==24.3.7 \
+ --hash=sha256:0895c22b9a6019ff2f4de2e5e2f7cd15914043e6e7033a94c0c6369422690f22 \
+ --hash=sha256:80c4f5dcad0ee76b7e349671a0d657f2fbba927a0244f88dd3f5ed6a3694e1fc
fonttools==4.49.0 \
--hash=sha256:07bc5ea02bb7bc3aa40a1eb0481ce20e8d9b9642a9536cde0218290dd6085828 \
--hash=sha256:0ba0e00620ca28d4ca11fc700806fd69144b463aa3275e1b36e56c7c09915559 \
@@ -338,12 +338,12 @@ google-auth==2.28.1 \
googleapis-common-protos==1.62.0 \
--hash=sha256:4750113612205514f9f6aa4cb00d523a94f3e8c06c5ad2fee466387dc4875f07 \
--hash=sha256:83f0ece9f94e5672cced82f592d2a5edf527a96ed1794f0bab36d5735c996277
-gradio==4.19.2 \
- --hash=sha256:6fe5815bb4dfaeed1fc74223bffd91da70a1b463158af8c5e03d01bb09068a1d \
- --hash=sha256:acab4a35f556dbc3ae637469312738d154bcb73f0b8d5f4f65e4d067ecb1e0b1
-gradio-client==0.10.1 \
- --hash=sha256:879eb56fae5d6b1603bb9375b88d1de0d034f3dac4b3afc8dbc66f36f6e54d5d \
- --hash=sha256:a0413fffdde3360e0f6aaec8b8c23d8a320049a571de2d111d85ebd295002165
+gradio==4.20.1 \
+ --hash=sha256:01815047593f8d653d609ab2d9b89a7d435fd20438602de19278f8cd6acd1fb9 \
+ --hash=sha256:9efb88a4e1626f4f452467e45f44c37dbd8b57f7839f71326ec169e40de5cdab
+gradio-client==0.11.0 \
+ --hash=sha256:18ae178f83df4a1ff53247457f6ee13c4d0dfc3b7a8bff6d9da677600f31b04e \
+ --hash=sha256:638a43ccc73937790a3c68adb732da03ee7077732a0b6540a9196bacbb1f6539
greenlet==3.0.3 \
--hash=sha256:149e94a2dd82d19838fe4b2259f1b6b9957d5ba1b25640d2380bea9c5df37676 \
--hash=sha256:15d79dd26056573940fcb8c7413d84118086f2ec1a8acdfa854631084393efcc \
@@ -434,9 +434,9 @@ httptools==0.6.1 \
httpx==0.27.0 \
--hash=sha256:71d5465162c13681bff01ad59b2cc68dd838ea1f10e51574bac27103f00c91a5 \
--hash=sha256:a0cb88a46f32dc874e04ee956e4c2764aba2aa228f650b06788ba6bda2962ab5
-huggingface-hub==0.20.3 \
- --hash=sha256:94e7f8e074475fbc67d6a71957b678e1b4a74ff1b64a644fd6cbb83da962d05d \
- --hash=sha256:d988ae4f00d3e307b0c80c6a05ca6dbb7edba8bba3079f74cda7d9c2e562a7b6
+huggingface-hub==0.21.4 \
+ --hash=sha256:df37c2c37fc6c82163cdd8a67ede261687d80d1e262526d6c0ce73b6b3630a7b \
+ --hash=sha256:e1f4968c93726565a80edf6dc309763c7b546d0cfe79aa221206034d50155531
humanfriendly==10.0 \
--hash=sha256:1697e1a8a8f550fd43c2865cd84542fc175a61dcb779b6fee18cf6b6ccba1477 \
--hash=sha256:6b0b831ce8f15f7300721aa49829fc4e83921a9a301cc7f606be6686a2288ddc
@@ -446,9 +446,9 @@ idna==3.6 \
importlib-metadata==6.11.0 \
--hash=sha256:1231cf92d825c9e03cfc4da076a16de6422c863558229ea0b22b675657463443 \
--hash=sha256:f0afba6205ad8f8947c7d338b5342d5db2afbfd82f9cbef7879a9539cc12eb9b
-importlib-resources==6.1.1 \
- --hash=sha256:3893a00122eafde6894c59914446a512f728a0c1a45f9bb9b63721b6bacf0b4a \
- --hash=sha256:e8bf90d8213b486f428c9c39714b920041cb02c184686a3dee24905aaa8105d6
+importlib-resources==6.1.3 \
+ --hash=sha256:4c0269e3580fe2634d364b39b38b961540a7738c02cb984e98add8b4221d793d \
+ --hash=sha256:56fb4525197b78544a3354ea27793952ab93f935bb4bf746b846bb1015020f2b
iniconfig==2.0.0 \
--hash=sha256:2d91e135bf72d31a410b17c16da610a82cb55f6b0477d1a902134b24a455b8b3 \
--hash=sha256:b6a85871a79d2e3b22d2d1b94ac2824226a63c6b741c88f7ae975f18b6778374
@@ -576,9 +576,9 @@ markupsafe==2.1.5 \
--hash=sha256:ea3d8a3d18833cf4304cd2fc9cbb1efe188ca9b5efef2bdac7adc20594a0e46b \
--hash=sha256:f5dfb42c4604dddc8e4305050aa6deb084540643ed5804d7455b5df8fe16f5e5 \
--hash=sha256:fce659a462a1be54d2ffcacea5e3ba2d74daa74f30f5f143fe0c58636e355fdd
-marshmallow==3.20.2 \
- --hash=sha256:4c1daff273513dc5eb24b219a8035559dc573c8f322558ef85f5438ddd1236dd \
- --hash=sha256:c21d4b98fee747c130e6bc8f45c4b3199ea66bc00c12ee1f639f0aeca034d5e9
+marshmallow==3.21.1 \
+ --hash=sha256:4e65e9e0d80fc9e609574b9983cf32579f305c718afb30d7233ab818571768c3 \
+ --hash=sha256:f085493f79efb0644f270a9bf2892843142d80d7174bbbd2f3713f2a589dc633
matplotlib==3.8.3 \
--hash=sha256:04b36ad07eac9740fc76c2aa16edf94e50b297d6eb4c081e3add863de4bb19a7 \
--hash=sha256:09074f8057917d17ab52c242fdf4916f30e99959c1908958b1fc6032e2d0f6d4 \
@@ -753,115 +753,115 @@ numpy==1.26.4 \
oauthlib==3.2.2 \
--hash=sha256:8139f29aac13e25d502680e9e19963e83f16838d48a0d71c287fe40e7067fbca \
--hash=sha256:9859c40929662bec5d64f34d01c99e093149682a3f38915dc0655d5a633dd918
-onnxruntime==1.17.0 \
- --hash=sha256:16d26badd092c8c257fa57c458bb600d96dc15282c647ccad0ed7b2732e6c03b \
- --hash=sha256:4b038324586bc905299e435f7c00007e6242389c856b82fe9357fdc3b1ef2bdc \
- --hash=sha256:6f1273bebcdb47ed932d076c85eb9488bc4768fcea16d5f2747ca692fad4f9d3 \
- --hash=sha256:7466724e809a40e986b1637cba156ad9fc0d1952468bc00f79ef340bc0199552 \
- --hash=sha256:90c0890e36f880281c6c698d9bc3de2afbeee2f76512725ec043665c25c67d21 \
- --hash=sha256:93d39b3fa1ee01f034f098e1c7769a811a21365b4883f05f96c14a2b60c6028b \
- --hash=sha256:ac2f286da3494b29b4186ca193c7d4e6a2c1f770c4184c7192c5da142c3dec28 \
- --hash=sha256:b4c87d83c6f58d1af2675fc99e3dc810f2dbdb844bcefd0c1b7573632661f6fc \
- --hash=sha256:bb1bf1ee575c665b8bbc3813ab906e091a645a24ccc210be7932154b8260eca1 \
- --hash=sha256:cb60fd3c2c1acd684752eb9680e89ae223e9801a9b0e0dc7b28adabe45a2e380 \
- --hash=sha256:d2b22a25a94109cc983443116da8d9805ced0256eb215c5e6bc6dcbabefeab96 \
- --hash=sha256:d47bee7557a8b99c8681b6882657a515a4199778d6d5e24e924d2aafcef55b0a \
- --hash=sha256:dba55723bf9b835e358f48c98a814b41692c393eb11f51e02ece0625c756b797 \
- --hash=sha256:ee48422349cc500273beea7607e33c2237909f58468ae1d6cccfc4aecd158565 \
- --hash=sha256:f34cc46553359293854e38bdae2ab1be59543aad78a6317e7746d30e311110c3
-openai==1.12.0 \
- --hash=sha256:99c5d257d09ea6533d689d1cc77caa0ac679fa21efef8893d8b0832a86877f1b \
- --hash=sha256:a54002c814e05222e413664f651b5916714e4700d041d5cf5724d3ae1a3e3481
-opentelemetry-api==1.22.0 \
- --hash=sha256:15ae4ca925ecf9cfdfb7a709250846fbb08072260fca08ade78056c502b86bed \
- --hash=sha256:43621514301a7e9f5d06dd8013a1b450f30c2e9372b8e30aaeb4562abf2ce034
-opentelemetry-exporter-otlp-proto-common==1.22.0 \
- --hash=sha256:3f2538bec5312587f8676c332b3747f54c89fe6364803a807e217af4603201fa \
- --hash=sha256:71ae2f81bc6d6fe408d06388826edc8933759b2ca3a97d24054507dc7cfce52d
-opentelemetry-exporter-otlp-proto-grpc==1.22.0 \
- --hash=sha256:1e0e5aa4bbabc74942f06f268deffd94851d12a8dc30b02527472ef1729fe5b1 \
- --hash=sha256:b5bcadc129272004316a455e9081216d3380c1fc2231a928ea6a70aa90e173fb
-opentelemetry-instrumentation==0.43b0 \
- --hash=sha256:0ff1334d7e359e27640e9d420024efeb73eacae464309c2e14ede7ba6c93967e \
- --hash=sha256:c3755da6c4be8033be0216d0501e11f4832690f4e2eca5a3576fbf113498f0f6
-opentelemetry-instrumentation-asgi==0.43b0 \
- --hash=sha256:1f593829fa039e9367820736fb063e92acd15c25b53d7bcb5d319971b8e93fd7 \
- --hash=sha256:3f6f19333dca31ef696672e4e36cb1c2613c71dc7e847c11ff36a37e1130dadc
-opentelemetry-instrumentation-fastapi==0.43b0 \
- --hash=sha256:2afaaf470622e1a2732182c68f6d2431ffe5e026a7edacd0f83605632b66347f \
- --hash=sha256:b79c044df68a52e07b35fa12a424e7cc0dd27ff0a171c5fdcc41dea9de8fc938
-opentelemetry-proto==1.22.0 \
- --hash=sha256:9ec29169286029f17ca34ec1f3455802ffb90131642d2f545ece9a63e8f69003 \
- --hash=sha256:ce7188d22c75b6d0fe53e7fb58501613d0feade5139538e79dedd9420610fa0c
-opentelemetry-sdk==1.22.0 \
- --hash=sha256:45267ac1f38a431fc2eb5d6e0c0d83afc0b78de57ac345488aa58c28c17991d0 \
- --hash=sha256:a730555713d7c8931657612a88a141e3a4fe6eb5523d9e2d5a8b1e673d76efa6
-opentelemetry-semantic-conventions==0.43b0 \
- --hash=sha256:291284d7c1bf15fdaddf309b3bd6d3b7ce12a253cec6d27144439819a15d8445 \
- --hash=sha256:b9576fb890df479626fa624e88dde42d3d60b8b6c8ae1152ad157a8b97358635
-opentelemetry-util-http==0.43b0 \
- --hash=sha256:3ff6ab361dbe99fc81200d625603c0fb890c055c6e416a3e6d661ddf47a6c7f7 \
- --hash=sha256:f25a820784b030f6cb86b3d76e5676c769b75ed3f55a210bcdae0a5e175ebadb
-orjson==3.9.14 \
- --hash=sha256:06fb40f8e49088ecaa02f1162581d39e2cf3fd9dbbfe411eb2284147c99bad79 \
- --hash=sha256:08e722a8d06b13b67a51f247a24938d1a94b4b3862e40e0eef3b2e98c99cd04c \
- --hash=sha256:135d518f73787ce323b1a5e21fb854fe22258d7a8ae562b81a49d6c7f826f2a3 \
- --hash=sha256:19cdea0664aec0b7f385be84986d4defd3334e9c3c799407686ee1c26f7b8251 \
- --hash=sha256:1f7b6f3ef10ae8e3558abb729873d033dbb5843507c66b1c0767e32502ba96bb \
- --hash=sha256:20837e10835c98973673406d6798e10f821e7744520633811a5a3d809762d8cc \
- --hash=sha256:26280a7fcb62d8257f634c16acebc3bec626454f9ab13558bbf7883b9140760e \
- --hash=sha256:2eefc41ba42e75ed88bc396d8fe997beb20477f3e7efa000cd7a47eda452fbb2 \
- --hash=sha256:449bf090b2aa4e019371d7511a6ea8a5a248139205c27d1834bb4b1e3c44d936 \
- --hash=sha256:4dc1c132259b38d12c6587d190cd09cd76e3b5273ce71fe1372437b4cbc65f6f \
- --hash=sha256:58b36f54da759602d8e2f7dad958752d453dfe2c7122767bc7f765e17dc59959 \
- --hash=sha256:6f39a10408478f4c05736a74da63727a1ae0e83e3533d07b19443400fe8591ca \
- --hash=sha256:751250a31fef2bac05a2da2449aae7142075ea26139271f169af60456d8ad27a \
- --hash=sha256:793f6c9448ab6eb7d4974b4dde3f230345c08ca6c7995330fbceeb43a5c8aa5e \
- --hash=sha256:90903d2908158a2c9077a06f11e27545de610af690fb178fd3ba6b32492d4d1c \
- --hash=sha256:917311d6a64d1c327c0dfda1e41f3966a7fb72b11ca7aa2e7a68fcccc7db35d9 \
- --hash=sha256:95c03137b0cf66517c8baa65770507a756d3a89489d8ecf864ea92348e1beabe \
- --hash=sha256:9a1af21160a38ee8be3f4fcf24ee4b99e6184cadc7f915d599f073f478a94d2c \
- --hash=sha256:a2591faa0c031cf3f57e5bce1461cfbd6160f3f66b5a72609a130924917cb07d \
- --hash=sha256:a603161318ff699784943e71f53899983b7dee571b4dd07c336437c9c5a272b0 \
- --hash=sha256:a6bc7928d161840096adc956703494b5c0193ede887346f028216cac0af87500 \
- --hash=sha256:abcda41ecdc950399c05eff761c3de91485d9a70d8227cb599ad3a66afe93bcc \
- --hash=sha256:b7c11667421df2d8b18b021223505dcc3ee51be518d54e4dc49161ac88ac2b87 \
- --hash=sha256:c19009ff37f033c70acd04b636380379499dac2cba27ae7dfc24f304deabbc81 \
- --hash=sha256:ce6f095eef0026eae76fc212f20f786011ecf482fc7df2f4c272a8ae6dd7b1ef \
- --hash=sha256:d2cf1d0557c61c75e18cf7d69fb689b77896e95553e212c0cc64cf2087944b84 \
- --hash=sha256:d450a8e0656efb5d0fcb062157b918ab02dcca73278975b4ee9ea49e2fcf5bd5 \
- --hash=sha256:df76ecd17b1b3627bddfd689faaf206380a1a38cc9f6c4075bd884eaedcf46c2 \
- --hash=sha256:e2450d87dd7b4f277f4c5598faa8b49a0c197b91186c47a2c0b88e15531e4e3e \
- --hash=sha256:ea890e6dc1711aeec0a33b8520e395c2f3d59ead5b4351a788e06bf95fc7ba81
+onnxruntime==1.17.1 \
+ --hash=sha256:2dff1a24354220ac30e4a4ce2fb1df38cb1ea59f7dac2c116238d63fe7f4c5ff \
+ --hash=sha256:36fd6f87a1ecad87e9c652e42407a50fb305374f9a31d71293eb231caae18784 \
+ --hash=sha256:40f08e378e0f85929712a2b2c9b9a9cc400a90c8a8ca741d1d92c00abec60843 \
+ --hash=sha256:53e4e06c0a541696ebdf96085fd9390304b7b04b748a19e02cf3b35c869a1e76 \
+ --hash=sha256:55b5e92a4c76a23981c998078b9bf6145e4fb0b016321a8274b1607bd3c6bd35 \
+ --hash=sha256:5e3716b5eec9092e29a8d17aab55e737480487deabfca7eac3cd3ed952b6ada9 \
+ --hash=sha256:606a7cbfb6680202b0e4f1890881041ffc3ac6e41760a25763bd9fe146f0b335 \
+ --hash=sha256:6226a5201ab8cafb15e12e72ff2a4fc8f50654e8fa5737c6f0bd57c5ff66827e \
+ --hash=sha256:99a8bddeb538edabc524d468edb60ad4722cff8a49d66f4e280c39eace70500b \
+ --hash=sha256:ac79da6d3e1bb4590f1dad4bb3c2979d7228555f92bb39820889af8b8e6bd472 \
+ --hash=sha256:ae9ba47dc099004e3781f2d0814ad710a13c868c739ab086fc697524061695ea \
+ --hash=sha256:d43ac17ac4fa3c9096ad3c0e5255bb41fd134560212dc124e7f52c3159af5d21 \
+ --hash=sha256:ebbcd2bc3a066cf54e6f18c75708eb4d309ef42be54606d22e5bdd78afc5b0d7 \
+ --hash=sha256:fbb98cced6782ae1bb799cc74ddcbbeeae8819f3ad1d942a74d88e72b6511337 \
+ --hash=sha256:fd7fddb4311deb5a7d3390cd8e9b3912d4d963efbe4dfe075edbaf18d01c024e
+openai==1.13.3 \
+ --hash=sha256:5769b62abd02f350a8dd1a3a242d8972c947860654466171d60fb0972ae0a41c \
+ --hash=sha256:ff6c6b3bc7327e715e4b3592a923a5a1c7519ff5dd764a83d69f633d49e77a7b
+opentelemetry-api==1.23.0 \
+ --hash=sha256:14a766548c8dd2eb4dfc349739eb4c3893712a0daa996e5dbf945f9da665da9d \
+ --hash=sha256:cc03ea4025353048aadb9c64919099663664672ea1c6be6ddd8fee8e4cd5e774
+opentelemetry-exporter-otlp-proto-common==1.23.0 \
+ --hash=sha256:2a9e7e9d5a8b026b572684b6b24dcdefcaa58613d5ce3d644130b0c373c056c1 \
+ --hash=sha256:35e4ea909e7a0b24235bd0aaf17fba49676527feb1823b46565ff246d5a1ab18
+opentelemetry-exporter-otlp-proto-grpc==1.23.0 \
+ --hash=sha256:40f9e3e7761eb34f2a1001f4543028783ac26e2db27e420d5374f2cca0182dad \
+ --hash=sha256:aa1a012eea5342bfef51fcf3f7f22601dcb0f0984a07ffe6025b2fbb6d91a2a9
+opentelemetry-instrumentation==0.44b0 \
+ --hash=sha256:79560f386425176bcc60c59190064597096114c4a8e5154f1cb281bb4e47d2fc \
+ --hash=sha256:8213d02d8c0987b9b26386ae3e091e0477d6331673123df736479322e1a50b48
+opentelemetry-instrumentation-asgi==0.44b0 \
+ --hash=sha256:0d95c84a8991008c8a8ac35e15d43cc7768a5bb46f95f129e802ad2990d7c366 \
+ --hash=sha256:72d4d28ec7ccd551eac11edc5ae8cac3586c0a228467d6a95fad7b6d4edd597a
+opentelemetry-instrumentation-fastapi==0.44b0 \
+ --hash=sha256:4441482944bea6676816668d56deb94af990e8c6e9582c581047e5d84c91d3c9 \
+ --hash=sha256:67ed10b93ad9d35238ae0be73cf8acbbb65a4a61fb7444d0aee5b0c492e294db
+opentelemetry-proto==1.23.0 \
+ --hash=sha256:4c017deca052cb287a6003b7c989ed8b47af65baeb5d57ebf93dde0793f78509 \
+ --hash=sha256:e6aaf8b7ace8d021942d546161401b83eed90f9f2cc6f13275008cea730e4651
+opentelemetry-sdk==1.23.0 \
+ --hash=sha256:9ddf60195837b59e72fd2033d6a47e2b59a0f74f0ec37d89387d89e3da8cab7f \
+ --hash=sha256:a93c96990ac0f07c6d679e2f1015864ff7a4f5587122dd5af968034436efb1fd
+opentelemetry-semantic-conventions==0.44b0 \
+ --hash=sha256:2e997cb28cd4ca81a25a9a43365f593d0c2b76be0685015349a89abdf1aa4ffa \
+ --hash=sha256:7c434546c9cbd797ab980cc88bf9ff3f4a5a28f941117cad21694e43d5d92019
+opentelemetry-util-http==0.44b0 \
+ --hash=sha256:75896dffcbbeb5df5429ad4526e22307fc041a27114e0c5bfd90bb219381e68f \
+ --hash=sha256:ff018ab6a2fa349537ff21adcef99a294248b599be53843c44f367aef6bccea5
+orjson==3.9.15 \
+ --hash=sha256:10c57bc7b946cf2efa67ac55766e41764b66d40cbd9489041e637c1304400494 \
+ --hash=sha256:12365576039b1a5a47df01aadb353b68223da413e2e7f98c02403061aad34bde \
+ --hash=sha256:2973474811db7b35c30248d1129c64fd2bdf40d57d84beed2a9a379a6f57d0ab \
+ --hash=sha256:2c51378d4a8255b2e7c1e5cc430644f0939539deddfa77f6fac7b56a9784160a \
+ --hash=sha256:2d99e3c4c13a7b0fb3792cc04c2829c9db07838fb6973e578b85c1745e7d0ce7 \
+ --hash=sha256:2f256d03957075fcb5923410058982aea85455d035607486ccb847f095442bda \
+ --hash=sha256:4228aace81781cc9d05a3ec3a6d2673a1ad0d8725b4e915f1089803e9efd2b99 \
+ --hash=sha256:4feeb41882e8aa17634b589533baafdceb387e01e117b1ec65534ec724023d04 \
+ --hash=sha256:5bb399e1b49db120653a31463b4a7b27cf2fbfe60469546baf681d1b39f4edf2 \
+ --hash=sha256:62482873e0289cf7313461009bf62ac8b2e54bc6f00c6fabcde785709231a5d7 \
+ --hash=sha256:6ae4e06be04dc00618247c4ae3f7c3e561d5bc19ab6941427f6d3722a0875ef7 \
+ --hash=sha256:6f7b65bfaf69493c73423ce9db66cfe9138b2f9ef62897486417a8fcb0a92bfe \
+ --hash=sha256:71c6b009d431b3839d7c14c3af86788b3cfac41e969e3e1c22f8a6ea13139404 \
+ --hash=sha256:8055ec598605b0077e29652ccfe9372247474375e0e3f5775c91d9434e12d6b1 \
+ --hash=sha256:82425dd5c7bd3adfe4e94c78e27e2fa02971750c2b7ffba648b0f5d5cc016a73 \
+ --hash=sha256:87f1097acb569dde17f246faa268759a71a2cb8c96dd392cd25c668b104cad2f \
+ --hash=sha256:946c3a1ef25338e78107fba746f299f926db408d34553b4754e90a7de1d44068 \
+ --hash=sha256:95cae920959d772f30ab36d3b25f83bb0f3be671e986c72ce22f8fa700dae061 \
+ --hash=sha256:9fe41b6f72f52d3da4db524c8653e46243c8c92df826ab5ffaece2dba9cccd58 \
+ --hash=sha256:b3d336ed75d17c7b1af233a6561cf421dee41d9204aa3cfcc6c9c65cd5bb69a8 \
+ --hash=sha256:b66bcc5670e8a6b78f0313bcb74774c8291f6f8aeef10fe70e910b8040f3ab75 \
+ --hash=sha256:b725da33e6e58e4a5d27958568484aa766e825e93aa20c26c91168be58e08cbb \
+ --hash=sha256:b72758f3ffc36ca566ba98a8e7f4f373b6c17c646ff8ad9b21ad10c29186f00d \
+ --hash=sha256:bcef128f970bb63ecf9a65f7beafd9b55e3aaf0efc271a4154050fc15cdb386e \
+ --hash=sha256:c8e8fe01e435005d4421f183038fc70ca85d2c1e490f51fb972db92af6e047c2 \
+ --hash=sha256:d61f7ce4727a9fa7680cd6f3986b0e2c732639f46a5e0156e550e35258aa313a \
+ --hash=sha256:d6768a327ea1ba44c9114dba5fdda4a214bdb70129065cd0807eb5f010bfcbb5 \
+ --hash=sha256:e18668f1bd39e69b7fed19fa7cd1cd110a121ec25439328b5c89934e6d30d357 \
+ --hash=sha256:fbbeb3c9b2edb5fd044b2a070f127a0ac456ffd079cb82746fc84af01ef021a4 \
+ --hash=sha256:ff0f9913d82e1d1fadbd976424c316fbc4d9c525c81d047bbdd16bd27dd98cfc
overrides==7.7.0 \
--hash=sha256:55158fa3d93b98cc75299b1e67078ad9003ca27945c76162c1c0766d6f91820a \
--hash=sha256:c7ed9d062f78b8e4c1a7b70bd8796b35ead4d9f510227ef9c5dc7626c60d7e49
packaging==23.2 \
--hash=sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5 \
--hash=sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7
-pandas==2.2.0 \
- --hash=sha256:159205c99d7a5ce89ecfc37cb08ed179de7783737cea403b295b5eda8e9c56d1 \
- --hash=sha256:20404d2adefe92aed3b38da41d0847a143a09be982a31b85bc7dd565bdba0f4e \
- --hash=sha256:2707514a7bec41a4ab81f2ccce8b382961a29fbe9492eab1305bb075b2b1ff4f \
- --hash=sha256:30b83f7c3eb217fb4d1b494a57a2fda5444f17834f5df2de6b2ffff68dc3c8e2 \
- --hash=sha256:38e0b4fc3ddceb56ec8a287313bc22abe17ab0eb184069f08fc6a9352a769b18 \
- --hash=sha256:5a946f210383c7e6d16312d30b238fd508d80d927014f3b33fb5b15c2f895430 \
- --hash=sha256:736da9ad4033aeab51d067fc3bd69a0ba36f5a60f66a527b3d72e2030e63280a \
- --hash=sha256:761cb99b42a69005dec2b08854fb1d4888fdf7b05db23a8c5a099e4b886a2106 \
- --hash=sha256:7ea3ee3f125032bfcade3a4cf85131ed064b4f8dd23e5ce6fa16473e48ebcaf5 \
- --hash=sha256:8108ee1712bb4fa2c16981fba7e68b3f6ea330277f5ca34fa8d557e986a11670 \
- --hash=sha256:85793cbdc2d5bc32620dc8ffa715423f0c680dacacf55056ba13454a5be5de88 \
- --hash=sha256:8ce2fbc8d9bf303ce54a476116165220a1fedf15985b09656b4b4275300e920b \
- --hash=sha256:a146b9dcacc3123aa2b399df1a284de5f46287a4ab4fbfc237eac98a92ebcb71 \
- --hash=sha256:a1b438fa26b208005c997e78672f1aa8138f67002e833312e6230f3e57fa87d5 \
- --hash=sha256:a20628faaf444da122b2a64b1e5360cde100ee6283ae8effa0d8745153809a2e \
- --hash=sha256:a41d06f308a024981dcaa6c41f2f2be46a6b186b902c94c2674e8cb5c42985bc \
- --hash=sha256:a626795722d893ed6aacb64d2401d017ddc8a2341b49e0384ab9bf7112bdec30 \
- --hash=sha256:cfd6c2491dc821b10c716ad6776e7ab311f7df5d16038d0b7458bc0b67dc10f3 \
- --hash=sha256:eb1e1f3861ea9132b32f2133788f3b14911b68102d562715d71bd0013bc45440 \
- --hash=sha256:f5be5d03ea2073627e7111f61b9f1f0d9625dc3c4d8dda72cc827b0c58a1d042 \
- --hash=sha256:f9670b3ac00a387620489dfc1bca66db47a787f4e55911f1293063a78b108df1 \
- --hash=sha256:fbc1b53c0e1fdf16388c33c3cca160f798d38aea2978004dd3f4d3dec56454c9
+pandas==2.2.1 \
+ --hash=sha256:04f6ec3baec203c13e3f8b139fb0f9f86cd8c0b94603ae3ae8ce9a422e9f5bee \
+ --hash=sha256:06cf591dbaefb6da9de8472535b185cba556d0ce2e6ed28e21d919704fef1a9e \
+ --hash=sha256:0ab90f87093c13f3e8fa45b48ba9f39181046e8f3317d3aadb2fffbb1b978572 \
+ --hash=sha256:0f573ab277252ed9aaf38240f3b54cfc90fff8e5cab70411ee1d03f5d51f3944 \
+ --hash=sha256:101d0eb9c5361aa0146f500773395a03839a5e6ecde4d4b6ced88b7e5a1a6403 \
+ --hash=sha256:11940e9e3056576ac3244baef2fedade891977bcc1cb7e5cc8f8cc7d603edc89 \
+ --hash=sha256:4acf681325ee1c7f950d058b05a820441075b0dd9a2adf5c4835b9bc056bf4fb \
+ --hash=sha256:53680dc9b2519cbf609c62db3ed7c0b499077c7fefda564e330286e619ff0dd9 \
+ --hash=sha256:7d2ed41c319c9fb4fd454fe25372028dfa417aacb9790f68171b2e3f06eae8cd \
+ --hash=sha256:88ecb5c01bb9ca927ebc4098136038519aa5d66b44671861ffab754cae75102c \
+ --hash=sha256:8df8612be9cd1c7797c93e1c5df861b2ddda0b48b08f2c3eaa0702cf88fb5f88 \
+ --hash=sha256:94e714a1cca63e4f5939cdce5f29ba8d415d85166be3441165edd427dc9f6bc0 \
+ --hash=sha256:9d1265545f579edf3f8f0cb6f89f234f5e44ba725a34d86535b1a1d38decbccc \
+ --hash=sha256:a935a90a76c44fe170d01e90a3594beef9e9a6220021acfb26053d01426f7dc2 \
+ --hash=sha256:af5d3c00557d657c8773ef9ee702c61dd13b9d7426794c9dfeb1dc4a0bf0ebc7 \
+ --hash=sha256:c2ce852e1cf2509a69e98358e8458775f89599566ac3775e70419b98615f4b06 \
+ --hash=sha256:c38ce92cb22a4bea4e3929429aa1067a454dcc9c335799af93ba9be21b6beb51 \
+ --hash=sha256:c391f594aae2fd9f679d419e9a4d5ba4bce5bb13f6a989195656e7dc4b95c8f0 \
+ --hash=sha256:c70e00c2d894cb230e5c15e4b1e1e6b2b478e09cf27cc593a11ef955b9ecc81a \
+ --hash=sha256:e97fbb5387c69209f134893abc788a6486dbf2f9e511070ca05eed4b930b1b02 \
+ --hash=sha256:f02a3a6c83df4026e55b63c1f06476c9aa3ed6af3d89b4f04ea656ccdaaaa359 \
+ --hash=sha256:f821213d48f4ab353d20ebc24e4faf94ba40d76680642fb7ce2ea31a3ad94f9b
parso==0.8.3 \
--hash=sha256:8c07be290bb59f03588915921e29e8a50002acaf2cdc5fa0e0114f91709fafa0 \
--hash=sha256:c001d4636cd3aecdaf33cbb40aebb59b094be2a74c556778ef5576c175e19e75
@@ -916,9 +916,9 @@ pillow==10.2.0 \
pluggy==1.4.0 \
--hash=sha256:7db9f7b503d67d1c5b95f59773ebb58a8c1c288129a88665838012cfb07b8981 \
--hash=sha256:8c85c2876142a764e5b7548e7d9a0e0ddb46f5185161049a79b7e974454223be
-posthog==3.4.2 \
- --hash=sha256:c7e79b2e585d16e93749874bcbcdad78d857037398ce0d8d6c474a04d0bd3bbe \
- --hash=sha256:f0eafa663fbc4a942b49b6168a62a890635407044bbc7593051dcb9cc1208873
+posthog==3.5.0 \
+ --hash=sha256:3c672be7ba6f95d555ea207d4486c171d06657eb34b3ce25eb043bfe7b6b5b76 \
+ --hash=sha256:8f7e3b2c6e8714d0c0c542a2109b83a7549f63b7113a133ab2763a89245ef2ef
prettytable==3.10.0 \
--hash=sha256:6536efaf0757fdaa7d22e78b3aac3b69ea1b7200538c2c6995d649365bddab92 \
--hash=sha256:9665594d137fb08a1117518c25551e0ede1687197cf353a4fdc78d27e1073568
@@ -955,74 +955,77 @@ pyasn1==0.5.1 \
pyasn1-modules==0.3.0 \
--hash=sha256:5bd01446b736eb9d31512a30d46c1ac3395d676c6f3cafa4c03eb54b9925631c \
--hash=sha256:d3ccd6ed470d9ffbc716be08bd90efbd44d0734bc9303818f7336070984a162d
-pydantic==2.6.1 \
- --hash=sha256:0b6a909df3192245cb736509a92ff69e4fef76116feffec68e93a567347bae6f \
- --hash=sha256:4fd5c182a2488dc63e6d32737ff19937888001e2a6d86e94b3f233104a5d1fa9
-pydantic-core==2.16.2 \
- --hash=sha256:02906e7306cb8c5901a1feb61f9ab5e5c690dbbeaa04d84c1b9ae2a01ebe9379 \
- --hash=sha256:0ba503850d8b8dcc18391f10de896ae51d37fe5fe43dbfb6a35c5c5cad271a06 \
- --hash=sha256:16aa02e7a0f539098e215fc193c8926c897175d64c7926d00a36188917717a05 \
- --hash=sha256:22c5f022799f3cd6741e24f0443ead92ef42be93ffda0d29b2597208c94c3753 \
- --hash=sha256:2924b89b16420712e9bb8192396026a8fbd6d8726224f918353ac19c4c043d2a \
- --hash=sha256:308974fdf98046db28440eb3377abba274808bf66262e042c412eb2adf852731 \
- --hash=sha256:396fdf88b1b503c9c59c84a08b6833ec0c3b5ad1a83230252a9e17b7dfb4cffc \
- --hash=sha256:3ac426704840877a285d03a445e162eb258924f014e2f074e209d9b4ff7bf380 \
- --hash=sha256:3fab4e75b8c525a4776e7630b9ee48aea50107fea6ca9f593c98da3f4d11bf7c \
- --hash=sha256:406fac1d09edc613020ce9cf3f2ccf1a1b2f57ab00552b4c18e3d5276c67eb11 \
- --hash=sha256:40a0bd0bed96dae5712dab2aba7d334a6c67cbcac2ddfca7dbcc4a8176445990 \
- --hash=sha256:459c0d338cc55d099798618f714b21b7ece17eb1a87879f2da20a3ff4c7628e2 \
- --hash=sha256:459d6be6134ce3b38e0ef76f8a672924460c455d45f1ad8fdade36796df1ddc8 \
- --hash=sha256:46b0d5520dbcafea9a8645a8164658777686c5c524d381d983317d29687cce97 \
- --hash=sha256:47924039e785a04d4a4fa49455e51b4eb3422d6eaacfde9fc9abf8fdef164e8a \
- --hash=sha256:4bfcbde6e06c56b30668a0c872d75a7ef3025dc3c1823a13cf29a0e9b33f67e8 \
- --hash=sha256:4f9ee4febb249c591d07b2d4dd36ebcad0ccd128962aaa1801508320896575ef \
- --hash=sha256:5f60f920691a620b03082692c378661947d09415743e437a7478c309eb0e4f82 \
- --hash=sha256:60eb8ceaa40a41540b9acae6ae7c1f0a67d233c40dc4359c256ad2ad85bdf5e5 \
- --hash=sha256:6db58c22ac6c81aeac33912fb1af0e930bc9774166cdd56eade913d5f2fff35e \
- --hash=sha256:70651ff6e663428cea902dac297066d5c6e5423fda345a4ca62430575364d62b \
- --hash=sha256:72f7919af5de5ecfaf1eba47bf9a5d8aa089a3340277276e5636d16ee97614d7 \
- --hash=sha256:7b883af50eaa6bb3299780651e5be921e88050ccf00e3e583b1e92020333304b \
- --hash=sha256:7beec26729d496a12fd23cf8da9944ee338c8b8a17035a560b585c36fe81af20 \
- --hash=sha256:7bf26c2e2ea59d32807081ad51968133af3025c4ba5753e6a794683d2c91bf6e \
- --hash=sha256:7c31669e0c8cc68400ef0c730c3a1e11317ba76b892deeefaf52dcb41d56ed5d \
- --hash=sha256:870dbfa94de9b8866b37b867a2cb37a60c401d9deb4a9ea392abf11a1f98037b \
- --hash=sha256:88646cae28eb1dd5cd1e09605680c2b043b64d7481cdad7f5003ebef401a3039 \
- --hash=sha256:8aafeedb6597a163a9c9727d8a8bd363a93277701b7bfd2749fbefee2396469e \
- --hash=sha256:8bde5b48c65b8e807409e6f20baee5d2cd880e0fad00b1a811ebc43e39a00ab2 \
- --hash=sha256:8f9142a6ed83d90c94a3efd7af8873bf7cefed2d3d44387bf848888482e2d25f \
- --hash=sha256:936a787f83db1f2115ee829dd615c4f684ee48ac4de5779ab4300994d8af325b \
- --hash=sha256:98dc6f4f2095fc7ad277782a7c2c88296badcad92316b5a6e530930b1d475ebc \
- --hash=sha256:9957433c3a1b67bdd4c63717eaf174ebb749510d5ea612cd4e83f2d9142f3fc8 \
- --hash=sha256:99af961d72ac731aae2a1b55ccbdae0733d816f8bfb97b41909e143de735f522 \
- --hash=sha256:9b5f13857da99325dcabe1cc4e9e6a3d7b2e2c726248ba5dd4be3e8e4a0b6d0e \
- --hash=sha256:9d776d30cde7e541b8180103c3f294ef7c1862fd45d81738d156d00551005784 \
- --hash=sha256:a3b7352b48fbc8b446b75f3069124e87f599d25afb8baa96a550256c031bb890 \
- --hash=sha256:a477932664d9611d7a0816cc3c0eb1f8856f8a42435488280dfbf4395e141485 \
- --hash=sha256:a7e41e3ada4cca5f22b478c08e973c930e5e6c7ba3588fb8e35f2398cdcc1545 \
- --hash=sha256:a90fec23b4b05a09ad988e7a4f4e081711a90eb2a55b9c984d8b74597599180f \
- --hash=sha256:a9e523474998fb33f7c1a4d55f5504c908d57add624599e095c20fa575b8d943 \
- --hash=sha256:b0d7a9165167269758145756db43a133608a531b1e5bb6a626b9ee24bc38a8f7 \
- --hash=sha256:b94cbda27267423411c928208e89adddf2ea5dd5f74b9528513f0358bba019cb \
- --hash=sha256:ce232a6170dd6532096cadbf6185271e4e8c70fc9217ebe105923ac105da9978 \
- --hash=sha256:cf903310a34e14651c9de056fcc12ce090560864d5a2bb0174b971685684e1d8 \
- --hash=sha256:d5362d099c244a2d2f9659fb3c9db7c735f0004765bbe06b99be69fbd87c3f15 \
- --hash=sha256:dffaf740fe2e147fedcb6b561353a16243e654f7fe8e701b1b9db148242e1272 \
- --hash=sha256:e6294e76b0380bb7a61eb8a39273c40b20beb35e8c87ee101062834ced19c545 \
- --hash=sha256:eca58e319f4fd6df004762419612122b2c7e7d95ffafc37e890252f869f3fb2a \
- --hash=sha256:ed957db4c33bc99895f3a1672eca7e80e8cda8bd1e29a80536b4ec2153fa9804 \
- --hash=sha256:ef6113cd31411eaf9b39fc5a8848e71c72656fd418882488598758b2c8c6dfa0 \
- --hash=sha256:f8ed79883b4328b7f0bd142733d99c8e6b22703e908ec63d930b06be3a0e7113 \
- --hash=sha256:fe56851c3f1d6f5384b3051c536cc81b3a93a73faf931f404fef95217cf1e10d \
- --hash=sha256:ff7c97eb7a29aba230389a2661edf2e9e06ce616c7e35aa764879b6894a44b25
+pydantic==2.6.3 \
+ --hash=sha256:72c6034df47f46ccdf81869fddb81aade68056003900a8724a4f160700016a2a \
+ --hash=sha256:e07805c4c7f5c6826e33a1d4c9d47950d7eaf34868e2690f8594d2e30241f11f
+pydantic-core==2.16.3 \
+ --hash=sha256:00ee1c97b5364b84cb0bd82e9bbf645d5e2871fb8c58059d158412fee2d33d8a \
+ --hash=sha256:0d32576b1de5a30d9a97f300cc6a3f4694c428d956adbc7e6e2f9cad279e45ed \
+ --hash=sha256:0df446663464884297c793874573549229f9eca73b59360878f382a0fc085979 \
+ --hash=sha256:0f56ae86b60ea987ae8bcd6654a887238fd53d1384f9b222ac457070b7ac4cff \
+ --hash=sha256:162e498303d2b1c036b957a1278fa0899d02b2842f1ff901b6395104c5554a45 \
+ --hash=sha256:1b662180108c55dfbf1280d865b2d116633d436cfc0bba82323554873967b340 \
+ --hash=sha256:1cac689f80a3abab2d3c0048b29eea5751114054f032a941a32de4c852c59cad \
+ --hash=sha256:287073c66748f624be4cef893ef9174e3eb88fe0b8a78dc22e88eca4bc357ca6 \
+ --hash=sha256:2a72fb9963cba4cd5793854fd12f4cfee731e86df140f59ff52a49b3552db241 \
+ --hash=sha256:2acca2be4bb2f2147ada8cac612f8a98fc09f41c89f87add7256ad27332c2fda \
+ --hash=sha256:2f583bd01bbfbff4eaee0868e6fc607efdfcc2b03c1c766b06a707abbc856187 \
+ --hash=sha256:33809aebac276089b78db106ee692bdc9044710e26f24a9a2eaa35a0f9fa70ba \
+ --hash=sha256:36fa178aacbc277bc6b62a2c3da95226520da4f4e9e206fdf076484363895d2c \
+ --hash=sha256:4204e773b4b408062960e65468d5346bdfe139247ee5f1ca2a378983e11388a2 \
+ --hash=sha256:456855f57b413f077dff513a5a28ed838dbbb15082ba00f80750377eed23d132 \
+ --hash=sha256:49d5d58abd4b83fb8ce763be7794d09b2f50f10aa65c0f0c1696c677edeb7cbf \
+ --hash=sha256:4df8a199d9f6afc5ae9a65f8f95ee52cae389a8c6b20163762bde0426275b7db \
+ --hash=sha256:500960cb3a0543a724a81ba859da816e8cf01b0e6aaeedf2c3775d12ee49cade \
+ --hash=sha256:519ae0312616026bf4cedc0fe459e982734f3ca82ee8c7246c19b650b60a5ee4 \
+ --hash=sha256:5c5cbc703168d1b7a838668998308018a2718c2130595e8e190220238addc96f \
+ --hash=sha256:6162f8d2dc27ba21027f261e4fa26f8bcb3cf9784b7f9499466a311ac284b5b9 \
+ --hash=sha256:716b542728d4c742353448765aa7cdaa519a7b82f9564130e2b3f6766018c9ec \
+ --hash=sha256:732da3243e1b8d3eab8c6ae23ae6a58548849d2e4a4e03a1924c8ddf71a387cb \
+ --hash=sha256:75b81e678d1c1ede0785c7f46690621e4c6e63ccd9192af1f0bd9d504bbb6bf4 \
+ --hash=sha256:75f76ee558751746d6a38f89d60b6228fa174e5172d143886af0f85aa306fd89 \
+ --hash=sha256:86b3d0033580bd6bbe07590152007275bd7af95f98eaa5bd36f3da219dcd93da \
+ --hash=sha256:8d62da299c6ecb04df729e4b5c52dc0d53f4f8430b4492b93aa8de1f541c4aac \
+ --hash=sha256:8e47755d8152c1ab5b55928ab422a76e2e7b22b5ed8e90a7d584268dd49e9c6b \
+ --hash=sha256:936e5db01dd49476fa8f4383c259b8b1303d5dd5fb34c97de194560698cc2c5e \
+ --hash=sha256:99b6add4c0b39a513d323d3b93bc173dac663c27b99860dd5bf491b240d26137 \
+ --hash=sha256:9c865a7ee6f93783bd5d781af5a4c43dadc37053a5b42f7d18dc019f8c9d2bd1 \
+ --hash=sha256:a425479ee40ff021f8216c9d07a6a3b54b31c8267c6e17aa88b70d7ebd0e5e5b \
+ --hash=sha256:a6b1bb0827f56654b4437955555dc3aeeebeddc47c2d7ed575477f082622c49e \
+ --hash=sha256:aaf09e615a0bf98d406657e0008e4a8701b11481840be7d31755dc9f97c44053 \
+ --hash=sha256:b29eeb887aa931c2fcef5aa515d9d176d25006794610c264ddc114c053bf96fe \
+ --hash=sha256:b3992a322a5617ded0a9f23fd06dbc1e4bd7cf39bc4ccf344b10f80af58beacd \
+ --hash=sha256:b60cc1a081f80a2105a59385b92d82278b15d80ebb3adb200542ae165cd7d183 \
+ --hash=sha256:b926dd38db1519ed3043a4de50214e0d600d404099c3392f098a7f9d75029ff8 \
+ --hash=sha256:bd87f48924f360e5d1c5f770d6155ce0e7d83f7b4e10c2f9ec001c73cf475c99 \
+ --hash=sha256:c9bd22a2a639e26171068f8ebb5400ce2c1bc7d17959f60a3b753ae13c632975 \
+ --hash=sha256:cbcc558401de90a746d02ef330c528f2e668c83350f045833543cd57ecead1ad \
+ --hash=sha256:cf6204fe865da605285c34cf1172879d0314ff267b1c35ff59de7154f35fdc2e \
+ --hash=sha256:d33dd21f572545649f90c38c227cc8631268ba25c460b5569abebdd0ec5974ca \
+ --hash=sha256:d89ca19cdd0dd5f31606a9329e309d4fcbb3df860960acec32630297d61820df \
+ --hash=sha256:dcca5d2bf65c6fb591fff92da03f94cd4f315972f97c21975398bd4bd046854a \
+ --hash=sha256:ded1c35f15c9dea16ead9bffcde9bb5c7c031bff076355dc58dcb1cb436c4721 \
+ --hash=sha256:e56f8186d6210ac7ece503193ec84104da7ceb98f68ce18c07282fcc2452e76f \
+ --hash=sha256:e7c6ed0dc9d8e65f24f5824291550139fe6f37fac03788d4580da0d33bc00c97 \
+ --hash=sha256:ec08be75bb268473677edb83ba71e7e74b43c008e4a7b1907c6d57e940bf34b6 \
+ --hash=sha256:ed25e1835c00a332cb10c683cd39da96a719ab1dfc08427d476bce41b92531fc \
+ --hash=sha256:f4cb85f693044e0f71f394ff76c98ddc1bc0953e48c061725e540396d5c8a2e1 \
+ --hash=sha256:f53aace168a2a10582e570b7736cc5bef12cae9cf21775e3eafac597e8551fbe \
+ --hash=sha256:f651dd19363c632f4abe3480a7c87a9773be27cfe1341aef06e8759599454120 \
+ --hash=sha256:fc4ad7f7ee1a13d9cb49d8198cd7d7e3aa93e425f371a68235f784e99741561f \
+ --hash=sha256:fee427241c2d9fb7192b658190f9f5fd6dfe41e02f3c1489d2ec1e6a5ab1e04a
+pydantic-settings==2.2.1 \
+ --hash=sha256:00b9f6a5e95553590434c0fa01ead0b216c3e10bc54ae02e37f359948643c5ed \
+ --hash=sha256:0235391d26db4d2190cb9b31051c4b46882d28a51533f97440867f012d4da091
pydub==0.25.1 \
--hash=sha256:65617e33033874b59d87db603aa1ed450633288aefead953b30bded59cb599a6 \
--hash=sha256:980a33ce9949cab2a569606b65674d748ecbca4f0796887fd6f46173a7b0d30f
pygments==2.17.2 \
--hash=sha256:b27c2826c47d0f3219f29554824c30c5e8945175d888647acd804ddd04af846c \
--hash=sha256:da46cec9fd2de5be3a8a784f434e4c4ab670b4ff54d605c4c2717e9d49c4c367
-pyparsing==3.1.1 \
- --hash=sha256:32c7c0b711493c72ff18a981d24f28aaf9c1fb7ed5e9667c9e84e3db623bdbfb \
- --hash=sha256:ede28a1a32462f5a9705e07aea48001a08f7cf81a021585011deba701581a0db
+pyparsing==3.1.2 \
+ --hash=sha256:a1bac0ce561155ecc3ed78ca94d3c9378656ad4c94c1270de543f621420f94ad \
+ --hash=sha256:f9db75911801ed778fe61bb643079ff86601aca99fcae6345aa67292038fb742
pypika==0.48.9 \
--hash=sha256:838836a61747e7c8380cd1b7ff638694b7a7335345d0f559b04b2cd832ad5378
pyproject-hooks==1.0.0 \
@@ -1037,12 +1040,15 @@ pytest==7.4.4 \
pytest-mock==3.12.0 \
--hash=sha256:0972719a7263072da3a21c7f4773069bcc7486027d7e8e1f81d98a47e701bc4f \
--hash=sha256:31a40f038c22cad32287bb43932054451ff5583ff094bca6f675df2f8bc1a6e9
-python-dateutil==2.8.2 \
- --hash=sha256:0123cacc1627ae19ddf3c27a5de5bd67ee4586fbdd6440d9748f8abb483d3e86 \
- --hash=sha256:961d03dc3453ebbc59dbdea9e4e11c5651520a876d0f4db161e8674aae935da9
+python-dateutil==2.9.0.post0 \
+ --hash=sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3 \
+ --hash=sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427
python-dotenv==1.0.1 \
--hash=sha256:e324ee90a023d808f1959c46bcbc04446a10ced277783dc6ee09987c37ec10ca \
--hash=sha256:f7b63ef50f1b690dddf550d03497b66d609393b40b564ed0d674909a68ebf16a
+python-iso639==2024.2.7 \
+ --hash=sha256:7b149623ff74230f4ee3061fb01d18e57a8d07c5fee2aa72907f39b7f6d16cbc \
+ --hash=sha256:c323233348c34d57c601e3e6d824088e492896bcb97a61a87f7d93401a305377
python-multipart==0.0.9 \
--hash=sha256:03f54688c663f1b7977105f021043b0793151e4cb1c1a9d4a11fc13d622c4026 \
--hash=sha256:97ca7b8ea7b05f977dc3849c3ba99d51689822fab725c3703af7c866a0c2b215
@@ -1131,9 +1137,9 @@ requests==2.31.0 \
requests-oauthlib==1.3.1 \
--hash=sha256:2577c501a2fb8d05a304c09d090d6e47c306fef15809d102b327cf8364bddab5 \
--hash=sha256:75beac4a47881eeb94d5ea5d6ad31ef88856affe2332b9aafb52c6452ccf0d7a
-rich==13.7.0 \
- --hash=sha256:5cb5123b5cf9ee70584244246816e9114227e0b98ad9176eede6ad54bf5403fa \
- --hash=sha256:6da14c108c4866ee9520bbffa71f6fe3962e193b7da68720583850cd4548e235
+rich==13.7.1 \
+ --hash=sha256:4edbae314f59eb482f54e9e30bf00d33350aaa94f4bfcd4e9e3110e64d0d7222 \
+ --hash=sha256:9be308cb1fe2f1f57d67ce99e95af38a1e2bc71ad9813b0e247cf7ffbcc3a432
rpds-py==0.18.0 \
--hash=sha256:01e36a39af54a30f28b73096dd39b6802eddd04c90dbe161c1b8dbe22353189f \
--hash=sha256:044a3e61a7c2dafacae99d1e722cc2d4c05280790ec5a05031b3876809d89a5c \
@@ -1211,30 +1217,30 @@ rpds-py==0.18.0 \
rsa==4.9 \
--hash=sha256:90260d9058e514786967344d0ef75fa8727eed8a7d2e43ce9f4bcf1b536174f7 \
--hash=sha256:e38464a49c6c85d7f1351b0126661487a7e0a14a50f1675ec50eb34d4f20ef21
-ruff==0.2.2 \
- --hash=sha256:0a9efb032855ffb3c21f6405751d5e147b0c6b631e3ca3f6b20f917572b97eb6 \
- --hash=sha256:0c126da55c38dd917621552ab430213bdb3273bb10ddb67bc4b761989210eb6e \
- --hash=sha256:1695700d1e25a99d28f7a1636d85bafcc5030bba9d0578c0781ba1790dbcf51c \
- --hash=sha256:1ec49be4fe6ddac0503833f3ed8930528e26d1e60ad35c2446da372d16651ce9 \
- --hash=sha256:3b65494f7e4bed2e74110dac1f0d17dc8e1f42faaa784e7c58a98e335ec83d7e \
- --hash=sha256:5e1439c8f407e4f356470e54cdecdca1bd5439a0673792dbe34a2b0a551a2fe3 \
- --hash=sha256:5e22676a5b875bd72acd3d11d5fa9075d3a5f53b877fe7b4793e4673499318ba \
- --hash=sha256:6a61ea0ff048e06de273b2e45bd72629f470f5da8f71daf09fe481278b175001 \
- --hash=sha256:940de32dc8853eba0f67f7198b3e79bc6ba95c2edbfdfac2144c8235114d6726 \
- --hash=sha256:b0c232af3d0bd8f521806223723456ffebf8e323bd1e4e82b0befb20ba18388e \
- --hash=sha256:c9d15fc41e6054bfc7200478720570078f0b41c9ae4f010bcc16bd6f4d1aacdd \
- --hash=sha256:cc9a91ae137d687f43a44c900e5d95e9617cb37d4c989e462980ba27039d239d \
- --hash=sha256:d450b7fbff85913f866a5384d8912710936e2b96da74541c82c1b458472ddb39 \
- --hash=sha256:d920499b576f6c68295bc04e7b17b6544d9d05f196bb3aac4358792ef6f34325 \
- --hash=sha256:e62ed7f36b3068a30ba39193a14274cd706bc486fad521276458022f7bccb31d \
- --hash=sha256:ecd46e3106850a5c26aee114e562c329f9a1fbe9e4821b008c4404f64ff9ce73 \
- --hash=sha256:f63d96494eeec2fc70d909393bcd76c69f35334cdbd9e20d089fb3f0640216ca
+ruff==0.3.1 \
+ --hash=sha256:09c7333b25e983aabcf6e38445252cff0b4745420fc3bda45b8fce791cc7e9ce \
+ --hash=sha256:11b5699c42f7d0b771c633d620f2cb22e727fb226273aba775a91784a9ed856c \
+ --hash=sha256:434c3fc72e6311c85cd143c4c448b0e60e025a9ac1781e63ba222579a8c29200 \
+ --hash=sha256:52b02bb46f1a79b0c1fa93f6495bc7e77e4ef76e6c28995b4974a20ed09c0833 \
+ --hash=sha256:54e5dca3e411772b51194b3102b5f23b36961e8ede463776b289b78180df71a0 \
+ --hash=sha256:5f0c21b6914c3c9a25a59497cbb1e5b6c2d8d9beecc9b8e03ee986e24eee072e \
+ --hash=sha256:6b730f56ccf91225da0f06cfe421e83b8cc27b2a79393db9c3df02ed7e2bbc01 \
+ --hash=sha256:6b82e3937d0d76554cd5796bc3342a7d40de44494d29ff490022d7a52c501744 \
+ --hash=sha256:78a7025e6312cbba496341da5062e7cdd47d95f45c1b903e635cdeb1ba5ec2b9 \
+ --hash=sha256:951efb610c5844e668bbec4f71cf704f8645cf3106e13f283413969527ebfded \
+ --hash=sha256:ae7954c8f692b70e6a206087ae3988acc9295d84c550f8d90b66c62424c16771 \
+ --hash=sha256:c0318a512edc9f4e010bbaab588b5294e78c5cdc9b02c3d8ab2d77c7ae1903e3 \
+ --hash=sha256:c78bfa85637668f47bd82aa2ae17de2b34221ac23fea30926f6409f9e37fc927 \
+ --hash=sha256:d30db97141fc2134299e6e983a6727922c9e03c031ae4883a6d69461de722ae7 \
+ --hash=sha256:d3b60e44240f7e903e6dbae3139a65032ea4c6f2ad99b6265534ff1b83c20afa \
+ --hash=sha256:d6abaad602d6e6daaec444cbf4d9364df0a783e49604c21499f75bb92237d4af \
+ --hash=sha256:d937f9b99ebf346e0606c3faf43c1e297a62ad221d87ef682b5bdebe199e01f6
semantic-version==2.10.0 \
--hash=sha256:bdabb6d336998cbb378d4b9db3a4b56a1e3235701dc05ea2690d9a997ed5041c \
--hash=sha256:de78a3b8e0feda74cabc54aab2da702113e33ac9d9eb9d2389bcf1f58b7d9177
-setuptools==69.1.0 \
- --hash=sha256:850894c4195f09c4ed30dba56213bf7c3f21d86ed6bdaafb5df5972593bfc401 \
- --hash=sha256:c054629b81b946d63a9c6e732bc8b2513a7c3ea645f11d0139a2191d735c60c6
+setuptools==69.1.1 \
+ --hash=sha256:02fa291a0471b3a18b2b2481ed902af520c69e8ae0919c13da936542754b4c56 \
+ --hash=sha256:5c0806c7d9af348e6dd3777b4f4dbb42c7ad85b190104837488eab9a7c945cf8
shellingham==1.5.4 \
--hash=sha256:7ecfff8f2fd72616f7481040475a65b2bf8af90a56c89140852d1120324e8686 \
--hash=sha256:8dbca0739d487e5bd35ab3ca4b36e11c4078f3a234bfce294b0a0291363404de
@@ -1244,36 +1250,36 @@ six==1.16.0 \
smmap==5.0.1 \
--hash=sha256:dceeb6c0028fdb6734471eb07c0cd2aae706ccaecab45965ee83f11c8d3b1f62 \
--hash=sha256:e6d8668fa5f93e706934a62d7b4db19c8d9eb8cf2adbb75ef1b675aa332b69da
-sniffio==1.3.0 \
- --hash=sha256:e60305c5e5d314f5389259b7f22aaa33d8f7dee49763119234af3755c55b9101 \
- --hash=sha256:eecefdce1e5bbfb7ad2eeaabf7c1eeb404d7757c379bd1f7e5cce9d8bf425384
-sqlalchemy==2.0.27 \
- --hash=sha256:03f448ffb731b48323bda68bcc93152f751436ad6037f18a42b7e16af9e91c07 \
- --hash=sha256:0de1263aac858f288a80b2071990f02082c51d88335a1db0d589237a3435fe71 \
- --hash=sha256:0fb3bffc0ced37e5aa4ac2416f56d6d858f46d4da70c09bb731a246e70bff4d5 \
- --hash=sha256:1306102f6d9e625cebaca3d4c9c8f10588735ef877f0360b5cdb4fdfd3fd7131 \
- --hash=sha256:15e19a84b84528f52a68143439d0c7a3a69befcd4f50b8ef9b7b69d2628ae7c4 \
- --hash=sha256:1ab4e0448018d01b142c916cc7119ca573803a4745cfe341b8f95657812700ac \
- --hash=sha256:4535c49d961fe9a77392e3a630a626af5baa967172d42732b7a43496c8b28876 \
- --hash=sha256:5b78aa9f4f68212248aaf8943d84c0ff0f74efc65a661c2fc68b82d498311fd5 \
- --hash=sha256:5cd20f58c29bbf2680039ff9f569fa6d21453fbd2fa84dbdb4092f006424c2e6 \
- --hash=sha256:680b9a36029b30cf063698755d277885d4a0eab70a2c7c6e71aab601323cba45 \
- --hash=sha256:6c5bad7c60a392850d2f0fee8f355953abaec878c483dd7c3836e0089f046bf6 \
- --hash=sha256:6c7a596d0be71b7baa037f4ac10d5e057d276f65a9a611c46970f012752ebf2d \
- --hash=sha256:7f470327d06400a0aa7926b375b8e8c3c31d335e0884f509fe272b3c700a7254 \
- --hash=sha256:86a6ed69a71fe6b88bf9331594fa390a2adda4a49b5c06f98e47bf0d392534f8 \
- --hash=sha256:8dfc936870507da96aebb43e664ae3a71a7b96278382bcfe84d277b88e379b18 \
- --hash=sha256:954d9735ee9c3fa74874c830d089a815b7b48df6f6b6e357a74130e478dbd951 \
- --hash=sha256:a3012ab65ea42de1be81fff5fb28d6db893ef978950afc8130ba707179b4284a \
- --hash=sha256:c4fbe6a766301f2e8a4519f4500fe74ef0a8509a59e07a4085458f26228cd7cc \
- --hash=sha256:ce850db091bf7d2a1f2fdb615220b968aeff3849007b1204bf6e3e50a57b3d32 \
- --hash=sha256:d04e579e911562f1055d26dab1868d3e0bb905db3bccf664ee8ad109f035618a \
- --hash=sha256:d177b7e82f6dd5e1aebd24d9c3297c70ce09cd1d5d37b43e53f39514379c029c \
- --hash=sha256:d997c5938a08b5e172c30583ba6b8aad657ed9901fc24caf3a7152eeccb2f1b4 \
- --hash=sha256:dbcd77c4d94b23e0753c5ed8deba8c69f331d4fd83f68bfc9db58bc8983f49cd \
- --hash=sha256:eb15ef40b833f5b2f19eeae65d65e191f039e71790dd565c2af2a3783f72262f \
- --hash=sha256:f9374e270e2553653d710ece397df67db9d19c60d2647bcd35bfc616f1622dcd \
- --hash=sha256:fa67d821c1fd268a5a87922ef4940442513b4e6c377553506b9db3b83beebbd8
+sniffio==1.3.1 \
+ --hash=sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2 \
+ --hash=sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc
+sqlalchemy==2.0.28 \
+ --hash=sha256:0315d9125a38026227f559488fe7f7cee1bd2fbc19f9fd637739dc50bb6380b2 \
+ --hash=sha256:0d3dd67b5d69794cfe82862c002512683b3db038b99002171f624712fa71aeaa \
+ --hash=sha256:124202b4e0edea7f08a4db8c81cc7859012f90a0d14ba2bf07c099aff6e96462 \
+ --hash=sha256:1ee8bd6d68578e517943f5ebff3afbd93fc65f7ef8f23becab9fa8fb315afb1d \
+ --hash=sha256:46a3d4e7a472bfff2d28db838669fc437964e8af8df8ee1e4548e92710929adc \
+ --hash=sha256:4a5adf383c73f2d49ad15ff363a8748319ff84c371eed59ffd0127355d6ea1da \
+ --hash=sha256:4b6303bfd78fb3221847723104d152e5972c22367ff66edf09120fcde5ddc2e2 \
+ --hash=sha256:56856b871146bfead25fbcaed098269d90b744eea5cb32a952df00d542cdd368 \
+ --hash=sha256:5da98815f82dce0cb31fd1e873a0cb30934971d15b74e0d78cf21f9e1b05953f \
+ --hash=sha256:78bb7e8da0183a8301352d569900d9d3594c48ac21dc1c2ec6b3121ed8b6c986 \
+ --hash=sha256:81ba314a08c7ab701e621b7ad079c0c933c58cdef88593c59b90b996e8b58fa5 \
+ --hash=sha256:943aa74a11f5806ab68278284a4ddd282d3fb348a0e96db9b42cb81bf731acdc \
+ --hash=sha256:9b66fcd38659cab5d29e8de5409cdf91e9986817703e1078b2fdaad731ea66f5 \
+ --hash=sha256:a921002be69ac3ab2cf0c3017c4e6a3377f800f1fca7f254c13b5f1a2f10022c \
+ --hash=sha256:ad7acbe95bac70e4e687a4dc9ae3f7a2f467aa6597049eeb6d4a662ecd990bb6 \
+ --hash=sha256:af8ce2d31679006e7b747d30a89cd3ac1ec304c3d4c20973f0f4ad58e2d1c4c9 \
+ --hash=sha256:b4a2cf92995635b64876dc141af0ef089c6eea7e05898d8d8865e71a326c0385 \
+ --hash=sha256:bbda76961eb8f27e6ad3c84d1dc56d5bc61ba8f02bd20fcf3450bd421c2fcc9c \
+ --hash=sha256:bea30da1e76cb1acc5b72e204a920a3a7678d9d52f688f087dc08e54e2754c67 \
+ --hash=sha256:c61e2e41656a673b777e2f0cbbe545323dbe0d32312f590b1bc09da1de6c2a02 \
+ --hash=sha256:c6c4da4843e0dabde41b8f2e8147438330924114f541949e6318358a56d1875a \
+ --hash=sha256:d3499008ddec83127ab286c6f6ec82a34f39c9817f020f75eca96155f9765097 \
+ --hash=sha256:dd53b6c4e6d960600fd6532b79ee28e2da489322fcf6648738134587faf767b6 \
+ --hash=sha256:e0b148ab0438f72ad21cb004ce3bdaafd28465c4276af66df3b9ecd2037bf252 \
+ --hash=sha256:e23b88c69497a6322b5796c0781400692eca1ae5532821b39ce81a48c395aae9 \
+ --hash=sha256:feea693c452d85ea0015ebe3bb9cd15b6f49acc1a31c28b3c50f4db0f8fb1e71
starlette==0.36.3 \
--hash=sha256:13d429aa93a61dc40bf503e8c801db1f1bca3dc706b10ef2434a36123568f044 \
--hash=sha256:90a671733cfb35771d8cc605e0b679d23b992f8dcfad48cc60b38cb29aeb7080
@@ -1381,9 +1387,12 @@ tokenizers==0.15.2 \
--hash=sha256:f33dfbdec3784093a9aebb3680d1f91336c56d86cc70ddf88708251da1fe9064 \
--hash=sha256:f86593c18d2e6248e72fb91c77d413a815153b8ea4e31f7cd443bdf28e467670 \
--hash=sha256:fb16ba563d59003028b678d2361a27f7e4ae0ab29c7a80690efa20d829c81fdb
-tomli==2.0.1; python_version < "3.11" \
+tomli==2.0.1 \
--hash=sha256:939de3e7a6161af0c887ef91b7d41a53e7c5a1ca976325f429cb46ea9bc30ecc \
--hash=sha256:de526c12914f0c550d15924c62d72abc48d6fe7364aa87328337a31007fe8a4f
+tomli-w==1.0.0 \
+ --hash=sha256:9f2a07e8be30a0729e533ec968016807069991ae2fd921a78d42f429ae5f4463 \
+ --hash=sha256:f463434305e0336248cac9c2dc8076b707d8a12d019dd349f5c1e382dd1ae1b9
tomlkit==0.12.0 \
--hash=sha256:01f0477981119c7d8ee0f67ebe0297a7c95b14cf9f4b102b45486deb77018716 \
--hash=sha256:926f1f37a1587c7a4f6c7484dae538f1345d96d793d9adab5d3675957b1d0766
@@ -1396,9 +1405,9 @@ tqdm==4.66.2 \
typer==0.9.0 \
--hash=sha256:50922fd79aea2f4751a8e0408ff10d2662bd0c8bbfa84755a699f3bada2978b2 \
--hash=sha256:5d96d986a21493606a358cae4461bd8cdf83cbf33a5aa950ae629ca3b51467ee
-typing-extensions==4.9.0 \
- --hash=sha256:23478f88c37f27d76ac8aee6c905017a143b0b1b886c3c9f66bc2fd94f9f5783 \
- --hash=sha256:af72aea155e91adfc61c3ae9e0e342dbc0cba726d6cba4b6c72c1f34e47291cd
+typing-extensions==4.10.0 \
+ --hash=sha256:69b1a937c3a517342112fb4c6df7e72fc39a38e7891a5730ed4985b5214b5475 \
+ --hash=sha256:b0abd7c89e8fb96f98db18d86106ff1d90ab692004eb746cf6eda2682f91b3cb
typing-inspect==0.9.0 \
--hash=sha256:9ee6fc59062311ef8547596ab6b955e1b8aa46242d854bfc78f4f6b0eff35f9f \
--hash=sha256:b23fc42ff6f6ef6954e4852c1fb512cdd18dbea03134f91f856a95ccc9461f78