Skip to content

Commit

Permalink
Add more examples
Browse files Browse the repository at this point in the history
  • Loading branch information
wctsai20002 committed Jul 29, 2024
1 parent 3ae243f commit ee9c352
Showing 1 changed file with 23 additions and 0 deletions.
23 changes: 23 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -74,6 +74,29 @@ result = converter.process()
print(result)
```

Generates prompts in split format, and then sends each prompt to LLM sequentially.

```python
import yaml
import json
from repository2prompt import Repository2Prompt, CONFIG

converter = Repository2Prompt("https://github.com/octocat/octocat.github.io", output_format="split")
result = converter.process()
prompts = json.loads(result)

for item in prompts:
prompt = item['prompt']
content = item['content']

# Combine prompt and content
full_message = f"{prompt}\n\n{content}"
print(full_message)

# Send to LLM and get response
response = chat_with_llm(full_message)
```

## Configuration

Repository2Prompt uses a YAML configuration file. The default configuration is located in `default_config.yaml` within the package. You can override the default settings by creating a `.repository2prompt.yaml` file in your home directory.
Expand Down

0 comments on commit ee9c352

Please sign in to comment.