Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Hello!
Thank you for this project!
I made an async PHP framework with a lot of features dedicated to llama.cpp . llama.cpp bindings can also be used standalone.
Here are the docs on how to use the basic bindings: https://resonance.distantmagic.com/docs/features/ai/server/llama-cpp/
There are also some features built on top llama.cpp API:
https://resonance.distantmagic.com/docs/features/ai/prompt-subject-responders/
I also wrote some tutorials:
https://resonance.distantmagic.com/tutorials/how-to-create-llm-websocket-chat-with-llama-cpp/
https://resonance.distantmagic.com/tutorials/how-to-serve-llm-completions/
I added the direct link to the bindings in the readme. Is there a chance I can also add links to those tutorials and other resources here somewhere?
Best wishes