This is an LLM fine-tuned and catered towards being a helpful assistant in the OCF (kind of like an Operational Staff member with a lot more technical knowledge to fix things on the backend). Welcome to the OCF staff, Waddles!
The project composes of two components with different tech stacks:
llm-backend
represents the backend that lives on HPC
These two components talk to each other using a RESTful API, an example can be found here. However, there is a caveat that the input to the model needs to be passed using the format specified here in get_prompt_template()
.
llm-frontend
represents the application frontend that lives on the desktops
Visit the individual README.md
file for each component to see how to contribute. (Click on the individual names of the component.)
This might become an interest group in some future semester so if you're seeing this once it is, just know I called it way back when.