Skip to content

vmatekole/llm-factory

Repository files navigation

LLM Maker

LLM Factory using Dependency Injection provided by Injector

It enables streamlined instantiation of an assortment of LLMs. Leveraging Injector, configuration is separated from the model providing a uniform interface for instantiation. Using Dependency Injection also encourages testability in code further downstream.

LLM Factory has the following goals:

  • Streamlined instantiation
  • Configuration Mgmt
  • Opinionated set of LLM services provided by

Models supported:

  • GPT4All
  • OpenAI

About

LLM Factory using Dependency Injection provided by [Injector](https://github.com/python-injector/injector)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages