Skip to content

lunamidori5/Midori-AI-Subsystem-Manager

Repository files navigation

Big UPDATE! (9/22/24)

We have started working on a GUI for the Subsystem, and fully remaking it to work with all docker sock types and working no matter the host os. If you would like to test, check out the Subsystem_pixelarch folder!

Midori AI Subsystem: Streamlining AI Workloads

Midori AI photo

In addition to Carly, we have developed the Midori AI Subsystem, a modular and extensible platform designed to simplify the deployment, configuration, and management of AI workloads. The Subsystem leverages Docker container technology to provide standardized environments for AI systems, ensuring consistent and predictable behavior. Key features include:

  • Simplified Deployment: Easily deploy AI systems using pre-configured containers, reducing complexity, and setup time.
  • Standardized Configurations: Ensure consistency and reproducibility across different environments with standardized configurations.
  • Isolation for AI Systems: Isolate AI systems within containers, preventing conflicts and ensuring resource allocation.
  • Growing Library of Backends and Tools: Access a wide range of supported AI backends and tools, expanding your AI capabilities.

----- Install / Setup -----

----- Installable Backends -----

Chat UIs

Chat with your own locally hosted AI, via:

  • AnythingLLM - For chating with your docs using LocalAI or other LLM hosts
  • Big-AGI - For chating with your docs using LocalAI or other LLM hosts

LLM Backends

Seamlessly integrate your AI systems with these LLM Backends:

  • LocalAI - For LLM inference, Embedding, and more
  • Ollama - For LLM inference, Embedding, and more
  • Axolotl - For training your own fine tuned LLMs

LLM Hubs

Chat with these locally hosted LLM Hubs, using the LLM backends in the Subsystem:

  • AutoGPT - For Setting up / running LLM "Experts"
  • MemGPT - For Setting up / running LLM with OS like memory

Cluster Based AI

Support the Midori AI node based cluster system!

  • Midori Ai Cluster - Not Ready Yet

Image AI

Make photos for your AI's, by using:

  • InvokeAI - For making photos using AI models

----- FAQs about the subsystem -----

  1. What is the purpose of the Midori AI Subsystem?

    • The Midori AI Subsystem is a modular and extensible platform for managing AI workloads, providing simplified deployment, standardized configurations, and isolation for AI systems.
  2. How does the Midori AI Subsystem simplify AI deployment?

    • The Midori AI Subsystem simplifies AI deployment by providing a streamlined and efficient way to deploy AI systems using Docker container technology, reducing complexities and ensuring consistent and predictable behavior.
  3. What are the benefits of using the Midori AI Subsystem?

    • The benefits of using the Midori AI Subsystem include simplified deployment, standardized configurations, isolation for AI systems, and a growing library of supported backends and tools.
  4. What are the limitations of the Midori AI Subsystem?

    • The limitations of the Midori AI Subsystem include its current beta status, potential for bugs, and reliance on Docker container technology.
  5. What are the recommended prerequisites for using the Midori AI Subsystem?

    • The recommended prerequisites for using the Midori AI Subsystem include Docker Desktop Windows or Docker installed on other operating systems, and a dedicated folder for the Manager program.
  6. How do I install the Midori AI Subsystem Manager?

  7. Where can I find more information about the Midori AI Subsystem?

    • You can find more information about the Midori AI Subsystem on the Midori AI Subsystem website, which includes documentation, tutorials, and a community Discord.
  8. What is the difference between the Midori AI Subsystem and other AI frameworks?

    • The Midori AI Subsystem differs from other AI frameworks by providing a modular and extensible platform specifically designed for managing AI workloads, offering features such as simplified deployment, standardized configurations, and isolation for AI systems.
  9. How does the Midori AI Subsystem handle security?

    • The Midori AI Subsystem does not handle security directly, but it relies on the security features provided by the underlying Docker container technology and the specific AI backends and tools being used.
  10. What are the plans for future development of the Midori AI Subsystem?

  • The plans for future development of the Midori AI Subsystem include adding new features, improving stability and performance, and expanding the library of supported backends and tools.

Questions from Carly Kay

Get Involved:

Unleashing the Future of AI, Together.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published