We have started working on a GUI for the Subsystem, and fully remaking it to work with all docker sock types and working no matter the host os. If you would like to test, check out the Subsystem_pixelarch
folder!
In addition to Carly, we have developed the Midori AI Subsystem, a modular and extensible platform designed to simplify the deployment, configuration, and management of AI workloads. The Subsystem leverages Docker container technology to provide standardized environments for AI systems, ensuring consistent and predictable behavior. Key features include:
- Simplified Deployment: Easily deploy AI systems using pre-configured containers, reducing complexity, and setup time.
- Standardized Configurations: Ensure consistency and reproducibility across different environments with standardized configurations.
- Isolation for AI Systems: Isolate AI systems within containers, preventing conflicts and ensuring resource allocation.
- Growing Library of Backends and Tools: Access a wide range of supported AI backends and tools, expanding your AI capabilities.
Chat with your own locally hosted AI, via:
- AnythingLLM - For chating with your docs using LocalAI or other LLM hosts
- Big-AGI - For chating with your docs using LocalAI or other LLM hosts
Seamlessly integrate your AI systems with these LLM Backends:
- LocalAI - For LLM inference, Embedding, and more
- Ollama - For LLM inference, Embedding, and more
- Axolotl - For training your own fine tuned LLMs
Chat with these locally hosted LLM Hubs, using the LLM backends in the Subsystem:
- AutoGPT - For Setting up / running LLM "Experts"
- MemGPT - For Setting up / running LLM with OS like memory
Support the Midori AI node based cluster system!
- Midori Ai Cluster - Not Ready Yet
Make photos for your AI's, by using:
- InvokeAI - For making photos using AI models
-
What is the purpose of the Midori AI Subsystem?
- The Midori AI Subsystem is a modular and extensible platform for managing AI workloads, providing simplified deployment, standardized configurations, and isolation for AI systems.
-
How does the Midori AI Subsystem simplify AI deployment?
- The Midori AI Subsystem simplifies AI deployment by providing a streamlined and efficient way to deploy AI systems using Docker container technology, reducing complexities and ensuring consistent and predictable behavior.
-
What are the benefits of using the Midori AI Subsystem?
- The benefits of using the Midori AI Subsystem include simplified deployment, standardized configurations, isolation for AI systems, and a growing library of supported backends and tools.
-
What are the limitations of the Midori AI Subsystem?
- The limitations of the Midori AI Subsystem include its current beta status, potential for bugs, and reliance on Docker container technology.
-
What are the recommended prerequisites for using the Midori AI Subsystem?
- The recommended prerequisites for using the Midori AI Subsystem include Docker Desktop Windows or Docker installed on other operating systems, and a dedicated folder for the Manager program.
-
How do I install the Midori AI Subsystem Manager?
- You can install the Midori AI Subsystem Manager by downloading the appropriate package for your operating system from the Midori AI Subsystem website and following the installation instructions. Click here to go to the Midori AI Subsystem website
-
Where can I find more information about the Midori AI Subsystem?
- You can find more information about the Midori AI Subsystem on the Midori AI Subsystem website, which includes documentation, tutorials, and a community Discord.
-
What is the difference between the Midori AI Subsystem and other AI frameworks?
- The Midori AI Subsystem differs from other AI frameworks by providing a modular and extensible platform specifically designed for managing AI workloads, offering features such as simplified deployment, standardized configurations, and isolation for AI systems.
-
How does the Midori AI Subsystem handle security?
- The Midori AI Subsystem does not handle security directly, but it relies on the security features provided by the underlying Docker container technology and the specific AI backends and tools being used.
-
What are the plans for future development of the Midori AI Subsystem?
- The plans for future development of the Midori AI Subsystem include adding new features, improving stability and performance, and expanding the library of supported backends and tools.
Questions from Carly Kay
- Join our Discord community: https://discord.gg/xdgCx3VyHU
- Connect with us on Facebook: https://www.facebook.com/TWLunagreen
- Follow us on Twitter: https://twitter.com/lunamidori5
- Explore our Pinterest boards: https://www.pinterest.com/luna_midori5/
- Follow us on Twitch: https://www.twitch.tv/luna_midori5
- Subscribe to our YouTube channel: https://www.youtube.com/channel/UCVQo4TxFJEoE5kccScY-xow
- Support us on PayPal: https://paypal.me/midoricookieclub?country.x=US&locale.x=en_US
Unleashing the Future of AI, Together.