Skip to content

TEAMPB/ollama4j-vaadin-ui

Repository files navigation

Ollama4j-UI

This project should be seen as a showcase to create a java based frontend application (using the Vaadin-Framework on top of an OpenLiberty JakartaEE10 server) for model interaction against the Ollama using the Java ollama4j API.

It should be considered a set of samples / use cases rather than a fully-functional interchained application serving each and every aspect of AI driven applications.

However one main goal of this repository is that anybody using java as their primary development language can create an environment to start Ollama development on a local server / local machine in only some minutes.

Architecture

flowchart LR
    subgraph Ollama Deployment
        direction TB
        m[Models]
        OLLAMA[Ollama] -->|manages| m
    end
    subgraph JakartaEE Runtime
        direction LR    
        UI[Ollama4j-UI] --uses--> O4J[Ollama4j]
        O4J-. http .-> OLLAMA
    end
    BROWSER[browser] -. http .-> UI
Loading

Build Status

build-status

Requirements

The following requirements must be met to run this application:

Java

Or

Running the Application

Import the project to the IDE of your choosing as a Maven project. Configure the microprofile-config.properties to match your local requirements.

Run application using

mvn liberty:dev (resp ./mvnw or mvnw in case maven is not installed locally)

Open http://localhost:9080/ollama4j-ui in browser.

Specify the model to use using the model-selector in the top right corner ot the application (should be defaulted to your default model set in microprofile-config.properties).

Get Involved

Contributions are most welcome! Whether it's reporting a bug, proposing an enhancement, or helping with code - any sort of contribution is much appreciated.

Credits

Shout out to @amithkoujalgi for creating the awesome ollama4j library that inspired the creation of this application.

Further shoutout to everybody involved and active around the awesome Ollama project.

About

UI-Tester for Interactions with Ollama via ollama4j

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published