Skip to content

iTeam-S/kaiz

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

kaiz

Before all:

$ ~ cd local
$ ~ docker compose -f compose.ollama.yaml up -d

Then, access the container:

$ ~ docker exec -it local_ollama bash
$ ~ ollama pull llama3.2:1b

Set environment variables:

export KAIZ_OLLAMA_HOST=...
export KAIZ_OLLAMA_MODEL=...

To install dependencies:

bun install

To run:

bun run index.ts Hello
bun run index.ts Hello! How to Run Open Source LLMs Locally Using Ollama ?

To build:

  • Linux:
$ ~ bun build --compile --minify ./index.ts  --target=bun-linux-x64 --outfile=bin/kaiz
  • Windows:
$ ~ bun build --compile --minify ./index.ts  --target=bun-windows-x64 --outfile=bin/kaiz

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published