Skip to content

ImDarkTom/LlamaPen

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

684 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LlamaPen

A no-install needed GUI for Ollama.

App Preview

Features

  • 🌐 Web-based interface usable on both desktop and mobile.
  • ✅ Easy setup & configuration.
  • 🖥️ Renders markdown, think text, LaTeX math.
  • 🛠️ Custom tool call support.
  • ⚡ Keyboard shortcuts for quick navigation.
  • 🗃️ Built-in model & download manager.
  • 🔌 Offline & PWA support.
  • 🕊️ 100% Free & Open-Source.

Setting Up

A guide for setup is included on the site. We've tried to make setup as smooth and straightforward as possible, letting you configure once and immediately start chatting any time Ollama is running.

Once set-up, you can start chatting. All chats are stored locally in your browser giving you complete privacy and near-instant chat load times.

Running Locally

If you instead want to contribute/run a development server, check out the contribution guide.

Running locally is made as straightforward as possible. There are two ways of getting a local LlamaPen instance:

Docker (recommended)

This route assumes you have Docker installed on your system.

Pull the image:

docker pull ghcr.io/imdarktom/llamapen:latest

Run the image:

docker run -d -p 8080:80 --name llamapen --restart unless-stopped ghcr.io/imdarktom/llamapen:latest

You can swap out the 8080 in the arguments for any port that you want LlamaPen to run on.

This will create a container that runs on startup with your computer and lets LlamaPen be accessible on localhost at the port specified.

Manually

You may run the app manually without Docker by installing it and running it through Bun. This is slightly less preferrable as you might encounter issues due to differences in package/tool versions.

Make sure you have installed:

1. Clone

git clone https://github.com/ImDarkTom/LlamaPen.git
cd LlamaPen

2. Install dependencies

bun install

3. Run

To run a local server:

bun run local

LlamaPen Cloud

If you are using the official site (https://llamapen.app), you may choose to enable LlamaPen Cloud. LlamaPen Cloud is an optional service that lets you run the most powerful versions of the latest models using a cloud provider if you are not able to run them locally. While LlamaPen is free and open-source, LlamaPen Cloud offers an optional subscription for increasing rate limits and accessing more expensive models.

For security reasons, LlamaPen Cloud is not open-source, however we strive to ensure your privacy (as outlined in the Cloud service privacy policy), and the only time we have access to your chats is if you explicitly enable LlamaPen Cloud in the settings and send chat requests using one of the provided models. No data is ever sent to LlamaPen Cloud if you do not enable it in the settings.

Donating

Funding to help development is always appreciated, whether that is through purchasing a subscription on LlamaPen API or donating directly, I will appreciate any sponsorship you give.

Buy Me A Coffee

Licenses & Attribution

LlamaPen is AGPL-3.0

About

A no-install needed web-GUI for Ollama.

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Sponsor this project

  •  

Contributors