ClassicConnect ClassicConnect
"640k ought to be enough for everybody."
 
FAQ :: Search :: Memberlist :: Usergroups :: Register
Profile :: Log in to check your private messages :: Log in

Easy LLM with Ollama

 
Post new topic   Reply to topic    ClassicConnect Forum Index -> Artificial Intelligence
View previous topic :: View next topic  
Author Message
Sashi
Admin


Joined: 03 Sep 2023
Posts: 70

PostPosted: Thu May 08, 2025 11:15 pm    Post subject: Easy LLM with Ollama Reply with quote

Want to install an LLM in a snap? Ollama is pretty solid!

First we need to download and install Ollama.
You can get it here:

Mac - https://ollama.com/download/Ollama-darwin.zip

Windows - https://ollama.com/download/OllamaSetup.exe

Linux
Code:
curl -fsSL https://ollama.com/install.sh | sh


Once Ollama is installed.
You will need a model to run. Head here to see the list of models available.
https://ollama.com/search

For this example I'm going to use llama3.3
To pull the model, open up your terminal and type:
Code:
ollama pull llama3.3

The model will download and once complete it will be ready to run.
To run the model type:
Code:
ollama run llama3.3

The model will load and its then ready for your input!

Other cool stuff for Ollama.
If you want to write scripts that invoke ollama, visit this github and choose between Python and Javascript!

Python - https://github.com/ollama/ollama-python

Javascript - https://github.com/ollama/ollama-js

Be sure to read the github readme for documentation and checkthe example directory for some good usage examples.

What about a web frontend?

To run Ollama using OpenWebUI (Similar to OpenAI's web interface) visit https://docs.openwebui.com/getting-started/quick-start

This will require you to be familiar with Docker.


Have fun!
_________________
I do stuff with things for reasons.
Fuck it, test in prod.
Back to top
View user's profile Send private message
LyraNovaHeart
Gorts


Joined: 15 Apr 2025
Age: 27
Posts: 48
Location: Los Angeles, California

PostPosted: Thu May 08, 2025 11:19 pm    Post subject: Reply with quote

While I do appreciate the posts, the reason Ollama isn't recommended is because it uses a non standard API (aka its not OpenAI compatible), and Ollama is just a fork of llama.cpp with less features than Koboldcpp, yet they have almost similar levels of ease of use.
_________________
I'm one day closer to being who I wanna be~
Back to top
View user's profile Send private message Visit poster's website
Sashi
Admin


Joined: 03 Sep 2023
Posts: 70

PostPosted: Thu May 08, 2025 11:24 pm    Post subject: Reply with quote

For those who don't want to be in the OpenAI sphere, I feel this is a decent alternative. Especially for those who just want to play around with AI and not do anything too in-depth.
_________________
I do stuff with things for reasons.
Fuck it, test in prod.
Back to top
View user's profile Send private message
Display posts from previous:   
Post new topic   Reply to topic    ClassicConnect Forum Index -> Artificial Intelligence All times are GMT
Page 1 of 1

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum



smartDark Style by Smartor
Powered by phpBB 2.0.25 CC Mod © 2001, 2002 phpBB Group
 
Page generation time: 0.026s (PHP: 96% - SQL: 4%) - SQL queries: 12 - GZIP enabled - Debug on