ClassicConnect ClassicConnect
"640k ought to be enough for everybody."
 
FAQ :: Search :: Memberlist :: Usergroups :: Register
Profile :: Log in to check your private messages :: Log in

Can i run an LLM from my Windows 7 Win-To-Go setup?

 
Post new topic   Reply to topic    ClassicConnect Forum Index -> Artificial Intelligence
View previous topic :: View next topic  
Author Message
PatoFlamejanteTV
New Member


Joined: 08 May 2026
Posts: 1

PostPosted: Fri May 08, 2026 1:36 pm    Post subject: Can i run an LLM from my Windows 7 Win-To-Go setup? Reply with quote

Currently i want to test Ollama on it, but its installed on a SD Card, its velocity is around ~20MB/s (tested by transferring an large file from my Linux setup to it).

It has 32 GB of storage, but windows only used ~7GB, so there's a plenty of storage yet.

Is possible to use Ollama on it? Or i may need to use an software to path the executable before using?
Back to top
View user's profile Send private message Send e-mail
claraberry
New Member


Joined: 19 Jan 2026
Age: 19
Posts: 6

PostPosted: Fri May 08, 2026 1:45 pm    Post subject: Re: Can i run an LLM from my Windows 7 Win-To-Go setup? Reply with quote

PatoFlamejanteTV wrote:
Currently i want to test Ollama on it, but its installed on a SD Card, its velocity is around ~20MB/s (tested by transferring an large file from my Linux setup to it).

It has 32 GB of storage, but windows only used ~7GB, so there's a plenty of storage yet.

Is possible to use Ollama on it? Or i may need to use an software to path the executable before using?


You'll probably get abysmal loading times, but there's nothing inherently stopping you if you have the hardware.
_________________
All operating systems suck equally, it is up to you to decide which you get along with the most
Back to top
View user's profile Send private message
LyraNovaHeart
Gorts


Joined: 15 Apr 2025
Age: 27
Posts: 50
Location: Los Angeles, California

PostPosted: Fri May 08, 2026 3:34 pm    Post subject: Reply with quote

Main limits here would be software support. Barely any LLM software supports Windows 7, only one I know would be Kobold CPP in the old cpu mode. Not to mention, you need a fairly good CPU to get any usable T/s out of it. An i7-3612QM for example, that will run LLama 3.2 3b @ Q4, around 8T/s. MoEs will certainly be helpful, but I wouldn't expect too much out of this. If you do however have something workable, it'll take a long time to load, once it's in RAM though it should be somewhat faster.
_________________
I'm one day closer to being who I wanna be~
Back to top
View user's profile Send private message Visit poster's website
Display posts from previous:   
Post new topic   Reply to topic    ClassicConnect Forum Index -> Artificial Intelligence All times are GMT
Page 1 of 1

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum



smartDark Style by Smartor
Powered by phpBB 2.0.25 CC Mod © 2001, 2002 phpBB Group
 
Page generation time: 0.0391s (PHP: 97% - SQL: 3%) - SQL queries: 15 - GZIP enabled - Debug on