this post was submitted on 13 Jul 2023
4 points (83.3% liked)

Free Open-Source Artificial Intelligence

3578 readers
1 users here now

Welcome to Free Open-Source Artificial Intelligence!

We are a community dedicated to forwarding the availability and access to:

Free Open Source Artificial Intelligence (F.O.S.A.I.)

More AI Communities

LLM Leaderboards

Developer Resources

GitHub Projects

FOSAI Time Capsule

founded 2 years ago
MODERATORS
 

I successfully installed oobavooga on a PC, and by using only the CPU I can run a vicuna and a wizardlm model. I can't run any llama, opt, gpt-j which looks required when trying to train the model through Lora. Do you have any suggestion?

you are viewing a single comment's thread
view the rest of the comments
[–] abhibeckert@lemmy.world 2 points 2 years ago* (last edited 2 years ago)

While it won't help OP, there are GPUs that have plenty of VRAM but are lacking CUDA.

Apple sells some with 192GB of VRAM for example... and as a Mac user I'm pretty frustrated by all the dependencies on CUDA. The few tools that don't require it run great on my Mac (Stable Diffusion for example, runs in 30 seconds on my laptop with typical settings).