Hello internet users. I have tried gpt4all and like it, but it is very slow on my laptop. I was wondering if anyone here knows of any solutions I could run on my server (debian 12, amd cpu, intel a380 gpu) through a web interface. Has anyone found any good way to do this?

  • GoogleyWoog@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    9 months ago

    I use KoboldCPP, works perfectly over the internet. Not sure how Intel support is though, and with 6GB VRAM the whole thing is of questionable utility. The smallest model I’ve found worth using would be in the 16GB range. With 24GB VRAM you can start to match ChatGPT 3.5 sort of thing.