If you’re just wanting to run LLMs quickly on your computer in the command line, this is about as simple as it gets. Ollama provides an easy CLI to generate text, and there’s also a Raycast extension for more powerful usage.

  • popcar2@programming.dev
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    There’s also GPT4All which has the same concept but comes with a convenient GUI rather than run on the command line. I had some fun with Mistral-7B but honestly the weaker models are too dumb to be useful.