• Lucy :3@feddit.org
    link
    fedilink
    arrow-up
    1
    ·
    8 hours ago

    If you have a decent GPU or CPU, you can just set up ollama with ollama-cuda/ollama-rocm and run llama3.1 or llama3.1-uncensored.