• pirat@lemmy.ml
    link
    fedilink
    arrow-up
    3
    ·
    6 days ago

    Neat!

    I’ve been running Garuda on my main rig for a minute. I thought all would be good but some of my music production stuff has been a bit slow to catch up as far as updates in the AUR vs the official .deb releases (and I haven’t tinkered enough to just make that work myself).

    Being able to install .deb otb seems nice; I was planning on running a new framework 12 laptop on it (which I dream of getting for a new performance rig for my music) but I may install it on my current performance rig to see how it runs.

    How well does it play with nvidia? If it’s all good and I eventually switch on my main rig I’d love to be able to run a local GPU supported AI. I know that for nvidia I have to have drivers that support cuda stuff.

    • marcie (she/her)@lemmy.mlOP
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      6 days ago

      Nvidia open source drivers are working pretty good, I have no complaints. Local AI stuff can be a little annoying to setup as a beginner I bet, but if you run it through llama.cpp its smooth sailing. I recommend something like StabilityMatrix (app image) if you have no clue whats going on

      • pirat@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        4 days ago

        I’ve done a fair bit of tinkering so I’m sure I can get it to work.