• zygo_histo_morpheus@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    The article mentions AI. 16gigs feels far too little to run a LLM of respectable size so I wonder what exactly this means? Feels like no one is gonna be happy about a 16gig LLM (high RAM usage and bad AI features)