• mindbleach@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    5
    ·
    18 days ago

    LLMs are fundamentally not designed to be reliable. That is not how they work. That is not the function they optimize. That is not the problem they solve.

    You’re making a fish climb trees.