• khalid_salad@awful.systems
    link
    fedilink
    English
    arrow-up
    36
    ·
    edit-2
    16 days ago

    Well, two responses I have seen to the claim that LLMs are not reasoning are:

    1. we are all just stochastic parrots lmao
    2. maybe intelligence is an emergent ability that will show up eventually (disregard the inability to falsify this and the categorical nonsense that is our definition of “emergent”).

    So I think this research is useful as a response to these, although I think “fuck off, promptfondler” is pretty good too.

      • FermiEstimate@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        24
        ·
        16 days ago

        No, there’s an actual paper where that term originated that goes into great deal explaining what it means and what it applies to. It answers those questions and addresses potential objections people might respond with.

        There’s no need for–and, frankly, nothing interesting about–“but, what is truth, really?” vibes-based takes on the term.

      • V0ldek@awful.systems
        link
        fedilink
        English
        arrow-up
        12
        ·
        edit-2
        15 days ago

        Only in the philosophical sense of all of physics being a giant stochastic system.

        But that’s equally useful as saying that we’re Turing machines? Yes, if you draw a broad category of “all things that compute in our universe” then you can make a reasonable (but disputable!) argument that both me and a Python interpreter are in the same category of things. That doesn’t mean that a Python interpreter is smart/sentient/will solve climate change/whatever Sammy Boi wants to claim this week.

        Or, to use a different analogy, it’s like saying “we’re all just cosmic energy, bro”. Yes we are, pass the joint already and stop trying to raise billions of dollars for your energy woodchipper.