• peanuts4life@beehaw.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    I’ve seen this with gpt4. If I ask it to proofread text with errors it consistently does a great job, but if I prompt it to proofread a text without errors, it hallucinates them. It’s funny to see Microsoft having the same issue.