• ShadowRam@fedia.io
    link
    fedilink
    arrow-up
    9
    ·
    3 hours ago

    It’s not all hype.

    nVidia has some SERIOUS R&D in the use of AI for the past 10 years.

    But using AI in the graphic space… upscaling, downscaling, faking lighting, faking physics… This is all very useful in making videogames.

    Then there was a leap in the way AI Image generation was done with the above hardware. And that opened up a whole new growing field.

    It’s just some people took basic language models that have been around for 30 years and scaled them up with their hardware. And it was neat, and surprising some of the stuff a LLM would output. But not reliable.

    And then suddenly a lot of layman’s got their hand on the LLM’s and thought it was the 2nd coming of Jesus, and started throwing big money at it… it will be surprise to no one who knows how these AI’s work that that big money isn’t going anywhere.

    But those first two, is no hype. It’s a real viable use case for the AI, and money will be made there.