Hi there, I want to share some thoughts and want to hear your opinions on it.

Recently, AI developments are booming also in the sense of game development. E.g. NVIDIA ACE which would bring the possibility of NPCs which run an AI model to communicate with players. Also, there are developments on an alternative to ray tracing where lighting, shadows and reflections are generated using AI which would need less performance and has similar visual aesthetics as ray tracing.

So it seems like raster performance is already at a pretty decent level. And graphic card manufacturers are already putting increasingly AI processors on the graphics card.

In my eyes, the next logical step would be to separate the work of the graphics card, which would be rasterisation and ray tracing, from AI. Resulting in maybe a new kind of PCIe card, an AI accelerator, which would feature a processor optimized for parallel processing and high data throughput.

This would allow developers to run more advanced AI models on the consumer’s pc. For compatibility, they could e.g. offer a cloud based subscription system.

So what are your thoughts on this?

  • maynarkh@feddit.nl
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    Good question, but I’d say that the same train of thought went through dedicated physics cards. I’d guess that an AI card should have a great value proposition to be worth buying.

    For compatibility, they could e.g. offer a cloud based subscription system.

    I’m not sure where you’re going with this, but it feels wrong. I’m not buying a hardware part that cannot function without a constant internet connection or regular payment.

    • Port8080@feddit.deOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      I’d guess that an AI card should have a great value proposition to be worth buying.

      Sure the card should have great value or must have an accessible price. It probably also depends on how “heavy” the tasks get. But seeing e.g. OpenAI struggling with requests, it may be useful to decentralize the processing (with running the model locally on the user’s pc).

      I’m not sure where you’re going with this, but it feels wrong. I’m not buying a hardware part that cannot function without a constant internet connection or regular payment.

      Maybe this statement was a bit confusing. What I meant was, that in a transition phase developers could choose to allow the usage of a dedicated accelerator card to run everything locally and offline. And for people who don’t have or want such a card they could provide a cloud based subscription model, where the processing is done on remote servers.