• BlueLineBae@midwest.social
      link
      fedilink
      arrow-up
      111
      ·
      5 months ago

      I would already like to buy a 4k TV that isn’t smart and have yet to find it. Please don’t add AI into the mix as well :(

      • NotNotMike@programming.dev
        link
        fedilink
        arrow-up
        12
        ·
        5 months ago

        I was just thinking the other day how I’d love to “root” my TV like I used to root my phones. Maybe install some free OS instead

      • Artyom@lemm.ee
        link
        fedilink
        arrow-up
        7
        arrow-down
        1
        ·
        edit-2
        5 months ago

        All TVs are dumb TVs if they have no internet access

      • MrQuallzin@lemmy.world
        link
        fedilink
        arrow-up
        6
        ·
        5 months ago

        We got a Sceptre brand TV from Walmart a few years ago that does the trick. 4k, 50 inch, no smart features.

        • Appoxo@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          5
          ·
          5 months ago

          Still slow UI.
          If only signage displays would have the fidelity of a regular OLED consumer without the business-usage tax on top.

          • Dandroid@sh.itjust.works
            link
            fedilink
            arrow-up
            1
            ·
            5 months ago

            What do you use the UI for? I just turn my TV on and off. No user interface needed. Only a power button on the remote.

            • Appoxo@lemmy.dbzer0.com
              link
              fedilink
              arrow-up
              1
              ·
              5 months ago

              Even switching to other stuff right after the boot (because the power-on can’t be called a simple power-on anymore) the tv is slow.
              I recently had the pleasure of interacting with a TV from ~2017 or 2018. God was it slow. Especially loading native apps (Samsung 50"-ish TV)

              I like my chromecast. At least that was properly specced. Now if only HDMI and CEC would work like I’d like to :|

      • AVincentInSpace@pawb.social
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        5 months ago

        Signage TVs are good for this. They’re designed to run 24/7 in store windows displaying advertisements or animated menus, so they’re a bit pricey, and don’t expect any fancy features like HDR, but they’ve got no smarts whatsoever. What they do have is a slot you can shove your own smart gadget into with a connector that breaks oug power, HDMI etc. which someone has made a Raspberry Pi Compute Module carrier board for, so if you’re into, say, Jellyfin, you can make it smart completely under your own control with e.g. libreELEC. Here’s a video from Jeff Geerling going into more detail: https://youtu.be/-epPf7D8oMk

        Alternatively, if you want HDR and high refresh rates, you’re okay with a smallish TV, and you’re really willing to splash out, ASUS ROG makes 48" 4K 10-bit gaming monitors for around $1700 US. HDMI is HDMI, you can plug whatever you want into there.

      • onlinepersona@programming.dev
        link
        fedilink
        arrow-up
        1
        arrow-down
        5
        ·
        5 months ago

        I don’t have a TV, but doesn’t a smart TV require internet access? Why not just… not give it internet access? Or do they come with their own mobile data plans now meaning you can’t even turn off the internet access?

        Anti Commercial-AI license

    • CrystalRainwater@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      1
      ·
      5 months ago

      Right now it’s easier to find projectors without it and a smart os. Before long tho it’s gonna be harder to find those without a smart os and AI upscaling

  • rtxn@lemmy.world
    link
    fedilink
    arrow-up
    81
    ·
    5 months ago

    The dedicated TPM chip is already being used for side-channel attacks. A new processor running arbitrary code would be a black hat’s wet dream.

    • MajorHavoc@programming.dev
      link
      fedilink
      arrow-up
      51
      ·
      5 months ago

      It will be.

      IoT devices are already getting owned at staggering rates. Adding a learning model that currently cannot be secured is absolutely going to happen, and going to cause a whole new large batch of breaches.

      • rtxn@lemmy.world
        link
        fedilink
        arrow-up
        19
        ·
        5 months ago

        TPM-FAIL from 2019. It affects Intel fTPM and some dedicated TPM chips: link

        The latest (at the moment) UEFI vulnerability, UEFIcanhazbufferoverflow is also related to, but not directly caused by, TPM on Intel systems: link

        • barsquid@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          5 months ago

          That’s insane. How can they be doing security hardware and leave a timing attack in there?

          Thank you for those links, really interesting stuff.

        • Blue_Morpho@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          5 months ago

          A processor that isn’t Turing complete isn’t a security problem like the TPM you referenced. A TPM includes a CPU. If a processor is Turing complete it’s called a CPU.

          Is it Turing complete? I don’t know. I haven’t seen block diagrams that show the computational units have their own cpu.

          CPUs also have co processer to speed up floating point operations. That doesn’t necessarily make it a security problem.

  • NounsAndWords@lemmy.world
    link
    fedilink
    arrow-up
    68
    ·
    5 months ago

    I would pay for AI-enhanced hardware…but I haven’t yet seen anything that AI is enhancing, just an emerging product being tacked on to everything they can for an added premium.

    • DerisionConsulting@lemmy.ca
      link
      fedilink
      arrow-up
      27
      ·
      5 months ago

      In the 2010s, it was cramming a phone app and wifi into things to try to justify the higher price, while also spying on users in new ways. The device may even a screen for basically no reason.
      In the 2020s, those same useless features now with a bit of software with a flashy name that removes even more control from the user, and allows the manufacturer to spy on even further the user.

    • Fermion@feddit.nl
      link
      fedilink
      arrow-up
      19
      ·
      5 months ago

      It’s like rgb all over again.

      At least rgb didn’t make a giant stock market bubble…

    • aname@lemmy.one
      link
      fedilink
      arrow-up
      9
      ·
      5 months ago

      My Samsung A71 has had devil AI since day one. You know that feature where you can mostly use fingerprint unlock but then once a day or so it ask for the actual passcode for added security. My A71 AI has 100% success rate of picking the most inconvenient time to ask for the passcode instead of letting me do my thing.

    • PriorityMotif@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      5 months ago

      Already had that Google thingy for years now. The USB/nvme device for image recognition. Can’t remember what it’s called now. Cost like $30.

      Edit: Google coral TPU

            • lmaydev@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              5 months ago

              It’s hardware specifically designed for running AI tasks. Like neural networks.

              An NPU, or Neural Processing Unit, is a dedicated processor or processing unit on a larger SoC designed specifically for accelerating neural network operations and AI tasks. Unlike general-purpose CPUs and GPUs, NPUs are optimized for a data-driven parallel computing, making them highly efficient at processing massive multimedia data like videos and images and processing data for neural networks

        • lmaydev@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          5 months ago

          An NPU, or Neural Processing Unit, is a dedicated processor or processing unit on a larger SoC designed specifically for accelerating neural network operations and AI tasks.

          Exactly what we are talking about.

        • lmaydev@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          5 months ago

          I’m a programmer so when learning a new framework or library I use it as an interactive docs that allows follow up questions.

          I also use it to generate things like regex and SQL queries.

          It’s also really good at refactoring code and other repetitive tasks like that

          • Nachorella@lemmy.sdf.org
            link
            fedilink
            arrow-up
            2
            ·
            5 months ago

            it does seem like a good translator for the less human readable stuff like regex and such. I’ve dabbled with it a bit but I’m a technical artist and haven’t found much use for it in the things I do.

    • flicker@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      5 months ago

      I figure they’re those “early adopters” who buy the New Thing! as soon as it comes out, whether they need it or not, whether it’s garbage or not, because they want to be seen as on the cutting edge of technology.

    • Appoxo@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      5
      ·
      5 months ago

      raytracing is something I’d pay for even if unasked, assuming they meaningfully impact the quality and dont demand outlandish prices.
      And they’d need to put it in unasked and cooperate with devs else it won’t catch on quickly enough.
      Remember Nvidia Ansel?

        • alessandro@lemmy.ca
          link
          fedilink
          arrow-up
          1
          ·
          5 months ago

          yeah but it didn’t try to lock him into a subscription plan or software ecosystem

          Not AI fault, the first one (killed) was a remotely controlled by the product of a big corp (Skynet), the other one was a local, offline one.

          Moral of the story: there’s difference between the AI that runs locally on your GPU and the one that runs on Elon’s remote servers… and that difference may be life or death.

  • Nora@lemmy.ml
    link
    fedilink
    arrow-up
    30
    ·
    5 months ago

    I was recently looking for a new laptop and I actively avoided laptops with AI features.

    • lamabop@lemmings.world
      link
      fedilink
      arrow-up
      18
      ·
      5 months ago

      Look, me too, but, the average punter on the street just looks at AI new features and goes OK sure give it to me. Tell them about the dodgy shit that goes with AI and you’ll probably get a shrug at most

  • cygnus@lemmy.ca
    link
    fedilink
    arrow-up
    24
    ·
    5 months ago

    The biggest surprise here is that as many as 16% are willing to pay more…

    • ShinkanTrain@lemmy.ml
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      5 months ago

      I mean, if framegen and supersampling solutions become so good on those chips that regular versions can’t compare I guess I would get the AI version. I wouldn’t pay extra compared to current pricing though

  • kemsat@lemmy.world
    link
    fedilink
    arrow-up
    26
    arrow-down
    3
    ·
    5 months ago

    What does AI enhanced hardware mean? Because I bought an Nvidia RTX card pretty much just for the AI enhanced DLSS, and I’d do it again.

    • WhyDoYouPersist@lemmy.world
      link
      fedilink
      arrow-up
      28
      arrow-down
      1
      ·
      5 months ago

      When they start calling everything AI, soon enough it loses all meaning. They’re gonna have to start marketing things as AI-z, AI 2, iAI, AIA, AI 360, AyyyAye, etc. Got their work cut out for em, that’s for sure.

      • kemsat@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        5 months ago

        Just saying, I’d welcome some competition from other players in the industry. AI-boosted upscaling is a great use of the hardware, as long as it happens on your own hardware only.

  • alessandro@lemmy.ca
    link
    fedilink
    arrow-up
    21
    ·
    5 months ago

    I don’t think the poll question was well made… “would you like part away from your money for…” vaguely shakes hand in air “…ai?”

    People is already paying for “ai” even before chatGPT came out to popularize things: DLSS

  • UnderpantsWeevil@lemmy.world
    link
    fedilink
    arrow-up
    21
    ·
    5 months ago

    Okay, but here me out. What if the OS got way worse, and then I told you that paying me for the AI feature would restore it to a near-baseline level of original performance? What then, eh?

    • Honytawk@lemmy.zip
      link
      fedilink
      arrow-up
      17
      arrow-down
      1
      ·
      5 months ago
      • The ones who have investments in AI

      • The ones who listen to the marketing

      • The ones who are big Weird Al fans

      • The ones who didn’t understand the question

    • barfplanet@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      5 months ago

      I’m interested in hardware that can better run local models. Right now the best bet is a GPU, but I’d be interested in a laptop with dedicated chips for AI that would work with pytorch. I’m a novice but I know it takes forever on my current laptop.

      Not interested in running copilot better though.

    • x0x7@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      2
      ·
      edit-2
      5 months ago

      Maybe people doing AI development who want the option of running local models.

      But baking AI into all consumer hardware is dumb. Very few want it. saas AI is a thing. To the degree saas AI doesn’t offer the privacy of local AI, networked local AI on devices you don’t fully control offers even less. So it makes no sense for people who value convenience. It offers no value for people who want privacy. It only offers value to people doing software development who need more playground options, and I can go buy a graphics card myself thank you very much.

    • 31337@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      5 months ago

      I would if the hardware was powerful enough to do interesting or useful things, and there was software that did interesting or useful things. Like, I’d rather run an AI model to remove backgrounds from images or upscale locally, than to send images to Adobe servers (this is just an example, I don’t use Adobe products and don’t know if this is what Adobe does). I’d also rather do OCR locally and quickly than send it to a server. Same with translations. There are a lot of use-cases for “AI” models.

  • qaz@lemmy.world
    link
    fedilink
    arrow-up
    18
    arrow-down
    1
    ·
    edit-2
    5 months ago

    I would pay extra to be able to run open LLM’s locally on Linux. I wouldn’t pay for Microsoft’s Copilot stuff that’s shoehorned into every interface imaginable while also causing privacy and security issues. The context matters.

    • Blue_Morpho@lemmy.world
      link
      fedilink
      arrow-up
      10
      arrow-down
      1
      ·
      5 months ago

      That’s why NPU’s are actually a good thing. The ability to run LLM local instead of sending everything to Microsoft/Open AI for data mining will be great.

      • schizo@forum.uncomfortable.business
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        5 months ago

        I hate to be that guy, but do you REALLY think that on-device AI is going to prevent all your shit being sent to anyone who wants it, in the form of “diagnostic data” or “usage telemetry” or whatever weasel-worded bullshit in the terms of service?’

        They’ll just send the results for “quality assurance” instead of doing the math themselves and save a bundle on server hosting.

        • chicken@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          4
          ·
          edit-2
          5 months ago

          but do you REALLY think that on-device AI is going to prevent all your shit being sent to anyone who wants it

          Yes, obviously, especially if you are running all open source software.

        • alessandro@lemmy.ca
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          5 months ago

          All your unattended date will be taken (and some of the attended one). This doesn’t mean you should stop to attend your data. Even of you’re somehow forced to use Windows instead open alternative, it doesn’t mean you can’t dual boot or use other privacy conscious devices when dealing with your sensitive data.

          Closed/proprietary OS and hardware driver can’t be considered safe by design)

        • Blue_Morpho@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          5 months ago

          I replied to the person above “locally on Linux”.

          Even in Windows, local queries give the possibility of control. Set your firewall and it cannot leak.

  • smokescreen@lemmy.ca
    link
    fedilink
    arrow-up
    17
    ·
    5 months ago

    Pay more for a shitty chargpt clone in your operating system that can get exploited to hack your device. I see no flaw in this at all.