• Onno (VK6FLAB)@lemmy.radio
    link
    fedilink
    English
    arrow-up
    173
    arrow-down
    2
    ·
    21 days ago

    Welcome to the “brand new world” of IOT hardware where you are the product and continued service depends entirely on how you can be monetized.

    • shalafi@lemmy.world
      link
      fedilink
      English
      arrow-up
      45
      arrow-down
      1
      ·
      21 days ago

      I’m assuming it runs on AI and the company has to provide the backend. So yeah, if you purchase something that requires a company’s infrastructure, it can certainly be bricked.

      • jonne@infosec.pub
        link
        fedilink
        English
        arrow-up
        27
        arrow-down
        1
        ·
        21 days ago

        Which is why you should only buy stuff that relies on local APIs and on board processing.

        • Chozo@fedia.io
          link
          fedilink
          arrow-up
          55
          ·
          21 days ago

          99.99% of the people willing to buy an emotional support robot for their children will have no idea what the words you said even mean.

          • Lost_My_Mind@lemmy.world
            link
            fedilink
            English
            arrow-up
            21
            arrow-down
            2
            ·
            21 days ago

            I’m confused how a robot even CAN be emotionally supportive. I didn’t even know this was a thing.

            • int_not_found
              link
              fedilink
              English
              arrow-up
              1
              ·
              20 days ago

              Programmed emotional support isn’t new. ELIZA was written in 1966 & surprisingly effective given the crudeness of computers at the time

          • jonne@infosec.pub
            link
            fedilink
            English
            arrow-up
            10
            arrow-down
            1
            ·
            21 days ago

            Yep, and that’s a shame. There should be some sort of government rating or warning put on stuff like that.