• General_Effort@lemmy.world
    link
    fedilink
    English
    arrow-up
    101
    arrow-down
    1
    ·
    3 months ago

    [French media] said the investigation was focused on a lack of moderators on Telegram, and that police considered that this situation allowed criminal activity to go on undeterred on the messaging app.

    Europe defending its citizens against the tech giants, I’m sure.

    • RedditWanderer@lemmy.world
      link
      fedilink
      English
      arrow-up
      70
      arrow-down
      2
      ·
      edit-2
      3 months ago

      There’s a lot of really really dark shit on telegram that’s for sure, and it’s not like signal where they are just a provider. They do have control the content

        • RedditWanderer@lemmy.world
          link
          fedilink
          English
          arrow-up
          27
          arrow-down
          5
          ·
          3 months ago

          I don’t recall CP/gore being readily available on those platforms, it gets reported/removed pretty quickly.

          • southsamurai@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            11
            arrow-down
            4
            ·
            3 months ago

            You’re young. It really was a thing. It never stayed up long, and they found ways to make it essentially instantaneous, but there was a time it was easy to find very unpleasant things on Facebook, whether you wanted to or not. Gore in specific was easy to run across at one point. CP, it was more offers to sell it.

            They fixed it, and it isn’t like that now, but it was a problem in the first year or two.

        • Kecessa@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          3 months ago

          So you don’t see the difference between the platforms that actually has measures in place to try and prevent it and platforms that intentionally don’t have measures in place to try and prevent it?

          Man, Lemmings must be even dumber than Redditors or something

    • chiisana@lemmy.chiisana.net
      link
      fedilink
      English
      arrow-up
      26
      ·
      3 months ago

      Safe harbour equivalent rules should apply, no? That is, the platforms should not be held liable as long as the platform does not permit for illegal activities on the platform, offer proper reporting mechanism, and documented workflows to investigate + act against reported activity.

      It feels like a slippery slope to arrest people on grounds of suspicion (until proven otherwise) of lack of moderation.

      • rottingleaf@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        3 months ago

        Telegram does moderation of political content they don’t like.

        Also Telegram does have means to control whatever they want.

        And sometimes they also hide certain content from select regions.

        Thus - if they make such decisions, then apparently CP and such are in their interest. Maybe to collect information for blackmail by some special services (Durov went to France from Baku, and Azerbaijan is friendly with Israel, and Mossad is even suspected of being connected to Epstein operation), maybe just for profit.

        • chiisana@lemmy.chiisana.net
          link
          fedilink
          English
          arrow-up
          4
          ·
          3 months ago

          I don’t know how they manage their platform — I don’t use it, so it’s irrelevant for me personally — was this proven anywhere in a court of law?