Ok, it’s me again. I’ve been checking the sampled logs on my cloudflare website and I’ve noticed some very particular requests:

Some context: I’m hosting my own static website (a personal blog) at home and serving it to the internet through a Cloudflare tunnel.

Upon inspecting them it seems like they are bots and web-crawlers trying to access directories and files that don’t exist on my server, (since I’m not using wordpress). While I don’t really have any credentials or anything to lose on my website and these attacks are harmless so far, this is kinda scary.

Should I worry? Is this normal internet behaviour? Should I expect even worse kinds of attacks? What can I do to improve security on my website and try to block these kinds of requests/attacks?

I’m still a noob, so this is a good opportunity for learning.

Thanks

  • ShortN0te@lemmy.ml
    link
    fedilink
    English
    arrow-up
    12
    ·
    edit-2
    12 hours ago

    Those attacks you see are mostly (close to 100%) harmless bots, scripts. Yes they are trying default passwords and exploits that got patched years ago.

    If you do not use default credentials and do run up to date software there is nothing to worry about.

    Even brute force attacks are rare.

    This is just “noise” so to speak.

    If you are scared by this, you should reconsider hosting something on the internet. Yes things like fail2ban can help but only if they knock on your server multiple times and mostly only to keep your logs clean.

  • hendrik@palaver.p3x.de
    link
    fedilink
    English
    arrow-up
    23
    ·
    edit-2
    14 hours ago

    That looks like the internet. Every server gets bombarded with these requests. Generally: Use good passwords, make sure your software blocks bots brute-forcing passwords, after some sane amount of tries… Keep everything updated…

    If you want some more attacks, install a mailserver. Or expose VNC/Windows Remote Desktop or a VOIP server. That gets the bots really worked up.

      • hendrik@palaver.p3x.de
        link
        fedilink
        English
        arrow-up
        11
        ·
        14 hours ago

        Just make sure you actually enable the jails/filters for the services you use … I’ve seen people just install it and that will by default just protect ssh and leave everything else as is.

      • bizdelnick@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        13 hours ago

        With fail2ban single bot behind a NAT can make the site unaccessible for all users behind that NAT.

        • cron
          link
          fedilink
          English
          arrow-up
          3
          ·
          13 hours ago

          That’s true, but might not really be a problem for most. Just set the jail time to something short (few minutes, maybe an hour).

  • catloaf@lemm.ee
    link
    fedilink
    English
    arrow-up
    19
    ·
    14 hours ago

    Yes it’s normal. Attackers are scanning everything, all the time.

  • Ebby@lemmy.ssba.com
    link
    fedilink
    English
    arrow-up
    9
    ·
    14 hours ago

    Should I worry?

    I’ve had this stuff in logs since the late 90’s. It was concerning at first, but port scanning and scripts are the internet’s background static now.

    Is this normal internet behaviour?

    Yup. Welcome to self hosting!

    Should I expect even worse kinds of attacks?

    Not that it will happen, but good security expects attacks. I like to say “Obscurity is not security.”

    What can I do to improve security on my website and try to block these kinds of requests/attacks?

    As these scrips are targeting code you don’t run, they can be ignored relatively safely.

    You can take a couple steps to lock things down like not responding to ping on WAN (less enticing to port scanning) locking down firewall settings, geolocation blocking, authentication, etc.

    That said, if the script changed to something you DO host, you may be in for a bad day. Good to stay on top of security patches in that case.

  • Kokesh@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    14 hours ago

    I used to have SQL commands via URL parameters back in the early 2000s. I had no SQL things running, so no problem there :)