• Squorlple@lemmy.world
    link
    fedilink
    English
    arrow-up
    42
    ·
    1 month ago

    My biggest gripe with cooking instructions is the non-specificity. “Stir pasta frequently”? How frequently? How continuously? Tell me in unit Hertz

          • prembil@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            1 month ago

            Just another of those internet image optical illusions. You won’t be fooling anyone on here 🧐

        • jettrscga@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          edit-2
          1 month ago

          I don’t understand the basis of the 24Hz limit rumor. My monitors are 144Hz, and if I limit them to 60Hz and move my mouse around I see fewer residual mouse cursors “after-images” than I do at 144Hz. That’s a simplified test that shows that the eye can perceive motion artifacts beyond 60Hz.

          The eye can perceive LEDs that are rectified at 60Hz AC, it’s very annoying.

          • PunchingWood@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            edit-2
            1 month ago

            I could never tell if people who were claiming not seeing more than the 24 Hz/FPS thing were serious or just excusing poor game optimization. They were either fanboys defending a poor job of a product, or simply had terrible eyes. But I think even with the latter you’d still be able to tell the difference in smoothness.

            It’s one of those things that once you experience a higher framerate in games it’s very hard to go back to a lower setting.

            I find it hard to get used to in movies/shows though. My TV has an option to insert frames for smoother playback to make it appear a higher Hz, but it often looks unnatural. It was hard getting used to The Hobbit movie (I think it was Desolation of Smaug) that was in 48 FPS. And Avatar: Way of Water was constantly switching between lower and higher frames for regular and action scenes, it was such a jarring experience.

              • PunchingWood@lemmy.world
                link
                fedilink
                arrow-up
                2
                ·
                1 month ago

                I believe 24Hz works in movies because the way cinemas are set up. The image projected onto canvas in a dark/dim room “burn” in (not sure what the correct term is) which can make it appear smoother. This is why they can get away with it in cinemas. Plus it’s also a consistent 24Hz, which in games (and Way of Water) isn’t.

                People used this excuse for games, to make games more “cinematic”, but that was just an absolute horseshit excuse for games being poorly optimised. Especially if the framerate wasn’t locked to 24FPS, and because home monitors and TVs don’t work the same as cinema projectors.

                I’m sure if all cinemas and media would move to a higher framerate/Hz it would eventually just feel normal though. It just often takes a lot of time getting used to, especially for cinema experiences.

            • SolarMonkey@slrpnk.net
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 month ago

              I used to have a 4k tv I used as a monitor. It was 60hz. When I was tired, my eyes would vibrate back and forth trying to play nice with the frame rate, blurring everything up. Very difficult to read. Huge increase in headaches.

              Switched to 120hz tv (all other specs equal) and the problem stopped entirely and hasn’t resurfaced in the 6 years since.

              A person may not notice it directly, but it does matter.

              I don’t really notice in movies and stuff but those are so damned chaotic anyway that it probably really doesn’t matter as much. (I don’t like live action, it’s difficult af to follow)

              I haven’t noticed in games really but i mostly play console where that’s not really something you can usually tweak

              • PunchingWood@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                1 month ago

                It’s often weird how people don’t notice it much when you turn a setting on or off. But then I usually whip out the UFO site and they’re immediately convinced (it’s also easier to explain).

                I have to say that on the PS5 the framerate differences have been quite noticeable. Especially first-party titles that support performance mode to go up to 60+ FPS instead of a usually locked 30, like in God of War and Horizon games.

                • SolarMonkey@slrpnk.net
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 month ago

                  I haven’t really run into ps5 issues, but then physical media is very difficult to find for 5, so I only have 4 games for ps5 vs 50+ for ps4 (I don’t buy digital games, ever).

                  But I guess I don’t really pay much attention to it either. As long as it works well enough I don’t usually mess with the display settings other than turning gama waaaaaaay up so I can see shit properly… my tv doesn’t support hdr, which I think became standard in 2017, or anything newer than that which newer games are built to use, so I mostly just leave the defaults alone. I definitely notice some games are smoother than others, but that could just as easily be the texture pack or resource utilization as well.

                  Back when I was playing games on my phone, I’d actually turn down the refresh… sure this game can run at 120, but it can also run at 30 or 60, let’s see what the lowest I can stand is! I don’t do that anymore, but it was good for battery life :)

          • Hjalmar@feddit.nu
            link
            fedilink
            arrow-up
            2
            ·
            1 month ago

            I think it’s the limit for what most people can see as jittery motion. You may be able to differentiate between higher FPS settings, but above 24 hertz most people shouldn’t be able to see discrete steps.

            That’s at least how I’ve come to understand it

          • SkunkWorkz@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            1 month ago

            24hz is the lower limit. People will perceive 24hz as a smooth sequence, especially with motion blur, while anything below it will start to look choppy. Of course humans can perceive higher frequencies. But 24hz became the standard because celluloid film is expensive especially in the early days of cinema. The less frames you need to shoot the less film you need to buy and develop. And film back then was probably not sensitive enough for the lower exposure times that come with higher frame rates.

          • gens@programming.dev
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            1 month ago

            Your eyes are not digital. Nothing physical really is. Think about a camera flash. They can get well under 3.33ms, meaning over 300fps, and you can still see it clearly (and painfully). Same for a monitor, it also has a “response time”. It is how long it takes for a pixel to transition color. (Usually “gray to gray”, as in one shade of gray to another. Black to white would be longer, as is for eyes.)

            So ofc you would see all the mice.

            It’s also why motion blur is a thing, even though it was usually implemented incorrectly. Seeing every motion on a tv or monitor in perfect sharpness feels weird, because they are pictures not actual movements.

            Your brain makes movements out of it all.

            Anyway: 16 is minimum, 24 is good for most movies, 30 for slower games, 60 minimum for fps (75 and above for faster fps, even though i played xonotic on 45), 120 for vr.

        • Prime@lemmy.sdf.org
          link
          fedilink
          arrow-up
          1
          ·
          1 month ago

          No it can see much more. Bonus: your brain can ‘see’ more than 100hz too. Google bundesen tva. Source i worked on programs to measure it for my gfs phd. Also i play fps :D