Sort of similar to the Great Filter theory, but applied to time travel technology.

  • 3volver@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    7 months ago

    Time travel within the same universe is not possible, it is a fun fiction which is always contradictory in some way. The only time travel possible would be the one that William Gibson uses in The Peripheral. His idea is that every time you go back in time a new parallel universe is created, and it doesn’t impact your current universe because of that.

    My theory is that we’re one of the most advanced species in our galaxy, and yet we still can’t reach another solar system. The probability of intelligent life forming from unintelligent life is extremely unlikely, and we had life on Earth for a LONG time before humans evolved. Intelligent life is very difficult to form, you need the perfect conditions and perfect stressors over millions of years. Then on top of that intelligent life which can reach another solar system is even less likely.

    There’s life out there thinking the same thing right now:

    • Reucnalts@feddit.de
      link
      fedilink
      arrow-up
      0
      ·
      7 months ago

      One of my favourite theory about human evolution is the stoned ape theory. It delivers the conditions of the evolution. Apes were forced outside of the jungle and ventured into open field and had to depend on different nutrition. So they ate some mushrooms and eventually ate ones with psylocibin. Small amounts increase your eyesight - so you can hunt better, bigger amounts are very sexuall arrousing - so more reproduction :)

  • cynar@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    The universe seems to be keyed to disallow time travel. The speed of light limit, in relativity, is sat exactly at the limit where time travel would become possible. Conversely, quantum mechanics does allow for FTP transmission. What it doesn’t allow is information to flow along those links. It’s hit with a 0.5 error rate, which completely blocks FTP communication.

    General relativity does allow for a few time travel options. However, these are sat well off in the sticks, where quantum relativity would dominate. Since we don’t have such a theory yet, our predictions are likely wrong. Even within these theories, a time machine would require a “closed timelike curve”. These can, in theory be made using several rapidly rotating black holes. Any ship traversing it, would never be able to leave before the time machine was built.

    Basically, time travel is almost certainly blocked by our laws of physics. Any loopholes would be limited to the lifetime of the “machine” and would require stellar level engineering for even a few seconds of travel.

    • SmoothOperator@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      7 months ago

      Quantum mechanics does not allow for FTL transmission. Disallowing information flow is the same as disallowing transmission.

      • cynar@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 months ago

        It seems to allow it, in a sense. The errors are also left on the transmission end. By transmitting them normally, the 2 signals can be combined to recreate the data. Something is shared, at some point.

        It’s definitely a “we’re not sure what’s actually going on” type situation though. Either both ends are drawing on some (otherwise) hidden data layer, or FTL transmission is allowed, so long as no information flows (information as defined by quantum mechanics). It just turns out that weird entanglement based systems are the only ones (we’ve found so far) able to send infomationless transmissions.

        Both solutions would give deeper insights into reality, and its underpinnings. Unfortunately, we’ve not actually teased out which is happening.

        My gut feeling is that the speed of light is a side effect of a fixed/stable causality across all rest frames. Hidden information seems to be a lot more cumbersome.

        • SmoothOperator@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          7 months ago

          I’m not sure what you mean. If something is “shared”, but this something contains no information, how can we know that it was shared? In what sense does this something even exist?

          The perfect correlation of entangled particles is well established, and very cool, but perfect correlation does not require sharing of “something”. The perfect correlation is baked into the system from the start, from local interactions only.

            • bunchberry@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              4 months ago

              That’s actually not quite accurate, although that is how it is commonly interpreted. The reason it is not accurate is because Bell’s theorem simply doesn’t show there is no hidden variables and indeed even Bell himself states very clearly what the theorem proves in the conclusion of his paper.

              In a theory in which parameters are added to quantum mechanics to determine the results of individual measurements, without changing the statistical predictions, there must be a mechanism whereby the setting of one measuring device can influence the reading of another instrument, however remote. Moreover, the signal involved must propagate instantaneously, so that such a theory could not be Lorentz invariant.[1]

              In other words, you can have hidden variables, but those hidden variables would not be Lorentz invariant. What is Lorentz invariance? Well, to be “invariant” basically means to be absolute, that is to say, unchanging based on reference frame. The term Lorentz here refers to Lorentz transformations under Minkowski space, i.e. the four-dimensional spacetime described by special relativity.

              This implies you can actually have hidden variables under one of two conditions:

              1. Those hidden variables are invariant under some other framework that is not special relativity, basically meaning the signals would have to travel faster than light and thus would contradict special relativity and you would need to replace it with some other framework.
              2. Those hidden variables are variant. That would mean they do indeed change based on reference frame. This would allow local hidden variable theories and thus even allow for current quantum mechanics to be interpreted as a statistical theory in a more classical sense as it even evades the PBR theorem.[2]

              The first view is unpopular because special relativity is the basis of quantum field theory, and thus contradicting it would contradict with one of our best theories of nature. There has been some fringe research into figuring out ways to reformulate special relativity to make it compatible with invariant hidden variables,[3] but given quantum mechanics has been around for over a century and nobody has figured this out, I wouldn’t get your hopes up.

              The second view is unpopular because it can be shown to violate a more subtle intuition we all tend to have, but is taken for granted so much I’m not sure if there’s even a name for it. The intuition is that not only should there be no mathematical contradictions within a single given reference frame so that an observer will never see the laws of physics break down, but that there should additionally be no contradictions when all possible reference frames are considered simultaneously.

              It is not physically possible to observe all reference frames simulatenously, and thus one can argue that such an assumption should be abandoned because it is metaphysical and not something you can ever observe in practice.[4] Note that inconsistency between all reference frames considered simulatenously does not mean observers will disagree over the facts, because if one observer asks another for information about a measurement result, they are still acquiring information about that result from their reference frame, just indirectly, and thus they would never run into a disagreement in practice.

              However, people still tend to find it too intuitive to abandon this notion of simultaneous consistency, so it remains unpopular and most physicists choose to just interpret quantum mechanics as if there are no hidden variables at all. #1 you can argue is enforced by the evidence, but #2 is more of a philosophical position, so ultimately the view that there are no hidden variables is not “proven” but proven if you accept certain philosophical assumptions.

              There is actually a second way to restore local hidden variables which I did not go into detail here which is superdeterminism. Superdeterminism basically argues that if you did just have a theory which describes how particles behave now but a more holistic theory that includes the entire initial state of the universe going back to the Big Bang and tracing out how all particles evolved to the state they are now, you can place restrictions on how that system would develop that would such that it would always reproduce the correlations we see even with hidden variables that is indeed Lorentz invariant.

              Although, the obvious problem is that it would never actually be possible to have such a theory, we cannot know the complete initial configuration of all particles in the universe, and so it’s not obvious how you would derive the correlations between particles beforehand. You would instead have to just assume they “know” how to be correlated already, which makes them equivalent to nonlocal hidden variable theories, and thus it is not entirely clear how they could be made Lorentz invariant. Not sure if anyone’s ever put forward a complete model in this framework either, same issue with nonlocal hidden variable theories.