When I was in school, I was always told “If you get a college degree you’ll on average make 500k more over the life time of your career regardless of what you get your degree in!”

Then as I finishing school, it was all about “If you get into tech you’ll make big bucks and always have jobs!”

Both of those have turned out not great for a lot of people.

Then whenever women say they’re struggling with money online, they get pointed to OF… which pays nothing to 99% of creators. Also very presumptive to suggest that, but we don’t even need to get into that.

So is there a field/career strategy that you feel like is currently being over pushed?

(My examples are USA, Nevada/Utah is where I grew up, if maybe it’s different in other parts of USA even.)

  • Churbleyimyam@lemm.ee
    link
    fedilink
    arrow-up
    7
    ·
    2 days ago

    The world has been changing fast and I think the safest advice in terms of always having work is to learn something to do with bedrock infrastructure, like plumbing or welding.

    • bizarroland@fedia.io
      link
      fedilink
      arrow-up
      2
      ·
      2 days ago

      As we approach the singularity, more and more things will be done by fewer people.

      No one has a plan for the singularity, they are hoping that AI will figure it out.

      May God have mercy on us all.

      • wizardbeard@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 days ago

        We’ve been “rapidly appeoaching the singularity” for quite a while now, and the current tools being marketed as “AI” don’t actually have any “intelligence” to them. We are not going to magically turn what we have now into “AGI”, it’s simply not possible given our current models and techniques.

        From someone in tech, at absolute best this is something that we might see strides in by the time we all die of old age, and that’s being absurdly optimistic. The only people pushing the idea of a faster timeline are those with money to grift off the idea.

        • bizarroland@fedia.io
          link
          fedilink
          arrow-up
          1
          ·
          2 days ago

          I see where you’re coming from but look at semiconductors. Right now Nvidia has dethroned Intel, and nvidia’s own insiders have stated that they are designing chips based on AI which they are then using to power the AI which design the next round of chips.

          Maybe the stuff that you and I have access to will never cross the border into AGI territory to some sort of AGI scenario, but that doesn’t mean that there are not systems and processes in play that can.

          • wizardbeard@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            2 days ago

            And here we have issues with the many different definitions of AI. Nvidia used machine learning to simulate countless iterations of their chip design to find the best configuration and layout (for the specific goals they set their AI to optimize for). They did not use chatGPT or anything that has textual output. It literally cannot spontaneously develop that ability.

            It is constrained by the bounds that are inherently neccessary to make it function and by the goals it is created to optimize for. It cannot just arbitrarily “choose” to go do something they aren’t pointing it at. It may do things that aren’t intended, but those are “happy accidents” related (again) to the goals it is given to optimize for. Like a delivery AI jumping off a balcony because it’s the fastest way down, since no goal weighting was given to self preservation or damaging the package.

            At the very least, until we have some way to codify the abstract concept of comprehension into a scoring system can be optimized for, none of these things are going to even approach AGI. This is due to the simple reality of how they work under the hood, and don’t for a fucking second believe the charlatans saying that we can’t understand them. We may not be able to discretely track each and every step a model takes in modifying it’s weights or each decision poiny when optimizing for specific output, but that’s a matter of storage space to store each step and drastic speed loss that would occur recording each step. It is not some inherent untracable magic in how they work.

            Computers, even quantum computers, work through billions of discrete traceable steps occurring each second. AI still needs discrete inputs, discrete goal/optimization/math to discern good output from bad, even if we choose not to track each step in between.

            Put as simply as possible: You cannot duct tape infinite speak and spells together to spontaneously create an intelligence, and that is effectively what current AI is doing in ever increasing amounts. We’re brute forcing it by throwing ever increasing amounts of resources at it, with rare and minor improvements in the underlying math occurring at far slower rates. The nvidea chip thing is just improving the ability of chips to do the math we’re already doing for this stuff even faster, so… more brute forcing.

            Edit: Also, nvidea is making more money than they ever have riding this hype train. Of course they’re going to push the idea that absurd leaps of progress are right around the corner, and that their products will get us there. They are the best in the market right now, but anything beyond that is pure conjecture to help drive sales. Their chips are not fundamentally doing anything new, just the same things but more efficiently.