Lately, I was going through the blog of a math professor I took at a community college back when I was in high school. Having gone the path I did in life, I took a look at what his credentials were, and found that he completed a computer science degree back sometime in the 1970s. He had a curmudgeonly and standoffish personality, and his IT skills were nonexistent back when I took him.

It’s fascinating to see the perspectives on computing and how many of the things I learned in my undergraduate were still being taught way back to the 1950s. It also seems like the computer science degree was more intertwined with its electrical engineering fraternal twin.

Although the title of this post is inherently provocative, I’m curious to hear from those of you who did computer science, electrical engineering, or similar technical degrees in decades past. Are there topics or subjects that have phased out over the years that you think leave younger programmers/engineers ill-equipped in the modern day? What common practices were you happy to see thrown in the dumpster and kicked away forever?

The community also seems like it was significantly smaller back then and more interconnected. Was nepotism as prevalent in the technology industry then as it is today?

This is just the start of a discussion, please feel free to share your thoughts!

  • mindlight@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    5 months ago

    No degree at all but been working IT since the 90’s.

    It’s fun that when I started in IT everything went from centralized (mainframe and terminals) to decentralized (PC). Then came Citrix and everything went towards centralized. Smartphones and apps came so we went decentralized again. Then cloud came and we essentially went centralized again.

    It’s all about trends, the pendulum swings back and forth…

  • Cyborganism@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    I did a computer science degree in the equivalent of a community college then followed with software engineering in an engineering university. Graduated in 2008.

    I find that software development and IT in general was a lot simpler back then than today. Nobody required any kind of certification to get a job.

    Early 2000s, when you had a problem in your project, you really had to mess around and try things to find the solution. You couldn’t really so much on stackoverflow or similar sites.

    • chilicheeselies@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      Things are much simpler now. Even little things. For instance error messages. They used to be cryptic as hell, but these days there is more of an emphasis on communication.

      The only thing more complex is the volume of choice. There are just soooo many ways to do something that picking a way can be daunting. Its led to a situation where you have to hire based on ability to learn rather than ability with a specific toolchain.

      • AggressivelyPassive@feddit.de
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        I wouldn’t say that.

        Software today, in the real business world, is extremely complex, simply because of all the layers you have to understand.

        Today I have to know about Kubernetes, Helm, CI/CD, security/policy scanners, Docker, Maven, Spring, hibernate ,200 libraries, Java itself, JVM details, databases , and a bit of JavaScript, Typescript, npm, and while we’re at it, react. And then of course the business logic.

        I’d argue, in today’s world, nobody actually understands their software completely. I’m not sure, when exactly the shift from raw dogging assembler and counting cycles to the mess of today happened, but I’d argue, software today is much much more complex and complicated.