Yes, software is getting worse, as education and corporate are getting worse.
Where employees needed to know what they actually were doing in the past, now is mostly auto-filled by IDE’s and languages that target other languages, so employees need to know less and less fundamentals.
Which in turn means when a low-level error occurs, either no one knows how to fix it, or the corporate refuses to hire someone who knows how to fix it because they’re “over-qualified”, and therefore would “cost them too much”.
Do you think complexity and scope stayed the same? Or did it increase? Do people have to know more now to have the same level of depth and surrounding knowledge?
Yes, software is getting worse, as education and corporate are getting worse.
Where employees needed to know what they actually were doing in the past, now is mostly auto-filled by IDE’s and languages that target other languages, so employees need to know less and less fundamentals.
Which in turn means when a low-level error occurs, either no one knows how to fix it, or the corporate refuses to hire someone who knows how to fix it because they’re “over-qualified”, and therefore would “cost them too much”.
Do you think complexity and scope stayed the same? Or did it increase? Do people have to know more now to have the same level of depth and surrounding knowledge?