Garrett (garote) wrote,
Garrett
garote

Getting obsolete...

For a long time now I’ve looked down on the younger generation of programmers mainly because they use frameworks and libraries willy-nilly without understanding how they work and what exactly they do, and call it "programming", or worse yet, "hacking".

But this year I’ve been realizing that I’m the old geezer on the porch complaining that his generation was somehow different when it was not.

Sure I learned about programming by entering machine language into a console, and went up from there. But I didn’t know jack shit about circuit design, and I still don’t know jack about it. In the past I’ve claimed this was different because circuit design was hardware design, and as a software person I was in a wholly different field, and justified in ignoring what lay beneath it.

But that division only appeared in retrospect, after the messy innovation that spawned the first solid platforms had taken place.

Looking around now, what divisions are starting to take shape? What core fields of study are being placed firmly on the wrong side of those divisions, doomed to fade away into dark corners of the industry?

Here's a list off the top of my head:

* Tomorrow's programmers are going to stop worrying almost entirely about WHERE their code is actually being run. And it will be hard to figure it out in any case.

* Tomorrow's programmers are going to expect software to auto-optimize itself to a huge degree, by having an AI interactively refine their design. The very notion of optimizing something for a given platform will seem quaint.

* Tomorrow's programmers are going to rent all their development tools on a monthly basis. They will be auto-updated every 24 hours. Every keystroke they make while on the clock will be recorded, and much of it will be rewindable and branch-able like a git repository on steroids. Development in an offline state will be severely handicapped, perhaps even impossible, but it won't matter because everything will be online all the time, for almost zero energy cost.

* Tomorrow's programmers are going to expect to be able to take anyone's device anywhere, and with permission, authenticate to it with a fingerprint or iris scan or code key, and instantly start using their own personal development environment, picking up exactly where they left off. When they stand up and move more than 3 feet away from the machine it will sense this and auto-lock, and the programmer can move on to another machine. (This is almost the way it is already, for some online developers working exclusively in browsers.)

What changes do you foresee, that will render large parts of current knowledge, or process, useless or irrelevant?
  • Post a new comment

    Error

    default userpic

    Your reply will be screened

  • 5 comments

Anonymous

December 6 2016, 04:12:15 UTC 7 months ago

So is all real engineering going to end up being done by the big three cloud providers? And anybody not employed by them to be condemned to "JavaScript/UI" activities? I can imagine backend jobs being eaten by "serverless" in ten years. I have no idea what I would do in such a case.

On the bright side, if you are right about the AI progress developers themselves could be made obsolete.
I don't think developers will ever be obsolete ... we all pedal too fast on our stationery bicycles of learning, to let that happen... :D

Javascript/UI stuff is moving down in the stack and getting crowded by variants, but I don't think it's gong to eclipse other development, because the other frameworks that browsers rely on are also getting so much more sophisticated and flexible. Swift and Java are not going anywhere, and neither is C for certain sub-tasks. But why run any of this locally when you can code it up through a portal and run it ""somewhere"", 10,000 instances at a time, managed in a swarm, or instantly on a collective of pre-assigned devices?
Sure.

And bytes and loops are getting obsolete.
Loops are for people who don't know recursion! :D
Wow, that's level 3!