I'm a programmer, a few months shy of his 40th birthday. It's Saturday morning, my kids are home with my wonderful wife (who is pulling my weight on the domestic front), and I'm at a tech conference. It's a session on React Native, and the presenter is convincing us why it's truly the "next big thing" for mobile development. To me, it seems a bit like JSPs of 15 years ago, with all the logic in the presentation code, but I'm "old", so I assume I just don't "get it".
The presenter blows through some slides, dazzles us with some impressive live coding, and then makes his way to the "name dropping" portion of the presentation, where he rattles off about a half a dozen supporting tools that I never knew existed, including something called Pepperoni (seriously). As someone who just recently got the hang of Angular, it all makes me feel a little disheartened. "Here we go again", I think.
Of course I'm not really surprised. Over the past 20 years, I've taken seats on a number of band wagons, and have generally enjoyed the rides. The buzz that comes with a new "disruption" in programming can be exciting - feeling apart of a community of technical innovaters, championing something that will make things a little easier, quicker, cleaner, better. It can be fun. But on this particular morning, at the cusp of 40, I have to admit I feel a little drained. I know this is part of the job - if I want to stay relevant (and well paid), I know that every so often I need to cast out some of the knowledge that I've so dutifully absorbed, and gear up up for the next journey. It's just how it is.
As I think about it though, this regular ritual of my programming career doesn't seem to be a way of life for other professionals. The doctor at 40 doesn't seem to be worried about discovering that all his knowledge of the vascular system is about to evaporate in favor of some new organizing theory. The same goes for the lawyer, the plumber, the accountant, or the english teacher. While there are certainly unappealing aspects to these professions, it's safe to say that for each of them, mid-way through their career, the knowledge they've accumulated is relatively stable, and has afforded them with some increased measure of respect and compensation. In programming though, 20 years of experience does not seem to confer those same advantages.
Two Main Forces
Of course not all is so dismal in our profession - there are so many things to love about being a programmer - but in terms of the never-ending struggle to "keep up", it is an interesting feature that seems more or less unique to our field. Am I right though? Is programming really different in this regard? And if it is, then why? And what does it mean for our career trajectory? I'd like to try to answer all of this (because, why not) in terms of two concepts.
The first is knowledge decay. Everything we know, not just about programming, has an expiration; a point at which it is no longer useful. I learned how to drive a car when I was 16, and for that most part, that knowledge still serves me well. This piece of knowledge could be said to have a long half-life. For many professionals, their domain knowledge also has a relatively long half-life. Sure, new discoveries in medicine may displace some existing procedures, but likely there will not be a major overhaul in our understanding of our biology. When the expiration is long like this, knowledge can effectively be considered cumulative. The doctor is more knowledgeable than he was last year, because everything he learned in the past 12 months built on all that he knew before.
In programming, for good or bad, I'd assert that this is not exactly the case. Putting a (rather arbitrary) stake in the ground, I'd say that:
Half of what a programmer knows will be useless in 10 years.
This could be way off (and there are many caveats of course - read on!)...but it seems about right for me. If I learned nothing else from this point forward, I bet that only about a half of my knowledge could I still use in 2026 (long live SQL!), and the other half would probably be of no use (React Native, perhaps?). Now of course I will be gaining new knowledge to replace the dead stuff, but will it be enough? Will I know more (useful) knowledge in 2026 than I do now?
This brings me to the second concept, knowledge accumulation rate - the pace at which we add new things to our knowledge corpus. In every field, there is a certain threshold of knowledge that must be met in order to be "certified" (or at least hireable), and the early portion of a career is typically dedicated to acquiring this knowledge. In programming, however, because of the fast decay of knowledge, it seems like we never really transcend the "student" period. We know we must always be learning, and this makes the stages of our career a bit atypical.
The Three Stages
If I were to graph an average programmer's knowledge over the course of their career, keeping in mind knowledge decay and accumulation rate, I think it might look like something this:
In the beginning of our careers, in what we could call the eager apprentice stage, accumulating knowledge is relatively easy. Everything is new, and so each experience is a vehicle to gain more knowledge. Moreover, since we're younger, we often have fewer hard obligations, and so we probably don't mind spending a few nights and weekends picking up new languages and frameworks. Lastly, and importantly, the expectations on us from our employers is lower. Everyone understands that we're junior, and so more easily than our colleagues, we can carve out a little time during the work day to fill in holes in our knowledge. This is a fun stage, but there's this persistent feeling that there's so much we don't know.
At some point though we cease to be novices, and we establish ourselves as productive, self-sufficient developers. For the first time, the gap between us and our colleagues (even the ones 10 years our senior!) does not seem so large. This fills us with vim and vigor, and so this is the rising star stage. The investment we made in learning has paid off, and just about everything we know is still useful - i.e. none of our knowledge has noticeably decayed. With this reservoir full of relevant knowledge, we begin to earn the respect of clients, peers, and managers, and with this respect comes titles, salary, and opportunities. Though we don't necessarily see it at the time, this is also an important point of inflection.
It's at this point that two things happen. First, that promotion to "senior" comes with something more than just money: greater expectations. Employers need their star programmers to be leaders - to help junior developers, review code, perform interviews, attend more meetings, and in many cases to help maintain the complex legacy software they helped build. All of this is eminently reasonable, but it comes, subtly, at the expense of our knowledge accumulation rate. The time we used to have to read tech blogs: gone. Second, it's also at this point that we first experience (or at least recognize) a little knowledge decay. Some of what we learned early in our career is now out-dated. All that time "we" (read: I) spent learning GWT? Lost! Essentially, both forces, knowledge decay and knowledge accumulation rate, begin to work against us.
It's at this point where we enter the third and final stage, the ebb-and-flow of the steady veteran. We are knowledgeable and productive, yes, but we also understand that we may actually know fewer (useful) things than we did at a prior point in our career. A non-trivial amount of our knowledge has decayed, and we may not have had the time to accumulate enough new knowledge to compensate. This can be frustrating, and I think it's why it's at this point that so many of us bail for other pastures - management, sales, testing, or (my dream) farming. We realize that it'll require real effort to just maintain our level proficiency - and without that effort, we could be worse at our jobs in 5 years than we are today. There is no coasting.
This is where I'm at. I still love to learn, but I appreciate that without some herculean effort, I will probably always remain in an equilibrium state hovering around the lower boundary of "expert". I'm ok with this, because I enjoy my personal life more than I want to be the next Martin Fowler (although I bet Martin has a kick-ass personal life too - that guy is amazing). Thinking about my career in terms of knowledge decay and accumulation though has changed my perspective a little.
First, I try to take the long view. I'm more wary of roles with excessively taxing expectations and few opportunities for novel experiences. I've seen quite a few colleagues take the bigger pay check at an employer where there'll be little opportunity to work with new things and learn. In 5 years, they realize that much of their valuable knowledge has evaporated and their pay is way out of whack with their actual worth. In some cases, I think making less money in the short term (at a better employer) will yield more money (and stability) over the course of a long career.
In the end, perhaps I haven't really forged any new ground here, but it's been useful for me to think about my career in terms of these two things: knowledge decay and knowledge accumulation. I'd love to hear any thoughts you have!