Programming technologies are evolving so fast that programmers constantly have to learn more and more to catch up whether you want it or not. Often it is not just learning more in the same direction but starting from a scratch.

Lets say you were a topnotch programmer in 1999 who quit for 10 years and went to a job interview in 2009 (funny even to imagine) - how much of your knowledge is still needed? And if we take a carpenter, engineer, doctor or even mathematician - they all are still good specialists after 10 years.

So why programming is so not stable? Is it because it is just relatively new or because something important is still missing after 50 years and we can't find it to settle in that direction? Do you think after some time situation will change?

Learning something new all the time is exciting and all, but it is starting to worry me that as I become older it will be harder and harder. After all "you can't teach an old dog new tricks" and I'm afraid that at the end I just end up behind college students and become one of those "cobol dinosaurs", only it will be probably "java dinosaurs" by that time.


There are two reasons I'm a programmer:

  • I like making things
  • I like learning things

I'm also 55 years old, and am currently teaching myself Haskell.


Things I knew in 1999 that are still true in 2009


I disagree; programmers are not the only ones that have to continue to learn to be better and keep up with technology. Everything evolves as time goes on, not just programming languages. For example, doctors have to continue to learn new practices in order to be more safe and a better doctor. But thats the beauty of getting into something thats really your passion.


I'm more afraid of coming to a time when I'm not learning anything.


I turned 50 last week (gulp). I'm also a COBOL dinosaur. At least, I could still hack a COBOL program if I saw one, but I haven't worked in it for money since 1989. I'm some kind of C dinosaur, too. And Visual Basic. These days I'm mostly a Ruby dinosaur in the making. I've been programming professionally for about 30 years and learning since day 1.

The kind of people who stay programmers over the long term likely fall into two groups:

  • the ones who are happy to exercise one skill set for eternity;
  • the ones who are happy learning new stuff all the time.

So I suspect that most people who stay programmers do so because they're the sort of people who enjoy continuous learning.


As a programmer, when you become afraid of learning something new, it's time to quit. It is part of a programmer's job to always learn new things. The time a programmer codes effectively is less than 50%. The rest is thinking, learning new things, reading documentation etc.

Then there are coders, they just write code... I'm sure you know the kind.


Thats the beauty of programming. It grows, improves every day. Its not monotonous.


"And if we take a carpenter, engineer, doctor or even mathematician - they all are still good specialists after 10 years."

I'd say that all of those fields have changed in ten years. I stopped being a mechanical engineer fourteen years ago, and I doubt that I'd have a future in the field if I went back to it today.

It's stability that's the illusion. Change is the nature of the world.


As someone who left programming in 1999 and came back in 2007, it really wasn't hard. I had to learn a new language -- Smalltalk programmers aren't quite in the demand they were 10 years ago.

The important part of programming is the larger concepts. In my case: OOP, MVC, scalability and knowing how to put together large projects translated across languages and platforms.

Learning the syntax of a new language isn't a big deal.


I can only share my own experience. It is true that we lose energy with the years; however, the experience of so much time makes you a lot more intuitive. I tend now to look at things from a broader perspective and choose to drill down and really learn something when it is worth it - and I can tell you that even though you might have less energy it gets focused a lot more precisely so you spend the same time or less to get something done. You can't stop learning, that is a given. But it sure is a lot of fun.


That?s interesting question. Other day I was deeply thinking about that.

You should not worry. In fact, the most important about programming is methodology. I started programming on 18-19 years. I loved because everyday I learned something new and extraordinary. Today, 10 years after, I write much less code than ever before and do fewer errors just because I've improved my programming and methodology skills and got experience on many projects.

Programming Languages and frameworks changes and appear everyday but you don't have to be updated like some bloggers (I think this guys do no project at all, because if you are in business you don?t have time to write in blogs ? a less if your business is a blog).

You know, these people talking about programming languages but I feel this like ?virtual? because the industry doesn't follow technology releases.

I first learn .net platform 3 years before a really commercial experience. I've moved from .net 1.1 to 2.0 just two years after its new release. Now I still using .net 2.0. What about 3.5? Owo... net 4.0 is around and why should I give up all the mastery I've done yet just to move to new platform?

Of course one day I?ll move and update my solutions to new technology but first let them mature. ASP to ASP.NET was horrible time, I?ll never move to something new straightway. WCF on .NET 3.5 is a nice tool but as I said I prefer methodology over tools and so I?ll first learn everything that Tomas Erls has to teach on SOA books ? such knowledge will rest for 10 years, WCF maybe 3?

I found out that Object Oriented Programming (OOP) is around since 90s and on 80s, when I wasn't even born, some folks were talking about object technology and just now, on my shiny days, 70% of the IT industry poorly know OOP and design patterns at all.

Most people doesn?t know that a turtle makes love until the last day of its life. Some lives 100 years and no younger turtle make an old turtle threatened.

The ?dinosaur guys? are the guys which rule the world. They carry life experience? Pay attention to agile manifesto people... there are no eighteen or even thirty years brilliant guys changing anything.

Relax, avoid anxiety.


I'll answer your question with my own:

"Why would anyone want to work in a field where after 10 years, they haven't learned anything new?"

Also, see everyone who said something about lawyers/doctors needing to learn new things constantly. (Thankfully, we have progressed from bloodletting and leeches as a remedy to medicine, therapy, etc.)


Every profession that isn't a complete dead end has some element of learning. You can't really run away from taking the responsibility of constantly learning new things. Regardless if you are a doctor, programmer, or a janitor; if you're not learning anything new then either:

  1. you're doing a really poor job,
  2. or the job you're working on is not a fit for you

Of course when we're talking about learning new things, we're not talking about learning everything that you never needed to know (i.e. back at school)? instead we're talking about learning stuff that will fascinate you and keep you engaged. Learn what you think is fun and try to find new things to play with, it may become useful later on in your career. If you cannot learn anything, then maybe the profession in question is not working out for you.

PS. I'm not afraid to learn new programming languages/frameworks/patterns/whatever. All of the new stuff tend interweave with each other.


The saddest thing I have ever seen was a 50-year old guy who was hired at a company I've worked for. After 3 months, he said goodbye. He was in tears, saying nomatter how hard he tried, he couldn't understand our codebase (and his task in it), and he'd probably move to another area of programming. Game(technology)code 10 years ago was way less complex than it is now.

My fear as a programmer is that this will one day happen to me - that there are too many new technologies, api's, patterns, interfaces for me to keep up.


There might come 100 new fancy languages etc, but there really are a lot of stuff that you can learn as a programmer that's just like any other profession. You learn different methods of how to approach certain problems.

There's always some kind of if-else-while-do-for-openfile-whatever things in all languages. You just need to catch up how they work and then you are on track. Of course, it usually takes a couple of years to master a certain language, but getting started with a new one is more or less always the same.

Logic is logic.


the plumber profession in Victorian era would have the similar impression as well (as the modern toilet was just invented, and in a frequent cycle of upgrading).

it seems to me that the greater leverage your "trade" has, the more frequent its development. toilets once had such leverage too, when they were dramatically changing people's lives.


Improvements and changes in programming languages are mostly incremental and new methodologies evolve over time. So I would think that it would be fairly easy to keep up. But then what do I know, I am only 27 ;)


As a programmer, when you are tired of learning new things, its time to go into management. Go to work for a company like Lockheed Martin, where management is nothing but burned-out engineers and programmers.


No, you can always teach an old dog new tricks.

The market is a huge place and there is room for all, even for dinosaurs.


I don't think that assembler or c will disappear after 10 years. Of course, new languages will appear, but fundamentals will always be there. So you need to learn new approaches and algorithms - the base of CS. New technologies and languages are only the new way of expressing the old tricks - more quickly and more efficiently. (IMHO)


I wrote my first computer program (in FORTRAN) in 1963 or so. I programmed a computer (UNIVAC 1005, in assembler) in the Army for about 8 months in 1971. That was when I decided that I wanted to learn more about the field. I went to school (UMASS, Amherst) on the GI Bill and I have been programming ever since I graduated in 1974.

I consider myself very fortunate indeed to have learned it when I did and to be able to continue programming professionally without interruption for 35 years. The way I feel now is that I have no desire to stop programming at any time in the forseeable future. I think that when the man in the black robe and scythe comes for me, he will hear something like this:

"Ok, Ok. I'll be right along just let me run these tests first."

Programmers have to constantly keep up with their field because it is constantly changing. For example, right now, you can see a paradigm shift from single execution program thinking to multiple execution program thinking (true parallel systems over multiple CPUs, not just multi-threading). Language extensions and new languages are being created to facilitate this task and it is very probably that one (or more) of these technologies will become defacto in the next few years. Programmers will need to know the new technologies or risk falling behind.

I'm not worried about this trend, as it is a trend which remains true in many professions. Eventually, as I get older, I may find it more difficult to learn the new technologies, but by then, I will be ready for retirement!


Being relatively new definitely plays a role though I think there is some flexibility that is also important to point out. In creating something virtual I think there is enough freedom that this causes a bit of a paradox, how many times do you go back and try to improve what was done which isn't often the case in medicine and law? While there can be some medical situations that are similar like those with chronic diseases that can be used like guinea pigs, if a surgeon takes out your appendix or tonsils, it isn't like to be repeated now is it? Human physiology probably hasn't changed much in the last 100 years while computers have changed rather dramatically. If a lawyer loses a case, aside from appealing, there is much else that can be done? In contrast there are many programming situations where things are done repeatedly and even have their own acronyms like ERP, CRM, and CMS to give a few examples where scale can play a role in terms of what is put in or replaced. Do humans ever upgrade our body? No we don't and rarely is there massive legal changes that forever change how courts work. How things plug together is still something that is evolving that I'm not sure if courtrooms and operating tables get changed as much as the UI on various software packages to be better for the end user. Think aboutu how much the mouse has shifted over the past couple of decades for something else that seems to be continually evolving and now we have touch screens in some cases that are cool but necessarily common everywhere.

In a way I think it is settled in a direction: There will be continual improvements to both the process and the technology with people being the part that messes things up as we are a little complicated, ya know. Think about how much something like a telephone has changed in the last 30 years from where there was 0-9 on a rotary dial to put in a number to now some phones having an entire QWERTY keyboard and various other screens like some cell phones that both slide and flip to do various functions that are way different from those old TV shows that show the phone were you had to manually dial the number and the phone wasn't quick if you needed help ASAP. The changes in the computing landscape in the last 30 years have been nothing short of awesome if you think back to the first video game consoles compared to what we have today to give another example. There are a few different catalysts for some of this. New hardware with features that at least I hadn't seen before like a multi-core CPU or GPU that makes how to design things easier in some ways and harder in others depending on what part your code plays. To complement that hardware comes new software models where things like Saas and cloud computing are being somewhat disruptive that I doubt we will have a long time period where there isn't something disruptive in some way shape or form. Lastly, look at how software is developed and what keeps changing over time here, where waterfall is still used and is good for some cultures, Agile is still working out various kinks as well as having multiple interpretations.

The World Wide Web isn't even 20 years old yet, but look at all the things that have used the Web in some way shape or form: 1) Proliferation of the internet and the tearing down of on-line service providers that walled off their services like AOL, Compuserve or Prodigy to name a few, 2) The birth of e-commerce where people can buy almost anything and have it delivered to their door without having to interact with anyone except the delivery guy who may have no idea what is in the package being delivered, 3) Globalization where those of us plugged into the web share knowledge and try to advance the knowledge we have in the world, 4) Communities that have grown in part from all the jobs that this has created whether it be over in India or in the U.S or elsewhere. How many software engineers are in the metro Seattle-Tacoma area or Silicon Valley and still work there?

I don't fear the learning my whole life as I think it'll be awesome to see the powers that I'll have in the coming decades as new tools and process ideas emerge and attempt to make what I do easier as well as the fact that I may move into other areas like architecture or management as career progressions.

I can think back to what I was using in 1999 for programming:

  • OS: NT 4.0, Windows 95
  • Web server: IIS 2.0, 3.0, using ISAPI Extensions and Client/Server structure.
  • DB: MS-SQL 6.5
  • SC: Visual Source Safe 6.0
  • IDE: Visual Studio 6.0 using C++
  • Hardware: Intel Pentium 2, 333MHz with 64 MB of RAM and a 4 GB hard drive split into 2 partitions on a cat5 network using ISDN lines to connect to the internet. 17" Monitor for each PC as I had 2 PCs.

What do I have now for development:

  • OS: Windows server 2000, 2003, 2008, Windows XP
  • Web server: IIS 5.0, 5.1, 6.0 and 7.0, using ISAPI filters, Websites, web applications, ASP, ASP.Net 1.1,2.0, 3.5
  • DB: MS-SQL 2005 and MS-SQL 2008
  • SC: Visual Source Safe and Subversion
  • IDE: Microsoft Visual Studio 2003, 2005, 2008 using ASP.Net/C#, JavaScript, VB.Net, HTML, XML
  • Hardware: Intel Core 2 Duo E6750, 2.0 GHz, 4 GB of RAM with a 150 GB hard drive on a Gigabit network with a direct broadband connection with our own corporate firewall. Dual 17" Monitors now.

So, has my job become simpler? Maybe if I didn't have legacy systems to worry about. For anyone who has read the whole thing, I thank you for your patience as I spew out part of my brains and heart here.


To earn more, you have to learn more.

Like a plant, once you stop growing, you're dying.


Anyone who isn't learning their entire lives probably isn't very successful.

Whether your learning technical things as a programmer would or learning more about relationships and business as a sales guy would all just comes down to personal preference.


No I am not afraid. Some things change, but a lot of stuff just get improved and you don't have to start over from the beginning when you learn new stuff.

Sure there is new languages or languages features, but they are often not that much different from what came before them (C# has some features that Java lacks, but they are not that difficult to learn). Every ten to fifteen years you have to spend 4 months to a year to learn a new paradigm, but there is more and more help each time and you have some time to learn it in.

And of course a lot of the issues involves people, who are so easy to understand (read two good books on psychology, cross it of the list for life).

But really no matter what you choose, improvement is required - otherwise you will end up like General Motors or Chrysler.


By the way, most of hot and valuable professions have the same property - lawyers, stock experts as well as programmers. We all operate with a new information.


The reason for the instability of programming is change.

In a digital environment change happens at astonishing frequency. While the example of a doctor has been given, it doesn't fully apply. After all, how often are revolutionary new treatments or surgeries pioneered? In the digital world this could happen in months, or even in three years. What was relevant to a programmer two years ago may not be any longer.

So you add everything together: changing hardware, changing software, changing standards, changing everything.

Yet, in programming things can have long periods of stability. I used Pascal for the first time (it was my first language) back in the days of Delphi 4, but didn't truly get into until Delphi 6; we've got Delphi 10+ or something now, but at last glance the language hasn't changed so much as the scenery around it. I can still take that knowledge and apply it to Object Pascal by using FPC. And, as far as I know, in the last ten years no huge changes has been made in C or C++ either–though the interpreted languages have made huge strides.

But if you look, programming has done a decent job at keeping up. Standards, browsers, protocols, etc, for the internet have changed vastly and improved greatly ever since I got started with HTML 4 almost 9 years ago. Yet, it's still HTML I use, if a dialect of version 4.

It all depends on where you look, but it'll all change. For better (XP/W7) or for worse (ME/Vista) change happens in software design.


The day I quit learning is the day I die. Still, I worry that as I get older, employers will just assume I don't have the ability to learn new technologies and will pass me over for someone younger and cheaper, whether I can learn the new technology or not (or already have).

It's a great argument for starting your own business, though.


Why? With software, once a problem is solved, we should not solve it again. So: we are always solving new problems or learning the newest solutions problems.

Afraid? Heck no. We are in software's golden age when the problems to solve are delicious. That's why I am doing this work.


A Mathematician has to keep up with the latest proofs (assuming he is a pure mathematician) or the latest software (assuming he is a statistician).

A doctor has to keep up with the newest medicines and treatments.

An engineer has to keep up with the latest technologies.

A carpenter is probably the only on that list that doesn't have to do too much learning.

it's a fast moving world and nobody can hold on to a job for 20 years now without learning anything new. Just take a look at all those GM workers.


When you know your field, start inventing something for someone else to learn.


You don't have to unless you want to. There are still cobol dinosaurs who are making tons of money because they're still willing to code in an antiquated language on an antiquated platform.

See this, Steve Yegge @ OSCON '07

"Java's my Dad's language, I'm not gonna use it" - Steve Yegge

With bleeding edge technologies (currently JS Frameworks) you'll spend most of your time chasing shadows trying to learn as many different platforms as possible even though most of them will fall out of the mainstream as soon as their advantages are formalized and included in the core (like flash movies will be deprecated with HTML5).

If you want to spend time learning less and making more money, pinpoint which languages are going to dominate a niche.

Such as

  • Python for dynamically typed scripting
  • C for systems development
  • Java/C# for statically typed programming
  • JavaScript for browser client-side scripting
  • Lua for games and 3D environment scripting

Learn as many languages as you choose and how they apply to current hardware technologies, and make a ton of money. I have worked with a few guys who have been coding primarily in Fortran since the 70's and they are definitely not struggling to find work.

Note: I would mention ASP.NET/PHP for server-side scripting but I honestly think Python will eventually replace both. LISP doesn't deserve mention because it has always been around and will always be around to give us a clear indicator of which programmers obsess more about exploring programming theory than efficiently writing functional code.


It is more frightening to stop learning. The rate of change in many industries is following the same pattern. Having to operate in a more connected world means everyone has to keep learning. Find anyone besides the Amish in America that aren't trying to increase profits and improve efficiency. Historically many industries underwent a revolutionary period where new methods and technologies transformed entire segments of the population. The agricultural revolution had a 30 year flash point in nineteenth century. Those not learning were literally swept out of the market and into the cities as unskilled workers. Consider yourself blessed to be living in todays age where there are so many tools available and so many interesting problems to be solved.


Speaking for myself, I think there are a variety of reasons that software development keeps changing as a profession. As the question alluded to, this is a profession that has only been around for 50 years or so. Although I'll agree with others that we're not unique in needing to continually update our skills. Also in that not everyone in our profession does so.

As others have said, our ways of doing things are constantly changing. One of the reasons for this that we're always finding new ways to use what we do. For example, 20 years ago the profession was already 30 years old, but Berners-Lee was just inventing the World Wide Web. Since then, we've invented all sorts of new ways to develop for the web as well as ways to use it.

Also, over time, we've adopted practices of other industries. When I first got into software development, we were just "computer programmers". The joke was that you gave a programmer an assignment and then put him in a closet for six months at the end of which you had a program. Initially, my friends and I thought of the term "software engineer" as just a more fancy sounding job title. Now, we apply true engineering principles to what we do.

And, perhaps I'm too cynical, but I feel there have been times when the changes are brought about by commercial interests. A company with enough clout can force - or at least try to force - the industry to adopt their way of doing things in an effort to give them an advantage in the market place. In my opinion, this is happening less than it once did, largely thanks to the open source movement.


If there wasn't something new to learn every day I don't think I would enjoy programming as much as I do. In fact, learning probably gives me the most beneficial, and sustained, pleasure of any pursuit in life.


I don't anticipate the situation changing-- the job will always require more learning-- it's an exciting job because it is so dynamic.

I don't listen to anyone who says I can't learn new things. I missed the early part of the internet boom and then took a couple years off. Coming back required lots of reading and studying, but I built back my expertise. You can catch up. Two years ago I was starting to feel concerned about the future like you-- and I loved Java and thought I'd never find anything better. But I was thrown into a Rails project, and it was a great experience, and eager to do more.

I have to disagree with these people that say all the important stuff is universal and transferrable. Maybe it's true and I just don't recognize it, but nearly everything has changed for me: We'd work on month long projects in preparation for a big release, and now we're releasing weekly. My day is filled with new practices: pairing, writing our own tests (first), using a completely different programming model (and language). All the skills in learning how to manage memory have no value. I used grep and sed and awk for what are now trivial refactorings in my IDE. IDE settings take care of hours of meetings and arguments about code style. I used to pride myself on my design skills, but I am starting to think that "high level" skill isn't all that critical, given the evolution of refactoring tools-- we'll see.

That being said, there is some sort of wisdom one collects over the years, which hopefully compensates when learning new things is harder (but not impossible!).


After high-school i briefly studied law & psychology.It was fun and interesting for a while, but i always missed the ability to construct things, taking ideas from draft to reality.

I've always loved computers, mostly for its ability to teach me something new everyday.Even if subjects like law are in a constant change, and the opportunity to learn exists, it's not the same.

Thats the reason i can see myself working with technology and computers for the next 20-40 years, because as long as i stay updated, it will never grow old.

And to be truly honest, i think programming lisp and hacking perl scripts together beats cognitive psychology and business law any day.


If you're not learning every day in whatever you do, you're not doing it right.


I think you could easily come back after 10 years away if your fundamentals were solid. Sure, you'd have to learn some new APIs but that's trivial compared to learning the underlying concepts. If you really understood them then, you'll still understand them now (though you might need to refresh your memory).

Inheritance, polymorphism, encapsulation, abstraction--these are hard concepts. How to represent inheritance in Java vs. C++ vs. Python vs. Ada 95 is just syntax.


I think people who love learning for their whole life become programmers and not the other way round.

New languages are invented every day because developers hate doing the same thing twice.



Programming is like poetry: repetition worth nothing
So you should constantly learn to produce something new