This question exists for historical reasons as it was valuable to the Stack Overflow community during the evolution of the site. It is not an example of an objective, on topic question and should not be used as an excuse to post something similar.

Recently I have noticed a number of questions on SO that look something like this:

I am writing a small program to keep a list of the songs that I keep on my ipod. I'm thinking about writing it as a 3-tier MVC Ruby on Rails web application with TDD, DDD and IOC, using a factory pattern to create the classes and a singleton to store my application settings. Do you think I'm taking the right approach?

Do you think that we're handing novice programmers a very sharp knife and telling them, "Don't cut yourself with this"?

NOTE: Despite the humorous tone, this is a serious (and programming-related) question.


If patterns, testing, managed languages and best practices is handing them a sharp knife, then at least they can cut with it.

Compare that to the unmanaged spaghetti C and Perl that I was taught in University - they handed me a spork and told me to butcher the cow!

75 accepted

I sense something of an age/generation divide on this.

Twenty years ago you knew someone was a bad programmer because they came to work and wrote 25 C programs that either sucked or didn't work at all. Today those same kind of people write 5 programs in 5 different languages and they still don't work.

It's not hard to tell who is jumping around technologies because they are curious and who is just desperately trying to find one they can succeed at.

The only real difference I see is that today there is an amazing amount of buzzword BS to throw about in an effort to explain why their work is sub-par.

In the end, the good developers find their path. The bad ones, unfortunately, seem to disproportionately end up being your boss.


Absolutely. But mostly because we are taught such bad habits at university.

Nearly all these answers are provided by senior programmers. As a junior programmer I might be able to provide a different insight into this question..

At university, I was taught interfaces, patterns and software design and thought it was all rubbish. The interfaces were wasted, repeated code, the patterns nonsensical and the design felt like wasted time that could be spent programming.

This was, in a large part thanks to it being taught by computer science academics who did very little programming and often didn't understand the practicalities associated with coding.

They taught us what the material was, but not why to use it, or how. Even worse, I wasn't convinced it was even a good idea.

This is, in my understanding, very common of university graduates.

So when we get into an actual programming job and get handed that sharp knife, we wonder why you didn't just give us that bread knife, and will probably end up using the blunt side to do all the work.

For those of us who use the sharp side, it's a very steep learning curve. After 3-4 years programming badly, we have to unlearn all those bad habits as well as learn this crazy world of designing software, not hacking code.

Most university assignments are small projects, usually based on a specific algorithm or getting a specific data structure to work. Code is thrown out after submission and never reused.

In employment, a large portion of my work is changing very large existing code-bases. Code is only thrown out if it does not do the job. A poorly coded, robust solution is still a robust solution.

The very moment you first have to troubleshoot legacy code, is when you see the value in the sharp side of the knife. That moment, when you first see a class with 6k+ LOC is very much a "OH $#!^" moment.

Maintainable code is not taught at university...


I think the problem with patterns, especially for more inexperienced coders, is they make you think in terms of the solution, rather than the actual problem staring you in the face.

Some one, with much more experience than myself, put it nicely: http://realtimecollisiondetection.net/blog/?p=44


It's useful to have all these techniques and tools, only through experience can you really be comfortable knowing which one to choose. In school they often just teach you all the syntax and things possible in a language, but give you much less guidance in using them. The frameworks, patterns, and new languages that arise in practice often give you insight into understanding how to apply the technologies.


The only thing novice programmers lack IMO is the ability to choose among the various offerings and strike the best balance. Other than that I think they should go nuts and try everything with a critical eye and find their comfort zone.

Ah yes, that formal education thing. Also important.


So long as these people aren't getting dogmatic about their approaches to programming, it's fine to explore the latest in buzzwords and see if it works for you. I like to think that this is just a giant evolutionary experiment in software engineering, and we're trying to determine what works best through trial and error. In my evolutionary analogy, programming ideas are genes, and success in software engineering is the fitness by which the genes are judged.

Sites like this facilitate horizontal gene transfer. We're all just bacteria I suppose. :)


As a young developer with no formal education myself I have experienced both sides of this coin. When I began writing code I wrote awful spagetti code with UI and logic all intertwined, not because I lacked understanding of what a class was, but because I lacked an understanding of why I should do this when it seemed to work just fine with everything inside Form1.cs.

As I became comfortable with C# as a language, I began for the first time to explore the idea of software design. I think I would be less well-rounded as a developer if I was not exposed to some of these buzzwords and design patterns that I discovered as a result.

For example, prior to hearing a Hanselminutes episode about TDD I had never even heared of unit testing; while I am aware that unit testing itself is not all that TDD is, I believe the knowlege I gained as a result of investigating it has made me a much better programmer.

To address the question directly, I believe exposing young developers such as myself to the principles of software design is essential if they are to grow into good developers. However one must not walk before he can run; there is a time in every developer's life where he must write crappy spaghetti code, if only so he has a frame of reference when considering what he has learned.

Further to this, as more experienced developers we have a responsibility to guide those less experienced; without guidance many young developers will simply cling to one pattern or methodology without the perspective necessary to understand its strengths and weaknesses.


We should beat new programmers over the head with a copy of "Code Complete" until they learn that the number one goal when writing code is to simplify complexity. Simplify, simplify, simplify!


As with most endeavors, we must seek balance.

On the one hand, our schools should teach new programmers core concepts in problem solving (divide and conquer, estimation, etc.) along with fundamentals such as sequence, alternation, repetition for imperative languages, generalization hierarchies for OO languages, and functional composition for functional languages.

On the other hand, software development is much more than collection of programming skills. Our schools should teach analysis and design, (unit, system, and integration) testing, fundamentals of networking, fundamentals of security, source code management techniques, code quality measures and more along side core.


I believe that novice programmers should be made aware of them. I think that not introducing novice programmers to advanced concepts would be the biggest disservice we could do to them. They won't be experts at the advanced concepts but at least it will given them a path of study so they can grow as a programmer. They'll grow by making mistakes an being mentored by more experienced programmers. Novice programmers really shouldn't be writing code without the supervision of a senior programmer to help guide them through their mistakes. Code reviews and pair programming are great ways to accomplish this without giving novice programmers enough rope to hang themselves and the rest of the team.


I understand where you're coming from. The poor novices are lead to believe they must know how to architect systems, make decisions how the team's development process, or even design important classes (or some other significant unit of code).

At best, they get a harsh dose of reality very quickly thanks to some attentive and relatively competent mentors.

At the worst, these novices become dogmatic; treating each buzzword as some sort of silver bullet; meanwhile, tried-and-true best practices known for over 30+ years are ignored and/or discarded. Look at the whole agile fiasco, for instance.

Who in their right mind actually believes the agile manifesto--which is great when understood in the proper context--means discarding things like functional specs or black box testing? Yet I recall reading many little gems like these not so long ago...


I think design patterns are useful for two reasons:

  1. They help provide a vocabulary for communication between programmers.

  2. These concepts have been reinvented time and again. Design patterns (and architecture patterns) pass along some of that historical community knowledge, giving you tips on consequences of the pattern -- what trade-offs you're making, what pitfalls you may want to avoid.


What ever happened to creating the right size solution for the problem? Seriously, you should apply the right design patterns and practices when they are appropriate.

It feels more like we give people a large hammer and say every problem you see is just another nail.


Working in a team environment on a large enterprise application, the most important thing to me is readability and intention.

Used properly, design patterns and unit tests tell a story. Used improperly, patterns obscure what's going on and create an even bigger mess. Sometimes I'd rather just look through a 500+ line function then trace through endless poor and leaky abstractions.

The rub is that to learn how to use a pattern correctly, you have to force it a few times. I'm certainly guilty of that.


Novice programmers should used tested recipes - the alternative would be for them to reinvent the wheel.

If they get stuck on a recipe, they should ask someone more experienced.. or stackoverflow.

Once mastered, they could improve the recipe.. or invent new ones.


Yes, and not just young programmers. I see many people who (take ASP.Net MVC for example) want to learn the latest framework, pattern, whatever, and hammer that sucker home. Someone said this above, but the real problem is that developer's don't know that these are tools, and they need to choose the right tool for the job. There was a question on SO which asked if the developer learned MVC, what websites could he not build. Totally wrong thinking.

One site I am working on is a morph from WebForms to MVC. Is it warranted? Yes, there are NO events or need for viewstate, events, anything Webforms on the site. My problem? I am trying to implement the repository pattern, pipes and filters, and a bunch of other stuff for a project that has 2 tables and very little data.

My saving grace is that I know what I am doing is overboard, but I am doing it to learn the pattern. I learned last night this pattern was not right for me. Do I still think the repository with pipes and filters is cool, heck yea! Did it waste over 3 hours of my time last night which could have been spent finishing the project, heck yea!


The problem with design patterns, is that they don't make a whole lot of sense in procedural code. And most "OO" code is procedural code in disguise.

When you get these concepts down:

  • A method really is a message being sent, not an instruction or a command
  • Your program is not a series of instructions, but a number of objects in a particular arrangement sending messages to each other
  • Interfaces are not so much abstractions of 'things' as they are message definitions, and should generally be defined by the object that wants to send the messages
  • Most methods should be void

Then the design patterns book will make a lot of sense. Until then, it will just cause more heartache than anything.

Once you do get those concepts down, many of the patterns will just become obvious.


No, I think the blogosphere gives young programmers the idea that they can and should use tools that they don't understand. At university it's our job to give students the opportunity to understand the tools they use...


Generally no, because most university computer science education will have taught them the other side of the software engineering scale


Some people seem to think that we have been able to build good software even though our education was not perfect, but in contrast, the younger ones will be totally helpless and unable to do anything at all unless we take them by the hand and show them step by step.

I think this is not true. There are lots of smart people that will fall asleep in the middle of reading "Design Patterns" and still be able to apply them just when they are needed, not more.

Maybe some people will blindly try to use design patterns everywhere, but that's comparable, and not worse, to the horrors some people did with pointer arithmetics, ownership, bit operations and spaghetti code in C++ for decades.


Got to fall down to get back up again. If they cut themselves, builds character, separates the wheat from the chaff. I would rather have someone working with me who has failed to deliver on a project at least once. I don't have to put up with the crying when the budget fails to come through, the project has been scrapped and they have spent way too many hours on throw away code. Keep it Sweet, Keep it Simple. Ship it.


The most useful thing that you can learn in an educational environment is something that is useful to you in 10, 20 or 100 years time.

BDD, requirements gathering, user stories, OOP - These are all the things that were applicable before the first computer was even built. If I was to go back to university again, I think the acid test would be "is this going to be useful to me in 20 years?". If they want to give examples of how those things are implemented in a contemporary setting such as Cucumber for BDD then that's all fine but the emphasis should be on the idea rather than the tools.

Call me a pervert but I use Mike Cohn's "User Stories Applied" book for all sorts of projects in my life from artistic projects to life planning.