I have had these questions for several years and haven't received a straight answer from anyone in the teaching profession.

  1. What are the most important things you have learned from higher level education as they apply to programming?

  2. What things were sorely lacking from your formal education?

  3. Was there anything you were taught that was counter-productive for your career or down-right wrong?

I recently went into my first development-related job after a long period of self-guided learning, several summers of community college, and two years of university level education (in progress). My limited experience is that the multiple years of formal education had little or nothing to do with actually working on a product that will be of use to people. The order of relevance seems backwards to me: self-guided learning (books, forums, blogs, etc.) was most valuable, followed by community college courses, and then lastly, university level courses.

For those of you who don't know, let me clue you in to how the first few years at university work (and this is at a well respected, technical university):

C++ courses are on the way out, my school at least is going exclusively to high(er)-level languages (Java)

The remaining C++ classes are taught the following way:

  • Memorize the exact syntax of the language because we're going to test you on paper without an IDE. Absolutely no reference material of any kind.
  • There's this thing called OO. Imagine that there's a Shape. Now image there is a Rectangle. The Rectangle "is a" Shape, you see? Moving on. (And if you're really lucky, they might mention Shape could be abstract).
  • Then there is this huge lecture on how hard pointers are and how they aren't that important because other languages don't use them. Then there's a half-hearted attempt to explain them with lots of warning that you can get an A without knowing anything about them.
  • And then there's recursion. There's even more apologies and hand-waiving and then all of the sudden, you never have to think about pointers or recursion ever again (or so they lead you to believe).
  • The rest of the course is about familiarizing you with GUIs and some discussion about data structures (and if you're lucky, they might make you actually implement one or two). Oh and by they way, they practically write the data structure code for you and don't stress full understanding. Just memorize when to use an array and when to use a linked-list and you'll be fine.

The same story applies to a class called "Software Engineering". It is only offered in Java.

  • Memorize syntax
  • Do some basic GUI work
  • This is what unit testing and source control are (the bright spot of the course, they actually mention them!)
  • And then the rest of the course is about Agile and Extreme Programming with rushed, partnered projects that consist of one screen that does one simple calculation (repeat, slightly modified, 10 times).

Now it's not all bad, I've had a good experience in a class about DBMSs and one about algorithm design. But most of it just seems... wrong. Are all schools like this?

17 accepted

I went to both the local university and DeVry, so I've seen two sides of the (possibly poly-dimensional) metaphorical fence.


A very strong emphasis on data structures at the university including their big-O performance and when one is appropriate and another is not for some kind of storage/retrieval. I've seen people who didn't get much of this and it's a serious detriment.

A strong emphasis on the software development life-cycle at DeVry. Finding problems ahead of time with requirements gathering and use cases has proven very useful. The final project (which was basically writing a program for contract, all steps involved) was invaluable. I don't know if the university students get as much of the business side of things, but it's proven invaluable.

Lots of database practice with SQL at DeVry. There were 3 courses (only 2 required, IIRC) but my teacher was excellent and at this point I'm using SQL almost daily. Having a very strong grasp of how relational databases work and should be designed and structured is a very very useful skill. I don't know if the university students get much (if any) of this. We didn't get just SELECTS and UPDATES, we got JOINs, stored procedures, and tons more.

The C++ classes I received at the university were great because (along with the data structures, which were integrated) they put a huge emphasis on pointers. You had to understand them to do decently, which was a good plus. The fact I later had a class in microcontrollers that was required (using Motorola HC12s) helped push the understanding of pointers and how it all works with hardware. This opened some students eyes from the "memory is a bunch of boxes" stuff you get in Programming 101.

I may not use pointers in my professional programming directly (I do Java), but the understanding of how all that works makes designing data structures much better.

Of course today I think they teach in .NET or Java, so much of this is lost.


There was very little debugging, very little source control. Neither place did much (if any) of either one while I was attending. I was barely instructed on unit tests. I think they were only in the course because of the professor, not the curriculum. These would have all been useful things to know.

Debugging can be such an art that it would have been nice to have a class just on that. We got some "here is how to use the debugger" stuff (this is how you step, this is Y, this is Z), but never "here is how you approach a problem like this". That kind of information would have been quite useful.

Visual design and design of user interfaces is something that I've basically picked up bits of through reading online and experience. Having courses on some basic rules would have been useful and might help the glut of unusable interfaces out there.


The only thing that jumps out at me was my database teacher's instructions to never use SELECT *. It has proven very useful in many situations (specifically writing DAOs) so you don't have to keep adding one column at a time. I understand not using it when generating reports and such, but the blanket "never" I was taught proved to be too much.


Are all schools like that? From what I've heard they are usually heavy theory (the university I went to) or heavy business and getting things done (DeVry). The combination made a very good education, actually.

But they both have failings. The university was dumbing things down (going to Java from C++) and didn't provide any real-world experience. If you wanted that, you'd need to get an internship and hope they taught you very well (project management, project design, etc).

DeVry seemed like they didn't get much of the theory from what I saw (I joined with credits already). They used Java for everything when I was there, but I think they've switched to .NET. Pointers are probably mysteries to many students, given only a basic "this is what they are, we won't use them" explanation.

In the time since I left, the implemented something they were calling the tracks system. This meant that instead of being a CIS student, you were a Digital Criminal Justice student or some such. You were a DB heavy student, web application heavy student, computer forensics, etc. From what I was able to see, it just meant teaching less general stuff and more laser focus. While the DB students probably got better at DB stuff than most CIS students ever did... they didn't get as much of everything else. This just produces pigeon-holed people who can't work outside of the little area they've been trained in.


Universities are not supposed to only teach you language specific things. They are supposed to teach you the ability to learn new languages, paradigms etc by your own. So the focus should naturally be all about things that are common to most programming languages: How to compute things. For example, data structures and algorithms. Furthermore, they should expose students to various different kinds of knowledge so that the students can go further on their own if they want to.

As my friends who learned scheme as their first language quoted their lecturer: You should be able to learn any programming language within a week after you learned scheme. What could be lacking if one has such learning ability? :-)


Software development and Computer Science aren't the same thing. If your ultimate goal is to go to work for a company and churn out code, you would probably be better served by continuing to work.

On the other hand, if you're interested in why and how what you're doing is working, Computer Science might be a bit more relevant. Only if you're willing to take the time, though.


1) Maybe I went to a really good school but didn?t realize it. It has a good reputation, but not the top. Anyways, I don?t understand the reason for the frequent comments (here and other sites) about how school only teaches the theory but nothing practical that you really use on the job. It has been quite a while, so I certainly don?t remember much of it, but in addition to the many, long late nights programming smaller assignments some of the practical things I had as assignments were

  • Implement a SQL database
  • Build a Linker/Assembler for a fictional PC
  • An Artificial Intelligence Project (Mine was language processing)
  • Add language features to an existing compiler (Pascal in my case)
  • Build a computer from TTL Chips
  • Create a 3D Graphics Library from just the pixel/line routines (yes it was slow, but I learned the math)

Plus, I learned about how a computer works and all the parts from the bit-level And/Or Chips, the ALU, the ripple counter etc? I understand how binary arithmetic is done inside the computer, I know how an OS and its innards work. Plus I learned Math and how to apply it. This doesn?t count the numerous projects I worked on because I was inspired by something I learned in class.

Do I use this stuff everday? No. But from time to time, I?ve had to apply something I learned from most all of it, even if it was just using what I learned about how to learn. People who don?t believe that a formal education is very useful in the real world are short-changing themselves and limiting their growth potential. I like variety, any more than 6 months doing the same type of work and I become mind-numbingly bored. I?ve found the only way to even have the opportunity to work on varied type of tasks is to prove that you are capable of it. If you haven?t had the math, you can?t pick up a book and learn many of the more interesting topics such as DSP, Computer Vision or 3D Graphics, to name just a few. I received that background in school, so I can become the local GPS expert or DSP Guru or whatever other specialty role that is required in a reasonably short time. I have my formal education to thank for having the ability

2) They did not do a good job of connecting the dots as far as making you understand WHY they are teaching you certain things. They also did not teach the importance of the soft skills.

3) Not that I can think of.


1) University will teach you in-depth theory. It has no immediate practical use, but it forms your mind and prepares for your future growth. You are made aware of stuff like algorithm complexity, so that when you write this nested nested loop, you don't wonder why the server takes 20 minutes to respond to your request. You learn about finite automata and Turing, which gives you the confidence to argue on slashdot that all programming languages are equivalent.

2) The practical side of the art of programming is that you must code many thousands of lines of code to learn. University courseload will never achieve that. The challenge that comes from setting up your own goals, trying to write the code that solves your problem, iterating, not being quite happy with it, tweaking it some more over months and months... that's what will happen in real life (at work). But not during College. So you must have some of the self-made attitude as well.

3) I wasn't taught anything that was wrong on paper. It's more what was left unsaid, or the emphasis on certain things that proved to be unimportant. If you are smart enough, you do exactly what you said in your example (horror story): you listen, you learn what sounds smart, and you dismiss what sounds bogus.


Im a high-school teacher in a technical school in italy: in 3 years course (from 16 to 19 years old students) I try to teach: 1st year: MASM: small simple routines on strings - same small routines in C using vi and gcc 2nd year: structs, lists. trees in C using vi and gcc - IPC and pthreads using gcc on any IDE configured by students (generally they like netbeans) 3rd year: advanced OOP in C++ - socket.h and design patterns

meanwhile, in another parallel course my students learn: 1st year basic programming on vectors and matrix in C (gcc) 2nd year basic OOP and GDI+ in .net (generally C#) 3rd year SQL and advanced .net using various DBs

I use to give class-test that have to be made on paper writing directly in C/C++ code. Always enphatize on "talking code" .

can u plz tell me if you think it could be good? tx in advance (and sry for my english)

  1. Thinking algorithmically.
  2. More Practical Programming.
  3. Not that I can think of (but some wasn't very useful).
  1. Problem-solving through decomopositon and abstraction.
  2. Tools and processes (Source code control, unit testing, deployment, etc.)
  3. I can't think of anything counter-productive, but perhaps useless: In school we had to flowchart every program and I haven't created a flowchart since (20 years).

Sounds like you went to a shitty school.


Personally, I think your experience is not entirely representative of what incoming students should expect to find in a well-respected university. Certainly, at my university, CS students start with Scheme to learn the basics and start writing larger and larger programs using recursion and the other principles that come with a LISP-ey language.

It isn't until they learn all of the "hard" functional programming stuff that goes along with Scheme that they're introduced to higher level languages which are then used as the means for teaching principles like OO design, pointers (in C++, of course), data structures, algorithms, etc. These principles are taught pretty well, in my opinion; lectures tend to be clear, questions are answered correctly, and to make the issue live, students are instructed to come up with some sort of useful program that is designed to teach them whatever concept is being covered.

You also implied that some of the concepts are taught in a hand-wavy lecture or two. From what I've seen, universities tend to have full term courses on topics like Data Structures and Algorithms.

Given all that, I think it would be unfair to categorically state that the education you get from a proper university is not as useful as what you'd pick up by reading JoS or SO.

That being said, it does depend quite a bit on who your professors are and such. most importantly, I find it very useful to take more challenging courses whenever possible. For instance, if there's an "honors" version of the same course being offered, taking that course generally ensures that both the professor and your classmates are of a higher caliber than otherwise, and the course itself tends to go into greater depth and detail.

Now, to answer your questions: 1. Not sure yet. 2. There are always going to be some things that are left out of a university education, but it has been my experience that the framework and background knowledge you acquire from a university is generally more than enough to enable you to figure them out the answer yourself or by using a reference. 3. No. Although they might have overstated the importance of commenting in my early programming courses.


1.What are the most important things you have learned from higher level education as they apply to programming?

The most important thing related to programming I feel I learned was how to learn. As has been mentioned, most of the curriculum from my CS program has very little immediate practical value (at least, I have yet to see very much). However, the curriculum did teach me how to go about learning something new. Also, critical thinking. Being able to become familiar with an application domain, and then analyze and solve a problem has been a very useful skill that my university professors helped develop.

The most important thing I learned at all from any of my CS classes was that there are all kinds of people, and you'll have to work with all of them. We studied sections of the book Dinosaur Brains, which gives very good advice for how to interact with people of differing social styles.

2.What things were sorely lacking from your formal education?

A compilers class. My university's CS program was very small, and more focused on software engineering than pure computer science, and didn't have a compilers class. At the time, I was grateful for that. Now, I sort of feel like I was robbed a little.

3.Was there anything you were taught that was counter-productive for your career or down-right wrong?

I have yet to see anything that fits that description. All of my university professors had extensive backgrounds in industry, and they made it a goal to prepare us for industry, and for software development in the real world. Of course, its only been 6 months, so I may yet find something.


Here is my experience of University of California (UC):

There is generally more emphasis on freedom & theory than application and what's applicable - there's the general thought that if you understand the theory you can and should be able to apply it. I took "Software Engineering" courses that involved coming up with a project of your own in groups that lasted the whole semester. The different approaches (agile, waterfall, XP, etc) were taught and we were suggested to use the agile approach. Much emphasis was given on freedom since we could use whatever source control we wanted (I believe some people even just emailed each other back and forth but SVN was provided and some usd Git), whatever language we felt comfortable with, etc. The project didn't have to be useful, hard, or practical - some projects were rather trivial and other seems really well planned (ex: had iPhone flashlight and a polished Gauntlet-clone that supported multiple player across a network). I would say that this course gave you the opportunity to explore the different possibilities of software development but only if you pushed yourself. The class itself presented many applicable software practices such as debugging, design, code-review, etc which was very useful.

We also had self-paced courses in which you could pick a language and follow a guide and teach yourself the language. This was an easy way to motivate yourself to go through a language with help and guidelines. Though I would say in most classes, the professors spend at most one or two classes going through a language and expects you to be able to work through the rest out on your own time and seeking help if you felt you needed it.

For other courses, they were all heavily focused on theory. This isn't bad if you want to go into kernel hacking or into redesigning the core components of a DBMS - both classes are offered. Anything that neared practical would be far too time-consuming or would be trivial. One of the more interesting courses offered was "Artificial Intelligence" - I would have to say that's one of the favorite courses (in part due to the great professor and what the course itself offered - AI through Pacman). Another good course that presented a closer to practical work was "Compiler" - SVN was used, clean code and documentation was expected. You also had to do your own testing and provide those test cases which were to be used on other group's code.

I'm a transfer student and prior to transferring from the community college - we didn't touch any source control system, we didn't do any actual debugging, we didn't focus much on what's practical, nor did we do in-depth theory work. However, the experience is good if you are relatively new to programming.

I would say that university provides an 80% theory / 20% practical in most cases. If you pushed yourself and found projects to work on (by yourself, with others) you can gain quite a bit of practical experience. The university provided many opportunities for you to find projects to work on - they won't necessarily be the type of work that you would do as a software engineer but for the most part they are all pretty interesting (and open-ended). If you wanted to push yourself and your university has a good Computer Science program, I would say that the university would provide you with a better overall experience.

  1. Domain Knowledge outside of CS. I apply computer science to a specific scientific knowledge. Double competency was key for me.
  2. Commercial Software development practices.
  3. Lambda Calculus.
  1. If your code is not readable, clearly, by a human - fucking re-write it.
  2. OO is about "seperation of concerns"; test-driven development & refactoring; evolutionary design; software development is about people
  3. "The more you plan upfront, the better"


What are the most important things you have learned from higher level education as they apply to programming?

Problem solving techniques - e.g. greedy algorithms, dynamic programming

Theory - NP-completeness, randomized algorithms, etc.

Mathematics - a good mathematical mind...


What things were sorely lacking from your formal education?

The applied side is totally useless, because it's not taught well, nor is it required. We have first and second year students who think python and java is programming, and go splat on a wall when they have to learn about OS and write C programs. They don't understand heap/stack-space, pointers, or stack frames, and looks at me funny when (as a n-th year student) I point out that they're writing something to a char* point to no where.


Was there anything you were taught that was counter-productive for your career or down-right wrong?

Our software engineering course still taught the waterfall model... enough said

Can't you sense the bitterness? ;)

  1. Algorithm design & analysis, data structures, lots of courses dealing with processor architecture, OO design, operating systems, databases, compilers, and we worked with a wide variety of languages (x86 & MIPS assembler, C++, java, C#, Scheme, Common Lisp, Prolog). I also loved my AI and computer graphics classes (computer graphics was heavy on the mathematics behind 3d rendering, but we did have a few OpenGL-based projects).

  2. The business side of the field. I learned quickly after graduating that being a professional programmer is only about providing business solutions. As a software engineer, you solve business problems, not intellectual ones. You don't play with AI, write ray tracers, or build compilers - you build seemingly useless web apps for your clients based solely on vague, conflicting requirements. It's also your job, not your client's, to determine what your client wants out of a product. You only do what makes money, not what's actually interesting. I loved CS while I was blind to this business aspect - now I kinda wish I never chose that as my major.

  3. Not that I can think of.


What are the most important things you have learned from higher level education as they apply to programming?

Anthropology: the tools for observing cultures, cultural situations, and interactions are very relevant to observing a business world situation, getting requirements, avoiding producing work that meets requirements but only looks good on paper, etc...

And historically charged courses in how to read ancient theology texts for inferences, assumptions, and what was between the lines that turned out to help decipher arcane AT&T communication for inferences, assumptions, and what was between the lines...

And writing, of course, has something to do with writing programs people can read.

And the math and CS stuff was also good: math for building the brain, CS that included a software engineering course that was basically equivalent to Code Complete. But good humanities courses, and in particular good CS courses, build transferable skills even when they don't give buzz-words for your resume.

What things were sorely lacking from your formal education?

There were some things that were pretty bad. But I've received enough advantages that I don't want to write indictments here.

It sounds like you had an unfortunate experience. But at least some of what's out there is helpful.

Was there anything you were taught that was counter-productive for your career or down-right wrong?

Almost certainly. But there was also something that helped me learn from my mistakes and learn from when I was wrong.

One closing comment:

There is no sport I've seen (I know, there's probably one on the web) where one sits on a weight machine in some artificial posture and makes a rather stylized motion. I've never come to friends watching sports on TV and seen a camera closeup of some athlete bench pressing his heart out.

But that doesn't mean that weight training is irrelevant. I've never heard of an NFL player who does not lift weights or perform some comparable strength training.

Education is something like that. Not exactly--there are many good programmers who are self-taught, and some of the best are self-taught, even if that means they've devised their own strength training program.

Physical strength is a transferable advantage in sports, and education is a transferable advantage in programming.


1.What are the most important things you have learned from higher level education as they apply to programming?

This has been covered well by others.

2.What things were sorely lacking from your formal education

This has also been covered well by others.

3.Was there anything you were taught that was counter-productive for your career or down-right wrong?

Most CS programs beat into students the incredibly wrong notion that elegant technical solutions are the #1 reason for software development success. This is all they focus on. In realty, as most of us have realized, nothing can be further from the truth. Correctly managing political and project-based factors/constraints is far more important to the outcome of a solution than the technical aspect. The real-world learning curve would be shrunk if universities would teach the importance of soft-skills in software development.


I got the feeling that what my CS profesors were teaching me was how to be a CS professor.


What are the most important things you have learned from higher level education as they apply to programming?

My undergrad major was Math, which teaches the importance of definitions, logic, and clear exposition. It is vaguely like law in this regard. This had huge implications for my later studies in computer science. For one, some subjects are directly related to math (e.g. algorithms, theory of computation), so the benefits are clear. However, math helped me as a communicator, whether it is an email message or helping a domain expert write a requirements document.

As a secondary influence, various classes in the humanities helped me learn to write. Clarity is one thing, but grace in written communication is invaluable in the workplace.

What things were sorely lacking from your formal education?

I'm not a good system administrator. I admire those who can troubleshoot a networking issue. It would have been good to have some more hands-on training.

Also, I'm not especially savvy when it comes to business and international commerce. e.g. Being offered gigs in other countries, or understanding how to do freelance consulting. I've learned a lot in this regard, but there are a ton of issues to consider.

Was there anything you were taught that was counter-productive for your career or down-right wrong?

I was at a top-shelf grad school, and the head of the department specifically told me he would not divulge what was "hot" in the market for jobs. One might think that he wanted me to pursue my passion, but I think he was covering for himself. He said it was my job to figure out where the bull market was. Nice.

Also, when I was in school, there was very little concern for robust code. Just hammer out the solution and turn it in. It was a surprise to learn in the real-world that so much time is spent on dealing with problems. e.g. What if the input isn't valid? What if the database goes down? Similarly, there was virtually no talk of testing, but that probably reflected the state-of-the-art. I don't what schools teach now in terms of unit testing.

  1. Object orientated programming
  2. Source control, unit testing and basic design patterns
  3. Not that I can think of

Most good schools are not like this. There are a bunch of trade schools that hand out worthless degrees but most of the top ones do a good job of teaching theory and not coding rules. As far as the classes (this applies a lot to "professional development classes") that pretty much just regurgitate what was in a book. Some people learn better that way than by reading/experimenting on their own but if you can learn from a book I think it is more efficient both time and money wise.

  1. Object Programming, Component Programming is very useful for me. But programming skill i can improve at Team Project and Graduation Project ;) Helping is also MSDNAA at my University - free MS software ! :D
  2. its too little SVN, and new technologies like AJAX, jQuery etc, and Agile(TDD etc) ;/