I hear this stuff alot since I am still going to school while employed as a developer. Strange, odd and sometimes downright incorrect sayings like "INSERT and DELETE sql statements are almost never used in the real world" from professors all the time.

It seems the less experience a professor has, the more they want to go on and on about "the real world" and how things will be different.

I once had to listen to a 30 minute rant from a professor teaching a db class that said "In the real world, tables always have prefixes, this book is awful because it doesn't have prefixes on any of its tables".

Ah, I revised the title, so the question is: What have you heard about "the real world" from academia that is completely incorrect?

Also, I'm not trying to bash academia or professors in general. As with everything in life, you'll experience those who are "amazing" to "awful" and everything in between.

For something thats gotten so many views, maybe it should be reopened :) Ah, 3 votes left!

12 accepted

academia lie #27: that I'd require post-scholastic knowledge of integral calculus


"You will have to relearn everything every six months... Beacuse the industry changes so fast."


I find that a lot of things I learn in university have absolutely no practical use in the "real world".

I think what you should gain from school is the fact that you are essentially "learning to learn" and its those kinda subtle, tranferrable skills - thinking logically, timekeeping, communication, planning etc that you should gain and nourish at school.

The actual content isnt what matters, I think. technology changes so quickly - its learning the paradigms of thought you need to be a developer that counts.


I heard that in the real world, junior programmers start at $60,000 and go up from there. I heard that people two years out of school were making $100,000 minimum.

I was in school during 1999 and 2000.


The Waterfall methodology was the way the real world works when it comes to project management and development.


Sorting algorithms. I had to write my own for lord knows how many projects. Now, in the real world, most languages I use can simply use .Sort() and I can move on.


I was told so many lies it wasn't funny.

Things like, you'll have clear requirements.

You'll work with smart people

You won't use open source

And so many other things....I wish I could go back to college and warn students of the horrible things that await them


Problem is... The people that are most passionate about software development are actually out here developing software already.

Academia is good for learning concepts. It's a good place to learn that INSERT and DELETE actually exist and how to use them purely conceptually.

However, rarely is academia a good place to learn patterns. What technology to use in certain situations, class design patterns, and good architecture. That's why this site is so popular, and the patterns are what make you into a professional developer.

Learn the concepts in academia world... then learn how to use them in the real world. Usually the only exception is when you have a teacher/professor that has a living example of a popular real-world application out there.

  • We were taught about various development methodologies, but I've yet to work in a company that uses any development metholodogy.
  • The salary expectations we were given were not even close to reality. It took three years before I could even find a job in my field.
  • "They ask for our grads by name!" ... Not from my experience. They cram 4 years of work into 2 years of schooling, but because of that they can't give a bachelor's degree. I would have been better off at a real university.
  • We had to learn Oracle because supposedly nobody used SQL Server. Everywhere I've been has used SQL Server.
  • They said we would need Discrete Math in the real world. I haven't used it since grad.
  • We were told that memos had to be three pages long. I told them nobody would ever read a memo that long, but they didn't believe me. I think one of my coworkers went to the same school because his memos are five pages long.

The only instructor who had a clue about the real world was someone who had been in the real world for many years and became an instructor later in life.


In truth, I didn't hear much about the real world while I was at university. And I was thinking earlier today how much better university courses would be if they actually taught predominently stuff that was needed in the real world.


1) That industrial problems are never as challenging, difficult, interesting or worthwhile as those in academia.

2) That minimizing the constant factor doesn't matter and isn't worthwhile.


One of the Big Lies is about 'software process'. The problem is that practically no two companies agree on what is the correct software process, so it is impossible to teach anything that a majority of students will encounter in the Real World. So many students feel lied to when they discover that everything they've been taught about software process is irrelevant, or even downright wrong.

Now if we taught the software process called OhmigodthedeadlineisWHEN???, we might prepare at least a significant minority of students for their real-world development experiences...


A professor for one class lead me to believe that employers in the real world expect every line of every method to be as optimized as possible. While being able to optimize code is a great skill, more often than not a "good enough" piece of code is more cost efficient than the "uber-optimized" version. When you are working with small/medium projects, extremely optimized code may only gain you .01-.1 seconds. The real world taught me to code everything first and then come back through and optimize bottlenecks. I am not saying to code sloppy, but if a "good enough" algorithm will take an hour to implement and the "uber-optimized" version takes a week, start with the good enough and see if you need the extra speed in testing.

"Premature optimization is the root of all evil" are words to live by :)


Oh, they deluded me into thinking that NP-completeness, once proven, was the end of that problem.


The same DB proff in the original question also said "In the real world, database triggers are used all the time".

I've never seen a trigger in any of my professional experience. Not saying that no one uses them, but I certainly haven't seen them used "all the time"


My professors haven't actually lied to me, but I noticed some real deficits in the stuff that we are exposed to on a technical level. Take for instance the department web server, which I was obliged to use on a number of projects.

The thing was essentially a hacked rsync that threw files onto Apache. No dynamic includes even, and if I was lucky an old version of php was running. I made some very bad static HTML then.

When I went into the real world, I realized how stupid this was; that nobody would ever operate in such a way; and that I could do something nontrivial with web development and dynamic languages (python, perl ftw). So yeah, basic IT can be lacking in academia.


i was told that haskell would come in handy.have never used it,in fact in an interview when i mentioned it the interviewer had no idea what i was talking about.


My professors said that you will not go past low paying web development without a degree. After 3 years of studying for BS in CS I do medium-pay contract jobs at a big company. I plan to finish education to get a high pay job there, but I never had to do low paying web dev jobs.


I was told with a straight face that dynamic memory(new, malloc) was evil and should not be used.