I'm wondering if anyone has had a situation where it's clear that you know more than everyone else on your team, including the lead and/or manager, and how you handled a situation like this?

For example, let's say you spend your free time learning ASP.NET MVC, NHibernate (or LINQ to SQL/Entity Framework/ORM of choice), Subversion (or Git) and an IoC container. You also study up on using the Repository pattern, Pipes and Filters, DDD, and have begun to dabble in WPF. Then, you interview for an developer job at "Acme Corporation". During the interview, the team lead relates the following information:

  • The programmers use SourceSafe for their version control system.
  • All apps are ASP.NET 2.0 WebForms, with all logic put in the code-behind, and for data access they use Typed Datasets filled via SqlDataAdapters.
  • All classes are programmed to concrete implementations (as opposed to interfaces).
  • No unit testing is done at all, and testing is performed by running the app and playing around with it.
  • The impression you get is that neither the lead nor the team are aware of any other way to do things.

How do you handle a situation like this? Do you not take the job, because the skills you've learned on your own are "better" than the skills the team uses? What if you are in the early stages of your career and need some experience under your belt? Do you take the job and try to get the team to use better tools, at the risk of being fired? Or do you take the job and use Typed Datasets in code-behind pages and VSS, ignoring what you know are better ways of writing code?

To put it a more blunt way: How do you handle situations where you are Elvis in a sea of Morts?


Don't take a job for which you are overqualified - you will only be miserable.

If, however, you do take the job (for whatever reason) you cannot complain since you knew what you were getting into in the first place. I would caution you to keep an open mind as well - just because you feel that you are Elvis doesn't mean that those around you cannot teach you anything :)


Personally I wouldn't take the job. Especially if there's a sea of them which would mean that change would be resisted. There are jobs out there that will do things "right" (ie your preferred way) that will take juniors too so my advice would be to keep looking. Don't sell out just yet :)


Let's answer your question with another question... You say that they do not appear to be aware of any other way to do things. The question is, would they be interested if they knew? The sad truth is that in this economy, holding out for the ideal job is not an option for everyone (speaking as someone who is recently unemployed).

It really is a fine line. If you think that taking the job will dumb you down in the process, definitely skip it. If you still have a followup interview, take an arbitrary example that might exist in their coding style, pull yourself up to a whiteboard, and show them the how and why of what you know and what you propose another solution would be. Don't try to sell it as the "right" solution, just as an option. If they see the potential in it, there is a good chance that they might be interested in changing it.

I previously (still feels weird saying that) worked at a shop that was just like the one you described. They really did not know any better. It is easy for a busy shop to get caught in a rut and as technology and methodologies move on, they do not have a chance to stay current. We were able to start by getting things up to the current framework version, then selling the developers features by going - "Hey, this is cool, check this out... See this for loop where you are just looking for a value, look at what LINQ can do." Getting them excited about how much more productive they could be was a selling point enough, and (almost) everyone started looking for ways to improve their code.


Many (big) orgainsations are still rather 'backward' in terms to technology they are using especially where there are integration to legacy platforms, mainframe stuff etc etc.

The big ones can't really leap and skip to new technology even though they might want to. Corporate bureaucracy aside and focusing on just technical aspect, "if it ain't broke, don't fix it". Especially so if the system supports the bakcbone day-to-day running of the business.

It's up to you to accept the challenge to move 'back in time'. It could spell opportunity for you to make some advancement to their system or even methodology. Though I don't recommend you shoot the gun the moment you join. Fit in first (if you can) and see how your modern skills and knowledge can help the organisation.


Provide input to your co-workers in a casual way, do not see as a "know-it-all". I believe you should put your all into your job and leave nothing out.

4 accepted

Given you atr in the early stages of your career, a mantra I used to go by was this...

While ever the job offered a learning opportunity and allowed you to grow as a developer, the job was worth while.

So perhaps the backward team may be an opportunity for you to teach/inspire others to progress in the way you have, and this provides you the opportunity to adopt a senior/mentor role.


I generally dislike the better than though attitude.

The cost of "updating" the skills of a team of 9-5ers is huge when you consider how much time you probably put into reading blogs and updating your skills. All their homebrew helpers, utilities and expertise in the tools they use would disappear.

And anyway, whats so wrong about the way they do things? Can you prove using SVN, Nihbernate, and IoC containers will make their jobs easier?

I use all the tools you mention in your post and am one of those guys that reads all the time, updates his skills. I think the key thing to remember is all of this may be just a passing fad or some new tool will make what we are starting to do now, tdd, ioc, orms, look like the typed datasets of 2003.


I guess this fits -

I'm a junior in high school, and I'm taking a Web Design course, which they just started offering this year, because I thought it'd be fun and an easy course. (Note that I know xHTML, CSS, PHP, SQL, Python, and a good amount of JS.)

On the first day of class, the teacher casually commented that he had started learning HTML and CSS just a month before he was supposed to start teaching it. It irritates the hell out of me, because I want to shout out "YOU'RE DOING THAT WRONG!" or "TABLES ARE FOR DATA, NOT DESIGN!!!", but I somehow manage to hold myself back from doing that.

Bleh. It irks me that he's getting paid ~$40,000 a year to teach stuff that he has no experience in, besides what he learned in his several "HTML and CSS for Beginners" books that he has on his desk, which he has to refer to every other question.

I just grin and bear it, and get that easy A. :-p.


I'd say it all depends on the team you're working with, what you're working on, and whether you have the desire and (potential) position to invoke change. Otherwise, you will probably get bored and frustrated.

As an example, 2-3 years ago our company was designing a new product, I wanted to go with NHibernate and a proper object model; I couldn't convince our manager to do this and we went down the (weakly typed) DataSet path. Fast-forward to today and our manager and team is now using Linq2Sql and considering EF. We are still (stuck) with the DataSet approach for some parts, but you know what, it does work.

Although I think it was a mistake and cost us a lot of time and money re-implementing what NHibernate already does (and better), and we still don't have clear separation (Linq2Sql is exposed in our presentation layer), I'm still nibbling away at the edges, and whenever possible trying to influence our design so that we do depend on abstractions, write unit tests, enforce proper separation of concerns, etc. Why? It's an interesting job that pays well, and I work with a good team of people. That counts for a lot in my opinion.


Are you looking for a job, or a career?

The fact is, that many people out there are interested in a job. Not many of them would read up on say, 'Design Patterns', or learn new technology after their working hours. Well, even if within, they might not be that interested too.

I'm thinking you are interested in a career. However, the person sitting next to you would simply be someone holding a job. Whatever you tried to teach or share would fall on deaf ears.

The team lead or architect could be in a bad spot too. Despite their knowledge of better or cutting edge solutions using, example, OSGI, Terracotta, Spring, etc. Their team mates might not be able to execute it well. And if you were to say to fire them and hire better ones, the company might not feel the same way. The longer you take to deliver, the worse the company gets, and you will be the one who gets chopped eventually.

So the logical choice could be for the team lead/architect to 'dumb down' too. Stick with the most basic solution, JSP/servlets, SQL rather than Spring/Hibernate. If using a CI like maven makes everyone confused, despite its benefits, do simple batch files. Distributed version control might be cool, like Mercurial, but most people would not know anything beyond SVN (and sometimes CVS).

If you were to take up this position, and have a strong desire to make things better, I would surely advise introducing changes incrementally, and feel the rate/speed of adoption. If they are resisting it and simply cannot see the benefits, drop it. Being a workplace, there will be politics involved, and the last thing you want to be in is everyone at the office against you.

Sorry if I sounded negative, and I probably am. Let's just say that I was never lucky enough to be in a place described by many developers online like, erm, Thoughtworks.


My personal opinion on this is:

Worst case: Prepare to be fired if you attempt to push for any change or even remotely try to get anyone on your side.

Best case: During code reviews they ask you why you do the following:

//code goes here
catch(IllegalArgumentException iae)
catch(CustomException ce)

if(value < 0 || myCustomObject == null)


An important part of the decision, when the job doesn't pass the Joel test, is to accept that whatever they are doing probably does work for them. It might not work as well as it could, but at least they have found a way to get some work done despite their tools. The real question would be, are they actually getting work done? If not, then don't take the job. If they are getting things done, then if the work they do interests you, (even if the tools they use don't meet your ideals), then you should consider taking the job.

They're getting work done because they are flexible enough to work within the constraints they have. Maybe they can continue to improve. Maybe


What is it costing to have this job? That's the big Q that comes to my mind. If the co-workers are driving you nuts and everyday seems to be worse than the day before, I'd consider quitting and not having that job. On the other hand, if the pay is huge, benefits rock and I only have to tune out a little, I may tolerate the pains though I probably would try to fix things a few times before settling into a blah spot on the team.

Know what your dealbreakers are, what price you have for the almost but not quite dealbreakers, and which things may help offset that. Where I work now has gone from SourceSafe to SVN and brought in various other changes that have helped a lot and I'd like to think how we work will continue to change over the next few years. Is it the awesomest place on Earth? Nah, but very few places would be. Sometimes it can be interesting to see what solution one has to discover if older technologies are used so that while the latest and greatest doesn't have this problem, you aren't upgrading to that anytime soon.