I will soon have finished my postgraduate studies and therefore will need to decide which way to turn once I am out in the wild. All they have been teaching us all these years was Java and various Java technologies. "Makes things portable", "Will run on any platform", "Is free" they keep repeating. Okay, fine, cool, be that as it may.

However, I would like to develop applications for the Windows platform and start using the Microsoft technology stack. I don't even know what constitutes the "Microsoft technology stack"! They avoid this topic as much as they can at the Uni.

There is this one particular teacher that can't maintain at least some decent level of professionalism to refrain from openly dissing all that is Microsoft. It's okay to have personal antipathy about a product or an entire line of products, I myself am "not exactly promoting" Mac but try to keep neutral about it in public, not to offend those that enjoy it.

This teacher I mentioned above, keeps making statements that he knows 'no one' who would develop apps solely for Windows and be actually paid for it. He keeps glorifying all the various Linux distributions (in my opinion, well beyond their actual merit) and tries hard to dissuade me from going the Microsoft way. Because of the way my University has decided to teach its content, I have not had much time to (and also did not see much point in) learn to work with the Microsoft technologies. Indeed, I have never developed anything for the .NET platform.

But I would like to! I spent years developing Windows applications in Delphi before I joined the Uni and I enjoyed it. I would like to get back to it. Not necessarily Delphi, but C# is looking swell. But I am a little worried about the fact that there's been a lot of Microsoft hate-talk lately, Macs are on the rise, Linux distros are becoming more and more widely spread and their community grows. Would I be getting on board of a sinking ship, if I decided to take the MS path? I am talking about ten+ years from now, is it not going to be all dead?

31 accepted

Educators in Universities tend to have different views, priorities and agendas to those working in the real world and I would not pay much attention to anything your teacher has been saying as it doesn't appear particularly accurate to me.

Personally, I develop with a range of tools and platforms and would urge anyone else to do the same, sticking to one platform is not a particularly healthy or future proof way to go.

As regards Microsoft? I've been developing on the Microsoft platform for close to 20 years now and I struggle to remember a more exciting time for MS developers. Visual Studio 2010 and .NET 4 out in April and it looks sweet so far!!

There is so much choice out there for developers these days, drink it all is my advice!!!


a lot of Microsoft hate-talk lately,

Pretty well since they wrote that letter about people copying their Basic there's been a disjoint between Microsoft and the hobby unix and academic communities. It hasn't made them go away.

Macs are on the rise,

That's news to me - I have seen precisely two Macs in the workplace in nearly twenty years. The top three Google hits for 'mac market share' contradict each other.

For example, arstechnica has more Windows 7 installations than OS X and growing faster, though in terms of cost of sales, I think Apple are growing faster.

iPhones are on the rise, as were iPods. In general, the future may well have more lighter platforms, and Windows doesn't seem very competitive in that market at that moment. I've never seen a Zune in real life, even in a shop, and the Windows Mobile and Windows XP Embedded devices I've seen have been a bit clunky.

But having more lightweight platforms doesn't necessarily equate to boosting platform agnostic environments such as Java - on many devices you have to create a device specific application just to use its features well, and not every platform will run Java.

Linux distros are becoming more and more widely spread and their community grows.

I use desktop Linux at home, but not at work - all my employers have had a big installed base of Windows, mostly XP. Though they often use embedded Linux.

There seemed to be a bit of Linux rush with netbooks, but they were a bit pants, and hardware has improved enough for the overhead to run Windows. People are used to using Microsoft tools on their desktops, so I'm not sure that the gains will be long term. There isn't a good substitute for Word on Linux (i.e. one which has the same level of detail in its user friendliness, and doesn't look like it hasn't been refreshed in ten years; functionally Open Office is mostly OK).

Would I be getting on board of a sinking ship, if I decided to take the MS path?

Maybe, maybe not. I strongly suspect that while there are desktop computers, there will be Windows. If you want to use newer platforms well, learn their ins-and-outs rather than learning something which insulates you from them.

If I knew definitely how the IT market would pan out, I'd be an investment guru not a software engineer.


I would say that I'm getting pretty anti-Microsoft. But seriously, Windows isn't going to disappear any time soon.

I wouldn't worry about lack of experience in .NET. Just start a project, learn the ins and outs, and you'll get the hang of it quickly.

There's also no need to tie yourself down to any platform. You're good at Java, you'll get good at C#, the world will be your oyster, as long you keep investing in yourself and learn new languages when you can.


OpenSource guys are advocating the death of proprietary technologies including Windows for twenty years now, and every year some new OpenSource project screams how it would be the ultimate solution for every user. Now, I still can't imagine my mom even understanding the difference between an operating system and Outlook, let alone wanting anything that typical OpenSource software projects promise, or caring at all. I think nobody knows what will be in ten years, even if many people say they do know.

Anyway. Just do what you think will help you in the next time, or just what is fun. Switching to other operating systems or programming languages, or frameworks, doesn't take decades.


I don't think so, but ten years is a really long time in this field and I honestly dont think anyone is able to give an accurate guess about that. But to be on the safe side, I think one should not focus on a single technology or programming language and instead keep learning new stuff continuously.


It's sad but true, academia has a small-but-significant proportion of idiots who have never had to earn a living creating real code that does real work. But stepping aside from that argument, look at the numbers. Search the jobsites and see how many Java jobs there are, and how many C#, within your area. Let's be objective about this.

By the way, you'll find it very easy to transfer Delphi skills to the C# world. It positively drips the character of Anders... I had no problem at all.


Looks at the figures here. The top tag (C#) outnumbers the next (Java) by almost a factor of 2.

10 years is an eon in developer time. Even if it is a "sinking ship" (a position that I'd strongly oppose), avoiding it because it will be dead in even 3 or 4 years time would be the wrong motivation.

Learning such a popular technology is never a bad idea, and the lessons you learn on the way will help you no matter where you end up. As you grow as a developer, you come to realise that the really important skills have very little to do with the platform of the moment.

It's a desirable skill right now, so if you want to then I say go for it.


It's silly to think this is an "either/or" decision. However I can understand that you want to know how to focus your efforts.

Ask not merely whether Microsoft based technologies are a "sinking ship." Whether or not they're "sinking" is not the only risk posed by focusing on their technologies.

Do some analysis. What are other ways in which such a decision might put you at a disadvantage?

How frequently do they change their technologies? If you invested yourself into developing deep expertise in Visual Basic 10 or 15 years ago, how much of that is still reaping rewards today? If you learned UNIX programming (in C or Perl, for example) 10 or 15 years ago, how much of that is still current today.

How much competition is there? Are there more opportunities for UNIX/Linux (and MacOS X) programmers? Or more for MS Windows? Now how many qualified people are in those markets competing for those positions? (I personally chose Linux --- as a sysadmin --- because it was an opportunity for me to be a "big fish" in a small pond rather than a minnow in the ocean; but I made that decision when the Linux kernel was as version 0.99p10 -- it was a very small pond back then).

How lucrative are the available positions? What are the median entry level salaries for those focused on one or another technology? What are they after 10 years?

I'm still not a programmer. I'm a sysadmin who happens to do some advanced scripting and a bit of programming as part of my work.

However, I'm suggesting that you re-think your question.


No, Microsoft isn't going to vanish overnight (unless someone finds a software patent which they infringe and successfully sues them for all they got -- which is unlikely). I openly hate them (just like your teacher) because I suffer so much from their products but if you see an opportunity to make money, go ahead. No one but you can make your decisions.

Just keep one thing in mind: "I enjoyed working with Delphi" doesn't mean "will pay my bills." The market around Windows is pretty competitive but it's also huge. So I suggest that you have a look, learn something (maybe Delphi was nice but .NET just isn't for you) and find a job which pays your rent. If that's Java, that's fine since you'll gather experience anywhere. You can learn .NET in your spare time. If it's a .NET job, you can learn and get paid for it.


With any tech there are no promises, but neither Java or .Net will die out any year soon.

According to JobServe (still the best UK IT job indicator IMHO), there are 1509 job adverts mentioning C# right now and 1739 job adverts mentioning Java. So both markets are very healthy.

MS is increasingly offering cheaper dev tools which may build a grass roots momentum eventually, but any Uni with an eye on its budget will focus on open source and free dev tools. For cheap MS dev tools try Websitespark or express tools. Perhaps these will eventually feed into Unis - but somehow I doubt it.


When I was in college around 2002, I had professors telling us that unless you were a great programmer, you might want to reconsider your major because there weren't going to be any programming jobs left in the US.......let's just say I'm glad I didn't listen to him. Windows may lose market share here or there but is not going to disappear in the next 10 years.


Regardless of a particular professor's preferences, I think the best way to guarantee your marketability is to be an expert in general software development practices - data structures, algorithms, performance, security, OOA/OOD, language definitions, etc. And the best way to do that is to experience lots of languages, and operating systems, and development environments. I think Joel is right on when he talks about working in languages like LISP and Haskell - not because you are going to find a job requiring Haskell knowledge, but because of the concepts and approaches that you will learn.


My personal story seem to be very similar to your one. At the Italian University they just tought us almost everything Java, C/C++, Unix OS (fork, shell, etc.) except Microsoft technologies. When I started working, the company asked me to do an application that was a mix of Visual Basic 6 and MFC, so I had to learn the VB language and the MFC libaray. Actually for developing Windows application with a decent GUI in a small timeframe they were not that bad.

In now days customers and people ask more and more for web applications. On web environment MS seems to be weak, I believe because of four main things:

1) other strong players like Google,

2) companies do not dislike Linux on server (maybe with a cPanel on it that makes thing easy), because they find cheap well working webhosting services and their final customers see only the web applications interface with no much interest in what servers is running behind them (and if they are, tell them you are running MS, so they will be happy like children in a garden).

3) The PC world is threatened by the smart phones, I don't mean they will replace the PC, but I mean customers more ferquently want the application to run on the iPhones too, they will ask it to run on the new Apple tablet PC, so if you do an app (even a web app) that runs only on Windows your customers might be not happy at all, also because they can easily find your competitors companies who make a web app that works.

4) some Linux GUI (Kubuntu) and other applications (like OpenOffice) start to be easier to use for final customers, so more chances that normal people might adopt them on their PC in order to save money.

I don't dislike C#, but I personally gave up on MS code langugaes, I still remember the nightmare to upgrade an application written in VB6 to VB.NET. Very frustrating we had to almost rewrite it all, I wonder why applications written in JAVA/PHP 1.0.0 still work in JAVA/PHP 3000.0.0 (unless for minor things).

On the other side it has to be said that MS and the bald Bill are packed of money, they could use them to wipe their noses, so in order to be defeated they would need to make many, many, many mistakes in sequence which is not so simple to happen in business world. I mean as long as they have money and they own the PC world (almost as a monopoly) they can make mistakes (like Windows Vista) but still rise theyr head up and try again.

The latest one from MS, to me is they are attempting to make all the web clients world need Silverlight to write user interfaces (HTML + CSS + JS worked great so far, so says MS: "why do we need to go on using them? We don't make any money out of them!") they have to threaten them because all the programers like sheeps need to waste their time learning Silverligh in place of more standard VML or SGML and be the new MS slaves for the next 100 years.

If you want to be one of them, just go on, but be careful the new Apple iPad does not support Silverlight, are you sure your customers (expecially managers who will for sure buy the iPad) will be happy if your web app won't work on their beautiful new iPad, do you think they will let you say: "Oh, actually it's the iPad that does not support properly my website/web app, it's not my fault!" Apple righfully adopted a good marketing strategy the same disgusting one used by MS for years, letting iPad work with Silverlight would have been a major huge marketing mistake, if MS take over the web, it's the end not only of a dream, but of millions of small (and big) companies very happy to work in the web environment coz' they don't need to be MS slaves anymore.


I've asked a similar question to this and all I can suggest is that learning in our own time is probably going to be the easiest way to do this, the knowledge you gained from your current course means that at least you have the experience in developing mobile applications, whether they be used to help develop Windows Mobile/Windows Phone 7 applications or Android applications.

It's indeed a shame that your lecturer is so against Microsoft for whatever reason but it doesn't take a genius to work out that a lot of software is developed on/for Windows for a reason.

Again the best way is to learn in your own time, buy a few books (Jon Skeet wrote a brilliant one :p) on C# and then if you feel like it's what you want you can continue to progress in C# or move to another .Net language.

Here's a link to my question too, might be useful to you:

Future prospects after completing University


A university graduate should find no obstacle in learning any current I.T framework or technology. You know your algorithms, your data structures, you've probably put together a compiler or two and have popped your cherry with C. What technology you use entirely depends on what type of job you want to do.

Learning MS technologies is a safe bet for a job, knowing C# and the .NET framework in general will land you a job pretty fast. Java jobs tend to be fewer, though from experience they were also higher paying.

My point is this: learn the Microsoft technologies you are interested in, but also make the effort to research the alternatives. Learn Java too and get acquainted with the massive amount of Java-based open source frameworks (something you will find lacking in the MS world). Learn Python, Scala and all the rest.

Also keep in mind that different kinds of jobs require different technologies. I currently work in Scientific computing, a field in which Microsoft really doesn't have anything to offer.


Looking at the development of new technology and tools, Microsoft is really producing some neat new stuff lately. Technology such as Silverlight will become more prominent, Azure will do some neat things considering cloud computing, the new Visual Studio (2010) is truly awesome (IntelliTrace for example).

I don't expect Microsoft to drop in the first 10 years :)


My suggestion is to learn to program using the Microsoft stack if you want to. In my opinion, programming language familiarity is better than your familiarity with the platform, as for every platform, everything is documented. If you do not know the platform, but you know what you want to do, there is always the API documentation to turn to. But if you have no idea how to do what you want to do, we have a problem.

Also, I agree with Skilldrick; you don't have to tie yourself down to any platform.

Suggestion: If you want to be able to program for any platform, learn C/C++. The platform specifics, you can learn it as needed.


I don't think I agree with your teacher, but I'd advice caution in "going the Microsoft path" anyway.

History has demonstrated that Microsoft has a tendency to quickly "deprecate" technology - ActiveX and COM/DCOM are the most prominent examples I can think of. One year they were "the next big thing" and the next one they were abandoned.

Microsoft has kept .NET alive longer than ActiveX or COM/DCOM. But can we be sure that it will not abandon it in 10 years?

As other have said, 10 years is a lot of time.

So my advice is: Don't put all the eggs on the same basket.

If there's money to be made, sure, go "the Microsoft path". Just make sure you have a way out. Invest some time in learning other technologies. Maybe even make some small project with them. So if the day arrives when Microsoft decides to stop supporting your technology, you can jump to another.