For a long time, the only IDE I knew was the VB6 IDE, which is fairly outdated (ca. 1998) and not very feature-rich (unless you purchase third-party add-ons). You can set breakpoints and watches and there are other (now common-place) amenities, such as Intellisense.
So, when I saw Eclipse for the first time, I was amazed at how much the IDE could do for you (
Generate Getters and Setters, the quick-fix feature, the
Refactoring menu, etc.). Same thing when we moved to Visual Studio and I saw all the auto-generated Form Designer code for the first time.
These extras are great for productivity and just plain Getting Things Done, but I have to wonder, are we becoming too reliant on IDE's? How many programmers actually understand the rationale for the refactoring suggestions their IDE's give them? Do they need to understand why the IDE is suggesting they make a particular change to the code they just wrote?
I just wonder, especially as more and more productivity features get crammed into IDE's, how much thinking future programmers will actually be doing if the IDE starts shielding them from having to actually think things through before they code, instead of using the IDE as a crutch (i.e. "I can write crap code, because the IDE will refactor it for me").
Sometimes I find it very liberating to just fire up a text editor and code in that for awhile, to prove to myself that I still actually know the language I'm coding in.
What are your thoughts on this? Are super-friendly IDE's jammed-packed with productivity features ultimately harming programmers by hiding too many details of the language and shielding them from making real design decisions?