And how did you understand it at the end? What kind of efforts and skills helped you solve it at the end?

Edit: Adding a list of the skills proposed in answers ( with popularity > 3)

  • Understanding humans
  • Understand when to stop development
  • learn it the hard way ( a lot of work )
  • realize that not every one thinks like programmer
  • program in school courses
  • learn older languages also ( C ?! )
  • communicate and clarify to align expectations
  • know how to say no and organize and prioritize problems and bugs
38 accepted




... and I'm not being facetious. Understanding humans is the most valuable software development skill you can have. After all, and I totally believe this, most software bugs arise from the requirements. If you can't understand your users' needs then you've set yourself on an uphill trek before you code a single ++

... and a further edit, you can do worse than try to empathise. I've read Adaptive Path's short book on design recently and I catch myself thinking about it in the context of software development all the time.

Preet Sangha contributed a great comment - 'people are hard, even the simple ones', even if it might be slightly harsh towards simple people (like me ;) ).


For me, the concept of "good enough" was (and still is) the hardest thing for me to deal with. While I want to produce the best possible code that is elegant, performs well, and meets all the requirements, sometimes you have to know when to draw a line in the sand and declare the release done.

Often things boil down to the holy trinity of software development: fast, cheap, quality where you only (generally) get to pick two of the three for any particular problem. Never forget the overall priorities of what you are creating.


Multithreaded Programming..


The ASP.Net page lifecycle.

Especially with relation to dynamic controls.

There are no skills that can help you here. You must get down into the trenches and wage war with it, day in and day out for months, until it finally lodges into your brain.


Monads. (Not that I understand more than 5% of them.)

Looking at how they enable to fake imperative programming and how that in turn makes STM possible helped.


I agree, people.

Not just for requirements, either.

Somehow in the MIT AI Lab I picked up the habit of gratuitously insulting people outside of the "in crowd" for the things they did.

It took a stupidly long time, and a Dale Carnegie course, to learn that, Gee, if you insult somebody they might not be inclined to work with you.


User Interface Design - Over time realizing that not everyone thinks like a programmer!

Test Driven Design - I read a lot of books/blogs and practiced.


That software doesn't have to be perfect in order to be released. Sometimes it is best to get something out there and in front of users.


Regular expressions


Object-oriented development. I learned it through a will to survive.


That I am not the greatest programmer that has ever lived. I was a bit cocky when I finished college. In some respects I still am. However thanks to my Project lead and a coworker, they showed me that in fact I was not the greatest programmer through some very mean and spirited code reviews when I started my career fresh out of college.

From this experience I learned that there is always someone better out there. And once I stop trying to learn and understand, I have effectively become the worlds worst programmer. This is because the world of software development is always changing. Weather it is news ways of development or new practices. There is something that can make a software developer better. Yes, even if they program in COBOL.

I would take an intern that has the desire to learn over an experienced developer that no longer has the desire to learn.


char **ptr


I guess pointers and memory. OK, that's two.

I got through it by being compelled to solve college programming problems that involved both.


In the very early days, pointers in C. Once you truly grok that though, you get a significantly better insight into the way the hardware works, and you're good to go.

A good understanding of C helped me so much. You listening, Jeff? ;-)


The Internet. It still boggles my mind how all the packets and signals can be routed so efficiently all the way around the world. I had no problem with things like pointers or OOP, but I still don't think I even have a clue how the Internet works.


Asymmetric encryption took me a while to grasp.


In college I had courses that used Pascal and C++ and the toughest hurdle back then was understanding pointers. I remember one of my instructors tried to help by comparing a pointer and data to a street address and a house. I understood the analogy but it never helped me when it came to actual practice, especially when dealing with multiple levels of indirection, which was often the case when working with pointers.

This was a problem for me until one semester I took a class in assembly programming/machine architecture.. it was then I started to really grok pointers and after that was able to deal with pointers and multiple levels of indirection fairly effortlessly, giving me the confidence to pursue a career in programming (first job was heavy C++/COM).


APL - never did get it though! Never wanted to get it either...just needed to make it thru the class!

Here is an example of the syntax from wikipedia:

The following function "life", written in Dyalog APL, takes a boolean matrix and calculates the new generation according to Conway's Game of Life:

alt text


1) bosses and management and the way companies function. Specially in large but "flat" organizations. (read: design by committee)

2) understanding co-workers that are technology obsessed over some things (usually language or paradigms) that are totally irrelevant to both work and the problem at hand.

3) the primary problem is getting specifications about what should be done, that are not an unworkable Ivory Tower decree.


Aligning expectations. The only way to address that is communication, clarification (if needed) and reiteration.


Pointers was one, OOP was another, but the first one for me was recursion. I remember struggling and struggling with it, and thinking it was so stupid...

And then suddenly, I got it.

It was a total flash of insight or divine understanding, and I switched to wanting to do everything recursively. ("When you have a hammer...")

It's also one of the examples I show "non-programmers" (who happen to be scripters -- like html designers) when I'm showing them how interesting programming can be.


Code generation. Back when giant sloths dug up our network cables, and saber-toothed tigers ate the repairmen, I worked in IBM 370 assembler language, and we had to write programs that would write programs that would emit programs that would do things. I took a long time to get that thing working, and wasn't confident of the results.

Then, a long time later, I learned Lisp, and ran into the DEFMACRO. With that and the backtick notation, it was real easy to generate code. Except that it was, and is, difficult to do it correctly.

I think the easiest environment I've found is Perl's eval function, but that may be because I've never done anything ambitious with it.


Unification: a concept used when constructing mechanical theorem provers in Artificial Intelligence. The explanation/description of Unification seems simple enough that it seems, at first, to be pretty straightforward. When you try to actually do it, though, it turns out to be surprisingly hard.


Writing, then knowing how to explain what I'm doing to another developer. I guess my most difficult concept to grasp is called "communication".


I think what did it for me was when I realized the other people in the room didn't know more than I did. They may have had an understanding of different things, but I knew as much as they did (or in some cases more). I'm not trying to be cocky or anything. For a long time I was always under the impression that everybody else knew more than I did, so I just hung back. It was when I realized that I had as much to contribute as everybody else that things really turned around for me.




The concept of picking my battles. To solve some problems/issues/bugs now, and some others later.

Sometimes the choice is mine, sometimes it's a task I'm assigned.

As a person I like making myself useful and solving peoples problems, so I quickly become a "go to" guy in any office for fixing those small problems that won't take a lot of time but just haven't gotten done for a long time. Learning to say no and to organize and prioritize problems and bugs has been a long hard climb, so far.

I have to say, during this I've grown to really appreciate the work that project managers and senior workmates can do. If good people are in those positions, it really makes my life easier.


The switch to object orientation (yeah, I'm an old timer)


A way to get rich without discarding the heart of programmer

I still couldn't think of the solution but I will definitely let you guys know ;-)


Dynamically allocated memory with malloc and free. I couldn't get my mind around the idea that there were mutable locations ("variables") that had no names.


That communication is as important as the technology that I work with.


Developers with 10 years of experience who write things like :

if (url.find("http://www.oursite.com/page.jsp?value1=1&value2=2.... ")== -1){
else if ((url.find("http://www.oursite.com/page.jsp?someothervalue=A&someothervalue=b.... ")== -1){
else if (

... }

You get the idea. I am not making this up.

Them, and the people that hire them.


Prolog and logical programming. The way through it? Think of it this way: your Prolog programs are trees, the rest is just depth-first search.

Prolog is an absolutely mind-bending way to write code, but once you get good you can do some amazing things.


That I wanted to be a software developer more than I wanted to be a writer.

I grew up believing I wanted to be a writer. All through the first decade and a half of my career in software, I felt like programming was just something I was doing to pay the bills while I wrote in my free time. In the mid-90s, I finally achieved my goal: I wrote a column for the San Francisco Chronicle. I wrote articles for Wired and New Scientist. I even got paid - a lot, really, though anything at all would be more than most - for writing fiction for a web site.

And I really didn't like it. This was a hard pill to swallow. The actual feeling of sitting down and finding something to say was painful. If there wasn't a looming deadline, I didn't do it. If there was a looming deadline, my life became a misery of half-finished sentences and sleepless nights. I completely went to pieces when I was in the late stages of finishing a piece. It was horrible.

It was certainly great to have written. Writing for the newspaper could be wonderful. It was a pretty remarkable feeling to be in a cafe and hear people discussing ideas that I'd just finished putting into words the night before. And writing for magazines meant that I got to sit down and play with robots with Will Wright, interview Pattie Maes about virtual dogs, hang out with the smartest guy I've ever met (Bob McHenry, general editor of the Encyclopedia Britannica) and talk to Richard Garfield about the genesis of Magic: The Gathering. I was invited to an ultra-special and extremely expensive "gathering of the digital elders," which was eleven different kinds of wrong. (I was invited in my capacity as a digital elder. Seriously. A lot of weird things happened in the first Web bubble.)

But it was just awful work, getting stuff out of my head and onto the page. Meanwhile, I still paid the bills by developing software, and it gradually dawned on me that I actually liked developing software. And that I didn't really like writing.

I spent the next 15 years approaching my work with what Thomas Lux once called "a positive condescension." The nagging feeling that I was missing something by working in software was gone. I ended up getting really good at it.


Multithreading for sure :(


logs in college algebra, they just didn't make sense.


Polymorphism was the trickiest of the big three OO concepts for me.

The literature I see today does a better job of illustrating it through analogy than when I first was learning it. Then moving on to the differences between inclusional and operational polymorphism. They seem simple now, but at the time, I didn't get it.


The Adapter pattern


Initially, I have had some troubles due to the impedance mismatch between relational and object oriented worlds. Also, the SQL way of thinking in terms of sets.


1) EJBs. It was my first foray into remote calls. Finally one day it just clicked. I still don't have the different types of transations down, though.

2) Incomplete requirements. At first I just didn't understand why you couldn't, with enough time, get a complete set of requirements before the project started. This goes with the "People" answer that was already given.



It's a bit like 'people', but tends to involve personalities and people trying to prove they are higher up than other people.

It's made even worse if the political people are, or used to be, technical as well!


Concurent debugging and to keep my head cool during it!


Volumes 1, 2, 3 of "The Art of Computer Programming". I bought the books because it is supposed to be the ultimate reference for all things programming... I'm still frustrated that after 25 years I've never referred to it.


Pointers vs References in c++ and when to use one over the other.

It took discussing with other developer and research aka "Google" to comprehend.


The benefit of using callback functions (and implementing them in C). I still have a hard time using them in every day problem solving. Sometimes I wonder if they are as useless as recursive functions.

Sometimes it helps to explain others the concept you are trying to learn. But with callback functions it didn't work. I guess when I make my first program were I use callback functions by myself, then I will really understand this concept.


cross domain single sign on


The difference between pointers and references.


Learning to edit with Vim was pretty tough at first. Fortunately I had some spare time that summer, I read the docs and tutorials, chatted with friends, and on the irc.freenode.net channel.

Once I got the basics down I found I could edit faster than with a conventional editor, and every extra trick I learned since then is a bonus!

I wouldn't recommend it for everyone, but if you are interested in using a "power" editor, be prepared to make some initial investment before reaping rewards.

As for Support Vector Machines, I never did manage to understand them.


For some reason, the thing that keeps coming to mind is trying to derive predicate calculus from first principles, not having actually studied it or anything, during the GRE Subject Test in Computer Science.

That test was freakin' hard, yo.


Web development. Moving from stateful fat clients to web-based thin clients was really difficult to get my head around. Of course, a lot of this was because I tried to learn it by reading articles that assumed you already knew about sessions and web server internals. Once I finally figured out that I could put things into the session in one call and retrieve it from the session in a subsequent call, things got much easier! I did, however, develop god-like cookie encoding sk1llz... :)