Parallel Programming is Entering the Mainstream – or isn’t it?
Alan Zeichick is convinced that parallel programming (threading in this case) is conquering the desktop. To measure how far this adaption goes in a specific organization, he proposes a Threading Maturity Model (ThMM). But I would not have formulated the headline of this article like I did, if I did not see question marks still and this article attempts to explain them a tiny bit…
But before I start with the question marks: I am also convinced that parallel programming in general and especially threading is going to be used more often, simply because the architectures today require it, as I have stated e.g. in my previous article about why I love parallel programming. I even went as far as to proclaim:
… we are in the middle of a revolution right now! It is a parallel revolution, and this time it is for real.
The last part of the sentence was added because people have basically been saying parallel programming is the next big thing since the very beginning of the computing age, and yet until these days parallel systems have remained expensive and not used much outside of computing centers. During my time studying for my diploma, a friend of mine bought a dual-processor Celeron system and proclaimed that soon every system on the market would be parallel. Did not happen. When the Pentium Pro chips first appeard, many people were so excited about their multiprocessing capabilities that they voiced the same opinion. Not that time, though. One probably does not need to look too far into the past to see similar examples of people going all crazy and proclaiming that parallel programming is all the rage and mainstream. Yet, it has not happened until today.
Therefore I was sceptical about this claim as well and have talked to many older and more experienced people from different fields at conferences about this issue, yet they all seem to agree: this time it’s different, the parallel architectures are here to stay and parallel programming is entering the mainstream. And that convinced me.
But of course, who am I to judge what is in the mainstream? I am sitting in my ivory tower (university, although I wish it was a tower. or ivory for that matter :P) all day, doing parallel programming. The group I am working in is at level 5 (Adoption) of Zeichick’s scale already. Yet, what counts is not me or my group. Its the vendors and ISVs out there that are building and selling actual applications and not merely prototypes and papers as we are taught in academia (which is, by the way, one of the main reasons I want to leave the university after my PhD at the end of this year). Are they adapting to parallel programming yet? Do they realize the full potential and also the dangers of the new computer architectures sold in every supermarket today?
The answer to this question I don’t know. I suspect not. I know we are not receiving requests for students trained in parallel programming often, but that may be because I am working at a relativly small and unknown university. When I am studying the job offerings here in Germany, I rarely see one where experience with threading or concurrent programming is asked for. That’s not a big problem for me, as I can always go back to sequential programming or being a project manager, but still a little voice in the back of my head says the situation should be different and companies should be adapting now, and not when it’s too late (see below for a description of too late).
Maybe the software shops don’t have any performance problems with their software. This is probably the best reason there is not to bother with parallel programming. There are reasons to use threads except for performance of course, but I think performance remains the most prominent one. But this state won’t last forever. More and more shops will run into performance issues with their software, because current architectures are not getting much faster, but rather more parallel. The game companies are (as always) the first ones to feel this, and as I understand it they are working on the problem (I know Valve does, but I guess Valve is not the typical software shop). But others will surely follow. If they start thinking about parallel programming by the time they realize their programs are too slow, it may be too late already.
Refactoring a big, existing codebase to use threads or some other parallel programming model is a huge undertaking (sometimes even comparable to a total rewrite). Testing it requires new tests, tools or sometimes even new testing frameworks (you know if your testing-framework is thread-safe? I bet it’s not!). It is not impossible, but the whole process takes a lot of time. Especially, when your developers need to be trained up front. If an ISV decides to do it when the first performance-problems appear, they may have a hard time getting a stable release out of the door in time. Or until their money runs out :?. If, on the other hand, the technical leadership of a software shop acknowledges the risk and decides to act in anticipation of the problem, a migration may be started early and will go more smoothly because there is more time to get everything right.
Developers can be trained ahead of time. In some parallel programming systems (e.g. OpenMP) there is an easy way to switch off parallel execution and ship a sequential version with one compiler switch. If you started parallelizing early enough and there are problems during the migration (and let’s be realistic: when was there a significant software development project without problems :|) you can still switch off parallelism and ship one more sequential version of your software to make your customers happy. If that version is too slow already because you have started too late you are out of luck.
But then again, I got side-tracked and this is just me musing in my ivory tower. And that’s why I would like to hear from you: Do you see concurrent programming being adopted in your workplace? Which stage on the Thread Maturety Model is your company on? Or maybe you feel that all this talk about parallel programming becoming mainstream is just another hype and will die down as soon as the CPU-manufacturers get their act together and start turning the clockspeed-screw again? Have you seen demand for parallel programming skills increase lately?
Questions and yet more questions. All of these are highly subjective, therefore I would really like to hear your opinion on all this!