It has been said that there are only seven basic plots from which all stories derive. Textbooks define seven simple machines on which all mechanical devices depend. These numbers are in dispute, of course, just as the number of fundamental particles has grown from the neutron, electron, and proton that you may have been taught in grade school. When I sat around the campfire with the tribe as a child, it was pretty much just Earth, Air, Fire and Water.
You may be surprised to find that in 1936 mathematician Alan Turing proved that all computing devices can be defined in terms of a few simple operations which he defined to be a Universal Turing Machine. For sure, the UTM is an abstract concept; programming in assembler (any assembler) would be a tremendous step up. But the basic idea is that all computing is at its heart built on a foundation of just a few very simple operations, which are combined macro-like to define more complex operations, which are in turn combined to define yet more complex operations, pretty much ad infinitum. It's macros all the way up until you finally arrive at languages like Ruby, Prolog, and Haskell.
Barry Karafin, computer scientist, professor, one-time CEO, and former head of Bell Laboratories, once said: "The half-life of any high technology is around five years". That's not true for all technologies, of course. As much as we hate to admit it, FORTRAN and COBOL have actually done pretty well for themselves and are still in use today. C has done remarkably better, serving as a portable assembler language in most of its applications. The jury is still out on C++ and Java, although I have more faith in the latter than in the former. It's not clear when or even if the Internet Protocol will be ever be replaced, but probably not in my lifetime.
While it may be fun to learn the latest hot new language (or framework, or any other technology), and hence feel hip for at least a brief while, to me that hot new language just doesn't really feel all that new. It seems more like one of those movies that try to recapture the feeling of a television show from thirty or so years ago, and frequently doing a terrible job of it. Even if a new technology is successful, you usually get the feeling that it won't be successful for all that long before it's replaced by the next big thing. The success of C, Java, and TCP/IP is that they've stuck around long enough to make becoming an expert in them worth your time.
I don't know how many basic algorithms or design patterns there are, but it's a lot more than seven. But I'm pretty sure it's not infinite either. And I know that those algorithms and patterns aren't language-specific. Sure, a pattern might be easier to implement in one language than in another, speaking as someone that has done object-oriented design by implementing the C++ virtual table in assembler. But as long as your language is Turing-equivalent, I'm positive you can implement any algorithm or design pattern in it.
That's why I find it a lot more satisfying to learn algorithms, design patterns, and basic techniques, than I do to learn a new programming language. And that's why for many high technologies, I practice Just In Time Learning. I deliberately delay in learning most new technologies until it becomes apparent that I'm actually going to need to know it. Then I ramp up quickly.
For sure, there are lots of things I've learn just for the sheer joy of it. Digital Aggregates has an Asterisk server because I got interested in the idea of an open source PBX. I have a Nokia N800 on my desk because I'm intrigued with the idea of an wireless internet appliance. I installed the CVM on it because I'm interested in using embedded Java in such devices. But I'm very careful where I spend my time in learning new things. This is partly because I am old and don't have much time left (see One Hundred Books), and partly because the value equation inside my head tells me it's just not worth the time I do have when compared to other things I could be spending my time learning. I prefer to learn high-leverage stuff.
Does this mean that I am arguing against the life-long learning that most career advisers say is absolutely necessary? Just the opposite. To practice JITL, you have to be really good at coming up to speed quickly. To do that, you must learn how to learn. And the only way to do that is by practicing learning new things. But you have to be picky about where you spend your time, because even though you are probably a lot younger than me, it is likely that your time is limited as well. (Whether the time of your children is similarly limited is a discussion we'll leave to the Singularity folks.)
Does it mean that I refuse to learn niche technologies that I need to do my current project? No, indeed. But it does mean that learning those niche technologies is made easier by that fact that I can see in them patterns and techniques in common with stuff I already know.
Fortunately, my career has been one big rummage sale of projects whose learning requirements were all over the map, ranging from writing cryptic assembler code for resource constrained digital signal processing chips to writing Java code for business process automation. Yet the same design patterns keep showing up, time and time again, regardless of the project or the language in which we implemented it. And in those rare moments in which I thought to myself "Hey... I've never see this technique before!", I had the wonderful feeling that I was adding yet another useful tool to my toolbox.
The older I get, the more interested I become in some of the really hard problems, like process issues (people are always more challenging to deal with than technology), or in the big picture areas like finance and economics. Some of the perhaps more esoteric stuff on my list to learn include rate monotonic analysis, game theory, and queueing theory. These aren't technologies, but techniques that can be applied to many areas, and for the distant foreseeable future. Many companies require their employees to take so many days of training per year, then offer the most introductory of classes in technologies that are likely to have disappeared in just a couple of years. Unless they are teaching a technology that is going to be imminently used by the employee, their training dollars would be better spent in teaching more fundamental skills and more broadly applicable tools.
I'd better get a move on. Time is short.