Wednesday, September 26, 2007
This morning, National Public Radio's Morning Edition featured a story by John McChesney titled GI Bill's Impact Slipping in Recent Years. It's worth a listen. Some interesting facts from the story:
At the close of World War II, only 5% of Americans had college degrees.
Only 40% of the veterans of WW II had high school diplomas.
The GI Bill paid for full tuition and books and provided a modest living stipend.
Ten million WWII vets went to college on the GI Bill.
That number included fourteen who would go on to win the Nobel Prize.
Twenty-four who would go on to win the Pulitzer Prize.
The GI Bill would educate 91,000 scientists.
And 67,000 physicians.
As well as several presidents, senators, and Supreme Court justices.
How much did it cost? Educating the WWII veterans cost about $50 billion in 2007 adjusted dollars, and some estimates place the return to the U. S. economy at a equally adjusted $350 billion, a seven-fold ROI. The increase in tax dollars alone just from the boost in earning power for the newly minted college graduates more than paid for the GI Bill.
The story is part of a continuing series on the reduced effectiveness of the GI Bill. Given the statistics cited above, it would be hard not to be in favor cranking up the investment in education for current veterans.
But surely there's a more general lesson to be learned here regarding the ROI on investing in education for your employees. I've personally witnessed a large technology company reduce its educational subsidy from "your entire degree" to "just the classes we think are pertinent to your current project", and downsizing its employee training program from a rich set of intense forty-hour in-house classes to generic introductory web-based presentations. As both an employee, and as a former educator, it broke my heart to see how little interest the company had in investing in its people, beyond reminding them at every six-month review that "continuous learning" was critical to their continued employment. How they accomplished that continuous learning was left as an exercise for the employee.
Do we really think the ROI for those WW II vets is that substantially greater from the ROI we get from investing in the education of our employees? And as an employee, doesn't it suggest to you that education is virtually free, given the likely increase in your earning power?
Employees should aggressively seek training and education, whether that means choosing to work for companies that provide and subsidize it, or paying for it themselves.
I'd be interested in comments from any veterans on their experiences, good and bad, with the GI Bill, and its impact on their lives. And comments and suggestions from readers in general on their experience with training and education provided or subsidized by their employer.
Thursday, September 20, 2007
After just a few hours of futzing around, I can now serve the entire Digital Aggregates Corporation web site off the N800 using some Java code I slung together based on the prior work of Jon Berg of turtlemeat.com. Sure, it's not the fastest or even the smallest web server on the planet. I'm not about to expose it outside the company firewall. And of course I'm sure I'm not the first to do this. But I can sit at my laptop and browse pages on a web server that I'm carrying around in my shirt pocket.
Here is the source zip and the jar file (served off the production web server) if you want to have this kind of joy yourself. I dropped the jar file into the Foundation directory that comes with the CVM distribution, placed a copy of the entire web site in a www directory, then fired the server up with this shell script.
./bin/cvm -cp chapparal-0.3.jar \
/home/user/www 300000 8080
I then pointed my laptop's web browser at http://192.168.1.106/index.html and watched wackiness ensue! (The IP address is DHCP served from behind my NAT firewall, so your mileage will vary.) The three arguments to the Server main are the root path prefixed to every file request, the number of milliseconds the server stays up, and the HTTP listen port to use.
I developed the whole thing on my laptop using Eclipse, tested it using junit, built it for the target with ant using the 1.6 compiler in 1.4 source and target mode, and then downloaded the jar to the N800 via its own web browser.
Wicked mad fun!
Sun Microsystems offers a Java SE 5.0 implementation for Linux on an embedded PowerPC that you can use for a ninety-day free trial. (If you want to ship it as a product, you have to pay a royalty.) As an experiment I recently installed this JVM on such a system, and as a demo ran my Java-based pocket web server on it. Took almost no effort, worked like a charm. It makes a nice proof of concept.
Wednesday, September 19, 2007
In Just In Time Learning, I argued that we should concentrate on learning the abstract rather than the concrete, for example new design patterns instead of new programming languages, because of their broader applicability. Now it may appear that I am going to argue for just the opposite.
One of the best ways I've found to learn new design patterns is to learn new programming languages that depend upon those patterns. One of the problems I have with using C++ to teach object oriented programming is that you don't have to use OOP to use C++. This was by design. It allows C++ to more easily and incrementally penetrate into legacy C code bases. You can use C++ as a "better" C, and ignore all the OO stuff. But this allows you to cheat if you are using C++ to learn OOP. Better to use a language like Java, which forces you to use OOP. Well, I suppose you can cheat in Java too, if you care to make everything static, but it's a heckuva lot more work than to just do it the right way straight off. Learning Java forces you into an OO way of thinking.
This is a variation of the Sapir-Whorf Hypothesis. No, Sapir-Whorf doesn't have anything to do with Klingons, although it may have something to do with the Klingon language. In the early to mid-twentieth century, linguist and anthropologist Edward Sapir and his student Benjamin Whorf formed a hypothesis that suggested that any natural language conditions the thoughts of the people that use it. In other words, not only does human culture influence the language used by that culture, but that the language in turn influences the thoughts of its speakers. If your natural language required that every sentence started with "Please", would children raised in such a society inevitably be more polite?
The hypothesis is essentially untestable, and has hence been controversial. Separating people from language for the purposes of a double blind study is impossible. But as a computer scientist and software developer, I find it hard not to be compelled by it. Because not only to do the design patterns you use influence what programming language you may choose to develop in, the programming language you develop in may influence what design patterns you choose.
In undergraduate and graduate school (this was back in the Dark Ages, you understand), I was lucky enough to have been exposed to a wide range of programming languages. C, much less Java, hadn't even been invented by the time I was an undergraduate. There was a rich culture of experimentation and innovation regarding programming languages, because no one language had really taken root as the "default" language to use for software development the way C and Java have today.
For sure, FORTRAN was was the language of choice (and still is) in scientific computing, and COBOL was the lingua franca for business applications. But every developer back then knew that neither FORTRAN nor COBOL was perfect for the burgeoning backlog of desired software applications, particularly for systems programming. That's why I ended up writing thousands of lines of assembler code (frequently using complex macro packages that made it look like a higher-level language) in the 1970s, and why C was invented at Bell Labs as a kind of structured assembler language. But it also lead to dozens, maybe hundreds, of other programming languages being tossed out into the software development community to see if any of them would stick. Let a thousand flowers bloom; let a thousand schools of thought contend. Except unlike Mao, the computer science community meant it.
Some of these languages were domain specific. Some were designed around novel hardware or operating system architectures. Some tossed the whole idea of procedural programming out the window, leading to whole new techniques for designing algorithms. Learning and programming in these languages lead to a many new paradigms and patterns being imprinted on your brain.
But see, here's the thing. Many of these new schools of thought can be adopted to languages like C, C++, and Java. They can lead to new and better ways to solve problems. Ways that were more efficient, more easily understood, and even more economically maintainable.
Some of the languages I learned in those Dark Ages persist today. Some do not. Some only existed for a short while in the feverish brains of my colleagues. But from each of them I learned something that I might not have learned had I just learned C or Java. Here are some examples.
Prolog. Prolog is a logic programming language. You do not write procedures or methods. You make logical statements that are themselves either true or false, or which can be evaluated as true or false with the insertion of boolean values into variables. A Prolog program might consist of hundreds or even thousands of logical statements, some standalone, some referring to one another. When you run the program, you set the initial conditions of any input variables, and the Prolog interpreter searches the entire solution space described by the Prolog program for a path through the search tree that could succeed. (There may be more than one.) If it exhaustively goes through all possible paths without finding a path that succeeds, the program fails. Of course, there might be a print statement or other side effect along the way.
Here is a Prolog program that was once part of my USENET posting signature. It's not too hard to guess what it means.
Yep, it's a disclaimer: these opinions belong to me; they do not belong to anyone else. Hacker humor. The bang prevents the interpreter from back tracking back through the solution tree to try another path, and the fail causes the program to fail.
Prolog blew my mind because it was such a different approach to writing algorithms. It's syntax was very grammar like (I was really into formal languages and parsing at the time), and I immediately set to work using it to implement a parser for another programming language. When I submitted a sample program in that language to my Prolog program, if the sample program was syntactically correct, the Prolog program succeeded, otherwise it failed. Side effects managed the symbol table and emitted pseudo-machine code.
Lisp. I was never a Lisp hacker, but the List Programming language was my introduction to functional programming style and the idea of representing complex structures as trees and performing transforms on them. Both of these ideas, functional programming style and structure representation and transformation, became part of my every day software design tool kit to show up again and again in the systems I develop. I think if I were to encounter Lisp now, I would appreciate it so much more than I did as an undergraduate, having spent the past thirty years applying its lessons in countless systems and applications.
SNOBOL. SNOBOL is a string processing language with regular expressions. Sure, that seems mundane now, but this predates UNIX and awk. Snobol taught me how to think of strings not as collections of bytes but as malleable forms that could be described and manipulated using complex regular expressions. Years later when I starting using UNIX, writing parsers and data tranformers using awk came naturally, partly because of my prior SNOBOL experience.
Back in the day it was a little game to see how many lines of code it took to write an 80-80 lister. An 80-80 lister was a program that read in punch cards and printed their content to the line printer. Not that you really needed to write an 80-80 lister, the IBM utility IEBGENER worked just fine. But it was just a quick way to evaluate the power of a programming language. An 80-80 lister in SNOBOL looked like this.
OUTPUT = INPUT
It was almost cheating, really.
Simscript. Simscript was an invention of the RAND Corporation, that wacky place that also brought us the Delphi technique of estimation and most of the ideas behind Dr. Strangelove. Simscript is a procedural programming language with support for simulation in the same way that Java supports multithreading: as part of the language itself. I only used Simscript as a student (it is a closed proprietary language, hence hard to get to), but all during my career I found myself wishing I had it around. For some reason I keep encountering the need to write simulations with event streams whose emission rates meet some statistical distribution. What is a few thousand lines of code in C become a few hundred lines of code in Simscript. I learned the basic design patterns of simulation in Simscript as an undergraduate, and I keep applying them in my professional career over and over in other languages.
Forth. Forth is a stack-oriented threaded interpreted language that is still in use today. Here, "threaded" has nothing to do with multithreading, but indicates how the interpretor stores the intermediate form of the program. A Forth program consists of a free form stream of words and numbers. New Forth words can be trivially defined. There is no compiler. Words are parsed and either executed directly or converted into some intermediate form (typically an address) into a dictionary as you type them in. Here is the definition of a simple Forth program that squares its input.
: SQUARE DUP * ;
This is a new word called SQUARE. It duplicates the value on the top of the stack so that there are two copies on the stack, then it multiplies those two values together while popping them off the stack, and pushes the result on the stack. The following Forth code squares the value of 12 and prints it out.
12 SQUARE ." SQUARE=" .
A useful Forth interpreter can be implemented in just a few kilobytes, and as such it remains popular in embedded circles. There is an IEEE standard for portable boot firmware that is Forth-based, and to this day you can walk up to many Solaris-based Sun workstations, hit a control sequence on the keyboard, and break into a Forth prompt from an interpreter in the boot ROM. I've embedded a Forth interpreter as a shell inside a C++ application, and have also used Forth's threaded interpretive approach in C and C++ code as a kind of self-modifying code model.
SAS. SAS is another closed, proprietary language that I've been fortunate enough to have access to several times in my career. It's a statistical analysis system, not a language really, but more like a large collection of feature-rich statistical programs with a consistent, high level interface.
I love SAS. Because simulation keeps rearing its ugly head in my work, I keep needing a way to crunch a lot of numbers and produce beautiful charts to summarize it. Spreadsheets don't cut it when you're dealing with gigabytes of data. For SAS, that's all in a days work.
SAS taught me the value of a consistent UI among a toolchain, and how that UI could look like a higher level language. It also taught me how to manage large datasets, something else it necessarily does well.
FP. FP is a pure functional programming language, pure in the sense that it has no concept of variables. It was first described by IBM fellow John Backus, the inventor of the FORTRAN programming language and Backus-Naur syntax, in his Turing Award lecture. It combines some of the ideas behind Kenneth Iverson's APL (another interesting language) with a functional programming style in an effort to liberate software design from the constraints of the traditional von Neumann computer architecture. I never actually used a working FP compiler, but the language turned out to be the partial inspiration of much research during my graduate student years.
Functional programming has informed my software design for the past thirty years, breaking me free early on from the traditional thinking of "compute a value, store the result" and the need to store partial results in variables. The functional programming paradigm maps to Java particularly well, since exceptions can be used to handle errors when using a functional design model.
BAFL. BAFL was another functional language based partly on FP. It existed only in the research group from which my thesis project emerged. What made BAFL really different from other languages, besides being functional instead of procedural, is that it had lazy evaluation. It never did a computation until it was convinced you were actually going to use the result. Lazy evaluation is the interpretive equivalent of the common compiler optimization of not generating code for parts of the program which have no effect.
I remember once we wrote this huge BAFL program to compute some mathematical function or other, turned it loose, and... nothing. We made a couple of corrections, tried it again, and... still nothing. We were looking for the bug when we realized we never told it to do anything with the result, so it never computed it. We added a print statement, and then the interpreter finally executed the huge evaluation tree that it had built.
BADJR. An implementation of BADJR, a language which only existed in the mind of my thesis advisor and mentor Bob Dixon, was my thesis project. In many ways it was the most bizarre programming language I have ever used. It had arbitrary precision fixed point arithmetic, implemented in two's complement base 100 binary coded decimal. It had no language constructs for iteration, instead opting to use efficient tail recursion in which the stack doesn't grow. And it had single-assignment variables: you could assign a variable a value once and only once, after which it was immutable.
When I describe this to people, particularly the single assignment part, a common initial reaction is "this can't work". The idea of single assignment seems more foreign than having no variables at all. But it is simply a very different programming style. The value of a particular variable in a particular stack frame can never be changed, but you can recurse and create a new copy of the same variable in a new stack frame and change that version.
BADJR was sufficiently powerful that another graduate student used it to implement a Prolog interpreter for his thesis project (and he went onto Bell Labs a decade ahead of me -- I always was the slow one). The entire research program from which my thesis emerged was inspired by Japan's Fifth Generation Computer Systems Project, which was to use AI (okay, so now you know it failed), massively parallel computers, and dataflow architectures, to take over the world. BADJR's single assignment architecture mapped well to massively parallel dataflow architectures, and the 5G project had already expressed an intent to leverage Prolog in its AI work. So we thought this was pretty cool stuff.
And it was, in its own way. BADJR pounded recursion as a school of thought into me like no other effort had. It's arbitrary precision arithmetic taught me how multiplication and division actually worked (and led to Mrs. Overclock giving me a copy of Knuth, including the ever-so-critical Seminumerical Algorithms, as a gift). And the two garbage collectors I had to implement for it (reference counting, and mark-and-sweep) helped me appreciate the issues underlying GC in the JVM decades later.
Assembler. Sure, laugh. I learned how and why programming languages work under the hood by writing tens of thousands of lines of IBM BAL and PDP-11 PAL assembler code (and made me a real C zealot as a result). That experience has served me well in my embedded work, when figuring out some of the more esoteric Linux kernel macros, and in fixing the occasional errant DSP application.
But more than that: my early assembler work circa 1976 was my introduction to object oriented design. No, really. The OS/360 I/O subsystem's assembler code introduced me to polymorphism, inheritance, modularity, and encapsulation, and to such OO mechanisms as dynamic binding and virtual tables. When C came along to be the portable assembler language I had always wanted, I used some of these same techniques, admittedly primitively implemented. When C++ finally came along, I breathed a sigh of relief: finally, a language that does all this for me.
Okay, maybe I really wasn't that smart. I'm not sure I really appreciated all those design patterns in the IOS. I just knew a good idea when I saw it, and the first rule of engineering is: steal all the good ideas.
Today I might recommend languages like Ruby and Haskell, although Prolog, Lisp, Simscript, Forth and SAS are still alive and kicking, and still very much worth learning just for the purposes of bending your head around their very different ideas of how to code. And assembler, maybe particularly Intel P4 assembler, still has its place too.
Learning new, and more importantly, radically different, programming languages will make you a better developer even if in your day job all you use is C or Java.
What new and different programming languages would you recommend?
Monday, September 17, 2007
Mrs. Overclock (a.k.a. Dr. Overclock, Medicine Woman) and I just returned from nearly three weeks in Japan. Five days of it were spent attending Nippon 2007, the 65th World Science Fiction Convention, in Yokohama. Nine days were spent with a group of over two dozen fans from the convention, travelling around Japan seeing the sights, having a grand time, and just generally making a nuisance of ourselves being henna gaijin. The remaining time Mrs. Overclock and I spent making a careful survey of the extensive and wonderful Tokyo subway systems, gawking like the small town rubes we really are.
First, the convention. The location of the World Science Fiction Convention is chosen by voting members of the WorldCon two years in advance from among the bidding locations. The WorldCon is completely fan-run. The most senior committee members may receive some compensation since for them it is very much a full time job, but for every one else it is a labor of love. For a convention that pulls anywhere from 2000 to 8000 attendees, the fact that it is a volunteer effort alone makes it a remarkable undertaking.
Nippon 2007 had about 2200 attendees, of which about 800 were from outside of Japan. While many WorldCons are held outside of the U.S., this was the first to be held in Japan, and hence the first to be run by Japanese fans. If my own experience is any indication, it was a complete success. The Japanese fans deserve a big domo arigato goziamas from this henna gaijin. Well done.
For many of us from the west, being in Japan is the closest we are likely to come to making first contact on another planet. For science fiction fans, this is a very resonant thing indeed. There were times I felt like I was living in The Mote in God's Eye by Niven and Pournelle, along with all that implies. (I might add that Mote is the only novel of any kind that I reread every decade or so.)
It is as difficult to quantify or qualify science fiction fandom as it is to precisely define science fiction. One of our tour group remarked that it is less than a family, but more than an organization. Mrs. Overclock characterizes it as a tribe, as in "Nothing good can come from dating outside your tribe." I like this a lot. Merriam-Webster says a tribe is "a social group comprising numerous families, clans, or generations together with slaves, dependents, or adopted strangers" or "a group of persons having a common character, occupation, or interest". Either of these serves as a workable definition of science fiction fandom.
I always know that I'll share some common interest with any fan. Perhaps I'm not into filk, maybe you aren't into anime, but we can both agree the latest William Gibson novel is just frackin' great. That's why travelling with fen (the plural of fan) almost always works well: you know the dinner conversation is going to be interesting, and almost all fen have a nearly insatiable thirst for the new and different, usually for the downright weird. They also seem to almost all have a wicked good sense of humor.
On our way to distant WorldCons, Mrs. Overclock and I play spot-the-fan in airports. It is cheating if they pull out the latest Gregory Benford novel. It is interesting to note that Mrs. Overclock and I had no problems identifying Japanese members of our tribe, regardless of the fact that we shared no spoken or written language with them. Apparently the fan gene transcends ethnic boundaries. Truly, this warms even my neutronium heart.
If science fiction is the religion of fandom, then it is a religion which has a huge and varied pantheon of gods. This is not unlike Shintoism, or for that matter Greek and Hindu mythology. And like those other pantheons, our gods walk among us. I had casual conversations at this particular convention with Gregory Benford, Larry Niven, and Joe Haldeman, and once I observed Mrs. Overclock chatting with Charles Stross. (For that matter, at least two members of our tour group were published authors, and Mrs. Overclock herself appears in a DVD and on two music CDs of science fictional sensability that you can purchase on Amazon.com.)
Lest you think it's all about elves, Star Wars, and Spock ears, one of the panels I attended was a slide slow and Q&A session by a forensic anthropologist who worked on a U.N. war crimes investigation team in Kosovo, Serbia. Holy crap, that was an eye opener; how far the Austro-Hungarian empire has crumbled. Other panels I attended included topics like The Sapir-Whorf Hypothesis (with Geoffrey Landis) -- no, that has nothing to do with Klingons, it's from the field of linguistics, the growth in public surveillance (with David Brin), the current thinking in life extension (with Greg Benford and Joe Haldeman), the Singularity (with Benford and Charles Stross), and legal aspects of current directions in intellectual property law (with Cory Doctrow). Only at a science fiction convention, folks. Maybe they should call it an "awesome mind blowing stuff convention".
"The future is already here. It's just not evenly distributed." -- William Gibson
This quote came to me while I found myself travelling with a group of more than two dozen fen at nearly 200 miles an hour across Japan in a superexpress or "bullet" train. It may be that every age thinks of itself as an age of miracles, but this is the age of miracles in which I live: that I can be sitting in first class comfort reading a book while traveling at the speed of Formula 1 race car. I can do so in the company of a group from five different nations (U.S.A., Canada, Great Britain, Australia, and Sweden) yet all in the same tribe. And I can order an iced coffee from the lady with the food cart. Sometimes you just gotta step back and live in the moment.
Fandom is populated by an inordinate number of technologists, and perhaps rightly so, since it is they that must work in the future professionally. But our little group of tourists included two physicians, two lawyers, a nurse, a librarian, a retired school teacher, an airline pilot, someone running for public office, and a megalomaniacal supervillain. It was about evenly distributed in gender (provided we agree that there are two genders), and probably statistically typical for sexual orientation and physical abilities. We were probably not typical for number of college degrees per capita.
What the demographics won't tell you is that it was a damned interesting group of people with which to travel. It was with this group that we made our whirlwind tour of Tokyo, Mount Fuji, Hakone, Hiroshima, Miyajima, Kyoto, and Osaka, all the while dealing with the effects of Typhoon #9, a.ka. "Fitow", as it hit the coast of Japan. It was sobering to stand near ground zero in Hiroshima and realize that with the exception of a single structure preserved for posterity, everything around us dated from one fateful millisecond in 1945. While we did not see every single Shinto shrine and Buddhist temple in Japan (seems like there is one just around every corner), I'd like to think we hit all the big names. Despite the fact that I purified myself at every one, I still managed to uphold my tradition when travelling abroad and bring home a new strain of viral crud. When the epidemic hits North America, you can blame me.
Our tour would not have been nearly so successful (in fact, it could have been total disaster -- getting fen to break their target lock with shiny pretty things is like herding cats) without the hard work of a lot of folks, and they also deserve a big domo: fearless leader and fan wrangler Ken Smookler, North American travel agent Alice Colody, our extraordinarily patient Japanese guides Kaori-san, Nobuyo-san, Yasuko-san, Yuki-san, and the many other expediters and drivers whose names I didn't catch. You brought 'em back alive!
"I don't think we're in Kansas anymore, Toto." -- Dorothy in The Wizard of Oz
Mrs. Overclock and I completed our trip with a few days spent exploring Tokyo. We stayed at The B Akasaka, a small boutique hotel that I would recommend. It is five minutes walk from the Akasaka subway station on the Chiyoda line that is right next to (there is a God) a Starbucks. Every morning we could be found there with our lattes and curry pies, pouring over maps planning that day's campaign, while avoiding the commuter rush hour.
Tokyo is unbelievably huge, to us anyway (maybe less so if you live in New York City or London). It is divided up into twenty-three wards, any one of which is the size of downtown Denver Colorado. Tokyo and Yokohama are the first and second largest cities in Japan, and the Tokyo-Yokohama corridor is the most densely populated region on the planet. We took an hour-long bus ride from Yokohama to Tokyo and never once left the urban cityscape. I've read that it was this corridor that was the inspiration for William Gibson's U.S. eastern seaboard "Sprawl" in his novel Neuromancer.
But we mastered the London tube, so we were equally determined to tackle the Tokyo subway system. We managed to navigate our way not only across different subway lines, but even across different competing subway systems, the Tokyo Metro and the Toei. We found our objectives in Akihabara ("Electric City"), Asakusa, Shibuya, and Shinjuku. We had a good time. We drank a lot of beer, hydration under such circumstances being crucial to clear thinking. That, and we never had to drive.
Several times while in Japan, and never more so than while using the subway, it occurred to us that there are just so many successful design patterns for doing anything. And so it turns out that the Tokyo subway works much like the London, Chicago, and Washington D.C. subways. This also applies to getting a cab, ordering a meal, and paying for a scale model Tachikoma in a Akahabara robot store.
We did encounter some new patterns.
The shinkansen or "superexpress" trains (only called "bullet trains" in the West) have almost no luggage space. It's a good thing too, since a superexpress stops for only about one minute at each station (this is not an exaggeration). You have to be at the door with your luggage in hand ready to get off or on. The Japanese solve this problem by shipping luggage seperately via motor freight. We used this successfully twice during our trip, packing for a few days in backbacks until we could rendezvous with our luggage. We also did laundry twice during the trip, having taken only a week's worth of clothes. Partly this was due to the desire to travel light, but also due to weight restrictions on domestic flights.
We successfully deciphered the bizarre Japanese system of street addresses that confounds even its cab drivers to find Mrs. Overclock's Japanese bead factory outlet. We are both pretty amazed that this worked. The Japanese system does not make use of street names. In fact, with the exception of very major thoroughfares, streets are not named. Although some Japanese cities are based on grid system adopted from the Chinese, Tokyo, especially the older parts, is a maze of twisty passages which all look alike. Apparently this was a deliberate attempt by paranoid shogun to make invasion difficult.
We ate at a typical Japanese lunch counter, where you begin by inserting coins into a vending machine. As you insert coins, possible meal selections are illuminated on the menu on the front. When you finally make a selection, you get a ticket. You give the ticket to the person behind the counter, and after a few minutes of hash slinging your meal appears. The staff never handles money. Mrs. Overclock and I discovered that the chicken katsu don was just like what we eat at our favorite Japanese restaurant near our home.
The Japanese have a very civilized approach to religion: you can have none, or you can have several simultaneously, and only practice when it suits you. I found this practical approach so compelling that I found myself praying at every Shinto shrine and Buddhist temple we visited, when I haven't been to a Western church in years except for weddings or funerals.
My month spent in the People's Republic of China in 1995 was a life changing experience for me, and it prepared me for three weeks spent being deaf, dumb, and illiterate in Japan. It is humbling to not be able to read signs or even ask simple questions. But I never really felt handicapped. Japan can be easily navigated, enjoyed, and savoured, without speaking Japanese.
And it should be. While the U.S. has a culture of individualism that (I would like to think) promotes innovation, Japan has a culture of group harmony that promotes efficient production. As peoples we are stronger together than we are separately, because our strengths are complementary. Both cultures have much to learn from one another.
Plus, they have lots of really cool stuff.