Take off your shoe and mindfully touch your big toe with your index finger. The sensations you experience are traveling along three different paths:
- the light reaches your eyes, then the visual cue travels a very short distance to your brain through your nervous system;
- the sensation of your finger touching your toe travels through your nervous system, up your arm to your spinal cord, then a short distance to your brain;
- the sensation of your toe touching your finger travels a relatively long distance up your leg, up your spinal cord, to your brain.
Here's the thing: there is about a 80 millisecond difference in delay between the latency from your eyes and the latency from your toe, with your finger falling in between. 80ms is nearly a tenth of a second. That's easily perceptible by the human brain; we recognize 1/10s delay routinely when listening to music. I own a mechanical watch - a vintage Eterna-Matic 36000 - that audibly ticks at 10Hz or 10 times a second.
But this difference in latency isn't apparent when we do this little experiment. Why not? There are some shenanigans going on in our brain regarding the processing of those sensations and it affects our perception of time.
Googling for this topic reveals all sorts of interesting research, the most common result being "hearing is faster than seeing", in part because of the processing involved. A lot of the research is done to determine how fast is fast enough to seem "instantaneous" and is motivated by response time studies. In the more recent decades, the computer gaming industry has been doing this kind of research.
But I was struck at the eerie similarities between this vagary of human perception and the how design and implementation decisions could affect latencies and therefore the ordering of events in real-time systems that I wrote about in this blog in Frames of Reference just a month earlier.
No comments:
Post a Comment