Consider the Thermometer
Reviewed by Mike Hoye / 2022-03-08
Keywords: Editorial
I want to tell you a story about measuring things. It starts with a simple one, the thermometer. Even though a rough understanding of basic principles of the thermometer are two thousand years old, for centuries the whole idea of measuring temperature was dogged by superstition. How, the question went, could you measure an experience as subjective and ethereal as temperature? Even though you could demonstrate the basic idea in ancient Greece with glass tubes and a fire the idea was absurd, like asking how much a poem weighs.
It was more than 1600 years between the first known glass-tube demonstrations of the principles involved, and Santorini Santorio's decision to put a ruler to the side of one of those glass tubes. It was most of a century later before Carlo Renaldini suggested that Christiaan Huygens's suggestion to measure against the freezing and boiling points of water be used as the anchor points of a universal scale. (Isaac Newton's proposal for the increments of that gradient was 12, incidentally, a decision I'm glad we didn't stick with. Forty years later Andres Celcius had a better idea.)
The first precision thermometers—using mercury, one of those unfortunately-reasonable-at-the-time decisions that have had distressing long-term consequences—were invented by Farenheit in 1714. More tragically, he proposed the metric that bears his name, but the tool worked. And if there's one thing in tech that we all know and fear, it's that there's nothing quite as permanent as something temporary that works.
In 1900, Henry Bolton describe this long evolution as, "encumbered with erroneous statements that have been reiterated with such dogmatism that they have received the false stamp of authority." Today, of course, outside of the most extreme margins, these questions are behind us.
Computing, as a field, is much less than a century old; most of the metrics we've tried to establish haven't proved out, and so many of the benchmarks we've chosen are just…arbitrary. (Nobody's counting lines of code anymore, but why are sprints two weeks?)
But we've got tools that the smartest people in the 17th century didn't. We can take our raw material, these huge piles of precision data-ore, and feed it into math furnaces that can anneal it, polish it, and harden it to a point.
But computing is as broad as any field of human endeavour has ever been; at the intersection of math, engineering, art and the social sciences. How could you measure something like that? The whole idea is absurd. You might as well be asking how much a poem weighs.
Which brings me to these lightning talks. It's still early days. This field is so young, so often dogmatic and superstitious. But I'd like to introduce you people who are building thermometers.
Take a look around you, at the ridiculous comforts of modernity that can only exist because we share this single consistent yardstick. The glass in your windows, the beams in the walls. Bread baked just so, served on ceramic plates and cut with knives that hold an edge. And then ask yourself, what about all this software, all these ideas we take out of heads and turn into machines. What could we build, if we weren't starting with loose clay, shaping it outside in the weather?
You might want to be there: I think it will be pretty cool.