I have just been reading Alien Embassy by Ian Watson (in the French translation published by Presses Pocket). This is a somewhat esoteric book, and I can’t say that I’m really fond of that genre. But it is an interesting story, it’s well written, and I enjoyed it.

Howewer, something really puzzled me, and it has to do with what seems to be the deep (mis)understanding of relatively fundamental mathematics, namely irrational or imaginary numbers, somewhere in the first third of the book.

#### Irrational numbers

Unfortunately, I can’t quote the original text, only back-translate from French, which may be hard to map on the original text. It reads something like that (I’d appreciate if some reader could give me the actual original text):

– He’s talking about mathematics, whispered the voice of Klimt. Irrational numbers are numbers like pi, the constant ratio between the circumference and the diameter of a circle, you must know that.

– That’s about twenty two seventh, I added; I knew at least that!

– It’s a very important number. Without it, geometry could not exist, Klimt commented. It represents a true geometrical relation. Pi appears as soon as you draw a circle. Yet it is totally irrational. There is no explanation to the sum “tweny two seventh”. You can divide twenty two by seventh as long as you want, you will never get a really definitive answer. […]

This is bogus twice.

- First, the author reasons about 22/7 instead of reasoning about pi. The two numbers are really not that close. To the fourth decimal, 22/7 is 3.1428 whereas pi is 3.1415. It’s really bizarre to talk about the irrational nature of pi by using the different number 22/7 that just happens to share 3 digits with it…
- But second, and more importantly,
*irrational*in mathematics means precisely that the number cannot be identified to the ratio of any two integers. That seems to have eluded the author entirely, as he seems to think that 22/7 itself is irrational, and that “irrational” denotes a number with an infinite number of decimals, like 1/3 = 0.33333…

#### Imaginary numbers

There is a similar issue a couple of pages later:

– From what I know, our scientists consider mathematics as imaginary dimensions, said Klimt

–Imaginarydimensions? the Azuran fluttered. Imaginary? Ah, but that’s where you got it wrong. These other dimensions are all but imaginary. They really exist. […]

Here, multiple layers of a very strange understanding of mathematics and physics mix up. I don’t think any scientist ever considered *mathematics* as a whole to be imaginary dimensions. Mathematics use symbols and relations between symbols, and I think that most mathematicians would consider the mathematics we can ever talk about as a countable set. There are imaginary *numbers*, a terminology that refers to a class of complex numbers like *i* that have a negative square (e.g. *i ^{2}=-1*).

Where imaginary *dimensions* show up is for example in some explanations of Einstein’s special relativity, where time is considered as an imaginary dimension, as opposed to three real spatial dimensions. This is a mathematical trick to account for the form of distance in space-time, which has three squares with the same sign and one with a different sign (*d**s*^{2} = − (*d**t*^{2}) + *d**x*^{2} + *d**y*^{2} + *d**z*^{2}).

#### Why does it matter?

Why bother about two little errors in what is, after all, simply intended to be entertainment? Well, it shows the limits of terminology when we cross domain boundaries. What seems to have confused the author in both cases is that a terminology in mathematics has a very precise meaning that happens to be really far from the everyday meaning of the same word.

The problem is that this gave the author, and possibly his readers, a false sense of understanding. Curiously, this echoes in my mind a number of issues I had with popular science books, some of which were written by well known names in physics. And it is also a problem with the operating systems people use on computers all the time: we tend to forget that a “desktop” or a “window” has nothing to do with the common acceptation of the term, which may confuse beginners quite a bit.

I’m not sure there is much to do about it, though, but to gently correct the mistakes when you see them. After all, short of inventing new terminology all the time (like “quarks” or “widgets”), we just have to proceed by analogy (e.g. “mouse”, “complex numbers”) and then stick with that wording. We consciously forget that the word is the same, the concepts end up being quite different in our brain.

But this has implication in another area I am interested in, concept programming. We give names to concepts, but these names overlap. Our brains know very well how to sort it out, but *only after we have been trained*. As a result, the big simplification I was expecting from concept programming might not happen, after all…

If there’s one thing I’ve learned from working with the creator of Hungarian Notation, it’s that giving something a name that’s just a sequence of random characters can be better than coming up with an evocative name by analogy that will mislead someone new to the concept into thinking that they understand it.