It makes some sort of superficial sense to measure type in ems. Usually ‘m’ is the widest character in our alphabet, though ‘w’ is probably almost always just as wide and occasionally wider. Once you’ve fixed on ‘m’, for whatever reason, though, ‘n’ comes up as an obvious half unit: ‘m’ looks like two upside down ‘u’s while ‘n’ is one. Leave aside the fact that one has three legs and the other two — the upside down pockets justify you in claiming ‘n’ is half of ‘m’. But of course it isn’t. Just measure them. You could of course find a quirky typeface which did have ‘n’s that were exactly half of the ‘m’, but mostly this won’t be the case.

These reflections are prompted by an exchange on the SHARP listserv. Frank E. Blokland has a learned and detailed post about measuring ems. The discussion was started by someone asking how one might calculate the number of ems in a book. (I hope I don’t have to go into this here. The bare bones can be worked out from my earlier post, though as someone on the listserv commented, why would anyone want to do this!) Much of the scholarly discussion has started from the assumption that there is some rational basis for measuring in ems. I suspect that nothing of the sort is actually the case. In the early days of print there was no standardization in type. One printer’s ‘e’ might look like this and another’s like that. In the course of time type began to be a bit more standardized: all the printers in town might buy their type from the same type founder, and then all their ‘e’s would be identical. But people in another town or country would be buying theirs from a different founder, and their type design and proportions might be entirely different. It’s not like someone came down from Mount Sinai or Mount Nurem and announced “God saith, behold, this shall be unto you an em; and this eke shall be your en. And the Lord thy God saith let there be two ens unto every em.” The fact that when Johannes Gutenberg talked about an em he had in mind something which may have been a millimeter wider or narrower than what Aldus Manutius had in mind when he referred to one is neither here nor there. They were close enough, and that was fine for discussion and for getting the job done. [I should perhaps say that I have no idea whether Aldus’ em was or was not identical to anyone else’s. All that matters is that there was no God-given or man-made reason why it should be.]

I suspect that printer’s nomenclature grew up in a fairly informal workplace-culture environment. We’ve all worked in offices: co-workers will refer to things in a sort of in-group jargon. Early print workers had no concern for what researchers 5¾ centuries later would think. So, if they would call a big space an em, and one half its size an en, Hinz in the corner wouldn’t shout out “But you can’t say that; an ‘n’ isn’t exactly half as wide as an ‘m’, lads”. Nobody would waste their time measuring anything with any precision: they were there to set type and print pages, and if they’d all called a big space a Wienerwurst, they’d all still have been able to throw you one when you needed it. And that was all that would be needed. So they called it after the letter ‘m’. I hesitate to tell serious academics not to waste their time looking for any more rational answer, but I do doubt there’s one to be found.

The_Witch_posterBut why did they chose ‘m’ rather than ‘w’ — they are both similar in width? And half of a ‘w’, the ‘v’ does echo the m/n relationship — rather better actually as a ‘w’ is just two ‘v’s, without that duplicated middle leg issue. I wonder if there’s any relevance in the fact of the letters’ pronunciation. Both ‘m’ and ‘n’ are pronounced alike in English, French, German, and Dutch, while ‘v’ and ‘w’ sound quite different. Still one can hardly imagine Hinz and his colleagues deciding against a “fvow” space on the grounds that people in other countries might be confused, while “en” would be internationally comprehensible! The fact that we call it “double-u” while the French have it as “double-v” may also have some relevance.* Obviously the French have the edge over Anglo-Saxons here, since ‘w’ does look like two ‘v’s. But hang on a minute: when you learned to write you were taught to form a ‘w’ as two joined-up ‘u’s. So can it really be the case that ‘w’ got its name in Britain under a scribal-based, manuscript regime, while  in France it was named after a print-based picture? Of course ‘w’ doesn’t feature much in French and other Romance languages, so they wouldn’t really have needed to invent a name for it till later on when the French Academy was no longer able to keep out foreign loan words. Anyway, I assume, the nomenclature started in Germany. If you look at Fraktur type (Wikipedia shows the alphabet) you can see that ‘v’ and ‘w’, while obviously related, are not a double act in the intimate way that ‘m’ and ‘n’ are.

But this makes me wonder whether calculating things in “ems” may have originated in the scribal world, and was just taken over by printers. Now there’s something academic researchers could get their teeth into! Personally I doubt it. Again, working from complete lack of specialized knowledge and using my common-sensical methodology, I say it would seem unlikely that workers would start to talk in terms of standard units of measurement until there was something there to measure. A lump of type invites you to think about each character as a unit in a way that a line of handwritten script doesn’t. Also I’d say the evidence points to scribes being paid by the project or the job: why, if you were being paid by the em or the letter would you devise an array of abbreviations which seem to appear in manuscripts mainly as a means of saving the scribes time and wrist power? (As well of course as saving rather expensive raw materials.)

Why should it be troublesome that em and en remain slightly slippery, imprecise terms until Apple plumped on a precise definition? The system worked.



* The Oxford Dictionaries site tells us — “English uses the Latin alphabet of the Romans. However, this had no letter suitable for representing the speech sound /w/ which was used in Old English, though phonetically the sound represented by /v/ was quite close. In the 7th century scribes wrote uu for /w/; later they used the runic symbol known as wynn. European scribes had continued to write uu, and this usage returned to England with the Norman Conquest in 1066. Early printers sometimes used vv for lack of a w in their type. The name double-u recalls the former identity of u and v, which you can also see in a number of  words with a related origin, for example flour/flowerguard/ward, or suede/Swede. (Based on the Oxford Companion to the English Language)”

LATER: I am ashamed to note that when speculating on this question I overlooked the obvious reason that there wasn’t any W till the early 17th century, so using it as a measuring standard in the early days of printing could never have come into question.