Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

units and measures

813 views
Skip to first unread message

Tracy Connell

unread,
Sep 9, 2003, 7:37:09 AM9/9/03
to
Just a quick question.

Why do the units and measures default to Picas. Who works in Picas? I normally work with Points and have to change the units of every new document.

Basically, is there a way to make the default unit Points from when I open InDesign instead of having to change it every time? It drives me nuts!!!!

Thanks

Tracy

Ken Grace

unread,
Sep 9, 2003, 8:45:13 AM9/9/03
to
Edit> Preferences > Units/increments

k


Ken Grace

unread,
Sep 9, 2003, 9:20:42 AM9/9/03
to
When you set defaults in InDesign you can either set defaults for an
existing document, or for all new documents.

For an existing document, make the changes and they will stay when you
reopen it.

For all new documents, close all open documents, make the changes, and
restart InDesign.

k


Tracy Connell

unread,
Sep 9, 2003, 8:15:47 AM9/9/03
to
Thanks, but I know how to change them. How do I make it stay on Points. If I open a New document it defaults back to Picas.

I need it to stay on points so that when I open a new document I dont have to go into Edit>Preferences/Units/increments everytime, or when I close InDesign and reopen it remains on points.

Can I make Points the DEFAULT?

Thanks

Tracy

Tracy Connell

unread,
Sep 9, 2003, 8:19:39 AM9/9/03
to
Never Mind...

Solution was to change them when no documents were open!

Tracy

Michael S. Flynn

unread,
Sep 9, 2003, 2:00:46 PM9/9/03
to

Who works in Picas?

I do.

Dave Saunders

unread,
Sep 9, 2003, 2:17:05 PM9/9/03
to
Me too. I find picas to be the most convenient of the various possibilities. Sometimes, I'll enter a value like 0p25.5 rather than do the math in my head, if I happen to need 25.5 points.

Dave

Scott McCullough

unread,
Sep 9, 2003, 3:27:23 PM9/9/03
to

>Who works in Picas?


I always work in picas. I got in the habit when I worked in a high-end textbook composition shop. There it was sink or swim--if you couldn't make sense of the designer's notations on the style pages for a book, you sure learned quickly. As Dave said, it's easy to enter point values in pica format. In addition, pica values are smaller and easier to remember than points or inches. Once you're accustomed to it, 25p6 is a lot easier to work with than 306 points or 4.25 inches.

Scott

M Blackburn

unread,
Sep 9, 2003, 5:10:51 PM9/9/03
to

Who works in Picas?


Me too. Almost exclusively.

Michael S. Flynn

unread,
Sep 9, 2003, 6:23:52 PM9/9/03
to
I had two years of acrhitecture and changing from knowing most of the decimal equivalants of inches out to 32nds, I balked strongly at learning to use picas & points until I realized how much easier they were to work with, especially when I saw that it made dividing up inches into odd sizes so much faster. 1/3, 1/6, 1/9 and 1/12 are odd sizes that conventional feet/inches don't handle well at all but are instantly at your fingertips when using picas/points, besides the 1/16, 1/8, 1/4, 1/2 and 3/4. And knowing that you have 12 points/pica, 6 pica/inch and 72 points/inch makes converting something like 5/8" to 3p9 something you can do in your head as fast as you can type it.

Tracy Connell

unread,
Sep 10, 2003, 6:17:50 AM9/10/03
to
Well, guess I started something controversial there! Sorry!

I just don't get picas!

Tracy

Scott McCullough

unread,
Sep 10, 2003, 9:13:30 AM9/10/03
to
No problem, Tracy. I think everyone took it in the spirit you meant. I do suggest you try learning the pica system--since you're already familiar with points, it's a relatively easy step to learn picas.

Cheers!

Scott

George Bilalis

unread,
Sep 10, 2003, 4:03:50 PM9/10/03
to
Which brings back the controversy of Anglo-Saxon measuring system as compared to the metric.

Tracy, for us in Europe this question might just be the case, having to think in dodecadecimal or hexadecimal ways instead of decimal.
But as the others say, we had to learn use the Pica/point (along with Cicero/point and Didot/point, along with Foot/inch) because it makes much sense in typography.

It's better to fit type height (always in points of pica) to leading of pica/point, than anything else and it's only logical then to measure the column height in pica/point.
In Europe where everybody uses cm as a unit, you can't expect an accurate fit in a column, if this is not measured in pica/point.

thanks
George

P.S. Who would think of measuring speed in furlongs per fortnight? <LOL>

Eelko Heuvelmans

unread,
Sep 11, 2003, 4:09:24 AM9/11/03
to
Let's join this discussion, can't be bad :)
I'm a student in my third year of a total of 4 and now i'm working for half a year on this assignment (a stage we call it, don't know the english word) at some compagny.

I never used Indesign before and now i have to learn to script to it,.. quite a challenge!

But, back ontopic.
When i first used InDesign i thought let's use Points. That's pretty logical i think, becuase, when working on a screen, you think in points.
But after a while i changed to centimeters, becuase layout will be easier to understand.
I never got hold of those strange Inches and Picas??? Wow,.. i even never heard of them, hihihi.

but as i read above, converting points and inches to picas is quite simple, so it should be the units to use if you want to be able to exchange your document with others?

I guess the discossion about what units to use is endless, everybody had it's own favorite. Escpecially in the time, when computers can do layout of the page and convert if neccecary?

George Bilalis

unread,
Sep 11, 2003, 10:02:22 AM9/11/03
to
Scott, you wrote:

Since picas are the international standard units of measure in high-end
design and publishing, that would probably be the safest way to go.


As you know the standard page in Europe (and the rest of the metric world) is A4 (210 mm X 297 mm), then A3 e.tc.
So we define a A4 page size and then do we have to think of it as 49p7,276 X 70p1,89 ? (which is an approximation)

This is an example of how 'international' a measuring system might be

regards
George

Scott McCullough

unread,
Sep 11, 2003, 9:42:33 AM9/11/03
to

>so it should be the units to use if you want to be able to exchange your
document with others?


Eelko:

That's an excellent point--as long as you're the only person working on the files, it doesn't matter what units of measure you use. But if you're going to be collaborating with others, it's important to agree on basic conditions like units so everyone is speaking a common language. Since picas are the international standard units of measure in high-end design and publishing, that would probably be the safest way to go.

Scott

George Bilalis

unread,
Sep 11, 2003, 10:35:25 AM9/11/03
to
Scott, pica/point WAS created long ago before you adopted the metric in U.S. So was A4 format. (And so were the other measuring systems still used in print)

It was that dtp layout was invented later and had to use what was already established. Then yes, I agree with you, it does add some complication <grin>

George

Scott McCullough

unread,
Sep 11, 2003, 10:10:16 AM9/11/03
to
George:

That does add some complication, doesn't it? Too bad the point/pica system wasn't created until after metric sizing became the standard. We wouldn't be stuck with 8.5 x 11 in. (51p x 66p) as the default page size here in the U.S.

Scott

John O

unread,
Sep 11, 2003, 10:43:40 AM9/11/03
to
>before you adopted the metric in U.S.

Aside from the engineering disciplines, we haven't done this yet. Probably
never will, at the rate we're progressing. :-)

John O


George Bilalis

unread,
Sep 11, 2003, 10:51:14 AM9/11/03
to
Sometimes John, legacy is stronger than reason.
Probably never, you are right

George

Ben Dupre

unread,
Sep 11, 2003, 12:35:23 PM9/11/03
to
When scripting doesn't it make sense to use metric units?

Can visual basic add, subtract, multiply, and divide using Picas/points?

Ben

Dave Saunders

unread,
Sep 11, 2003, 12:45:09 PM9/11/03
to
Makes little difference. Actually, it makes most sense to use points in scripting because some measures are in points and so by forcing them all to points you have consistency throughout the script.

But I find it better to work in picas in my scripts because that's what I think in, so I just have to divide the measurements that are in points by 12 when I need to include them in arithmetic expressions with other measurements.

Dave

Alan Gold

unread,
Sep 12, 2003, 12:15:08 AM9/12/03
to
I think I read an article in a recent issue of Electronic Publishing magazine about how big a "point" is has evolved over the last 100 years.
Believe I remember reading 15 years ago that the 72 points to the inch value was a construct of desktop publishing. (of course that's a long time ago for me to remember, so I could be wrong).
AG

Ken Grace

unread,
Sep 12, 2003, 5:59:34 AM9/12/03
to
InDesign manual p52:

"Like other graphic-arts software from Adobe and other companies, InDesign
uses Postscript points, which don't correspond exactly to traditional
printer points. There are 72.27 traditional printer points in an inch, as
opposed to 72 Postscript points."

k


George Bilalis

unread,
Sep 12, 2003, 6:10:16 AM9/12/03
to
Alan,

The point is 1/12th of one of the following :
1. Pica
2. Cicero
3. Didot
These three being different in measure, a point can be any of three measurements. That's ambiguity's definition.

You are right saying it kept developing (as a definition). The original typographer's pica was not an exact subdivision of the inch (actually it was less than 1/12th). Then sometime ago (18 years maybe) after dtp was invented, pica was rounded to an exact 1/12 of the inch, so later we had "a typographer's pica (the old traditional definition) and a Postscript pica at exactly 1/12 of the inch.
Then along goes the 'point' in the pica system. (and another 'point' in the Cicero and Didot systems).

That's not important now, as long as we know what are we talking about.

regards
George

Eelko Heuvelmans

unread,
Sep 12, 2003, 8:03:45 AM9/12/03
to
Pfff,.. this stuff is quite confusing for a newbe, hahaha!
Anyway, it's good fun reading and learning about it!

But i will stick to the metric system for a while ;)

Dominic Hurley

unread,
Sep 12, 2003, 9:12:04 AM9/12/03
to
As I recall, the original definition of the English-American pica was that there were 83 picas to 35 centimetres (or 996pts to 35cm). Though exactly why those numbers were chosen, I'm not sure.

M Blackburn

unread,
Sep 12, 2003, 10:20:00 AM9/12/03
to
PS: By working piece, I don't mean it was used to set type, it was physically used in the measurement slugs being produced by typesetters and the production of copies of the standard.

M Blackburn

unread,
Sep 12, 2003, 10:11:25 AM9/12/03
to

The original typographer's pica was not an exact subdivision of the inch


Actually George, that isn't correct according to what I've read. The original pica WAS exactly a sixth of an inch. What happened was that the pica measure used as the standard was an actual working piece and over the years it wore down. When the problem was recognized, the piece was locked away to protect the standard, but by then the standard was what it was. Now because of DTP the original standard is back.

Dominic Hurley

unread,
Sep 12, 2003, 11:26:11 AM9/12/03
to
That doesn't accord with the story I've read of the 83:45 ratio. But then, like so many of these things, who can say for sure now?

Ken Grace

unread,
Sep 12, 2003, 1:17:18 PM9/12/03
to
Wasn't it a fixed division of the distance from Caxton's thumb to his elbow,
and got smaller when he shrank with age?

k


M Blackburn

unread,
Sep 12, 2003, 2:03:48 PM9/12/03
to
Dominic,
It seems to me that your story deals with conversion between two systems of measurement and not the genesis of either. I really doubt any measure would ever be based on a conversion from an existing measurement system; that would render the new measure part of the original system, unable to be defined on its own. My story is more feasible, I do believe that there is a physical piece of metal (probably lead, eh) that is the defining measure of the pica, just as there is an iron rod sequestered away in London that is the defining standard for the foot. (The fact that, in the case of the foot, the original measure was based on something less definitive is not really the point. That rod was cast to represent the length of the barley grains or king's foot, whatever, and to provide an enduring standard that can physically be compared to copies.)

Alan Gold

unread,
Sep 12, 2003, 1:59:25 PM9/12/03
to
Not sure if you can get to this article without going through their "free registration screen", but here's more about it:

<http://ep.pennnet.com/Articles/Article_Display.cfm?Section=Archives&Subsection=Display&ARTICLE_ID=183124&KEYWORD=points%20picas>

M Blackburn

unread,
Sep 12, 2003, 2:12:05 PM9/12/03
to
Alan Gold's link is informative.

Dominic Hurley

unread,
Sep 13, 2003, 9:10:46 AM9/13/03
to
It's certainly a confused story. I don't understand the first part, though:

"Simon Fournier proposed a system of 72 points per inch ... and published a printed scale ... Depending on the weather, the printed scale changed in size. Since the ruler was used as a reference, printers and font makers suffered from inconsistent tools and measures. ... François Didot proposed a solution by defining a point as exactly 1/72 of a French inch."

If the original point was defined as 1/72 of an inch, then why did Didot's proposal to define it later as 1/72 of a French inch solve the inconsistent ruler problem? Are we to assume that there was an absolute French inch that all printers had access to and that never changed whereas the only copy anyone had of the original inch was a weatherbeaten printed scale? Or maybe they'd just invented weatherproof rulers by then!

PS. Why could a measure not be based "on a conversion from an existing measurement system"? What's the difference between saying there are 83pc to 35cm and a foot is "the length of the king's foot"? Must every measure be taken from something that is not originally a measure? So a centimetre must be defined not as 1/100 of a metre but as the length of a barley corn or what have you?

M Blackburn

unread,
Sep 15, 2003, 2:44:50 PM9/15/03
to
Dominic,

Just because any length can be measured in inches or in centimetres doesn't make the systems related. All measurement systems are initially based on something in the real world. The Imperial system was based on things such as barley grains or the length of the King's outstretched arm; Metric was an attempt to establish a system based on scientific principles (can't remember the specifics, but they screwed up a bit and had to fudge some of the values anyway.) Conversion is necessary because both systems are based on different fundamental values. If a new measurement system was based on a value from another sytem it would be a subset of the original, unable to be defined without reference to the original. The inch wasn't defined because 2.54 barley grains were needed to make a predetermined length, the inch was developed because a predetermined amount of whole grains came to a length that was then named an inch.

Richard Rönnbäck

unread,
Sep 15, 2003, 3:19:43 PM9/15/03
to
Here is a comprehensive link on the origin of units and measures

http://www.wikipedia.org/wiki/Weights_and_measures


Dominic Hurley

unread,
Sep 15, 2003, 8:01:29 PM9/15/03
to
You miss my point. You can easily define a new measure in terms an existing measure, and this doesn't for evermore tie it to that existing measure, any more than inches are for evermore tied to barley grains. And it certainly doesn't make it a "subset" of the other measure.

I just don't see the essential difference between basing a measurement system on something from the phsyical world and basing it on another measurement system. You say that the latter means that it is "unable to be defined without reference to the original" but then logically isn't that also the case for the former: eg, feet must always refer back to the King's foot (or whatever was first used)? Once a measure is defined, whether based on a physical object or a existing measure, it becomes valid and exists independent of its origin.

Besides which, defining a pica as being 1/83 of 35cm simply means it's 1/83 of 35/100 of 1/10,000,000 of a quadrant of the earth. So there's your physical object underlying the pica.

And your view doesn't seem to be shared by those responsible for determining measures: in 1959, the National Bureau of Standards and the United States Coast and Geodetic Survey redefined an inch as being equal to exactly 2.54 centimeters (ie, a foot was exactly 0.3048 meters). Does this mean inches are now a subset of centimetres? And the reference posted does indicate that a pica was at one stage defined as 1/83 of 35 centimetres.

Dave Saunders

unread,
Sep 15, 2003, 11:43:19 PM9/15/03
to
You bring back a memory of a marketing manager I used to work with who was comparing our computer with theirs. He done a series of benchmarks and then normalized our results to 100 to show the relative differences in the various tests.

Then he was stupid enough to say to a customer: Look how consistent our computer is compared to theirs!

Dave

George Bilalis

unread,
Sep 16, 2003, 2:39:41 AM9/16/03
to
With due respect to whatever said here, the fact is that an A4 page size (210mm X 297mm) is still defined as 49p7,276 X 70p1,89 (an approximation).

That's what measurement is all about, and then who would think of measuring speed in furlongs per fortnight, even if a furlong is a measurement of distance and fortnight a measurement of time?

regards
George

Dominic Hurley

unread,
Sep 16, 2003, 7:06:54 AM9/16/03
to
An A4 page size (210mm X 297mm) is still defined as 49p7,276 X 70p1,89.

True, George, but I tend to think of it as 595.2756pt x 841.89pt! Though, when I used FrameMaker more, I remembered it as 49.606pc x 70.157pc (FM doesn't do picas and points - only one or t'other).

And, Dave, I didn't follow your post at all except to pick up that you were calling me stupid. Cheers, Dave.

M Blackburn

unread,
Sep 16, 2003, 11:29:55 AM9/16/03
to
Dominic,
Methinks you're missing the point. The inch IS what is is based on; a conversion is a conversion and nothing but. An inch would be 2.54 cm long even if the metric system didn't exist (in fact it did) or if there were no other system at all to convert the value to.

George,
I think you're missing the point a bit too. We aren't talking about measuring length in pounds or kilograms, we are confining the discussion to comparable measures.

Dave Saunders

unread,
Sep 16, 2003, 11:37:26 AM9/16/03
to
Dominic,

Not at all. It was the marketing manager who was stupidly unable to relate different kinds of metrics to each other, even ones that he had himself created.

I absolutely was NOT calling you stupid. The thought never entered my head that anyone could imagine that's what I meant. I was simply elaborating on the theme that coming to grips with these issues and the relationships among units is something that can easily confuse people.

Sorry that you took that meaning. It was a completely unintentional implication.

Dave

Dave Saunders

unread,
Sep 16, 2003, 11:39:18 AM9/16/03
to
PS. Unless you've changed your name and career. I do remember the name of the marketing manager in question. <g>

Ben Dupre

unread,
Sep 16, 2003, 12:18:14 PM9/16/03
to
I think what is missing is type's indoctrination into the metric system. Paper was normalized at some point (at least outside the US) to an even number of milimeters, but type sizes remain in the old system, which is based on inches (English or French).

It does make more sense to me measure distance in terms of the size of the planet I live on, than in terms of barley grains or a dead man's foot. And it makes more sense to measure type in terms of the size of the paper it's printed on. Therefore the adoption of ISO standardized paper was a mistake. Or at least it was incomplete without redefining type sizes to some even division of the paper size.

Furthermore, the single most important feature of the metric system is not that the basis units are scientific and well thought out. It is that the entire system is DECIMAL; just like our Arabic numbering system.

But then how arbitrary is it to base a numbering system on the number of appendages attached to your hands? Our computer programming friends would probably be happier if the world counted in hexidecimal, octal, or (God help us) bianary.

I'm getting dizzy from arguing in circles.

Ben

M Blackburn

unread,
Sep 16, 2003, 12:44:47 PM9/16/03
to
Ben,
Though we weren't discussing the relative merits of different systems, my vote would be for metric over imperial. However, as to the metric system being well thought out, as I said, that didn't work out entirely as planned. If I remember the story properly, metric length was supposed to relate to the circumference of the earth but they got the initial measure wrong and were left with the dilemma of either fixing it or fudging it. They chose not to publicize the problem and eat fudge.

Marilyn Langfeld

unread,
Sep 16, 2003, 1:40:06 PM9/16/03
to
Hi guys, great discussion. One point (she laughs) I can make, is that since the authors of Postscript were American and not typographers, they redefined picas to match their point of reference. And point of reference is after all what we're discussing here.

I've worked in both systems and use whichever one my clients use (depending on their location) per project. But either way, I define my column width in relation to the page size (inches or cm), but type size, leading and column depth in points and picas. I can see the reason to use only one measurement system in scripting. While I'd choose metric over inches for most measurement (even though I'm American), points makes the most sense to me for type and leading.

George Bilalis

unread,
Sep 16, 2003, 1:47:27 PM9/16/03
to
Can anyone please enlighten me what was the exact reason for defining:
1. a Pica of 12 points,
2. a Cicero of 12 larger points and
3. a Didot of 12 points in between,
all for typographers to use around the world?

George

Ben Dupre

unread,
Sep 16, 2003, 3:33:00 PM9/16/03
to
The answer is further up the thread. It has to do with which inch you used. Each of these units is equal to 1/6 inch. The preference for dodecimal numbering I don't understand.

The preference for a line of type to be about 1/6 of an inch seems to be pretty much universal. The difference in size between picas, ciceros, and didots is small.

Ben

Jay Chevako

unread,
Sep 16, 2003, 4:50:45 PM9/16/03
to
Ben Dupre wrote:

> The preference for dodecimal numbering I don't understand.
>

Easily divisable by 2, 3 and 4.
Jay

Dominic Hurley

unread,
Sep 16, 2003, 7:10:18 PM9/16/03
to
Methinks you're missing the point.

The discussion has certainly got convoluted, and I'm not at all sure that we don't basically agree but are using different terms and thus talking past each other. So, I'll go back to basics and see if we can't come to some agreement.

I originally posted that I had understood that "the original definition of the English-American pica was that there were 83 picas to 35 centimetres". This was not a conversion as I understood it - I have never been referring to conversions.

But this may be the problem. We may be talking different things when we say "conversion". By "conversion", I mean the experession of a pre-exsiting measure in terms of its relationship to another measure - that is to say, if the point already had a set length that just happened to equal 1/12 x 1/83 x 35cm, then saying it was that ratio would be a conversion. But if the point had a different length, not equal to that ratio, or if the point had not in fact previously existed at all, then to say it equalled 1/996 x 35cm would be a definition, not a conversion.

For example, to say that 2 inches = 5.08cm is a conversion because inches and centimetres already exist and have set lengths; however, if I create a new unit called an NU ("New Unit") and define it to be equal to 3.24cm, then that is a definition. (I then create a rod of this length out of some dimensionally stable alloy to use as my standard and then create rulers and measuring devices based on it and propagate my "New Unit". I then promptly lose all my money and become at most a footnote to measuring history.)

Based upon my understanding of "conversion", what ATF did was not express a conversion ratio for the point and inches but define a new point (different from the previous Hawks point) and this new point was defined as being 1/996 x 35cm.

An inch would be 2.54 cm long even if the metric system didn't exist (in fact it did) or if there were no other system at all to convert the value to.

I agree that a new measure based solely on an existing measure (like my "New Unit") would be useless if it only ever existed in terms of the original definition (in my case, 1 NU = 3.24cm). But, once I had created my NU rod standard, created my scales based on it, and circulated them, then the entire metric system could happily vanish and my NU would still exist too. All that matters is that you have the money or the clout to get your unit accepted, which ATF obviously did. In the same way, the foot would have been useless as a measure once the king whose foot was used had died and his foot decomposed except that he was a king with authority, not a peasant. What matters is not what we use to define a new measure, but whether we have the aforementioned money or clout to get that measure adopted. (And I have assumed that we are talking about such systems that have indeed achieved wide acceptance.)

And this is where it gets interesting - I don't think any measurement system is independent, which seemes to be your original objection ("that would render the new measure part of the original system, unable to be defined on its own"). How can a metre be defined on its own?

When I entered into this discussion, I had thought that it was probably possible to create a measure purely based on objective, theoretical data. But after considering it, I don't see how that can be done. The metre was originally defined by reference to the earth, but I believe for a time at least the rod kept in France became the official "metre". But then this surely involves the rod being measured at a particular temperature and under who knows what other conditions. I'd be very interested to hear of a self-contained measure.

So, I do accept that underneath the point there must be a phsyical object somewhere. But a point-long object didn't have to have existed before the point was defined; the point system simply co-opted the phsyical standard used for the metre and redefined it.

I'm also interested in how you reconcile your two statements that:

"The original pica WAS exactly a sixth of an inch." (Post 27)

and:

"I really doubt any measure would ever be based on a conversion from an existing measurement system." (Post 32)

To me they seem contradictory.

Whatever, it still seems to be the case that the Fournier and PostScript points were defined as being exactly 1/72 of an inch, the Didot point was defined as being 1/72 of a French inch, and the ATF point was defined as being 1/996 x 35cm. But the pica/point system now exists independently of the Imperial and metric systems.

George Bilalis

unread,
Sep 17, 2003, 3:41:28 AM9/17/03
to
Ben et als,

thanks for detailed posts.
My question then, Ben, goes one step further: If each unit was based on a different definition of an "Inch", then why was it necessary for each one to define a different "Inch": A bigger, a smaller and one in between, if not for being just different? There goes the logic of a "universal measuring system!

And Dominic, we still have to convert between units: We convert the "US$" to the "€uro", the "Foot" or "Mile" to "meter" and "Km", the "Ounce" to "grm" e.tc.
Most conversions are approximative, rounding up or down, someone loosing something, someone gaining something.

And it's not counting in dodecadecimal, or hexadecimal that is the problem (instead of decimal). It was the logic of : 20 shillings to the old British Pound, 12 Pence to the Shilling, and 4 Farthings to the Pence that's irrational way of subdividing a unit of measure. (Or was that on purpose?)

thanks
George

Marilyn Langfeld

unread,
Sep 17, 2003, 9:11:34 AM9/17/03
to
George,

You might be interested in this article:

<http://desktoppub.about.com/gi/dynamic/offsite.htm?site=http%3A%2F%2Fwww.oberonplace.com%2Fdtp%2Ffonts%2Fpoint.htm>

which explores your question somewhat. Here's a little of what is said:

Two most widely used point units are ATA points (also known as Anglo-Saxon
point) and Didot points. Anglo-Saxon point which is about 1/72.272" has
been used on the island of the United Kingdom and on the American continent.
The second point variant is the Didot point which is used in Europe. This
point unit is named after the French printer François Ambroise Didot (1730
- 1804) who defined the "point-based" typographical measurement system
now bearing his name.

Both systems define another unit of measurements of length equal to 12
respective points. It's pica in Angle-Saxon system and Cicero in Didot
system:

1 pica (ATA) = 4.2175176 mm = 12 points (ATA) 1 Cicero = 4.531 mm = 12
points (Didot) 1 pica (TeX) = 4.217517642 mm = 12 points (TeX) 1 pica
(Postscript) = 4.233333333 mm = 12 points (Postscript)


So, the Cicero equals 12 Didot points. And, as per usual, the Anglo Saxon and French created competing systems, with the French system being created first.

He doesn't address the quesion of why the pica and cicero were each defined as 12 respective points, but does say there's a movement underway to propogate a fully metric system. This time developed by the Germans.

Since the Didot was defined as 1/72 of the French inch, I would guess that since 6x12=72, making a 12 point unit seemed logical. But why was the Didot defined as 1/72 of the French inch? Great question.

George Bilalis

unread,
Sep 17, 2003, 10:03:06 AM9/17/03
to
Thanks Marilyn,

You described precisely the answer I was always seeking, while putting that question to my students.
One thing is what we use in our daily routines.
Another is why this is that we have to use.

You see, newcomers always tend to think in what is obvious to them and as for my students-employees it's metric (cm and mm). They don't think much about using picas and I always had a hard time to convince them that it's a natural choice if we talk about type (where type height and leading, is universally defined in pica points, whatever the page size is defined in)
To complicate things further, we can still buy metal measuring scales-rulers in Greece (called "stigmometra" literally 'point-meters') specially for typographic use, that are made in Germany and carry a Didot scale only. (Pica scales are very rarely seen around here). In this case you understand the confusion of an apprentice as to what is measured, and the next natural choice to stick to the metric scales.

Of course we can measure the width of a column - or the page size in Didot, or Cicero points, but we still have to measure in Pica when it comes to type height. Q.e.d.

Thanks again Marilyn
George

Marilyn Langfeld

unread,
Sep 17, 2003, 10:24:16 AM9/17/03
to
Here's another good reference I've just found, that comes just a little closer to the answer you are seeking:

<http://www.sizes.com/tools/type.htm>

Quoting:

Type sizes were originally named; catalogs with such names appeared as
early as 1592.

Some of the names came from the type of book typically produced in that
size. “Cicero” was a size used to print editions of classical authors;
“Primer” was used to print religious books ordered by Henry VIII.

Another class of names were boasts of the type's beauty, such as “Paragon”
and “Nonpareil.”

The problem with using names was that there was no clear relationship
between the names, and they were not related to well-defined linear standards
like the inch. In the nature of printing by letterpress, to get the type
on the chase to lock up properly it all has to fit together.

Another disadvantage was that sometimes a name used to describe a body
size was also used as the name of a typeface. “English,” for example,
meant a typeface in the style called blackface as well as approximately
14 point type.

The French addressed the problem in 1723 with a royal order that the sizes
of type be fixed. 1735 Pierre Simon Fournier points. first in 1737, and
final version in 1764. In , 72 points to the pouce (1 pouce was then about
2.707 centimeters or 1.066 inches). By 1764, however, Fournier had dropped
the pouce and instead defined his point by a (badly) printed scale. It
seems likely, as Theodore de Vinne speculates in The Practise of Typography
(footnote, page 141), that in the intervening years Fournier had adjusted
his point so that it would fit existing sizes of type as well as possible.
In Fournier's system the cicéro (which plays the same role on the continent
as the pica does in English typesetting) is 12 pts. As it was used in
France in the 19th century, one of Fournier's points was approximately
0.35 millimeters.

The biggest failing of Fournier's system was that it was not related to
any other system of units. To remedy that, around 1785 François-Ambroise
Didot, a well-known Parisian typefounder, established a new system which
really returned to Fournier's 1737 definition of 72 points to the pouce,
so one Didot point was approximately 0.376 millimeters. Didot abolished
all names, replacing them with numerical sizes. In doing so he was forced
to make the basic size, cicéro, 11 points–which may have been what Fournier
was trying to avoid. Twelve is a very convenient basic size, since it
is easily divided into halves, thirds, and quarters that can be built
up from 1-point pieces. Even half of eleven is a special, non-integer
size. Nevertheless, after many decades in which both Fournier's and Didot's
systems were used side-by-side, Didot's prevailed, and is currently in
use in Europe (except Belgium) and certain other countries.


So, if Didot was trying to reconcile Fournier's system, and come up with divisible units that worked with type sizes already in existence, but named rather than defined as a mathematical progression, then perhaps 72 might have been a logical choice, as was already noted in this thread.

M Blackburn

unread,
Sep 17, 2003, 10:20:24 AM9/17/03
to

I'm also interested in how you reconcile your two statements


Dominic,
Hmmm. To avoid the contradiction I suppose I have to conclude that the pica is a subset of an established measurement system. It may have lost that status when the pica started to shrink and had to be defined on its own terms, but with the advent of the new postscript pica we're kinda back to the subset again. The physical entity of the typesetter's pica, be it a slug or mold, is no longer a valid standard. The pica is a sixth of an inch again.

You probably don't find my rationalization very satisfying, but consider this mind experiment. If for some strange reason, the inch happened to grow, what effect would that have on the pica? Would it have any effect on the centimetre?

George Bilalis

unread,
Sep 17, 2003, 12:47:50 PM9/17/03
to
Excellent again Marilyn,

Yes this is as close as it can go. And then:

Nevertheless, after many decades in which both Fournier's and Didot's
systems were used side-by-side, Didot's prevailed, and is currently in
use in Europe (except Belgium) and certain other countries.


What's bugging me is this 'except some' for a measuring system.
There is a centuries old church in Copenhagen with the left (or is it the right?) side longer than the other. The story is that the building was started by the Germans and finished by the Danes at a later time, who were following the original architectural plans using a different scale. <LOL>

Thanks
George

P.S. By the way ID defines 0p1 as 0,3528 mm, 0c1 as 0,3759 mm and doesn't mention Didot units (same as quark)

Dominic Hurley

unread,
Sep 17, 2003, 7:26:56 PM9/17/03
to
If for some strange reason, the inch happened to grow, what effect would that have on the pica? Would it have any effect on the centimetre?

No, and this is where I would disagree that picas/points are a subset of an existing measurement system. I would argue that, for a measure to be accepted as a valid system, it has to gain a measure of independence from other systems, even though it may originally have been defined in terms of those other systems. At one stage, this would have meant the creation of a dimensionally stable phyiscal rod to be stored in controlled conditions. So, if the inch standard was changed, picas and points wouldn't, unless they were then redefined to again be 1/12 and 1/72 of the new inch and a new standard pica rod created.

But, in practice, I don't know whether ATF or anyone else did in fact create an actual pica standard rod or whether they just piggybacked off the metre rod or inch rod or whatever did exist. This would seem to be be more practical. Instead of cutting a new rod, you just say that the pica is 35/8300 of the official metre rod. Here is a description of that rod:

"In 1927, the meter was more precisely defined as the distance, at 0°, between the axes of the two central lines marked on the bar of platinum-iridium kept at the Bureau International des poids
et Mesures, and declared Prototype of the meter by the 1st Conference Generale des Poids et Mesures, this bar being subject to standard atmospheric pressure and supported on two cylinders of at least one centimeter diameter, symmetrically placed in the same horizontal plane at a distance of 571 mm from each other."

But I don't think this makes picas/points a subset of inches or centimetres. If we accept that there is a unchanging physical standard to any measurement system (or as close as we can get it to it), like a fraction of a quadrant of the earth, then any measure initially defined as being a set number of units of another measure can also be traced back to that unchanging constant (eg, one metre = 1/10,000,000 of a quadrant of the earth, and one pica = 1/83,000,000,000 of a quadrant of the earth). So, in that way, I would agree that all measurement systems are a subset of the earth (if that's your base), but not necessarily of each other.

Now the metre is defined as the length of the path travelled by light in vacuum during a time interval of 1/299,792,458 of a second, and similarly, a ATF pica is 35/8300 that length travelled or that it's the length travelled in 35/2488277401400 of a second.

So, in summary, to gain that independence for your measurement system, you either create a physical reference after you've defined the measure or you track your measure back to the physical reference underlying the measure you defined your measure against. (Or you create the measure originally from a phsyical object.)

I'm interested by your staement that "when the pica started to shrink and had to be defined on its own terms". What do you mean by "defined on its own terms"?

And, briefly:

And Dominic, we still have to convert between units.

True, and I have never disputed that. But M Blackburn and I have been discussing whether the ratio I quoted was a definition or a conversion.

Ben Dupre

unread,
Sep 18, 2003, 10:26:22 AM9/18/03
to
If... the inch happened to grow...

All of the type already cast and the moulds it was cast from would not change. Therefore the conversion standards would have to.

Inasmuch as there are existing physical references (scales, etc.) the measurement system stands on it's own.

Ben

M Blackburn

unread,
Sep 18, 2003, 11:25:58 AM9/18/03
to
Dominic,
Whew! This is a long and convoluted story isn't it? Hopefully this will be my last word on the subject. I do not believe that the pica/point system is truly a subset of the inch; there is too much history and too many variables and previous standards. Still, the modern postscript pica seems to have aligned itself with the inch.

the original definition of the English-American pica was that there were
83 picas to 35 centimetres


The origins of the pica precede the metric system by over 200 years. There was confusion caused by small differences between principle advocats. Your example suggests to me that the pica was measured in centimetres in an attempt to nail the definition to something solid. But what is 4.216867469879518… mm? Nothing. It's what the specific pica that was chosen as the standard looks like when measured in millimetres. This I can see as nothing but a conversion; it's completely different than deciding that whatever distance light travels in a given amount of time will be a metre.

M Blackburn

unread,
Sep 18, 2003, 12:13:32 PM9/18/03
to

All of the type already cast and the moulds it was cast from would not
change


Ben,
I don't think you're treating the experiment fairly. If the inch grew (think supranaturally here) the iron bar that is kept in the Tower of London (or wherever) would grow too. Your pre-exisiting type and moulds would not be exempt from change IF the inch was used in their original construction. The platinum bar that defines the metre would not be affected because recalculating its length from the principles used in the first place should result in the same value.

Dominic Hurley

unread,
Sep 18, 2003, 7:46:09 PM9/18/03
to
Your example suggests to me that the pica was measured in centimetres in an attempt to nail the definition to something solid. But what is 4.216867469879518… mm? Nothing. It's what the specific pica that was chosen as the standard looks like when measured in millimetres.

No - if we take the earlier reference posted as being an accurate summation of the history of the pica, then what occurred was not a conversion, because the pica had a different value before ATF redefined it. It's very simple - it's a conversion if and only if ATF's pica exactly equalled the previous (Hawks) pica, which it did not. The pre-existing pica (the Hawks pica) did not equal 35/83cm (4.217mm); it equalled 1/6 x inch (4.233mm). Thus, the ATF pica was a new pica, and it was defined as 35/83cm, not converted into cm. If they wanted to convert it, ATF could easily have given the conversion as 254/600cm. That they didn't shows that they redefined it, not converted it.

To go back to my example, when I defined my New Unit as equal to 3.24cm, it wasn't a conversion - the NU didn't exist before I defined it, so how could I convert it? It was a definition. Likewise, the ATF pica (which didn't exist before it was defined) was defined as 35/83cm.

If the inch grew ... the iron bar that is kept in the Tower of London (or wherever) would grow too.

I don't follow that. If whoever is now in charge of the inch specification redefined the inch (or, more likely, the yard), then the bar wouldn't grow; they'd cast a new bar or redefine their light distance equation.

If you mean that the bar itself grew (and we assume that the bar is the standard reference, not any fancy light distance or krypton wavelength equation), then, yes, any new scales constructed directly from it would also be bigger than the previous ones. And, if the pica scale had simply co-opted the same rule, then future pica scales would also be larger. But, if they had cast a new alloy ATF pica rod when they originally defined the ATF pica, then the ATF pica wouldn't change unless that standard rod also mysteriously grew.

Just because they originally defined the ATF pica in terms of centimetres, doesn't mean it will always retain that ratio if the metre scale changes. ATF defined an absolute length in terms of centimetres, not a ratio. I don't think they were saying that an ATF pica would be 35/83 of a notional measure called the metre; they were saying that it was 35/83 of the absolute measure called the metre. That the metre may subsequently be redefined does not change the absolute length it used to have (and thus the absolute length the pica has). Even if they did intend to define a ratio rather than an absolute length, it doesn't change my argument that you can define a measure in terms of the absolute length of an another measure. And, when you think about it, it is entirely logical to do so - if you create a new measure, it will be handy if it does have a simple fractional value of an existing common measure, so that conversions are simple.

The platinum bar that defines the metre would not be affected because recalculating its length from the principles used in the first place should result in the same value.

You're assuimng that the yard and the metre are defined in different ways - the former from a metal bar and the latter from scientific principles. Your argument falls apart if both use the same objective basis, which I imagine they now do. Both the yard and the metre have at times been defined as the length of a physical bar, but now the metre is defined in terms of light speed and so, I imagine is the yard. The whole point of dimensional measurement systems is to define an absolute length, not a ratio, and so as technology advances, they'll convert (or redfine) the measures in terms of the most dimensionally stable object they can measure (once this was a platinum-iridium bar, now it's light speed). This doesn't make the varying measurement systems subsets of each other, but they do all relate to the stability of that universal standard reference. And I don't see any way out of that. The only purely mathematical measure I could think of was degrees, but, as far as I can see, measures of length will always have to relate to an external objective object.

By the bye, I actually have on my desk in front of me an old ATF metal typescale with scales for agates, inches, picas, and 6-pt, 8-pt, and 10-pt ems on it. But it's of little practical use now, because the PostScript point was defined differently from the ATF point. If I hold a DTP ruler against my ATF ruler, it very simply demonstrates that the definition of the PS point (1/72 of an inch) was not a conversion of the ATF point, and the same is true of the ATF point and the Hawks point.

M Blackburn

unread,
Sep 19, 2003, 2:23:15 PM9/19/03
to

was defined as 35/83cm, not converted into cm


Never said that Dominic. To me, conversion means expressing a specific measure in terms of units of a different measuring system.

I don't follow that. …then the bar wouldn't grow;


It's a mind experiment meant to explore relationships, it's conceptual, not restricted to strict reality.

You're assuming that the yard and the metre are defined in different ways


Actually they are. Imperial measures are based on real world items. Metric distance (supposedly) is based on the concept of a universal constant. But that's not relevant except to my mind experiment.

the definition of the PS point (1/72 of an inch) was not a conversion
of the ATF point


Okay, but I never suggested that either.

These arguments are getting as confusing as the history of pica.

Dominic Hurley

unread,
Sep 20, 2003, 10:12:30 AM9/20/03
to
What I'm confused by is your current position. When you say that you "never said that" are you saying that you never said that the ATF pica (what I originally referred to as the English-American pica) was not defined in terms of centimetres? Or that you never said that the 35/83cm definition was a conversion, not a definition? It would help to get that clear, because having reread your posts it seems clear to me that you said both these things, so maybe it's something else that you never said.

I do, however, agree that you never suggested that the definition of the PS point was not a conversion of the ATF point but, in turn, I never suggested that you had. I gave it as an example both of the difference between conversion and definition and of a measure (the PS point) being based on a "a value from another system".

To me, conversion means expressing a specific measure in terms of units of a different measuring system.

You'd better clarify this too. Are you saying that, to you, definition equals conversion or do you agree that you first need to define a measure before you can convert it (ie, definition and conversion are not the same)?

Yes, I did get that your growing inch was a "mind experiment", but to my mind you didn't give enough information for it to be valid - I wasn't clear what you meant and what rules applied in your alternative world, so I couldn't be sure I was arguing the same thing as you.

Actually [the metre and the yard] are [defined in different ways]. Imperial measures are based on real world items. Metric distance (supposedly) is based on the concept of a universal constant.

No, they're not defined in different ways. They're both defined in terms of supposed constants. Even if you subscribe to the theory that the yard was originally defined as the distance between Henry the First's nose and his thumb, that was still intended to be a constant (there weren't supposed to be two distances between his thumb and his nose). And if that's a real world item, why is the world less real than that? The metre was defined in a fraction of the circumference of the earth, and the world's had been around long before Henry the First was (and is still here long after he's died).

Furthermore, physical rods have served as the standard for both yards and metres. In 1819, the "First Report of the Commissioners Appointed to Consider Weights and Measures" gave the following information for the replacement of the British standard yard (which apparently kept shrinking), should it "be lost or impaired":

"the length of a pendulum vibrating seconds of mean solar time in London, on the level of the sea, and in a vacuum, is 39.1372 inches of this scale; and that the length of the metre employed in France, as the ten millionth part of the quadrantal arc of the meridian, has been found equal to 39.3694 inches."

Just 20 years prior, the metre was defined as one ten-millionth of the quadrant of the earth.

So, I don't see your basis for saying that Imperial and metric measurements are defined in different ways. And, if we take the situation existing today, the "international inch" was defined by the US and the British Commonwealth in 1958/59 to be exactly 25.4 mm (ie, it was defined in terms of the metric system of units). (Prior to that, in 1893, the US inch was defined as 100/3937 of a metre.)

Yet another example of a measure based on an existing measurement.

M Blackburn

unread,
Sep 22, 2003, 11:24:56 AM9/22/03
to

(ie, it was defined in terms of the metric system of units).


Dominic,
We obviously disagree on some fundamental concepts.

I consider your examples of "definition" to be conversions. These are attempts to keep a formal connection between two systems, they aren't related to the genesis of either. That the inch can be defined as 25.4 mm doesn't say anything more of an inch than it does of the millimetre which can be expressed in terms of inches. It's a Catch 22.

And I stand by my opinion that there is a difference between the imperial and metric systems. It's conceptual, but if all physical standards of length were lost, theoretically metric values could be re-established.

Dominic Hurley

unread,
Sep 22, 2003, 7:32:05 PM9/22/03
to
We obviously disagree on some fundamental concepts.

That's why I asked you to define your understandings of "conversion" and "definition" and whether definition equals conversion. If you would do that, then we could easily ascertain exactly what we disagree on. For the record, I have been using the following dictionary (COED, 8th ed) definitions:

"'Define': give the exact meaning of, mark out the limits of" with the added understanding that, in terms of length measurements, a definition of a unit will specify the absolute length of that unit in terms of a (supposed) constant.

"'Convert': change (money, stocks, units in which a quantity is expressed, etc) into others of a different kind" and I take it as read that the units you're converting (and those you're converting into) must have a pre-existing absolute value before they can be converted (ie, they must be defined before they can be converted).

And, (to borrow legal phrasing), "definition" and "conversion" have corresponding meanings. So, you can convert inches into centimetres (or yards or what have you), only if there exists a definition for inches that sets out the inch's absolute length.

If you agree with these definitions, then, to maintain your position, you have to demonstrate that the ATF pica existed before it was defined as 35/83cm and that it had a value that exactly equalled 35/83cm. Even if the ATF pica existed prior to its definition as 35/83cm, if it did not exactly equal 35/83cm, then to say that it does equal that length is no longer a conversion. It's a redefinition. So, can you demonstrate that the ATF pica did exist with this exact length prior to this definition?

You also have to produce the current official definition of the yard/foot/inch that defines those units not in terms of metres. Can you do that?

These are attempts to keep a formal connection between two systems [Imperial and metric], they aren't related to the genesis of either.

No, they're not related to the original genesis, but they are the current definitions. That is, Imperial units have been redefined and their original genesis is irrelevant.

And you yourself agreed with this:

"... there is an iron rod sequestered away in London that is the defining standard for the foot. (The fact that, in the case of the foot, the original measure was based on something less definitive is not really the point. That rod was cast to represent the length of the barley grains or king's foot, whatever, and to provide an enduring standard that can physically be compared to copies.) [Emphasis added.]"

Here you explicitly accept that the genesis of a measurement system may no longer be relevant to its definition - that is, it can be redefined.

That the inch can be defined as 25.4 mm doesn't say anything more of an inch than it does of the millimetre which can be expressed in terms of inches.

Wrong. The inch is currently defined in terms of metres, and the metre is defined in terms of the distance covered by light in a vacuum over a set period of time. Both are definitions. Neither is a conversion. If I am, as you say, wrong on this, it should be very simple for you to prove that - find me the current official definition of the inch that does not express it as a set number of centimetres/millimetres/metres. I'll be happy to accept I'm wrong if you can find this information.

... there is a difference between the imperial and metric systems. It's conceptual, but if all physical standards of length were lost, theoretically metric values could be re-established.

What do you mean by "all physical standards"? I'm guessing that you think the yard is still defined in terms of a phsyical alloy rod kept somewhere (even though this is not the case), whereas the metre is defined in terms of the earth (or krypton or light). But all you're saying then is "if the old standard for yards vanishes then old yards are lost". You can similarly say "if the earth vanishes (or all krypton atoms or the speed of light changes) then metres are lost". Both are valid statements and both are irrelevant to the discussion.

But to stick with your "what if" scenario in which all physical standards of length are lost, the foot/yard/inch can be re-established along with the metre, and that is presumably part of the reason why they were defined in terms of metres. Even if I were to accept that the definition is a conversion, not a definition, inches can still be re-established by applying that conversion to the re-established metre.

In any case, if you go back to the post that started all this, what you wrote was that "I really doubt any measure would ever be based on a conversion from an existing measurement system". So, even if you continue to classify the ATF 35/83cm definition as a conversion, not a definition, you must still concede that you were wrong in this.

George Bilalis

unread,
Sep 23, 2003, 4:10:40 AM9/23/03
to
Well, gentlemen,

I have been reading this elaborate thread getting more so, with great interest. But however detailed, this is still not replying why we don't express the size of a paper size in Angstroms (1:10^10 m)though this is the valid unit for measuring light wavelength, or why we use light-years to measure galactic dimensions.

Besides scientific research a unit of measure must be:
1. practical,
2. well defined, (tracable to some standard)
3. internationally understood and accepted.

This I believe was the original question that started this thread (conversions or no conversions) and all the fuss about the old 'imperial' system, being primitive, was that it was not 'well-defined' (also not internationally accepted).

As it stands today (after rationalization of the 'imperial' system) it's used alongside to the metric. It's only a matter of choice and personal taste (if not of legacy) what one typographer uses.
Just my two pennies.

George

M Blackburn

unread,
Sep 23, 2003, 1:58:40 PM9/23/03
to
Dominic,
I'm guessing you received an e-mail by a disinterested observer from outside the forum. His message has convinced me that the inch is now defined in metric terms (which in my mind relegates imperial measure to a subset of the metric system since it can no longer be independently assessed). Here's a link supplied by Mr. Nygaard for those that may be interested. <http://www.ngs.noaa.gov/PUBS_LIB/FedRegister/FRdoc59-5442.pdf>

In regard to your questions of the definition of the pica and its various incarnations, I find the stories supplied by the links here to be somewhat confusing to the point of seeming inconsistent at times. My original input was intended to relay the information that physical wear affected the historical measure of the pica – that has been established here – and that postscript has redefined the pica to agree with the original intention, that being six to the inch. That's not controversial is it? Unfortunately I got sucked in over my head and found myself dealing with details and definitions.

P.S. You could have chose: "conversion (b) the changing of units, measurments, etc. from one system or expression to another." The idea being an expression of equivalents rather than the changing of one thing into another as in the case of money – which in the real world hardly pays lip service to the idea of equivalence.

Dominic Hurley

unread,
Sep 24, 2003, 12:59:15 AM9/24/03
to
Yes, I did receive the e-mail and I thanked Mr Nygaard for his reference.

So, what's left? I disagree with the term "subset" in the context of metrics and Imperial measurements, but I don't see any point in arguing that.

I tried to follow the legislative trail here in New Zealand of the Imperial and metric measures, though I was hampered by not having access to the UK Imperial Act, but it is true that, for many of the earlier NZ Acts (ie, pre-1959), a table of equivalents was given, not a definition (as per the US Metric Act 1866 -
<http://lamar.colostate.edu/~hillger/laws/metric-act.html> ). These I agree, were conversions.

I also found some more information on the American-English pica, though it differs from that which I had previously found. Walter Tracy, in his Letters of Credit says that the US Type Founders Association (not the ATF) decided to institute a standard pica and it choose the pica used by the McKellar, Smiths, and Jordan foundry of Philadelphia. (This was one of the largest foundries around and many typesetters used their type and foundries their pica, so it was thought that adopting their pica would mean the least change for the most typesetters and foundries.) However, Tracy is not clear on whether MS&J had defined their pica as 35/83cm, saying only that it was exactly equal to this. So, maybe it was a conversion, though in the absence of evidence to the contrary, I think it more likely that was the definition MS&J used when casting their type. Tracy does also say that this ratio caused some consternation among some members of the association, because they wanted a system that had a simple relation to an existing measurement system. This story is also given at <http://www.wikipedia.org/wiki/Typographic_unit> .

As I said above, this is far from being the definitive word and, for example, the account at <http://www.sizes.com/tools/type.htm> differs notably from this. I'm trying to get hold of Richard Hopkins' Origin of the American Point System for Printers' Type Measurement to see what he says.

My original input was intended to relay the information that physical wear affected the historical measure of the pica ... That's not controversial is it?

No, and it's clear that other measures have suffered this problem when they have used physical rod standard. It was when you wrote that the 35/83cm definition was a conversion and that you doubted a measure would ever be based on another measurement system that I got curious, because these statements were far from obvious to me.

As for Angstroms and paper sizes and the like, I must admit I can't really comment, George. You have been carrying on a separate discussion from M Blackburn and myself, and I don't see that they cross at any point.

George Bilalis

unread,
Sep 24, 2003, 4:34:20 AM9/24/03
to
The complete story now proves one thing to me:
Everybody's belief though time was that he had a better idea on how to re-define, or define anew the same unit of measure.

This is enough proof to me that this certain "unit" doesn't comply to 2 out of 3 requirements i. e.

1. the "unit so defined, was not internationally accepted
2. the "unit" so defined was not 'well defined' (tracable to some universal reference)

This 'imperial' way of thinking is absurdum ad continuum IMHO

regards
George

Gustavo Sanchez

unread,
Sep 24, 2003, 4:31:42 AM9/24/03
to
There is one thing that makes pica or ciceros a perfect tool for human calculations and it is its being a dodecadecimal number (ie: based on the number twelve). First time I was explained the typographic measures, I thought "This people are crazy", until I had to start myself making menthal calculations to work with columns, gutters and that stuff (with paper, rule and pencils).

If you use a decimal unit (that is: A unit divisible in 10 subunits), you'll get 'bad partitions' (fractions) quite fast: Half a centimetre is 0.5; One third of a centimetre is 0.333 (period); a quarter is 0.25... That is avoided with a 1/12 number: Half a pica is p6, A third of a pica is p4, A quarter of a pica is p3... That is because 12 is a very easily divisible number. Much friendlier than 10.

My 2c about why a pica/cicero is (was) so helpful.

M Blackburn

unread,
Sep 24, 2003, 2:01:21 PM9/24/03
to
Dominic,
New Zealand, eh. I thought maybe COED meant Canadian Oxford.

Okay, so "subset" might not be the best word, but if there is no longer independent standards for the imperial system as Mr. Nygaard says then it makes no sense to me to consider the imperial measurement as an independent system (except in common usage).

As for my doubt: both systems were originally invented independently, the current situation developed through a significant amount of singular circumstances, and I still doubt that this approach would be used if the goal wasn't to establish a commonality between two existing systems.

Scott McCullough

unread,
Sep 24, 2003, 3:13:04 PM9/24/03
to
Excellent point, Gustavo (pun intended). I'd never even considered that issue, but you're absolutely correct. No other number could be so easily divisible into common units.

Scott

Dominic Hurley

unread,
Sep 24, 2003, 6:49:21 PM9/24/03
to
COED = Concise Oxford English Dictionary. There is a New Zealand edition (and a comprehensive Dictionary of New Zealand English), but in my experience the concise edition has been the standard style reference for publishers here (alongside the New Zealand Style Book)).

Re "Everybody's belief though time was that he had a better idea on how to re-define, or define anew the same unit of measure."

From my reading on the subject over the last few days, it's clear to me that the history of measures (length measures at least) has been a quest for a more accurate reference (ie, a dimensionally stable and constant objective measure of length). The pendulum was once considered for both Imperial and metric units (the unit would be defined as the length of the pendulum that gave a period of X seconds). However, gravity varies over the earth, so it wasn't constant enough. Likewise, when they measured the quadrant of the earth, they failed to take into account the flattening of the earth. An alloy rod was for a long time considered the most exact reference, then radiation, now the speed of light. The basis of the SI system is that the definitions of its base units are defined in an absolute way without referring to any other units and they are based not on physical objects (such as standard meter sticks or standard kilogram bars) but on stable properties of the universe. (The one exception is the kilogram, which is still defined as being the weight of the standard kilogram kept at the International Bureau of Weights and Measures in Paris:

"This one physical standard is still used because scientists can weigh objects very accurately. Weight standards in other countries can be adjusted to the Paris standard kilogram with an accuracy of one part per hundred million. So far, no one has figured out how to define the kilogram in any other way that can be reproduced with better accuracy than this. The 21st General Conference on Weights and Measures, meeting in October 1999, passed a resolution calling on national standards laboratories to press forward with research to 'link the fundamental unit of mass to fundamental or atomic constants with a view to a future redefinition of the kilogram.' The next General Conference, in 2003, will surely return to this issue."

Source: <http://www.unc.edu/~rowlett/units/sifundam.html>

See also <http://www.bipm.fr/enus/3_SI/si.html>

And, for those who really want to grapple with the subject, check out the diagram at <http://www.bipm.fr/enus/3_SI/si_fig.html> !

So, I think it was not so much that people wanted to redefine a unit of measure but that they wanted the reference underlying it to be as accurate as possible. But this did sometimes involve redefining the absolute length of the unit.

If we subscribe to the idea of absolute length (which we must, if we are to get anything done!), it doesn't bother me that we use the same reference for two measurement systems. On the contrary, I would consider it a waste of time to try to come up with two equally accurate but independent reference constants. This is why I don't consider Imperial units to be a subset of metric units: the former are not a neat fraction of the latter, one is a decimal system, the other is a bizarre system, etc, etc. But I certainly understand the point in calling it a subset.

George Bilalis

unread,
Sep 25, 2003, 4:00:13 AM9/25/03
to
Since this has gone way far off the starting point, may I add a correction:

Dominic you wrote:

(The one exception is the kilogram, which is still defined as being the
weight of the standard kilogram kept at the International Bureau of Weights
and Measures in Paris:


I have to correct you, as the kilogram according to the definition of MKSA (SI) system, is NOT a measure of weight (a unit of force) but a measure of mass. Mass and weight are totally different things. You can have a kgr of weight, that is the weight of a kgr of mass when a unit of earth's acceleration of gravity (under standard conditions 9.81 m/sec^2) is acting upon. A kgr of mass measures heavier at the poles and lighter at the equator of earth.

Very interesting development on this thread.

George

Scott McCullough

unread,
Sep 25, 2003, 8:55:20 AM9/25/03
to
Tracy:

I don't know if you've stuck with us through this whole thread, but thanks for bringing it up. It's been very enlightening, if somewhat esoteric, and has forced us to reconsider some of what we've always just accepted as "the rules."

Scott

Dominic Hurley

unread,
Sep 25, 2003, 8:52:26 PM9/25/03
to
Yes, I should have written "the kilogram is still defined as being the mass of the standard kilogram". I do know the difference between mass and weight but the error slipped through because I added the information as an aside and so didn't scrutinise that copy well enough. But everyone can rest assured that I have absolutely no desire to engage in any further discussion on weight, mass, length, or on any other SI or Imperial units!

George Bilalis

unread,
Sep 26, 2003, 4:30:49 AM9/26/03
to
Well done Dominic. Amazing thread though!
I feel we all owe to Tracy this one

Cheers
George

M Blackburn

unread,
Sep 26, 2003, 10:30:09 AM9/26/03
to
Dominic,
Oops. Should have known. I think my Canadian Oxford would be a concise. I have an old Shorter at home, which is two larger volumes, and I see that the OED is now up to 20 volumes (I seem to remember it being 12).

Tracy and everyone else:
Think I'll sign off on this one too. It has been quite a thread, eh. And to think it all started with: "Just a quick question."

Stu Bloom

unread,
Sep 26, 2003, 1:21:04 PM9/26/03
to

Can visual basic add, subtract, multiply, and divide using Picas/points?


Yup.


Sub Main()
MsgBox I2P(P2I("1p7.75") + P2I("3p8.11"))
End Sub

Public Function P2I(sPicas As String) As Single
Dim sngPicas As Single
Dim sngPoints As Single
Dim nPos As Long

If IsNumeric(sPicas) Then ' interpret as all picas
sngPicas = CSng(sPicas)
sngPoints = 0
Else
nPos = InStr(LCase(sPicas), "p")
If nPos <= 1 Then ' illegal string passed
Err.Raise 13 ' type mismatch
ElseIf nPos = Len(sPicas) Then ' p at the end of the string
If IsNumeric(Left(sPicas, nPos - 1)) Then
sngPicas = CSng(Left(sPicas, nPos - 1))
sngPoints = 0
Else
Err.Raise 13 ' type mismatch
End If
Else ' p somewhere in the middle of string
If IsNumeric(Left(sPicas, nPos - 1)) And _
IsNumeric(Mid(sPicas, nPos + 1)) Then
sngPicas = CSng(Left(sPicas, nPos - 1))
sngPoints = CSng(Mid(sPicas, nPos + 1))
Else
Err.Raise 13 ' type mismatch
End If
End If
End If
P2I = sngPicas / 6 + sngPoints / 72
End Function

Public Function I2P(sngInches As Single, Optional nDecimalPlaces As Long = 2) As String
Dim nPicas As Long
Dim sngPoints As Single

nPicas = Int(sngInches * 6)
sngPoints = (sngInches * 6 - nPicas) * 12
I2P = nPicas & "p" & Round(sngPoints, nDecimalPlaces)
End Function

Rikk

unread,
Sep 26, 2003, 1:49:23 PM9/26/03
to
I hate to have to burst everyone's bubble but time and space are not
constants. S. Hawkings theories have demonstatated a likelihood that the
flow of time is not constant over the age of the universe. As Speed o light
is a direct measure of time, it is also variable. In addition, Heisenberg's
Uncertainty Principle states that position is unknowable owing to the fact
that the more precise a measure is taken, the more disruption of the state
of the sample occurs. With a variable space/time and the inability to
precisely measure-we can never know how long a point, pica, inch, meter,
meridian, lightyear, etc. But we can get close enough for rock-n-roll.

(Very tongue-in-cheek)

I loved the discussion and learned a lot. Thank you all.


M Blackburn

unread,
Sep 26, 2003, 3:02:41 PM9/26/03
to
Nobody said time and space are constants: Speed of light is. There is a Portuguese physicist (J... can't remember the name) that is suggesting that light may not be entirely constant either, or at least that it may not be the ulitmate barrier current theory dictates, but that is far from established. Also, speed of light is not a direct measure of time, and you are misrepresenting the Uncertainty Principle which is that one can't know both the position and momentum of subatomic particles because measuring one disrupts the other (it's not a relative accuracy thing).

And one last thing: I think everyone here has demonstrated very clearly that they were confused at some point in time by this discussion. <g>

Rikk

unread,
Sep 26, 2003, 3:24:13 PM9/26/03
to
True enough.

Uncertainty does deal in the sub atomic

I just felt left out.

<bg>

"M Blackburn" <mblac...@altamira.com> wrote in message
news:2ccd5...@webx.la2eafNXanI...

0 new messages