Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

What standards would you change?

6 views
Skip to first unread message

George W. Harris

unread,
Mar 31, 2004, 10:04:45 PM3/31/04
to
Wildepad <wild...@newsguy.com> wrote:

:Assume access to a non-paradoxical time travel machine.
:
:You might want to go back to a point where you can substitute
:something logical for what actually came into widespread use (after
:you become obscenely wealthy, of course).
:
:I've heard EEs saying it should have been a 200V@200cps electrical
:grid, others want all books bound at the top rather than on the side,
:and I doubt that few people would object to introducing the metric
:system 2000 years ago.
:
:Personally, I'd have made it a ten hour day, with ten waits in an
:hour, ten minutes in a wait, ten moments in a minute, ten seconds in a
:moment, etc.
:
:
:What would you change and why?

Push-button phones are laid out like this:

1 2 3
4 5 6
7 8 9
* 0 #

while calculators are laid out like this:

7 8 9
4 5 6
1 2 3
{whatever}.

I'd make them the same.

Trivial, I know.

--
"The truths of mathematics describe a bright and clear universe,
exquisite and beautiful in its structure, in comparison with
which the physical world is turbid and confused."

-Eulogy for G.H.Hardy

George W. Harris For actual email address, replace each 'u' with an 'i'

Ben Bradley

unread,
Mar 31, 2004, 10:44:13 PM3/31/04
to
In rec.arts.sf.science, Wildepad <wild...@newsguy.com> wrote:

>Assume access to a non-paradoxical time travel machine.
>
>You might want to go back to a point where you can substitute
>something logical for what actually came into widespread use (after
>you become obscenely wealthy, of course).
>
>I've heard EEs saying it should have been a 200V@200cps electrical
>grid,

Well, there was/is the RS232 "standard" with all the various
combinations of baud, word length, parity, hardware or software flow
control, that actually make it dozens, if not hundreds or thousands,
or "standards." Then I'd make one of those "standards" compatible with
NIDI's baud of 32,768. Dunno how to actually fix it, though, all those
features were actually needed way back in the days of 75 and 110 baud
TTY's and modems.

>others want all books bound at the top rather than on the side,
>and I doubt that few people would object to introducing the metric
>system 2000 years ago.
>
>Personally, I'd have made it a ten hour day, with ten waits in an
>hour, ten minutes in a wait, ten moments in a minute, ten seconds in a
>moment, etc.
>
>
>What would you change and why?

-----
http://mindspring.com/~benbradley

DJensen

unread,
Mar 31, 2004, 10:36:48 PM3/31/04
to
Wildepad wrote:

> Assume access to a non-paradoxical time travel machine.
>
> You might want to go back to a point where you can substitute
> something logical for what actually came into widespread use (after
> you become obscenely wealthy, of course).
>
> I've heard EEs saying it should have been a 200V@200cps electrical

> grid, others want all books bound at the top rather than on the side,


> and I doubt that few people would object to introducing the metric
> system 2000 years ago.
>
> Personally, I'd have made it a ten hour day, with ten waits in an
> hour, ten minutes in a wait, ten moments in a minute, ten seconds in a
> moment, etc.
>
> What would you change and why?

I'd want the [modern, western] calendar fixed, for sure: 10
months of 36 days (six weeks of six days each), plus an
intercalary week of 5 (6 with a leap day) at the end/start of
each year[1] -- which would be set around one of the equinoxes.

Circles of 100 degrees rather than 360 (which is related to the
calendar but doesn't need to be). This could give us something
like base-10 time too, in a roundabout way.

I'm sure there's all sorts of things that could be done in the
areas of city planning and mass transit... maybe divorcing the
width of train tracks from the width of two horses pulling a
cart? We've been passing that along since the Roman Empire.

[1] I -think- ancient Egypt arrived at a calendar like this at
some point, but it fell out of use as the Babylonian calendar
spread west.

--
DJensen

Derek Lyons

unread,
Mar 31, 2004, 11:00:38 PM3/31/04
to
DJensen <m...@no-spam-thanks.net> wrote:
>I'd want the [modern, western] calendar fixed, for sure: 10
>months of 36 days (six weeks of six days each), plus an
>intercalary week of 5 (6 with a leap day) at the end/start of
>each year[1] -- which would be set around one of the equinoxes.
>
>[1] I -think- ancient Egypt arrived at a calendar like this at
>some point, but it fell out of use as the Babylonian calendar
>spread west.

Why? Other than for financial calculations such 'evenness' provides
not real practical benefit.

>I'm sure there's all sorts of things that could be done in the
>areas of city planning and mass transit... maybe divorcing the
>width of train tracks from the width of two horses pulling a
>cart? We've been passing that along since the Roman Empire.

No. That's been passing around since the begining of the internet.

D.
--
Touch-twice life. Eat. Drink. Laugh.

Aaron Bergman

unread,
Mar 31, 2004, 11:20:40 PM3/31/04
to
In article <k7Mac.61014$1A6.1...@news20.bellglobal.com>,
DJensen <m...@no-spam-thanks.net> wrote:

> Circles of 100 degrees rather than 360 (which is related to the
> calendar but doesn't need to be). This could give us something
> like base-10 time too, in a roundabout way.

Ugh. 100 has too few factors. 360's got tons.

You could use gradians (IIRC). 400 grad == 360 deg.

Aaron

LukeCampbell

unread,
Apr 1, 2004, 12:32:17 AM4/1/04
to
Wildepad wrote:
> What would you change and why?

I'd make sure there was a year zero.

Luke

--
To email me, take out the trash.

Coridon Henshaw <(chenshaw<RE<MOVE>@(T

unread,
Apr 1, 2004, 12:34:49 AM4/1/04
to
Wildepad <wild...@newsguy.com> wrote in
news:kjbm609ej0d8ov9fv...@4ax.com:

> What would you change and why?

Microsoft is a 'standard' the world does not need.

--
Coridon Henshaw - http://www3.telus.net/csbh - "I have sadly come to the
conclusion that the Bush administration will go to any lengths to deny
reality." -- Charley Reese

Radovan Garabik

unread,
Apr 1, 2004, 3:52:09 AM4/1/04
to
Wildepad <wild...@newsguy.com> wrote:
> Assume access to a non-paradoxical time travel machine.
>
> You might want to go back to a point where you can substitute
> something logical for what actually came into widespread use (after
> you become obscenely wealthy, of course).
>
> I've heard EEs saying it should have been a 200V@200cps electrical
> grid, others want all books bound at the top rather than on the side,
> and I doubt that few people would object to introducing the metric
> system 2000 years ago.
>
> Personally, I'd have made it a ten hour day, with ten waits in an
> hour, ten minutes in a wait, ten moments in a minute, ten seconds in a
> moment, etc.
>

too complicated - too many units
something like french revolution hours is not bad - 20 hours per day
(2x10 for day and night), 100 minutes, 100 seconds.

But anyway, it would be MUCH better to switch to duodecimal system
anyway.

>
> What would you change and why?

I'll show Novial to Schleyer and Zamenhof. Or maybe Interlingua, to
increase the chance of international acceptance.
And we would not have to communicate in English (yuck!).

Crossposted to alt.language.artificial, soc.history.what-if

--
-----------------------------------------------------------
| Radovan Garabík http://melkor.dnp.fmph.uniba.sk/~garabik/ |
| __..--^^^--..__ garabik @ kassiopeia.juls.savba.sk |
-----------------------------------------------------------
Antivirus alert: file .signature infected by signature virus.
Hi! I'm a signature virus! Copy me into your signature file to help me spread!

Bryan Derksen

unread,
Apr 1, 2004, 4:23:50 AM4/1/04
to
On 1 Apr 2004 08:52:09 GMT, Radovan Garabik

<gar...@kassiopeia.juls.savba.sk> wrote:
>too complicated - too many units
>something like french revolution hours is not bad - 20 hours per day
>(2x10 for day and night), 100 minutes, 100 seconds.

Still too many units, IMO. I'd go all the way to the Swatch system;
divide the day into 1000 "beats", each 1 minute and 24.6 seconds long.
You can use centibeats for counting second-like intervals, and
dekabeats or hectobeats for hour-like ones.

http://www.swatch.com/internettime/home.php

>But anyway, it would be MUCH better to switch to duodecimal system
>anyway.

Hexadecimal. :)



>> What would you change and why?

It'd be nice to push XML's development (or something similar that
doesn't take as much space) as far into the past as possible, perhaps
as an adjunct to the ASCII standard; a standardized way of describing
the semantics of data files would make so many things much easier.
Perhaps try to make ASCII more Unicode-ready while I'm at it. It'll be
tricky balancing future expandability with the limitations of early
computers.

How about measuring all temperatures in kelvins? :)

Erik Max Francis

unread,
Apr 1, 2004, 5:04:56 AM4/1/04
to
Bryan Derksen wrote:

> Still too many units, IMO. I'd go all the way to the Swatch system;
> divide the day into 1000 "beats", each 1 minute and 24.6 seconds long.
> You can use centibeats for counting second-like intervals, and
> dekabeats or hectobeats for hour-like ones.

That's the standard metric approach to dividing up a day; Swatch didn't
invent it, they just chose to define it with @000 being their time
rather than GMT. The day is divided into decidays, centidays,
millidays, and so on.

--
__ Erik Max Francis && m...@alcyone.com && http://www.alcyone.com/max/
/ \ San Jose, CA, USA && 37 20 N 121 53 W && AIM erikmaxfrancis
\__/ Patiently, I'm still / Holding out until
-- Sandra St. Victor

Bernardz

unread,
Apr 1, 2004, 7:12:33 AM4/1/04
to
In article <406BE948...@alcyone.com>, m...@alcyone.com says...

> Bryan Derksen wrote:
>
> > Still too many units, IMO. I'd go all the way to the Swatch system;
> > divide the day into 1000 "beats", each 1 minute and 24.6 seconds long.
> > You can use centibeats for counting second-like intervals, and
> > dekabeats or hectobeats for hour-like ones.
>
> That's the standard metric approach to dividing up a day; Swatch didn't
> invent it, they just chose to define it with @000 being their time
> rather than GMT. The day is divided into decidays, centidays,
> millidays, and so on.
>
>

If someone is proposing changes to time what about the calendar.

The standard year is 365 days divided by 7 you get 52 weeks and one day.

Because of this one day extra we need every year a new calender as the
days of the week change to the date.

Well what if we had one day a year set aside as a special holiday which
is not a week day. Note on a leap year we would need two such days.

We could have now a permanent calendar

--
It is scarily when you consider that kooky conspiracy theorist actually
vote.

Observations of Bernard - No 57

Dr John Stockton

unread,
Apr 1, 2004, 11:46:19 AM4/1/04
to
JRS: In article <kjbm609ej0d8ov9fv...@4ax.com>, seen in
news:rec.arts.sf.science, Wildepad <wild...@newsguy.com> posted at Wed,
31 Mar 2004 19:28:39 :

>Assume access to a non-paradoxical time travel machine.
>
>You might want to go back to a point where you can substitute
>something logical for what actually came into widespread use (after
>you become obscenely wealthy, of course).


Counting on the fingers, rather than on the fingers-and-thumbs, thus
getting base 8; or in a binary mode in the fingers of one hand, leading
to base 16.

If the machine won't go back that far, introduce Arabic Numerals, with
zero, at the beginning of (Graeco-Roman) Classical times,

If the machine won't go back that far, give Archbishop Ussher's ideas to
Dionysius Exiguus, so that the Year Zero could be what we now call 4713
BC; and persuade DE of the benefits of year-month-day order and sensible
month-lengths.

If the machine won't go back that far, make America speak German.

If the machine won't go back that far, make C, etc., start indexing at 1
rather than 0 - a fertile source of error to remove.

--
© John Stockton, Surrey, UK. ?@merlyn.demon.co.uk Turnpike v4.00 MIME. ©
Web <URL:http://www.merlyn.demon.co.uk/> - w. FAQish topics, links, acronyms
PAS EXE etc : <URL:http://www.merlyn.demon.co.uk/programs/> - see 00index.htm
Dates - miscdate.htm moredate.htm js-dates.htm pas-time.htm critdate.htm etc.

Jack Linthicum

unread,
Apr 1, 2004, 1:11:03 PM4/1/04
to
George W. Harris <gha...@mundsprung.com> wrote in message news:<612n60psj1to6jq6e...@4ax.com>...

> Wildepad <wild...@newsguy.com> wrote:
>
> :Assume access to a non-paradoxical time travel machine.
> :
> :You might want to go back to a point where you can substitute
> :something logical for what actually came into widespread use (after
> :you become obscenely wealthy, of course).
> :
> :I've heard EEs saying it should have been a 200V@200cps electrical
> :grid, others want all books bound at the top rather than on the side,
> :and I doubt that few people would object to introducing the metric
> :system 2000 years ago.
>

I know of one computer installer who would like for a world-wide
standard 50/60 cps, 110 220 200 volts. He took a very expensive
computer from the US to Geneva in 1979 and set it up and, after many
warnings, plugged it in. 50 cycles at 220-240 volts is not compatible
with 60 cycles and 110-120. But he confirmed the then experimentally
held belief that computers work on smoke, let the smoke out and the
computer stops.

Christian Weisgerber

unread,
Apr 1, 2004, 11:49:02 AM4/1/04
to
Ben Bradley <ben__b...@mindspring.example.com> wrote:

> Well, there was/is the RS232 "standard" with all the various
> combinations of baud, word length, parity, hardware or software flow
> control, that actually make it dozens, if not hundreds or thousands,
> or "standards."

Oh, you're complaining about so-called asynchronous mode, which was
invented sometime in the 1920s for electro-mechanical teletypes.
That has always been a poor man's technology. Proper serial
interfaces are of course synchronous and clocked, which EIA-232
(the standard) very much provides for. Run a sensible HDLC-based
framing protocol over it, and you're all set. The problem is, this
was rather expensive in the old days of 8-bit microprocessors and
even more so before.

The electrical part of EIA-232 (also defined in V.28) sucks, too.
You really want to use differential signals. Plenty of existing
standards to choose from, but of course they were always a bit more
expensive.

--
Christian "naddy" Weisgerber na...@mips.inka.de

Keith Morrison

unread,
Apr 1, 2004, 1:19:49 PM4/1/04
to
Wildepad wrote:

>>:What would you change and why?
>>
>> Push-button phones are laid out like this:
>>
>>1 2 3
>>4 5 6
>>7 8 9
>>* 0 #
>>
>> while calculators are laid out like this:
>>
>>7 8 9
>>4 5 6
>>1 2 3
>>{whatever}.
>>
>> I'd make them the same.
>>
>> Trivial, I know.
>

> It's not really trivial, and there are many minor quirks like this.
> One of my own complaints is that if you look at tv listings, they go
> from the lowest number at the top to the higher numbered channels at
> the bottom, but if you surf through the channels, pushing the top
> button on the remote takes you up in number but down the list.

Some satellite receivers allow you to arrange the TV guide listing
lowest-highest or highest-lowest. Mine is set up so the higher-
numbered channels are at the top.

In the whatever-hundred channel universe, my bitch about TV listings
is when you've got several affiliates of the same network, they should
be grouped together irrespective of channel number.

--
Keith

Keith Morrison

unread,
Apr 1, 2004, 1:34:38 PM4/1/04
to
Erik Max Francis wrote:

>>Still too many units, IMO. I'd go all the way to the Swatch system;
>>divide the day into 1000 "beats", each 1 minute and 24.6 seconds long.
>>You can use centibeats for counting second-like intervals, and
>>dekabeats or hectobeats for hour-like ones.
>
> That's the standard metric approach to dividing up a day; Swatch didn't
> invent it, they just chose to define it with @000 being their time
> rather than GMT. The day is divided into decidays, centidays,
> millidays, and so on.

Julian dating. 0 was noon Universal time, 1 Jan 4713 BCE.

It's now about 1830 UT, April 1, 2004, as I;m writing this, which
is a Julian Date of 2453097.27083 (18:30:00 UT, to be exact).

Note also that something similar was used (very inconsistantly)
in the latter Star Trek series regarding the Stardate. A year
had 1000 units, so "Stardate 51000.0" was 51 years after some
reference date.

--
Keith

Jack Linthicum

unread,
Apr 1, 2004, 3:00:42 PM4/1/04
to
Bernardz <Berna...@REMOVEhotmail.com> wrote in message news:<MPG.1ad6b240eb771472989a3f@news>...

> In article <406BE948...@alcyone.com>, m...@alcyone.com says...
> > Bryan Derksen wrote:
> >
> > > Still too many units, IMO. I'd go all the way to the Swatch system;
> > > divide the day into 1000 "beats", each 1 minute and 24.6 seconds long.
> > > You can use centibeats for counting second-like intervals, and
> > > dekabeats or hectobeats for hour-like ones.
> >
> > That's the standard metric approach to dividing up a day; Swatch didn't
> > invent it, they just chose to define it with @000 being their time
> > rather than GMT. The day is divided into decidays, centidays,
> > millidays, and so on.
> >
> >
>
> If someone is proposing changes to time what about the calendar.
>
> The standard year is 365 days divided by 7 you get 52 weeks and one day.
>
> Because of this one day extra we need every year a new calender as the
> days of the week change to the date.
>
> Well what if we had one day a year set aside as a special holiday which
> is not a week day. Note on a leap year we would need two such days.
>
> We could have now a permanent calendar

What for? Virtually every business gives away a new calendar every
year, for those who are more artistically inclined the book stores
sell calendars with every theme from religion to your favorite
explosive device. I write on my calendar, I do not want to attend the
same funeral every year, it is a quirk of mine I picked up from
undertaker grandfather and uncle. If we rectify the little things like
February and its lack of days what are we going to do about those
clumsy religious holidays that depend on phases of the moon or the
complete lunar calendar. We have computers and radio stations to tell
us what day it is.

George W. Harris

unread,
Apr 1, 2004, 4:27:36 PM4/1/04
to
Wildepad <wild...@newsguy.com> wrote:

:Assume access to a non-paradoxical time travel machine.
:
:You might want to go back to a point where you can substitute
:something logical for what actually came into widespread use (after
:you become obscenely wealthy, of course).
:
:I've heard EEs saying it should have been a 200V@200cps electrical
:grid, others want all books bound at the top rather than on the side,
:and I doubt that few people would object to introducing the metric
:system 2000 years ago.

:
:Personally, I'd have made it a ten hour day, with ten waits in an


:hour, ten minutes in a wait, ten moments in a minute, ten seconds in a
:moment, etc.

I'd establish a set of units (mass, distance, time)
such that c, G and h are all equal to 1.


--
Real men don't need macho posturing to bolster their egos.

George W. Harris For actual email address, replace each 'u' with an 'i'.

Erik Max Francis

unread,
Apr 1, 2004, 5:22:03 PM4/1/04
to
Keith Morrison wrote:

> Note also that something similar was used (very inconsistantly)
> in the latter Star Trek series regarding the Stardate. A year
> had 1000 units, so "Stardate 51000.0" was 51 years after some
> reference date.

It changed throughout the different series, so the definition wasn't
that clean or consistent.

--
__ Erik Max Francis && m...@alcyone.com && http://www.alcyone.com/max/
/ \ San Jose, CA, USA && 37 20 N 121 53 W && AIM erikmaxfrancis

\__/ Wretches hang that jurymen may dine.
-- Alexander Pope

G. Waleed Kavalec

unread,
Apr 1, 2004, 5:30:36 PM4/1/04
to

"Erik Max Francis" <m...@alcyone.com> wrote in message
news:406C960B...@alcyone.com...

> Keith Morrison wrote:
>
> > Note also that something similar was used (very inconsistantly)
> > in the latter Star Trek series regarding the Stardate. A year
> > had 1000 units, so "Stardate 51000.0" was 51 years after some
> > reference date.
>
> It changed throughout the different series, so the definition wasn't
> that clean or consistent.

Gene Rodenberry once "explained" on some interview that star-dates had to
take location relative to the galactic center into account.

It was a good out. ;-)


DJensen

unread,
Apr 1, 2004, 5:50:56 PM4/1/04
to
Derek Lyons wrote:
> DJensen <m...@no-spam-thanks.net> wrote:
>
>>I'd want the [modern, western] calendar fixed, for sure: 10
>>months of 36 days (six weeks of six days each), plus an
>>intercalary week of 5 (6 with a leap day) at the end/start of
>>each year[1] -- which would be set around one of the equinoxes.
>
> Why? Other than for financial calculations such 'evenness' provides
> not real practical benefit.

On a 10x36+5(sometimes 6) calendar, each month of each year in
each olympiad begins on the same day of the week, with drift of
one day (backwards) only every new olympiad (unless you decide
that the intercalary week doesn't affect which day the following
month starts, then no drift at all). Makes planning for future
events, down to the day of the week, much easier for everyone and
cuts back on a lot of holiday drift.

>>I'm sure there's all sorts of things that could be done in the
>>areas of city planning and mass transit... maybe divorcing the
>>width of train tracks from the width of two horses pulling a
>>cart? We've been passing that along since the Roman Empire.
>
> No. That's been passing around since the begining of the internet.

Well no, the story only gained a booster from the internet (what
urban legend hasn't?) but it's been urban legend material since
WWII. Note though that Snopes only faults the legend for some of
its embellishments and ignoring the logic behind why the guage
persisted, otherwise it's basically true (for most of North
America and I assume modern Britain):
http://www.snopes.com/history/american/gauge.htm

--
DJensen

DJensen

unread,
Apr 1, 2004, 5:51:50 PM4/1/04
to
Aaron Bergman wrote:
> DJensen <m...@no-spam-thanks.net> wrote:
>>Circles of 100 degrees rather than 360 (which is related to the
>>calendar but doesn't need to be). This could give us something
>>like base-10 time too, in a roundabout way.
>
> Ugh. 100 has too few factors. 360's got tons.
>
> You could use gradians (IIRC). 400 grad == 360 deg.

How about 1000 then?

--
DJensen

DJensen

unread,
Apr 1, 2004, 5:55:38 PM4/1/04
to
LukeCampbell wrote:
> Wildepad wrote:
>> What would you change and why?
>
> I'd make sure there was a year zero.

I don't see a practical reason for that.

--
DJensen

Bernard Peek

unread,
Apr 1, 2004, 6:09:53 PM4/1/04
to
In message <lj2p601hubar84l05...@4ax.com>, George W.
Harris <gha...@mundsprung.com> writes

>Wildepad <wild...@newsguy.com> wrote:
>
>:Assume access to a non-paradoxical time travel machine.
>:
>:You might want to go back to a point where you can substitute
>:something logical for what actually came into widespread use (after
>:you become obscenely wealthy, of course).
>:
>:I've heard EEs saying it should have been a 200V@200cps electrical
>:grid, others want all books bound at the top rather than on the side,
>:and I doubt that few people would object to introducing the metric
>:system 2000 years ago.
>:
>:Personally, I'd have made it a ten hour day, with ten waits in an
>:hour, ten minutes in a wait, ten moments in a minute, ten seconds in a
>:moment, etc.
>
> I'd establish a set of units (mass, distance, time)
>such that c, G and h are all equal to 1.

Try foot, nanosecond, jupiter-mass.


--
Bernard Peek
London, UK. DBA, Manager, Trainer & Author. Will work for money.

nyra

unread,
Apr 1, 2004, 5:55:26 PM4/1/04
to
Jack Linthicum schrieb:

>
> Bernardz <Berna...@REMOVEhotmail.com> wrote in message news:<MPG.1ad6b240eb771472989a3f@news>...
> > In article <406BE948...@alcyone.com>, m...@alcyone.com says...
> > > Bryan Derksen wrote:
> > >
> > > > Still too many units, IMO. I'd go all the way to the Swatch system;
> > > > divide the day into 1000 "beats", each 1 minute and 24.6 seconds long.
> > > > You can use centibeats for counting second-like intervals, and
> > > > dekabeats or hectobeats for hour-like ones.
> > >
> > > That's the standard metric approach to dividing up a day; Swatch didn't
> > > invent it, they just chose to define it with @000 being their time
> > > rather than GMT. The day is divided into decidays, centidays,
> > > millidays, and so on.
> > >
> > >
> >
> > If someone is proposing changes to time what about the calendar.
> >
> > The standard year is 365 days divided by 7 you get 52 weeks and one day.
> >
> > Because of this one day extra we need every year a new calender as the
> > days of the week change to the date.

The days of the week also change from month to month. To make the
calendar more rational, i think the french revolution calendar (12
months of 30 days each; 3 "decades" of ten days each per month; 5 or 6
extra days outside of the month/decade count) was a very viable
approach; one could fiddle with the individual parameters, but i think
a month-equivalent shouldn't be shorter than 20 days (18 months per
year) or longer than 60 (6 per year); and the number of days in the
"normal" year - without the extra days - should be easily
subdividible, which rules out 364, which has the unwieldy 13 as prime
factor.

> > Well what if we had one day a year set aside as a special holiday which
> > is not a week day. Note on a leap year we would need two such days.
> >
> > We could have now a permanent calendar
>
> What for? Virtually every business gives away a new calendar every
> year, for those who are more artistically inclined the book stores
> sell calendars with every theme from religion to your favorite
> explosive device.

With a variation on the revolution calendar, the week day of every day
in every month in every year would be instantly obvious (if it's the
16th of a "month", it's always the 6th day of the decade). The
practical value wouldn't be immense, but it'd be there.

> If we rectify the little things like
> February and its lack of days what are we going to do about those
> clumsy religious holidays that depend on phases of the moon or the
> complete lunar calendar.

You won't get to tell the churches when to celebrate easter, they'll
use their own religious calendar for these purposes. The islamic
calendar is completely asynchronous to ours[1], and orthodox churches
still calculate religious holidays after the julian calendar instead
of the gregorian. I don't see why western christianity should have a
problem with a discrepancy between the "secular" and "ritual"
calendar.

[1] iirc they manage to fit 34 years into 33 solar years.

I would like to see a non-geocentric system of measurements; of
course, what to base such a system on? Measurements of the hydrogen
atom? And a system of time measurement which doesn't care about Earth
would necessarily be relegated to a mostly symbolical status, as
humans' life rhythm depends on day/night cycles - introducing a
"stardate day" of 18 earth hours just wouldn't work for many people.

--
Omnis clocha clochabilis, in clocherio clochando, clochans
clochativo clochare facit clochabiliter clochantes.
F. Rabelais, Gargantua


Warren Okuma

unread,
Apr 1, 2004, 6:34:25 PM4/1/04
to

"Wildepad" <wild...@newsguy.com> wrote in message
news:kjbm609ej0d8ov9fv...@4ax.com...

> Assume access to a non-paradoxical time travel machine.
>
> You might want to go back to a point where you can substitute
> something logical for what actually came into widespread use (after
> you become obscenely wealthy, of course).
>
> I've heard EEs saying it should have been a 200V@200cps electrical
> grid, others want all books bound at the top rather than on the side,
> and I doubt that few people would object to introducing the metric
> system 2000 years ago.
>
> Personally, I'd have made it a ten hour day, with ten waits in an
> hour, ten minutes in a wait, ten moments in a minute, ten seconds in a
> moment, etc.
>
>
> What would you change and why?

Replace all other systems of measure with the metric system... grams,
meters, etc...

Why? Because it's easier on science and folks.


John

unread,
Apr 1, 2004, 8:51:27 PM4/1/04
to

"Wildepad" <wild...@newsguy.com> wrote in message
news:kjbm609ej0d8ov9fv...@4ax.com...
> Assume access to a non-paradoxical time travel machine.
>
>
> What would you change and why?

All countries to use the same television, power and plug standard(s), as
well as the same road rules and side-of-road driving.


Ben Bradley

unread,
Apr 1, 2004, 11:58:16 PM4/1/04
to

Where were you four or five years ago? It would have saved the
stupid people from so vehemently disagreeing with the smart people
about when the 21st Century started.
I dunno if it's true, but back around y2k I heard there had been no
disagreement that the 20th Century started in 1901.

-----
http://mindspring.com/~benbradley

Ben Bradley

unread,
Apr 2, 2004, 12:33:19 AM4/2/04
to
In rec.arts.sf.science, Wildepad <wild...@newsguy.com> wrote:

>On Thu, 1 Apr 2004 17:46:19 +0100, Dr John Stockton
><sp...@merlyn.demon.co.uk> wrote:
>
>>JRS: In article <kjbm609ej0d8ov9fv...@4ax.com>, seen in
>>news:rec.arts.sf.science, Wildepad <wild...@newsguy.com> posted at Wed,
>>31 Mar 2004 19:28:39 :
>>>Assume access to a non-paradoxical time travel machine.

Does that mean I can kill someone else's grandfather?

>>Counting on the fingers, rather than on the fingers-and-thumbs, thus
>>getting base 8; or in a binary mode in the fingers of one hand, leading
>>to base 16.
>

>There are 10 kinds of people in the world.
>Those who understand binary and those who don't.
>
>There are 11 kinds of people in the world.
>Those who understand unary and those who don't.
>
>There are 3 kinds of people in the world.
>Those who can count and those who can't.


>
>>If the machine won't go back that far, make America speak German.

I get the vague feeling that this thread just got Godwinized...

>>If the machine won't go back that far, make C, etc., start indexing at 1
>>rather than 0 - a fertile source of error to remove.

Hmm, someone wanted the year numbering scheme to start at 0 rather
than 1. Actually, in addition to my other post, this could have helped
Jesus, as his age would correspond to the year AD. As it was, I'm sure
he got confused and had to double-check himself at times: "Let's see,
it's the year 11 AD, but I'm actually only 10 years old..." ;)

There are two kinds of people in the world:
0. Those who start enumerating with the number zero, and
1. Those who don't.

>I, too, thought that at one time, but once I got used to it I found it
>eminently useful.

There are enough ways to fsck up in C (even just fencepost and
related counting errors, not to mention things like using == when
meaning = and vice versa) that changing the index (and presuming it's
actually better starting with 1 than with 0) won't fix more than maybe
one percent of the problems. Caveat coder.

-----
http://mindspring.com/~benbradley

Erik Max Francis

unread,
Apr 2, 2004, 12:40:25 AM4/2/04
to
Dr John Stockton wrote:

> If the machine won't go back that far, make C, etc., start indexing at
> 1
> rather than 0 - a fertile source of error to remove.

This is hardly an error -- it makes a great deal more sense from a
computer science perspective to index from 0 rather than 1. For one
thing, it's the lowest unsigned value of any unsigned type (whereas 1 is
not the lowest value of any integral type), and for another, since
array/pointer subscription works by addressing, 0 is the offset of the
first element of an array, not 1.

Some people don't like it, but indexing from 0 is not an arbitrary
choice, it was done for a very good, and still valid, reason.

--
__ Erik Max Francis && m...@alcyone.com && http://www.alcyone.com/max/
/ \ San Jose, CA, USA && 37 20 N 121 53 W && AIM erikmaxfrancis

\__/ Do not stand in a place of danger trusting in miracles.
-- (an Arab proverb)

Erik Max Francis

unread,
Apr 2, 2004, 1:05:14 AM4/2/04
to
LukeCampbell wrote:
>
> Wildepad wrote:
> > What would you change and why?
>
> I'd make sure there was a year zero.

That seems kind of a waste to blow your wish on, since this can easily
by a notational change, rather than requiring an actual change
propagating back through history. Just write n AD as n, 1 BC as 0, and
n BC (n > 1) as (-n - 1).

To actually change the historical account, you'd probably have to have
Western civilization adopt the concept of the number zero a little
earlier as well ...

Aaron Bergman

unread,
Apr 2, 2004, 2:24:37 AM4/2/04
to
Not have C be the standard language for computing.

Not allow programs to overwrite the execution stack.

Aaron

Radovan Garabik

unread,
Apr 2, 2004, 3:40:49 AM4/2/04
to
Wildepad <wild...@newsguy.com> wrote:
> On Thu, 1 Apr 2004 17:46:19 +0100, Dr John Stockton
> <sp...@merlyn.demon.co.uk> wrote:
>
>>If the machine won't go back that far, make America speak German.
>
> Ugh! While I'm not English's greatest fan, German is, imo, far worse
> -- any language that needs more than one word for 'the' or 'you' has
> too many unnecessary rules and odd structures.

Ugh! -- any language that needs 'the' (definite article) has
too many unnecessary rules and odd structures. Many languages do
fine without.
Anyway, English has (at least) two different words for 'the'
(though they are spelled the same). Ditto for 'you'

Radovan Garabik

unread,
Apr 2, 2004, 3:50:29 AM4/2/04
to
In rec.arts.sf.science Jack Linthicum <jackli...@earthlink.net> wrote:
> Bernardz <Berna...@REMOVEhotmail.com> wrote in message news:<MPG.1ad6b240eb771472989a3f@news>...
>> In article <406BE948...@alcyone.com>, m...@alcyone.com says...
>> > Bryan Derksen wrote:
>> >
>> > > Still too many units, IMO. I'd go all the way to the Swatch system;
>> > > divide the day into 1000 "beats", each 1 minute and 24.6 seconds long.
>> > > You can use centibeats for counting second-like intervals, and
>> > > dekabeats or hectobeats for hour-like ones.

remember that not each day has exactly 1000 beats, though

>> >
>> > That's the standard metric approach to dividing up a day; Swatch didn't
>> > invent it, they just chose to define it with @000 being their time
>> > rather than GMT. The day is divided into decidays, centidays,
>> > millidays, and so on.
>> >
>> >
>>
>> If someone is proposing changes to time what about the calendar.
>>
>> The standard year is 365 days divided by 7 you get 52 weeks and one day.

The biggest problem with calendar reforms is that week is too engraved
in people's minds, and that 7 is not exactly divisible, neither is a
divisor of something 'natural'. But combination 5 working days + 2 free
days seems to be an optimal one - make week 6 days, and they you either
are loosing much of work productivity with just 4 working days, or do not let
people have enough rest with 5 working days.

Anyway, with duodecimal system, 6-day week, calendar can be quite nicely
organized into 10 months, each 26 (nice half-round number) days (=5 weeks),
+ 5 days left for end-of-year orgies^Wholidays.

Erik Max Francis

unread,
Apr 2, 2004, 3:51:40 AM4/2/04
to
Radovan Garabik wrote:

> Ugh! -- any language that needs 'the' (definite article) has
> too many unnecessary rules and odd structures. Many languages do
> fine without.
> Anyway, English has (at least) two different words for 'the'
> (though they are spelled the same). Ditto for 'you'

What's different about two words that are spelled the same and
(presumably) mean the same? Pronunciation (/Di/ vs. /D@/ for _the_, for
example)?

--
__ Erik Max Francis && m...@alcyone.com && http://www.alcyone.com/max/
/ \ San Jose, CA, USA && 37 20 N 121 53 W && AIM erikmaxfrancis

\__/ They love too much that die for love.
-- (an English proverb)

Radovan Garabik

unread,
Apr 2, 2004, 3:52:47 AM4/2/04
to

But people in power (=corporations) do not want this. See, when digital TV
emerged, it was a perfect opportunity to introduce common standard
for worlwide TV. It did not happen.

And regional codes in DVD players....

Radovan Garabik

unread,
Apr 2, 2004, 3:57:59 AM4/2/04
to
Erik Max Francis <m...@alcyone.com> wrote:
> Radovan Garabik wrote:
>
>> Ugh! -- any language that needs 'the' (definite article) has
>> too many unnecessary rules and odd structures. Many languages do
>> fine without.
>> Anyway, English has (at least) two different words for 'the'
>> (though they are spelled the same). Ditto for 'you'
>
> What's different about two words that are spelled the same and
> (presumably) mean the same? Pronunciation (/Di/ vs. /D@/ for _the_, for
> example)?
>

Exactly. Such things should not be in an international auxilliary
language. 'The' is obscured by English spelling, but 'a/an' is
a nicer example of what I mean. It is not much better than German,
if we do not think about cases.

Erik Max Francis

unread,
Apr 2, 2004, 4:08:40 AM4/2/04
to
Radovan Garabik wrote:

> Erik Max Francis <m...@alcyone.com> wrote:
>
> > What's different about two words that are spelled the same and
> > (presumably) mean the same? Pronunciation (/Di/ vs. /D@/ for _the_, > > for example)?
>
> Exactly. Such things should not be in an international auxilliary
> language. 'The' is obscured by English spelling, but 'a/an' is
> a nicer example of what I mean. It is not much better than German,
> if we do not think about cases.

I certainly agree that the inconsistent pronunciation of different
English words, particularly common ones, is frustrating, but as a point
of language usage, they're still the same word is that term is meant;
your comment ("English has two different words for 'the'") was extremely
confusing to a native English speaker on its face.

Radovan Garabik

unread,
Apr 2, 2004, 4:47:28 AM4/2/04
to
Erik Max Francis <m...@alcyone.com> wrote:
> Radovan Garabik wrote:
>
>> Erik Max Francis <m...@alcyone.com> wrote:
>>
>> > What's different about two words that are spelled the same and
>> > (presumably) mean the same? Pronunciation (/Di/ vs. /D@/ for _the_, > > for example)?
>>
>> Exactly. Such things should not be in an international auxilliary
>> language. 'The' is obscured by English spelling, but 'a/an' is
>> a nicer example of what I mean. It is not much better than German,
>> if we do not think about cases.
>
> I certainly agree that the inconsistent pronunciation of different
> English words, particularly common ones, is frustrating, but as a point
> of language usage, they're still the same word is that term is meant;
> your comment ("English has two different words for 'the'") was extremely
> confusing to a native English speaker on its face.
>

Yes, I agree that for native speakers, some issues foreigners have
with their language are extremely confusing :-)

A German would probably be too confused by the claim that "der" and
"dem" is considered to be two words (it is just one word, it means the
same, just the second one is in different case)

Radovan Garabik

unread,
Apr 2, 2004, 6:47:57 AM4/2/04
to
Wolfgang Schwanke <s...@sig.nature> wrote:
> Radovan Garabik <gar...@kassiopeia.juls.savba.sk> wrote in
> news:c4j9kv$2hmj20$4...@ID-89407.news.uni-berlin.de:
>
>> John <ju...@junk.com> wrote:
>
>>> All countries to use the same television, power and plug standard(s),
>>> as well as the same road rules and side-of-road driving.
>>
>> But people in power (=corporations) do not want this.
>
> I don't think that's the reason. The division is too deep to overcome
> easily.

>
>> See, when
>> digital TV emerged, it was a perfect opportunity to introduce common
>> standard for worlwide TV. It did not happen.
>
> Was it really an opportunity? It's considered desirable to have digital TV
> downwards compatible with the analogue television standard so that existing
> hardware (TVs, VCRs, cameras ...) isn't mass-obsoleted overnight. Therefore
> the digital transmission modes have to inherit the parameters (frame rate,
> scan lines) of the local analogue ones. Therefore the camps have to remain
> separate.

framerate and scanlines can be rather freely modified (at least in
European satellite DVB), at worst you are just having flicker and
wrong aspect ratio

I mean like ioncompatibility between USA DirectTV and MPEG2 broadcasting
in Europe, differenct frequency ranges used, different ways of
controlling LNB's (so you cannot pluck LNB bought in Japan into
European satellite receiver) etc...

I do not know about terrestrial DVB, but I suspect situation will be
similar - different standards not exactly on purpose, but because they
are being developed independently ("why the heck should we copy those
pesky <fill in your favourite nation> just because they were the
first?")

>
>> And regional codes in DVD players....
>

> That one is of course on purpose, but as you can see it "works" completely
> independent of TV standards.

Exactly the same way can DVB broadcasting could work completely
independently of (legacy) TV standards - it is just the same stream of
bytes, after all

Michael Ash

unread,
Apr 2, 2004, 7:12:30 AM4/2/04
to
On Fri, 2 Apr 2004, Radovan Garabik wrote:

> A German would probably be too confused by the claim that "der" and
> "dem" is considered to be two words (it is just one word, it means the
> same, just the second one is in different case)

I don't know any German, but I assume 'in different case' means that "der"
and "dem" aren't interchangeable? The two pronunciations of "the" are,
unless I'm badly mistaken, completely interchangeable. Any place you can
say "thee" you can say "thuh", and vice-versa.

Radovan Garabik

unread,
Apr 2, 2004, 7:55:38 AM4/2/04
to
Michael Ash <mi...@mikeash.com> wrote:
> On Fri, 2 Apr 2004, Radovan Garabik wrote:
>
>> A German would probably be too confused by the claim that "der" and
>> "dem" is considered to be two words (it is just one word, it means the
>> same, just the second one is in different case)
>
> I don't know any German, but I assume 'in different case' means that "der"
> and "dem" aren't interchangeable?

Yes, they aren't. Similar as in English you cannot interchange "take" and
"takes" freely (I take, he takes...). But, are "take" and "takes" two
words or just one, but in different forms?

> The two pronunciations of "the" are,
> unless I'm badly mistaken, completely interchangeable. Any place you can
> say "thee" you can say "thuh", and vice-versa.

Are you sure? My english is not perfect, but I guess if I spoke
something like "/D@/ apple is growing on /Di/ tree in /Di/ garden and
/D@/ apple is being watched by /Di/ cat" I would get at the best funny
looks and encouraging comments about my good English pronunciation.

IMHO /D@/ is used before consonants, /Di/ before vowels.
/Di/ before consonnants is used for emphasis, /D@/ before vowels should
not be used (only in rather rapid speach with extreme reduction in vowel
quality). It of course further depends on the exact variety of English
used.

Radovan Garabik

unread,
Apr 2, 2004, 8:04:27 AM4/2/04
to
Wolfgang Schwanke <s...@sig.nature> wrote:
> Radovan Garabik <gar...@kassiopeia.juls.savba.sk> wrote in
> news:c4jjtd$25pc82$1...@ID-89407.news.uni-berlin.de:
>
>>> Was it really an opportunity? It's considered desirable to have
>>> digital TV downwards compatible with the analogue television standard
>>> so that existing hardware (TVs, VCRs, cameras ...) isn't
>>> mass-obsoleted overnight. Therefore the digital transmission modes
>>> have to inherit the parameters (frame rate, scan lines) of the local
>>> analogue ones. Therefore the camps have to remain separate.
>>
>> framerate and scanlines can be rather freely modified (at least in
>> European satellite DVB), at worst you are just having flicker and
>> wrong aspect ratio
>
> Nowadays they can as standards converters have come down in price. But

It is not about converter output, but about framerate and resolution of
digital signal. I am receiving a number of satellite DVB channels, with
a great variety of resolutions - it all boils down to the receiver
producing output signal converted to PAL, and as long as the digital
input is MPEG2 DVB/S compatible (in the US it isn't, as I understand),
it all works.

> that's a hardware issue, not one of transmission modes or digital TV. You
> can just as easily convert analogue standards. Does that mean they've
> suddenly become immaterial? Seems not. As long as not all households are
> equipped with such a beast, it remains an issue.


>
>> I do not know about terrestrial DVB,
>

> I do. I live in an area where it's in use. Terrestrial analogue TV has been
> switched off last year, it's all digital now. (That is Berlin, Germany.
> Other major cities are to follow soon)
>
> (As opposed to the UK, where AFAIK they've had digital terrestrial for
> several years almost nationwide, but the analogue transmitters remain in
> place for a lenghty transition period)
>
> You get a box where you plug in your antenna, and out comes a conventional
> 625/50/PAL signal which you can feed to your regular TV or VCR. This is a
> necessary feature.

Well, but I was not speaking about the OUTPUT of the black box, but
about the digital signal that is being fed INTO the black box.
In an ideal world, you would take German TV, german DVB/T black box
(receiver, top box or whatever is it called) and happily used the said
combination in USA (neglecting the issue of 230 vs. 110 Volts).
I have no idea if it works, but probably it does not.

Leif Magnar Kj|nn|y

unread,
Apr 2, 2004, 8:11:17 AM4/2/04
to
In article <2004040207...@agamemnon.twistedsys.net>,

Michael Ash <mi...@mikeash.com> wrote:
>
>I don't know any German, but I assume 'in different case' means that "der"
>and "dem" aren't interchangeable?

No more than "he" and "him" are interchangeable in English.

--
Leif Kjønnøy, Geek of a Few Trades. http://www.pvv.org/~leifmk
Disclaimer: Do not try this at home.
Void where prohibited by law.
Batteries not included.

James F. Cornwall

unread,
Apr 2, 2004, 10:49:41 AM4/2/04
to
Erik Max Francis wrote:
>
> Dr John Stockton wrote:
>
> > If the machine won't go back that far, make C, etc., start indexing at
> > 1
> > rather than 0 - a fertile source of error to remove.
>
> This is hardly an error -- it makes a great deal more sense from a
> computer science perspective to index from 0 rather than 1. For one
> thing, it's the lowest unsigned value of any unsigned type (whereas 1 is
> not the lowest value of any integral type), and for another, since
> array/pointer subscription works by addressing, 0 is the offset of the
> first element of an array, not 1.
>

The problem is that while it makes sense from a computer science
perspective, a very large number of programs are written by people with
engineering or geology or meteorology or ... perspectives. For most of
these people, we look at an array of "stuff" and perceive that the first
entry in that array is entry number 1, not entry number zero...

Looking at each entry in terms of its position in the array, not how far
it's offset from the start of the array. We don't *care* that the
assembly code generated by our compiler uses offsets regardless of what
our favorite language says.

Personally, I prefer the "starts at 1" style for high order languages.
Offsets and starting from zero is fine at the assembly labguage level,
but 1-indexing works better for the majority of people who are writing
code. IME, anyway.

Jim Cornwall
(Done Fortran, C, assembler, ....)


> Some people don't like it, but indexing from 0 is not an arbitrary
> choice, it was done for a very good, and still valid, reason.
>
> --
> __ Erik Max Francis && m...@alcyone.com && http://www.alcyone.com/max/
> / \ San Jose, CA, USA && 37 20 N 121 53 W && AIM erikmaxfrancis
> \__/ Do not stand in a place of danger trusting in miracles.
> -- (an Arab proverb)


--

****************************************************
** Facilior veniam posterius quam prius capere! **
****************************************************
** James F. Cornwall, sole owner of all opinions **
** expressed in this message... **
****************************************************

Keith Morrison

unread,
Apr 2, 2004, 11:36:30 AM4/2/04
to
Radovan Garabik <gar...@kassiopeia.juls.savba.sk> wrote:

>> The two pronunciations of "the" are,
>> unless I'm badly mistaken, completely interchangeable. Any place you can
>> say "thee" you can say "thuh", and vice-versa.
>
>Are you sure? My english is not perfect, but I guess if I spoke
>something like "/D@/ apple is growing on /Di/ tree in /Di/ garden and
>/D@/ apple is being watched by /Di/ cat" I would get at the best funny
>looks and encouraging comments about my good English pronunciation.

No, you wouldn't. I'm a native English speaker and the sentence comes
out perfectly well both ways. Using /Di/ makes it sound like you are
emphasising something or another, but no more than pronouncing it
"runnING" instead of "runnin" means those are two different words.

Michael Ash

unread,
Apr 2, 2004, 12:17:38 PM4/2/04
to
On Fri, 2 Apr 2004, Radovan Garabik wrote:

> > I don't know any German, but I assume 'in different case' means that "der"
> > and "dem" aren't interchangeable?
>
> Yes, they aren't. Similar as in English you cannot interchange "take" and
> "takes" freely (I take, he takes...). But, are "take" and "takes" two
> words or just one, but in different forms?

Got it. As to the "take" thing, I will strongly state that I have no idea.
:)

> > The two pronunciations of "the" are,
> > unless I'm badly mistaken, completely interchangeable. Any place you can
> > say "thee" you can say "thuh", and vice-versa.
>
> Are you sure? My english is not perfect, but I guess if I spoke
> something like "/D@/ apple is growing on /Di/ tree in /Di/ garden and
> /D@/ apple is being watched by /Di/ cat" I would get at the best funny
> looks and encouraging comments about my good English pronunciation.
>
> IMHO /D@/ is used before consonants, /Di/ before vowels.

I'm obviously blinded by proximity, and I see that you're right as far as
common usage goes. But I'm not sure if pronouncing it the other way is
'wrong' or just 'different'. It seems to me that this is how it's normally
said, but that it's not always said that way and it's not required.

Anyway, if you don't like "the", just switch to Chinese. :) If only they
had a reasonable writing system....

Michael Ash

unread,
Apr 2, 2004, 12:18:34 PM4/2/04
to
On Fri, 2 Apr 2004, James F. Cornwall wrote:

> Personally, I prefer the "starts at 1" style for high order languages.
> Offsets and starting from zero is fine at the assembly labguage level,
> but 1-indexing works better for the majority of people who are writing
> code. IME, anyway.

Thinking of C as a high-level language is a common mistake. It's much more
accurate to group it with assembly. :)

DJensen

unread,
Apr 2, 2004, 12:37:08 PM4/2/04
to

For something that only comes up once a century at most it's
really not a practical change to make. Besides, counting from
zero would just confuse the same stupid people -- 'how can there
be 11 eggs in a dozen? this is one egg, not zero!'

--
DJensen

Alan Lothian

unread,
Apr 2, 2004, 12:58:10 PM4/2/04
to
In article <c4j9gl$2hmj20$3...@ID-89407.news.uni-berlin.de>, Radovan
Garabik <gar...@kassiopeia.juls.savba.sk> wrote:

>
> The biggest problem

Not so. The biggest problem is that sexist, speciesist and self-styled
scientificist so-called authorities have deluded us into believing that
the "earth" goes round the "sun" and have a typical anal-retentive
obsession that a "year" must end up at the same place on each "orbit"
every time. Just because they're good at long division doesn't mean we
should be slaves to their nonsense. The year starts when the God-king
is slain over the tilled soil before the new crops are sown, as any
fule kno. And it ends when the God-King says so, OK?

--
"The past resembles the future as water resembles water" Ibn Khaldun

My .mac.com address is a spam sink.
If you wish to email me, try atlothian at blueyonder dot co dot uk

DJensen

unread,
Apr 2, 2004, 12:48:00 PM4/2/04
to
Radovan Garabik wrote:
> Michael Ash <mi...@mikeash.com> wrote:
>>On Fri, 2 Apr 2004, Radovan Garabik wrote:
>>>A German would probably be too confused by the claim that "der" and
>>>"dem" is considered to be two words (it is just one word, it means the
>>>same, just the second one is in different case)
>>
>>I don't know any German, but I assume 'in different case' means that "der"
>>and "dem" aren't interchangeable?
>
> Yes, they aren't. Similar as in English you cannot interchange "take" and
> "takes" freely (I take, he takes...). But, are "take" and "takes" two
> words or just one, but in different forms?

Conjugation creates distinct forms, thus distinct words. 'Take'
is no more the same word as 'takes' than 'suis' is the same word
as 'sommes' en francais.

--
DJensen

Hop David

unread,
Apr 2, 2004, 1:09:19 PM4/2/04
to

Wildepad wrote:
> Assume access to a non-paradoxical time travel machine.
>

> You might want to go back to a point where you can substitute
> something logical for what actually came into widespread use (after
> you become obscenely wealthy, of course).
>
> I've heard EEs saying it should have been a 200V@200cps electrical
> grid, others want all books bound at the top rather than on the side,
> and I doubt that few people would object to introducing the metric
> system 2000 years ago.
>
> Personally, I'd have made it a ten hour day, with ten waits in an
> hour, ten minutes in a wait, ten moments in a minute, ten seconds in a
> moment, etc.


>
>
> What would you change and why?

I'd make the circle constant 6.2831.. instead of 3.1415... . That would
get rid of many annoying powers of 2 I'm always running into.

--
Hop David
http://clowder.net/hop/index.html

Dr John Stockton

unread,
Apr 2, 2004, 11:47:32 AM4/2/04
to
JRS: In article <8csp601f66c5phc6g...@4ax.com>, seen in
news:rec.arts.sf.science, Ben Bradley <ben_nospa...@mindspring.exa
mple.com> posted at Thu, 1 Apr 2004 23:58:16 :

>
> Where were you four or five years ago? It would have saved the
>stupid people from so vehemently disagreeing with the smart people
>about when the 21st Century started.
> I dunno if it's true, but back around y2k I heard there had been no
>disagreement that the 20th Century started in 1901.

Not so, AIUI; the German Empire got that wrong. But, for the 21st
Century, only Cuba & maybe China got it right.

--
© John Stockton, Surrey, UK. ?@merlyn.demon.co.uk Turnpike v4.00 MIME. ©
Web <URL:http://www.merlyn.demon.co.uk/> - w. FAQish topics, links, acronyms
PAS EXE etc : <URL:http://www.merlyn.demon.co.uk/programs/> - see 00index.htm
Dates - miscdate.htm moredate.htm js-dates.htm pas-time.htm critdate.htm etc.

Joshua P. Hill

unread,
Apr 2, 2004, 2:55:55 PM4/2/04
to
On Thu, 01 Apr 2004 03:04:45 GMT, George W. Harris
<gha...@mundsprung.com> wrote:

> Push-button phones are laid out like this:
>
>1 2 3
>4 5 6
>7 8 9
>* 0 #
>
> while calculators are laid out like this:
>
>7 8 9
>4 5 6
>1 2 3
>{whatever}.
>
> I'd make them the same.
>
> Trivial, I know.

That's because when Bell Labs was researching the layout of the
Touch-Tone keypad, they found that test subjects made fewer dialing
errors with the inverted layout than with the traditional adding
machine (now calculator) layout.

--

Josh

To reply by email, delete "REMOVETHIS" from the address line.

Joshua P. Hill

unread,
Apr 2, 2004, 3:23:09 PM4/2/04
to
On Wed, 31 Mar 2004 19:28:39 -0600, Wildepad <wild...@newsguy.com>
wrote:

>Assume access to a non-paradoxical time travel machine.
>
>You might want to go back to a point where you can substitute
>something logical for what actually came into widespread use (after
>you become obscenely wealthy, of course).
>
>I've heard EEs saying it should have been a 200V@200cps electrical
>grid, others want all books bound at the top rather than on the side,
>and I doubt that few people would object to introducing the metric
>system 2000 years ago.
>
>Personally, I'd have made it a ten hour day, with ten waits in an
>hour, ten minutes in a wait, ten moments in a minute, ten seconds in a
>moment, etc.
>
>
>What would you change and why?

Well, I wouldn't go with 200 Volts -- it's more efficient but more
hazardous. Rather I'd go with the Japanese 100V, which still leaves
you with 200 Volts for power-hungry appliances if you use both phases.
200 Hz, OTOH, probably makes sense -- smaller transformers (aircraft
use 400 Hz for that reason).

Hmmmm . . . I'd go to the metric system, of course, or better yet,
something based on natural units, the ones which minimize the number
of constants. And a better calendar/clock.

Our archaic English spelling would definitely go. No accents though --
problematic on a keyboard -- we probably should have a few more
characters in the alphabet.

The QWERTY keyboard is a disaster -- I'd use the Dvorak or something
similar. The Linotype keyboard was even better.

Irregular verbs and noun gender would disappear from foreign
languages, though not from English, since I already know it. For that
matter, foreign languages would disappear & we'd standardize on the
language that I happen to speak.

Capitalization: who came up with the loony rules that say that the
first letter of most but not all words is capitalized in titles? Would
the universe die a horrible death if we just capitalized every word?
For that matter, why do we have capital letters anyway? They're pretty
darn redundant. I'd do away with them and bring back thorn and ash,
just to confuse people.

Why do zeroes look like o's, ones like l's, and so forth? Lots of
silly orthographic conventions that seem to have been devised by the
writers of stylebooks and the teacher's union.

External testicles. I mean, there's a reason for them, but I can't
believe that if nature hadn't worked a bit she couldn't have come up
with something less kludgy.

The piano keyboard made lots of sense in the Renaissance, when tuning
was Pythagorean and music was modal, but it doesn't work well for the
tempered scale. If the gaps were omitted, it seems to me that you'd be
able to play in any key by using one of only two possible fingerings.
And while we're changing things, how about making a similar change to
the conventions for musical notation, which are similarly antiquated.

FM radio: they really messed that up when they introduced the pilot
tone stereo system. TV's moving on to ATSC, but we messed that up,
too: we should have gone with the system being used elsewhere in the
world (e have Zenith to "thank" in both cases . . . ). And AM radio,
of course, should be history by now.

Computer mice! I can't think of a more perverse technology. The mouse
slows you down, wastes desk space, and causes sore fingers and carpal
tunnel syndrome, as well as encouraging the development of an
interface that is at the same time perversely baroque and
underpowered. And while they're at it, I'd get rid of any icon that
doesn't have text on it -- I mean, there's a reason the world
abandoned hieroglyphics . . . And that brings up Windows: we should
have an elegant OS, something better than Unix, which while better
than Windows shows its age. And then there's the X86 architecture . .

Joshua P. Hill

unread,
Apr 2, 2004, 3:25:13 PM4/2/04
to
On Thu, 1 Apr 2004 16:49:02 +0000 (UTC), na...@mips.inka.de (Christian
Weisgerber) wrote:

>Ben Bradley <ben__b...@mindspring.example.com> wrote:
>
>> Well, there was/is the RS232 "standard" with all the various
>> combinations of baud, word length, parity, hardware or software flow
>> control, that actually make it dozens, if not hundreds or thousands,
>> or "standards."
>
>Oh, you're complaining about so-called asynchronous mode, which was
>invented sometime in the 1920s for electro-mechanical teletypes.
>That has always been a poor man's technology.

Impressive, though, that it's still in use after all these years . . .

Joshua P. Hill

unread,
Apr 2, 2004, 3:29:00 PM4/2/04
to
On Thu, 01 Apr 2004 23:58:16 -0500, Ben Bradley
<ben_nospa...@mindspring.example.com> wrote:

>In rec.arts.sf.science, DJensen <m...@no-spam-thanks.net> wrote:
>
>>LukeCampbell wrote:
>>> Wildepad wrote:

>>>> What would you change and why?
>>>

>>> I'd make sure there was a year zero.
>>
>>I don't see a practical reason for that.
>

> Where were you four or five years ago? It would have saved the
>stupid people from so vehemently disagreeing with the smart people
>about when the 21st Century started.
> I dunno if it's true, but back around y2k I heard there had been no
>disagreement that the 20th Century started in 1901.

Kinda makes you wonder if the Flynn effect is real, doesn't it . . .

Joshua P. Hill

unread,
Apr 2, 2004, 4:03:00 PM4/2/04
to
On 2 Apr 2004 11:47:57 GMT, Radovan Garabik
<gar...@kassiopeia.juls.savba.sk> wrote:

>framerate and scanlines can be rather freely modified (at least in
>European satellite DVB), at worst you are just having flicker and
>wrong aspect ratio
>
>I mean like ioncompatibility between USA DirectTV and MPEG2 broadcasting
>in Europe, differenct frequency ranges used, different ways of
>controlling LNB's (so you cannot pluck LNB bought in Japan into
>European satellite receiver) etc...
>
>I do not know about terrestrial DVB, but I suspect situation will be
>similar - different standards not exactly on purpose, but because they
>are being developed independently ("why the heck should we copy those
>pesky <fill in your favourite nation> just because they were the
>first?")

What happened was something like this:

1. The Japanese national television network developed the original
1125 line, 60 Hz analog HDTV standard

2. They put it out for international discussion and approval

3. The French threw a wrench into the works, surprise, surprise

4. The aspect ratio, originally chosen to match the European
wide-screen movie standard of 1.66, was changed a little bit to make
it as incompatible as possible with every existing aspect ratio and
ugly to boot

5. The Americans introduced DVB and chose the Zenith 8VSB system for
modulation, with 16VSB reserved for cable use

6. The Europeans became interested in a superior standard called COFDM

7. Just about every test performed internationally favored COFDM, but
in a last-minute trial in the US 8VSB won, in part it seems because of
problems with the COFDM modulator used in the trials and in part
because American broadcast conditions are very different than those in
more densely-populated countries (COFDM is superior in urban areas
whereas 8VSB requires less power to broadcast over long distances)

8. Meanwhile, 8VSB modulators and sets were already being produced

9. Also, the Americans couldn't agree on which of several formatsto
use, so chose them all

10. Net result: both the world and the ATSC systems have too many
"standards" and are at the same time less versatile than they should
be

11. But we've been so slow to put them into use that they're pretty
much all obsolescent anyway -- compression technology has improved
since the introduction of MPEG-2.

Richard Kennaway

unread,
Apr 2, 2004, 4:47:06 PM4/2/04
to
Dr John Stockton <sp...@merlyn.demon.co.uk> wrote:
> Counting on the fingers, rather than on the fingers-and-thumbs, thus
> getting base 8; or in a binary mode in the fingers of one hand, leading
> to base 16.

We have computers to handle binary/octal/hex. For human purposes,
better to go way back to when the five-fingered plan for hands and feet
was established, and make it six-fingered instead. Result: base 12
notation, much more useful for people doing their own arithmetic.

-- Richard Kennaway

Erik Max Francis

unread,
Apr 2, 2004, 6:58:22 PM4/2/04
to
"James F. Cornwall" wrote:

> The problem is that while it makes sense from a computer science
> perspective, a very large number of programs are written by people
> with
> engineering or geology or meteorology or ... perspectives. For most
> of
> these people, we look at an array of "stuff" and perceive that the
> first
> entry in that array is entry number 1, not entry number zero...

Truth is, I haven't seen much difficulty with zero-based indices except
from people who come from a background of using languages with one-based
indices. On the other hand, I'm sure people familiar with zero-based
indices would have some initial difficulty when converting to a language
that uses one-based indices. So this really isn't a case of which is
"better," it's a case of which you're more familiar with.

Since there are actual low-level reasons to index from zero but none for
indexing from one, it sure seems a better idea to index from zero and
let the people who are used to silly languages catch up :-).

--
__ Erik Max Francis && m...@alcyone.com && http://www.alcyone.com/max/
/ \ San Jose, CA, USA && 37 20 N 121 53 W && AIM erikmaxfrancis

\__/ All men think all men mortal, save themselves.
-- Edmund Young

JJ Karhu

unread,
Apr 2, 2004, 7:03:53 PM4/2/04
to
I'd change the TV and movie standards so that there would be a single
framerate all around the world. And unify the TV systems while I'm at
it.

// JJ

Joshua P. Hill

unread,
Apr 2, 2004, 9:12:38 PM4/2/04
to
On Sat, 03 Apr 2004 03:03:53 +0300, JJ Karhu <kur...@modeemi.fi>
wrote:

>I'd change the TV and movie standards so that there would be a single
>framerate all around the world. And unify the TV systems while I'm at
>it.

That's what the Japanese wanted to do. Hell, everyone except the
French -- they complained that 60 Hz favored the 60 Hz countries, and
wanted a "compromise" that was equally far from everything -- 72 Hz or
something.

Joshua P. Hill

unread,
Apr 2, 2004, 9:17:16 PM4/2/04
to
On Fri, 02 Apr 2004 16:56:49 -0600, Wildepad <wild...@newsguy.com>
wrote:

>On Fri, 02 Apr 2004 15:23:09 -0500, Joshua P. Hill
><josh442R...@snet.net> wrote:
>
>>Our archaic English spelling would definitely go. No accents though --
>>problematic on a keyboard -- we probably should have a few more
>>characters in the alphabet.
>

>If it were me (and you can all be thankful it's not), I'd do with many
>less. Five keys on a keyboard is all you really need if you get into
>two-position duplex characters.

Sounds like it would be a lot of work to learn such an alphabet.
Better to let the keyboard decode things into a phonetically
appropriate alphabet, I think. And when they develop the alphabet,
they should Huffman code it so that the most common phonemes are
written with the fewest strokes . . .

>>Why do zeroes look like o's, ones like l's, and so forth? Lots of
>>silly orthographic conventions that seem to have been devised by the
>>writers of stylebooks and the teacher's union.
>

>I don't know if it's a myth or not, but I've always heard that a zero
>looks like a zero because it represents a hole.

Kind of cool if it's true.

>>External testicles. I mean, there's a reason for them, but I can't
>>believe that if nature hadn't worked a bit she couldn't have come up
>>with something less kludgy.
>

>From what I've heard, women are glad that God decided to try something
>different from udders.

He did?

Phillip Thorne

unread,
Apr 3, 2004, 12:33:54 AM4/3/04
to
On Wed, 31 Mar 2004, Wildepad <wild...@newsguy.com> proposed:

>Assume access to a non-paradoxical time travel machine.
>You might want to go back to a point where you can substitute
>something logical for what actually came into widespread use
>[...]

>What would you change and why?

Clocks and calendars are based on the natural movements of celestial
bodies, which tend to come in almost-but-not-quite integral ratios.
This causes me to become angry! Angry and tired. I would see to it
(declare! enjoin! compel!) that Luna orbited Earth in exactly 30
Earth-days, that Earth rotated exactly 30 times during one orbit, that
Mars had exactly twice Earth's year, and so forth. This would make
the job of armillary-sphere and orrery artisans so much easier.

These ratios have a tendency to be inconstant, however, given such
phenomena as tidal friction and conservation of angular momentum. To
compensate for these annoying-but-inevitable effects, I would declare
that the interface between Earth's oceans and sea-floors would have
zero friction.

Also, I would impose a second moon, as a momentum-sink. To avoid
confusion with the synchronized timekeeping moon ("Luna"), it would be
painted black; it would thereby provide an excitingly invisible
navigational anomaly for early intrepid space explorers.

After these fixes I would travel to the Big Bang and induce certain
regions of space-time to have universal constants just slightly-teensy
different from those prevailing in our neighborhood. This would be to
demonstrate the difference between "conditions must be exactly right"
and "conditions can vary by a small but nonzero amount;" thereby
edifying! educating! spiting! future physicists, SF authors, and
physicists-cum-SF authors who insist on misusing anthropic
cosmological principles in their narratives.

/- Phillip Thorne ----------- The Non-Sequitur Express --------------------\
| org underbase ta thorne www.underbase.org It's the boundary |
| net comcast ta pethorne site, newsletter, blog conditions that |
\------------------------------------------------------- get you ----------/

Peter Knutsen

unread,
Apr 3, 2004, 3:26:50 AM4/3/04
to

Joshua P. Hill wrote:

> <ben_nospa...@mindspring.example.com> wrote:
>> Where were you four or five years ago? It would have saved the
>>stupid people from so vehemently disagreeing with the smart people
>>about when the 21st Century started.
>> I dunno if it's true, but back around y2k I heard there had been no
>>disagreement that the 20th Century started in 1901.
>
> Kinda makes you wonder if the Flynn effect is real, doesn't it . . .

Apparently, it stopped working some years ago.

--
Peter Knutsen

Peter Knutsen

unread,
Apr 3, 2004, 3:27:47 AM4/3/04
to

Coridon Henshaw wrote:
> Microsoft is a 'standard' the world does not need.

I'd go for the entire IBM PC concept. Amigas should rule.

--
Peter Knutsen

Peter Knutsen

unread,
Apr 3, 2004, 3:30:14 AM4/3/04
to

Bryan Derksen wrote:
> Still too many units, IMO. I'd go all the way to the Swatch system;
> divide the day into 1000 "beats", each 1 minute and 24.6 seconds long.
> You can use centibeats for counting second-like intervals, and
> dekabeats or hectobeats for hour-like ones.
>
> http://www.swatch.com/internettime/home.php

I've always liked the concept for arranging online meetings when the
participants are from different time zones, but there is absolutely no
reason to use those "beats" locally.

--
Peter Knutsen

Michael Ash

unread,
Apr 3, 2004, 3:57:59 AM4/3/04
to
On Fri, 2 Apr 2004, Joshua P. Hill wrote:

> That's what the Japanese wanted to do. Hell, everyone except the
> French -- they complained that 60 Hz favored the 60 Hz countries, and
> wanted a "compromise" that was equally far from everything -- 72 Hz or
> something.

The solution should be obvious; just use a 300Hz framerate everywhere. :)

phil hunt

unread,
Apr 2, 2004, 4:11:25 PM4/2/04
to
On Fri, 02 Apr 2004 12:37:08 -0500, DJensen <m...@no-spam-thanks.net> wrote:
>Ben Bradley wrote:
>> In rec.arts.sf.science, DJensen <m...@no-spam-thanks.net> wrote:
>>>LukeCampbell wrote:
>>>>Wildepad wrote:
>>>>
>>>>>What would you change and why?
>>>>
>>>>I'd make sure there was a year zero.
>>>
>>>I don't see a practical reason for that.
>>
>> Where were you four or five years ago? It would have saved the
>> stupid people from so vehemently disagreeing with the smart people
>> about when the 21st Century started.


And the 21st century would be the one where the years are of the
form 21xx.

--
"It's easier to find people online who openly support the KKK than
people who openly support the RIAA" -- comment on Wikipedia
(Email: zen19725 at zen dot co dot uk)


phil hunt

unread,
Apr 2, 2004, 4:08:46 PM4/2/04
to
On Wed, 31 Mar 2004 22:36:48 -0500, DJensen <m...@no-spam-thanks.net> wrote:
>
>I'd want the [modern, western] calendar fixed, for sure: 10
>months of 36 days (six weeks of six days each), plus an
>intercalary week of 5 (6 with a leap day) at the end/start of
>each year[1] -- which would be set around one of the equinoxes.

Better to have 13 months of 28 days each, which is similar to the
lunar cycle (why do you think they're called months?) and is also a
whole number of weeks, so for any particular year any day-of-month
will be the same day-of-week for every month.

The Year could start at the spring equinox since (at least in the
northern hemisphere, where most people live) since spring is
associated with new life. Or use the quarter-points and make the
year start on Beltane.

>
>Circles of 100 degrees rather than 360 (which is related to the
>calendar but doesn't need to be). This could give us something
>like base-10 time too, in a roundabout way.

I'd use 6400; easier for BOTE calculations of how big distant things
are (a 1 m object 1v km away subtends an angle of approximately 1
mil).

phil hunt

unread,
Apr 2, 2004, 4:12:44 PM4/2/04
to
On 1 Apr 2004 08:52:09 GMT, Radovan Garabik <gar...@kassiopeia.juls.savba.sk> wrote:
>>
>> What would you change and why?
>
>I'll show Novial to Schleyer and Zamenhof. Or maybe Interlingua, to
>increase the chance of international acceptance.

That may well work.

Joshua P. Hill

unread,
Apr 3, 2004, 9:46:52 AM4/3/04
to
On Fri, 2 Apr 2004 22:11:25 +0100, ph...@invalid.email.address (phil
hunt) wrote:

>On Fri, 02 Apr 2004 12:37:08 -0500, DJensen <m...@no-spam-thanks.net> wrote:
>>Ben Bradley wrote:
>>> In rec.arts.sf.science, DJensen <m...@no-spam-thanks.net> wrote:
>>>>LukeCampbell wrote:
>>>>>Wildepad wrote:
>>>>>
>>>>>>What would you change and why?
>>>>>
>>>>>I'd make sure there was a year zero.
>>>>
>>>>I don't see a practical reason for that.
>>>
>>> Where were you four or five years ago? It would have saved the
>>> stupid people from so vehemently disagreeing with the smart people
>>> about when the 21st Century started.
>
>
>And the 21st century would be the one where the years are of the
>form 21xx.

Then you'd end up with a zeroeth century.

Joshua P. Hill

unread,
Apr 3, 2004, 9:47:12 AM4/3/04
to
On Sat, 03 Apr 2004 10:26:50 +0200, Peter Knutsen <pe...@knutsen.dk>
wrote:

That would certainly explain rap . . .

how...@brazee.net

unread,
Apr 3, 2004, 10:16:02 AM4/3/04
to

On 3-Apr-2004, Joshua P. Hill <josh442R...@snet.net> wrote:

> >And the 21st century would be the one where the years are of the
> >form 21xx.
>
> Then you'd end up with a zeroeth century.

Or possibly start off with one, except... we'd also have a negative zeroeth
century.

Joshua P. Hill

unread,
Apr 3, 2004, 10:17:11 AM4/3/04
to
On Sat, 3 Apr 2004 00:41:08 +0200, Wolfgang Schwanke <s...@sig.nature>
wrote:

>Joshua P. Hill <josh442R...@snet.net> wrote in
>news:rijr60t5octqro9jj...@4ax.com:


>
>> What happened was something like this:
>>
>> 1. The Japanese national television network developed the original
>> 1125 line, 60 Hz analog HDTV standard
>

>An interesting aspect is that digital TV is used for different purposes on
>both sides of the Atlantic. In the US its major application seems to be
>HDTV. In Europe, digital is mostly used to distribute a larger number of
>standard resolution programmes (625/50).

It is interesting, isn't it? Some of that I think is historical: the
Japanese were the first to develop and broadcast HDTV, after which a
number of companies in the US started devising /analog/ HDTV broadcast
standards, which involved either improvements to NTSC or a second
"helper" channel broadcast in the alternate-channel guard bands. In
the course of that, it became increasingly apparent that the analog
signal was so inefficient and corruptible that it made more sense to
switch to a digital system, but none of the HDTV system proponents
proposed one until General Instruments came out with an all-digital
proposal on the even of submissions.Then the other major system
proponents made their systems digital as well.

Europe, meanwhile was off working on MAC, a non-starter. I gather they
looked at the DVB activity in the States said this is nice, and
decided to use it without going to the expense of converting to HDTV.
Hard to say whether that made sense -- it may have sped up the
adoption curve a bit, but once you've seen HDTV, it's hard to go back
to standard definition, even digital component at 16:9. OTOH, by
ignoring the early adoption segment of the curve, Europe will probably
end up with a somewhat better system.

Seems to me an ideal system would be something along the lines of
completely variable frame rates from 16-75 Hz. and aspect ratio
variable from 1.33:1 to 2.35:1 to accomodate all existing film and
video formats, non-interlaced, primary standard (used for day-to-day
broadcasting) of 1.66:1, 1000 lines non-interlaced @ 60 Hz, COFDM in a
6 MHz channel -- achievable with contemporary compression technology
(current DVB systems all use MPEG-2, which is getting kind of long in
the tooth). International, of course. But then, I think the focus is
going to shift to IP anyway . . .

Joshua P. Hill

unread,
Apr 3, 2004, 10:27:35 AM4/3/04
to
On Sat, 03 Apr 2004 04:36:06 -0600, Wildepad <wild...@newsguy.com>
wrote:

>On Fri, 02 Apr 2004 21:17:16 -0500, Joshua P. Hill


><josh442R...@snet.net> wrote:
>
>>On Fri, 02 Apr 2004 16:56:49 -0600, Wildepad <wild...@newsguy.com>
>>wrote:
>>
>>>On Fri, 02 Apr 2004 15:23:09 -0500, Joshua P. Hill
>>><josh442R...@snet.net> wrote:
>>>
>>>>Our archaic English spelling would definitely go. No accents though --
>>>>problematic on a keyboard -- we probably should have a few more
>>>>characters in the alphabet.
>>>
>>>If it were me (and you can all be thankful it's not), I'd do with many
>>>less. Five keys on a keyboard is all you really need if you get into
>>>two-position duplex characters.
>>
>>Sounds like it would be a lot of work to learn such an alphabet.
>

>Not really -- it's both an outgrowth of and a reversion to a numbering
>system that I've been working on (well, working on off-and-on).
>
>It takes two symbols (one atop the other) to make up a letter or
>number (duplex).
>
>You can see how this works (hopefully more clearly than I can explain
>it) at:
><http://web.newsguy.com/wildepad/pk1>
>
>Since this is sort-of base 3, that gives a total of eight characters,
>but when used for text the 'meaning' of the letters changes with
>position (hence a two-position duplex system), giving 64
>quasi-syllables.
>
>How does that work? If 4=t and 6=g when in the first position, and 4=a
>and 6=o when it is in the second position (a sequence that then
>repeats):
>'46' is 'to', '46-64' is 'toga', '66-44' is 'goat'.
>
>This may look extremely complicated, especially when familiar symbols
>are being used in unfamiliar ways, but is surprisingly easy to get
>used to -- it is certainly not as difficult as Chinese characters are
>to a typical westerner.
>
>The keyboard would only need to have a space bar (between words), an
>enter (between sentences), and three characters (blank, 1, 0) [with
>the 'processor' (whether mechanical or electronic) advancing from top
>to bottom and then left to right after every press of a key and
>automatically creating a short dash between the quasi-syllables within
>each word]. If you were so inclined you could break it out to include
>all eight letters, or even one with all 64 quasi-syllables.
>
>
>If you really feel the need for punctuation (which I would like to do
>without -- the words should carry the meaning), there could be eight
>of them as characters/numbers standing alone (not combined into a
>quasi-syllable).
>
>
>If you're really interested (and there's no reason why you should be),
>the language constructed with these characters would be entirely
>logical, with no need for a dictionary -- the characters might
>represent, in order: class, number, presence, type, etc.
>
>A simplified example:
>
>if (in order of position)
>1= 'human' (class)
>1= 'single', 2 = 'two', 3 = '>2' (number)
>1 = 'unspecified', 2 = 'specified' (presence)
>1 = 'ungendered', 2 = 'male', 3 = 'female' (type)
>
>then:
>1111 = some person
>1122 = this man
>1312 = some men
>1223 = that pair of women
>etc. etc. etc.
>
>In this way you only need to break the world down into seven
>categories/classes (such as human, animal, vegetable, mineral, action
>(verbs), etc.), and maintain some consistency in 'meaning' of
>positions (for example, if 5 = verb, then a word that starts with 51
>would be a single instance (hit, kiss, etc.), 53 is something done
>regularly (eating, sleeping, etc.), and so on and so forth).
>
>In many ways, it would be like the old game of 20 questions -- each
>position refining the meaning a little bit more.
>
>
>Alas, I've never seriously gotten into determining the actual layout
>of the language. It would be a major undertaking to determine the best
>classes and what each position should mean.
>
>This is complicated by the fact that I would like to see a spectrum of
>relationships -- since the letters are also numbers, you can create
>word equations: woman + woman = trouble, man + money = fun, paint +
>house = lousy weekend.
>
>Since the letters can be written symmetrically, a form of poetry could
>develop that means one thing when read rightside-up and something
>quite different when read upside-down, or you could combine art and
>literature by using the letters as numbers for a bitmap.

Wouldn't that be hard to learn? Relatively few symbols and classes,
but each word would have to be analyzed . . .

Joshua P. Hill

unread,
Apr 3, 2004, 10:29:11 AM4/3/04
to
On Sat, 3 Apr 2004 03:57:59 -0500, Michael Ash <mi...@mikeash.com>
wrote:

I was thinking that it would be more convenient to abolish France . .

Ash Wyllie

unread,
Apr 3, 2004, 8:39:48 AM4/3/04
to
James F. Cornwall opined

>Erik Max Francis wrote:
>>
>> Dr John Stockton wrote:
>>
>> > If the machine won't go back that far, make C, etc., start indexing at
>> > 1
>> > rather than 0 - a fertile source of error to remove.
>>
>> This is hardly an error -- it makes a great deal more sense from a
>> computer science perspective to index from 0 rather than 1. For one
>> thing, it's the lowest unsigned value of any unsigned type (whereas 1 is
>> not the lowest value of any integral type), and for another, since
>> array/pointer subscription works by addressing, 0 is the offset of the
>> first element of an array, not 1.
>>

>The problem is that while it makes sense from a computer science
>perspective, a very large number of programs are written by people with
>engineering or geology or meteorology or ... perspectives. For most of
>these people, we look at an array of "stuff" and perceive that the first
>entry in that array is entry number 1, not entry number zero...

>Looking at each entry in terms of its position in the array, not how far
>it's offset from the start of the array. We don't *care* that the
>assembly code generated by our compiler uses offsets regardless of what
>our favorite language says.

>Personally, I prefer the "starts at 1" style for high order languages.
>Offsets and starting from zero is fine at the assembly labguage level,
>but 1-indexing works better for the majority of people who are writing
>code. IME, anyway.

All of which explains why APL has the quadIO shared variable. Good old
PL/1 and Pascal (among others) have arrays where both upper and lower
bounds can be declared.

Choice is nice.

>> Some people don't like it, but indexing from 0 is not an arbitrary
>> choice, it was done for a very good, and still valid, reason.
>>


-ash
Cthulhu for President!
Why vote for a lesser evil?

Ash Wyllie

unread,
Apr 3, 2004, 8:47:52 AM4/3/04
to
Peter Knutsen opined

>Coridon Henshaw wrote:
>> Microsoft is a 'standard' the world does not need.

>I'd go for the entire IBM PC concept. Amigas should rule.

Hear, hear.

Karl M. Syring

unread,
Apr 3, 2004, 11:34:28 AM4/3/04
to
Michael Ash <mi...@mikeash.com> wrote in message news:<2004040212...@agamemnon.twistedsys.net>...

> On Fri, 2 Apr 2004, James F. Cornwall wrote:
>
> > Personally, I prefer the "starts at 1" style for high order languages.
> > Offsets and starting from zero is fine at the assembly labguage level,
> > but 1-indexing works better for the majority of people who are writing
> > code. IME, anyway.
>
> Thinking of C as a high-level language is a common mistake. It's much more
> accurate to group it with assembly. :)

Specifically, it is PDP-7 assembler. Interestingly, there are
sanitized versions of C,like Necula's CIL
(http://manju.cs.berkeley.edu/cil).

Karl M. Syring

Keith Morrison

unread,
Apr 3, 2004, 1:00:49 PM4/3/04
to
Joshua P. Hill <josh442R...@snet.net> wrote:

>>>> Where were you four or five years ago? It would have saved the
>>>>stupid people from so vehemently disagreeing with the smart people
>>>>about when the 21st Century started.
>>>> I dunno if it's true, but back around y2k I heard there had been no
>>>>disagreement that the 20th Century started in 1901.
>>>
>>> Kinda makes you wonder if the Flynn effect is real, doesn't it . . .
>>
>>Apparently, it stopped working some years ago.
>
>That would certainly explain rap . . .

I have to admit I thought much the same thing but my appreciation of
rap has gone up over the last few years. Sturgeon's Law is in full
effect, of course, but there are some examples of some really powerful
stuff out there.

--
Keith

Joshua P. Hill

unread,
Apr 3, 2004, 1:54:06 PM4/3/04
to

I'm sure you're right, but it's more fun to make fun of it. :-)

DJensen

unread,
Apr 3, 2004, 2:40:09 PM4/3/04
to
Wildepad wrote:
> Not really -- it's both an outgrowth of and a reversion to a numbering
> system that I've been working on (well, working on off-and-on).
>
> It takes two symbols (one atop the other) to make up a letter or
> number (duplex).
>
> You can see how this works (hopefully more clearly than I can explain
> it) at:
> <http://web.newsguy.com/wildepad/pk1>

These pages need a legend of some kind -- what are the circular
shapes? the rectangles? the spotted rectangles? the big black
diamonds? the diamonds with circles in them? I get that the
diamondoid line-shapes are the symbols though... I think.

[cut]


> The keyboard would only need to have a space bar (between words), an
> enter (between sentences), and three characters (blank, 1, 0) [with
> the 'processor' (whether mechanical or electronic) advancing from top
> to bottom and then left to right after every press of a key and
> automatically creating a short dash between the quasi-syllables within
> each word]. If you were so inclined you could break it out to include
> all eight letters, or even one with all 64 quasi-syllables.
>
> If you really feel the need for punctuation (which I would like to do
> without -- the words should carry the meaning), there could be eight
> of them as characters/numbers standing alone (not combined into a
> quasi-syllable).

There's a pretty good reason that punctuation isn't part of a
word's meaning -- <cat> and <cat,> mean the same thing, but the
common 'cat' element doesn't tell you where there should be a
pause, and there isn't necessarily a pause everytime you
encounter it. How do you get each word capable of conveying the
parsing suggestion/instruction without getting <cat>,
<pause-cat>, <fullstop-cat>, <exclamation-cat>,
<interrogative-cat>, etc?

[cut]
> I'm not sure if the language I propose would have any letter used with
> greater frequency than any other.
>
> In any event, when writing out a number under my system, it is not
> necessary to lift the pen from the paper after each digit, and words
> can be written in one flourish instead of needing two strokes for some
> letters as we have to contend with.

The problem I see with that, and I've seen with a number of
artificial languages/alphabets (see the invented languages
sections at www.omniglot.com -- specifically Tenctonese, Mesa) is
that when your letters/symbols look so much alike or are all
composed of the same set of components, linked letters easily
start to become other letters by accident because there's
no/little visual break between them. If this symbol system is
meant to be easily written, haste and repetition will erase any
subtle precautions you've built in to prevent that. (I know most
of my vowels disappear when I write, and that's in English.)

If you haven't already, check out 12480:
http://www.omniglot.com/writing/12480.htm

I'm impressed/intrigued by your idea though. I've also been
working on a language/symbol set for a WIP, but it's not meant to
be hand-written.

--
DJensen

DJensen

unread,
Apr 3, 2004, 2:52:34 PM4/3/04
to
phil hunt wrote:
> On Wed, 31 Mar 2004 22:36:48 -0500, DJensen <m...@no-spam-thanks.net> wrote:
>
>>I'd want the [modern, western] calendar fixed, for sure: 10
>>months of 36 days (six weeks of six days each), plus an
>>intercalary week of 5 (6 with a leap day) at the end/start of
>>each year[1] -- which would be set around one of the equinoxes.
>
> Better to have 13 months of 28 days each, which is similar to the
> lunar cycle (why do you think they're called months?) and is also a
> whole number of weeks, so for any particular year any day-of-month
> will be the same day-of-week for every month.

Unless you're going to anchor the year to the lunar cycle, why
bother? If we're changing the number and lengths of months, why
retain the 7-day week? As I replied elsewhere, a 36-day month
also has the same day-of-week benefit (for each 4-year period).

> The Year could start at the spring equinox since (at least in the
> northern hemisphere, where most people live) since spring is
> associated with new life. Or use the quarter-points and make the
> year start on Beltane.

If your 13-month year is based on the moon (and if it's not it's
just as arbitrary as any other suggestion made so far) you're
going to run into trouble pretty quickly if you base any
significant dates (such as the end/start of the year) on the
solar cycle.

--
DJensen

how...@brazee.net

unread,
Apr 3, 2004, 4:27:00 PM4/3/04
to

On 3-Apr-2004, DJensen <m...@no-spam-thanks.net> wrote:

> Unless you're going to anchor the year to the lunar cycle, why
> bother? If we're changing the number and lengths of months, why
> retain the 7-day week? As I replied elsewhere, a 36-day month
> also has the same day-of-week benefit (for each 4-year period).

The interesting time period to me is the week. It appears to be entirely
the result of Genesis, but seems to have spread world wide. Did other
societies have a similar time cycle?


Also - what measurements of time did people use before the 24 hour clock
became standard?

Bernard Peek

unread,
Apr 3, 2004, 5:01:06 PM4/3/04
to
In message <E_Fbc.14568$lt2....@newsread1.news.pas.earthlink.net>,
how...@brazee.net writes

I've always wondered whether other countries use different units of
time. Does China use the second?


--
Bernard Peek
London, UK. DBA, Manager, Trainer & Author. Will work for money.

DJensen

unread,
Apr 3, 2004, 5:25:33 PM4/3/04
to
how...@brazee.net wrote:
> On 3-Apr-2004, DJensen <m...@no-spam-thanks.net> wrote:
>
>
>>Unless you're going to anchor the year to the lunar cycle, why
>>bother? If we're changing the number and lengths of months, why
>>retain the 7-day week? As I replied elsewhere, a 36-day month
>>also has the same day-of-week benefit (for each 4-year period).
>
>
> The interesting time period to me is the week. It appears to be entirely
> the result of Genesis, but seems to have spread world wide. Did other
> societies have a similar time cycle?

http://en.wikipedia.org/wiki/Week

It seems to be a more or less global standard, due to the
visibility of the sun, moon, Mars, etc. Sumer had it before the
Jews, but apparently the Chinese and Japanese arrived at it
independently of both civilizations.

> Also - what measurements of time did people use before the 24 hour clock
> became standard?

The medieval canonical hours clock didn't have hours of equal
lengths, and it appears that the day started around 6am rather
than midnight:

http://en.wikipedia.org/wiki/Canonical_hours

It's the only reference I noticed to a clock that didn't measure
hours as we do... I'd bet there's probably a traditional Muslim
time keeping system and far eastern clocks that predate/differ
from the modern 24-hours.

Taking a leap, I'd say we probably owe some of this 12-hour-day
idea to the Babylonians too.

--
DJensen

Leif Magnar Kj|nn|y

unread,
Apr 3, 2004, 6:30:43 PM4/3/04
to
In article <nHhbc.20435$j57.1...@news20.bellglobal.com>,
DJensen <m...@no-spam-thanks.net> wrote:
>Radovan Garabik wrote:
>>
>> Yes, they aren't. Similar as in English you cannot interchange "take" and
>> "takes" freely (I take, he takes...). But, are "take" and "takes" two
>> words or just one, but in different forms?
>
>Conjugation creates distinct forms, thus distinct words.

No no, different forms of the same verb. You won't find "takes" as
a separate entry in the dictionary.

--
Leif Kjønnøy, Geek of a Few Trades. http://www.pvv.org/~leifmk
Disclaimer: Do not try this at home.
Void where prohibited by law.
Batteries not included.

Joshua P. Hill

unread,
Apr 3, 2004, 6:33:30 PM4/3/04
to
On Sat, 03 Apr 2004 17:25:33 -0500, DJensen <m...@no-spam-thanks.net>
wrote:

>The medieval canonical hours clock didn't have hours of equal
>lengths, and it appears that the day started around 6am rather
>than midnight:
>
>http://en.wikipedia.org/wiki/Canonical_hours
>
>It's the only reference I noticed to a clock that didn't measure
>hours as we do... I'd bet there's probably a traditional Muslim
>time keeping system and far eastern clocks that predate/differ
>from the modern 24-hours.
>
>Taking a leap, I'd say we probably owe some of this 12-hour-day
>idea to the Babylonians too.

Found this:

Temporal hours were of little use to astronomers, and around 127 CE
Hipparchus of Niceae, working in the great city of Alexandria,
proposed dividing the day into 24 equinoctial hours. These equinoctial
hours, so called because they are based on the equal length of day and
night at the equinox, split the day into equal periods. (Despite his
conceptual advance, ordinary people continued to use temporal hours
for well over a thousand years: the conversion to equinoctial hours in
Europe was made when mechanical, weight driven clocks were developed
in the fourteenth century.)

The division of time was further refined by another Alexandrian based
philosopher, Claudius Ptolemeus, who divided the equinoctial hour into
60 minutes, inspired by the scale of measurement used in ancient
Babylon.

http://africanhistory.about.com/library/weekly/aa033101b.htm

phil hunt

unread,
Apr 3, 2004, 3:23:33 PM4/3/04
to
On Sat, 03 Apr 2004 09:46:52 -0500, Joshua P. Hill <josh442R...@snet.net> wrote:
>>>>
>>>> Where were you four or five years ago? It would have saved the
>>>> stupid people from so vehemently disagreeing with the smart people
>>>> about when the 21st Century started.
>>
>>And the 21st century would be the one where the years are of the
>>form 21xx.
>
>Then you'd end up with a zeroeth century.

Of course; what other sensible name is there for the years 0-99 ?

Erik Max Francis

unread,
Apr 3, 2004, 6:51:12 PM4/3/04
to
phil hunt wrote:

> Of course; what other sensible name is there for the years 0-99 ?

Well for one thing, zeroth isn't a cardinal number.

For another, that means that preceding century ranges from -100 to -1.
What years to you call those, the minus zeroth century?

Note what happens either way -- the negative centuries all have the last
year in the century end in 1. So you're right back to the "confusing"
thing that you were trying to solve, it just happens in the BC years.

--
__ Erik Max Francis && m...@alcyone.com && http://www.alcyone.com/max/
/ \ San Jose, CA, USA && 37 20 N 121 53 W && AIM erikmaxfrancis
\__/ Suffering is a journey which has an end.
-- Matthew Fox

Paul Colquhoun

unread,
Apr 3, 2004, 7:21:27 PM4/3/04
to
On Fri, 02 Apr 2004 14:30:27 -0600, Wildepad <wild...@newsguy.com> wrote:
| On Fri, 2 Apr 2004 15:49:41 GMT, "James F. Cornwall"

| <JCor...@cox.net> wrote:
|
|>Erik Max Francis wrote:
|>>
|>> Dr John Stockton wrote:
|>>
|>> > If the machine won't go back that far, make C, etc., start indexing at
|>> > 1
|>> > rather than 0 - a fertile source of error to remove.
|>>
|>> This is hardly an error -- it makes a great deal more sense from a
|>> computer science perspective to index from 0 rather than 1. For one
|>> thing, it's the lowest unsigned value of any unsigned type (whereas 1 is
|>> not the lowest value of any integral type), and for another, since
|>> array/pointer subscription works by addressing, 0 is the offset of the
|>> first element of an array, not 1.
|>>
|>
|>The problem is that while it makes sense from a computer science
|>perspective, a very large number of programs are written by people with
|>engineering or geology or meteorology or ... perspectives.
|
| Ah -- now there is the problem!
|
| <rant>
|
| Programs should be written by people who know and understand
| programming.
|
| Programmers don't routinely think they can design bridges, locate oil
| pockets, or predict the weather, but people from other disciplines all
| want to write programs and then complain when they have trouble doing
| it or the code doesn't run 'right' (i.e. it does what it was written
| to do and not what the pseudo-programmer thought it should).
|
| </rant>


I've always thought that a proogramming language should make it easy to
write a program that matches the problem space, not force you to adapt
the problem to match the language.

On the array front, the Pascal approach is very good. When you define
an array, you give the lowest and highest (or starting and ending, actually)
indexes, rather that assuming they start from 0 and giving the length.
This makes it easy to use when you need to store information about data
that naturally ranges between, say, 1000 and 2000 without having to
sutract the offset all the time.

Pascal also allows you to use any ordered, discrete, type ( characters,
enumerated types, etc) as array indexes.

This applies to a range of languages similar to Pascal, like Modula and Ada.

Starting all arrays at 0 is just to make the compiler writers job easier,
not to help the programmers using the language.


--
Reverend Paul Colquhoun, ULC. http://andor.dropbear.id.au/~paulcol
Asking for technical help in newsgroups? Read this first:
http://catb.org/~esr/faqs/smart-questions.html#intro

DJensen

unread,
Apr 3, 2004, 7:15:44 PM4/3/04
to
phil hunt wrote:

> On Sat, 03 Apr 2004 09:46:52 -0500, Joshua P. Hill <josh442R...@snet.net> wrote:
>
>>>>> Where were you four or five years ago? It would have saved the
>>>>>stupid people from so vehemently disagreeing with the smart people
>>>>>about when the 21st Century started.
>>>
>>>And the 21st century would be the one where the years are of the
>>>form 21xx.
>>
>>Then you'd end up with a zeroeth century.
>
> Of course; what other sensible name is there for the years 0-99 ?

The first century. Oh wait--

--
DJensen

DJensen

unread,
Apr 3, 2004, 7:19:38 PM4/3/04
to
Leif Magnar Kj|nn|y wrote:

> In article <nHhbc.20435$j57.1...@news20.bellglobal.com>,
> DJensen <m...@no-spam-thanks.net> wrote:
>
>>Radovan Garabik wrote:
>>
>>>Yes, they aren't. Similar as in English you cannot interchange "take" and
>>>"takes" freely (I take, he takes...). But, are "take" and "takes" two
>>>words or just one, but in different forms?
>>
>>Conjugation creates distinct forms, thus distinct words.
>
> No no, different forms of the same verb. You won't find "takes" as
> a separate entry in the dictionary.

You will find it has its own definition though.

Takes a looks at these dictionary entry for those word, then
tried to seeing how them was differently word by reads those
sentence here.

--
DJensen

J

unread,
Apr 3, 2004, 8:56:32 PM4/3/04
to
below...

> >The interesting time period to me is the week. It appears to be
entirely
> >the result of Genesis, but seems to have spread world wide. Did other
> >societies have a similar time cycle?

I recently looked into this... a 7 day week is more than just an invention
of one culture, it's too global, coming up from independent societies. But
at the same time it is not the only period that could be considered a week,
even today we have the 5 day business week. In ancient times week periods
ranged from 4-10 days [though 8 and 9 were absent in examples I found]---
but all of these were based on commerce, not really calendar weeks, and in
some cases a 7 day week went right along side the business week.

There are only two explanations for the pandemic appearance of a 7 day week
IMO;
1) a common anscestorage, or subconscious thought;
a) via Genesis
b) via an Atlantis/Tower of Babel [which again is Genesis] type source
2) An ancient astronomical or other observable phenomena that had a 7 day
cycle;
a) that doesn't exist in modern times
b) that goes un-noticed by modern minds

In either case, the matter is really moot. Whoever invented it, or why it
was invented is lost to history. However, the reason why it is used today is
clear as a bell, Christians inherited the week from Jews and when
Constintine became a Christian the partern for the rest of Western History
was set. It doesn't matter, religiously speaking, that the 7-day week
existed before Moses or even Abraham, technically speaking they don't see it
as their invention only that it existed since the beginning and they
[speaking of the Jews] just continued using what God had already instituted.

Later, J


J

unread,
Apr 3, 2004, 8:59:06 PM4/3/04
to
below...

"Erik Max Francis" <m...@alcyone.com> wrote in message
news:406F4DF0...@alcyone.com...


> Note what happens either way -- the negative centuries all have the last
> year in the century end in 1. So you're right back to the "confusing"
> thing that you were trying to solve, it just happens in the BC years.

2 BC, 1 BC, 0 BC, 0 AD, 1 AD, 2 AD....

1st century BC, 0th century BC, 0th century AD, 1st century AD...

This must fly in the face of all logic... no wait, it doesn't, just defies
all convention.

Later, J


Terrafamilia

unread,
Apr 3, 2004, 10:06:40 PM4/3/04
to


> What would you change and why?

With all the talk about languages, writing/typing sytems, and computers,
I wonder if Marain, the language of Iain Bank's Culture,would be feasible.

http://homepages.compuserve.de/Mostral/artikel/marain.html

Ciao,

Terrafamilia


Vadim S Kaplunovsky

unread,
Apr 3, 2004, 11:04:25 PM4/3/04
to
In article <406CFCC9...@alcyone.com>,

Erik Max Francis <m...@alcyone.com> wrote:
>Dr John Stockton wrote:
>
>> If the machine won't go back that far, make C, etc., start indexing at
>> 1
>> rather than 0 - a fertile source of error to remove.
>
>This is hardly an error -- it makes a great deal more sense from a
>computer science perspective to index from 0 rather than 1. For one
>thing, it's the lowest unsigned value of any unsigned type (whereas 1 is
>not the lowest value of any integral type), and for another, since
>array/pointer subscription works by addressing, 0 is the offset of the
>first element of an array, not 1.
>
>Some people don't like it, but indexing from 0 is not an arbitrary
>choice, it was done for a very good, and still valid, reason.
>
Starting array indices from 0, 1 or arbitrary offsets is a non-issue.
Any programmer that graduated from diapers can take care of offsets
without bothering to wake up :-).

The real problems with C arrays are: (1) Conflation of arrays, strings
and pointers; and (2) Pointer arithmetic. Between these two features,
out-of-bounds errors became pretty much un-controllable. Add to this
(3) Standard I/O functions using (1) and (2) and relying on the
*calling* function to check string-space availability, and you get
a really ugly mess breeding bugs to this day.

Standard Pascal has the opposite problems: (1) No way to make a pointer to
an existing object; and (2) no good way to deal with strings. Problem (2)
is solved by most modern extensions of Pascal, but every extension does
it differently, and that creates a problem of its own.

Historically, C was designed for kernel programming where run-time
out-of-bounds error-checking would not work anyway, so K&R didn't
bother facilitating such checking for other programs. And Pascal
was designed for teaching structured programming to CS freshmen,
so N.Wirth didn't bother making it useful for actual programming.
Both were sensible decisions at the time, and we have been suffering
the consequences ever since :-(.

--Vadim.
--
*******************************************************************
Vadim S. Kaplunovsky, | va...@physics.utexas.edu
Professor of Physics, | #include <std_disclaimer.h>
University of Texas at Austin. | #excuse bad_typing.

DJensen

unread,
Apr 3, 2004, 11:23:37 PM4/3/04
to
Wildepad wrote:
> On Sat, 03 Apr 2004 14:40:09 -0500, DJensen <m...@no-spam-thanks.net>
> wrote:

>>Wildepad wrote:
>>>You can see how this works (hopefully more clearly than I can explain
>>>it) at:
>>><http://web.newsguy.com/wildepad/pk1>
>>
>>These pages need a legend of some kind -- what are the circular
>>shapes? the rectangles? the spotted rectangles? the big black
>>diamonds? the diamonds with circles in them? I get that the
>>diamondoid line-shapes are the symbols though... I think.
>
> The whole idea behind the pages is that they are a grade-school primer
> found on an alien planet. Everyone accepts that science would be a
> universal language, a common denominator that would provide the
> foothold to understanding an alien race -- this is a sort of test to
> show how readily recognizable the most basic science is when presented
> in an unusual manner.

Then it's flying well over my head, because I don't get any of
the symbolism nor do I see any relation to science. But then I'm
no scientist.

>>There's a pretty good reason that punctuation isn't part of a
>>word's meaning -- <cat> and <cat,> mean the same thing, but the
>>common 'cat' element doesn't tell you where there should be a
>>pause, and there isn't necessarily a pause everytime you
>>encounter it. How do you get each word capable of conveying the
>>parsing suggestion/instruction without getting <cat>,
>><pause-cat>, <fullstop-cat>, <exclamation-cat>,
>><interrogative-cat>, etc?
>

> By proper writing. If someone is screaming as a feline, then the
> 'cat!' would be understood; if the sentence starts with 'Where is'
> then 'cat?' should be understood; etc. etc. etc. ad infinitum ad
> nauseum.

If you want to restrict it so questions must be asked using
particular words. The question mark shorthands that for us
though, to the point that you can turn any assertive statement
into a question with just a mark above the period, no need for
'who' or 'when' or the rest.

How does one scream on a clay tablet though? I'm not trying to
knock your idea, but it sounds like any deviation from a
straightforward sentence will become repetitive because it must
be phrased a certain way and include certain words.

>>I'm impressed/intrigued by your idea though. I've also been
>>working on a language/symbol set for a WIP, but it's not meant to
>>be hand-written.
>

> The reason I started with simple, straight lines is so that they could
> easily be impressed onto clay tablets or chiseled in stone.

I've started with simple lines (horizontal, vertical, four
curves) because I want them to be easily recognized while
carrying the most information in the fewest strokes, but I'm
designing the symbols for a society that no longer physically writes.

ObUrbanLegend(?): The international peace symbol was designed, in
part, that it could be pressed into clay tablets to survive a
nuclear war.

--
DJensen

Erik Max Francis

unread,
Apr 4, 2004, 12:07:14 AM4/4/04
to
J wrote:

> 2 BC, 1 BC, 0 BC, 0 AD, 1 AD, 2 AD....
>
> 1st century BC, 0th century BC, 0th century AD, 1st century AD...

While there's no reason you can't have a year labeled zero after its
invention, it really still does not make sense to talk about a "zeroth
century." The "th" numbers are cardinal numbers and begin with one
(first). There isn't one before that, that's first.

You can't label these new years AD and BC, because they're not AD and BC
anymore. (J's AD and J's BC -- JAD and JBC -- are fine.)

> This must fly in the face of all logic... no wait, it doesn't, just
> defies
> all convention.

It comes awfully close to defying all logic. There are two year zeroes?
If you're interested in having years labelled with integers, instead of
skipping a number, it duplicates it! How is that better? Everyone will
still make all the exact same errors, they'll just do it the other way.

As I said earlier, there is nothing preventing you from remapping the
years so that 2 AD = 2, 1 AD = 1, 1 BC = 0, 2 BC = -1, and so on. You
can do this and it's merely a notational change, rather than redefining
the calendar. What you're doing is actually redefining things.

There's a very good reason that the Gregorian calendar doesn't contain a
year zero -- it's because it wasn't invented (well, exposed to
Westerners enough) yet! That can be remapped away with a new notational
system, but you really can't get rid of the fact that the first century
is supposed to be the _first_, and there can't be a "zeroth" before
that.

--
__ Erik Max Francis && m...@alcyone.com && http://www.alcyone.com/max/
/ \ San Jose, CA, USA && 37 20 N 121 53 W && AIM erikmaxfrancis

\__/ Sit loosely in the saddle of life.
-- Robert Louis Stevenson

It is loading more messages.
0 new messages