Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

7-bit character sets and shell

4 views
Skip to first unread message

Timo Lehtinen

unread,
Oct 15, 1990, 11:18:02 AM10/15/90
to
In article <keld.655930411@dkuugin> ke...@login.dkuug.dk (Keld J|rn Simonsen) writes:
>
> ...lines removed...
>
>However, in a 7-bit environment there do not seem to be any simple
>solution. One way of doing it is to provide an alternate representation
>for the special ASCII characters, including #$@[\]^`{|}~ .

No, not trigraphs in the shell too ? Aaaaarrrrggghhhhh.....
Please leave our precious special characters the way they are.

Timo

--
____/ ___ ___/ / Kivihaantie 8 C 25
/ / / SF-00310 HELSINKI, Finland
____ / / / Phone: +358 0 573 161, +358 49 424 012
Stream Technologies Inc. Fax: +358 0 571 384

Anssi Porttikivi

unread,
Oct 15, 1990, 4:27:05 PM10/15/90
to
In <1990Oct15.1...@sti.fi>
t...@sti.fi (Timo Lehtinen) writes:

>Keld J|rn Simonsen writes


>>However, in a 7-bit environment there do not seem to be any simple
>>solution. One way of doing it is to provide an alternate representation
>>for the special ASCII characters, including #$@[\]^`{|}~ .
>
>No, not trigraphs in the shell too ? Aaaaarrrrggghhhhh.....
>Please leave our precious special characters the way they are.

Multigraphs or many character words instead of traditional one
character codes is not feasible in practice. I guess what Keld
means is to make all unix system software to give a possibility
to use not only characters like the pipe character - the vertical
bar or the scandinavian o with two dots - in their current
special meanings but also alternative characters which would not
be the same codes as used for national purposes.

A somewhat questionable solution would be to establish a
convention that the special characters are defined in environment
variables. IF all software would read them there. Then you
could define the exclamation mark to be the pipe character and
use scandinavian o's with dots in filenames. If there were
software around that would not respect the environment you simply
had to define the environment to old settings and work the way
you do now so the environment obeying and and non-obeying
software would not confuse each other.

You could just start by modifying the shell to obey these
environment settings. Or first we could try running the shell as
a pipe somewhat like this

"program to parse the command line like the shell but according
to the environment and translate the defined new special
characters to their old equivalents and the old characters to
their escaped versions" | sh

But the only real solution is to use larger standard character
set. The user community should set a target date, say 1995, when
all organisations switch to 8-bit character sets and them only!
Personally I think software shoud be character code
representation independent, i. e. able to use any font or
character set like usually in Mac environment. This means that
characters should not be used as commands but only as data or at
least all special characters should always be configurable.

Eric Thomas SUNET

unread,
Oct 16, 1990, 6:53:26 PM10/16/90
to
I have always been curious about something: what on earth is the purpose of
7-bit ASCII??? I am faced with a serious yet completely ridiculous problem at
the moment. I am using a VT320 connected to a Un*x box in order to telnet to
a VMS machine. I would like to run my VT320 in VT320 mode, not in VT100 mode.
Unfortunately, VT320 mode implies 8-bits ASCII and the shell I am using refuses
to stty cs8. Well it doesn't really refuse to do it, it does accept the command
with no error message but quietly resets to 7-bits afterwards to correct my
"mistake". One of the other shells doesn't do it, but then the telnet command
is kind enough to correct my mistake (again, without bothering to inform me),
so it really doesn't make any difference, I'm just stuck to VT100 mode.

EBCDIC might be a plague, but at least there's one thing they didn't do wrong:
they didn't come up with a 7-bit EBCDIC (with only 7 bits you wouldn't even be
able to write 'HELLO WORLD', let aside 'hello world' :-) ). They never made
terminals where you have to try 25 parity/delay/stop bits and speed combinations
before the thing deigns to print the login prompt. I have an 8-years old dusty
3278 next to my VT320, it runs about 5 times faster than the VT320's maximum
speed of 19200bps, doesn't even HAVE a setup mode where you might have to
wonder what parity/speed/xon/stop bits/delay you need, and I never saw it print
garbage on the screen, despite the use of an 8-bits character set. So, can
someone give me a technical reason why 7-bits ASCII is still in use???

Eric

Michel Wurtz

unread,
Oct 18, 1990, 9:21:14 AM10/18/90
to
in article <22...@sunic.sunet.se>, er...@sunic.sunet.se (Eric Thomas SUNET) says:
>
> I have always been curious about something: what on earth is the purpose of
> 7-bit ASCII??? I am faced with a serious yet completely ridiculous problem at
> the moment. I am using a VT320 connected to a Un*x box in order to telnet to
> a VMS machine...
...
> ... So, can

> someone give me a technical reason why 7-bits ASCII is still in use???
>
I was faced with the same problem with our VMS+UNIX computer environment.
The technical reason (wich is also an historical reason) is that the 8th bit
is generaly used for other purpose then allowing multinational charset (at
least in the original implementation of Unix): quoting in shell and vi, and
parity for transmission.

In your case, even with 8 bits shells and editors, you have *NO* chance
to use 8 bits on your VMS machines, because the telnet protocol allow only
7-bit ASCII... (I suspect a *US cultural imperialism* from the
designing staff in Berkeley :-)!

The only solution I found was the use of an RS232 line beetwen
our Unix and our VMS machine and a software like kermit (the 4C version
supports 8-bits ASCII) for remote login. File transfer with ftp is less
(but still) subject to problems.
--
, ,
/| /| Michel Wurtz m...@ign.uucp
/ |/ |. _ |_ _| Projet Semio
/ ||(_ | )(-| Tel: +33 1 43988103
Fax: +33 1 43988031

Institut Geographique National 2, Av Pasteur BP 68 F-94160 St Mande

David Wright

unread,
Oct 18, 1990, 5:48:58 PM10/18/90
to
Meanwhile - someone had better fix sendmail, SMTP and 49 other things to
make them 8-bit clean. So far as I know most current mail systems strip
the 8th bit as a matter of course, and I think much news transport does too.

No doubt some UNIX systems have 8-bit versions but until they are in general
use 8-bit text will be routinely munged when transmitted. Of course we
could all switch to X.400 for everything.......

Regards, "None shall be enslaved by poverty, ignorance or conformity"
David Wright STL, London Road, Harlow, Essex CM17 9NA, UK
d...@stl.stc.co.uk <or> ...uunet!mcsun!ukc!stl!dww <or> PSI%234237100122::DWW
<or> /g=David/s=Wright/org=STC Technology Ltd/prmd=STC plc/admd=Gold 400/co=GB

Michel Wurtz

unread,
Oct 24, 1990, 4:48:40 AM10/24/90
to
In article <18...@hulda.erbe.se>, p...@erbe.se (Robert Claeson) writes:
> Have you tried negotiating TELNET binary mode? Also, while many UNIX
> TELNET clients (ie, the user command) do strip all data sent down to
> 7 bits by default, few, if any, TELNET servers (ie, "telnetd") do.

Huh? TELNET binary mode? That seems great, but I don't know how to
negotiate it: at least, on my site , there is no option to do this.

...(wait a couple of minutes here for experimentation :-)...

OK, I found a "mode" setup on a recent implementation, and it seems to
work (or half-work): I've tried between two systems (a 386 with system V
and a Sun 4 with Sunos 4.1), and it works only in one direction (?)...
In any case, this feature is pretty undocumented!

0 new messages