Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

[comp.os.linux.advocacy] FAQ and Primer for COLA, Edition III

4 views
Skip to first unread message

High Plains Thumper

unread,
Sep 9, 2006, 6:54:36 AM9/9/06
to
Following is excerpted from:

http://www.faqs.org/faqs/linux/advocacy/faq-and-primer/

1.2 Welcome to comp.os.linux.advocacy

If you are new to Linux and/or comp.os.linux.advocacy,
welcome. It is hoped that you will will enjoy your time in
comp.os.linux.advocacy and find it educational. We also
hope that you will find Linux as useful for you. and that
in the ripeness of time that you will become a
contributing member of the Linux community.

COLA is like a meeting hall for Linux advocacy. A place
where those who advocate the use of Linux can meet and
discuss all things Linux. In addition it is a place were
individuals interested in Linux can come to gain an
understanding of the Linux and the Linux community and to
learn about the capabilities of Linux from those who are
experienced with the use, administration, and development
of Linux.

By using Linux as a user or sysadmin you are a member of
the Linux community of which this newsgroup is an asset.
The Linux community is world-wide and interconnected by
the internet and other networks gated to the internet.

The description that your news server delivers to you for
comp.os.linux.advocacy, or COLA for short, is "Benefits of
Linux compared to other operating systems". That
description is derived from the charter of COLA. Sometimes
advocacy groups are viewed as a place where the bickering
undesirables of other newsgroups are directed, in order to
remove a disruption from another group on the same general
subject. That is incorrect for COLA.

1.3 Contributing to this FAQ and Primer

All those who advocate the use of Linux are invited to
submit material and suggestions to be considered for
future versions of this document. Submissions should be
sent by email to mj...@mindspring.com. You may also post
your submissions in COLA; however, in that case you should
still email your submission as well, so that the
submission will not be missed as can happen if it were
posted in COLA only.

Submissions offered by those who may deemed to be hostile
to Linux, including but not limited to anti-Linux
propagandists, will not be accepted.

1.4 The Charter of comp.os.linux.advocacy

The charter of comp.os.linux.advocacy is:

For discussion of the benefits of Linux compared to
other operating systems.

That single sentence is the one and only charter of the
newsgroup comp.os.linux.advocacy. The newsgroup's charter
is for the newsgroup as a place for supporters of Linux to
gather to discuss Linux, for the betterment of the Linux
community and the promotion and development of Linux. It
supports this as a place for those who would like to learn
more about Linux to come to learn from those who know
Linux. It does not call for it to be a place where the
anti-Linux propagandists to gather in order to discredit
Linux.

DFS

unread,
Sep 9, 2006, 12:09:29 PM9/9/06
to
You left out the most important function of cola: to provide an online forum
for petulant, disgruntled Linux users to lie and whine endlessly about
Microsoft and the Windows operating system.

7

unread,
Sep 9, 2006, 4:23:47 PM9/9/06
to
DFS wrote on behalf of micoshaft:

> You left out the most important function of cola: to provide an online
> forum for petulant, disgruntled Linux users to lie and whine endlessly
> about Microsoft and the Windows operating system.

Oh no DFS! Say it ain't so!

Linonut

unread,
Sep 9, 2006, 10:45:34 PM9/9/06
to
After takin' a swig o' grog, DFS belched out this bit o' wisdom:

> You left out the most important function of cola: to provide an online forum
> for petulant, disgruntled Linux users to lie and whine endlessly about
> Microsoft and the Windows operating system.

That isn't in the charter, but it is a fine purpose for COLA. It's
tough to keep venting steam about Windows/MS fubars at work all the
time.

--
"It turns out Luddites don't know how to use software properly,
so you should look into that." -- Bill Gates, FOCUS interview
http://www.cantrip.org/nobugs.html

DFS

unread,
Sep 9, 2006, 11:01:46 PM9/9/06
to
Linonut wrote:
> After takin' a swig o' grog, DFS belched out this bit o' wisdom:
>
>> You left out the most important function of cola: to provide an
>> online forum for petulant, disgruntled Linux users to lie and whine
>> endlessly about Microsoft and the Windows operating system.
>
> That isn't in the charter, but it is a fine purpose for COLA.

It's kind of a sad reflection on the Linux "community" is what it is.

> It's tough to keep venting steam about Windows/MS
> fubars at work all the time.

Get used to it, my man, 'cause Windows isn't leaving your workplace for a
long, long, long time...

Roy Schestowitz

unread,
Sep 10, 2006, 1:11:06 AM9/10/06
to
__/ [ High Plains Thumper ] on Saturday 09 September 2006 11:54 \__

> <snip />


>
> 1.4 The Charter of comp.os.linux.advocacy
>
> The charter of comp.os.linux.advocacy is:
>
> For discussion of the benefits of Linux compared to
> other operating systems.
>
> That single sentence is the one and only charter of the
> newsgroup comp.os.linux.advocacy. The newsgroup's charter
> is for the newsgroup as a place for supporters of Linux to
> gather to discuss Linux, for the betterment of the Linux
> community and the promotion and development of Linux. It
> supports this as a place for those who would like to learn
> more about Linux to come to learn from those who know
> Linux. It does not call for it to be a place where the
> anti-Linux propagandists to gather in order to discredit
> Linux.

I hope you set this post to a weekly recurrence. It's more valuable than it
may seem on the surface.

Best wishes,

Roy

--
Roy S. Schestowitz | Download Othello: http://othellomaster.com
http://Schestowitz.com | SuSE Linux | PGP-Key: 0x74572E8E
6:05am up 51 days 18:17, 10 users, load average: 1.24, 0.60, 0.39
http://iuron.com - Open Source knowledge engine project

High Plains Thumper

unread,
Sep 10, 2006, 6:43:45 AM9/10/06
to
Roy Schestowitz wrote:
> High Plains Thumper on Saturday
>
>> Following is excerpted from:
>>
>> http://www.faqs.org/faqs/linux/advocacy/faq-and-primer/
>>
>> <snip />
>>
>> 1.4 The Charter of comp.os.linux.advocacy
>>
>> The charter of comp.os.linux.advocacy is:
>>
>> For discussion of the benefits of Linux compared to
>> other operating systems.
>>
>> That single sentence is the one and only charter of the
>> newsgroup comp.os.linux.advocacy. The newsgroup's
>> charter is for the newsgroup as a place for supporters
>> of Linux to gather to discuss Linux, for the betterment
>> of the Linux community and the promotion and
>> development of Linux. It supports this as a place for
>> those who would like to learn more about Linux to come
>> to learn from those who know Linux. It does not call
>> for it to be a place where the anti-Linux propagandists
>> to gather in order to discredit Linux.
>
> I hope you set this post to a weekly recurrence. It's more
> valuable than it may seem on the surface.

True. Minimum posting should be at least monthly, I'll give it
a shot. Also, we know that this so called second charter that
some have brought up in the past, in truth does not exist.

Reference:

http://groups.google.com/group/comp.os.linux.advocacy/browse_frm
/thread/3a36eaf74cc73f/dc0c7bfeb17f459f?lnk=st&q=&rnum=1
&hl=en#dc0c7bfeb17f459f

or http://tinyurl.com/g4f4k

> Following is the historical context:
>
> Mr. Dave Sill <d...@sws5.ctd.ornl.gov> posted to various
> newsgroups subject, "2nd RFD: comp.os.linux reorganization"
> Dated: 14 Oct 1994.
>
> Post is archived at
>
> ftp://ftp.isc.org/pub/usenet/news.announce.newgroups/comp/co
> mp .os.linux-reorg3
>
> or http://tinyurl.com/egj8s (user friendly of the same)
>
> Reorganisation was prompted by excessive levels of traffic
> in comp.os.linux.help newsgroup. Mr. Sill initiated a
> Request for Discussion (RFD), which after discussion and
> approval, resulted in 5 additional groups being added.
> c.o.l.help was relegated to c.o.l.misc with c.o.l.help
> being retired.
>
> c.o.l.help was replaced with:
>
> 1) comp.os.linux.advocacy - For discussion of the benefits


> of Linux compared to other operating systems.
>

> 2) comp.os.linux.development.apps - For questions and
> discussion regarding the writing of applications for Linux
> and the porting of applications to Linux.
>
> 3) comp.os.linux.hardware - For questions and discussion
> specific to a particular piece of hardware.
>
> 4) comp.os.linux.networking - For questions and discussion
> relating to networking or communications.
>
> 5) comp.os.linux.x - For questions and discussion relating
> to X Window System.
>
> Thus, group c.o.l.advocacy originated out of c.o.l.help.
>
> It was born out of a need to expand help categories.
> Charter for c.o.l.help was, "Discussion of Linux-specific
> questions and advice".
>
> Per RFD, "This proposed reorganization was prompted
> originally by the excessive level of traffic in
> comp.os.linux.help. Following a straw poll conducted by
> the proponent, an RFD was posted for breaking
> comp.os.linux.help into various subgroups and creating
> comp.os.linux.answers for separating the documentation and
> announcements currently posted to comp.os.linux.announce.
> During the discussion period, it became evident that it
> would be preferable to create the groups directly under
> comp.os.linux, rather than splitting comp.os.linux.help."
>
> A relationship between "help" and "advocacy" is
> established.

--
HPT


ed

unread,
Sep 10, 2006, 7:06:40 AM9/10/06
to
On Sun, 10 Sep 2006 10:43:45 +0000 (UTC)
High Plains Thumper <h...@singlecylinderbikes.com> wrote:

> True. Minimum posting should be at least monthly, I'll give it
> a shot. Also, we know that this so called second charter that
> some have brought up in the past, in truth does not exist.
>
> Reference:
>
> http://groups.google.com/group/comp.os.linux.advocacy/browse_frm
> /thread/3a36eaf74cc73f/dc0c7bfeb17f459f?lnk=st&q=&rnum=1
> &hl=en#dc0c7bfeb17f459f

ERRRgh:
http://www.hyphenologist.co.uk/killfile/antitrollfaqhtm.htm

What idiot wrote that in FP.

--
Regards, Ed :: http://www.gnunix.net
proud linux person
The story of Paul Bunyan and his blue ox, is based on the true story
of Chuck Norris and his throbing penis.

Peter Hayes

unread,
Sep 10, 2006, 8:20:59 AM9/10/06
to
In <lcLMg.2524$726...@bignews1.bellsouth.net> DFS wrote:
> Linonut wrote:
>> After takin' a swig o' grog, DFS belched out this bit o' wisdom:
>>
>>> You left out the most important function of cola: to provide an
>>> online forum for petulant, disgruntled Linux users to lie and whine
>>> endlessly about Microsoft and the Windows operating system.
>>
>> That isn't in the charter, but it is a fine purpose for COLA.
>
> It's kind of a sad reflection on the Linux "community" is what it is.

Not at all - demonstrating the awfulness of Woeful Windows is a service
to anyone wishing to discover alternatives.

>> It's tough to keep venting steam about Windows/MS
>> fubars at work all the time.
>
> Get used to it, my man, 'cause Windows isn't leaving your workplace
> for a long, long, long time...

And for every year that Windows monopolises the desktop, progress is
stifled. Linux and OS X demonstrate that admirably.

We've seen all those problems XP inflicted on the industry, Blaster,
Sobig, etc, at a cost of $$$bn. Now we're in for another round of these
exploits as Vista comes on stream.

Wise CEOs will hold back or bypass Vista altogether - what business
desktop needs Aero?

--

Peter

Linonut

unread,
Sep 10, 2006, 10:40:04 AM9/10/06
to
After takin' a swig o' grog, DFS belched out this bit o' wisdom:

> Linonut wrote:
>> After takin' a swig o' grog, DFS belched out this bit o' wisdom:
>>
>>> You left out the most important function of cola: to provide an
>>> online forum for petulant, disgruntled Linux users to lie and whine
>>> endlessly about Microsoft and the Windows operating system.
>>
>> That isn't in the charter, but it is a fine purpose for COLA.
>
> It's kind of a sad reflection on the Linux "community" is what it is.

No. It's a sad reflection on the reality of Windows and how
over-integrated and fragile it can be.

>> It's tough to keep venting steam about Windows/MS
>> fubars at work all the time.
>
> Get used to it, my man, 'cause Windows isn't leaving your workplace for a
> long, long, long time...

Of course not. But I've found I no longer need to use it for most
things. I need it for the following: Rational (IT will not provide us
with a Linux version, so we have to use the shit Windows version), Word
(OpenOffice can view Word files just fine now, but still has trouble
with Word's awful numbering/outlining support), and for Visual Studio
work, including debugging.

But, I can run that stuff through a VNC window or through the method
noted in the sig.

And, believe me, the developers and their managers are aware of the
problems of Windows. Linux will probably be used as the platform for one
of our projects, at some point.

I don't know why the Navy wanted Windows so badly for their network
infrastructure, unless Microsoft was lobbying heavily for it. Trying to
get NMCI to work almost sunk EDS, and nobody likes the system anyway, it
is a joke to some people.

However, the Word and e-mail stuff works well enough.

--
Boot your Windows operating system in a virtual machine on Linux
It's safer.
-- http://fabrice.bellard.free.fr/qemu/

DFS

unread,
Sep 10, 2006, 11:12:11 AM9/10/06
to
Linonut wrote:
> After takin' a swig o' grog, DFS belched out this bit o' wisdom:
>
>> Linonut wrote:
>>> After takin' a swig o' grog, DFS belched out this bit o' wisdom:
>>>
>>>> You left out the most important function of cola: to provide an
>>>> online forum for petulant, disgruntled Linux users to lie and whine
>>>> endlessly about Microsoft and the Windows operating system.
>>>
>>> That isn't in the charter, but it is a fine purpose for COLA.
>>
>> It's kind of a sad reflection on the Linux "community" is what it is.
>
> No. It's a sad reflection on the reality of Windows and how
> over-integrated and fragile it can be.

Like KDE and Gnome? Like every Linux distro that's taken seriously? That
"over-integrated"?


>>> It's tough to keep venting steam about Windows/MS
>>> fubars at work all the time.
>>
>> Get used to it, my man, 'cause Windows isn't leaving your workplace
>> for a long, long, long time...
>
> Of course not. But I've found I no longer need to use it for most
> things. I need it for the following: Rational (IT will not provide
> us with a Linux version, so we have to use the shit Windows version),
> Word (OpenOffice can view Word files just fine now, but still has
> trouble with Word's awful numbering/outlining support), and for
> Visual Studio work, including debugging.
>
> But, I can run that stuff through a VNC window or through the method
> noted in the sig.

So you work in Linux all day, while everyone else runs Windows?


> And, believe me, the developers and their managers are aware of the
> problems of Windows.

Which leads me to believe they're NOT aware of the problems of Linux.

> Linux will probably be used as the platform for
> one of our projects, at some point.

heee! That's some major commitment to Linux.

> I don't know why the Navy wanted Windows so badly for their network
> infrastructure, unless Microsoft was lobbying heavily for it. Trying
> to get NMCI to work almost sunk EDS, and nobody likes the system
> anyway, it is a joke to some people.

http://toolbar.netcraft.com/site_report?url=http://www.homeport.navy.mil

Do you really think it would have been a better system with all Linux/OSS
components?

> However, the Word and e-mail stuff works well enough.

All MS software works plenty well enough. Most of it is exceptional.

DFS

unread,
Sep 10, 2006, 11:22:59 AM9/10/06
to
Peter Hayes wrote:

> And for every year that Windows monopolises the desktop, progress is
> stifled. Linux and OS X demonstrate that admirably.

You must be talking about the "innovative" Xgl and spinning cubes. yawn...

> We've seen all those problems XP inflicted on the industry, Blaster,
> Sobig, etc, at a cost of $$$bn.

XP didn't inflict anything anywhere (except the best Windows desktop OS).

> Now we're in for another round of
> these exploits as Vista comes on stream.

How would you know what the future holds?

> Wise CEOs will hold back or bypass Vista altogether

'Cause cola nut Peter Hayes says so?

Nearly every company in the world runs on Windows - was founded on Windows,
and grew successful using Windows - so why should they hold off on the
latest and greatest version?

> - what business desktop needs Aero?

No business needs Aero (or Xgl or Aqua). So you turn it off if you don't
need it.


Peter Hayes

unread,
Sep 10, 2006, 12:40:08 PM9/10/06
to
In <c3WMg.17365$Ca4....@bignews7.bellsouth.net> DFS wrote:
> Peter Hayes wrote:
>
>> And for every year that Windows monopolises the desktop, progress is
>> stifled. Linux and OS X demonstrate that admirably.
>
> You must be talking about the "innovative" Xgl and spinning cubes.
> yawn...

Aero seems to be the big deal with Vista and it's well out of date in
comparison, even before it's released.

>> We've seen all those problems XP inflicted on the industry, Blaster,
>> Sobig, etc, at a cost of $$$bn.
>
> XP didn't inflict anything anywhere

It inflicted gaping security holes on its users. It's one reason Vista
is so late - Microsoft moved many of their Vista developers over to help
with XP SP2 because their users were screaming blue murder. Either that
or Homeland Security put a rocket up Gates' tail. Or both.

Either way, XP's security weaknesses cost users worldwide hundreds of
billions of dollars. And is still costing us.

> (except the best Windows desktop OS).

Wow!!! Such a stratospheric achievement.

>> Now we're in for another round of
>> these exploits as Vista comes on stream.
>
> How would you know what the future holds?

The past is a good teacher.

>> Wise CEOs will hold back or bypass Vista altogether
>
> 'Cause cola nut Peter Hayes says so?

No, because common sense says so.

> Nearly every company in the world runs on Windows - was founded on
> Windows, and grew successful using Windows

Only through the illegal leverage of a monopoly.

> - so why should they hold off on the latest and greatest version?

Why saddle themselves with yet another round of exploits when they're
still getting to grips with the last lot?

>> - what business desktop needs Aero?
>
> No business needs Aero (or Xgl or Aqua). So you turn it off if you
> don't need it.

So why invest in it?

A few corporates might invest in the hardware and software necessary to
make the Vista transition for their road warriors, the ROI elsewhere is
negative.

--

Peter

DFS

unread,
Sep 10, 2006, 3:17:50 PM9/10/06
to
Peter Hayes wrote:

> Either way, XP's security weaknesses cost users worldwide hundreds of
> billions of dollars. And is still costing us.

XP's security weaknesses didn't cost the world one red cent.

>> (except the best Windows desktop OS).
>
> Wow!!! Such a stratospheric achievement.

Well, it sure puts the beatdown on Linux...

>>> Now we're in for another round of
>>> these exploits as Vista comes on stream.
>>
>> How would you know what the future holds?
>
> The past is a good teacher.

OK, so Linux will always remain 10 years behind the rest of the world. Rex
Ballard will always claim credit for events he had nothing to do with. cola
will always whine about MS and Windows.

Bummer.

>>> Wise CEOs will hold back or bypass Vista altogether
>>
>> 'Cause cola nut Peter Hayes says so?
>
> No, because common sense says so.

This is cola: the charter says Linux "advocates" have to park their common
sense at the door.

>> Nearly every company in the world runs on Windows - was founded on
>> Windows, and grew successful using Windows
>
> Only through the illegal leverage of a monopoly.

Only because Windows offered them what they needed. They were more than
willing to buy it, too. Win 1.0 and 2.0 weren't successful... but 3.0+ took
the world by storm. Six years later a 96% market share in desktop operating
systems was controlled by Microsoft.

Linux, as usual, was nowhere to be seen.

>> - so why should they hold off on the latest and greatest version?
>
> Why saddle themselves with yet another round of exploits when they're
> still getting to grips with the last lot?

What company is dealing with exploits?


>>> - what business desktop needs Aero?
>>
>> No business needs Aero (or Xgl or Aqua). So you turn it off if you
>> don't need it.
>
> So why invest in it?

You're not so dense as you're trying to be, right? Life moves on. People
get tired of the same screens, the same furniture, the same auto...


> A few corporates might invest in the hardware and software necessary
> to make the Vista transition for their road warriors, the ROI
> elsewhere is negative.

Computers and software are a cost of doing business. Hopefully their
investment results in something worthwhile: easier maintenance, more
productive employees, better analysis, etc.

As for negative ROI... imagine trying to switch a big company to Linux.
Ouch!

Can it even be done? IBM, with all their technical talent, couldn't do it.
Supposedly Novell is trying, but I'm sure it's a nightmare to go into work
each day and deal with all the bullshit computer problems: systems being
migrated, files lost, thousands of Excel sheets that don't run in OO Calc,
Word documents that open up scrambled in OO Writer.


GreyCloud

unread,
Sep 10, 2006, 6:44:08 PM9/10/06
to
DFS wrote:

> Peter Hayes wrote:
>
>
>>Either way, XP's security weaknesses cost users worldwide hundreds of
>>billions of dollars. And is still costing us.
>
>
> XP's security weaknesses didn't cost the world one red cent.
>

http://www.macdailynews.com/index.php/weblog/comments/windows_worms_and_viruses_cost_companies_average_of_2_million_per_incident/

Windows worms and viruses cost companies average of $2 million per incident

Thursday, July 08, 2004 - 08:10 AM EDT

"Internet-based business disruptions triggered by worms and viruses are
costing companies an average of nearly $2-million in lost revenue per
incident, market researcher Aberdeen said on Tuesday," CNET reports.

"Out of 162 companies contacted, 84 per cent said their business
operations have been disrupted and disabled by Internet security events
during the last three years. Though the average rate of business
operations disruption was one incident per year, about 15 per cent of
the surveyed companies said their operations had been halted and
disabled more than seven times over a three-year period," CNET reports.

--
Where are we going?
And why am I in this handbasket?

High Plains Thumper

unread,
Sep 11, 2006, 7:01:53 AM9/11/06
to
ed wrote:

> High Plains Thumper wrote:
>
>> True. Minimum posting should be at least monthly, I'll give it
>> a shot. Also, we know that this so called second charter that
>> some have brought up in the past, in truth does not exist.
>>
>> Reference:
>>
>> http://groups.google.com/group/comp.os.linux.advocacy/browse_frm
>> /thread/3a36eaf74cc73f/dc0c7bfeb17f459f?lnk=st&q=&rnum=1
>> &hl=en#dc0c7bfeb17f459f
>
> ERRRgh:
> http://www.hyphenologist.co.uk/killfile/antitrollfaqhtm.htm
>
> What idiot wrote that in FP.

URL did not update when in Google frames. Message pointer remained at Post#
23, meant for Post# 13. You would have to follow the thread to know the
context.

http://tinyurl.com/prsvo

points at correct post in Google.

--
HPT

Linonut

unread,
Sep 11, 2006, 7:32:51 AM9/11/06
to
After takin' a swig o' grog, DFS belched out this bit o' wisdom:

> Linonut wrote:
>> After takin' a swig o' grog, DFS belched out this bit o' wisdom:
>>
>>> Linonut wrote:
>>>> After takin' a swig o' grog, DFS belched out this bit o' wisdom:
>>>>
>>>>> You left out the most important function of cola: to provide an
>>>>> online forum for petulant, disgruntled Linux users to lie and whine
>>>>> endlessly about Microsoft and the Windows operating system.
>>>>
>>>> That isn't in the charter, but it is a fine purpose for COLA.
>>>
>>> It's kind of a sad reflection on the Linux "community" is what it is.
>>
>> No. It's a sad reflection on the reality of Windows and how
>> over-integrated and fragile it can be.
>
> Like KDE and Gnome? Like every Linux distro that's taken seriously? That
> "over-integrated"?

Never learned about corba or dcop?

No. I mean over-integrated like Windows or (in the Linux world)
Nautilus.

I mean like an OS where a Windows update breaks "Update Fields" in
Windows, or screws up certain multiple-monitor configurations.

>> But, I can run that stuff through a VNC window or through the method
>> noted in the sig.
>
> So you work in Linux all day, while everyone else runs Windows?

Pretty much. There are a few other people who use Linux now and then at
work.

>> And, believe me, the developers and their managers are aware of the
>> problems of Windows.
>
> Which leads me to believe they're NOT aware of the problems of Linux.

They aren't yet, although I do make a point of saying that Linux is no
panacea. A serious project still takes serious hard work.

>> Linux will probably be used as the platform for
>> one of our projects, at some point.
>
> heee! That's some major commitment to Linux.

It's a start, your heehawing notwithstanding.

>> I don't know why the Navy wanted Windows so badly for their network
>> infrastructure, unless Microsoft was lobbying heavily for it. Trying
>> to get NMCI to work almost sunk EDS, and nobody likes the system
>> anyway, it is a joke to some people.
>
> http://toolbar.netcraft.com/site_report?url=http://www.homeport.navy.mil
>
> Do you really think it would have been a better system with all Linux/OSS
> components?

Yes. No question. EDS still might have produced some screwups, though.

By the way, their network is bigger than just that one navy.mil site.

>> However, the Word and e-mail stuff works well enough.
>
> All MS software works plenty well enough. Most of it is exceptional.

Tell me about this mythical exceptional Microsoft software. All I've
seen is MS Office, Project, Visio, Visual Studio, Internet Explorer, and
Windows itself. What else is out there?

--
Mayor Noche's Bomba Shelter. Where bad food and bad people go
together. Try our Sleepy Joe -- it's half dog-food, half downers.
Today's special -- red beans and reds.
-- The Firesign Theatre

Linonut

unread,
Sep 11, 2006, 7:36:20 AM9/11/06
to
After takin' a swig o' grog, GreyCloud belched out this bit o' wisdom:

> DFS wrote:
>
>> XP's security weaknesses didn't cost the world one red cent.
>
> http://www.macdailynews.com/index.php/weblog/comments/windows_worms_and_viruses_cost_companies_average_of_2_million_per_incident/
>
> Windows worms and viruses cost companies average of $2 million per incident
>
> Thursday, July 08, 2004 - 08:10 AM EDT
>
> "Internet-based business disruptions triggered by worms and viruses are
> costing companies an average of nearly $2-million in lost revenue per
> incident, market researcher Aberdeen said on Tuesday," CNET reports.
>
> "Out of 162 companies contacted, 84 per cent said their business
> operations have been disrupted and disabled by Internet security events
> during the last three years. Though the average rate of business
> operations disruption was one incident per year, about 15 per cent of
> the surveyed companies said their operations had been halted and
> disabled more than seven times over a three-year period," CNET reports.

And that is why, although DFS doesn't believe it, Linux is being
seriously deployed, or considered for deployment, wherever Windows
servers are being used.

The desktop is more difficult, due to the prevalence of the non-portable
Office formats, but at least IT can lock down its users Windows desktops
by fiat.

--
Yes, I've heard of "decaf." What's your point?

Mark Kent

unread,
Sep 11, 2006, 12:22:56 PM9/11/06
to
begin oe_protect.scr
High Plains Thumper <h...@singlecylinderbikes.com> espoused:

> Roy Schestowitz wrote:
>> High Plains Thumper on Saturday
>>
>>> Following is excerpted from:
>>>
>>> http://www.faqs.org/faqs/linux/advocacy/faq-and-primer/
>>>
>>> <snip />
>>>
>>> 1.4 The Charter of comp.os.linux.advocacy
>>>
>>> The charter of comp.os.linux.advocacy is:
>>>
>>> For discussion of the benefits of Linux compared to
>>> other operating systems.
>>>
>>> That single sentence is the one and only charter of the
>>> newsgroup comp.os.linux.advocacy. The newsgroup's
>>> charter is for the newsgroup as a place for supporters
>>> of Linux to gather to discuss Linux, for the betterment
>>> of the Linux community and the promotion and
>>> development of Linux. It supports this as a place for
>>> those who would like to learn more about Linux to come
>>> to learn from those who know Linux. It does not call
>>> for it to be a place where the anti-Linux propagandists
>>> to gather in order to discredit Linux.
>>
>> I hope you set this post to a weekly recurrence. It's more
>> valuable than it may seem on the surface.
>
> True. Minimum posting should be at least monthly, I'll give it
> a shot. Also, we know that this so called second charter that
> some have brought up in the past, in truth does not exist.

It would be useful to have it come up. I'd also quite like to see the
FAQ rejuvenated, as we still have a few questions coming up here on a
regular basis which could be handled in that way.

The present discussion about the CLI was well handled in the first FAQ
issue, I thought (but then I did write that bit :-).

>
> Reference:
>
> http://groups.google.com/group/comp.os.linux.advocacy/browse_frm
> /thread/3a36eaf74cc73f/dc0c7bfeb17f459f?lnk=st&q=&rnum=1
> &hl=en#dc0c7bfeb17f459f
>
> or http://tinyurl.com/g4f4k

<snip>

--
| Mark Kent -- mark at ellandroad dot demon dot co dot uk |
Although golf was originally restricted to wealthy, overweight Protestants,
today it's open to anybody who owns hideous clothing.
-- Dave Barry

Roy Schestowitz

unread,
Sep 11, 2006, 12:53:11 PM9/11/06
to
__/ [ Mark Kent ] on Monday 11 September 2006 17:22 \__


If suitable, the FAQ can be modified in a Wiki. *smile*

High Plains Thumper

unread,
Sep 11, 2006, 1:05:43 PM9/11/06
to
Mark Kent wrote:
> High Plains Thumper espoused:

>> Roy Schestowitz wrote:
>>
>>> I hope you set this post to a weekly recurrence. It's
>>> more valuable than it may seem on the surface.
>>
>> True. Minimum posting should be at least monthly, I'll
>> give it a shot. Also, we know that this so called second
>> charter that some have brought up in the past, in truth
>> does not exist.
>
> It would be useful to have it come up. I'd also quite like
> to see the FAQ rejuvenated, as we still have a few
> questions coming up here on a regular basis which could be
> handled in that way.
>
> The present discussion about the CLI was well handled in
> the first FAQ issue, I thought (but then I did write that
> bit :-).

CLI is something I am unfamiliar with, what is it?

--
HPT

DFS

unread,
Sep 11, 2006, 1:15:10 PM9/11/06
to
High Plains Thumper wrote:

> CLI is something I am unfamiliar with, what is it?

Common Language Infrastructure

http://en.wikipedia.org/wiki/Common_Language_Infrastructure

GreyCloud

unread,
Sep 11, 2006, 2:57:13 PM9/11/06
to
Linonut wrote:

That plus the fact that Intel themselves went Linux.

DFS

unread,
Sep 11, 2006, 3:02:55 PM9/11/06
to
GreyCloud wrote:

> That plus the fact that Intel themselves went Linux.

What does that mean? Sounds like a typical "advocate" claim.


GreyCloud

unread,
Sep 12, 2006, 1:06:05 AM9/12/06
to
DFS wrote:

Easy enuf to prove. Just go to http://www.intel.com and poke in Linux
in their search box. You'll get tons of info there. It is Intels claim
not mine.

Hadron Quark

unread,
Sep 12, 2006, 3:19:02 AM9/12/06
to
GreyCloud <mi...@cumulus.com> writes:

> DFS wrote:
>
>> GreyCloud wrote:
>>
>>>That plus the fact that Intel themselves went Linux.
>> What does that mean? Sounds like a typical "advocate" claim.
>
> Easy enuf to prove. Just go to http://www.intel.com and poke in Linux
> in their search box. You'll get tons of info there. It is Intels
> claim not mine.

There is tons of info : about their compilers etc for Linux.

"Supporting Linux" might be a better claim.

--
Who wants to remember that escape-x-alt-control-left shift-b puts you into
super-edit-debug-compile mode?
(Discussion in comp.os.linux.misc on the intuitiveness of commands, especially
Emacs.)

Mark Kent

unread,
Sep 12, 2006, 7:23:53 AM9/12/06
to
begin oe_protect.scr
Roy Schestowitz <newsg...@schestowitz.com> espoused:

Ah, yes - good point! Perhaps time to get back to that!

>
>
>>> Reference:
>>>
>>> http://groups.google.com/group/comp.os.linux.advocacy/browse_frm
>>> /thread/3a36eaf74cc73f/dc0c7bfeb17f459f?lnk=st&q=&rnum=1
>>> &hl=en#dc0c7bfeb17f459f
>>>
>>> or http://tinyurl.com/g4f4k
>>
>> <snip>

--
| Mark Kent -- mark at ellandroad dot demon dot co dot uk |

"... freedom ... is a worship word..."
"It is our worship word too."
-- Cloud William and Kirk, "The Omega Glory", stardate unknown

Mark Kent

unread,
Sep 12, 2006, 7:24:42 AM9/12/06
to
begin oe_protect.scr
High Plains Thumper <h...@singlecylinderbikes.com> espoused:

Umm, erm, yeah - good question. It's, err, well, it has a screen with
text on it, and, err, you, err, use the keyboard, I think...

--
| Mark Kent -- mark at ellandroad dot demon dot co dot uk |

GreyCloud

unread,
Sep 13, 2006, 12:53:04 AM9/13/06
to
Hadron Quark wrote:

> GreyCloud <mi...@cumulus.com> writes:
>
>
>>DFS wrote:
>>
>>
>>>GreyCloud wrote:
>>>
>>>
>>>>That plus the fact that Intel themselves went Linux.
>>>
>>>What does that mean? Sounds like a typical "advocate" claim.
>>
>>Easy enuf to prove. Just go to http://www.intel.com and poke in Linux
>>in their search box. You'll get tons of info there. It is Intels
>>claim not mine.
>
>
> There is tons of info : about their compilers etc for Linux.
>
> "Supporting Linux" might be a better claim.
>

They do. Like I said, there are long lists of support articles on Intel
chips for Linux. Also, considering that in that list is their own
report on Linux TCO where they claim to have saved millions of dollars
by leaving proprietary software systems. I almost thought that they
also changed from OpenVMS to Linux on their fab lines, but someone
reported in comp.os.vms that they hadn't. You only have to pay for
OpenVMS license just once if things are running smoothly.

High Plains Thumper

unread,
Sep 13, 2006, 7:52:30 AM9/13/06
to
Mark Kent wrote:
> High Plains Thumper espoused:
>> Mark Kent wrote:
>>
>>> The present discussion about the CLI was well handled in
>>> the first FAQ issue, I thought (but then I did write that
>>> bit :-).
>>
>> CLI is something I am unfamiliar with, what is it?
>
> Umm, erm, yeah - good question. It's, err, well, it has a screen with
> text on it, and, err, you, err, use the keyboard, I think...

Not a laptop with 2 knobs like an Etch-A-Sketch? :-)

--
HPT

Jim

unread,
Sep 13, 2006, 8:00:47 AM9/13/06
to
On or about 2006-09-13 Wednesday 12:52, I did witness the following events
concerning High Plains Thumper:

We know a song about that.

--
I hereby testify that the above statement is an accurate recollection of the
events mentioned therein.
http://dotware.co.uk
Registered Linux user #426308 -*- http://counter.li.org

Scott Nudds

unread,
Sep 14, 2006, 9:29:59 PM9/14/06
to
From: "Linonut" <lin...@bone.com>

> I don't know why the Navy wanted Windows so badly for their network
> infrastructure, unless Microsoft was lobbying heavily for it.

Probably because the Navy realizes that Unix/Linux sucks pig shit.

What happened to ADA?

Ahahahahahahahahaahah


Scott Nudds

unread,
Sep 14, 2006, 9:38:12 PM9/14/06
to

"Peter Hayes" <not_i...@btinternet.com> wrote in message

> It inflicted gaping security holes on its users. It's one reason Vista
> is so late - Microsoft moved many of their Vista developers over to help
> with XP SP2 because their users were screaming blue murder. Either that
> or Homeland Security put a rocket up Gates' tail. Or both.

Windows didn't inflict security holes, as much as the C programming
language does, and the C programming philosophy.

The C programming language has security namagement flaws and even memory
management flaws built right into the IO library. Flaws that any grade
school programmer would immediately recognize.

Unix/Linux is built on those flaws.


"Peter Hayes" <not_i...@btinternet.com> wrote in message


> Either way, XP's security weaknesses cost users worldwide hundreds of
> billions of dollars. And is still costing us.

Yup. The adoption of C as the language of choice has been costly. It was
a foolish mistake that will take decades more to resolve.

C is one of the worst languages ever developed. It is the DOS of
programming languages.


> > How would you know what the future holds?


"Peter Hayes" <not_i...@btinternet.com> wrote in message


> The past is a good teacher.

A strange statement coming from a Linux oozer, when Linux is just another
failed attempt to bring the failure of Unix to the market for the 3
thousanth time.

If Unix droids actually learned from history they wouldn't keep making the
same mistakes over, and over and over again thousands upon thousands of
times.

Linux is shit because Unix is shit, and Linux = Unix.= Shit Stick.

Peter Hayes

unread,
Sep 14, 2006, 7:32:06 PM9/14/06
to
In <gRkOg.78559$sS1....@read1.cgocable.net> Scott Nudds wrote:
>
> "Peter Hayes" <not_i...@btinternet.com> wrote in message
>> It inflicted gaping security holes on its users. It's one reason
>> Vista is so late - Microsoft moved many of their Vista developers
>> over to help with XP SP2 because their users were screaming blue
>> murder. Either that or Homeland Security put a rocket up Gates' tail.
>> Or both.
>
> Windows didn't inflict security holes, as much as the C programming
> language does, and the C programming philosophy.

Yet Linux is also largely written in C and it doesn't have the same
security problems as Windows. Bit of a conundrum, that, don't you think?

> The C programming language has security namagement flaws and even
> memory management flaws built right into the IO library. Flaws that
> any grade school programmer would immediately recognize.

So you're saying Microsoft's developers aren't even up to school grade
programmer level? Really?

>> Either way, XP's security weaknesses cost users worldwide hundreds of
>> billions of dollars. And is still costing us.
>
> Yup. The adoption of C as the language of choice has been costly.
> It was a foolish mistake that will take decades more to resolve.

Much OS X stuff is written in Objective C which presumably by your
thinking has the same errors as C, so why is it exploit free?

> C is one of the worst languages ever developed. It is the DOS of
> programming languages.

Rewrite Vista in x86 assembler, I say...

That should keep it off the streets for another decade at least... :-)



>> > How would you know what the future holds?
>> >

>>> The past is a good teacher.
>
> A strange statement coming from a Linux oozer, when Linux is just
> another failed attempt to bring the failure of Unix to the market for
> the 3 thousanth time.

Contemporary Linux distros have succeeded in bringing Unix to the market,
and you only need to look at OS X to realise that they're equally
successful.

--

Peter

GreyCloud

unread,
Sep 15, 2006, 1:19:34 AM9/15/06
to
Peter Hayes wrote:

>
>> C is one of the worst languages ever developed. It is the DOS of
>>programming languages.
>
>
> Rewrite Vista in x86 assembler, I say...
>
> That should keep it off the streets for another decade at least... :-)
>

LOL! Indeed it would.
Of course they'd have to fire their coders for new ones, if any exist today.

GreyCloud

unread,
Sep 15, 2006, 1:21:10 AM9/15/06
to
Scott Nudds wrote:

During that time, ADA was a huge resource hog. ADA did generate tight
code tho, but no one wanted to use it so it died. I remember the DEC
version of ADA that took up all of the 32mb of memory we had and only
for one user on a vax.

Scott Nudds

unread,
Sep 15, 2006, 6:35:37 AM9/15/06
to

"Peter Hayes" <not_i...@btinternet.com> wrote in message
> Yet Linux is also largely written in C and it doesn't have the same
> security problems as Windows. Bit of a conundrum, that, don't you think?

Not really. Unix has been rewritten 3,000 times or so over the last 50
years, windows only 4-5 times. And while Windows has been the target of
exploits simply because of it's market share, Unix/Linux has taken the time
to avoid using those language features that produce buffer overflow
problems.

Now the question is, what kind of fucked up shit language comes with an
I/O library that BY STANDARD CONTAINS MULTIPLE BUFFER OVERFLOW CONDITIONS
that even a grade school programmer can easily spot?

Such a spectacular language failure can only be a result of a
spectacularly poor language.


> > The C programming language has security namagement flaws and even
> > memory management flaws built right into the IO library. Flaws that
> > any grade school programmer would immediately recognize.

"Peter Hayes" <not_i...@btinternet.com> wrote in message


> So you're saying Microsoft's developers aren't even up to school grade
> programmer level? Really?

No, I reserve that observation to the incompetents who designed the C
programming language and it's I/O library.


> > Yup. The adoption of C as the language of choice has been costly.
> > It was a foolish mistake that will take decades more to resolve.

"Peter Hayes" <not_i...@btinternet.com> wrote in message


> Much OS X stuff is written in Objective C which presumably by your
> thinking has the same errors as C, so why is it exploit free?

Does it? That isn't clear at all. Does it use the same I/O library?
That's highly doubtful given that it's an OOP language.

Stupid... Stupid... Linux ShitLicker.

> > C is one of the worst languages ever developed. It is the DOS of
> > programming languages.

"Peter Hayes" <not_i...@btinternet.com> wrote in message


> Rewrite Vista in x86 assembler, I say...

I agree, it would be 1/10th the size and run 4-8 times faster.


> > A strange statement coming from a Linux oozer, when Linux is just
> > another failed attempt to bring the failure of Unix to the market for
> > the 3 thousanth time.

"Peter Hayes" <not_i...@btinternet.com> wrote in message


> Contemporary Linux distros have succeeded in bringing Unix to the market,

Giving copies away free hasn't worked to get people to use the Linux Shit
Stick. So some Linux promoters are actually begining to pay to have their
OS included on machines.

You would have to pay me a great deal of money in order to use Linux.

Using Linux is as disgusting an experience as falling into a sewer.


Scott Nudds

unread,
Sep 15, 2006, 6:36:57 AM9/15/06
to

"GreyCloud" <mi...@cumulus.com> wrote in message

> During that time, ADA was a huge resource hog. ADA did generate tight
> code tho, but no one wanted to use it so it died.

Another AmeriKKKan white elephant.


GreyCloud

unread,
Sep 15, 2006, 4:43:31 PM9/15/06
to
Scott Nudds wrote:

Objective-C is just a small extension to C. The Gnu C compiler now
gives out warnings to not use certain functions and that your code is
not secure. Apple did their homework on this one and contributed to
this project. And yet no one can show any damages to OS X as compared
to the damages done by Microsoft windows using IE and OE.

GreyCloud

unread,
Sep 15, 2006, 4:44:57 PM9/15/06
to
Scott Nudds wrote:

Yet it did what it was designed for... and it worked very good at making
tight valid code. Too bad it did die off... because DOD was paying
handsomely to those that could code in ADA.

Peter Hayes

unread,
Sep 16, 2006, 6:02:18 AM9/16/06
to
In <pfKdncvv3eXmjZbY...@bresnan.com> GreyCloud wrote:
> Scott Nudds wrote:
>> "Peter Hayes" <not_i...@btinternet.com> wrote in message
>>
>>>Much OS X stuff is written in Objective C which presumably by your
>>>thinking has the same errors as C, so why is it exploit free?
>>
>>
>> Does it? That isn't clear at all. Does it use the same I/O
>> library?
>> That's highly doubtful given that it's an OOP language.
>
> Objective-C is just a small extension to C. The Gnu C compiler now
> gives out warnings to not use certain functions and that your code is
> not secure. Apple did their homework on this one and contributed to
> this project. And yet no one can show any damages to OS X as compared
> to the damages done by Microsoft windows using IE and OE.

I would hope that Microsoft have also avoided any insecure functions
when writing Vista.

They should have done so years ago, but evidently not...

--

Peter

Linonut

unread,
Sep 16, 2006, 8:43:03 AM9/16/06
to
After takin' a swig o' grog, Peter Hayes belched out this bit o' wisdom:

> I would hope that Microsoft have also avoided any insecure functions
> when writing Vista.
>
> They should have done so years ago, but evidently not...

HANDLEs -- the gateway to Chaos.

--
Intel: where Quality is job number 0.9998782345!

Hadron Quark

unread,
Sep 16, 2006, 12:38:34 PM9/16/06
to
GreyCloud <mi...@cumulus.com> writes:

> Scott Nudds wrote:
>
>> "GreyCloud" <mi...@cumulus.com> wrote in message
>>
>>>During that time, ADA was a huge resource hog. ADA did generate tight
>>>code tho, but no one wanted to use it so it died.
>> Another AmeriKKKan white elephant.
>>
>
> Yet it did what it was designed for... and it worked very good at
> making tight valid code. Too bad it did die off... because DOD was
> paying handsomely to those that could code in ADA.

And still are.

--
No alcohol, dogs or horses.

GreyCloud

unread,
Sep 16, 2006, 2:34:19 PM9/16/06
to
Peter Hayes wrote:

I moved code over to VS6 that gcc 4.0 caught as dangerous... VS6 never
gave any warnings. I don't know if the later versions of VS will report
dangerous functions or not.

GreyCloud

unread,
Sep 16, 2006, 2:37:14 PM9/16/06
to
Hadron Quark wrote:

I wonder what platform they are writing ADA on these days?
I know back around 1991 or so, DOD sent down a memorandum that we
convert our projects over to ADA. We just had a vax785 and DECs version
of ADA needed all 32m of ram and only one user could get on. Later, it
seemed that everyone just ignored that memorandum and kept on using what
they already had. More likely it was just a budget problem and couldn't
justify the expenditure... and consequently a few projects just shutdown.

Hadron Quark

unread,
Sep 16, 2006, 3:29:17 PM9/16/06
to
GreyCloud <mi...@cumulus.com> writes:

I worked on Convex using Verdix ADA.

--
Japan, n:
A fictional place where elves, gnomes and economic imperialists
create electronic equipment and computers using black magic. It
is said that in the capital city of Akihabara, the streets are
paved with gold and semiconductor chips grow on low bushes from
which they are harvested by the happy natives.

Scott Nudds

unread,
Sep 17, 2006, 1:21:38 AM9/17/06
to

"Linonut" <lin...@bone.com> wrote in message news:-

> HANDLEs -- the gateway to Chaos.

C standard library -> Guaranteed buffer overflow problems.


GreyCloud

unread,
Sep 16, 2006, 11:00:21 PM9/16/06
to
Hadron Quark wrote:

> GreyCloud <mi...@cumulus.com> writes:
>
>
>>Hadron Quark wrote:
>>
>>
>>>GreyCloud <mi...@cumulus.com> writes:
>>>
>>>
>>>>Scott Nudds wrote:
>>>>
>>>>
>>>>
>>>>>"GreyCloud" <mi...@cumulus.com> wrote in message
>>>>>
>>>>>
>>>>>
>>>>>>During that time, ADA was a huge resource hog. ADA did generate tight
>>>>>>code tho, but no one wanted to use it so it died.
>>>>>
>>>>>Another AmeriKKKan white elephant.
>>>>>
>>>>
>>>>Yet it did what it was designed for... and it worked very good at
>>>>making tight valid code. Too bad it did die off... because DOD was
>>>>paying handsomely to those that could code in ADA.
>>>
>>>And still are.
>>>
>>
>>I wonder what platform they are writing ADA on these days?
>>I know back around 1991 or so, DOD sent down a memorandum that we
>>convert our projects over to ADA. We just had a vax785 and DECs
>>version of ADA needed all 32m of ram and only one user could get on.
>>Later, it seemed that everyone just ignored that memorandum and kept
>>on using what they already had. More likely it was just a budget
>>problem and couldn't justify the expenditure... and consequently a few
>>projects just shutdown.
>
>
> I worked on Convex using Verdix ADA.
>

I'm not familiar with Convex or Verdix. See what happens when you
retire? You eventually lose touch with whats going on out there. :-/

ed

unread,
Sep 17, 2006, 5:53:57 AM9/17/06
to

That's exactly what windows is made of

--
Regards, Ed :: http://www.openbsdhacker.com
just another python hacker
The film "Brokeback Mountain" was originally pitched as an off-beat
romance starring Mr. T and Chuck Norris. The sole reason the two
legends declined the starring roles is because if Mr. T and Chuck
Norris were to kiss, God would die.

Scott Nudds

unread,
Sep 17, 2006, 4:45:39 PM9/17/06
to

> > C standard library -> Guaranteed buffer overflow problems.


"ed" <e...@noreply.com> wrote in message
news:20060917105...@localhost.localdomain...


> That's exactly what windows is made of

Every C programmer who has ever used the standard C library for user I/O has
written a program that suffers from buffer overflow errors and is prone to
buffer overflow exploits.

This is obvious to every programmer who can code at a grade school level or
better.

Yet these problems were not readily apparent to the morons who developed the
language, or the I/O library, or the tens of thousands of people involved in
the standardization of that library.

The recommended solution to this "issue" was to use a larger input buffer to
make the problem less likely.

Over 90% of all of computer exploits are buffer overflow exploits.

Slit from ear to ear, the throats of these FUCKING IGNORANT SHIT LICKERS.


Scott Nudds

unread,
Sep 17, 2006, 4:46:55 PM9/17/06
to

"GreyCloud" <mi...@cumulus.com> wrote in message news:AL-

> I'm not familiar with Convex or Verdix. See what happens when you
> retire? You eventually lose touch with whats going on out there. :-/

When you retire, you should have better grip on the road.


ed

unread,
Sep 17, 2006, 2:03:29 PM9/17/06
to
On Sun, 17 Sep 2006 13:45:39 -0700
"Scott Nudds" <nos...@foo.com> wrote:

> > > C standard library -> Guaranteed buffer overflow problems.
>
>
> "ed" <e...@noreply.com> wrote in message
> news:20060917105...@localhost.localdomain...
> > That's exactly what windows is made of
>
> Every C programmer who has ever used the standard C library for user
> I/O has written a program that suffers from buffer overflow errors and
> is prone to buffer overflow exploits.
>
> This is obvious to every programmer who can code at a grade school
> level or better.
>
> Yet these problems were not readily apparent to the morons who
> developed the language, or the I/O library, or the tens of thousands
> of people involved in the standardization of that library.
>
> The recommended solution to this "issue" was to use a larger input
> buffer to make the problem less likely.

Where did you get that impression? It's not the generally accepted
solution when writing C to make a huge memory buffer because the
programmer cannot navigate the memory correctly.

--
Regards, Ed :: http://www.usenix.org.uk
just another linux person
The A-Team, in fact, chronicles President T's diplomatic visits to
the US in the 1980s. They were an unparalleled success.

GreyCloud

unread,
Sep 17, 2006, 2:12:36 PM9/17/06
to
Scott Nudds wrote:

Not with the high price of gas you won't.

Scott Nudds

unread,
Sep 17, 2006, 8:12:14 PM9/17/06
to

> On Sun, 17 Sep 2006 13:45:39 -0700
> "Scott Nudds" <nos...@foo.com> wrote:
> > The recommended solution to this "issue" was to use a larger input
> > buffer to make the problem less likely.


From: "ed" <e...@noreply.com>
> Where did you get that impression?

From the mouthes of C ShitLickers of course.


From: "ed" <e...@noreply.com>
> It's not the generally accepted
> solution when writing C to make a huge memory buffer because the
> programmer cannot navigate the memory correctly.

Navogating large buffer spaces is no different than navigating small ones.

And of course Anyone who has ever written a program that uses the standard C
I/O library for user I/O has necessarily written a program that suffers from
buffer overflow errors and is succeptable to buffer overflow exploits.

Shit Licker in the C programming community caused this problem. They lived
with it for decades as the problem persisted and grew. They ignored the
obvious failure of their pet language, and the cock suckers knowingly made
it an official standard.

Shit is the official food of the C programming community.

Slit their fucking ignorant throats. All of em.


ed

unread,
Sep 18, 2006, 1:25:25 PM9/18/06
to
On Sun, 17 Sep 2006 17:12:14 -0700
"Scott Nudds" <nos...@foo.com> wrote:


You are wrong. Buffer overflow errors occur because idiots who learn to
code with syntax sugar don't realise what is happening at the lower
levels. Those who learn with ASM understand C pretty well.

Just because a few people don't understand C does not mean there is a
problem with the language. If that were true I could happily crush .net
by it not being portable. I'm guessing you're just a shit C programmer
and never figured out how to do things right. Hell, you keep talking
shit, literally.

--
Regards, Ed :: http://www.ednevitable.co.uk
just another python person
When Chuck Norris dines at Chinese buffets, under the tip he leaves
"go play in traffic you fucking Asian".

Scott Nudds

unread,
Sep 18, 2006, 7:26:49 PM9/18/06
to

"ed" <e...@noreply.com> wrote in message
news:20060918183...@localhost.localdomain...

> You are wrong. Buffer overflow errors occur because idiots who learn to
> code with syntax sugar don't realise what is happening at the lower
> levels.

Partly but that doesn't explain why functions wich necessarily <MUST>
produce buffer overflows are peppered through the Standard C I/O library.

If you use the Standard C or C++ I/O library for user input your program
<WILL> suffer from buffer overflow failures.

The gift of Buffer overflows, and buffer overflow exploits is built right
into the standard IO package for this shit language.


"ed" <e...@noreply.com> wrote in message

news:20060918183...@localhost.localdomain...


> Those who learn with ASM understand C pretty well.

I programmed in Assembler for 20 years. C suck pig shit. It is brain
dead and one of the worst and most offensive languages ever created. K&R
should be publicly drawn and quartered. I'll drive the horses myself.

Those two fuckups have done more damage to the computing community than
anything Microsoft could ever have done.


"ed" <e...@noreply.com> wrote in message

news:20060918183...@localhost.localdomain...


> Just because a few people don't understand C does not mean there is a
> problem with the language.

There is a problem with the language you fucking moron because it's standard
IO Library DOESNT WORK, NEVER HAS WORKED, and NEVER WILL WORK.

Those who wrote it and those who standardized it, and those who use it
without complaint have Shit for Brains and are guilty of criminal stupidity.

The Ghost In The Machine

unread,
Sep 18, 2006, 6:00:08 PM9/18/06
to
In comp.os.linux.advocacy, Scott Nudds
<nos...@foo.com>
wrote
on Mon, 18 Sep 2006 16:26:49 -0700
<4iDPg.98708$sS1....@read1.cgocable.net>:

>
> "ed" <e...@noreply.com> wrote in message
> news:20060918183...@localhost.localdomain...
>> You are wrong. Buffer overflow errors occur because idiots who learn to
>> code with syntax sugar don't realise what is happening at the lower
>> levels.
>
> Partly but that doesn't explain why functions wich necessarily <MUST>
> produce buffer overflows are peppered through the Standard C I/O library.
>
> If you use the Standard C or C++ I/O library for user input your program
> <WILL> suffer from buffer overflow failures.

Wow. He's clairvoyant. So tell me, O Superintelligent
One, how the following program will suffer from a buffer
overflow failure.

Take your time.

----8< >8----

#include <stdio.h>

int main(int argc, char **argv)
{
char buf[1024];
int lno = 0;

while(fgets(buf, sizeof(buf), stdin) != NULL)
{
lno++;
printf("%7d ", lno);
fputs(buf, stdout);
}

return 0;
}

----8< >8----

(No, it's not a perfect line number counter. Were I to want that,
I'd use getc(). But it meets all of Scott Nudds' criteria
for "having a buffer overflow failure".)

>
> The gift of Buffer overflows, and buffer overflow exploits is built right
> into the standard IO package for this shit language.

So what language should we be using instead?

[1] Python? An idea, but probably not low enough for device I/O.
It works reasonably well at the application level, of course.

[2] C++? Glorified C, with many of the same problems,
although one can forestall many (but not all) of the
stupider issues such as accessing freed memory with
careful usage.

[3] Java? Same problem as Python.

[4] C#? Same problem as Python.

[5] Assembly?

[6] Pascal?

[7] Modula?

[8] Ada?

[9] APL? (There's a thought!)

[10] Something else?

>
>
> "ed" <e...@noreply.com> wrote in message
> news:20060918183...@localhost.localdomain...
>> Those who learn with ASM understand C pretty well.
>
> I programmed in Assembler for 20 years. C suck pig shit. It is brain
> dead and one of the worst and most offensive languages ever created. K&R
> should be publicly drawn and quartered. I'll drive the horses myself.

OK. So what should replace it?

>
> Those two fuckups have done more damage to the computing community than
> anything Microsoft could ever have done.
>
>
> "ed" <e...@noreply.com> wrote in message
> news:20060918183...@localhost.localdomain...
>> Just because a few people don't understand C does not mean there is a
>> problem with the language.
>
> There is a problem with the language you fucking moron because it's standard
> IO Library DOESNT WORK, NEVER HAS WORKED, and NEVER WILL WORK.

Which explains why many Microsoft programs use C++, an extension of C
and having most of the same library routines (in addition to its own).

>
> Those who wrote it and those who standardized it, and those who use it
> without complaint have Shit for Brains and are guilty of criminal stupidity.
>
> Slit their fucking ignorant throats.
>

That won't solve the problem, although it might make one
feel better for a short time, especially if one happens
to be a violent, drug-crazed maniacal psychopath. ;-)

Me, I prefer genteeler solutions, such as replacing C++
with C# or Java, or perhaps designing a new language --
'D'? -- which cannot possibly buffer overflow (since
perhaps among other things the length of the buffer is
implicitly passed in, a la Pascal) and doesn't have many
of the problems with C's powerful but highly abusable
preprocessor, yet can be easily generated from a large
subset of properly-developed (FSVO) current C programs.
For example, 'D' might borrow part of Java/Python's
"import" construct.

BTW, VMS had "pass by descriptor", basically an 8-byte
header describing type, length, and pointer. For whatever
reason, they didn't catch on.

--
#191, ewi...@earthlink.net
Windows Vista. Because it's time to refresh your hardware. Trust us.

Scott Nudds

unread,
Sep 18, 2006, 9:49:17 PM9/18/06
to

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote in message
news:paf3u3-

> Wow. He's clairvoyant. So tell me, O Superintelligent
> One, how the following program will suffer from a buffer
> overflow failure.

> while(fgets(buf, sizeof(buf), stdin) != NULL)

It's quite simple...

fgets(*buf, buflen)

DOESN'T WORK. IT IS BROKEN.
IT'S ALWAYS BEEN BROKEN.
IT WAS WRITTEN BY INCOMPETENTS
IT CONTINUES TO BE DUPLICATED TO THIS DAY BY INCOMPETENTS
IT WAS STANDARDIZED BY INCOMPETENTS.

Anyone who uses this function has shit for brains.

Here is it's source....


/* pANS stdio -- fgets */
#include "iolib.h"
char *fgets(char *as, int n, FILE *f){
int c;
char *s=as;
c = EOF;

/* No test is performed to see if a char=0 is part of the stream */
/* Hence if a char=0 is found, all of the stream following will */
/* be ignored by the calling routine because it must assume */
/* that the returned buffer is terminated with a char=0 */

while(n>1 && (c=getc(f))!=EOF){
*s++=c;
--n;
if(c=='\n') break;
}

if(c==EOF && s==as || ferror(f)) return NULL;
if(n) *s='\0';
return as;
}

Also note that the function as defined takes an int as the buffer length
since int can be either signed or unsigned depending on the compiler,
allowing negative buffer sizes places artificial limits on the size that
buffers can be and also places complications on how sizes can be reported if
the function is to be replaced with something more rational.

Also note that the function returns a pointer to the start of the buffer.
(as). as does not change and hence the return value is nothing more than a
binary null or (as). returning (s) on the other hand would allow the length
of the input stream that was read to be determined. Returning a character
count would be even better.

Further by defining the function to return a signed integer, and limiting
buffer sizes to sizeof(unsigned int)>>1 then a return value of <0 for error,
0 = EOF, >0 = valid input = # of chars read, is available.

Incompetent programmers write code as you see it.

Competent programmers write code as I have just described.

C is nothing but pure, unadulterated incompetence.

Slit their fucking ignorant throats.


---
The fgets ("file get string") function is similar to the gets function. This
function is deprecated -- that means it is obsolete and it is strongly
suggested you do not use it -- because it is dangerous. It is dangerous
because if the input data contains a null character, you can't tell. Don't
use fgets unless you know the data cannot contain a null. Don't use it to
read files edited by the user because, if the user inserts a null character,
you should either handle it properly or print a clear error message.
---

C was designed by incompetent fools, implemented by incompetent fools.
Standardized by a legion of incompetent fools, and they should all be strung
up by their genitals and whipped until they are dead.

How would a competent programmer write a similar input function.
Presuming we have fixed length strings, the buffer should be first cleared.
next the stream should be read as before, keeping all characters accept new
line or coniderig char=0 to also be a new line. EOF should be treated like
a newline

The function should <not> append anything to the input stream but return it
raw <without the terminating character> and should return the number of
characters read (less the terminating character)

The Ghost In The Machine

unread,
Sep 18, 2006, 8:00:03 PM9/18/06
to
In comp.os.linux.advocacy, Scott Nudds
<nos...@foo.com>
wrote
on Mon, 18 Sep 2006 18:49:17 -0700
<FnFPg.66515$ED.4...@read2.cgocable.net>:

>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote in message
> news:paf3u3-
>> Wow. He's clairvoyant. So tell me, O Superintelligent
>> One, how the following program will suffer from a buffer
>> overflow failure.
>
>> while(fgets(buf, sizeof(buf), stdin) != NULL)
>
> It's quite simple...
>
> fgets(*buf, buflen)
>
> DOESN'T WORK. IT IS BROKEN.
> IT'S ALWAYS BEEN BROKEN.
> IT WAS WRITTEN BY INCOMPETENTS
> IT CONTINUES TO BE DUPLICATED TO THIS DAY BY INCOMPETENTS
> IT WAS STANDARDIZED BY INCOMPETENTS.

I see. What function should be used in its stead?
It is important that we identify all such functions and
replace them. For example, fgets() should never be used,
obviously.

(I would suggest readline() but that gets into forbidden
territory. This function, and any derivations thereof,
are covered under the GPL license -- and as any Wintrool
knows, the GPL license is a viral, dangerous license,
tending to free code which shouldn't be freed but buried
in peat moss so as not to expose major Linux security
weaknesses such as the above.)

>
> Anyone who uses this function has shit for brains.
>
> Here is it's source....

One version thereof. I'm unpacking 2.3.6-r4 now on my Gentoo box.
It is far different from this pANS (?) version, and is calling
_IO_getline().

(I'd unpack 2.4 but Gentoo is complaining about nptl.)

>
>
> /* pANS stdio -- fgets */
> #include "iolib.h"
> char *fgets(char *as, int n, FILE *f){
> int c;
> char *s=as;
> c = EOF;
>
> /* No test is performed to see if a char=0 is part of the stream */
> /* Hence if a char=0 is found, all of the stream following will */
> /* be ignored by the calling routine because it must assume */
> /* that the returned buffer is terminated with a char=0 */
>
> while(n>1 && (c=getc(f))!=EOF){
> *s++=c;
> --n;
> if(c=='\n') break;
> }
>
> if(c==EOF && s==as || ferror(f)) return NULL;
> if(n) *s='\0';
> return as;
> }
>
> Also note that the function as defined takes an int as the buffer length
> since int can be either signed or unsigned depending on the compiler,
> allowing negative buffer sizes places artificial limits on the size that
> buffers can be and also places complications on how sizes can be reported if
> the function is to be replaced with something more rational.

A valid if slightly old complaint. Back in the PDP 11 days one was
lucky if total memory was 120k, and a process could only have 32 k
of space for code and 32k of space for data. At some point POSIX will
want to revisit this, of course...or simply replace this routine
with something Scott Nudds Approved(tm).

>
> Also note that the function returns a pointer to the start of the buffer.
> (as). as does not change and hence the return value is nothing more than a
> binary null or (as). returning (s) on the other hand would allow the length
> of the input stream that was read to be determined. Returning a character
> count would be even better.

A valid complaint.

>
> Further by defining the function to return a signed integer, and limiting
> buffer sizes to sizeof(unsigned int)>>1 then a return value of <0 for error,
> 0 = EOF, >0 = valid input = # of chars read, is available.
>
> Incompetent programmers write code as you see it.
>
> Competent programmers write code as I have just described.
>
> C is nothing but pure, unadulterated incompetence.
>
> Slit their fucking ignorant throats.

Of course. We should be using C# instead. It's the Language Of The
Future(tm).

>
>
> ---
> The fgets ("file get string") function is similar to the gets function.

It is virtually identical, except that fgets() has an integer length
and a file pointer.

> This
> function is deprecated

Both functions are deprecated. The standards committee just doesn't
know it yet. :-)

> -- that means it is obsolete and it is strongly
> suggested you do not use it -- because it is dangerous. It is dangerous
> because if the input data contains a null character, you can't tell. Don't
> use fgets unless you know the data cannot contain a null. Don't use it to
> read files edited by the user because, if the user inserts a null character,
> you should either handle it properly or print a clear error message.
> ---
>
> C was designed by incompetent fools, implemented by incompetent fools.
> Standardized by a legion of incompetent fools, and they should all be strung
> up by their genitals and whipped until they are dead.

You do have issues. Let GWB do the torture. We're here to implement
a brand new language for Windows.

I think....? :-)

>
> How would a competent programmer write a similar input function.

First, he would not. Someone else would write it, and he'd use it,
since it's such a general problem.

If one were to write it, given the criteria you've espoused, one might
write it as follows:

bool readBuffer(const char * inPrompt, unsigned inPromptLen,
fid_t outfid, bool throwOnError, fid_t infid, char * readBuffer,
unsigned readBufferSize, unsigned * readBufferReturn) throws IOException
{
/* implementation */
}

The function returns success or failure, and can throw an exception.
(This is a C++ capability. It's not used that often.)

inPrompt, inPromptLen, and outFid are typically things like "$ ", 2,
and 1 (1 = standard output; I forget the symbolic form for an integer
fid and it depends on whether one wants to discuss Portable Library --
FILE * -- or integer descriptors). readBuffer, readBufferSize, and
readBufferReturn should be obvious.

fid_t is an abstract type for now.

We can quibble as to whether some of these could be globals (e.g.,
throwOnError could be a global set somewhere; this leads to threading
issues).

> Presuming we have fixed length strings, the buffer should be first cleared.
> next the stream should be read as before, keeping all characters accept new
> line or coniderig char=0 to also be a new line. EOF should be treated like
> a newline
>
> The function should <not> append anything to the input stream but return it
> raw <without the terminating character> and should return the number of
> characters read (less the terminating character)
>

Check std::readline() for something that is closer to your
specifications.

Scott Nudds

unread,
Sep 18, 2006, 11:44:16 PM9/18/06
to

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
> I see. What function should be used in its stead?
> It is important that we identify all such functions and
> replace them. For example, fgets() should never be used,
> obviously.

Then why is the function in the standard I/O library?

Who was the incompetent cock sucker who wrote it?
Who was the incompetent cock sucker who accepted it in the library?
Who were the incompetent cock suckers who have used it?
Who are the incompetent cock suckers who voted for it's inclusion in the
standard?

Slit their incompetent throats.......
All of them.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> (I would suggest readline() but that gets into forbidden
> territory. This function, and any derivations thereof,
> are covered under the GPL license -- and as any Wintrool
> knows, the GPL license is a viral, dangerous license,
> tending to free code which shouldn't be freed but buried
> in peat moss so as not to expose major Linux security
> weaknesses such as the above.)

More importantly readline isn't part of the standard C I/O library and it
calls fgets() the function that I just pegged as BROKEN.

HENCE READLINE IS BROKEN AS WELL.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> One version thereof. I'm unpacking 2.3.6-r4 now on my Gentoo box.
> It is far different from this pANS (?) version, and is calling
> _IO_getline().

Which illustrates another reason why C IS SUCH A FUKING PIECE OF SHIT.
Version incompatibility is inescapable, and hence incompatibility is
inescapable.


> > Also note that the function as defined takes an int as the buffer length
> > since int can be either signed or unsigned depending on the compiler,
> > allowing negative buffer sizes places artificial limits on the size that
> > buffers can be and also places complications on how sizes can be
reported if
> > the function is to be replaced with something more rational.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> A valid if slightly old complaint. Back in the PDP 11 days one was
> lucky if total memory was 120k, and a process could only have 32 k
> of space for code and 32k of space for data. At some point POSIX will
> want to revisit this, of course...or simply replace this routine
> with something Scott Nudds Approved(tm).

All the more reason to have the functions work and return usable data
rather than be broken and return less than optimal data. In both instances
this necessitates reproducing the code without errors in every application,
and hence increasing the size of every application that would like to
actually work rather than fail.

In other words, the C I/O library fails at it's secondary task as well.
Being convenient and utilitarian.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Of course. We should be using C# instead. It's the Language Of The
> Future(tm).

C like LinSux has all the components to be a valid language that has merrit.
C++ goes perhaps 80% the way there, but still fails on too many fronts as a
result of the need for backward compatibility.

Pointers in themselves are not an issue. Incompetent language design and
implementation are.

C was designed and developed by a legion of incompetents....

> > C was designed by incompetent fools, implemented by incompetent fools.
> > Standardized by a legion of incompetent fools, and they should all be
strung
> > up by their genitals and whipped until they are dead.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> You do have issues. Let GWB do the torture. We're here to implement
> a brand new language for Windows.
>
> I think....? :-)

DotNet.....


> > How would a competent programmer write a similar input function.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> First, he would not. Someone else would write it, and he'd use it,
> since it's such a general problem.

That someone else had better be a competent programmer or else you end up
with the incompetent results that were produced by K&R & the other
intellectual criminals.
.

> If one were to write it, given the criteria you've espoused, one might
> write it as follows:
>
> bool readBuffer(const char * inPrompt, unsigned inPromptLen,
> fid_t outfid, bool throwOnError, fid_t infid, char * readBuffer,
> unsigned readBufferSize, unsigned * readBufferReturn) throws IOException
> {
> /* implementation */
> }

uint ReadBuffer(*buffer, uint Buflen, StreamHandleType sthdl, uint flags)
begin
' Implementation
end ReadBuffer

The function returns # of chars, EOF, failure or throws an exception.

But instead C gives the user fgets and gets.......

Fucking incompetents 40 years of this incompetence.

Slit their fucking throats... All of them...

The Ghost In The Machine

unread,
Sep 19, 2006, 1:00:03 PM9/19/06
to
In comp.os.linux.advocacy, Scott Nudds
<nos...@foo.com>
wrote
on Mon, 18 Sep 2006 20:44:16 -0700
<p3HPg.98744$sS1....@read1.cgocable.net>:

>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> I see. What function should be used in its stead?
>> It is important that we identify all such functions and
>> replace them. For example, fgets() should never be used,
>> obviously.
>
> Then why is the function in the standard I/O library?

Because nobody has petitioned the POSIX group for a replacement.
(Even after acceptance of such, there will be an interlude, as code is
mutated.)

>
> Who was the incompetent cock sucker who wrote it?

Many. Implementations exist all over the spectrum, from
Whitespace C on VMS way way back when to Amiga to Atari to
Linux today, FreeBSD, HURD, and yes, even Windows.

> Who was the incompetent cock sucker who accepted it in the library?

Who indeed.

> Who were the incompetent cock suckers who have used it?

Even more than those who have implemented it. It's a
simple if slightly off-color solution to the problem of
"how do I read a line anyway?".

C has two basic, tragic flaws. Unlike Pascal or FORTRAN,
C's strings *have no intrinsic length*. One is expected
to go NUL-hunting. Second, C has no method to pass an
array/buffer length as a parameter; one has to explicitly
use sizeof() or constants:

fgets(char * buf, int len, FILE * fp);

as opposed to a more rational

fgets2(array[1..*] of char, stream_type t);

where fgets2() would simply extract the length from
the buffer using Java's '.length' or some such,
and use that. There's no possibility of buffer
overflow there.

(A third flaw, which Java partially addresses [I don't
know about C# but suspect it does too]: C's char is too
small to hold modern Unicode characters. Various
proposals have been floated and accepted to take care of
this issue, and we now have a right mess, reflected in
X's total inability to display multibyte characters properly
without helper libraries such as pango and cairo. At least,
I've not figured out to do it. To its credit, Windows can
at least display Unicode out of the box, though it is also
hampered by C's and C++'s many issues.)

For whatever reason, Kernigan and Ritchie decided both
were A Good Thing(tm). Maybe it was back then, when
memory was far tighter than it is now and one would
take all day to squeeze out a few bytes of code. One
could construe it as an ugly tradeoff.

I'm old enough to remember DOS overlays. That was -- painful.
640k for systems which can now handle more than 640 *megabytes*.
(640 gigabytes of RAM, soon.)

> Who are the incompetent cock suckers who voted for it's
> inclusion in the standard?

You already asked that.

>
> Slit their incompetent throats.......
> All of them.

You're probably talking about killing half of the software developers on
the planet -- including me! :-P . (The other half use VB. :-) )

I'd hate to think what your thoughts on FORTRAN might be. :-)

>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> (I would suggest readline() but that gets into forbidden
>> territory. This function, and any derivations thereof,
>> are covered under the GPL license -- and as any Wintrool
>> knows, the GPL license is a viral, dangerous license,
>> tending to free code which shouldn't be freed but buried
>> in peat moss so as not to expose major Linux security
>> weaknesses such as the above.)
>
> More importantly readline isn't part of the standard C I/O library and it
> calls fgets() the function that I just pegged as BROKEN.
>
> HENCE READLINE IS BROKEN AS WELL.
>

OK. Noted. You might have to discuss that with Richard Stallman. :-)

>
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> One version thereof. I'm unpacking 2.3.6-r4 now on my Gentoo box.
>> It is far different from this pANS (?) version, and is calling
>> _IO_getline().
>
> Which illustrates another reason why C IS SUCH A FUKING PIECE OF SHIT.
> Version incompatibility is inescapable, and hence incompatibility is
> inescapable.

Noted.

>
>
>> > Also note that the function as defined takes an int as the buffer length
>> > since int can be either signed or unsigned depending on the compiler,
>> > allowing negative buffer sizes places artificial limits on the size that
>> > buffers can be and also places complications on how sizes can be
> reported if
>> > the function is to be replaced with something more rational.
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> A valid if slightly old complaint. Back in the PDP 11 days one was
>> lucky if total memory was 120k, and a process could only have 32 k
>> of space for code and 32k of space for data. At some point POSIX will
>> want to revisit this, of course...or simply replace this routine
>> with something Scott Nudds Approved(tm).
>
> All the more reason to have the functions work and return usable data
> rather than be broken and return less than optimal data. In both instances
> this necessitates reproducing the code without errors in every application,
> and hence increasing the size of every application that would like to
> actually work rather than fail.
>
> In other words, the C I/O library fails at it's secondary task as well.
> Being convenient and utilitarian.

Good. Suggest a replacement.

>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Of course. We should be using C# instead. It's the Language Of The
>> Future(tm).
>
> C like LinSux has all the components to be a valid language that has merrit.

C has *no* components of *any* sort, unless one counts
separately-compilable object files. It is an invalid,
incompetent language, according to what I believe are your
measurement criteria.

[1] It has no provisions for string length, as mentioned
heretofore. All strings are merely character arrays
terminated with NUL. This introduces a large number
of kinks in a program. strlen(), of course, is an O(N)
workaround, and can suffer in some contexts such as device
space -- fortunately, device space is rarely encountered
outside of the kernel context. Byte storage versus number
of characters introduce further kinks...read up on UTF-8
sometime; it's a mess.

[2] It does not have packaging, polymorphism, or any OO
design tools at all, beyond maybe the ability to have a
function pointer. (C++ has a method pointer, which helps
a little.)

[3] It cannot assign one array to another or use
variably-sized arrays. (GNU CC *can* use a variable to
create an array on the stack but it cannot change its size
afterwards.) In other words, arrays aren't first-class
objects, if I understand the term correctly.

[4] Functions aren't first-class objects. In other words, one cannot
do things like

h = f + g
h(5)

in a reasonable fashion. (C++ can define operator() for
a class and model f and g as objects, but that's not quite
the same thing. Java has no operator overloading at all.)

> C++ goes perhaps 80% the way there, but still fails on too many fronts as a
> result of the need for backward compatibility.
>
> Pointers in themselves are not an issue. Incompetent language design and
> implementation are.

Pointers are a reflection of incompetence. They should
ideally not be used anywhere.

>
> C was designed and developed by a legion of incompetents....

AFAIK C was designed by two individuals, Kernigan and
Ritchie. However, I'd have to research the matter.
Of course many individuals have designed C *compilers*,
gcc, icc, and Microsoft C++ to name three. Amiga actually
had two at one point: Manx and Lattice. There's probably
been a fair number for DOS and other systems.

>
>
>
>> > C was designed by incompetent fools, implemented by incompetent fools.
>> > Standardized by a legion of incompetent fools, and they should all be
> strung
>> > up by their genitals and whipped until they are dead.
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> You do have issues. Let GWB do the torture. We're here to implement
>> a brand new language for Windows.
>>
>> I think....? :-)
>
> DotNet.....

Noted.

>
>
>> > How would a competent programmer write a similar input function.
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> First, he would not. Someone else would write it, and he'd use it,
>> since it's such a general problem.
>
> That someone else had better be a competent programmer or else you end up
> with the incompetent results that were produced by K&R & the other
> intellectual criminals.
> .

It is a tradeoff.

>
>> If one were to write it, given the criteria you've espoused, one might
>> write it as follows:
>>
>> bool readBuffer(const char * inPrompt, unsigned inPromptLen,
>> fid_t outfid, bool throwOnError, fid_t infid, char * readBuffer,
>> unsigned readBufferSize, unsigned * readBufferReturn) throws IOException
>> {
>> /* implementation */
>> }
>
> uint ReadBuffer(*buffer, uint Buflen, StreamHandleType sthdl, uint flags)
> begin
> ' Implementation
> end ReadBuffer

We can quibble as you like; I was returning a success code and storing
the number of characters successfully read (it is possible to read part
of the data and then fail) in a parameter passed by reference. You
are simply returning the number of characters successfully read;
I don't see an error code return.)

(I'm assuming the '*' means pass by reference here.)

This declaration does have a problem, however.
"uint flags" is invalid. Use SET of ENUM instead.

In a Pascal-ish dialect this might render:

function ReadBuffer(buffer: array[1..*] of char,
buflen: unsigned_integer,
var sthdl: stream_handle_type,
modifier_flags: set of (ASYNCHRONOUS, LINEBUFFERED, CRLF, NEWLINE, ...)
) : unsigned_integer;
begin
...
end;

or one can go with

function ReadBuffer(buffer: array[1..*] of char,
buflen: unsigned_integer,
var bufret: unsigned_integer,
var sthdl: stream_handle_type,
modifier_flags: set of (ASYNCHRONOUS, LINEBUFFERED, CRLF, NEWLINE, ...)
) : boolean;
begin
...
end;

depending on which is more important to return and which
can be relegated to a VAR parameter or simply dropped.

One of the more annoying aspects of Win32 C #include files
(I've not looked at C# variants) is figuring out which
flags are relevant to which routines. The documentation
tries, but ultimately it's probably better to use a SET
of ENUM. Regrettably, C does not support such; neither
does C++, although in a pinch one can cook up a bitvector.

One can quibble as to whether the buflen is really
necessary in this case (in Java, for instance, one can get
the length of an array using '.length'; I would think C#
has something similar), but without more linguistic info
it's hard to say. For example, one might contemplate
functions to create an array of char which shares another
array of char, but which is shorter and offset -- substr(),
for lack of a better name. However, this falls beyond the
ReadBuffer() specifications as I currently understand them.

>
> The function returns # of chars, EOF, failure or throws an exception.

EOF does not fit in the unsigned integer return scheme.
This is another problem with contemporary computer
languages; extension of the basic primitives is impossible.
Smalltalk is the only exception I know of, and I know
little about it, though www.squeak.org is an interesting
playbox therefor.

>
> But instead C gives the user fgets and gets.......
>
> Fucking incompetents 40 years of this incompetence.
>
> Slit their fucking throats... All of them...
>

Except for the Microsoft ones. We want to keep them. They make money.
:-)

Right?

Scott Nudds

unread,
Sep 19, 2006, 6:26:07 PM9/19/06
to

> In comp.os.linux.advocacy, Scott Nudds

> > Then why is the function in the standard I/O library?

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Because nobody has petitioned the POSIX group for a replacement.
> (Even after acceptance of such, there will be an interlude, as code is
> mutated.)

"Replacement" you fucking moron? There should be nothing to replace because
these fucking incompetently written functions shouldn't be in any standard
in the first place.

What king of SHIT do these FUCKING RETARDS have for brains, when they ALLOW
THE CONSIDERATION, let alone ALLOW THE ADOPTION OF INCOMPETENTLY WRITTEN
FUNCTIONS IN ANY STANDARD?

Allowing INCOMPETENCE IN A STANDARD IS JUST ANOTHER SIGN OF INCOMPETENCE.

It's a sign of GREATER INCOMPETENCE in fact because an individual can have a
lapse of judgement and produce a bad function, but a group of competent
individuals should be immune to this kind of failure because they should
check each other's work.

SO WHAT DOES THIS TELL US ABOUT THE C-COMMUNITY AS A WHOLE - as they make
up the entirety of the standards committie?

It tells you that THE ENTIRE C PROGRAMMING COMMUNITY HAS SHIT FOR BRAINS.

They are unworthy of living. Too stupid to live. Waste of skin. Unwelcome
to breathe my air.
Slit their fucking ignorant throats. All of em.

Scott Nudds

unread,
Sep 19, 2006, 7:59:04 PM9/19/06
to

> Scott Nudds wrote:
> > Who was the incompetent cock sucker who wrote it?

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Many. Implementations exist all over the spectrum, from
> Whitespace C on VMS way way back when to Amiga to Atari to
> Linux today, FreeBSD, HURD, and yes, even Windows.

Incompetents, upon incompetents, upon incompetents.

So what makes all of these C shit lickers to incompetent? Is it bad
drinking water? Have they not been beaten enogh as children? Or is it that
genetically predisposed Shit Licker are naturally attracted to languages
like C that are STINKING PILES OF FESTERING SHIT?


> Scott Nudds wrote:
> > Who was the incompetent cock sucker who accepted it in the library?

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Who indeed.

Yes who? It wasn't just one individual. It was every individual in the
standards organization - they all knew the flaws and they all voted to
standardize them. And if they didn't know the flaws - given their obvious
nature - they are equally incompetent.

You just can't get around it. These fucking cocksuckser are too stupid to
be allowed to continue breathing.
Slit their throats....


> Scott Nudds wrote:
> > Who were the incompetent cock suckers who have used it?

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Even more than those who have implemented it. It's a
> simple if slightly off-color solution to the problem of
> "how do I read a line anyway?".

Oh, wow, highly complex. Obviously vastly too complex for the most
advanced C Monkey. Input a line of text... 12 versions of how to do that,
and none work properly.

It takes a special kind of CockSucker society to standardize that.

Slit their throats... Every one of them...


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> C has two basic, tragic flaws. Unlike Pascal or FORTRAN,
> C's strings *have no intrinsic length*. One is expected
> to go NUL-hunting. Second, C has no method to pass an
> array/buffer length as a parameter; one has to explicitly
> use sizeof() or constants:

These are not flaws, but simply inconveniences. Store the string length in
the first character or two of the string if you like. Or process the
buffer until you get char=0, or just whitespace the string's tail.

The problem in this instance (fgets) is that the fuckups who wrote the
funciton can't make up their small little minds weather they intend to use
the function for binary or ASCII input. As a result, they can't manage the
null characters properly, as their management is different in both
instances.

So what a competent programmer would do in this instance would be to
generalize by providing a flag that tells the funciton how to manage the
null. In one instance, given the binary read swith it would just fill the
entire buffer with data, not concerning itself with eol or null. In the
text read mode, it would end at a null and eol, and replace eol with null so
that the ending condition is consistant.

Other modes could be provided to replace EOL with a space for example, or
any other host of options.

It isn't rocket science. But people in the C universe just can't seem to
manage the most trivial of tasks.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> where fgets2() would simply extract the length from
> the buffer using Java's '.length' or some such,
> and use that. There's no possibility of buffer
> overflow there.

One design creterion for the langauge was to have nothing hidden to the
programmer. Hidden length variables are just that unless they are properly
implemented. They can be implemented of course simply by storing the length
at a negative offset from the string pointer itself.

However this complicates string allocation,, struct interpretation etc.

Fixed length strings are often a pain in the ass to use, but they need not
be <IF> the rest of the environment is competently written.

C is <NOT> an exemple of a competently designed langauge.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> (A third flaw, which Java partially addresses [I don't
> know about C# but suspect it does too]: C's char is too
> small to hold modern Unicode characters. Various
> proposals have been floated and accepted to take care of
> this issue, and we now have a right mess, reflected in
> X's total inability to display multibyte characters properly
> without helper libraries such as pango and cairo. At least,
> I've not figured out to do it. To its credit, Windows can
> at least display Unicode out of the box, though it is also
> hampered by C's and C++'s many issues.)

Unicode is another spectacular failure brought to the world by the same
shit licking morons who standardized C.

It's very simple. Use ASCII And if you need more characters use
Extended ASCII, you have 256 characters in total there.

Need more characters yet? OK, go to the 65536 you can get with 2 bytes.
Need more still. Come to me and I will send you to the hell where you are
skinned alive.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> For whatever reason, Kernigan and Ritchie decided both
> were A Good Thing(tm).

Ya, well once you have shit for brains - as these morons do - then it's
hard not to have shit for brains.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Maybe it was back then, when
> memory was far tighter than it is now and one would
> take all day to squeeze out a few bytes of code. One
> could construe it as an ugly tradeoff.

At the time, Japanese computers were quite capable of displaying thousands
of characters.

It's not rocket science. But far too complex for cocksucking morons like
K&R to comprehend.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> I'm old enough to remember DOS overlays. That was -- painful.
> 640k for systems which can now handle more than 640 *megabytes*.
> (640 gigabytes of RAM, soon.)

And the inclusion of 1 instruction could have made all of that mess go away.
That instruction being the simple return of 4 into any of the 80x86's
registers. Instruction to have the Mnemonic SegSize.

Later 80x86 CPU's return SegSize = 8 allows the applicaiton to support 16
megs of RAM.
SegSize = 16 = 4 gig.


> > Who are the incompetent cock suckers who voted for it's
> > inclusion in the standard?

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
> You already asked that.

Potentially different people that share the blame. Slit their ignorant
throats.


> > Slit their incompetent throats.......
> > All of them.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> You're probably talking about killing half of the software developers on
> the planet -- including me! :-P . (The other half use VB. :-) )

So let it be written... So let it be done....


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> I'd hate to think what your thoughts on FORTRAN might be. :-)

Fortran doesn't suffer from the same level of incompetence as shown in the
C universe.
In the C universe, incompetence is epidemic, and originates in the basic
design criterion of the language itself.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Good. Suggest a replacement.

C is too brain dead to be fixed with simple tweaks. It's an abortion from
the ground up, which is a shame since it has some nice features. My
suggestion - my insistance - is that C be abandoned entirely, and those who
refuse, be collected, ground up, and used for fertilizer.

C++ goes quite a way to answering many of C's failures. But it too needs
to be completely abandoned.


> > C like LinSux has all the components to be a valid language that has
merrit.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> C has *no* components of *any* sort, unless one counts
> separately-compilable object files. It is an invalid,
> incompetent language, according to what I believe are your
> measurement criteria.

I am referrig to abstract components like structures, unions, functions,
pointers etc. It's all there, but ineptly implemented. Just like Linux.
It's all there, but in general, it's ineptly implemented.

As a result, unworthy of consideration.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> [1] It has no provisions for string length, as mentioned
> heretofore. All strings are merely character arrays
> terminated with NUL. This introduces a large number
> of kinks in a program. strlen(), of course, is an O(N)
> workaround, and can suffer in some contexts such as device
> space -- fortunately, device space is rarely encountered
> outside of the kernel context. Byte storage versus number
> of characters introduce further kinks...read up on UTF-8
> sometime; it's a mess.

Look, the entire concept is fucked. It's fucked from the word go. The
foundation is cracked from the beginning. It's rotten to the core.

Only fools build on quicksand. And yet people continue to build on
quicksand. What does that tell you?


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> [2] It does not have packaging, polymorphism, or any OO
> design tools at all, beyond maybe the ability to have a
> function pointer. (C++ has a method pointer, which helps
> a little.)

Fuck, it doesn't even have block type checking. It's defaults are
backwards with every module level definition having global scope.

It doesn't define what a fucking integer is. Or what size a character is.
Neither does it specify how or when incerment/decrement is performed.

OO is all about data encapsulation and isolation. C is by design overtly
hostile to the very concept of data encapsulation.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Java has no operator overloading at all.)

Which is probably a good thing. Operator overloading has very limited
applicability. Particularly where the operator precidence is fixed.
Operator creation on the other hand would have great utility.


> > C++ goes perhaps 80% the way there, but still fails on too many fronts
as a
> > result of the need for backward compatibility.
> >
> > Pointers in themselves are not an issue. Incompetent language design
and
> > implementation are.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Pointers are a reflection of incompetence. They should
> ideally not be used anywhere.

You mean, they should be hidden wherever possible.

Iterating over a list via pointer is still faster than array references.

The problem is not pointers, but unconstrained pointers. The problem with
languages like C that implement pointers is that they do not implement
bracketed pointers.


> > C was designed and developed by a legion of incompetents....

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> AFAIK C was designed by two individuals, Kernigan and Ritchie.

And they are the perpetrators of the majority of the crimes. But they
didn't write the entire I/O library. And they were not the exclusive members
of the standards committee. They didn't do all of the post rev1.0
development.

Other incompetents are responsible for those failures.


> We can quibble as you like; I was returning a success code and storing
> the number of characters successfully read (it is possible to read part
> of the data and then fail) in a parameter passed by reference. You
> are simply returning the number of characters successfully read;
> I don't see an error code return.)

0 = no characters read (eof) -n = error.


> This declaration does have a problem, however.
> "uint flags" is invalid. Use SET of ENUM instead.

It is because there is no uint type defined in c. But the intent is clear
Unsigned Int.

> One of the more annoying aspects of Win32 C #include files
> (I've not looked at C# variants) is figuring out which
> flags are relevant to which routines. The documentation
> tries, but ultimately it's probably better to use a SET
> of ENUM. Regrettably, C does not support such; neither
> does C++, although in a pinch one can cook up a bitvector.

C and C++ both provide enum, but don't do a *2 scale, but rather a simple
increment. Hence it is not convenient for flag creation.


> One can quibble as to whether the buflen is really
> necessary in this case (in Java, for instance, one can get
> the length of an array using '.length'; I would think C#
> has something similar), but without more linguistic info
> it's hard to say. For example, one might contemplate
> functions to create an array of char which shares another
> array of char, but which is shorter and offset -- substr(),
> for lack of a better name. However, this falls beyond the
> ReadBuffer() specifications as I currently understand them.

That is typically the way I do it in Machine code. Ptr-> buffer
with -offset = buffer specific details, length, # of chars contained, etc...

> > The function returns # of chars, EOF, failure or throws an exception.
>
> EOF does not fit in the unsigned integer return scheme.
> This is another problem with contemporary computer
> languages; extension of the basic primitives is impossible.

EOF = 0 in this instance.


> > But instead C gives the user fgets and gets.......
> >
> > Fucking incompetents 40 years of this incompetence.
> >
> > Slit their fucking throats... All of them...

> Except for the Microsoft ones. We want to keep them. They make money.
> :-) Right?

Wrong. I have no love of Microsoft and note that from 1980 to 1995 they
managed to grow as a result of inept competiton from the Unix community.
Inept competition that continues until this day.

However with the advent of Win32, and increasingly more so as time has
passed, Microsoft has produced ever improving code, and now produces most
probably the best products available. with some minor exceptions for nich
products.

Linux./Unix however continues to flounder, fumbling around with attempt
after attempt to make itself more compatible with past Unix failures.

Unix/Linux is no real competition to Microsoft. It could be. But the
Unix/Linux community are too content with living in a 1970 teletype driven
world.

The Ghost In The Machine

unread,
Sep 19, 2006, 5:00:15 PM9/19/06
to
In comp.os.linux.advocacy, Scott Nudds
<nos...@foo.com>
wrote
on Tue, 19 Sep 2006 15:26:07 -0700
<2vXPg.66716$ED.6...@read2.cgocable.net>:

>
>> In comp.os.linux.advocacy, Scott Nudds
>> > Then why is the function in the standard I/O library?
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Because nobody has petitioned the POSIX group for a replacement.
>> (Even after acceptance of such, there will be an interlude, as code is
>> mutated.)
>
> "Replacement" you fucking moron?

Yes, replacement, unless you want to completely get rid of all
line-reading capability. I am not going to suggest that a line reader
be bug-for-bug or parameter-for-parameter compatible, in this context,
but one has to be able to read a line, if only to accept a line of user
text from a file as it is being parsed.

> There should be nothing to replace because
> these fucking incompetently written functions shouldn't be in any standard
> in the first place.

If wishes were horses beggars would ride. In any event, they were
written and now need to be replaced. The question is: how?

>
> What king of SHIT do these FUCKING RETARDS have for brains, when they ALLOW
> THE CONSIDERATION, let alone ALLOW THE ADOPTION OF INCOMPETENTLY WRITTEN
> FUNCTIONS IN ANY STANDARD?

The complacency of the user base.

>
> Allowing INCOMPETENCE IN A STANDARD IS JUST ANOTHER SIGN OF INCOMPETENCE.
>
> It's a sign of GREATER INCOMPETENCE in fact because an individual can have a
> lapse of judgement and produce a bad function, but a group of competent
> individuals should be immune to this kind of failure because they should
> check each other's work.
>
> SO WHAT DOES THIS TELL US ABOUT THE C-COMMUNITY AS A WHOLE - as they make
> up the entirety of the standards committie?
>
> It tells you that THE ENTIRE C PROGRAMMING COMMUNITY HAS SHIT FOR BRAINS.
>
> They are unworthy of living. Too stupid to live. Waste of skin. Unwelcome
> to breathe my air.
> Slit their fucking ignorant throats. All of em.
>

I would not recommend you try that approach. There's some other
individuals out there who are rather more competent...law enforcement.
:-)

The Ghost In The Machine

unread,
Sep 19, 2006, 7:00:09 PM9/19/06
to
In comp.os.linux.advocacy, Scott Nudds
<nos...@foo.com>
wrote
on Tue, 19 Sep 2006 16:59:04 -0700
<aSYPg.101211$sS1....@read1.cgocable.net>:

>
>> Scott Nudds wrote:
>> > Who was the incompetent cock sucker who wrote it?
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Many. Implementations exist all over the spectrum, from
>> Whitespace C on VMS way way back when to Amiga to Atari to
>> Linux today, FreeBSD, HURD, and yes, even Windows.
>
> Incompetents, upon incompetents, upon incompetents.
>
> So what makes all of these C shit lickers to incompetent? Is it bad
> drinking water? Have they not been beaten enogh as children? Or is it that
> genetically predisposed Shit Licker are naturally attracted to languages
> like C that are STINKING PILES OF FESTERING SHIT?
>

I don't do judgements. I do fixes. How do we fix the codebase?
Reimplementation in C# or Squeak might be interesting.

>
>
>
>> Scott Nudds wrote:
>> > Who was the incompetent cock sucker who accepted it in the library?
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Who indeed.
>
> Yes who? It wasn't just one individual. It was every individual in the
> standards organization - they all knew the flaws and they all voted to
> standardize them. And if they didn't know the flaws - given their obvious
> nature - they are equally incompetent.
>
> You just can't get around it. These fucking cocksuckser are too stupid to
> be allowed to continue breathing.
> Slit their throats....

Gad, take a chill pill, man. Just lock them up for life for "failure to
present a solution that improves computer science" or something. No
need to shed blood here. :-)

>
>
>> Scott Nudds wrote:
>> > Who were the incompetent cock suckers who have used it?
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Even more than those who have implemented it. It's a
>> simple if slightly off-color solution to the problem of
>> "how do I read a line anyway?".
>
> Oh, wow, highly complex. Obviously vastly too complex for the most
> advanced C Monkey. Input a line of text... 12 versions of how to do that,
> and none work properly.
>
> It takes a special kind of CockSucker society to standardize that.

Yes, a government-paid one, apparently.

>
> Slit their throats... Every one of them...
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> C has two basic, tragic flaws. Unlike Pascal or FORTRAN,
>> C's strings *have no intrinsic length*. One is expected
>> to go NUL-hunting. Second, C has no method to pass an
>> array/buffer length as a parameter; one has to explicitly
>> use sizeof() or constants:
>
> These are not flaws, but simply inconveniences.

They are outright, irretrievable flaws, and variations on these
flaws are responsible for most of the buffer overflows. For example,
try this:

char * readMyData(char * inbuffer)
{
fgets(inbuffer, sizeof(inbuffer), stdin);

return inbuffer;
}

Very very sloppy programming, I know. Guess what sizeof() returns?
It's not the size of the buffer -- in this case, it's the size of the
*pointer*, and therefore one is guaranteed to read all of 3 bytes and a
NUL (or 7 bytes if one's lucky enough to have a 64-bit architecture).

If one's lucky the 3 bytes might include a newline. Might be good
for a US state code or various two-letter country affairs.

Contrast that to this Java example:

class Example
{
public String readMyData(StringBuffer sb)
{
/* implementation elided */
}
}

Several differences are immediately apparent.

[1] Strings in Java are *immutable*, except by using very hackish
introspection techniques. Passing one in is therefore pointless
if one expects to modify it; it must be part of the return, either
alone (as in here) or as part of a response class which is returned
as a chunk.

[2] StringBuffers and StringBuilders have no inherent limitation
regarding length -- until one runs out of resource, of course.

[3] Strings can handle arbitrarily large chars.

> Store the string length in
> the first character or two of the string if you like. Or process the
> buffer until you get char=0, or just whitespace the string's tail.

The buffer is initially uninitialized. One interesting (and rather
gruesome) test might be along the lines of the following:

void hardTest()
{
char tmpbuf[256];

for(int i =0; i < sizeof(tmpbuf); i++) tmpbuf[i] = (char) (rand() * 256);

routineToTest(tmpbuf);
}

Good luck, routineToTest. You'll need it.

In practice, programs don't usually explicitly bother to initialize
buffers like this using calls to rand() or random(), but they might as
well; a typically assembly sequence just does

hardTest:
PUSH BP
MOV BP, SP
LEA -256(BP), SP
LEA -256(BP), SI
PUSH SI
CALL _routineToTest

or some such. The memory between BP-256 and BP? Might as well be
random; certainly this routine doesn't initialize it, and if the caller
to this routine called someone else that scribbled all over stack,
well...

>
> The problem in this instance (fgets) is that the fuckups who wrote the
> funciton can't make up their small little minds weather they intend to use
> the function for binary or ASCII input. As a result, they can't manage the
> null characters properly, as their management is different in both
> instances.

I for one would think fgets() is exclusively for ASCII, but it's
otherwise a good point. In any event, binary presents its own
challenges; fwrite() does well enough but isn't very portable
across machine types.

(Guess how I know. :-) )

>
> So what a competent programmer would do in this instance would be to
> generalize by providing a flag that tells the funciton how to manage the
> null. In one instance, given the binary read swith it would just fill the
> entire buffer with data, not concerning itself with eol or null. In the
> text read mode, it would end at a null and eol, and replace eol with null so
> that the ending condition is consistant.
>
> Other modes could be provided to replace EOL with a space for example, or
> any other host of options.
>
> It isn't rocket science. But people in the C universe just can't seem to
> manage the most trivial of tasks.
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> where fgets2() would simply extract the length from
>> the buffer using Java's '.length' or some such,
>> and use that. There's no possibility of buffer
>> overflow there.
>
> One design creterion for the langauge was to have nothing hidden to the
> programmer. Hidden length variables are just that unless they are properly
> implemented.

.length is not a variable, but an attribute. One could liken it to
C++'s "this", which isn't really a variable either, but a context.
Or one can think of sizeof(), a pseudo-function.

Of course the compiler might have to store it in a register
(not necessarily the same one, and it can "hand off" to
another register during part of its processing).

> They can be implemented of course simply by storing the length
> at a negative offset from the string pointer itself.

That would work to some extent, for pointers to strings.

>
> However this complicates string allocation,, struct interpretation etc.

Depends. C has another flaw: *it can't tell the difference between
an array and a pointer*. In short, the following works:

char hello[] = "Hello, World!";

puts(hello);

despite the fact that puts(const char *) is the declaration. The
problems in this should be obvious to anyone who's used PASCAL
or perhaps Modula-2.

If puts() is dumb enough to try to look at
*((long *) (p - 4)), assuming that's where the length of
the string would be, things get ... messy.

>
> Fixed length strings are often a pain in the ass to use, but they need not
> be <IF> the rest of the environment is competently written.
>
> C is <NOT> an exemple of a competently designed langauge.

Agreed here. Not sure what to replace it with yet, though, and
Linus might have some ideas thereon. :-) After all, he
created this kernel, and used C for its implementation (along
with the necessary assembly "glue"). Others followed his lead.

>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> (A third flaw, which Java partially addresses [I don't
>> know about C# but suspect it does too]: C's char is too
>> small to hold modern Unicode characters. Various
>> proposals have been floated and accepted to take care of
>> this issue, and we now have a right mess, reflected in
>> X's total inability to display multibyte characters properly
>> without helper libraries such as pango and cairo. At least,
>> I've not figured out to do it. To its credit, Windows can
>> at least display Unicode out of the box, though it is also
>> hampered by C's and C++'s many issues.)
>
> Unicode is another spectacular failure brought to the world by the same
> shit licking morons who standardized C.

Unicode has some major problems, most of them having to do with
backwards compatibility concerns. The ISO-8859-* is ugly, but
only a tiny part of the problem apparently. Fonts in X are very
troublesome, requiring Cairo or Pango to wrap over the worst of the
issues.

For its part Windows Unicode is one of the few bits they got more or
less *right*. And even then, they had to split the codebase -- there's
two variants of every text-based routine.

>
> It's very simple. Use ASCII And if you need more characters use
> Extended ASCII, you have 256 characters in total there.

Whoopee. The Chinese will not be very happy with you. The mathematicians
will of course want to pick a bone as well. Arabic is an interesting
language, written *backwards* (well, OK, it's a viewpoint from the US
side of the fence :-) ). Chinese is written *sideways*, traditionally.
Japanese has *three character* sets to fiddle with: traditional kanjii,
hiragana, and katakana.

>
> Need more characters yet? OK, go to the 65536 you can get with 2 bytes.
> Need more still. Come to me and I will send you to the hell where you are
> skinned alive.

And how does one read these characters?

>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> For whatever reason, Kernigan and Ritchie decided both
>> were A Good Thing(tm).
>
> Ya, well once you have shit for brains - as these morons do - then it's
> hard not to have shit for brains.
>
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Maybe it was back then, when
>> memory was far tighter than it is now and one would
>> take all day to squeeze out a few bytes of code. One
>> could construe it as an ugly tradeoff.
>
> At the time, Japanese computers were quite capable of displaying thousands
> of characters.

AFAIK, they still are.

>
> It's not rocket science. But far too complex for cocksucking morons like
> K&R to comprehend.
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> I'm old enough to remember DOS overlays. That was -- painful.
>> 640k for systems which can now handle more than 640 *megabytes*.
>> (640 gigabytes of RAM, soon.)
>
> And the inclusion of 1 instruction could have made all of that mess go away.
> That instruction being the simple return of 4 into any of the 80x86's
> registers. Instruction to have the Mnemonic SegSize.
>
> Later 80x86 CPU's return SegSize = 8 allows the applicaiton to support 16
> megs of RAM.
> SegSize = 16 = 4 gig.

64 GB is now possible. I don't know the details.

>
>
>> > Who are the incompetent cock suckers who voted for it's
>> > inclusion in the standard?
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> You already asked that.
>
> Potentially different people that share the blame. Slit their ignorant
> throats.
>
>
>> > Slit their incompetent throats.......
>> > All of them.
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> You're probably talking about killing half of the software developers on
>> the planet -- including me! :-P . (The other half use VB. :-) )
>
> So let it be written... So let it be done....

I'd be careful if I were you. You might have to ask the Legislature to
change the Penal Code (section 187 in California; other localities will
vary) first.

>
>
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> I'd hate to think what your thoughts on FORTRAN might be. :-)
>
> Fortran doesn't suffer from the same level of incompetence as shown in the
> C universe.

No, but it does have its own special challenges. You might want to grab
the SPICE source sometime, for example; its ideas regarding memory
allocation are ... quaint.

> In the C universe, incompetence is epidemic, and originates in the basic
> design criterion of the language itself.
>
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Good. Suggest a replacement.
>
> C is too brain dead to be fixed with simple tweaks.

I was referring to fgets(), but OK, go ahead; expand scope.

> It's an abortion from
> the ground up, which is a shame since it has some nice features.

Such as...? I see no real useful features in C.

> My
> suggestion - my insistance - is that C be abandoned entirely, and those who
> refuse, be collected, ground up, and used for fertilizer.
>
> C++ goes quite a way to answering many of C's failures. But it too needs
> to be completely abandoned.

Good. Now suggest a replacement for both C and C++.
Is C# sufficient, for example? One might have some work
to do in order to rewrite the Linux kernel, but in theory
it's possible.

Or did you prefer Smalltalk/Squeak, Python, or Java? Java
has its own silly issues.

>
>
>> > C like LinSux has all the components to be a valid language that has
> merrit.
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> C has *no* components of *any* sort, unless one counts
>> separately-compilable object files. It is an invalid,
>> incompetent language, according to what I believe are your
>> measurement criteria.
>
> I am referrig to abstract components like structures, unions, functions,
> pointers etc. It's all there, but ineptly implemented.

No, it's not all there. It's not even close to all there.

Structures: Yes.
Unions: An abomination leading to many bugs.
Classes: Nope.
Class permissions: n/a
(C++ has public:, protected:, private:. Java introduces
package level by omitting the keyword.)
Functions: Second-level only.
(A first-level object would allow for assignments, copying,
and even some operators on that object. Function pointers,
like arrays, can be referenced, but that's about it; certainly
they cannot be modified.)
Arrays: Second-level only.
Pointers: Another abomination.
Polymorphism: Nope.
(A polymorphic collection allows for many object types, all of them
having a common subbase, in the collection. For example,
for(std::list<item*>::iterator i = l1.begin();
i != l1.end(); i++) { (*i)->print(); }
where l1 is a list of things that are known to be subclasses
of 'class item', which contains 'virtual void print() = 0;'.
Java and C# have similar concepts.)
Methods: Nope.
(C++ introduced methods.)
Nested classes: Not even close.
Overloading: Nope.
(C++ introduced the concept of having two functions with the same
name but different signatures.)
Methods on primitives: Nope.
(A feature in Smalltalk.)
Object introspection: Nope.
(Java allows for lookup of methods and fields on an object, and
can even call such methods. There is a Proxy class as well.
C of course has nothing, not even on structures and unions.)
Object property-setting-to-call: Nope.
(C# is the only language I know of that has this.)
Class modifiability: Nope.
(This interesting capability allows for addition and editing
of methods on existing objects. Smalltalk has it. Not many
other languages do.)
Templates: Nope.
(C has no templates; everything uses the preprocessor. C++
introduces templates, with many issues. Java's templates are also
problematic. I don't know what C# has.)
Exceptions: Nope.
(C++ has them; they're not used that often. Java has them and
they're used extensively.)
Virtual: Nope.
(Closest C can get is a function pointer, and such is actually used
in the Linux kernel.)
Threading support: Nope.
(External libraries pick up some of the slack.)
Array Length: Nope.
(Java has a.length. C++ STL has dynamic classes that attempt to
mimic arrays but that's not strictly speaking part of the language.)
Dynamic Array Creation: Nope.
(GCC has an extention that allows
int n = ....;
int a[n]; Java allows int[] a = new int[n]. That's as close
as they get. C can point to an arbitrary size array, but
that's a different idea and is an abomination anyway.)
Dynamic Array Size Modification: Nope.
(C uses realloc(), which only works on pointers.)
Arbitrary method call: Nope.
(Another Smalltalkism, although Java's invoke on introspected
methods comes close.)
Transparent casting: Nope.
(I'm not sure what to call this but in C++ an object will
occasionally be constructed or cast when one wants to call a method
on another object related thereto. This can lead to subtle bugs but
can also be very useful.)
Dynamic cast checking: Nope.
(C++ has several types of casts, which may be overkill. Java has
ClassCastException.)
Pointer Arithmetic: Yep.
(This is one reason C gets away with so much. If one declares
int a[10];
int *b = a;
then b[0] == a[0] and *(b+1) == a[1]. One can contrast
this with Draco, which did not have the implicit multiplication,
or with a modified Pascal with the addr() construct. In a
more proper language one would write
int *b = &a; or int *b = addr(a);.)


> Just like Linux.
> It's all there, but in general, it's ineptly implemented.
>
> As a result, unworthy of consideration.

Of course. And you've made a detailed study of the Vista source
code to ensure that all of your objections have been met, I trust?

>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> [1] It has no provisions for string length, as mentioned
>> heretofore. All strings are merely character arrays
>> terminated with NUL. This introduces a large number
>> of kinks in a program. strlen(), of course, is an O(N)
>> workaround, and can suffer in some contexts such as device
>> space -- fortunately, device space is rarely encountered
>> outside of the kernel context. Byte storage versus number
>> of characters introduce further kinks...read up on UTF-8
>> sometime; it's a mess.
>
> Look, the entire concept is fucked. It's fucked from the word go. The
> foundation is cracked from the beginning. It's rotten to the core.

True.

>
> Only fools build on quicksand. And yet people continue to build on
> quicksand. What does that tell you?

You said it was cracked, not quicksand. Of course Unix is built on C.
What does *that* tell *you*?

All it tells me is that Unix is a tried and true hack. :-) C is a devil
of a mess but it's a well-known devil. I'm not sure regarding its
replacement yet; you aren't giving me a lot of options beyond .NET.

>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> [2] It does not have packaging, polymorphism, or any OO
>> design tools at all, beyond maybe the ability to have a
>> function pointer. (C++ has a method pointer, which helps
>> a little.)
>
> Fuck, it doesn't even have block type checking. It's defaults are
> backwards with every module level definition having global scope.
>
> It doesn't define what a fucking integer is.

Actually, it does, just rather vaguely. I'd have to look. Of course
part of the issue back then was variable sizes of machine word types.
Honeywell, for instance, had an old machine that had 9-bit bytes.

> Or what size a character is.
> Neither does it specify how or when incerment/decrement is performed.
>
> OO is all about data encapsulation and isolation. C is by design overtly
> hostile to the very concept of data encapsulation.

Correct.

>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Java has no operator overloading at all.)
>
> Which is probably a good thing. Operator overloading has very limited
> applicability. Particularly where the operator precidence is fixed.
> Operator creation on the other hand would have great utility.

Java doesn't have that, either; neither does C++. The only language I
know that has such is a rather obscure dialect at one point named ICL; I
believe it was from CAL TECH. Of course its idea of an operator was
\<word>, but one could create as many such as one desired.

Smalltalk might have something but I'd have to look.

>
>
>> > C++ goes perhaps 80% the way there, but still fails on too many fronts
> as a
>> > result of the need for backward compatibility.
>> >
>> > Pointers in themselves are not an issue. Incompetent language design
> and
>> > implementation are.
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Pointers are a reflection of incompetence. They should
>> ideally not be used anywhere.
>
> You mean, they should be hidden wherever possible.

If you prefer. Certainly they should not be out in the open where abuse
is rampant.

>
> Iterating over a list via pointer is still faster than array references.

Implementation detail.

>
> The problem is not pointers, but unconstrained pointers. The problem with
> languages like C that implement pointers is that they do not implement
> bracketed pointers.
>
>
>> > C was designed and developed by a legion of incompetents....
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> AFAIK C was designed by two individuals, Kernigan and Ritchie.
>
> And they are the perpetrators of the majority of the crimes. But they
> didn't write the entire I/O library. And they were not the exclusive members
> of the standards committee. They didn't do all of the post rev1.0
> development.
>
> Other incompetents are responsible for those failures.
>
>
>
>
>
>
>> We can quibble as you like; I was returning a success code and storing
>> the number of characters successfully read (it is possible to read part
>> of the data and then fail) in a parameter passed by reference. You
>> are simply returning the number of characters successfully read;
>> I don't see an error code return.)
>
> 0 = no characters read (eof) -n = error.

Negative numbers are not possible in an unsigned integer context.

>
>
>> This declaration does have a problem, however.
>> "uint flags" is invalid. Use SET of ENUM instead.
>
> It is because there is no uint type defined in c. But the intent is clear
> Unsigned Int.

Unsigned set of bitflags, you mean. A lot of Win32 calls are of this
type, ORing flags all over the place. Useful, but can be problematic.

>
>
>
>> One of the more annoying aspects of Win32 C #include files
>> (I've not looked at C# variants) is figuring out which
>> flags are relevant to which routines. The documentation
>> tries, but ultimately it's probably better to use a SET
>> of ENUM. Regrettably, C does not support such; neither
>> does C++, although in a pinch one can cook up a bitvector.
>
> C and C++ both provide enum, but don't do a *2 scale, but rather a simple
> increment. Hence it is not convenient for flag creation.

Assignment of arbitrary integers to enum values is possible, if a bit
ugly.

>
>
>> One can quibble as to whether the buflen is really
>> necessary in this case (in Java, for instance, one can get
>> the length of an array using '.length'; I would think C#
>> has something similar), but without more linguistic info
>> it's hard to say. For example, one might contemplate
>> functions to create an array of char which shares another
>> array of char, but which is shorter and offset -- substr(),
>> for lack of a better name. However, this falls beyond the
>> ReadBuffer() specifications as I currently understand them.
>
> That is typically the way I do it in Machine code. Ptr-> buffer
> with -offset = buffer specific details, length, # of chars contained, etc...
>
>
>
>> > The function returns # of chars, EOF, failure or throws an exception.
>>
>> EOF does not fit in the unsigned integer return scheme.
>> This is another problem with contemporary computer
>> languages; extension of the basic primitives is impossible.
>
> EOF = 0 in this instance.

Problematic in some communications. Sockets in particular may have no
data available in non-blocking mode. This is not an EOF but simply
an indication that there's no data available right now.

>
>
>> > But instead C gives the user fgets and gets.......
>> >
>> > Fucking incompetents 40 years of this incompetence.
>> >
>> > Slit their fucking throats... All of them...
>
>> Except for the Microsoft ones. We want to keep them. They make money.
>> :-) Right?
>
> Wrong. I have no love of Microsoft and note that from 1980 to 1995 they
> managed to grow as a result of inept competiton from the Unix community.
> Inept competition that continues until this day.

And they are still growing.

>
> However with the advent of Win32, and increasingly more so as time has
> passed, Microsoft has produced ever improving code, and now produces most
> probably the best products available. with some minor exceptions for nich
> products.
>
> Linux./Unix however continues to flounder, fumbling around with attempt
> after attempt to make itself more compatible with past Unix failures.
>
> Unix/Linux is no real competition to Microsoft. It could be.

No, it cannot. Microsoft is the holder of a number of C#/.NET patents,
the best language/environment known to the modern world (the
Microsoft-centric variant, anyway).

> But the
> Unix/Linux community are too content with living in a 1970 teletype driven
> world.
>

1960's. Get your story straight.

Scott Nudds

unread,
Sep 19, 2006, 10:08:47 PM9/19/06
to

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
> Yes, replacement, unless you want to completely get rid of all
> line-reading capability.

No, you get rid of the entire festering piece of shit. The entire language.
It's all garbage. Flush it.


> > There should be nothing to replace because
> > these fucking incompetently written functions shouldn't be in any
standard
> > in the first place.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> If wishes were horses beggars would ride.

If K&R and other C proponents weren't complete incompetents.there would be
no need would there.

I am a firm believer that people who are criminally incompetent should be
held responsible for their incompetence.

Don't you?

I advise the abandonment of incompetent designs. Don't you?

As a result I insist upon the abandonment of C and corrective action taken
against the incompetents who mis-designed the language.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> In any event, they were written and now need to be replaced. The
question is: how?

Difficult since so much C code is so badly written.

And that is what makes the crimes of K&R and their pet langauge so great.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> The complacency of the user base.

Slit their throats... They are unworthy of existance.


> > SO WHAT DOES THIS TELL US ABOUT THE C-COMMUNITY AS A WHOLE - as they
make
> > up the entirety of the standards committie?
> >
> > It tells you that THE ENTIRE C PROGRAMMING COMMUNITY HAS SHIT FOR
BRAINS.
> >
> > They are unworthy of living. Too stupid to live. Waste of skin.
Unwelcome
> > to breathe my air.
> > Slit their fucking ignorant throats. All of em.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> I would not recommend you try that approach. There's some other
> individuals out there who are rather more competent...law enforcement.
> :-)

In the middle of the night, you wake these incompetent morons up, escort
them to the back of the waiting van where they are bound, dumped into a
grinder, and their remains subsequently applied to the nearest wheat field.

C is one of the worst languages ever mis-conceived. It is incompetent,
it's inventors incompetent, it's adopters incompetent, and it's users
incompetent. Combined they have cost the world trillions in wasted effort
and decades in wasted time.

Find them, and destroy them.


cc

unread,
Sep 19, 2006, 7:43:07 PM9/19/06
to

Only someone who has never been laid could complain so virulently about
a programming language.

The Ghost In The Machine

unread,
Sep 19, 2006, 9:00:35 PM9/19/06
to
In comp.os.linux.advocacy, cc
<ruw...@hotmail.com>
wrote
on 19 Sep 2006 16:43:07 -0700
<1158709387....@e3g2000cwe.googlegroups.com>:

A possibility but a little beyond the scope of the present discussion,
methinks. :-) However, it is clear that Nudds is a perfectionist.
(It is also clear he may need something to calm his nerves.
A good woman might do that, or just a punch in the nose.
If he's real lucky...but never mind.)

In any event, fgets() is a problematic call, as is the
entire C language, for various reasons. Not that it
doesn't work -- a meat grinder can kill you but it also is
useful in grinding meat. :-) Just try not to put a hand
in during operation, and make sure maintenance is done
properly and buy from a reputable supplier, as opposed
to one using the advertising tagline "We make bombs and
grinder parts While-U-Wait" or some such.

I am not certain what to replace C with, in any event.
C# has some similar problems if one turns on the
unsafe qualifier, AIUI. Java doesn't deal very well
with assembler -- a necessity in certain contexts.
(JNI *does* work, though -- slowly.) Squeak is in its own
little world. C++ has most of C's blemishes, although
some of them have been papered over with masking tape,
and it also has the ability to do OO things fairly easily
(and trip itself up almost as easily).

Is there a perfect language? A good question. I wish I had
a good answer. There are, of course, plenty of workable
environments, and I'd personally prefer to frame the question in
its comparative form anyway -- which one is "better"? And then
there's the metrics to consider; how does one determine "better"?
Where does "better" apply?

I'm also curious as to Nudds' thinking regarding certain
CSS software suppliers. Clearly, Vista is The Future
(the future *what* is another question), but AFAICT Nudds
either has an "in" with the developers and is therefore
protected by NDA and cannot disclose, or is "out" and
therefore knows nothing about Vista's internals, and can
make educated guesses same as the rest of us, but that's
about it.

cc

unread,
Sep 19, 2006, 9:20:32 PM9/19/06
to

The Ghost In The Machine wrote:
> In comp.os.linux.advocacy, cc
> <ruw...@hotmail.com>
> wrote
> on 19 Sep 2006 16:43:07 -0700
> <1158709387....@e3g2000cwe.googlegroups.com>:
> >
>
> A possibility but a little beyond the scope of the present discussion,
> methinks. :-) However, it is clear that Nudds is a perfectionist.

That's a nice way of putting it.

> (It is also clear he may need something to calm his nerves.
> A good woman might do that, or just a punch in the nose.
> If he's real lucky...but never mind.)
>

I think it's clear by Scott's dedication to feces and his homoerotic
pseudonym that there is no good woman for him. He prefers the company
of men.

Scott Nudds

unread,
Sep 20, 2006, 12:36:22 AM9/20/06
to

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
> I don't do judgements. I do fixes. How do we fix the codebase?
> Reimplementation in C# or Squeak might be interesting.

I don't attempt to fix that which can not be fixed. Typically you can only
push the problem deeper into more esoteric and less predictable territory
where it will crop up and cause all sorts of nonsense problems later on when
people forget the esoteric conditions where the so called "fix" no longer
fixes anything.

Many people make their living out of not solving problems in exactly this
way.

They are part of the problem, not part of the solution. Indeed they do
nothing but perpetuate the failure.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Gad, take a chill pill, man. Just lock them up for life for "failure to
> present a solution that improves computer science" or something. No
> need to shed blood here. :-)

I don't think the jails would be large enough. I'm willing to excempt
juvi's who were too young to understand the extent of their crime.


> > It takes a special kind of CockSucker society to standardize that.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> Yes, a government-paid one, apparently.

You have a point. Groups that are under government oversite often try to be
produce a solution that speaks to the concerns of all people. Some people
need white, others insist on black, so the result is Black or White as an
option, on thursday, when it rains. Or Grey.

Successful groups however adopt a vision and stick to it.

Open source generally follows the first method, while closed source often
the second. And this is why closed source is generally of better quality
and consistancy than open source.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> They are outright, irretrievable flaws, and variations on these
> flaws are responsible for most of the buffer overflows. For example,
> try this:
>
> char * readMyData(char * inbuffer)
> {
> fgets(inbuffer, sizeof(inbuffer), stdin);
>
> return inbuffer;
> }
>
> Very very sloppy programming, I know. Guess what sizeof() returns?
> It's not the size of the buffer -- in this case, it's the size of the
> *pointer*, and therefore one is guaranteed to read all of 3 bytes and a
> NUL (or 7 bytes if one's lucky enough to have a 64-bit architecture).

Of course. The language has no real concept of buffers, At run time it has
no concept at all. It's the compiler that adds up the size of the objects
and then defines some constants to match those instances where the sizeof
operator are used.

You can of course include the buffer size along with the buffer. But when
you do it is another matter. It makes sense to do it with strings at all
times since their meaning is clear. But how about just a general buffer
that is used for input of integers and ascii strings and unicode strings.

It's very convenient to be able to define a complex object as a struct and
then not have to worry about how big it is - correcting for the hidden data
that may be included by the compiler in it's various components. Alignment
constraints also result in a situation where for example an integer may not
be able to be placed 2 bytes before a string, - this is particularly true in
a language like C in which an integer has no defined size.

Now <THAT> is truly a failure of the language.. It's failure to define the
size of it's variables. If C were rationally defined, it would be defined
to use the types.

uint8
sint8
uint16
sint16
uint32
sint32
uint64
sint64

etc.

Individual language bastards could then define int as any type they liked.
But the base types would all be well defined.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> [1] Strings in Java are *immutable*, except by using very hackish
> introspection techniques. Passing one in is therefore pointless
> if one expects to modify it; it must be part of the return, either
> alone (as in here) or as part of a response class which is returned
> as a chunk.

type string
uint16 buflen
uint16 strlen
char buf[buflen]
char zero = 0
end type

Pass pointers to that, rather than pointers to the raw buffer.


> [2] StringBuffers and StringBuilders have no inherent limitation
> regarding length -- until one runs out of resource, of course.

Both a limitation and a liability.


> [3] Strings can handle arbitrarily large chars.

No value.

> > Store the string length in
> > the first character or two of the string if you like. Or process the
> > buffer until you get char=0, or just whitespace the string's tail.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> The buffer is initially uninitialized.

You don't know that, but with C, you must assume that.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> or some such. The memory between BP-256 and BP? Might as well be
> random; certainly this routine doesn't initialize it, and if the caller
> to this routine called someone else that scribbled all over stack,
> well...

You can't prevent that and it's not your problem to worry about others
clobbering your data. If they do, it's their fault.

Machines can however be built to prevent such things. It requires the
implementation in hardware of safe pointers in which each pointer is defined
to take a starting address and an ending address, and any attempt to use
that pointer outside of those addresses cause a critical application fault.

C provides no direction to hardware manufacturers to implement such
pointers. hence they are not implemented.

> > One design creterion for the langauge was to have nothing hidden to the
> > programmer. Hidden length variables are just that unless they are
properly
> > implemented.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> .length is not a variable, but an attribute. One could liken it to
> C++'s "this", which isn't really a variable either, but a context.
> Or one can think of sizeof(), a pseudo-function.

Variable, attribute, it's all the same. <unless> the attribute is peppered
as a constant through the application. This is how C implements sizeof().
The compiler knows what the value of sizeof(item) is, and when it sees the
pseudofunction it does a replace with the precomputed constant.

The problem with doing this of course is that the size doesn't track along
with the movement of the item reference. To do that, you have to tie the
attribute to the object, and that means physically storing the data (or a
reference to it) along side of the item itself as a variable. And that
value is hidden.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Depends. C has another flaw: *it can't tell the difference between
> an array and a pointer*. In short, the following works:
>
> char hello[] = "Hello, World!";
>
> puts(hello);
>
> despite the fact that puts(const char *) is the declaration. The
> problems in this should be obvious to anyone who's used PASCAL
> or perhaps Modula-2.
>
> If puts() is dumb enough to try to look at
> *((long *) (p - 4)), assuming that's where the length of
> the string would be, things get ... messy.

Yup. That is undoubtedly why hidden data was deemed to be a bad thing. I
tend to agree with the decision in this case, and note again that if
additional data needs to be included along with your strings you can
manually define a structure to include them.

What would be nice however raises the issue of variable initialization. It
would be nice if string t not only defined t as a string type but also
cleared t.buffer and set t.size to sizeof(t.buffer)

C++ can do this of course, and is one of the reasons why C++ is far superior
to C.

> > C is <NOT> an exemple of a competently designed langauge.
>
> Agreed here. Not sure what to replace it with yet, though, and
> Linus might have some ideas thereon. :-) After all, he
> created this kernel, and used C for its implementation (along
> with the necessary assembly "glue"). Others followed his lead.

C++ is not quite an acceptable replacement. But requires some simple but
extensive alterations to correct the remaining flaws in the langauge.
Things like defining the sizes of variable types, the inclusion of block
typing, Cleaning up pointer reference syntax, the rewrite of the I/O
library (hardly used anymore) And the correction of a host of other flaws
that I no longer remember.


> > "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Unicode has some major problems, most of them having to do with
> backwards compatibility concerns. The ISO-8859-* is ugly, but
> only a tiny part of the problem apparently. Fonts in X are very
> troublesome, requiring Cairo or Pango to wrap over the worst of the
> issues.

Unicode needs to be wished into a corn field, along with Linux.


> > It's very simple. Use ASCII And if you need more characters use
> > Extended ASCII, you have 256 characters in total there.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> Whoopee. The Chinese will not be very happy with you.

The Chinese speak english at this point. The Japanese as well So do the
people of India, and Pakistan, and most people in Europe.

Unicode fosters separateness, isolation, an inability to communicate.

Unicode is not only technical stupidity, it is pure unadulterated evil.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> The mathematicians will of course want to pick a bone as well.

Let them. For them we will define a mathematical symbol set that is
distinct from ASCII.
Do you feel a need to use a tripple path integral as your window Title?
No. So no need for support of this character set there or essentially
anywhere else for that matter.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Arabic is an interesting
> language, written *backwards* (well, OK, it's a viewpoint from the US
> side of the fence :-) ).

Only the numbers are written backwards from what I understand. They can
all speak english too you know.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Japanese has *three character* sets to fiddle with: traditional kanjii,
> hiragana, and katakana.

What about the Martians? The Lipicysesians? How about the shrews.. God
don't forget about the shrews.....

> > Need more characters yet? OK, go to the 65536 you can get with 2
bytes.
> > Need more still. Come to me and I will send you to the hell where you
are
> > skinned alive.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> And how does one read these characters?

You mean, how are they displayed on a page? Specifically are you asking what
direction you advance in order to move to the next character position?

My question is what is the size of a page?


> > At the time, Japanese computers were quite capable of displaying
thousands
> > of characters.
>
> AFAIK, they still are.

Illustrating why Unicode has never been needed.

It's not rocket science. But far too complex for cocksucking morons like
K&R to comprehend.

> >> I'm old enough to remember DOS overlays. That was -- painful.
> >> 640k for systems which can now handle more than 640 *megabytes*.
> >> (640 gigabytes of RAM, soon.)
> >
> > And the inclusion of 1 instruction could have made all of that mess go
away.
> > That instruction being the simple return of 4 into any of the 80x86's
> > registers. Instruction to have the Mnemonic SegSize.
> >
> > Later 80x86 CPU's return SegSize = 8 allows the applicaiton to support
16
> > megs of RAM.
> > SegSize = 16 = 4 gig.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> 64 GB is now possible. I don't know the details.

64 bit pointers in 64 bit CPU's with the upper addressing bits truncated.

> > So let it be written... So let it be done....

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> I'd be careful if I were you. You might have to ask the Legislature to
> change the Penal Code (section 187 in California; other localities will
> vary) first.

Once the first few end up feeding wheat fields, the others will get the
picture.


> No, but it does have its own special challenges. You might want to grab
> the SPICE source sometime, for example; its ideas regarding memory
> allocation are ... quaint.

I know. Ferrite core memory was hard to make when Fortran was developed.


> > It's an abortion from
> > the ground up, which is a shame since it has some nice features.

> Such as...? I see no real useful features in C.

Pointers are unique to C and are the principle way in which it provides a
speed advantage over alternate languages. Also C integer types are <not>
overflow checked for every operation and this also proves a considerable
speed advantage. Not doing a jcc or jo after ever add eax,figbert, means
less pipeline stalls as the CPU tries to figure out weather to pre-process
the fall though code or the branch code.

Not checking for overflow also provides for cleaner modulo arithmetic which
quite often comes in handy for bit manipulation, encryption, and a host of
other uses.

Type conversion is generally good, Struct and Union implementation is
good.

Other langauge have some of these features, but the first two I indicate
are unique.


> > My
> > suggestion - my insistance - is that C be abandoned entirely, and those
who
> > refuse, be collected, ground up, and used for fertilizer.
> >
> > C++ goes quite a way to answering many of C's failures. But it too
needs
> > to be completely abandoned.
>
> Good. Now suggest a replacement for both C and C++.
> Is C# sufficient, for example?

I can define a replacement. It's rather trivial. But no good replacement
exists at the moment. This is not because there is a problem doing so, it's
strictly because there is no effort to do so.


> One might have some work
> to do in order to rewrite the Linux kernel, but in theory
> it's possible.

Why recreate yet another version of a loser OS?

> Or did you prefer Smalltalk/Squeak, Python, or Java? Java
> has its own silly issues.

Python is just brain dead. Not only no block typing, but no block
delineation at all.
Java suffers from the same brain dead syntax as C. Smalltalk I prefer to C
and squeak I've never heard of before other than the occasional squeen from
the mouse that is living in my kitchen.


> > I am referrig to abstract components like structures, unions,
functions,
> > pointers etc. It's all there, but ineptly implemented.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> No, it's not all there. It's not even close to all there.
>
> Structures: Yes.
> Unions: An abomination leading to many bugs.

I see you haven't done much in the way of file parsing or header
processing. Lets say you have a header for a file of type xyz. There are 3
different versions of the file type and each has it's own header structure.
With a union you read the header, then process the version number, then use
that to overlay the right union to get the proper interpretation of the
data.

They also allow buffers to be re-used in different contexts of course.

Unions are quite useful.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Classes: Nope.

Not really needed. There are other means of encapsulation


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Class permissions: n/a

Since classes aren't needed....

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Pointers: Another abomination.

A direct mapping onto a machine register and hence fast and efficient, not
requiring a multiply and add for every variable reference as is typically
required for array indexing.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Polymorphism: Nope.

A nice, but unnecessary feature.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Methods: Nope.

Since you don't need classes, you don't need methods obviously since
methods are class functions.

But you can still encapsulate your data by simply putting the code in a
module and setting access permissions accordingly.

You will have to pass your object ID along with your function calls
manually.

Oh well, ugly but viable.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> (C++ introduced methods.)

OOP programming existed long before C++


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Nested classes: Not even close.

Never had a use for them.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Overloading: Nope.

Generally useless. Should be replaced with operation definition.

Define operator <zyz> = function(left,right)


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> (C++ introduced the concept of having two functions with the same
> name but different signatures.)

That's polymorphysm.


> Methods on primitives: Nope.
> (A feature in Smalltalk.)

No need.


> Object introspection: Nope.

No objects hence no need.


> Object property-setting-to-call: Nope.
> (C# is the only language I know of that has this.)

VB has it as well.


> Class modifiability: Nope.
> (This interesting capability allows for addition and editing
> of methods on existing objects. Smalltalk has it. Not many
> other languages do.)

Sounds dangerous.

> Templates: Nope.
> (C has no templates; everything uses the preprocessor. C++
> introduces templates, with many issues. Java's templates are also
> problematic. I don't know what C# has.)

A solution looking for a problem.

> Exceptions: Nope.
> (C++ has them; they're not used that often. Java has them and
> they're used extensively.)

C has exceptions. At least all the versions of C I have ever had the
displeasure to use.

> Virtual: Nope.
> (Closest C can get is a function pointer, and such is actually used
> in the Linux kernel.)

Not sure what you mean by "virtual". Sounds like another poorly chosen
keyword.


> Threading support: Nope.
> (External libraries pick up some of the slack.)

Agreed, much better to have it directly specified at the language level.

> Array Length: Nope.
> (Java has a.length. C++ STL has dynamic classes that attempt to
> mimic arrays but that's not strictly speaking part of the language.)

Compiler knows it, and can pass it along to the application. What you do
with it is your concern.

I have no problem with this.


> Dynamic Array Creation: Nope.

Use pointer/index referencing. For linear arrays. Even for
multi-dimentional arrays with some difficulty.

> Dynamic Array Size Modification: Nope.

Malloc what you need and overlay your array of structures.


> Arbitrary method call: Nope.
> (Another Smalltalkism, although Java's invoke on introspected
> methods comes close.)

Unknown....

> Transparent casting: Nope.
> (I'm not sure what to call this but in C++ an object will
> occasionally be constructed or cast when one wants to call a method
> on another object related thereto. This can lead to subtle bugs but
> can also be very useful.)

I prefer all casting to be explicit. fewer errors.

> Pointer Arithmetic: Yep.
> (This is one reason C gets away with so much. If one declares
> int a[10];
> int *b = a;
> then b[0] == a[0] and *(b+1) == a[1]. One can contrast
> this with Draco, which did not have the implicit multiplication,
> or with a modified Pascal with the addr() construct. In a
> more proper language one would write
> int *b = &a; or int *b = addr(a);.)

The problem is with your interpretation of a[10] you think a[10]
represent an array of 10 items. In reality it is a pointer to an array of
10 items.

Approached this way the subtle contextual switch vanishes. But might
agree, the contextual switch shouldn't be there in the first place.

Problem is , if you do this then

int a[10]
int *b

b=a

&b returns the address of variable b while
&a returns the address, not of a but of the int [10]

So now you have another kind of unwanted contextual switch.


> > Just like Linux.
> > It's all there, but in general, it's ineptly implemented.
> >
> > As a result, unworthy of consideration.


> Of course. And you've made a detailed study of the Vista source
> code to ensure that all of your objections have been met, I trust?

Linux ineptitude doesn't need source code to see. You only need to install
it to see how much of a shit stick it is.


> You said it was cracked, not quicksand. Of course Unix is built on C.
> What does *that* tell *you*?

That shit breeds shit.

> > Fuck, it doesn't even have block type checking. It's defaults are
> > backwards with every module level definition having global scope.
> >
> > It doesn't define what a fucking integer is.


> Actually, it does, just rather vaguely. I'd have to look.

No, the language doesn't define it. The implementation does. And every
implementation can be different.
So in other words, there is no definition.


> Of course part of the issue back then was variable sizes of machine word
types.
> Honeywell, for instance, had an old machine that had 9-bit bytes.

Yup. I don't care. Emulate 8 bit bytes or refuse to compile.

> > Operator creation on the other hand would have great utility.
>
> Java doesn't have that, either; neither does C++. The only language I
> know that has such is a rather obscure dialect at one point named ICL; I
> believe it was from CAL TECH. Of course its idea of an operator was
> \<word>, but one could create as many such as one desired.
>
> Smalltalk might have something but I'd have to look.

Operator overloading is really not a selling feature. It's only usable in
very,very limited situations.


> > You mean, they should be hidden wherever possible.
>
> If you prefer. Certainly they should not be out in the open where abuse
> is rampant.

I spent 20 years programming in assembler so pointer use is second nature
to me. The implemntation in hardware of safe pointers would solve all
pointer problems of course. C provides no directition for such
advancements.


> > Iterating over a list via pointer is still faster than array references.
>
> Implementation detail.

One that compilers are not very good at resolving. Typically they will
generate a multiply and an add for every variable reference in an array, but
with a pointer they only do an index, which can often be combined and run in
parallel with the memory access.


> > 0 = no characters read (eof) -n = error.
>
> Negative numbers are not possible in an unsigned integer context.

Which is why I would define the return type as signed rather than just
"int" which leaves the sign/not signed state undefined (C incompetence)


> Unsigned set of bitflags, you mean. A lot of Win32 calls are of this
> type, ORing flags all over the place. Useful, but can be problematic.

Bitflags are small and compact and you can pass a whack of em in one
variable.
nice and efficient like.

> > C and C++ both provide enum, but don't do a *2 scale, but rather a
simple
> > increment. Hence it is not convenient for flag creation.
>
> Assignment of arbitrary integers to enum values is possible, if a bit
> ugly.

Ya, and you can use multiple defines.


> Problematic in some communications. Sockets in particular may have no
> data available in non-blocking mode. This is not an EOF but simply
> an indication that there's no data available right now.

Then return an error condition-8= socket data not available.

I suppose you could also do -1 = EOF. 0 = no data available.


> > Wrong. I have no love of Microsoft and note that from 1980 to 1995 they
> > managed to grow as a result of inept competiton from the Unix community.
> > Inept competition that continues until this day.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> And they are still growing.

Aided by the incompetence of the Linux community.

> > Linux./Unix however continues to flounder, fumbling around with attempt
> > after attempt to make itself more compatible with past Unix failures.
> >
> > Unix/Linux is no real competition to Microsoft. It could be.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> No, it cannot. Microsoft is the holder of a number of C#/.NET patents,
> the best language/environment known to the modern world (the
> Microsoft-centric variant, anyway).

Compiling to virtual assembler has a primary advantage of obscuring the
source from whence the code came. The secondary advantage is translation
into a universal standard langauge that can be then cross assembled onto a
target instruction set.

For the moment Linux has C++ and since the sourse is open, the source can
be open.

>
> > But the
> > Unix/Linux community are too content with living in a 1970 teletype
driven
> > world.

> 1960's. Get your story straight.

The University I attended was still using punchcards in 1981.

I was disgusted.


Scott Nudds

unread,
Sep 20, 2006, 12:39:31 AM9/20/06
to

"cc" <ruw...@hotmail.com> wrote

> Only someone who has never been laid could complain so virulently about
> a programming language.

Curses... Exposed again....


Scott Nudds

unread,
Sep 20, 2006, 12:55:38 AM9/20/06
to

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
> but AFAICT Nudds
> either has an "in" with the developers and is therefore
> protected by NDA and cannot disclose, or is "out" and
> therefore knows nothing about Vista's internals, and can
> make educated guesses same as the rest of us, but that's
> about it.

I have <no> knowledge about Vista's internal workings at all. But I do
know how Microsoft works, and I do know what is coming down the pipe in
terms of hardware.

Vista is mostly hype as are all of Microsof's products. Some trivial new
feature is added like a picture of a barking dog, and suddenly is's a got to
have feature that is the focal point of the entire computing universe.

The bulk of Vista is going to be a new interface layer upon which a new
generation of applciatons will be run. Sure it also is a cumulative bug
fix, and a rewrite of some key funcitons to make them function better.

Key to vista for Microsoft will the the deeper imbeding of DRM, and
install security to ensure that anyone who runs Vista is a paying customer.

However equally clear is the vision that Vista will foster for a 3d
working environment. This environment will fail of course, but it will
remain there, sitting in the background for those who have the hardware to
use it. The rest of you will be running a slightly modified version of the
existing XP API.

Over the next 5 years or so, hardware upgrades will allow the Aero
interface to move into the mainstream on the personal desktop, and possibly
on the corporate desktop as well - they won't have much choice. But by then
the hardware costs will be much lower, and performance improved.

In the intervening time, Microsoft may bring out Aero Light or something
to speed acceptance.

In the background, applications that were written to target the x86
specifically will increasinlgy be written in DotNet and this will give them
instant portability when the PC platform splits into multiple incompatible -
multi-cpu camps. Microsoft will need to port Vista and the JIT compiler,
and the rest of the worlds applications will come free. Write once, run
anywhere in the Microsoft Universe. Single executable distribution. No
hastles, no muss, no fuss. No source code insecurity.

Meanwhile Linux will continue to be a 2/10ths of a percent player that
will always think it's 1969 and that it's connected to a wise 360 printing
teletype.

Pathetic....

Scott Nudds

unread,
Sep 20, 2006, 1:01:27 AM9/20/06
to

> > A possibility but a little beyond the scope of the present discussion,
> > methinks. :-) However, it is clear that Nudds is a perfectionist.

"cc" <ruw...@hotmail.com> wrote in message


> That's a nice way of putting it.

Software should be crystaline in form and function so that defects in
structure and purity can easily be detected.


"cc" <ruw...@hotmail.com> wrote in message


> I think it's clear by Scott's dedication to feces and his homoerotic
> pseudonym that there is no good woman for him. He prefers the company
> of men.

Ho Hummm... Another pimple faced 16 year old Linux Shit Licker who
doesn't have a clue. Who doesn't have an idea. Who doesn't have a dime,
and not even enough imagination to create an amusing or accurate insult.

cc

unread,
Sep 19, 2006, 10:14:43 PM9/19/06
to

Scott Nudds wrote:
> > > A possibility but a little beyond the scope of the present discussion,
> > > methinks. :-) However, it is clear that Nudds is a perfectionist.
>
> "cc" <ruw...@hotmail.com> wrote in message
> > That's a nice way of putting it.
>
> Software should be crystaline in form and function so that defects in
> structure and purity can easily be detected.
>
>
> "cc" <ruw...@hotmail.com> wrote in message
> > I think it's clear by Scott's dedication to feces and his homoerotic
> > pseudonym that there is no good woman for him. He prefers the company
> > of men.
>
> Ho Hummm... Another pimple faced 16 year old Linux Shit Licker who


Yikes, who said I use Linux? And again with your poop fetish...


> doesn't have a clue. Who doesn't have an idea. Who doesn't have a dime,
> and not even enough imagination to create an amusing or accurate insult.


What does having money have to do with creating insults? Did you pay
someone for "Linux Shit Licker"? If so, I'd ask for my money back.

As far as not being able to create an amusing or accurate insult, well
amusing is subjective you know. You seem to be fond of all things
poopy, so I guess that's what keeps you happy. I'm not particularly
amused by it, but hey whatever gets you off. And as for accurate...well
I saw no denial. I calls 'em likes I sees 'em.

GreyCloud

unread,
Sep 19, 2006, 10:28:10 PM9/19/06
to
cc wrote:

>> C is one of the worst languages ever mis-conceived. It is incompetent,
>>it's inventors incompetent, it's adopters incompetent, and it's users
>>incompetent. Combined they have cost the world trillions in wasted effort
>>and decades in wasted time.
>>
>> Find them, and destroy them.
>
>
> Only someone who has never been laid could complain so virulently about
> a programming language.
>

LOL! If he ever got laid, he'd forget everything he ever learned. :-))

Scott Nudds

unread,
Sep 20, 2006, 2:59:21 AM9/20/06
to

"cc" <ruw...@hotmail.com> wrote in message
> What does having money have to do with creating insults? Did you pay
> someone for "Linux Shit Licker"? If so, I'd ask for my money back.

Yawn.......


Scott Nudds

unread,
Sep 20, 2006, 3:06:53 AM9/20/06
to

"GreyCloud" <mi...@cumulus.com> wrote

> LOL! If he ever got laid, he'd forget everything he ever learned. :-))

Yawn.........


tha...@tux.glaci.remove-this.com

unread,
Sep 20, 2006, 10:35:45 AM9/20/06
to
The Ghost In The Machine <ew...@sirius.tg00suus7038.net> wrote:
>
> In any event, fgets() is a problematic call, as is the
> entire C language, for various reasons. Not that it
> doesn't work -- a meat grinder can kill you but it also is
> useful in grinding meat. :-) Just try not to put a hand
> in during operation, and make sure maintenance is done
> properly and buy from a reputable supplier, as opposed
> to one using the advertising tagline "We make bombs and
> grinder parts While-U-Wait" or some such.

The thing is, C was originally designed to be just one step
up from a macro assembler, and in that respect it succeeds at
its goals. Certainly there are some legitimate gripes about
lack of consistency across different platform implementations
(i.e. size of primitives, etc), but judging it by the standards
of other languages with more rigid type checking and better
memory management and so on is not really a fair comparison.
It was meant to be a small, flexible tool that could be
implemented on the less powerful hardware of its day. Sure,
people choose it now for higher level tasks that are better
suited to other languages, but that is hardly the fault of
any design choice in C. If I use a wrench to pound in a nail,
its not the fault of the wrench.

Later,

Thad

The Ghost In The Machine

unread,
Sep 20, 2006, 11:00:06 AM9/20/06
to
In comp.os.linux.advocacy, cc
<ruw...@hotmail.com>
wrote
on 19 Sep 2006 19:14:43 -0700
<1158718483.0...@d34g2000cwd.googlegroups.com>:

>
> Scott Nudds wrote:
>> > > A possibility but a little beyond the scope of the present discussion,
>> > > methinks. :-) However, it is clear that Nudds is a perfectionist.
>>
>> "cc" <ruw...@hotmail.com> wrote in message
>> > That's a nice way of putting it.
>>
>> Software should be crystaline in form and function so that defects in
>> structure and purity can easily be detected.
>>
>>
>> "cc" <ruw...@hotmail.com> wrote in message
>> > I think it's clear by Scott's dedication to feces and his homoerotic
>> > pseudonym that there is no good woman for him. He prefers the company
>> > of men.
>>
>> Ho Hummm... Another pimple faced 16 year old Linux Shit Licker who
>
>
> Yikes, who said I use Linux? And again with your poop fetish...

Hm. Hard to say from your postings but since you've attacked Scott
Nudds' perfect crystal (Aero?) he probably made an assumption. :-)

>
>
>> doesn't have a clue. Who doesn't have an idea. Who doesn't have a dime,
>> and not even enough imagination to create an amusing or accurate insult.
>
>
> What does having money have to do with creating insults? Did you pay
> someone for "Linux Shit Licker"? If so, I'd ask for my money back.

He probably borrowed the concept from The Kids In The Hall. :-)

>
> As far as not being able to create an amusing or accurate insult, well
> amusing is subjective you know. You seem to be fond of all things
> poopy, so I guess that's what keeps you happy. I'm not particularly
> amused by it, but hey whatever gets you off. And as for accurate...well
> I saw no denial. I calls 'em likes I sees 'em.
>

Probably the best we all can do.

The Ghost In The Machine

unread,
Sep 20, 2006, 11:00:06 AM9/20/06
to
In comp.os.linux.advocacy, Scott Nudds
<nos...@foo.com>
wrote
on Tue, 19 Sep 2006 21:39:31 -0700
<2Z0Qg.101560$sS1....@read1.cgocable.net>:

No, that's another C library, I'm afraid. It's primarily designed to
handle things like vi. :-)

The Ghost In The Machine

unread,
Sep 20, 2006, 11:00:05 AM9/20/06
to
In comp.os.linux.advocacy, Scott Nudds
<nos...@foo.com>
wrote
on Tue, 19 Sep 2006 21:36:22 -0700
<6W0Qg.101558$sS1....@read1.cgocable.net>:

>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> I don't do judgements. I do fixes. How do we fix the codebase?
>> Reimplementation in C# or Squeak might be interesting.
>
> I don't attempt to fix that which can not be fixed.

Who says this cannot be fixed? Given enough energy *anything*
can be fixed. But one has to know what one's fixing; there's
a lot of bad C code out there. Can't pull a boulder with a rubber
band and all that.

> Typically you can only
> push the problem deeper into more esoteric and less predictable territory
> where it will crop up and cause all sorts of nonsense problems later on when
> people forget the esoteric conditions where the so called "fix" no longer
> fixes anything.
>
> Many people make their living out of not solving problems in exactly this
> way.
>
> They are part of the problem, not part of the solution. Indeed they do
> nothing but perpetuate the failure.

Oh, so you *do* want to go around slashing throats then?
That's...comforting.

>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Gad, take a chill pill, man. Just lock them up for life for "failure to
>> present a solution that improves computer science" or something. No
>> need to shed blood here. :-)
>
> I don't think the jails would be large enough. I'm willing to excempt
> juvi's who were too young to understand the extent of their crime.

Don't worry. The Juvi's are being programmed into using VB.

The compiler cannot do it alone. Consider the above again,
with this call:

char * callit()
{
char * buf[*] = (char *[*]) malloc(256);
readMyData(buf);
}

In an ideal system this would actually work, and the [*]
construct would be a hint to the compiler that he has to
[a] look up/calculate the actual size of the buffer at
runtime, and [b] pass whatever is relevant so that anyone
passed that buffer would know it as well.

>
> You can of course include the buffer size along with the buffer.

It has to be done.

> But when
> you do it is another matter. It makes sense to do it with strings at all
> times since their meaning is clear. But how about just a general buffer
> that is used for input of integers and ascii strings and unicode strings.

A "general buffer"? Are you referring to a structure, or to some sort
of uninterpreted area of memory that hasn't been processed yet?

Why define a struct like that? The compiler should *assume* it.
In other words, the following:

string s = "Hi";
printf("%d %d %*.*s\n", s.buflen, s.strlen, s.strlen, s.strlen, s.buf);

would actually work. Or, if one wants to get even sillier:

printf("%d %d %*.*s\n", "Hi".buflen, "Hi".strlen,
"Hi".strlen, "Hi".strlen, "Hi".buf);

would work as well. (This actually *does* work in Java, except
for the printf().)

>
>
>> [2] StringBuffers and StringBuilders have no inherent limitation
>> regarding length -- until one runs out of resource, of course.
>
> Both a limitation and a liability.

There's only so much virtual memory on one's system. :-)

>
>
>> [3] Strings can handle arbitrarily large chars.
>
> No value.

You're right. We should restrict ourselves to ASCII.
No point in gumming up the works with all of that thar
foreign-type stuff like Arabic and Chinese and Klingon
and Ebonics. Just good old American English.

>
>
>
>> > Store the string length in
>> > the first character or two of the string if you like. Or process the
>> > buffer until you get char=0, or just whitespace the string's tail.
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> The buffer is initially uninitialized.
>
> You don't know that, but with C, you must assume that.

Depends on the context. In any event, initialization is
not a given.

char buf[1024];

and

void routine
{
static char buf[1024];
}

are initialized (to zeroes, actually).

void routine()
{
char buf[1024];
}

is not. Confusing? You bet your sweet bippy.

>
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> or some such. The memory between BP-256 and BP? Might as well be
>> random; certainly this routine doesn't initialize it, and if the caller
>> to this routine called someone else that scribbled all over stack,
>> well...
>
> You can't prevent that and it's not your problem to worry about others
> clobbering your data. If they do, it's their fault.

I can prevent it by redefining the problem to use a different language.

>
> Machines can however be built to prevent such things. It requires the
> implementation in hardware of safe pointers in which each pointer is defined
> to take a starting address and an ending address, and any attempt to use
> that pointer outside of those addresses cause a critical application fault.
>
> C provides no direction to hardware manufacturers to implement such
> pointers. hence they are not implemented.

An interesting hypothesis. Too bad LISP machines never took off.

>
>
>
>> > One design creterion for the langauge was to have nothing hidden to the
>> > programmer. Hidden length variables are just that unless they are
> properly
>> > implemented.
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> .length is not a variable, but an attribute. One could liken it to
>> C++'s "this", which isn't really a variable either, but a context.
>> Or one can think of sizeof(), a pseudo-function.
>
> Variable, attribute, it's all the same. <unless> the attribute is peppered
> as a constant through the application. This is how C implements sizeof().
> The compiler knows what the value of sizeof(item) is, and when it sees the
> pseudofunction it does a replace with the precomputed constant.
>
> The problem with doing this of course is that the size doesn't track along
> with the movement of the item reference. To do that, you have to tie the
> attribute to the object, and that means physically storing the data (or a
> reference to it) along side of the item itself as a variable. And that
> value is hidden.

And hopefully nonsettable. An interesting point.

You've already explained variable types -- and I believe standards are
in place for some of them (e.g., int32). What, precisely, is "block
typing"? Are you referring to encapsulation of every memory block with
a RTTI descriptor?

>
>
>> > "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Unicode has some major problems, most of them having to do with
>> backwards compatibility concerns. The ISO-8859-* is ugly, but
>> only a tiny part of the problem apparently. Fonts in X are very
>> troublesome, requiring Cairo or Pango to wrap over the worst of the
>> issues.
>
> Unicode needs to be wished into a corn field, along with Linux.

OK. And what would it be replaced with? I'd suggest wchar_t.

>
>
>> > It's very simple. Use ASCII And if you need more characters use
>> > Extended ASCII, you have 256 characters in total there.
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Whoopee. The Chinese will not be very happy with you.
>
> The Chinese speak english at this point. The Japanese as well So do the
> people of India, and Pakistan, and most people in Europe.
>
> Unicode fosters separateness, isolation, an inability to communicate.
>
> Unicode is not only technical stupidity, it is pure unadulterated evil.

Well, there you have it. I'm sure your proposal can be forwarded to the
government and then to the ISO straightaway.

>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> The mathematicians will of course want to pick a bone as well.
>
> Let them. For them we will define a mathematical symbol set that is
> distinct from ASCII.
> Do you feel a need to use a tripple path integral as your window Title?
> No. So no need for support of this character set there or essentially
> anywhere else for that matter.
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Arabic is an interesting
>> language, written *backwards* (well, OK, it's a viewpoint from the US
>> side of the fence :-) ).
>
> Only the numbers are written backwards from what I understand. They can
> all speak english too you know.
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Japanese has *three character* sets to fiddle with: traditional kanjii,
>> hiragana, and katakana.
>
> What about the Martians? The Lipicysesians? How about the shrews.. God
> don't forget about the shrews.....

I don't see any Martians. Asians I see every day. :-) I'll worry about
grokking Martian when Martians come to visit.

>
>
>
>> > Need more characters yet? OK, go to the 65536 you can get with 2
> bytes.
>> > Need more still. Come to me and I will send you to the hell where you
> are
>> > skinned alive.
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> And how does one read these characters?
>
> You mean, how are they displayed on a page? Specifically are you asking what
> direction you advance in order to move to the next character position?
>
> My question is what is the size of a page?

If you want to expand the scope that much, yes, what is the method by
which one transliterates arbitrary characters into a bitmap, and
then printed?

As for page size...research "A4". Even in the US we have 8 1/2"x11"
and 8 1/2" x 14", plus a fair number of other sizes in various contexts.

>
>
>> > At the time, Japanese computers were quite capable of displaying
> thousands
>> > of characters.
>>
>> AFAIK, they still are.
>
> Illustrating why Unicode has never been needed.
>
> It's not rocket science. But far too complex for cocksucking morons like
> K&R to comprehend.

K&R didn't invent Unicode, methinks.

>
>
>> >> I'm old enough to remember DOS overlays. That was -- painful.
>> >> 640k for systems which can now handle more than 640 *megabytes*.
>> >> (640 gigabytes of RAM, soon.)
>> >
>> > And the inclusion of 1 instruction could have made all of that mess go
> away.
>> > That instruction being the simple return of 4 into any of the 80x86's
>> > registers. Instruction to have the Mnemonic SegSize.
>> >
>> > Later 80x86 CPU's return SegSize = 8 allows the applicaiton to support
> 16
>> > megs of RAM.
>> > SegSize = 16 = 4 gig.
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> 64 GB is now possible. I don't know the details.
>
> 64 bit pointers in 64 bit CPU's with the upper addressing bits truncated.
>
>
>
>> > So let it be written... So let it be done....
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> I'd be careful if I were you. You might have to ask the Legislature to
>> change the Penal Code (section 187 in California; other localities will
>> vary) first.
>
> Once the first few end up feeding wheat fields, the others will get the
> picture.

No doubt they will...and then go after you. The police don't like
lawbreakers. They're funny that way. :-)

>
>
>> No, but it does have its own special challenges. You might want to grab
>> the SPICE source sometime, for example; its ideas regarding memory
>> allocation are ... quaint.
>
> I know. Ferrite core memory was hard to make when Fortran was developed.
>
>
>> > It's an abortion from
>> > the ground up, which is a shame since it has some nice features.
>
>> Such as...? I see no real useful features in C.
>
> Pointers are unique to C

Hardly:

TYPE
something = ^RECORD
...
END;

LISP had nothing *but* pointers, unless someone stuffed a value in the
cdr (since a cdr might have been 15 bits that was a little tough to do
unless the value was a small integer).

And of course assembly had machine addresses, which is what a pointer
is, for the most part.

> and are the principle way in which it provides a
> speed advantage over alternate languages. Also C integer types are <not>
> overflow checked for every operation and this also proves a considerable
> speed advantage. Not doing a jcc or jo after ever add eax,figbert, means
> less pipeline stalls as the CPU tries to figure out weather to pre-process
> the fall though code or the branch code.
>
> Not checking for overflow also provides for cleaner modulo arithmetic which
> quite often comes in handy for bit manipulation, encryption, and a host of
> other uses.
>
> Type conversion is generally good, Struct and Union implementation is
> good.
>
> Other langauge have some of these features, but the first two I indicate
> are unique.

So you're talking about a speed-versus-safety tradeoff.

>
>
>> > My
>> > suggestion - my insistance - is that C be abandoned entirely, and those
> who
>> > refuse, be collected, ground up, and used for fertilizer.
>> >
>> > C++ goes quite a way to answering many of C's failures. But it too
> needs
>> > to be completely abandoned.
>>
>> Good. Now suggest a replacement for both C and C++.
>> Is C# sufficient, for example?
>
> I can define a replacement. It's rather trivial.

Then do so, and submit it.

> But no good replacement
> exists at the moment.

That's because you haven't *defined* one yet. Until you
define one (and presumably patent and market it to
a sufficiently large group as well), the problem will
remain unsolved. Once the world of course recognizes
the sheer unadulterated brilliance of your contribution,
then they'll all say *BOP* "Wow! We could have been
doing it that way for all of these decades!".

> This is not because there is a problem doing so, it's
> strictly because there is no effort to do so.

So put in the effort. You've identified a problem. Fix it!

>
>
>> One might have some work
>> to do in order to rewrite the Linux kernel, but in theory
>> it's possible.
>
> Why recreate yet another version of a loser OS?

Why indeed? Did you have an alternate in mind?
FreeBSD will also need rewriting. HURD probably
will, too. AmigaOS is based on BCPL but probably
has been rewritten in that dreadful language; it will
need restructuring. MacOS (*not* MacOSX) was written
in Pascal but MacOSX is probably mostly C; that will
require rewriting.

It's a lot of work, but worth it in order to get rid
of every scrap of "loser C code".

>
>
>
>> Or did you prefer Smalltalk/Squeak, Python, or Java? Java
>> has its own silly issues.
>
> Python is just brain dead. Not only no block typing, but no block
> delineation at all.
> Java suffers from the same brain dead syntax as C. Smalltalk I prefer to C
> and squeak I've never heard of before other than the occasional squeen from
> the mouse that is living in my kitchen.

So like I said...put in the effort.

>
>
>> > I am referrig to abstract components like structures, unions,
> functions,
>> > pointers etc. It's all there, but ineptly implemented.
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> No, it's not all there. It's not even close to all there.
>>
>> Structures: Yes.
>> Unions: An abomination leading to many bugs.
>
> I see you haven't done much in the way of file parsing or header
> processing. Lets say you have a header for a file of type xyz. There are 3
> different versions of the file type and each has it's own header structure.
> With a union you read the header, then process the version number, then use
> that to overlay the right union to get the proper interpretation of the
> data.
>
> They also allow buffers to be re-used in different contexts of course.
>
> Unions are quite useful.

But an abomination nonetheless.

union
{
int v;
char b[12];
} u;

u.v = 1234;
printf("%s\n", u.b);

Now, in a language such as Pascal, which has variant types, this might
be disallowed and the code throw an exception, if I'm not mistaken.

Java doesn't have unions.

>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Classes: Nope.
>
> Not really needed. There are other means of encapsulation

Describe.

>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Class permissions: n/a
>
> Since classes aren't needed....
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Pointers: Another abomination.
>
> A direct mapping onto a machine register and hence fast and efficient, not
> requiring a multiply and add for every variable reference as is typically
> required for array indexing.

Not all that direct, and not a register. A pointer is merely a virtual
address in most implementations, and a virtual address is just that: it
requires translation by the MMU. The real, physical address could be
anywhere in RAM.

>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Polymorphism: Nope.
>
> A nice, but unnecessary feature.

Noted.

>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Methods: Nope.
>
> Since you don't need classes, you don't need methods obviously since
> methods are class functions.

No, they're methods. :-) But C++ treats them as functions, for the most
part; the names are mangled to protect the innocent.

>
> But you can still encapsulate your data by simply putting the code in a
> module and setting access permissions accordingly.

And where are these "access permissions" defined?

>
> You will have to pass your object ID along with your function calls
> manually.

What object ID?

>
> Oh well, ugly but viable.
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> (C++ introduced methods.)
>
> OOP programming existed long before C++
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Nested classes: Not even close.
>
> Never had a use for them.
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Overloading: Nope.
>
> Generally useless. Should be replaced with operation definition.
>
> Define operator <zyz> = function(left,right)

Ah, ICL.

>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> (C++ introduced the concept of having two functions with the same
>> name but different signatures.)
>
> That's polymorphysm.

No, that's overload. Polymorphism requires classes; overloading of
functions does not.

>
>
>> Methods on primitives: Nope.
>> (A feature in Smalltalk.)
>
> No need.
>
>
>> Object introspection: Nope.
>
> No objects hence no need.

Structures?

>
>
>> Object property-setting-to-call: Nope.
>> (C# is the only language I know of that has this.)
>
> VB has it as well.
>
>
>> Class modifiability: Nope.
>> (This interesting capability allows for addition and editing
>> of methods on existing objects. Smalltalk has it. Not many
>> other languages do.)
>
> Sounds dangerous.

It is. Smalltalk also has a safety net -- the unknown message.

>
>
>
>> Templates: Nope.
>> (C has no templates; everything uses the preprocessor. C++
>> introduces templates, with many issues. Java's templates are also
>> problematic. I don't know what C# has.)
>
> A solution looking for a problem.

Noted.

>
>
>
>> Exceptions: Nope.
>> (C++ has them; they're not used that often. Java has them and
>> they're used extensively.)
>
> C has exceptions. At least all the versions of C I have ever had the
> displeasure to use.

Erm....not that *I* know of. Of course C++ does, but that's a different
language.

>
>
>
>> Virtual: Nope.
>> (Closest C can get is a function pointer, and such is actually used
>> in the Linux kernel.)
>
> Not sure what you mean by "virtual". Sounds like another poorly chosen
> keyword.

Virtual is inherent to C++'s implementation of polymorphism. Briefly
put:

class Item { virtual void print() = 0; }
class IntegerItem : public Item { int i; void print() { printInt(i); }}
class DoubleItem : public Item { double d; void print() { printDouble(d); }}

struct ItemPrinter
{
void operator()(Item * p) { p->print(); }
} itemPrinter;

std::list<Item *> listOfItems;

std::for_each(listOfItems.begin(), listOfItems.end(), itemPrinter);

or if you prefer, skip ItemPrinter and just use a for loop:

for(std::list<Item*>::const_iterator i = listOfItems.begin();
i != listOfItems.end(); i++) { (*i)->print(); }

>
>
>> Threading support: Nope.
>> (External libraries pick up some of the slack.)
>
> Agreed, much better to have it directly specified at the language level.
>

Java has the "synchronized" keyword. Smalltalk might have message
atomicity of some sort.

>
>
>> Array Length: Nope.
>> (Java has a.length. C++ STL has dynamic classes that attempt to
>> mimic arrays but that's not strictly speaking part of the language.)
>
> Compiler knows it, and can pass it along to the application. What you do
> with it is your concern.
>
> I have no problem with this.
>
>
>> Dynamic Array Creation: Nope.
>
> Use pointer/index referencing. For linear arrays. Even for
> multi-dimentional arrays with some difficulty.
>
>> Dynamic Array Size Modification: Nope.
>
> Malloc what you need and overlay your array of structures.
>
>
>> Arbitrary method call: Nope.
>> (Another Smalltalkism, although Java's invoke on introspected
>> methods comes close.)
>
> Unknown....
>

Useful in certain command dispatch contexts. For example, one might
implement in Java a command parser that maps the first word of a
command line to a method name, and subsequent words to arguments
associated with that name.

>
>
>> Transparent casting: Nope.
>> (I'm not sure what to call this but in C++ an object will
>> occasionally be constructed or cast when one wants to call a method
>> on another object related thereto. This can lead to subtle bugs but
>> can also be very useful.)
>
> I prefer all casting to be explicit. fewer errors.

I'm inclined to agree, but in any event C botched it there too;
casts are very hard to find. (The modern casting variant is
easier to search for but is not used nearly as often.)

>
>
>
>> Pointer Arithmetic: Yep.
>> (This is one reason C gets away with so much. If one declares
>> int a[10];
>> int *b = a;
>> then b[0] == a[0] and *(b+1) == a[1]. One can contrast
>> this with Draco, which did not have the implicit multiplication,
>> or with a modified Pascal with the addr() construct. In a
>> more proper language one would write
>> int *b = &a; or int *b = addr(a);.)
>
> The problem is with your interpretation of a[10] you think a[10]
> represent an array of 10 items. In reality it is a pointer to an array of
> 10 items.

No, it is an array of 10 items. The "a" cannot be moved; nor can it be
assigned to.

PASCAL allows

PROGRAM test;

TYPE
int10array: array[1..10] of integer;
VAR
a, b: int10array;

BEGIN
a := b;
END.

However, the equivalent in C will generate an error during compile time.

int a[10], b[10];

int main()
{
a = b;
}

test.c.:7: error: incompatible types in assignment

One can try a typedef but that doesn't work either.

In PASCAL, if one wants a pointer, one must declare it as a pointer,
and the following will *NOT* work:

TYPE
int10array: array[1..10] of integer;
intptr: ^integer;
int10arrayptr: ^int10array;
VAR
a: intptr;
b: int10arrayptr;
BEGIN
new(b);
a := b; (* ERROR! *)
END.

In short, PASCAL knows the difference between a pointer to an array,
an array, and a pointer to a value. C gets very sloppy.

For its part Java treats everything as a pointer except for primitives.
This means that initializers such as

String[][] a = new String[][]{
new String[]{
"a11", "a12", "a13"
},
new String[]{
"a21",
},
new String[]{
"a31", "a32"
},
};

actually make sense. One can also do things such as

a[1] = new String[5];

later on. All this does is replace the second row.

>
> Approached this way the subtle contextual switch vanishes. But might
> agree, the contextual switch shouldn't be there in the first place.
>
> Problem is , if you do this then
>
> int a[10]
> int *b
> b=a
>
> &b returns the address of variable b while
> &a returns the address, not of a but of the int [10]
>
> So now you have another kind of unwanted contextual switch.

That too. It's a problem. PASCAL is clean if wordy.
C is a mess.

>
>
>> > Just like Linux.
>> > It's all there, but in general, it's ineptly implemented.
>> >
>> > As a result, unworthy of consideration.
>
>
>> Of course. And you've made a detailed study of the Vista source
>> code to ensure that all of your objections have been met, I trust?
>
> Linux ineptitude doesn't need source code to see. You only need to install
> it to see how much of a shit stick it is.

Ah, of course, how silly of me. Vista's expertise is widely known.

>
>
>
>
>> You said it was cracked, not quicksand. Of course Unix is built on C.
>> What does *that* tell *you*?
>
> That shit breeds shit.
>
>> > Fuck, it doesn't even have block type checking. It's defaults are
>> > backwards with every module level definition having global scope.
>> >
>> > It doesn't define what a fucking integer is.
>
>
>> Actually, it does, just rather vaguely. I'd have to look.
>
> No, the language doesn't define it. The implementation does. And every
> implementation can be different.
> So in other words, there is no definition.
>
>
>> Of course part of the issue back then was variable sizes of machine word
> types.
>> Honeywell, for instance, had an old machine that had 9-bit bytes.
>
> Yup. I don't care. Emulate 8 bit bytes or refuse to compile.
>
>
>
>> > Operator creation on the other hand would have great utility.
>>
>> Java doesn't have that, either; neither does C++. The only language I
>> know that has such is a rather obscure dialect at one point named ICL; I
>> believe it was from CAL TECH. Of course its idea of an operator was
>> \<word>, but one could create as many such as one desired.
>>
>> Smalltalk might have something but I'd have to look.
>
> Operator overloading is really not a selling feature. It's only usable in
> very,very limited situations.

Like matrix multiplication, I suppose.

>
>
>> > You mean, they should be hidden wherever possible.
>>
>> If you prefer. Certainly they should not be out in the open where abuse
>> is rampant.
>
> I spent 20 years programming in assembler so pointer use is second nature
> to me. The implemntation in hardware of safe pointers would solve all
> pointer problems of course. C provides no directition for such
> advancements.
>
>
>> > Iterating over a list via pointer is still faster than array references.
>>
>> Implementation detail.
>
> One that compilers are not very good at resolving. Typically they will
> generate a multiply and an add for every variable reference in an array, but
> with a pointer they only do an index, which can often be combined and run in
> parallel with the memory access.
>
>
>> > 0 = no characters read (eof) -n = error.
>>
>> Negative numbers are not possible in an unsigned integer context.
>
> Which is why I would define the return type as signed rather than just
> "int" which leaves the sign/not signed state undefined (C incompetence)

But that leaves out half of the integers. Anything greater than
2^31 and less than 2^32-1 would be impossible. Java made a very
strange decision for example to not allow unsigned integers;
certain algorithms have difficulties.

>
>
>
>
>> Unsigned set of bitflags, you mean. A lot of Win32 calls are of this
>> type, ORing flags all over the place. Useful, but can be problematic.
>
> Bitflags are small and compact and you can pass a whack of em in one
> variable.
> nice and efficient like.

So can sets of enums. PASCAL did need a hint: the PACKED keyword.

>
>
>
>> > C and C++ both provide enum, but don't do a *2 scale, but rather a
> simple
>> > increment. Hence it is not convenient for flag creation.
>>
>> Assignment of arbitrary integers to enum values is possible, if a bit
>> ugly.
>
> Ya, and you can use multiple defines.
>
>
>> Problematic in some communications. Sockets in particular may have no
>> data available in non-blocking mode. This is not an EOF but simply
>> an indication that there's no data available right now.
>
> Then return an error condition-8= socket data not available.
>
> I suppose you could also do -1 = EOF. 0 = no data available.

Unix went the "0" route, with error EAGAIN in some contexts.

>
>
>> > Wrong. I have no love of Microsoft and note that from 1980 to 1995 they
>> > managed to grow as a result of inept competiton from the Unix community.
>> > Inept competition that continues until this day.
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> And they are still growing.
>
> Aided by the incompetence of the Linux community.

Well yes, but once Linux's throat is slit and Microsoft Windows achieves
100% market share everywhere, we won't have to worry. Microsoft's
competence is known far and wide; that's why there are so many Linux
CERT advisories -- over 100,000 of them at last count.

>
>
>
>> > Linux./Unix however continues to flounder, fumbling around with attempt
>> > after attempt to make itself more compatible with past Unix failures.
>> >
>> > Unix/Linux is no real competition to Microsoft. It could be.
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> No, it cannot. Microsoft is the holder of a number of C#/.NET patents,
>> the best language/environment known to the modern world (the
>> Microsoft-centric variant, anyway).
>
> Compiling to virtual assembler has a primary advantage of obscuring the
> source from whence the code came.

This is an advantage?

> The secondary advantage is translation
> into a universal standard langauge that can be then cross assembled onto a
> target instruction set.

Guess what gcc does. Go on. Guess.

Hint: it's not solid or liquid or plasma.

>
> For the moment Linux has C++ and since the sourse is open, the source can
> be open.

For the moment. We'll have to take care of that now, won't we? After
all, there are so many holes in Linux. Any hacker can find one.
Because Microsoft Windows Vista has closed source code, it's
very secure; so secure, in fact, that nobody's ever heard of
a blue screen of death, whereas BSODs happen all the time in
Linux boxes.

They're so well known in fact that they've been seen in widebanner
displays in Melbourne, Australia, airport locales, and even
machine control units.

>
>
>
>>
>> > But the
>> > Unix/Linux community are too content with living in a 1970 teletype
> driven
>> > world.
>
>> 1960's. Get your story straight.
>
> The University I attended was still using punchcards in 1981.
>
> I was disgusted.
>

GreyCloud

unread,
Sep 20, 2006, 2:37:04 PM9/20/06
to
Scott Nudds wrote:

Well, did you ever get any pussy in your life?

<guffaw>

The Ghost In The Machine

unread,
Sep 20, 2006, 3:00:05 PM9/20/06
to
In comp.os.linux.advocacy, tha...@tux.glaci.remove-this.com
<tha...@tux.glaci.remove-this.com>
wrote
on Wed, 20 Sep 2006 14:35:45 +0000 (UTC)
<eerjk1$ie1$1...@tux.glaci.com>:

> The Ghost In The Machine <ew...@sirius.tg00suus7038.net> wrote:
>>
>> In any event, fgets() is a problematic call, as is the
>> entire C language, for various reasons. Not that it
>> doesn't work -- a meat grinder can kill you but it also is
>> useful in grinding meat. :-) Just try not to put a hand
>> in during operation, and make sure maintenance is done
>> properly and buy from a reputable supplier, as opposed
>> to one using the advertising tagline "We make bombs and
>> grinder parts While-U-Wait" or some such.
>
> The thing is, C was originally designed to be just one step
> up from a macro assembler, and in that respect it succeeds at
> its goals.

There is that, and I even have a printout of a SmallC
compiler. It had 4 files and maybe 25 or so pages (each
page maybe 66 or so lines of 80 column code, fanfold).

Wiki suggests this is one of many variants:

http://en.wikipedia.org/wiki/Small-C

I suspect this is SmallC.lha (since I was heavily into
Amiga hardware at one point) but would have to look for
it, and at it. The main focus would be to build a system
from scratch using little more than a binary patch editor
and an ability to write to a floppy.

gcc of course is now far bigger and depends on tools such as
bison/byacc/yacc and [f]lex, but the greatest trees can grow
from the smallest acorns.

> Certainly there are some legitimate gripes about
> lack of consistency across different platform implementations
> (i.e. size of primitives, etc), but judging it by the standards
> of other languages with more rigid type checking and better
> memory management and so on is not really a fair comparison.
> It was meant to be a small, flexible tool that could be
> implemented on the less powerful hardware of its day. Sure,
> people choose it now for higher level tasks that are better
> suited to other languages, but that is hardly the fault of
> any design choice in C. If I use a wrench to pound in a nail,
> its not the fault of the wrench.

Depends on whether the documentation coming with the wrench is
sufficiently readable. :-) In any event, a hammer may not
be appropriate for turning a bolt, though that can be done
as well, with a lot of work.

I'll admit I'm not familar with Nudd's implementation of libc.
It's not glibc, that's clear.

>
> Later,
>
> Thad

Scott Nudds

unread,
Sep 20, 2006, 9:27:41 PM9/20/06
to

<tha...@tux.glaci.remove-this.com> wrote

> The thing is, C was originally designed to be just one step
> up from a macro assembler, and in that respect it succeeds at
> its goals.

Well, not really. It optimizes like shit. Typically generating code that
is 2-4-8 times larger and 2-4-8 times slower than tuned assembler.


<tha...@tux.glaci.remove-this.com> wrote


> Certainly there are some legitimate gripes about
> lack of consistency across different platform implementations
> (i.e. size of primitives, etc), but judging it by the standards
> of other languages with more rigid type checking and better
> memory management and so on is not really a fair comparison.

Why not?


<tha...@tux.glaci.remove-this.com> wrote


> It was meant to be a small, flexible tool that could be
> implemented on the less powerful hardware of its day.

It's a tool. I'll give it that. But it's a very poor tool given it's wide
variety of sanctioned implementation specific behaviours.

K&R could always offer a public apology.... I would reduce their jail time
if they did.

<tha...@tux.glaci.remove-this.com> wrote


> Sure, people choose it now for higher level tasks that are better
> suited to other languages, but that is hardly the fault of
> any design choice in C. If I use a wrench to pound in a nail,
> its not the fault of the wrench.

So what you do is you go to the hardware store to find a hammer, and nails
and all you find are wrenches and screws because everyone is using their
wrench as a hammer to hammer in screws.

I don't blame C for being an Abomination. I blame K&R and those who were
involved in the standardization of the filth that is the C langauge.

Scott Nudds

unread,
Sep 20, 2006, 9:29:05 PM9/20/06
to

"GreyCloud" <mi...@cumulus.com> wrote in message

> Well, did you ever get any pussy in your life?

Yawn......


Scott Nudds

unread,
Sep 20, 2006, 10:30:23 PM9/20/06
to

> In comp.os.linux.advocacy, Scott Nudds

> > I don't attempt to fix that which can not be fixed.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Who says this cannot be fixed? Given enough energy *anything*
> can be fixed.

You can't fix 2 so that it equals 1. You can however sometimes fix the
facts around the evidence, but that will hopefully get you impeached,
criminally convicted and executed.

You can't fix C, because the flaws that make up C are central to the C
language. A fix is necessarily a replacement.


> > Typically you can only
> > push the problem deeper into more esoteric and less predictable
territory
> > where it will crop up and cause all sorts of nonsense problems later on
when
> > people forget the esoteric conditions where the so called "fix" no
longer
> > fixes anything.
> >
> > Many people make their living out of not solving problems in exactly
this
> > way.
> >
> > They are part of the problem, not part of the solution. Indeed they do
> > nothing but perpetuate the failure.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> Oh, so you *do* want to go around slashing throats then?
> That's...comforting.

I oppose the perpetuation of failure. Hence I oppose C, and the C Shit
Lickers who promote and defend it.


> > I don't think the jails would be large enough. I'm willing to excempt
> > juvi's who were too young to understand the extent of their crime.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Don't worry. The Juvi's are being programmed into using VB.

Excellent. VB is vastly superior to C. The latest versions even have <<
and >> binary operators, although unfortunately they are arithmatic shifts
not logical shifts.

Once Microsoft implements unsigned binaries that don't have overflow
checking I suspect this unfortunate << >> behaviour will be able to be
avoided.

VB is slick. VB.NET is very slick. Joyful. Beauty. Round in all the
right places. Jucy, Enviting, Sweet, Savory, Warm, Fuzzy, and shaved in all
the right places.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> The compiler cannot do it alone. Consider the above again,
> with this call:
>
> char * callit()
> {
> char * buf[*] = (char *[*]) malloc(256);
> readMyData(buf);
> }
>
> In an ideal system this would actually work, and the [*]
> construct would be a hint to the compiler that he has to
> [a] look up/calculate the actual size of the buffer at
> runtime, and [b] pass whatever is relevant so that anyone
> passed that buffer would know it as well.

Traditionally an empty brace flags the kind of behaviour you are looking
for.

The syntax need not be so cryptic however.

void Callit() {
const BUFSIZE = 256
char buf[] = malloc(BUFSIZE)

readMyData(&buf[0],BUFSIZE)

}

or

void Callit() {
const BUFSIZE = 256
char *buf = malloc(BUFSIZE)

readMyData(buf,BUFSIZE)

}


> > You can of course include the buffer size along with the buffer.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> It has to be done.

Yes. If you don't like that I can understand your pain. But <not>
explicitly including it means that it must be included in a hidden form, and
that is against the language's design goals.

You can certianly develop a langauge that does this, and yes I agree it
would be better, but I don't consider it a language flaw.

Having an undefined size for an integer on the other hand and not having
a set range <IS> a serious flaw.


> > But when
> > you do it is another matter. It makes sense to do it with strings at
all
> > times since their meaning is clear. But how about just a general buffer
> > that is used for input of integers and ascii strings and unicode
strings.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> A "general buffer"? Are you referring to a structure, or to some sort
> of uninterpreted area of memory that hasn't been processed yet?

A structure is a high level format that is imposed on a buffer of some
sort. The buffer is the storage area and the structure is the
interpretation of that area.

I am referring to a situation where there may be alternate interpretations
to a data area depending upon the data contained within.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> Why define a struct like that? The compiler should *assume* it.
> In other words, the following:
>
> string s = "Hi";
> printf("%d %d %*.*s\n", s.buflen, s.strlen, s.strlen, s.strlen, s.buf);
>
> would actually work. Or, if one wants to get even sillier:

I have no problem with that.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> printf("%d %d %*.*s\n", "Hi".buflen, "Hi".strlen,
> "Hi".strlen, "Hi".strlen, "Hi".buf);
>
> would work as well. (This actually *does* work in Java, except
> for the printf().)

It's the price you pay for logically handled strings.

> >> [3] Strings can handle arbitrarily large chars.
> >
> > No value.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> You're right. We should restrict ourselves to ASCII.
> No point in gumming up the works with all of that thar
> foreign-type stuff like Arabic and Chinese and Klingon
> and Ebonics. Just good old American English.

There are 256 characters available, 220 or so are available for display
purposes. I grant you the freedom to use 2 byte charactes and define
another 65300 characters for these languages. The lower 256 however will
always be ascii and control.

If you need more charactrs than please have your head checked for worms.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Depends on the context. In any event, initialization is
> not a given.
>
> char buf[1024];
>
> and
>
> void routine
> {
> static char buf[1024];
> }
>
> are initialized (to zeroes, actually).
>
> void routine()
> {
> char buf[1024];
> }
>
> is not. Confusing? You bet your sweet bippy.

I don't think there is any guarantee of buffer initialization in the C
standard. And yes it is inconsistant and confusing. And another reason why
C is a Shit Stick.

Fortunatly C is just about dead at this point.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> > You can't prevent that and it's not your problem to worry about others
> > clobbering your data. If they do, it's their fault.
>
> I can prevent it by redefining the problem to use a different language.

Well. not really. Errant pointers are always a problem. Even when the
langauge doesn't explicitly provide them.

Delimited pointers are the solution to the pointer problem. But they
require some hardware assistance to work properly and efficiently.


> > Machines can however be built to prevent such things. It requires the
> > implementation in hardware of safe pointers in which each pointer is
defined
> > to take a starting address and an ending address, and any attempt to use
> > that pointer outside of those addresses cause a critical application
fault.
> >
> > C provides no direction to hardware manufacturers to implement such
> > pointers. hence they are not implemented.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> An interesting hypothesis. Too bad LISP machines never took off.

Lisp doesn't lend itself very well to bit twiddeling, and that is largely
what PC's do. It wouild be kinda like trying to build an OS out of
MicroSoft word Macro's.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> And hopefully nonsettable. An interesting point.

It would be settable as shown unfortunately, you can write protect blocks,
but not individual variables.
There isn't much of an alternative I'm afraid. Delimited pointers would
be a solution though. By necessity they can be write protected.


> You've already explained variable types -- and I believe standards are
> in place for some of them (e.g., int32). What, precisely, is "block
> typing"? Are you referring to encapsulation of every memory block with
> a RTTI descriptor?

I'm not sure what RTTI means, but block typing is simply the process of
checking the end of a block to help ensure that
it matches the block that it was intended to end.

do
while

for
next

case
end_case

function
end_function

etc...


> > Unicode needs to be wished into a corn field, along with Linux.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> OK. And what would it be replaced with? I'd suggest wchar_t.

I wouldn't replace it with anything. I wouild just adopt a standard for the
upper 128 characters in an 8 bit byte and leave it at that.

There is no value in supporting ZULU as a language type.

Supporting ZULU does nothing bur promote disunity, misunderstanding and
violence.

> >> > It's very simple. Use ASCII And if you need more characters use
> >> > Extended ASCII, you have 256 characters in total there.
> >
> > "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
> >> Whoopee. The Chinese will not be very happy with you.
> >
> > The Chinese speak english at this point. The Japanese as well So do
the
> > people of India, and Pakistan, and most people in Europe.
> >
> > Unicode fosters separateness, isolation, an inability to communicate.
> >
> > Unicode is not only technical stupidity, it is pure unadulterated
evil.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Well, there you have it. I'm sure your proposal can be forwarded to the
> government and then to the ISO straightaway.

Do you think the ISO would adopt standards that extinguish the need for
thier own existance?


> > What about the Martians? The Lipicysesians? How about the shrews..
God
> > don't forget about the shrews.....

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> I don't see any Martians. Asians I see every day. :-) I'll worry about
> grokking Martian when Martians come to visit.

Unicode isn't going to be very Uni on that day now is it?

I have never understood the mindset in which people argue that disunity
and the inability to communicate should be not only accepted but promoted.

Unicode is an embodyment of that sick philosophy.


> > You mean, how are they displayed on a page? Specifically are you asking
what
> > direction you advance in order to move to the next character position?
> >
> > My question is what is the size of a page?

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> If you want to expand the scope that much, yes, what is the method by
> which one transliterates arbitrary characters into a bitmap, and
> then printed?

If you wish to convey a text message and you are printing backwards you had
better know the starting position on the page. If you intend to print down
the page you had better know how many lines you have.

Martians of course will want you to print into the page, and you should be
looking for ways to supprt that as well.

I'll spare you the requirements of the pandimentional.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> As for page size...research "A4". Even in the US we have 8 1/2"x11"
> and 8 1/2" x 14", plus a fair number of other sizes in various contexts.

Is that your standard? All printing will be to manuscript size A4?
Portrate not lanscape I presume.

Excellent. Submit your suggestion to the ISO.

> > It's not rocket science. But far too complex for cocksucking morons
like
> > K&R to comprehend.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> K&R didn't invent Unicode, methinks.

And yet still too complex for their small chimplike minds.


> > Once the first few end up feeding wheat fields, the others will get
the
> > picture.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> No doubt they will...and then go after you. The police don't like
> lawbreakers. They're funny that way. :-)

A small price to pay for a better world.

> > Pointers are unique to C
>
> Hardly:

Well, at the time it was developed.


> TYPE
> something = ^RECORD
> ...
> END;

> LISP had nothing *but* pointers, unless someone stuffed a value in the
> cdr (since a cdr might have been 15 bits that was a little tough to do
> unless the value was a small integer).

Well, no. Lisp has refernces but no pointers.


> And of course assembly had machine addresses, which is what a pointer
> is, for the most part.

No, that's exactly what a pointer is.

> So you're talking about a speed-versus-safety tradeoff.

Not generally. But speed is a consideration. But even here C fails...


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Then do so, and submit it.

Done so many times. But not submitted to the ISO who would never accept it.

http://www.youtube.com/watch?v=KoekkotLxgA&NR


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> That's because you haven't *defined* one yet. Until you
> define one (and presumably patent and market it to
> a sufficiently large group as well), the problem will
> remain unsolved. Once the world of course recognizes
> the sheer unadulterated brilliance of your contribution,
> then they'll all say *BOP* "Wow! We could have been
> doing it that way for all of these decades!".

At this point, there is no point. C has done it's damage and there is
nothing that can be done to undo it.

PASM cross assemblers are now entering the mainstream and both C and C++
will soon be relegated to the dustbin of history.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> So put in the effort. You've identified a problem. Fix it!

I don't have the decades needed to write a new optimizing compiler. But I
have poduced a specification. You should be able to find it online. But it
matters not. C and C++ have already done their damage and the world is
moving on.

> Why indeed? Did you have an alternate in mind?

Windows is the only viable alternative. But of course you know that.

The Ghost In The Machine

unread,
Sep 20, 2006, 9:00:07 PM9/20/06
to
In comp.os.linux.advocacy, Scott Nudds
<nos...@foo.com>
wrote
on Wed, 20 Sep 2006 19:30:23 -0700
<6akQg.106224$sS1....@read1.cgocable.net>:

Run Time Type Identification. Basically, handed a pointer, one can
determine what type it belongs to.

> but block typing is simply the process of
> checking the end of a block to help ensure that
> it matches the block that it was intended to end.
>
> do
> while
>
> for
> next
>
> case
> end_case
>
> function
> end_function
>
> etc...

Ah, a code construct.

>
>
>> > Unicode needs to be wished into a corn field, along with Linux.
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> OK. And what would it be replaced with? I'd suggest wchar_t.
>
> I wouldn't replace it with anything. I wouild just adopt a standard for the
> upper 128 characters in an 8 bit byte and leave it at that.
>
> There is no value in supporting ZULU as a language type.
>
> Supporting ZULU does nothing bur promote disunity, misunderstanding and
> violence.

Supporting anything that's nonASCII and nonEnglish supports
disunity, misunderstanding, and violence.

>
>
>
>> >> > It's very simple. Use ASCII And if you need more characters use
>> >> > Extended ASCII, you have 256 characters in total there.
>> >
>> > "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> >> Whoopee. The Chinese will not be very happy with you.
>> >
>> > The Chinese speak english at this point. The Japanese as well So do
> the
>> > people of India, and Pakistan, and most people in Europe.
>> >
>> > Unicode fosters separateness, isolation, an inability to communicate.
>> >
>> > Unicode is not only technical stupidity, it is pure unadulterated
> evil.
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Well, there you have it. I'm sure your proposal can be forwarded to the
>> government and then to the ISO straightaway.
>
> Do you think the ISO would adopt standards that extinguish the need for
> thier own existance?

If the case is strongly enough presented, perhaps. As in "your
existence is now illegal. Go away."

Failing that, I'm not certain. But they can read handwriting on the
wall, and if there's enough handwriting things get interesting.

>
>
>
>
>> > What about the Martians? The Lipicysesians? How about the shrews..
> God
>> > don't forget about the shrews.....
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> I don't see any Martians. Asians I see every day. :-) I'll worry about
>> grokking Martian when Martians come to visit.
>
> Unicode isn't going to be very Uni on that day now is it?
>
> I have never understood the mindset in which people argue that disunity
> and the inability to communicate should be not only accepted but promoted.
>
> Unicode is an embodyment of that sick philosophy.

Exactly. ASCII is the only standard ever needed.

>
>
>> > You mean, how are they displayed on a page? Specifically are you asking
> what
>> > direction you advance in order to move to the next character position?
>> >
>> > My question is what is the size of a page?
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> If you want to expand the scope that much, yes, what is the method by
>> which one transliterates arbitrary characters into a bitmap, and
>> then printed?
>
> If you wish to convey a text message and you are printing backwards you had
> better know the starting position on the page. If you intend to print down
> the page you had better know how many lines you have.
>
> Martians of course will want you to print into the page, and you should be
> looking for ways to supprt that as well.
>
> I'll spare you the requirements of the pandimentional.
>
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> As for page size...research "A4". Even in the US we have 8 1/2"x11"
>> and 8 1/2" x 14", plus a fair number of other sizes in various contexts.
>
> Is that your standard? All printing will be to manuscript size A4?
> Portrate not lanscape I presume.
>
> Excellent. Submit your suggestion to the ISO.

My standard? Hardly. A4 is a European variant of paper size
and well known in England.

In any event, all printing should be done on 8 1/2" x 11".
This is, after all, an American problem.

>
>
>
>> > It's not rocket science. But far too complex for cocksucking morons
> like
>> > K&R to comprehend.
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> K&R didn't invent Unicode, methinks.
>
> And yet still too complex for their small chimplike minds.
>
>
>> > Once the first few end up feeding wheat fields, the others will get
> the
>> > picture.
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> No doubt they will...and then go after you. The police don't like
>> lawbreakers. They're funny that way. :-)
>
> A small price to pay for a better world.
>
>> > Pointers are unique to C
>>
>> Hardly:
>
> Well, at the time it was developed.
>
>
>> TYPE
>> something = ^RECORD
>> ...
>> END;
>
>> LISP had nothing *but* pointers, unless someone stuffed a value in the
>> cdr (since a cdr might have been 15 bits that was a little tough to do
>> unless the value was a small integer).
>
> Well, no. Lisp has refernces but no pointers.

I suppose one could construe it as such. CAR and CDR were 15 bit values
in an old computer system.

>
>
>> And of course assembly had machine addresses, which is what a pointer
>> is, for the most part.
>
> No, that's exactly what a pointer is.

Not in FORTRAN SPICE. The concept got extended and corrupted there.

>
>
>
>> So you're talking about a speed-versus-safety tradeoff.
>
> Not generally. But speed is a consideration. But even here C fails...
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Then do so, and submit it.
>
> Done so many times. But not submitted to the ISO who would never accept it.
>
> http://www.youtube.com/watch?v=KoekkotLxgA&NR

Don't be stupid. The ISO is obligated to at least consider it,
presumably.

>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> That's because you haven't *defined* one yet. Until you
>> define one (and presumably patent and market it to
>> a sufficiently large group as well), the problem will
>> remain unsolved. Once the world of course recognizes
>> the sheer unadulterated brilliance of your contribution,
>> then they'll all say *BOP* "Wow! We could have been
>> doing it that way for all of these decades!".
>
> At this point, there is no point. C has done it's damage and there is
> nothing that can be done to undo it.

Oh, I dunno about that. If one gets a sufficiently good language out
there it'll take the world by storm. Something COOL. Something that
eventually doesn't become a hash of its former self.

>
> PASM cross assemblers are now entering the mainstream and both C and C++
> will soon be relegated to the dustbin of history.

Cross assemblers? What good would *those* do? RISC doesn't have AAD;
X86 doesn't have instructions executed following jumps.

Granted, a cross assembler can be used to develop code on one platform
for eventual execution on another, much like cross-compilers. However,
the cross compiler can accept code originally intended for the
originating box, with proper porting controls.

>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> So put in the effort. You've identified a problem. Fix it!
>
> I don't have the decades needed to write a new optimizing
> compiler.

You don't need to write the *compiler*. You merely need to specify
a *language*. Do that, and someone else might write the compiler,
if you present it strongly enough.

> But I
> have poduced a specification. You should be able to find it
> online. But it matters not. C and C++ have already done
> their damage and the world is moving on.

OK. Got a website?

You didn't write .NET or MSIL (at least, not as far as I know). If
that's your entry, then fine; state so and have done with it.

>
>
>
>> Why indeed? Did you have an alternate in mind?
>
> Windows is the only viable alternative. But of course you know that.
>

So Windows is now a computer language. Interesting.

Scott Nudds

unread,
Sep 21, 2006, 3:36:08 AM9/21/06
to

> > I'm not sure what RTTI means,

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> Run Time Type Identification. Basically, handed a pointer, one can
> determine what type it belongs to.

Iew.. Considerable cost there, both in terms of storage space and processing
overhead.

Best to avoid that kind of nonsense, or relegate it to a special class of
variables where the overhead is minimized.


> > but block typing is simply the process of
> > checking the end of a block to help ensure that
> > it matches the block that it was intended to end.
> >
> > do
> > while
> >
> > for
> > next
> >
> > case
> > end_case
> >
> > function
> > end_function
> >
> > etc...

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
> Ah, a code construct.

No, assigning a block type to a block, so that the end type can be matches
with the block type to ensure
correspondance.

It's a good thing.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Supporting anything that's nonASCII and nonEnglish supports
> disunity, misunderstanding, and violence.

Nor omgnik kusk oc kuoy ssae htp uamm omru oykc ufog.

But transmitting graphics as UUEncoded text is just peachy...

Right Ghost?

> > Do you think the ISO would adopt standards that extinguish the need
for
> > thier own existance?

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> If the case is strongly enough presented, perhaps. As in "your
> existence is now illegal. Go away."

Yes I suppose that is the only way they would disband.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Failing that, I'm not certain. But they can read handwriting on the
> wall, and if there's enough handwriting things get interesting.

True. But those hands I've found are attached to small monkey brains of
the type that praise the glory of the Linux Shit Stick.

Hence they usually end up grunting and smearing their feces on that wall.


> > Unicode is an embodyment of that sick philosophy.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> Exactly. ASCII is the only standard ever needed.

And what character set does UUEncoding use?

Isn't UUEncodeing used in the XFer of all form of binary data from smut,
smut and more smut, to mp3's, dvd's and the like.

Seems to me that if ASCII is capable of that, it's capable of
communicating with ZULU's

Aahahahahahahahah......

> > Excellent. Submit your suggestion to the ISO.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> My standard? Hardly. A4 is a European variant of paper size
> and well known in England.

But the adoption of A4 as <the> standard size for a page for text output
is your idea.

What's holding you back?


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> In any event, all printing should be done on 8 1/2" x 11".
> This is, after all, an American problem.

I kinda bet it will be done in ASCII.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Don't be stupid. The ISO is obligated to at least consider it,
> presumably.

Meaningless and pointless.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Oh, I dunno about that. If one gets a sufficiently good language out
> there it'll take the world by storm. Something COOL. Something that
> eventually doesn't become a hash of its former self.

Just look at the DotNet languages. They are the future.you can pretty much
forget everything else.

> > PASM cross assemblers are now entering the mainstream and both C and
C++
> > will soon be relegated to the dustbin of history.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> Cross assemblers? What good would *those* do? RISC doesn't have AAD;
> X86 doesn't have instructions executed following jumps.

Well, then I guess to perform an Ascii Adjust after a Division one will
have to inline some code to peroform the equivalent operation, and then
optimize the resulting code stream.

You know, CPU's don't have an "width = x(10)*pi" instruction either but
the compilers do manage to produce a stream of instructions that perform the
desired task.

As to execution during jump, this is simply a matter of altering
instruction ordering. I'll relegate that to the peep hole optimizer.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Granted, a cross assembler can be used to develop code on one platform
> for eventual execution on another, much like cross-compilers. However,
> the cross compiler can accept code originally intended for the
> originating box, with proper porting controls.

A properly constructed language wouldn't need any alteration at all unless
it was manipulating hardware directly. But lets exclude those programs for
the moment.

JIT compilation provides two nice advantages. Code security and code
security.

Code security in the sense that reverse compiling the code is difficult,
and Code security in the sense that altering the code is difficult.

Executables are generally smaller than the source, so there are speed
advantages here as well, and of course the instruction set of the primary
compiler's target (virtual) CPU can be chosen to give hardware manufacturers
some guide to where to guide the next generation of hardware acceleration.

You might have noticed that I'm big on restricting pointer access by
altering the structure of the base CPU pointer register.


> > I don't have the decades needed to write a new optimizing
> > compiler.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> You don't need to write the *compiler*. You merely need to specify
> a *language*. Do that, and someone else might write the compiler,
> if you present it strongly enough.

Done.


> > But I
> > have poduced a specification. You should be able to find it
> > online. But it matters not. C and C++ have already done
> > their damage and the world is moving on.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
> OK. Got a website?

Nope. But somewhere I still have the specification.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> So Windows is now a computer language. Interesting.

Pretty much. Windows applications are nearly universally written in one
of the Microsoft langauges that run under Microsoft Visual Studio.

There are some that are written in Borland products of course, but that's
a small minority

Things will become even more tightly integrated between language and OS
with Vista DotNet.

I think you will generally like the future computing universe. Between
modules, you will have a data flow type environment where messages are
passed between independent objects each with their own serialization buffers
to maintain some form of order. The volume of messages that can be
processed at any time will be in direct proportion to the number of CPU
cores available, and it won't matter where those cores are. Local, remote,
it just won't matter.

Underlying that, you will have some more traditional code and
connectiviity that generates those messages and responds to them.

Individual processes then will be limited in speed by the speed of the
cores that are running the individual modules.

On the module level you will see objects. On the code level you will see
OOP objects and traditional code.

This is the Future of computing as seen and as will be delivered by
Microsoft.

GreyCloud

unread,
Sep 21, 2006, 1:26:01 AM9/21/06
to
Scott Nudds wrote:

Ah, too bad.

Scott Nudds

unread,
Sep 21, 2006, 5:26:13 AM9/21/06
to

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
> So you're talking about a speed-versus-safety tradeoff.

There is no inherent safety problem that is associated with union
constructs. But yes, if you wish to avoid them, then you have to implement
a less efficient method of manipulation.


> > Why recreate yet another version of a loser OS?

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Why indeed? Did you have an alternate in mind?

Just use windows. It functions well enough. Just clean up the API.
DotNet will do that if properly used by Microsoft.


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> It's a lot of work, but worth it in order to get rid
> of every scrap of "loser C code".

Eventually it will just be machine translated away.

That's the only hope at this point.

> > Java suffers from the same brain dead syntax as C. Smalltalk I prefer
to C
> > and squeak I've never heard of before other than the occasional squeen
from
> > the mouse that is living in my kitchen.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> So like I said...put in the effort.

Squeek is a smalltalk 80.envionment I see. A little mousie universe all
on it's own.


> > I see you haven't done much in the way of file parsing or header
> > processing. Lets say you have a header for a file of type xyz. There
are 3
> > different versions of the file type and each has it's own header
structure.
> > With a union you read the header, then process the version number, then
use
> > that to overlay the right union to get the proper interpretation of the
> > data.
> >
> > They also allow buffers to be re-used in different contexts of course.
> >
> > Unions are quite useful.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> But an abomination nonetheless.

It's the most efficient way of processing headers. Unions allow you to
change interpretive contexts depending on the data found. Very useful.


>
> union
> {
> int v;
> char b[12];
> } u;
>
> u.v = 1234;
> printf("%s\n", u.b);
>
> Now, in a language such as Pascal, which has variant types, this might
> be disallowed and the code throw an exception, if I'm not mistaken.
>
> Java doesn't have unions.

Microsoft Visual Basic does. It doesn't have unions though. Don't know
about the DotNet version. Should check that out.

I see no reason why variant types should enter into the picture here, as
everything is well typed to <not> be a variant.


> > "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
> >> Classes: Nope.
> >
> > Not really needed. There are other means of encapsulation

> > "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
> Describe.

There is module level encapsulation of course, and then there is
translation unit encapsulation.

Both can be exploited to provide the same manner of encapsulation as you
get with classes. But. you have to have some standards in place on what you
are going to call your class instantiation and deinstantiation functions.

Creating a new object for example would actually just call a module
(class) function that returns a pointer to a newly allocated structure.
Future references to that "object" would call the root object and pass the
instance of the object (the pointer to the object's data area, or perhaps an
index to a table containing that data), to the object. Variable references
within the object are then made via the pointer provided.

This is just a manual version of what is happening behind the scenes
anyhow.

Nothing special.

The point is, class like data encapuslation doesn't demand class
nomenclature.


> > "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
> >> Pointers: Another abomination.
> >
> > A direct mapping onto a machine register and hence fast and efficient,
not
> > requiring a multiply and add for every variable reference as is
typically
> > required for array indexing.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> Not all that direct, and not a register. A pointer is merely a virtual
> address in most implementations, and a virtual address is just that: it
> requires translation by the MMU. The real, physical address could be
> anywhere in RAM.

Yup, which at runtime is transferred into a pointer register prior to use.

Don't confuse pointer with address. A pointer is a variable that contains
an address. The address itself is a number reperesnting a position in RAM.
A poitner is a variable (and hence a CPU register) that holds an address.

> > Since you don't need classes, you don't need methods obviously since
> > methods are class functions.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> No, they're methods. :-) But C++ treats them as functions, for the most
> part; the names are mangled to protect the innocent.

No, they are simply function calls. Albeit with one or more hidden
parameters.
You can even look at the Opcode level and lo and behold you will find that
they are structured
just like a function call.

Of course, it couldn't be otherwise.

> > But you can still encapsulate your data by simply putting the code in
a
> > module and setting access permissions accordingly.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> And where are these "access permissions" defined?

In the module header and in the module itself of course. I am speaking of
the lovely C characteristic of defaulting
all module level labels to have global scope.

The C default behaviours are invariably exactly opposite those dictated by
common sense.

If the inventors of C could fuck something up. They did. And did so
willingly and with great pleasure.


> > You will have to pass your object ID along with your function calls
> > manually.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
> What object ID?

A number that identifies the object. This could be an index to a
translation table that holds pointers to the object data areas or a direct
pointer to the objects data area.

> > Define operator <zyz> = function(left,right)
>
> Ah, ICL.

A lot more useful than simple operator overloading.


> > "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
> >> (C++ introduced the concept of having two functions with the same
> >> name but different signatures.)
> >
> > That's polymorphysm.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> No, that's overload. Polymorphism requires classes; overloading of
> functions does not.

Really? I'll have to review.

Just goes to show you how bad the Nomenclature that these ShitLickers have
develped actually is.


> > No objects hence no need.

> Structures?

Stuctures aren't really considered objects. But aggregate types.

> >> Virtual: Nope.
> >> (Closest C can get is a function pointer, and such is actually used
> >> in the Linux kernel.)
> >
> > Not sure what you mean by "virtual". Sounds like another poorly
chosen
> > keyword.
>
> Virtual is inherent to C++'s implementation of polymorphism. Briefly
> put:
>
> class Item { virtual void print() = 0; }
> class IntegerItem : public Item { int i; void print() { printInt(i); }}
> class DoubleItem : public Item { double d; void print() {
printDouble(d); }}
>
> struct ItemPrinter
> {
> void operator()(Item * p) { p->print(); }
> } itemPrinter;
>
> std::list<Item *> listOfItems;
>
> std::for_each(listOfItems.begin(), listOfItems.end(), itemPrinter);
>
> or if you prefer, skip ItemPrinter and just use a for loop:
>
> for(std::list<Item*>::const_iterator i = listOfItems.begin();
> i != listOfItems.end(); i++) { (*i)->print(); }

Sorry, I find that shit above impossible to follow - particularly at 1:30 in
the morning...
You are just referring to the syntax used to employ polymorphysm.

Again, shit syntax on the part of C++ developers.


> >> Threading support: Nope.
> >> (External libraries pick up some of the slack.)
> >
> > Agreed, much better to have it directly specified at the language
level.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> Java has the "synchronized" keyword. Smalltalk might have message
> atomicity of some sort.

Iew.. yuck. You can't do anything with that. At a minimum you want do have
a parallelizing option in the definition of a function.

uint Wongomatic_status = 0

void wongomatic(uint a, uint b) parallel {
inc(Wongomatic_status)
Code here
dec(Wongomatic_status)
}

void do_two() {
Wongomatic (1,1)
Wongomatic (1,2)
wait(Wongomatic_status)
}

> >> Arbitrary method call: Nope.
> >> (Another Smalltalkism, although Java's invoke on introspected
> >> methods comes close.)
> >
> > Unknown....

> Useful in certain command dispatch contexts. For example, one might
> implement in Java a command parser that maps the first word of a
> command line to a method name, and subsequent words to arguments
> associated with that name.

Still unclear. Sounds like you want to implement a state machine.


> >> Transparent casting: Nope.
> >> (I'm not sure what to call this but in C++ an object will
> >> occasionally be constructed or cast when one wants to call a method
> >> on another object related thereto. This can lead to subtle bugs
but
> >> can also be very useful.)
> >
> > I prefer all casting to be explicit. fewer errors.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> I'm inclined to agree, but in any event C botched it there too;
> casts are very hard to find. (The modern casting variant is
> easier to search for but is not used nearly as often.)

The goal was to use a minimum number of symbols. Hence everything is hard
to find. It was a stupid idea from the very beginning. See Block
Typing....

> >> Pointer Arithmetic: Yep.
> >> (This is one reason C gets away with so much. If one declares
> >> int a[10];
> >> int *b = a;
> >> then b[0] == a[0] and *(b+1) == a[1]. One can contrast
> >> this with Draco, which did not have the implicit multiplication,
> >> or with a modified Pascal with the addr() construct. In a
> >> more proper language one would write
> >> int *b = &a; or int *b = addr(a);.)
> >
> > The problem is with your interpretation of a[10] you think a[10]
> > represent an array of 10 items. In reality it is a pointer to an array
of
> > 10 items.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> No, it is an array of 10 items. The "a" cannot be moved; nor can it be
> assigned to.

To be more precise, it's the address of an array of 10 items. a is not an
array. It is the address of an array.

>
> PASCAL allows
>
> PROGRAM test;
>
> TYPE
> int10array: array[1..10] of integer;
> VAR
> a, b: int10array;
>
> BEGIN
> a := b;
> END.
>
> However, the equivalent in C will generate an error during compile time.
>
> int a[10], b[10];
>
> int main()
> {
> a = b;
> }
>
> test.c.:7: error: incompatible types in assignment
>
> One can try a typedef but that doesn't work either.

Yes, because the language is interpreting a and b to be addresses, and as
you say they are fixed and are not variables. hence a=b is a request to
assign the fixed address b to the fixed address a, which makes no sense.

It's equivalent to saying &a[0] = &b[0]

By increasing the compiler's smarts it could perform a proper
interpretation, and then do a block copy of b to a if they are compatible,
or you could arrange to have a and b be pointers to a data area rather than
the data area itself. that way a=b would assign the address contained in
pointer b to pointer a.

But then how would you ever reference the original buffer pointed to by a
again?

> In short, PASCAL knows the difference between a pointer to an array,
> an array, and a pointer to a value. C gets very sloppy.

Ultimately there is only one kind of pointer, and that pointer translates
to a CPU pointer regisrer that holds an address. I don't see how C is being
inconsistant here.


> For its part Java treats everything as a pointer except for primitives.
> This means that initializers such as
>
> String[][] a = new String[][]{
> new String[]{
> "a11", "a12", "a13"
> },
> new String[]{
> "a21",
> },
> new String[]{
> "a31", "a32"
> },
> };
>
> actually make sense. One can also do things such as
>
> a[1] = new String[5];
>
> later on. All this does is replace the second row.

Disgusting.

> > Approached this way the subtle contextual switch vanishes. But might
> > agree, the contextual switch shouldn't be there in the first place.
> >
> > Problem is , if you do this then
> >
> > int a[10]
> > int *b
> > b=a
> >
> > &b returns the address of variable b while
> > &a returns the address, not of a but of the int [10]
> >
> > So now you have another kind of unwanted contextual switch.
>
> That too. It's a problem. PASCAL is clean if wordy.
> C is a mess.

The solution is simple in the case of C though. Just avoid the use of
"a" as a standin for &a[0].

> > Linux ineptitude doesn't need source code to see. You only need to
install
> > it to see how much of a shit stick it is.
>
> Ah, of course, how silly of me. Vista's expertise is widely known.

Linux inferiority is widely known. Vista has only been hinted at. We will
know in November when it is released.

Your hope of it being 10 to 20 times slower than exisiting XP will of course
be dashed, and in 1 day, Vista will have more users than Linux has managed
to obtain in 15 years.

I feel your sense of loss already.


> > Operator overloading is really not a selling feature. It's only
usable in
> > very,very limited situations.

> Like matrix multiplication, I suppose.


What of matrix multiplication. You do know that C++ is allowed to alter
the order of it's operations for operators of the same precidence.

So for Matricies A,B,C, how do you ensure that A=B*C = A=B*C when the
compiler could compute A=C*B

And of course you do realize that with matricies multiplication isn't
reflexive.

How about vector multiplication? What are you going to use for cross and
dot products?

Much better to define your own operators and have them all have fixed
precedence.


> But that leaves out half of the integers. Anything greater than
> 2^31 and less than 2^32-1 would be impossible.

Ya, well I guess you will have to live with using two reads to process
strings more than
2 billion charactres long.

> Java made a very strange decision for example to not allow unsigned
integers;
> certain algorithms have difficulties.

Really? -20 points for Java.

> >> Unsigned set of bitflags, you mean. A lot of Win32 calls are of this
> >> type, ORing flags all over the place. Useful, but can be problematic.
> >
> > Bitflags are small and compact and you can pass a whack of em in one
> > variable.
> > nice and efficient like.
>
> So can sets of enums. PASCAL did need a hint: the PACKED keyword.

Convenient, but unnecessary.


> > "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
> >> And they are still growing.
> >
> > Aided by the incompetence of the Linux community.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> Well yes, but once Linux's throat is slit and Microsoft Windows achieves
> 100% market share everywhere, we won't have to worry. Microsoft's
> competence is known far and wide; that's why there are so many Linux
> CERT advisories -- over 100,000 of them at last count.

Either get with the program of competing with Microsoft, or get out of the
way. LinTards have this habit of standing around with their thumbs up their
backside and pretending that the Linux Shit Stick is a threat to Microsoft,
when in reality it is one of Microsoft's key assets in that it is preventing
real competition from developing.

Linux is a great diversion of resources and effort that like all Unix
variants that came before it, is going nowhere fast.


> > Compiling to virtual assembler has a primary advantage of obscuring
the
> > source from whence the code came.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
> This is an advantage?

Just ask any software company that wishes to develop for the environment.
They will have a keen interest in <not> releasing their source code to the
public.


> > The secondary advantage is translation
> > into a universal standard langauge that can be then cross assembled onto
a
> > target instruction set.
>
> Guess what gcc does. Go on. Guess.

It translates to Pcode. Which is something different than a virtual CPU.
Close but more abstract.

> Hint: it's not solid or liquid or plasma.

Quarkonium?
QuantumFroth?


> > For the moment Linux has C++ and since the sourse is open, the source
can
> > be open.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> For the moment. We'll have to take care of that now, won't we? After
> all, there are so many holes in Linux. Any hacker can find one.
> Because Microsoft Windows Vista has closed source code, it's
> very secure; so secure, in fact, that nobody's ever heard of
> a blue screen of death, whereas BSODs happen all the time in
> Linux boxes.

I don't think I've had a non-hardware related BSOD in the last half
decade. Perhaps longer.

And who is this "We" you are referring to? There isn't enough
organization in the LinTard community for there to be any "we".

It's all I, I, I, I, I, and thats why there are 300 or so different
Distro's, and 5 GUI's, and 500 text editors, and 900 different shells, etc,
etc, etc.

Open source suffers from a lack of decipline in it's proponents. That's why
Open source code is almost universally pathetic.

The Ghost In The Machine

unread,
Sep 21, 2006, 12:00:05 PM9/21/06
to
In comp.os.linux.advocacy, Scott Nudds
<nos...@foo.com>
wrote
on Thu, 21 Sep 2006 00:36:08 -0700
<IEoQg.72634$ED.5...@read2.cgocable.net>:

It's a code construct. I don't know about it being a good thing but
then VI has spoiled me regarding parenthesis, bracket, and brace
matching. No doubt that can be done in this new system.

It's not unreasonable, though many languages opt instead for '{}' (C)
or 'begin end' (Pascal, Modula-2).

>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Supporting anything that's nonASCII and nonEnglish supports
>> disunity, misunderstanding, and violence.
>
> Nor omgnik kusk oc kuoy ssae htp uamm omru oykc ufog.
>
> But transmitting graphics as UUEncoded text is just peachy...
>
> Right Ghost?

Correct. Pictures should be allowed but non-ASCII should not be.
If the proper headers are included HTML should also be allowed,
along with RTF, Microsoft Word (encoded as necessary), and
of course Excel spreadsheets.

These are all standard formats.

>
>
>
>> > Do you think the ISO would adopt standards that extinguish the need
> for
>> > thier own existance?
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> If the case is strongly enough presented, perhaps. As in "your
>> existence is now illegal. Go away."
>
> Yes I suppose that is the only way they would disband.

There's a thought. Think Congress could pass a law making
Microsoft the Official OS? Failing that, importers could
impound Linux-based Chinese machines.

>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Failing that, I'm not certain. But they can read handwriting on the
>> wall, and if there's enough handwriting things get interesting.
>
> True. But those hands I've found are attached to small monkey brains of
> the type that praise the glory of the Linux Shit Stick.
>
> Hence they usually end up grunting and smearing their feces on that wall.
>
>
>> > Unicode is an embodyment of that sick philosophy.
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Exactly. ASCII is the only standard ever needed.
>
> And what character set does UUEncoding use?

Standard ASCII. The data is encoded 6 bits at a time, if
memory serves.

>
> Isn't UUEncodeing used in the XFer of all form of binary
> data from smut, smut and more smut, to mp3's, dvd's and the like.

There are two encoding methods. UUEncoding is the unofficial
but widely supported method. Base64 is the official one.

The two are generally distinguishable if one is looking at
encoded text by the fact that UUEncoded text starts with the
letter 'M' (a coded line length).

>
> Seems to me that if ASCII is capable of that, it's capable of
> communicating with ZULU's

ASCII is not by itself capable of UUEncoding; one needs a codec.

>
> Aahahahahahahahah......
>
>
>
>> > Excellent. Submit your suggestion to the ISO.
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> My standard? Hardly. A4 is a European variant of paper size
>> and well known in England.
>
> But the adoption of A4 as <the> standard size for a page
> for text output is your idea.
>
> What's holding you back?

Nothing. It's already possible on Linux boxes.

>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> In any event, all printing should be done on 8 1/2" x 11".
>> This is, after all, an American problem.
>
> I kinda bet it will be done in ASCII.

All printing should also be done in ASCII, yes.

>
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Don't be stupid. The ISO is obligated to at least consider it,
>> presumably.
>
> Meaningless and pointless.
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Oh, I dunno about that. If one gets a sufficiently good language out
>> there it'll take the world by storm. Something COOL. Something that
>> eventually doesn't become a hash of its former self.
>
> Just look at the DotNet languages. They are the future.you can pretty much
> forget everything else.

True. We should start implementing a Python.NET.

>
>
>
>> > PASM cross assemblers are now entering the mainstream and both C and
> C++
>> > will soon be relegated to the dustbin of history.
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Cross assemblers? What good would *those* do? RISC doesn't have AAD;
>> X86 doesn't have instructions executed following jumps.
>
> Well, then I guess to perform an Ascii Adjust after a Division one will
> have to inline some code to peroform the equivalent operation, and then
> optimize the resulting code stream.
>
> You know, CPU's don't have an "width = x(10)*pi" instruction
> either but the compilers do manage to produce a stream of
> instructions that perform the desired task.

This is true.

>
> As to execution during jump, this is simply a matter of altering
> instruction ordering. I'll relegate that to the peep hole optimizer.
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Granted, a cross assembler can be used to develop code on one platform
>> for eventual execution on another, much like cross-compilers. However,
>> the cross compiler can accept code originally intended for the
>> originating box, with proper porting controls.
>
> A properly constructed

.NET

> language wouldn't need any alteration at all unless
> it was manipulating hardware directly. But lets exclude
> those programs for the moment.
>
> JIT compilation provides two nice advantages. Code security and code
> security.

JIT compilation is an abomination. Java uses it.

>
> Code security in the sense that reverse compiling the code is difficult,
> and Code security in the sense that altering the code is difficult.
>
> Executables are generally smaller than the source, so there are speed
> advantages here as well, and of course the instruction set of the primary
> compiler's target (virtual) CPU can be chosen to give hardware manufacturers
> some guide to where to guide the next generation of hardware acceleration.
>
> You might have noticed that I'm big on restricting pointer access by
> altering the structure of the base CPU pointer register.

I'm not entirely sure how one "alters the structure of the base CPU
pointer register". Perhaps you can expound thereon?

>
>
>> > I don't have the decades needed to write a new optimizing
>> > compiler.
>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> You don't need to write the *compiler*. You merely need to specify
>> a *language*. Do that, and someone else might write the compiler,
>> if you present it strongly enough.
>
> Done.

OK. Where's a website giving details?

>
>
>> > But I
>> > have poduced a specification. You should be able to find it
>> > online. But it matters not. C and C++ have already done
>> > their damage and the world is moving on.
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> OK. Got a website?
>
> Nope. But somewhere I still have the specification.

Fine. I'd like to see it.

>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> So Windows is now a computer language. Interesting.
>
> Pretty much. Windows applications are nearly universally written in one
> of the Microsoft langauges that run under Microsoft Visual Studio.
>
> There are some that are written in Borland products of course, but that's
> a small minority
>
> Things will become even more tightly integrated between language and OS
> with Vista DotNet.
>
> I think you will generally like the future computing universe. Between
> modules, you will have a data flow type environment where messages are
> passed between independent objects each with their own serialization buffers
> to maintain some form of order. The volume of messages that can be
> processed at any time will be in direct proportion to the number of CPU
> cores available, and it won't matter where those cores are. Local, remote,
> it just won't matter.
>
> Underlying that, you will have some more traditional code and
> connectiviity that generates those messages and responds to them.
>
> Individual processes then will be limited in speed by the speed of the
> cores that are running the individual modules.
>
> On the module level you will see objects. On the code level you will see
> OOP objects and traditional code.
>
> This is the Future of computing as seen and as

ITYM "it".

> will be delivered by Microsoft.
>

For a price.

The Ghost In The Machine

unread,
Sep 21, 2006, 12:00:05 PM9/21/06
to
In comp.os.linux.advocacy, GreyCloud
<mi...@cumulus.com>
wrote
on Wed, 20 Sep 2006 23:26:01 -0600
<0JadnYuwMOb_v4_Y...@bresnan.com>:

> Scott Nudds wrote:
>
>> "GreyCloud" <mi...@cumulus.com> wrote in message
>>
>>>Well, did you ever get any pussy in your life?
>>
>>
>> Yawn......
>>
>>
>
> Ah, too bad.
>

It's quite clear that Vista.NET Will Fix Everything(tm),
according to Scott Nudds. This might include sexual
escapades though it's hard to say. Personally, I think
women transcend computers anyway; they use them as a tool.

Shmuel (Seymour J.) Metz

unread,
Sep 21, 2006, 12:40:28 PM9/21/06
to
begin In <eerjk1$ie1$1...@tux.glaci.com>, on 09/20/2006

>The thing is, C was originally designed to be just one step up from a
>macro assembler,

ITYM down; the C preprocessor is pathetic.

>but judging it by the standards of other languages with more
>rigid type checking and better memory management and so on is
>not really a fair comparison.

Well, not when you're using it on a DEC PDP-7, 9 or 15. But when it's
ported to more capable machines then it's fair to compare it to other
languages for those machines.

>It was meant to be a small, flexible tool that could be implemented
>on the less powerful hardware of its day.

But that's not how it's being used today. Nor was it the best that
could have been done on the PDP-9.

--
Shmuel (Seymour J.) Metz, SysProg and JOAT <http://patriot.net/~shmuel>

Unsolicited bulk E-mail subject to legal action. I reserve the
right to publicly post or ridicule any abusive E-mail. Reply to
domain Patriot dot net user shmuel+news to contact me. Do not
reply to spam...@library.lspace.org

Scott Nudds

unread,
Sep 21, 2006, 4:59:46 PM9/21/06
to

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
> It's quite clear that Vista.NET Will Fix Everything(tm), according to
Scott Nudds.

Hmmm, that's not something that I would say, but Vista in conjunction with
DotNet will solve a variety of problems, and has the potential of solving
others, including the initiation of a phase out of older API's that were
hacked into existance and shouldn't really be present. Hopefully they will
be depreciated in Visa and gone in Vista II. Certainly the DotNet API goes
a long way in encapsulating the most desirable functions from the older
API's.

DotNet also solves the problem of code portability between dissimilar
machines, and the sheer bulk of the API makes the Win32 interface look
miniscule in comparison.

Emulation/Traslation interfaces like Wine will therefore take an order of
magnitude longer to consturuct - longer than the morph time of the interface
itself. Hence Wine type efforts are essentially pointless.

Aero is also positioned well to direct the evolution of graphic standards -
DirectX is already doing this of course with it's shader standards.

Aero however won't be adopted rapidly by users. All reports are that this
Aero system requires a lot of compute horsepower. This will <NOT> be a
problem in 3-4 years when most machines will be multi-core. Effectively 1
core will be able to be dedicated to the UI.

Compared to the introduction of earlier versions of Windows. Vista adoption
will be quite slow due to bad press and Microsofts failure to make a case
for why....

Vista is more of a great product for Microsoft than a prodcut that offers
people anything they really need. So explaining why upgrades should be done
is somewhat difficult.

But one thing can easily be predicted. 24 hours after the store doors are
opened, Vista will have a larger installed base of users than Linux has
managed to gain for itself over the entire 15 years of it's existance....


"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Personally, I think women transcend computers anyway; they use them as a
tool.

They use men as tools too.. But that's because women are superior.


tha...@tux.glaci.remove-this.com

unread,
Sep 21, 2006, 2:06:57 PM9/21/06
to
The Ghost In The Machine <ew...@sirius.tg00suus7038.net> wrote:
>
> I'll admit I'm not familar with Nudd's implementation of libc.
> It's not glibc, that's clear.

I actually worked on a lightweight glibc implementation for an
embedded linux project. It's actually rather impressive how small
you can squeeze that monster when you decide to support only one
platform and throw out a lot of the less used functions. It
was quite the battle deciding what should stay and what should go.
The application people wanted pretty much everything, but the
testing and certification people wanted to keep it as small as
possible. We had to strike a balance somewhere in between.

Thad

Scott Nudds

unread,
Sep 21, 2006, 6:06:32 PM9/21/06
to

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
> It's a code construct. I don't know about it being a good thing but
> then VI has spoiled me regarding parenthesis, bracket, and brace
> matching. No doubt that can be done in this new system.
>
> It's not unreasonable, though many languages opt instead for '{}' (C)
> or 'begin end' (Pascal, Modula-2).

HLL's exist to make coding more convenient and less error prone. Since
block nesting errors can largely be
prevented with the implementation of block typing, and since block typing
presents no significant burdens to compiler writers, and doesn't in any way
alter code generation efficiency, it follows that to be true to the design
goals of every HLL, block typing is an implementation must.


> > But transmitting graphics as UUEncoded text is just peachy...
> >
> > Right Ghost?

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> Correct. Pictures should be allowed but non-ASCII should not be.

The point is that LinTards and UniTards have designed their communication
systems so that any kind of binary data is transmitted as ASCII. Yet here
you can't seem to exist without some method of printing that highly prized
ZULU language to the computer screen.

So what you need to do is develop a system of text encryption that
simultaneously uses characters that are 4,3,2,and 1 byte long to so that you
can define character sets for ZULU, Tik and ancient Sumarian. so that should
a PC fall through some crack in time and finds itself back 30,000 years ago
in the hands of a Neandertal, they will be able to tap at the keys to
compose a message to the future in their own languge. Magically supported
by the Unicode character set.

And then you intend to store that data as a series of 8 bit characters and
for the purposes of transmission, convert it to a series of 7 bit
characters, doing a reconversion at the destination.

You know, there is no Egyptian hyroglyph for Unicode FuckTard, and no key to
press on my keyboard to make this symbol appear even if there was such a
symbol. So you can just have to imagine that it appears inside the
following bracket you ().

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> If the proper headers are included HTML should also be allowed,
> along with RTF, Microsoft Word (encoded as necessary), and
> of course Excel spreadsheets.

What's the Zulu word for :"GRAFTABL" and where are the keys on my keyboard
needed to type in native Zulu?


> > Yes I suppose that is the only way they would disband.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> There's a thought. Think Congress could pass a law making Microsoft the
Official OS?

No need to pass a law. Why not just prevent the Microsoft Monopoly from
being broken up as required by Law...

Oop, Congressional Republicans and the Bushie White House have already
done that.....

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
> Failing that, importers could impound Linux-based Chinese machines.

I wonder how many of those pentium class machines will be found in Iran in
the next couple of years. I read yesterday how the Justice Department has
just fined an American Company 150,000 for sending 2,000 low speed pentium
motherboards to the middle east because they were cross shipped to Iran in
violation of U.S. export restrictions.

I guess no one has managed to tell the Repugs in the U.S. congress that
there is more computing power in a land fill C64 than was available during
the Manhatten project.


> > And what character set does UUEncoding use?

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> Standard ASCII. The data is encoded 6 bits at a time, if
> memory serves.

ASCII uses 7 bits with the lower 32 bytes and the upper 1 byte used as
various terminal control characters.

So you think that ASCII is quite suitable for the transmission of digital
video, but isn't up to snuff for displaying font required to display the
Zulu language.

That's real odd because if you photographed the Zulu text you could just
transmit it like you do video, or any other ASCII encrypted binary data.


> > Isn't UUEncodeing used in the XFer of all form of binary
> > data from smut, smut and more smut, to mp3's, dvd's and the like.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> There are two encoding methods. UUEncoding is the unofficial
> but widely supported method. Base64 is the official one.
>
> The two are generally distinguishable if one is looking at
> encoded text by the fact that UUEncoded text starts with the
> letter 'M' (a coded line length).

Wow, it starts with the letter M, does it. That's some great standard you
have there. You know Sesime Street often starts with the letters P, Q and
L.

UniTard fuckups have actually managed to creat a system of encoding and
transmission where 8 bit binary data is converted to 7 bit Ascii, so that it
can be converted back to 8 bit binary data for display and manipulation and
then convered back to 7 bit data for trasmission where it is converted back
to 8 bit data for reception, and converted back to 7 bit data for expansion
and then converted back to 8 bit data for final storage.

It's brilliant I tells ya. Fucking stupid Unix ShitLickers.


> > But the adoption of A4 as <the> standard size for a page
> > for text output is your idea.
> >
> > What's holding you back?

> > "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> Nothing. It's already possible on Linux boxes.

But not a standard so different people may have different ideas on what a
page size should be. Now how are you going to propely format your Zulu
script then?

Your solution was to set A4 as the standard page size for display. OK.
Why don't you approach the ISO and have them define that standard page size
as the standard page size for all textual display?

I'm waiting....

> > "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
> >> In any event, all printing should be done on 8 1/2" x 11".
> >> This is, after all, an American problem.
> >
> > I kinda bet it will be done in ASCII.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> All printing should also be done in ASCII, yes.

So you stick is to be able to display Zulu, Coptic, Ancient Sumarian,
Egyptian glyphs, Aztec knots and the like on screen but not be able to print
them?

You LinTards are the origin of all half baked - unworkable - non-solutions
aren't you?

> > Just look at the DotNet languages. They are the future.you can pretty
much
> > forget everything else.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> True. We should start implementing a Python.NET.

I see, so you want to support a language like Python that has no block
delimiters and which loses all assemblance of program flow if you load it
into an editor that happens to convert white space in the form of spaces to
white space in the form of tabs?

You see, it's suggestions like yours, that have the rest of us saying that
you LinTards have shit for brains.


> > JIT compilation provides two nice advantages. Code security and code
> > security.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> JIT compilation is an abomination. Java uses it.

Java does use it. DotNet uses it too. It's a requirement for the
security of write once, run anywhere applications and applets.

Welcome to the real world Ghost. PASM is here to stay.

> > Code security in the sense that reverse compiling the code is
difficult,
> > and Code security in the sense that altering the code is difficult.
> >
> > Executables are generally smaller than the source, so there are speed
> > advantages here as well, and of course the instruction set of the
primary
> > compiler's target (virtual) CPU can be chosen to give hardware
manufacturers
> > some guide to where to guide the next generation of hardware
acceleration.
> >
> > You might have noticed that I'm big on restricting pointer access by
> > altering the structure of the base CPU pointer register.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote


> I'm not entirely sure how one "alters the structure of the base CPU
> pointer register". Perhaps you can expound thereon?

By implementing in hardware pointer registers that contain address
delimiters as well as the pointer itself.

I have written about this in reasonable detail. Haven't you been
listening?


> > Nope. But somewhere I still have the specification.
>

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> Fine. I'd like to see it.

Given your reluctance to read what I have written about delimited pointers
in the past, I'm not going to give finding the specification a high
priority. You undoubtely will refuse to read it as well.

> > will be delivered by Microsoft.

"The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote

> For a price.

Absolutely. You say that as if you find profit abhorent. Are you a
Communist Ghost?

The Ghost In The Machine

unread,
Sep 21, 2006, 4:00:03 PM9/21/06
to
In comp.os.linux.advocacy, Scott Nudds
<nos...@foo.com>
wrote
on Thu, 21 Sep 2006 13:59:46 -0700
<aqAQg.107692$sS1....@read1.cgocable.net>:

>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> It's quite clear that Vista.NET Will Fix Everything(tm), according to
> Scott Nudds.
>
> Hmmm, that's not something that I would say, but Vista in conjunction with
> DotNet will solve a variety of problems, and has the potential of solving
> others, including the initiation of a phase out of older API's that were
> hacked into existance and shouldn't really be present. Hopefully they will
> be depreciated in Visa and gone in Vista II.

Deprecated, you mean.

> Certainly the DotNet API goes
> a long way in encapsulating the most desirable functions from the older
> API's.

And getting rid of the least desirable ones.

Replacing Apache will be interesting. I'm assuming
Microsoft will engage in a campaign to require Web servers
to install IIS in order to allow people to use the new,
improved IE7.0 stuff.

>
> DotNet also solves the problem of code portability between dissimilar
> machines, and the sheer bulk of the API makes the Win32 interface look
> miniscule in comparison.

Say not "bulk". Say "capability".

>
> Emulation/Traslation interfaces like Wine will therefore take an order of
> magnitude longer to consturuct - longer than the morph time of the interface
> itself. Hence Wine type efforts are essentially pointless.

It's a moving target.

>
> Aero is also positioned well to direct the evolution of graphic standards -
> DirectX is already doing this of course with it's shader standards.
>
> Aero however won't be adopted rapidly by users.

Why not?

> All reports are that this
> Aero system requires a lot of compute horsepower.

Horsepuckey. If one is willing to take a User Experience of 1, one can
probably use Aero with as little as 256 MB and a subpar non-GL graphics
card.

> This will <NOT> be a
> problem in 3-4 years when most machines will be multi-core. Effectively 1
> core will be able to be dedicated to the UI.

And another core will be in the card itself -- the GPU. It's getting
close, and will probably get even closer as time goes on.

Effectively, every machine is already dual-core -- CPU
and GPU. Vista will be ready. Will Linux?

(It turns out yes, but it takes some setup. Vista comes ready out of
the box; Linux may have to battle its own demons in the X area.)

>
> Compared to the introduction of earlier versions of Windows. Vista adoption
> will be quite slow due to bad press and Microsofts failure to make a case
> for why....

Don't worry. That will be highly accelerated once Microsoft starts
spending ad revenue. Presumably this will be in the Christmas
timeframe.

>
> Vista is more of a great product for Microsoft than a prodcut that offers
> people anything they really need. So explaining why upgrades should be done
> is somewhat difficult.

New data search engine (WinFS).
New IE capabilities such as tabbed browsing.
New .NET APIs on the local box that can connect to specialized servers,
ready to do cool things. I'm not sure what those cool things are but
presumably Microsoft will think of something.

>
> But one thing can easily be predicted. 24 hours after the store doors are
> opened, Vista will have a larger installed base of users than Linux has
> managed to gain for itself over the entire 15 years of it's existance....

It already does. How many of you out there have the beta? :-)

(Not me. I lack the hardware. But there's probably a few early
adopters slavering over the new functionality Vista gives them,
especially for game development.)

>
>
> "The Ghost In The Machine" <ew...@sirius.tg00suus7038.net> wrote
>> Personally, I think women transcend computers anyway; they use them as a
> tool.
>
> They use men as tools too.. But that's because women are superior.
>

The Ghost In The Machine

unread,
Sep 21, 2006, 4:00:03 PM9/21/06
to
In comp.os.linux.advocacy, Shmuel (Seymour J.) Metz
<spam...@library.lspace.org.invalid>
wrote
on Thu, 21 Sep 2006 13:40:28 -0300
<4512ce8c$1$fuzhry+tra$mr2...@news.patriot.net>:

> begin In <eerjk1$ie1$1...@tux.glaci.com>, on 09/20/2006
> at 02:35 PM, tha...@tux.glaci.remove-this.com said:
>
>>The thing is, C was originally designed to be just one step up from a
>>macro assembler,
>
> ITYM down; the C preprocessor is pathetic.

I'll admit I was seeing better even back then. The VMS assembler
preprocessor was very capable, for instance.

>
>>but judging it by the standards of other languages with more
>>rigid type checking and better memory management and so on is
>>not really a fair comparison.
>
> Well, not when you're using it on a DEC PDP-7, 9 or 15. But when it's
> ported to more capable machines then it's fair to compare it to other
> languages for those machines.
>
>>It was meant to be a small, flexible tool that could be implemented
>>on the less powerful hardware of its day.
>
> But that's not how it's being used today. Nor was it the best that
> could have been done on the PDP-9.
>


--

GreyCloud

unread,
Sep 21, 2006, 4:30:51 PM9/21/06
to
The Ghost In The Machine wrote:

> In comp.os.linux.advocacy, GreyCloud
> <mi...@cumulus.com>
> wrote
> on Wed, 20 Sep 2006 23:26:01 -0600
> <0JadnYuwMOb_v4_Y...@bresnan.com>:
>
>>Scott Nudds wrote:
>>
>>
>>>"GreyCloud" <mi...@cumulus.com> wrote in message
>>>
>>>
>>>>Well, did you ever get any pussy in your life?
>>>
>>>
>>> Yawn......
>>>
>>>
>>
>>Ah, too bad.
>>
>
>
> It's quite clear that Vista.NET Will Fix Everything(tm),
> according to Scott Nudds. This might include sexual
> escapades though it's hard to say. Personally, I think
> women transcend computers anyway; they use them as a tool.
>

Maybe .NET is a brand name of panty hose.
He's certainly deluded then.

GreyCloud

unread,
Sep 21, 2006, 4:31:50 PM9/21/06
to
Scott Nudds wrote:

Yet even beta version of Vista is now getting hosed by viruses and other
malware. So much for M$ promises of security.

It is loading more messages.
0 new messages