I would like to have your opinion about CSE 3310 Validator v2.00 BETA 1E.
I'm using Home Site 2.5 (beta) and I found useful validating my pages on my
computer before uploading them to the server and validate them with the
Kinder, Gentler Validator or with Webtech. I am also using Spyglass
validator, but it seems to give strange results.
Do you think that CSE 3310 Validator or any validation software of this
kind is reliable?
Thanks for your help and ciao from Italy.
-------------------------------------------
Carlo Mario Chierotti - Acqui Terme, Italia
http://aquarius.net/Cmc_eng.htm
-------------------------------------------
in girum imus nocte et consumimur igni
>Hello.
>
>I would like to have your opinion about CSE 3310 Validator v2.00 BETA 1E.
>I'm using Home Site 2.5 (beta) and I found useful validating my pages on my
>computer before uploading them to the server and validate them with the
>Kinder, Gentler Validator or with Webtech. I am also using Spyglass
>validator, but it seems to give strange results.
I'm testing CSE 3310 right now, and I'm about ready to plunk down my
hard earned cash for it.  So far it's done everything I've wanted it
to and more.
bald...@airmail.net (Paul Schmehl)
http://www.utdallas.edu/~pauls/
"Certainly the pleasures of youth are great, but they
are nothing compared to the pleasures of adultery."
From "Anguished English" by Richard Lederer
> I would like to have your opinion about CSE 3310 Validator v2.00 BETA 1E.
I just downloaded it and after the 150 documents are over I'll easily
pay the $25 for it.  
It rocks and you don't need UNIX or use an online validator which lags
if using a modem.
It's a Win95/NT validator and it does everything I want.  I'm going
through my site correcting errors.  It rocks.
Vikram
___________________________________________________________
Vikram Pant		       vik...@vikrampant.com
  The House That Vikram Built - http://www.vikrampant.com
  MIDI-Fest (140MB+ of MIDI) - http://www.midifest.com
| I would like to have your opinion about CSE 3310 Validator v2.00
| BETA 1E [...] I found useful validating my pages on my computer
| before uploading them to the server and validate them with the
| Kinder, Gentler Validator or with Webtech. I am also using Spyglass
| validator, but it seems to give strange results.
| Do you think that CSE 3310 Validator or any validation software of
| this kind is reliable?
KGV and Webtechs are CGI applications built around the nsgmls/SP
parsing package (See <URL:http://www.jclark.com/sp/>.) The SP parsing
library is widely recognized as the best freeware implementation of a
validating SGML parser. (The Spyglass validator is also apparently
based on SP, but unlike nsgmls it handles only a limited built-in set
of DTDs.) Since HTML is formally defined in SGML, anything passed by
KGV or Webtechs is practically certain to be formally correct. This is
what _validation_ means.
However, even while a HTML document could be syntactically _correct_,
it need not be _robust_ in relation to the existing crop of HTML
user-agents (browsers, crawlers, etc.) Certain usages while legal
could be unsafe, i.e. are generally known to trigger bugs in (popular)
software. For instance, a bug of long standing in the Mosaic family
(NCSA, Netscape, Explorer) was the inability to recognize attribute
value literals in single-quotes: purely as a style/robustness issue,
it made sense to use double-quotes only. Another example is endtags
for TR and TD elements not really being "optional" for Netscape.
Programs that check for such gotchas are known as "checkers" or
"linters." Examples are WebLint and HTMLchek (See the list at
<URL:http//www.htmlhelp.com/links/validators.html>.) Their primary
function is to generate warnings, the more nagging the better!:-)
Almost all such linters are *not* validators: they use ad hoc parsing
heuristics as a simplification in order to focus on their built-in
sets of known problematic forms. Here and there, they will generate
bogus errors (as distinct from warnings) and perhaps even miss real
errors.
So, using both a validator to check for correctness and a linter to
check for robustness is a very good combination. 
CSE is a good product, but it's inappropriately named. It's a linter,
not a validator. The "Program Limitations" section in the WinHelp file
of the distribution leaves little doubt:
= 1. HTML Validator does not and cannot completely check a document
= for 100% correct syntax. 
This rules it out as a _validator_.
As a field test, running CSE on Mike Meyer's Bug Test page (see
<URL:http://www.phone.net/home/mwm/bugs.html>) proved instructive.
Out of the box, CSE considers double quotes in text and multi-line
attribute value literals to be "errors". They're definitely not, but
there can be no objection to a linter throwing warnings instead. It
also appears to be unaware of the full range of entity references
defined for HTML 2.0. An ability to read DTDs could have saved the
tedium of entering about 100+ definitions manually in the "Character
Entities" dialog box. Finally (and this was predictable), CSE still
doesn't understand comment declaration syntax: it issues a wellknown
but misleading "definition", and misdiagnoses the problem on a line in
the Bug Test (line 37) that is actually treated differently by
Netscape 4.0 (wrong) and Explorer 3.0 (right).
Nevertheless, its linting features are reasonably comprehensive, and
in its configuration dialog it allows element heirarchies (i.e. "this
can contain that") to be specified, which is a definite step up from
the other linters. The ability to read and write actual DTDs -- i.e.
published if not wellknown structural specifications -- would be a
very welcome addition.
:ar
>of DTDs.) Since HTML is formally defined in SGML, anything passed by
>KGV or Webtechs is practically certain to be formally correct. This is
>what _validation_ means.
That may be more the case in recent weeks, as it seems a number of new
DTDs have been added. But even at that, if they consider it actually
worth 'reverse engineering' some Netscape tags just in order to
include them in a definition or configuration, then I think that's
where you lose me, at least.
Again, you are right to demand some standard, even a host of these, as
you can select from radio buttons at these sites. But a standard which
is not supported entirely is not as good as one which is, but which
seems non-standard because you only prefer the former and think the
market ought to be ignored, at any rate.
>However, even while a HTML document could be syntactically _correct_,
>it need not be _robust_
Correctness begs the question of a standard, I think. And I wonder
what you mean by, robust. You suggested, below, referring to Mosaic,
that it referred to inconsistencies and incompatibilities between
browsers. What might be an example of something that was, and
something that wasn't 'robust' - maybe even a couple of examples?
>it made sense to use double-quotes only. Another example is endtags
>for TR and TD elements not really being "optional" for Netscape.
What happens if you leave them off? in NN.
>CSE is a good product, but it's inappropriately named. It's a linter,
>not a validator.
Is it unable to flag open containers, scoping errors, and the like?
What would a validator do that CSE would not? What does CSE provide
that we know other validator's don't?
>= 1. HTML Validator does not and cannot completely check a document
>= for 100% correct syntax. 
>
>This rules it out as a _validator_. 
But, seriously, wouldn't it simply rule out all others, as well? since
_they_ can't even recognize certain tags and attributes that people
might well want to use.
>As a field test, running CSE on Mike Meyer's Bug Test page (see
><URL:http://www.phone.net/home/mwm/bugs.html>) proved instructive.
>Out of the box, CSE considers double quotes in text and multi-line
>attribute value literals to be "errors". They're definitely not,
Basically, what CSE picked up were unknown character aliases/entities
and the kludgy comment stuff meant to fool browsers; and disallowing
single quoted values (which can't be disabled from what I can see; but
then prob not a good thing to do, in the first place). Ironically, of
course, it passes HTML 2.0 but fails Wilbur and Cougar (barely, but
still fails). The question is, after filling in the character entities
(which admittedly, should have been in the configuration file), would
you really want a document like that passed as 'legal'; particularly
since all the comment trickery, in a real web page, would likely be
triggered by something as prosaic as a typo? Would you feel better
that it passed a 'validator', knowing where and why it failed CSE?
(seems I recall something of this line a few weeks ago, from folks
carrying the water for the purist side in all this)
>defined for HTML 2.0. An ability to read DTDs could have saved the
>tedium of entering about 100+ definitions manually
Should have been there. The neat thing it, you can type them in. And
it didn't miss 100, more like 15-20 or so.
>but misleading "definition", and misdiagnoses the problem on a line in
>the Bug Test (line 37) that is actually treated differently by
>Netscape 4.0 (wrong) and Explorer 3.0 (right).
If you mean line 34, it reads (line 37 isn't flagged, the one that
trips up NN):
<!--  -- --> Your parser terminates on dash-dash-> whether it
terminates the comment or not -->
Perhaps you've found a rather esoteric bug in CSE. I think I'd want
this flagged anyway, even if it was proposed to Netscape and Microsoft
as legal. As for NN 4, I don't know (I don't have it, anymore). But NN
3 and IE 3 handle the line in the same way (so does Lynx 2.7). So . .
. (in fact, it's ironic that while Lynx seems less 'gullible' than NN,
in that it matches well with IE 3's take on the page, where it
describes various character entities, and since the DOS version might
not have a full character set, it simply can't show them).
Peace.
> ar...@nmds.com (Arjun Ray) wrote:
> 
> >of DTDs.) Since HTML is formally defined in SGML, anything passed by
> >KGV or Webtechs is practically certain to be formally correct. This is
> >what _validation_ means.
> 
> That may be more the case in recent weeks,
No, it has meant that all along.  It's just you that's on the learning
curve here.
> >However, even while a HTML document could be syntactically _correct_,
> >it need not be _robust_
> 
> Correctness begs the question of a standard, I think.
Does that mean something?
> And I wonder
> what you mean by, robust. 
I have the impression that experienced HTML authors already know
what that means.  You could too, if you paid attention to the postings
that you are following-up to.
> What would a validator do that CSE would not?
You have already been told, several times: it guarantees formal
compliance with an SGML DTD - nothing more, and nothing less.  Linters
are also valuable tools, but they do a different job.  If you weren't so
busy exercising your democratic right to post, you might have grasped
this the first time around. 
In article <3314fde...@news.pacbell.net>,
1023...@compuserve.com (Mark Johnson) wrote:
> ar...@nmds.com (Arjun Ray) wrote:
> >it made sense to use double-quotes only. Another example is endtags
> >for TR and TD elements not really being "optional" for Netscape.
> 
> What happens if you leave them off? in NN.
With nested tables, Netscape gets greatly confused if the cells etc
are not explicitly closed.
> >CSE is a good product, but it's inappropriately named. It's a linter,
> >not a validator.
> 
> Is it unable to flag open containers, scoping errors, and the like?
> What would a validator do that CSE would not? What does CSE provide
> that we know other validator's don't?
A validator, by definition, checks against a DTD. If it does not
do that (and CSE apparently doesn't), then it _is not a validator_
no matter how useful it can be.
> But, seriously, wouldn't it simply rule out all others, as well? since
> _they_ can't even recognize certain tags and attributes that people
> might well want to use.
That's not the point.
> <!--  -- --> Your parser terminates on dash-dash-> whether it
> terminates the comment or not -->
Just for the record, this is a legal comment, and a browser must
not display any of the text. Lynx, with the right options set,
does get this right. It is actually one of the few browsers that
does.
- -- 
E-mail: gala...@htmlhelp.com .................... PGP Key: 512/63B0E665
Maintainer of WDG's HTML reference: <http://www.htmlhelp.com/reference/>
-----END PGP SIGNED MESSAGE-----
>It's just you that's on the learning curve here.
You mean, generally? You seem less gracious in the flames you start
than others of your school.
>You have already been told,
I've been "told" lots of things by a few people who seem like they
never learn.
Peace.
>With nested tables, Netscape gets greatly confused if the cells etc
>are not explicitly closed.
I think you mean IE.
How about:
<table bgcolor=white width="80%"><tr>
<td><td>
  <table bgcolor=green width="100%"><tr>
    <td> </td>
    <td align=right>
    <table bgcolor=silver width="60%"><tr>
      <td> 
    </table>
  </table>
</table>
Netscape 3 seems to handle it. Not so IE. Then again, this really
isn't something you _want_ to do. But just to make note of it.
>A validator, by definition, checks against a DTD. If it does not
>do that (and CSE apparently doesn't),
Actually, it does, if you set up the configuration in that way. But a
DTD, least I've seen, isn't good enough for actual web page work, is
it? That requires, instead, knowledge of tags and attributes that may
even be specific to NN or IE, or whatever else comes along. You don't
want the tools you use preaching to you - NoNetscapeNow. It's just
lame. And you probably get sick of my saying this, and I realize that,
but what's the standard, here - the market and the browsers most
everyone use, or some rarely used and rarely seen academic thing that
a few folks seem to have sense of 'discovery' about?
>> <!--  -- --> Your parser terminates on dash-dash-> whether it
>> terminates the comment or not -->
>
>Just for the record, this is a legal comment, and a browser must
>not display any of the text. Lynx, with the right options set,
>does get this right. It is actually one of the few browsers that
>does.
Well, true, I think Lynx makes an interesting 'linter', of sorts, and
someone else suggested to me. And I've been checking my pages againt
it. Beyond that . . .
Peace.
|> A validator, by definition, checks against a DTD. If it does not
|> do that (and CSE apparently doesn't),
|
| Actually, it does, if you set up the configuration in that way. 
No, it doesn't, because it can't. It is *not* written according to the
syntax actually involved in validation: its configuration dialogs
address only a subset of the provisions expressible in a DTD (and, as
it happens, in the SGML declaration.) There are usages permitted in
every known HTML DTD, which CSE is unaware of, that lead to some of
the bogus "error" messages it emits; and there are constraints imposed
by every known HTML DTD, which CSE is unable to impose, that lead to
known true errors not being detected at all.
An example of the first kind of variance is minimization applied to
attribute value specifications. Normally, these must be in the full
'name=quoted-value' syntax. Minimization consists of omitting (1) the
quotes, (2) the '=' _value indicator_, and (3) the attribute *name*.
This is permissible only when the attribute's legal values comprise a
_name token group_ (a fixed enumeration of name tokens.) 'ALIGN', for
instance, is one such attribute, so '<H1 center>' is legal usage.
Precisely the same minimization is involved for an attribute such as
ISMAP, where the 'ISMAP' retained in normal usage from the full form
<IMG ... ISMAP="ISMAP"> is the *value*, not the *name*.
Examples of the second kind of variance are uniqueness constraints
(e.g. only one TITLE element is permitted per document) and pure
element content models (certain elements, such as UL, DL, OL, TABLE,
TR, etc. cannot contain free-standing text directly: the text has to
be in a contained sub-element.)  
| But a DTD, least I've seen, isn't good enough for actual web page
| work, is it? 
You haven't seen enough to hold an opinion anywhere close to informed.
| That requires, instead, knowledge of tags and attributes that may
| even be specific to NN or IE, or whatever else comes along. 
Did it ever occur to you to download the validation package, install
it on your own system, and learn how to edit DTDs? 
Or do you prefer simply babbling?
| You don't want the tools you use preaching to you - NoNetscapeNow.
You appear to lack any ability to keep your canards relevant.
| It's just lame. And you probably get sick of my saying this, and I
| realize that,
People could be sick of your garrulous rubbish.
| but what's the standard, here - the market and the browsers most
| everyone use, or some rarely used and rarely seen academic thing
| that a few folks seem to have sense of 'discovery' about?
A published formal specification exists. Produce either a competing
specification, or the soruce code for a reference implementation.
Anything that permits independent and objective verification.
Just stop your idiotic FUD.
:ar
In article <3318d6a2...@news.pacbell.net>,
1023...@compuserve.com (Mark Johnson) wrote:
> gala...@htmlhelp.com (Arnoud "Galactus" Engelfriet) wrote:
> >A validator, by definition, checks against a DTD. If it does not
> >do that (and CSE apparently doesn't),
> 
> Actually, it does, if you set up the configuration in that way. 
A DTD is a formal specification of a language using SGML syntax.
It is not the same as a lot of descriptions of the form "BODY
goes inside HTML" and "A goes inside TT, B, I, EM, ...".
> But a
> DTD, least I've seen, isn't good enough for actual web page work, is
> it? 
That depends. You can write your own DTD based on some public DTD,
adding and removing stuff that you know is different. Some people
here do so, to ensure that they cover the common problems (like
missing ALT text, not-so-optional table cell closing tags, etc)
and newer features (frames, embed, etc). 
> lame. And you probably get sick of my saying this, and I realize that,
> but what's the standard, here - the market and the browsers most
> everyone use, or some rarely used and rarely seen academic thing that
> a few folks seem to have sense of 'discovery' about?
On the Internet, the standard is what is published by the IETF - the
RFCs, STDs and related documents. In the case of HTML and related
protocols, this authority is now with the W3C.
>it happens, in the SGML declaration.) There are usages permitted in
>every known HTML DTD, which CSE is unaware of,
You may have something there. It did flag some likely bogus comment
lines that were technically legal in that bugs.html. But like I said,
is this something you'd want to go unflagged, particularly since in
the real world something like that is likely just a typo?
>Examples of the second kind of variance are uniqueness constraints
>(e.g. only one TITLE element is permitted per document) and pure
>element content models (certain elements, such as UL, DL, OL, TABLE,
>TR, etc. cannot contain free-standing text directly: the text has to
>be in a contained sub-element.)  
You mean virtually anything outside the td container? I believe you
are right, though some tags can float around out there (with
unpredictable effects?). I don't suppose CSE can catch that. I guess
it should. Maybe next version, who knows?
>You haven't seen enough to hold an opinion anywhere close to informed.
You mean DTDs? As they often say - get real. Why do you think I went
with CSE, which was only $15 at the time? The on-line validators, at
that time, were kicking back all kinds of errors cause they didn't
'speak' Netscape, or Explorer, or whatever. It was frustrating. And I
wasn't going to keep playing that game. If CSE misses text at the
wrong scope in a table container, then - oh, well (next time). It's
better than the so-called 'validators', that seemed to choke on
basically everything you sent their way, just because they preferred
to preach NoNetscapeNow.
>Or do you prefer simply babbling?
If you understood what I'm saying in this, you wouldn't ask such an
ignorant question? Conversely, one might say it does appear, however,
that _you_ prefer simply flaming.
>| You don't want the tools you use preaching to you - NoNetscapeNow.
>
>You appear to lack any ability to keep your canards relevant.
You just have no clue what I'm referring to with this. You've got your
Lynx standard, your anti-industry standard so tapped into your
imagination, that you can't imagine a standard actually set by
industry and used by over 90% of the people surfing the net.
>People could be sick of your garrulous rubbish.
You're trying too hard.
>Just stop your idiotic FUD.
FUD . . . again? Did we agree that was Farsically Understated
Depositions, or Fat Uncle Dutch? I forget.
Anyway.
Peace.
>You can write your own DTD based on some public DTD,
>adding and removing stuff that you know is different. Some people
>here do so, to ensure that they cover the common problems (like
>missing ALT text, not-so-optional table cell closing tags, etc)
>and newer features (frames, embed, etc). 
How does one do that? And is it just simply more trouble than it's
worth, compared with CSE, backed up by say, KGV? And, please, don't
make me have to put this in the category of 'ask a silly question',
based on your reply. Anyhow.
>RFCs, STDs and related documents. In the case of HTML and related
>protocols, this authority is now with the W3C.
Well, just make sure to tell to the guys in Menlo Park and Redmond
about it. I'm sure it would get a chuckle from the bunch, or so it
would appear (and not to say I _approve_ of competing standards, any
more than I would have thought an RTF war a good thing - just that the
'browser war', if it isn't over yet, has taken HTML along for the
ride).
Peace.
In article <331a98f4...@pbinews.pacbell.net>,
1023...@compuserve.com (Mark Johnson) wrote:
> gala...@htmlhelp.com (Arnoud "Galactus" Engelfriet) wrote:
> >You can write your own DTD based on some public DTD,
> 
> How does one do that?
First, you read an FAQ or tutorial on SGML (there are several
available through Yahoo or a Websearch on "SGML NEAR tutorial"),
and then you download the HTML 3.2 DTD and start hacking. :-)
> And is it just simply more trouble than it's
> worth, compared with CSE, backed up by say, KGV? 
That depends. If you know the syntax of a DTD, it is quite easy to
change it to something you want. Adding an element is a few minutes
work, if you're not used to it.
The tricky part is finding an SGML parser which can apply the DTD
to your document. James Clark's SP or nsgmls is the most popular
freeware parser. It is available from (IIRC) ftp://ftp.jclark.com/pub/sp
Unfortunately, I have yet to find a good "How to write your own
DTD and validate with it offline" guide.. perhaps I need to write
one. :-) But if you need help with specific stuff, feel free to
mail me.
> ar...@nmds.com (Arjun Ray) wrote:
>> There are usages permitted in every known HTML DTD, which CSE is
>> unaware of,
>
> You may have something there. 
Your theories are irrelevant. RTFM.
<URL:http://www.w3.org/pub/WWW/MarkUp/SGML/>
> Examples of the second kind of variance are uniqueness constraints
> [...] and pure element content models [...]
>
> You mean [clueless question]? I believe you are right, 
Your believing anything is of no consequence. RTFM.
<URL:http://www.w3.org/pub/WWW/MarkUp/>
>> You haven't seen enough to hold an opinion anywhere close to
>> informed.
>
> You mean DTDs? 
No. What DTDs are. Try this, for something simple:
<URL:http://www.dclab.com/wizzy/w19a.htm>
(and <URL:http://www.dclab.com/wizintro.htm> for more.)
> As they often say - get real. Why do you think I went with CSE, 
> which was only $15 at the time? 
The SP package is free: <URL:http://www.jclark.com/sp/>. Get real.
> The on-line validators, at that time, were kicking back all kinds of
> errors cause they didn't 'speak' Netscape, or Explorer, or whatever.
> It was frustrating. [moan, groan, etc.]
The SP package has always been free. Whining that free online services
based on a freely available package aren't serving one's wants, is a
tantrum characteristic of the slothful or the incompetent.
>> Or do you prefer simply babbling?
>
> If you understood what I'm saying 
That's the problem. You're not saying anything.
> one might say it does appear, however, that _you_ prefer simply
> flaming.
-sigh-. You shouldn't use words you don't understand. "Flaming" is one
such. You labor under the delusion that you are entitled to be taken
seriously while you parade your ignorance. This is a technical forum,
where published references and objective verification are the basis
for any rational discussion. Your manifest inability to deal with such
terms of discourse is why you confuse disabusing with "flaming".  
>>| You don't want the tools you use preaching to you - NoNetscapeNow.
>>
>> You appear to lack any ability to keep your canards relevant.
>
> You just have no clue what I'm referring to with this. You've got
> your Lynx standard, your anti-industry standard so tapped into your
> imagination, that you can't imagine a standard actually set by
> industry and used by over 90% of the people surfing the net.
-yawn-. In reprising the stale catechistic blather of the many fatuous
bores to have afflicted ciwah over the last two years, the least you
could do is not be so tedious. The thread is on validation software,
but a bee in your bonnet insists that this banal rant is relevant. OK,
let's consider it, briefly. 
First, acquaint yourself with "The Sons of Martha" by Rudyard Kipling.
It makes a fundamental point on engineering as a social activity.
<URL:http://www.lexmark.com/data/poem/kipli01.html#kipling12>
Next, read RFCs 1601-3 carefully. You need to grasp some elementary
concepts; in particular, what "standard" means on the Internet. The
acronym RFC is also explained.
   <URL:http://ds.internic.net/rfc/rfc1601.txt>
   <URL:http://ds.internic.net/rfc/rfc1602.txt>
   <URL:http://ds.internic.net/rfc/rfc1603.txt>
Finally, the article "The Populist Evangel" from ciwah Jan '96 (try
Dejanews) will explain the meaning of your bromides. They form the
credo of every whining coward sniping from the sidelines, secure in
the sanctimony of his puling righteousness that the burden of his
inadequacies shall be made the duty of others to bear, and that his
uninformed petulance shall suffice to impugn the energy, dedication
and hard work of the many people who *volunteered* for any standards
process -- without whom no standard would exist at all.
Standards exist for a reason. It's your problem to figure out what
that reason is. 
|> Just stop your idiotic FUD.
|
| FUD . . . again? Did we agree that was Farsically Understated
| Depositions, or Fat Uncle Dutch? I forget.
The premise, that you knew anything to forget, is lamentably false.
You may begin your education here:
<URL:http://www.denken.or.jp/cgi-bin/JARGON>
And while you're there, tracking down edifying terms such as "FUD",
"flame", "RTFM" and "burble", don't forget to look up:
*plonk*
-- 
NETSCAPISM /net-'sca-,pi-z*m/ n (1995): habitual diversion of the mind to
    purely imaginative activity or entertainment as an escape from the
    realization that the Internet was built by and for someone else.
                                                  -- Erik Naggum
(completely off-topic comment from me)
>NETSCAPISM /net-'sca-,pi-z*m/ n (1995): habitual diversion of the mind to
>    purely imaginative activity or entertainment as an escape from the
>    realization that the Internet was built by and for someone else.
>                                                  -- Erik Naggum
  Amazing... of all the places on the Internet; of all the groups in Usenet,
ciwah was the *last* place I'd expected *that* particular S-of-a-B to be
quoted...
*shakes her head*
-- 
  Tina Marie Holmboe
Unless explicitly stated otherwise, the   /                ti...@htmlhelp.com  /
opinions expressed are mine, and should  / http://www.htmlhelp.com/%7Etina/  /
in no way be associated with the WDG.   /         The Web Design Group      /
>Your theories are irrelevant.
Ever seen The Boston Strangler, the old film starring Tony Curtis,
with the trendy and shopworn splitscreens? There's a scene where they
pick up a guy, a paranoid, who they 'like' for the murders, who
repeatedly shouts back to their questions - Irrevelevant! Immaterial!
and such. He's afraid to confront things, which I guess was the point
of the screenplay, at that point; just as the strangler character
Curtis plays, Albert, fears coming to terms with himself. Maybe such
'theories', whatever they were, aren't so "irrelevant", save to
someone who is afraid to consider he is, possibly, wrong about some
things; and despite his background and intelligence, otherwise. But,
I'm just supposing here.
>Your believing anything is of no consequence.
Immaterial and irrevelant, no doubt.
>[yours is a] tantrum characteristic of the slothful or the incompetent.
I never claimed to be as smart as you. I only suggested that
immaterial, irrelevant may not really be, to one thinking clearly.
>That's the problem. You're not saying anything.
Then to repeat myself, I really don't believe the few things I've
said, that seem so controversial to you, are immaterial, irrelevant,
is all.
>> one might say it does appear, however, that _you_ prefer simply
>> flaming.
>
>-sigh-. You shouldn't use words you don't understand. "Flaming"
But I do understand the UseNet term, flame, as in:
>You labor under the delusion
>you parade your ignorance.
>Your manifest inability
>you confuse disabusing with "flaming".  
To which I might only reply that if it were all so immaterial,
irrelevant, anyone, not just myself, might wonder how it's gotten you
so worked up in your 'indifference'.
>-yawn-.
Which translates - immaterial! irrelevant! (correct me if I'm wrong)
>reprising the stale catechistic blather
>fatuous bore
>afflicted ciwah
>the least you could do is not be so tedious.
>this banal rant is relevant. OK
Fine. And it's immaterial, to boot. (and this _is_ getting repetitive,
even were I some sort of therapist, which I certainly am not)
>First, acquaint yourself with "The Sons of Martha" by Rudyard Kipling
Can _I_ say immaterial, irrevelant? Is it allowed?
>You need to grasp some elementary
>concepts; in particular, what "standard" means on the Internet.
Which I'm going to interpret as meaning - guess we're going to have to
agree to disagree.
>|> Just stop your idiotic FUD.
>|
>| FUD . . . again? Did we agree that was Farsically Understated
>| Depositions, or Fat Uncle Dutch? I forget.
>
>The premise,
I forget the Finest Unnecessary Department, Five Unpleasant Days,
Funny Until Departure, Fully Underwritten Disaster, Famously
Undercooked Delicacies, and the Fastest Underwater Duck.
Well . . . been fun. Keep the faith, I guess - even if its wrong.
Peace.
>   *plonk*
>Unfortunately, I have yet to find a good "How to write your own
>DTD and validate with it offline"
Still, one can have all the definitions and configurations they want.
If they aren't up to handling the tags and attributes that ship with
NN or IE, seriously, how can they be used, professionally, to help
create web sites when something like CSE is available to do the better
job?
Peace.
In article <331f35c3...@news.pacbell.net>,
1023...@compuserve.com (Mark Johnson) wrote:
> gala...@htmlhelp.com (Arnoud "Galactus" Engelfriet) wrote:
> >Unfortunately, I have yet to find a good "How to write your own
> >DTD and validate with it offline"
> 
> Still, one can have all the definitions and configurations they want.
With a DTD? Sure. I admit it's not as easy as CSE apparently is,
you have to write the DTD manually, but then I'm one of those people
who writes Web pages by hand. :-)
> If they aren't up to handling the tags and attributes that ship with
> NN or IE, 
A DTD isn't "up to" anything more than you write it to be. If you
want to add support for BLINK, to name one tag, just add
<!ELEMENT BLINK - - (%text)>
to the DTD. Frames and assorted stuff get a little more difficult,
because Netscape never properly defined where and how these elements
go, but my frames section at htmlhelp.com gives one possible DTD
fragment for frames.
> seriously, how can they be used, professionally, to help
> create web sites when something like CSE is available to do the better
> job?
I don't understand why you're constantly arguing against the use of
a DTD and SGML validator/parser, and in favor of CSE. You can do
_anything_ in a DTD, as long as it follows SGML rules. Require
closing tags on paragraphs, table cells and lists? One change per
element. Add the TYPE attribute to STYLE? Just type it in. 
For my documents, I just downloaded SP (a free SGML parser), copied
the W3C DTD, modified it to suit my needs, and then I just type
cd \websites\htmlhelp\reference
make docs validate zip
to generate the documents, validate them at HTML 3.2, and then zip
them up in one file which I upload to my server.
I suppose this is more of a GUI vs CLI debate now; I really don't
see how I can generate, validate and upload 111 files in 16
directories using GUI tools only.
>In article <331f35c3...@news.pacbell.net>,
>1023...@compuserve.com (Mark Johnson) wrote:
>> Still, one can have all the definitions and configurations they want.
>
>With a DTD? Sure. I admit it's not as easy as CSE apparently is,
After thinking about it, a bit, I suppose the answer is - no, it's
not, but probably not so much more difficult; one could just punch up
the Spyglass configurations, I guess. Still, the mistakes of CSE seem
to occur at tricks around the margins, at worst, and seem, at that,
helpful more often than not compared with a 'true' validator. So . . .
I'm happy with it.
>> If they aren't up to handling the tags and attributes that ship with
>> NN or IE, 
>
>A DTD isn't "up to" anything more than you write it to be.
Again, suppose you're right - but if I ever find the time. And that's
the thing. Meanwhile . . . CSE. And I'm happy with it. Perhaps _you_
might consider punching up a few DTD files so that they are NN and IE
compatible, and then putting them on your site or posting to a misc
binaries group?
>I don't understand why you're constantly arguing against the use of
>a DTD and SGML validator/parser, and in favor of CSE.
It works. And the others I've used, don't. Simple.
>You can do _anything_ in a DTD, as long as it follows SGML rules.
Understood. Fair enough. Just requires the extra work to make it do
that. And nobody's bothered, yet. CSE validates the page. That's what
I want.
>For my documents, I just downloaded SP (a free SGML parser), copied
>the W3C DTD, modified it to suit my needs, and then I just type
>
>cd \websites\htmlhelp\reference
>make docs validate zip
>
>to generate the documents, validate them at HTML 3.2, and then zip
>them up in one file which I upload to my server.
Well, but with respect, CSE can be called by an entry in a pop-up menu
right off the selected file, and automatically dumps the output to a
text processor of your choosing. What could be easier? I mean if
you're suggesting a validation/upload in one step, what happens if the
validator kicks back an error (when do you get to fix it)?
>I suppose this is more of a GUI vs CLI debate now; I really don't
>see how I can generate, validate and upload 111 files in 16
>directories using GUI tools only.
You mean upload regardless? If you were so sure they were
syntactically correct, why use the validator, in the first place? Do I
misunderstand you, here?
Peace.
In article <3322a15d...@news.pacbell.net>,
1023...@compuserve.com (Mark Johnson) wrote:
> gala...@htmlhelp.com (Arnoud "Galactus" Engelfriet) wrote:
> >With a DTD? Sure. I admit it's not as easy as CSE apparently is,
> 
> After thinking about it, a bit, I suppose the answer is - no, it's
> not, but probably not so much more difficult; 
As I understand it, you configure things in CSE using pulldown
menus and such. You have to write a DTD by hand. There could be
visual DTD generators, although I have never seen them. It doesn't
bother me, though. I only write a DTD once, and then use it. :-)
> >A DTD isn't "up to" anything more than you write it to be.
> 
> Again, suppose you're right - but if I ever find the time. 
So because the publicly available DTDs do not support what you want,
and you don't have time to write your own DTD, you think DTDs are
inadequate? That sounds a bit short-sighted to me.
I believe that HTML Pro supports every element/attribute known to
mankind; if you use that, you should be able to validate anything.
> And that's
> the thing. Meanwhile . . . CSE. And I'm happy with it. Perhaps _you_
> might consider punching up a few DTD files so that they are NN and IE
> compatible, and then putting them on your site or posting to a misc
> binaries group?
I'd love to, but it is rather hard to reverse engineer these elements.
There seem to be new table attributes weekly. :-) And HTMLPro already
covers most of them.
> Well, but with respect, CSE can be called by an entry in a pop-up menu
> right off the selected file, and automatically dumps the output to a
> text processor of your choosing. What could be easier? 
For 111 files? I'd absolutely hate to click/select "CSE check" on
111 source files. That's more a GUI shortcoming, but still.. 
> I mean if
> you're suggesting a validation/upload in one step, what happens if the
> validator kicks back an error (when do you get to fix it)?
The process is aborted when something generates an error along the way.
I don't upload before all errors/warnings have been fixes, just like
I won't release a program until it compiles without warnings.
> >I suppose this is more of a GUI vs CLI debate now; I really don't
> >see how I can generate, validate and upload 111 files in 16
> >directories using GUI tools only.
> 
> You mean upload regardless? If you were so sure they were
> syntactically correct, why use the validator, in the first place? Do I
> misunderstand you, here?
Saying that I rarely if ever make syntactic mistakes is probably
a bit arrogant. :-) I should have added that it's really three
separate steps. I don't upload until it's valid.
Now if only I could get Perl to run under OS/2..
Anything that follows HTML Pro's *strict* syntax¹.
¹ See <URL:news:5fi77g$r58$1...@hal.cs.duke.edu> for some examples.
-- 
Heikki "Hezu" Kantola, <Heikki....@IKI.FI>
Lähettämällä mainoksia tai muuta asiatonta sähköpostia yllä olevaan
osoitteeseen sitoudut maksamaan oikolukupalvelusta FIM500 alkavalta
tunnilta.