Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Web considered harmful

25 views
Skip to first unread message

mw

unread,
Mar 22, 2014, 8:43:59 AM3/22/14
to
Web considered harmful
======================

Over the past decade, the internet has seen a transition from
single-task protocols to the web to the extent that new functionality
is often only exposed as a web-API with a proprietary protocol.

While the base protocol (HTTP) and information serialization (HTML,
XML, JSON) is standardized, the methods for extracting information
from the received data varies from website to website.

The solution in the 1990s was to make a standardized protocol,
e.g. IMAP or NNTP, which could be used to access email or news in a
standardized manner.

For interfacing with, say, google mail, however, a client application
will have to speak the google mail API which is incompatible with the
mail API of another provider. This transition is turning the internet
into a collection of walled gardens with the obvious drawback that
most websites -- if an API is present at all -- will only have the
official client implementation to said API available. Mostly there
will be a few closed-source implementations provided by the vendor,
most commonly a combination of the following:

* a website (often with mandatory javascript)

* a mobile website (possibly without javascript, but optimized for
small screens and thus not very practical on a desktop browser and
often not exposing all available features)

* Android or iPhone app (sometimes not exposing all available
features, restricted to a single platform)

leaving users little choice in case they are using a different
platform or want to collect their data in a unified format.

Even worse is receiving information from websites where no API exists.
There is no standard for logging into websites which have a mandatory
username/password login prompt and implementations will have to handle
cookies, referer headers (ridiculously many website mandate one for
XSRF protection even though the standard makes them optional) and site
specific form locations to which POST and GET requests will need to be
made in a site specific order.

For the most part, there has been no effort in changing any aspect of
this problem, which has existed for more than 10 years. On the
contrary, companies have consecutively started to discontinue support
for open web standards such as RSS/Atom.

Conclusion: The web as it is now is harmful to the open standard
culture of the internet.

Related readings (please expand):
https://www.gnu.org/philosophy/javascript-trap.html

Comments and discussion would be appreciated.

Doc O'Leary

unread,
Mar 22, 2014, 12:11:55 PM3/22/14
to
In article <lgk0if$vbv$1...@news.albasani.net>, mw <mw0...@arcor.de>
wrote:

> Comments and discussion would be appreciated.

How is the web to blame? This has played out for not just a decade
online, but for all forms of media in the past. People are cheap.
Companies are controlling. Advertisers ooze in to fill any empty space.
If you don't like it, work to change *those* dynamics, on the web and
beyond.

--
iPhone apps that matter: http://appstore.subsume.com/
My personal UDP list: 127.0.0.1, localhost, googlegroups.com, teranews.com,
and probably your server, too.

mw

unread,
Mar 23, 2014, 7:42:33 AM3/23/14
to
>> Comments and discussion would be appreciated.
>
> How is the web to blame? This has played out for not just a decade
> online, but for all forms of media in the past. People are cheap.
> Companies are controlling. Advertisers ooze in to fill any empty space.
> If you don't like it, work to change *those* dynamics, on the web and
> beyond.

While I'm all for fighting the commercialization of the internet, this
isn't the point I'm complaining about.

For most protocols, essential things like login are a straight-forward
process:

1. Open TCP connection
2. Negotiate protocol specific things which need to be negotiated
first
3. Send username/password

that's it.

For the web, it is highly specific to each website, except for e.g.
HTTP authentication which is hardly used at all anymore (and besides,
it doesn't have a logout option in any browser I've used so far so I
will be logged in until I terminate the browser, if I'm not mistaken)
and SSL client certificate authentication which was never widespread
in the first place.

1. Open TCP connection to foo.com
2. Send HTTP GET to /login.php with Connection: keep-alive because I
want to omit the TCP connection reopening from this post. Await
response. Throw response away except for Set-Cookie header.
3. Send HTTP POST request to /login.php with headers
Cookie: [Content or previous Set-Cookie]
Referer: http://foo.com/login.php

And a body formatted as application/x-www-form-urlencoded
containing the name of the form, the name of the username field and
the name of the password field (notice the x-? It makes me
suspicious whether posting form values to a website is an official
standard)

4. Hopefully you're logged in now.
You can send more GET requests over the same connection, of course
while keeping track of your Referer headers as well, most of the
time manually because most HTTP libraries don't do it
automatically. Similar problem exists with cookies.

The problem here is, that, while you can e.g. expect SMTP, IMAP, NNTP
etc. to follow roughly the same logic for login, the forms and input
fields will be called different names on different websites. This
means that for any website you want to interact with, you can
basically re-write your code, as well as when the login form for a
particular website changes for some reason, as well as dealing with
hidden inputs and the like.

Another problem is that every log in potentially takes multiple
reconnects in case the connection is closed after a request. And even
then, it takes multiple HTTP requests too get past the login form.
There is no ready solution for any programming language I've seen that
would offer a halfway ready-made solution for keeping cookies and
Referer headers and so on around in the way the web currently
mandates.

The only failsafe client for this abhorrent mess is a full-fledged
browser with javascript enabled. Too bad it isn't particularly
well-scriptable and a huge resource hog if you just want a tiny bit of
information.

Add to that that real-world HTML is tagsoup without a real standard,
and good luck finding a HTML5 parser supporting the whole standard
including backwards compatibility for non-UTF-8 encodings for most
programming languages as well. (gumbo doesn't)

Poul-Henning Kamp has written an article about HTTP/2.0 still not
doing away with cookies:
https://www.varnish-cache.org/docs/trunk/phk/http20.html

Which, to me, raises the question: What problems do the successors to
HTTP/1.1 actually solve?

So I guess my main complaint is the lack of standard on the web and
lack of infrastructure to deal with it.

And while I like to keep this discussion about the technological side
of things, I would certainly say that the web *enables* corporations
to abandon open standards and -- as the FSF put it -- allows
corporations to run proprietary software on your computer without your
explicit consent, a problem that, in my opinion, didn't exist on this
scale before the web went all 2.0. Speaking of which, fighting this
issue alone seems like a fight against windmills. It's easier to rant
about the state of the web on mostly dead newsgroups. (I honestly
don't know if there even is any other place to discuss this.)


(By the way, your address still says "usenet2013")

Doc O'Leary

unread,
Mar 23, 2014, 1:15:04 PM3/23/14
to
In article <lgmhb9$oru$1...@news.albasani.net>, mw <mw0...@arcor.de>
wrote:

> >> Comments and discussion would be appreciated.
> >
> > How is the web to blame? This has played out for not just a decade
> > online, but for all forms of media in the past. People are cheap.
> > Companies are controlling. Advertisers ooze in to fill any empty space.
> > If you don't like it, work to change *those* dynamics, on the web and
> > beyond.
>
> While I'm all for fighting the commercialization of the internet, this
> isn't the point I'm complaining about.

It should be. You're focussing on symptoms when you should be fighting
the disease.

> For most protocols, essential things like login are a straight-forward
> process:
>
> 1. Open TCP connection
> 2. Negotiate protocol specific things which need to be negotiated
> first
> 3. Send username/password
>
> that's it.

Nonsense. Go read some actual RFCs and you'll see that even things you
might expect to be similar, like login, still require things be done
differently. Differing web APIs are no better or worse than the
differences between NNTP and IMAP protocols.

> The problem here is, that, while you can e.g. expect SMTP, IMAP, NNTP
> etc. to follow roughly the same logic for login,

No, you can't expect that. There is no meta-standard for login. If
there were, we wouldn't have things like PAM and LDAP and countless
other ways to manage authentication.

> the forms and input
> fields will be called different names on different websites.

Just like the different non-web protocols do. Your evidence doesn't
support your position.

> This
> means that for any website you want to interact with, you can
> basically re-write your code, as well as when the login form for a
> particular website changes for some reason, as well as dealing with
> hidden inputs and the like.

Same would be true for any changing protocol. If a company, say
Microsoft or Google, wasn't happy with IMAP as-is, you'll run into
exactly the same problems with their extensions. Because, like I said,
companies are controlling. Not a web issue.

> There is no ready solution for any programming language I've seen that
> would offer a halfway ready-made solution for keeping cookies and
> Referer headers and so on around in the way the web currently
> mandates.

And I can't find a standard libnntp to use for writing my own
newsreader. The problems you describe aren't limited to the web.

> The only failsafe client for this abhorrent mess is a full-fledged
> browser with javascript enabled. Too bad it isn't particularly
> well-scriptable and a huge resource hog if you just want a tiny bit of
> information.

Perhaps there are better ways to get that information than the
human-readable web? If not, why not? Could it be . . . oh, I don't
know . . . because . . . "People are cheap. Companies are controlling.
Advertisers ooze in to fill any empty space."

> Add to that that real-world HTML is tagsoup without a real standard,
> and good luck finding a HTML5 parser supporting the whole standard
> including backwards compatibility for non-UTF-8 encodings for most
> programming languages as well. (gumbo doesn't)

Use a published web API instead of screen scraping, then. If you're
saying that's not available, why not? Could it be . . .

> So I guess my main complaint is the lack of standard on the web and
> lack of infrastructure to deal with it.

The lack of meta-standards is wide reaching. Best practices are too
easily *not* followed everywhere. It only stands out on the web because
that's what people flocked to. It'd be spread out more if people used
other protocols more. Like I said, read a few RFCs and you'll see that
wheels get re-invented, incompatibly, all the time.

> And while I like to keep this discussion about the technological side
> of things, I would certainly say that the web *enables* corporations
> to abandon open standards and -- as the FSF put it -- allows
> corporations to run proprietary software on your computer without your
> explicit consent, a problem that, in my opinion, didn't exist on this
> scale before the web went all 2.0.

But to frame that as a web issue, or even a technical issue, is
short-sighted. People, both on the corporate side and on the consumer
side, push at the edges of *any* whiz-bang thing they're given. Over
time, it becomes an arms race between each parties interests. That the
web is the popular battlefield is purely incidental.

> Speaking of which, fighting this
> issue alone seems like a fight against windmills. It's easier to rant
> about the state of the web on mostly dead newsgroups. (I honestly
> don't know if there even is any other place to discuss this.)

Pick your battles. The way to "fight" it is to offer better
alternatives. And it is not enough for those to be *technical*
alternatives. Your real focus does need to be on balancing all the
human desires that have made the web the mess it became (just like TV
and radio and magazines and newspapers and on and on).

> (By the way, your address still says "usenet2013")

Because I haven't received any spam at that address yet! :-)

user1

unread,
Aug 26, 2014, 12:46:51 PM8/26/14
to
On 22/03/14 12:43, mw wrote:> Web considered harmful
Hi mv,

I agree - but what is the solution?

Personally, I reckon that the public will get what it accepts.

My guess is - the only way to combat the trend would be to increase
awareness and provide more plentiful alternatives.

Maybe the only way to stop people using such harmful services is to
increase awareness of the downsides.

I sometimes think that the 'new-breed' of Internet users (google,
web-sites and web-mail) do not appreciate the true freedom and value of
the Internet and it's information services - and do not appreciate how
the 'new-breed' of web-services actually undermine this true freedom and
value.

Whereas once the 'new-breed' were in the minority, they are now in the
majority, and their advertisement fuelling presence, coupled with their
relative naivety of such freedoms and values has led to big business
driving the progressions.

Fair enough to say that we have come a long way from scrolling marquees,
flash-only sites and java-applets - but at what cost?

Unless WE are the Internet, the Internet is not what it should be.

'The Internet is for everyone'.
- see <http://rfc3271.org/rfc3271.html>

Perhaps someone could create a 'masterplan' of ideas that are fully
integrated, and provide the fullness of functionality for all things,
and are all based on current standards. If nothing else, to inspire the
community to pursue the idea further, in greater numbers, and with a
common cause.

Does anyone know of any worthwhile things along these lines?
(apart from www.gnu.org, of-course)

user1

--

Long live the real Internet.

Scientific (she/her)

unread,
Dec 17, 2021, 3:33:33 PM12/17/21
to
I have noticed that after all these years too - I fucking hate modern
Internet. I fucking hate how social media has taken over us, I fucking
hate how hard it is to do anything in modern Web.

I will take the good ol' times of internetworking on Unix command line
in 80s over this modern crap every day.

--
There is no verifiable evidence that gender dysphoria can be treated in
other ways than transitioning. None whatsoever.
Gender identity becomes unchangeable by age 4, something transphobes
fail to understand.
Scaring trans people away from transitioning and repressing their
identities *IS* conversion therapy.

Ant

unread,
Dec 18, 2021, 7:05:27 AM12/18/21
to
You can still use lynx web browser. ;) However, most web sites don't
work with it. :(

--
Cold & windy winter rain storm came & left a big mess! Dang coldness, colony, works, strikes, software upgrades, free games, spams, (scam/fraud)s, greeds, inflations, illness, life, etc. again. :(
Note: A fixed width font (Courier, Monospace, etc.) is required to see this signature correctly.
/\___/\ Ant(Dude) @ http://aqfl.net & http://antfarm.home.dhs.org.
/ /\ /\ \ Please nuke ANT if replying by e-mail.
| |o o| |
\ _ /
( )

Doc O'Leary

unread,
Dec 18, 2021, 2:01:17 PM12/18/21
to
For your reference, records indicate that
"Scientific (she/her)" <sci...@is.truth> wrote:

> On 3/22/14 12:43 PM, mw wrote:

Quite a necro, but I approve! :-)

> I have noticed that after all these years too - I fucking hate modern
> Internet. I fucking hate how social media has taken over us, I fucking
> hate how hard it is to do anything in modern Web.

I’m right there with you. One of my projects for 2022 is going to be to
move away from the web as a primary means of sending or receiving
information. I’m looking at things like Jekyll to get away from having
a heavy stack for my site(s), but even that might be too closely tied to
the way the modern web works.

> I will take the good ol' times of internetworking on Unix command line
> in 80s over this modern crap every day.

Well, it’s not like everything was perfectly executed back then, either.
For example, no standardization on configuration files has been a constant
annoyance for decades. But there is a lot to be said for text file formats
of increasing complexity based on need. I mean, web browsers do *so* much
these days, yet if you hand them a bit of Markdown they’re left clueless?

--
"Also . . . I can kill you with my brain."
River Tam, Trash, Firefly


David

unread,
Jan 29, 2022, 1:45:24 PM1/29/22
to
On 2021-12-18, Doc O'Leary wrote:
> On 2021-12-17, Scientific (she/her) wrote:
> Quite a necro, but I approve! :-)
>
>> I have noticed that after all these years too - I fucking hate modern
>> Internet. I fucking hate how social media has taken over us, I fucking
>> hate how hard it is to do anything in modern Web.
>
> I’m right there with you. One of my projects for 2022 is going to be to
> move away from the web as a primary means of sending or receiving
> information. I’m looking at things like Jekyll to get away from having
> a heavy stack for my site(s), but even that might be too closely tied to
> the way the modern web works.
>
>> I will take the good ol' times of internetworking on Unix command line
>> in 80s over this modern crap every day.
>
> Well, it’s not like everything was perfectly executed back then, either.
> For example, no standardization on configuration files has been a constant
> annoyance for decades. But there is a lot to be said for text file formats
> of increasing complexity based on need. I mean, web browsers do *so* much
> these days, yet if you hand them a bit of Markdown they’re left clueless?

At least there's reader mode, but that's like using uBlock Origin
instead of serving only what's needed.

I'm surprised that Gemini managed to get quite popular within like one
or two years and Firefox still cannot render Markdown natively.


meff

unread,
Jan 29, 2022, 3:28:37 PM1/29/22
to
On 2021-12-17, Scientific (she/her) <sci...@is.truth> wrote:
> On 3/22/14 12:43 PM, mw wrote:
> I have noticed that after all these years too - I fucking hate modern
> Internet. I fucking hate how social media has taken over us, I fucking
> hate how hard it is to do anything in modern Web.

You're trying to solve an emotional or social problem with a technical
solution. More people like the Web than the crumudgeons who
don't. People have voted with their feet.

> I will take the good ol' times of internetworking on Unix command line
> in 80s over this modern crap every day.

Ableists only apply?

rtr

unread,
Jan 30, 2022, 2:00:29 AM1/30/22
to
I think one of the main strengths of gemini is that it's simple enough
that it allows a lot of people to dip their hands into it and implement
servers and clients for it but it's also modern enough that we don't
have to deal with esoteric behavior such as with gopher.

--
Give them an inch and they will take a mile.
--
gemini://rtr.kalayaan.xyz

ne...@zzo38computer.org.invalid

unread,
Jan 30, 2022, 8:13:49 PM1/30/22
to
David <da...@arch.invalid> wrote:
> On 2021-12-18, Doc O'Leary wrote:
> > On 2021-12-17, Scientific (she/her) wrote:
> >> On 2014-03-22, mw wrote:

> >>> Over the past decade, the internet has seen a transition from
> >>> single-task protocols to the web to the extent that new functionality
> >>> is often only exposed as a web-API with a proprietary protocol.
> >>>
> >>> While the base protocol (HTTP) and information serialization (HTML,
> >>> XML, JSON) is standardized, the methods for extracting information
> >>> from the received data varies from website to website.

That is the case; also these formats are more complicated than they should
be in some ways but also lack some things unfortunately (e.g. JSON only
supports Unicode text and only floating point numbers, not binary data or
64-bit integers (unless encoded); HTTP, HTML, and XML have more problems).

> >>> The solution in the 1990s was to make a standardized protocol,
> >>> e.g. IMAP or NNTP, which could be used to access email or news in a
> >>> standardized manner.

Yes, and we can still do such things as needed. I also have some other ideas
that I mention farther below.

We can still use protocols such as NNTP, IRC, etc; we can also make up new
protocols if they are needed. Multiple protocols for accessing the same
messages would also work.

I would want to promote supporting any suitable protocols, file formats, etc
but this isn't common.

> >>> For interfacing with, say, google mail, however, a client application
> >>> will have to speak the google mail API which is incompatible with the
> >>> mail API of another provider. This transition is turning the internet
> >>> into a collection of walled gardens with the obvious drawback that
> >>> most websites -- if an API is present at all -- will only have the
> >>> official client implementation to said API available. Mostly there
> >>> will be a few closed-source implementations provided by the vendor,
> >>> most commonly a combination of the following:

> >>> leaving users little choice in case they are using a different
> >>> platform or want to collect their data in a unified format.

True. Sometimes specialized formats will be needed for some applications
(and existing formats may be unsuitable), but they should be documented,
and conversion software could be available if appropriate.

> >>> Even worse is receiving information from websites where no API exists.

It is bad; yes. In this way it is necessary to do without, but some web
pages have other obstructive things, that can get in the way even if you
are just trying to view it normally, too.

> >>> There is no standard for logging into websites which have a mandatory
> >>> username/password login prompt and implementations will have to handle
> >>> cookies, referer headers (ridiculously many website mandate one for
> >>> XSRF protection even though the standard makes them optional) and site
> >>> specific form locations to which POST and GET requests will need to be
> >>> made in a site specific order.

Actually there is (HTTP basic/digest auth), but it isn't commonly used, and
most web browsers do not provide the user much control over it (such as a
command to log out, options to persist sessions, etc).

> >>> For the most part, there has been no effort in changing any aspect of
> >>> this problem, which has existed for more than 10 years. On the
> >>> contrary, companies have consecutively started to discontinue support
> >>> for open web standards such as RSS/Atom.
> >>>
> >>> Conclusion: The web as it is now is harmful to the open standard
> >>> culture of the internet.

I agree. However, even if standards are open does not automatically make them
good (but it does make them better than proprietary systems).

> >>> Related readings (please expand):
> >>> https://www.gnu.org/philosophy/javascript-trap.html

One of the things that article says is: "Browser users also need a convenient
facility to specify JavaScript code to use instead of the JavaScript in a
certain page." I very much agree with this; it is a very important feature.
Furthermore, there may be some things that a user might want their alternative
scripts to do that the ones included in the document cannot do (e.g. access
other programs and files on the user's computer, bypass CORS, etc).

They also mention Java, Flash, Silverlight, etc. It is true, JavaScript is not
the only way; furthermore, Java and JavaScripts are only the programming
languages, which are not themself bad, but I think embedding them in documents
in this way is bad (but common). There is also WebAssembly, too. So, I will
just call these programs in the document as "document scripts" instead (as
opposed to JavaScript code which is part of the web browser itself, etc).

Even if a program is free software, the user does not necessary want to execute
that program on their computer, so the above is important, as is such things
as whitelisting (possibly with cryptographic hashes to identify them). (User
specified whitelisting should also be how "secure contexts" are implemented;
the existing implementation is no good. Actually, whitelisting by cryptographic
hash both solves spies tampering with data in non-TLS, and the server operator
changing it to undesirable things in TLS, too; secure contexts fail to solve
the latter thing.)

Some of the criteria for nontrival scripts are a bit strange, such as the
criteria that arrays cannot have more than fifty elements. (An actual memory
management system to restrict memory allocation might be better. It could
also restrict execution time, etc, as needed.)

Also in some cases, it may be wanted to change the definition of some functions
before the script is executed.

Free JavaScript code is insufficient, though. There will also need ways to
make the data interoperable, including outside of the web browser.

Also, even if a script is allowed to run, if it requests (for example) camera
access, it should allow the user to specify the command or device filename to
use as input. This way, web apps that use it can work even if you do not have
a camera. The same is true for other things, such as audio input/output, MIDI,
game controls, etc. It is even true for keyboard commands, so it doesn't
override the keyboard commands, or allows user customization, etc.

Another thing to do, other than scripts, is CSS. I thought the idea of "meta
CSS" to allow the end user to customize the interpretation of CSS and all of
the priorities, etc. ARIA also helps a bit (or at least it would, if it were
implemented; I mention this a bit more below). For example, one thing that a
user might want to do is to skip animations (at least, I often find CSS
animations to be annoying, and a waste of energy). Another thing would be to
specify rules that are disabled in the presence of other rules (for example,
sometimes you might want CSS).

> >> I have noticed that after all these years too - I fucking hate modern
> >> Internet. I fucking hate how social media has taken over us, I fucking
> >> hate how hard it is to do anything in modern Web.

I agree; it is difficult to do many things. But, I don't use Facebook, etc.

> > I'm right there with you. One of my projects for 2022 is going to be to
> > move away from the web as a primary means of sending or receiving
> > information. I'm looking at things like Jekyll to get away from having
> > a heavy stack for my site(s), but even that might be too closely tied to
> > the way the modern web works.

> >> I will take the good ol' times of internetworking on Unix command line
> >> in 80s over this modern crap every day.

Yes, it is better. Modern designs have problems one is that command-line access
is not working very well, and many other problems, too, including not letting
the user to specify what they want and assuming things other than what the user
had specified, etc. Programs also are not working together very well, unlike
the UNIX which can use pipes, etc to use programs together.

> > Well, it's not like everything was perfectly executed back then, either.
> > For example, no standardization on configuration files has been a constant
> > annoyance for decades. But there is a lot to be said for text file formats
> > of increasing complexity based on need. I mean, web browsers do *so* much
> > these days, yet if you hand them a bit of Markdown they're left clueless?

There are a few reasons why they would not implement Markdown, one of which is
that there are a few different variants, so they aren't always compatible.

It is common they implement the bad stuff, some of the good features though are
not implemented, and some good feature are even being removed, too.

However, one feature I find useful due to these mess is the web developer
console. Even not being a web developer, it is useful as a end user, too.
In a few cases, the document.evaluate command might be able to extract data.

> At least there's reader mode, but that's like using uBlock Origin
> instead of serving only what's needed.

There are some other problems with the reader mode too.

I would want to implement a "ARIA view" mode, which mostly ignores the CSS
(possibly with a few exceptions, such as still paying attention to whether
or not it specifies a fixed pitch font) in favour of using most HTML commands
(except those specifying colours) and ARIA properties, to render the document.
(For example, some web applications use custom widgets, but have the suitable
ARIA properties; then they can be used to display standard widgets in place of
the custom ones. Simply disabling CSS doesn't work; I have tried.)

One of my ideas is also to have request/response overriding in the client
software that can be configured by the user. This would make many other
options to be unnecessary, such as cookies, language, etc; this is a unified
method which does this and a lot more, including things that we have not
thought of yet (if the end user can think of it).

Another thing that could be done is alternative providers. In this way, it
is possible to provide things in many ways without being locked in and
without being restricted to specific complicated software, etc.

There is also one more thing I considered in the case of HTTP, which would
allow you to serve Markdown, MathML, FLIF, FLAC, etc, and allows better user
customization, accessibility, efficiency (if native implementations are
available), possibly reducing bandwidth requirements, etc. It is a new
response header, which can occur any number of times. If the response is
not understood, then it can load one of those instead but without changing
the current document URL. This way, if the user has enabled this feature
(the user should always be allowed to disable or override stuff; the above
request/response overriding already does this in this case), then it would
automatically just work as far as the user can see, without needing to do
anything special, etc.

Web browsers (and other programs) need better user control, instead of
removing good features and adding bad ones, or assuming that the user
wanted something other than what is specified, etc. I think UNIX philosophy
is much better, instead.

Some way to specify common links for data and alternative protocols should
also be necessary (possibly <link rel="alternate"> might do). The alternate
protocols might not have a MIME type, but can still specify the URL.

It is unfortunate that fixing it involves more things like that instead of
just making it in a simpler way, but it seems necessary, to me.

Fortunately, much of the above is not needed in the case of Gemini, which
does not have these problems. However, I think that the Gemini format and
protocol is perhaps a bit too simple (while Markdown is too complicated,
and HTTP and HTML are too complicated, and PDF is too complcated, etc; FTP
is also bad but for other reasons). But, for most of the things that Gemini
is used for, it is probably OK (although, in addition to the current
specification, should also implement "insecure-gemini" scheme which is the
same but without TLS and that 6x responses are not allowed).

I may have other things to write, but will do so later, instead of now.

--
Don't laugh at the moon when it is day time in France.

Doc O'Leary

unread,
Feb 2, 2022, 11:04:11 PM2/2/22
to
For your reference, records indicate that
ne...@zzo38computer.org.invalid wrote:

> There are a few reasons why they would not implement Markdown, one of which is
> that there are a few different variants, so they aren't always compatible.

Neither are all the variants of HTML compatible, but you presumably
wouldn’t argue that as a reason browsers shouldn’t handle *any* HTML,
right? My point is that there are many document formats that have a more
or less direct conversion to features that are supported by HTML, yet
feeding one to a “modern” browser that has kitchen-sink support for just
about everything else under the sun leaves them dumbfounded. I mean, a
basic CSV file should be trivially easy to display as any other table would
be, but is there any major browser that does that?

> It is common they implement the bad stuff, some of the good features though are
> not implemented, and some good feature are even being removed, too.

What can be said to be bad or good are in the eye of the beholder. I
personally dislike the focus on publisher-controlled presentation. CSS
was supposed to move us away from that, but most browsers don’t make it
easy to override sites so that the visitor can define their own unique
view of a usable web.

> It is unfortunate that fixing it involves more things like that instead of
> just making it in a simpler way, but it seems necessary, to me.

Well, I’d say it’s only “necessary” in the sense that some people can’t
see beyond bloating one app until it does everything they need. I can
easily see a tool developed with the Unix Philosophy in mind, but I can
also see that most users wouldn’t actually use it, because they are quite
happy living in an online world where the presentation is controlled by
someone else whose aim is continued engagement.

ne...@zzo38computer.org.invalid

unread,
Feb 3, 2022, 2:56:33 PM2/3/22
to
One problem in general is that software is not designed for advanced users.
Computer software should be designed for advanced users.

Doc O'Leary <drol...@2017usenet1.subsume.com> wrote:
> > There are a few reasons why they would not implement Markdown, one of which is
> > that there are a few different variants, so they aren't always compatible.
>
> Neither are all the variants of HTML compatible, but you presumably
> wouldn’t argue that as a reason browsers shouldn’t handle *any* HTML,
> right? My point is that there are many document formats that have a more
> or less direct conversion to features that are supported by HTML, yet
> feeding one to a “modern” browser that has kitchen-sink support for just
> about everything else under the sun leaves them dumbfounded. I mean, a
> basic CSV file should be trivially easy to display as any other table would
> be, but is there any major browser that does that?

It is a valid point. It ought to be possible to make extensions in a web browser
to implement whatever format you want to including overriding its built-in
capability of any format, including using extensions written in native code
(loading by .so files, or using pipes), that the end user can set up if wanted.
(The same should be true for character encodings and protocols too, in addition
to file formats, audio filters, I/O interfaces, etc.)

Furthermore, I had mentioned the possibility that if the end user has not disabled
document scripts, then there is possibility to display even if there is no handling
of that file format built-in or configured by the user, by extra HTTP headers.

> What can be said to be bad or good are in the eye of the beholder. I
> personally dislike the focus on publisher-controlled presentation. CSS
> was supposed to move us away from that, but most browsers don’t make it
> easy to override sites so that the visitor can define their own unique
> view of a usable web.

I agree with you; I also dislike the focus on publisher-controlled presentation.
Even if the CSS can be overridden (or disabled), this is not good enough. It is
one thing why I think that ARIA is important to fix it.

There can also be adding meta-CSS, which includes codes that can be specified
only by the user and is not possibly by publisher, and can use CSS codes as
additional criteria, and can change the meaning of certain CSS properties too.

If a new browser must be written, another alternative is just to not implement
CSS at all, maybe. Some things will not work without CSS, but maybe if you have
HTML and ARIA, and possibility of user customizations (even if it is its own
simplified kind of variant of CSS that only can be used by the end user) then
it might be suitable for most, maybe.

Another feature I would want is to remove many animations.

> > It is unfortunate that fixing it involves more things like that instead of
> > just making it in a simpler way, but it seems necessary, to me.
>
> Well, I’d say it’s only “necessary” in the sense that some people
> can’t see beyond bloating one app until it does everything they need.

It makes sense to have different things in different programs, but is sometimes
to be suitable to have multiple protocols/formats available in one interface,
even if it calls external programs to do so.

For example, IRC can be a separate program, but it can make sense to support
HTTP, HTML, Gemini, Gopher, etc together in one program, although I think that
it might be better having the core program not supporting any of these and only
the interface which calls extensions to implement them, instead. This way, you
can use the links between them, bookmark, etc.

Additionally, to support end user defining pipes, etc for I/O, which makes it
better. This is many older UNIX programs are doing that I would hope, too. (For
example, I had design music play program will just write it to stdout, you can
pipe to aplay to play back, or sox to convert it, etc. So, the web browser ought
to be design in such a way, too.)

(Even other programs will, e.g. the UNIX shell to execute other programs and use
the pipes to use multiple programs together, loading SQLite extensions in the
SQLite command shell to use their functions and virtual tables in one interface,
a picture editor or sound editor GUI program to load plug-ins for file formats
etc, and others, so the web browser should do so, too.)

> I can easily see a tool developed with the Unix Philosophy in mind, but I can
> also see that most users wouldn’t actually use it, because they are quite
> happy living in an online world where the presentation is controlled by
> someone else whose aim is continued engagement.

Multiple programs can be made for similar purpose, and I think that is what is
needed. Unfortunately, WWW is rather difficult and extra stupid. But, I would
hope that it can be done (even if some features are excluded; I can live with
that, and actually deliberately want to exclude some features, and for some
of them to be implemented in an entirely different way than what the existing
implementations are currently doing).

Programs with subsets/supersets of features, and different sets of features,
can also be possible; this should not exclude such a possibility.

The lack of capabilities of WebExtension is problematic. One thing that will
partially help is to allow loading native code extensions (.so files, or you
can sometimes use pipes which sometimes can mean you might not need to write
an extension for the web browser). Such native code extensions could call back
into the JavaScript interface, too.

Extensions added through the extension catalog should not be allowed to load
native codes; to do so you must install it by yourself instead.

meff

unread,
Feb 3, 2022, 9:38:33 PM2/3/22
to
On 2022-02-03, ne...@zzo38computer.org.invalid <ne...@zzo38computer.org.invalid> wrote:
> One problem in general is that software is not designed for advanced users.
> Computer software should be designed for advanced users.
One could say this about _anything_ no? Cars should be made for
advanced users, tools should be made for advanced users, kitchens
should be made for advanced users, and so on. I think the reality is
that most humans are not advanced users of most things.

Largely I think this thread is about technology people lamenting a
past where the net was only for other technology people. But the net
is infinitely wide. There's space for everyone on here. There doesn't
need to be gatekeeping on the net. We're not running out of internet
any time soon.

> For example, IRC can be a separate program, but it can make sense to support
> HTTP, HTML, Gemini, Gopher, etc together in one program, although I think that
> it might be better having the core program not supporting any of these and only
> the interface which calls extensions to implement them, instead. This way, you
> can use the links between them, bookmark, etc.

With HTTP/2 and HTTP/3 this doesn't necessarily need to be
true. HTTP/2 and HTTP/3 is good enough at this point to give you a
duplex channel.

Doc O'Leary

unread,
Feb 4, 2022, 3:49:38 PM2/4/22
to
For your reference, records indicate that
ne...@zzo38computer.org.invalid wrote:

> One problem in general is that software is not designed for advanced users.
> Computer software should be designed for advanced users.

Underlying that, it is often the case that software is not designed *by*
advanced users. Which is to say, that even if the developers are tech
rock stars, the are usually answering to some MBA who doesn’t have a
clue what it means to have software that is well-architected.

> If a new browser must be written, another alternative is just to not implement
> CSS at all, maybe. Some things will not work without CSS, but maybe if you have
> HTML and ARIA, and possibility of user customizations (even if it is its own
> simplified kind of variant of CSS that only can be used by the end user) then
> it might be suitable for most, maybe.

I think the kitchen-sink nature of the modern web is just too brittle to
*not* need a completely restructured browser. Trying to jam everything
into HTML, including ARIA, is not a great approach. I mean, if there are
parts of a web page that are semantically navigation links, I’m not sure
why that is getting served up as part of the page content in the first
place, never mind layering CSS on top of it to display it in some
particular way that is not in the viewers best interest.

> Another feature I would want is to remove many animations.

Auto-load videos (especially with sound) are something I could do
without, too. I remember when there used to be a click-to-play
extension that disabled Adobe Flash, but now that multimedia is “standard”
on the modern web, it has become harder and harder to eliminate such
things, especially on mobile platforms.

Another feature along those lines would be to put a limit on how much data
you’ll allow a page to load. There is no web page that I want to visit
sight-unseen that requires 400MB of data to be loaded and consumes 2GB of
RAM.

> It makes sense to have different things in different programs, but is sometimes
> to be suitable to have multiple protocols/formats available in one interface,
> even if it calls external programs to do so.

Sure. Even browsers themselves these days spin up additional processes to
sandbox pages for security and UI responsiveness. The problem remains
that, for the modern web, things are all fundamentally controlled by the
remote server. So long as that transaction is more about rendering a page
a certain way rather than transferring information for the user to do with
as they please, the web will increasingly become bogged down by its own
weight.

> For example, IRC can be a separate program, but it can make sense to support
> HTTP, HTML, Gemini, Gopher, etc together in one program, although I think that
> it might be better having the core program not supporting any of these and only
> the interface which calls extensions to implement them, instead. This way, you
> can use the links between them, bookmark, etc.

I’ve always liked the idea of a common UI over some kind of middleware. I
mean, whether it’s email or Usenet or Reddit or chat, I should be able to
*whatever* software I like for viewing messages in a conversation. But I
do acknowledge that most people are simply unable or unwilling to separate
the content from its presentation.

Doc O'Leary

unread,
Feb 5, 2022, 1:42:13 PM2/5/22
to
For your reference, records indicate that
meff <em...@example.com> wrote:

> Largely I think this thread is about technology people lamenting a
> past where the net was only for other technology people. But the net
> is infinitely wide. There's space for everyone on here. There doesn't
> need to be gatekeeping on the net. We're not running out of internet
> any time soon.

I would argue somewhat the opposite. We *are* definitely running out of
Internet that is free and open for people. That especially applies to
the web, where large corporations have exercised vast power to manipulate
people to act against their own best interest. Complaints of
“gatekeeping” on Usenet ring hollow; if the “space” provided by Facebook
and Twitter are more to your liking, go there and try to have this kind
of discussion.

> With HTTP/2 and HTTP/3 this doesn't necessarily need to be
> true. HTTP/2 and HTTP/3 is good enough at this point to give you a
> duplex channel.

HTTP/3 is so different from HTTP/2 that they shouldn’t even be discussed
as being related protocol. It leaves me stepping back even further from
the request semantics and question what people are even looking to
accomplish. Too many things (e.g., microservice APIs) are jammed through
HTTP simply because web stacks are so common, not because they’re a good
way to get the job done.

So, if anything, I’m lamenting the past where the web was *just* the web.
It was a particular kind of information system, exchanging mainly HTML
documents, that people could easily read and link to. Then it lost sight
of the Unix Philosophy and tried to become everything to everybody. So
(again, in full acknowledgement of the irony of discussing this on Usenet
when so many people have had their attention absorbed by web forums
controlled by social media companies) I ask you: what do you think the
WWW *shouldn’t* do?

meff

unread,
Feb 5, 2022, 6:24:16 PM2/5/22
to
On 2022-02-05, Doc O'Leary <drol...@2017usenet1.subsume.com> wrote:
> I would argue somewhat the opposite. We *are* definitely running out of
> Internet that is free and open for people.

I agree with your sentiment but not your diagnosis. Getting people to
care about a free and open Web is the fight that's being lost
here. The Internet is as it always was. Tier 1s are peering and asking
for transit, as are Tier 2 and Tier 3. While IPv4s have become
expensive due to exhaustion, IPv6 /64s cost peanuts. You can get an IP
address and send a packet to another IP address any day (well,
depending on if the ISPs have put the recepient behind CGNAT or not.)
Unfortunately _people_ don't care about the freedom and openness
anymore.

It's up to us to _educate_ folks about what's lacking. I
don't find this complaining-behind-closed-doors behavior particularly
conducive to this though. We need to remind people about
why free and open communication is important, no matter whether the
captor is a corporation or a government. You can't achieve that by
calling names, in fact people are even less likely to listen to you if
you call them names.

> That especially applies to
> the web, where large corporations have exercised vast power to manipulate
> people to act against their own best interest. Complaints of
> “gatekeeping” on Usenet ring hollow; if the “space” provided by Facebook
> and Twitter are more to your liking, go there and try to have this kind
> of discussion.

I don't find it gatekeeping as much as complaining. My father loves to
lament times gone by but his memories conveniently edits away all the
downsides. Again I find this behavior unproductive and closed
minded. You'll never get people to care about freedom if you start out
by insulting them or complaining about them. My father remains
unpopular at dinner parties.

> HTTP/3 is so different from HTTP/2 that they shouldn’t even be discussed
> as being related protocol. It leaves me stepping back even further from
> the request semantics and question what people are even looking to
> accomplish. Too many things (e.g., microservice APIs) are jammed through
> HTTP simply because web stacks are so common, not because they’re a good
> way to get the job done.

The authors of QUIC (the standard that eventually became HTTP/3) had
started by trying to create a non-Web protocol from the ground up. The
trouble was middleboxes. Middleboxes would throw away anything that
wasn't on a few set of explicitly allowed ports (HTTP, HTTPS, SMTP) or
wasn't just TCP traffic. I fully admit IMO that Google used their
influence to jam their vision of the future of network transit into
the IETF which is why QUIC was chosen as HTTP/3, but HTTP/3 did start
out trying to be a different way of transiting packets over the
net. Unfortunately, ISPs do not want to upgrade or evolve their
middleboxes in any way.

> So, if anything, I’m lamenting the past where the web was *just* the web.
> It was a particular kind of information system, exchanging mainly HTML
> documents, that people could easily read and link to. Then it lost sight
> of the Unix Philosophy and tried to become everything to everybody. So
> (again, in full acknowledgement of the irony of discussing this on Usenet
> when so many people have had their attention absorbed by web forums
> controlled by social media companies) I ask you: what do you think the
> WWW *shouldn’t* do?

I don't think it should or should not do anything. I am not an
architect of humanity. I am not God. I'm fine with humans doing what
they will. The Web is only as useful as the entities that produce
content for it and the entities that consume content on it. I would
like to see a world where humans once again understand why early
Internet pioneers fought so hard for neutral networks, but at the end
of the day I recognize that my views are minority ideas and that all I
can do is try to sway hearts and minds, not tell others what to
do. Most importantly I may be _wrong_ and the others may be right. I
respect the will of other free humans.

I'd like to try to meet others in the middle. That might mean offering
Web interfaces for Usenet, writing about the forgotten parts of the
Net that still have posters like you and I. One "carrot" I like to
offer folks is censorship; unmoderated newsgroups have nobody telling
you what is and is not verboten. Nobody can systematically silence
you. Others may killfile you, but nobody has power over your voice on
Usenet the way Reddit can just ban people and entire
communities. The same goes for other net technologies like email.

But it's also important to understand why the status quo exists
(instead of just getting angry at it.) CGNAT makes P2P technology
nearly impossible on the web. Email is overrun with spam. Mobile
phones consume too much battery to keep persistent connections
open. Most middleboxes block UDP packets. ISPs prioritize downlinks
over uplinks and offer terrible QoS on uplinks. Most non-Web traffic
is unencrypted and leaks personal information to middleboxes. The web
has succeeded because it was relatively simple for ISPs to operate, so
most of the complexity was pushed up to the application protocol (with
stateful cludges like cookies.)

I'm hoping that if HTTP/3 can actually become a net standard that
middleboxes respect, that we can _finally_ start sending UDP packets,
which would be more convenient for mobile devices and for many
protocols. Wireguard tunnels (or Zerotier) and services built atop
them, like Tailscale, have brought E2EE IP tunnels to people in an
accessible way. Meshnets like CJDNS and Yggdrassil are out there which
can tunnel over regular IPv4 connections. I use my energy to educate
my friends and family about the importance of a free and open Internet
and encourage more tinkering-happy friends of mine to play around with
the "real" Internet, the one with IP packets flowing freely between
hosts.

Doc O'Leary

unread,
Feb 6, 2022, 2:21:55 PM2/6/22
to
For your reference, records indicate that
meff <em...@example.com> wrote:

> The Internet is as it always was.

It clearly isn’t, nor should anyone expect it to be. A *crapload* has
changed since the start of Eternal September. Non-technical people
don’t care *at all* about things like IPv6 or HTTP/3, though. They
don’t even care about the application layer, and couldn’t even begin
to tell you anything about how the WWW works beyond opening their
browser and entering a URL.

> It's up to us to _educate_ folks about what's lacking.

Nope. You need to talk with more non-technical people. They are already
aware of the ills of the modern Internet, but they feel helpless to do
anything about it. I have friends who know that WhatsApp is toxic, but
can’t bring themselves to abandon it because the network effect is too
strong. Other friends leave their smart phones at the door along with
their shoes when they get home because notifications of all kinds have
become too demanding of their attention. People *know* they’re being
manipulated, but the FOMO is overpowering for them.

> You can't achieve that by
> calling names, in fact people are even less likely to listen to you if
> you call them names.

Who advocated that? I *will* say someone is doing something wrong if I
think they’re doing something wrong, though. And there is *a lot* wrong
with the modern Internet.

> I don't find it gatekeeping as much as complaining. My father loves to
> lament times gone by but his memories conveniently edits away all the
> downsides. Again I find this behavior unproductive and closed
> minded. You'll never get people to care about freedom if you start out
> by insulting them or complaining about them. My father remains
> unpopular at dinner parties.

That’s sad. You have been successfully manipulated into thinking that
criticism is closed minded and should be viewed as unpopular. You’ve
fallen for the relentless positivity that pushes social media engagement.
The world will never get better if people are unable or unwilling to face
our problems head on. Maybe your father’s problem is that he’s choosing
to attend vacuous dinner parties?

> The
> trouble was middleboxes. Middleboxes would throw away anything that
> wasn't on a few set of explicitly allowed ports (HTTP, HTTPS, SMTP) or
> wasn't just TCP traffic.

And that is a problem that definitely should be solved, but the *right*
solution is not to ham-fistedly jam even *more* under the umbrella of
the WWW. If you want to make the argument of “is as it always was”,
you can’t just roll over for every power play to co-opt standards that
Google makes.

> Most importantly I may be _wrong_ and the others may be right.

But you can’t actually sort that out unless you take a position in the
first place. And solutions can both be wrong *and yet* popular. Yes,
people are free to do what they will, but part of that should be the
adult responsibility of, as you have done, acknowledging that they *can*
be wrong.

> Nobody can systematically silence
> you. Others may killfile you, but nobody has power over your voice on
> Usenet the way Reddit can just ban people and entire
> communities. The same goes for other net technologies like email.

Yet another thing that isn’t “is as it always was”. Modern email,
despite still being based on open protocols, is largely controlled by
gatekeepers like Google and Amazon. You *will* be systematically
silenced for reasons of their choosing. Worse, cloud providers are
more than happy to mix traffic from abusive customers in with
legitimate users, turning them into human shields.

> I'm hoping that if HTTP/3 can actually become a net standard that
> middleboxes respect, that we can _finally_ start sending UDP packets,

And while I can respect that as the ends, I don’t accept that the means
of achieving it is respectable. History has shown that ISPs are more
than willing to drag their feet or completely torpedo technology advances
just because it is easier to do nothing new. If you truly want to open
innovation back up, as I keep saying, you should *not* be asking for
changes that can be restricted to *just* the WWW.

You mention spam, and that is another *great* example of how problems are
not getting solved on the modern Internet. I have *actual* solutions for
spam, which is why I can give a valid email on my Usenet posts. But the
big guys don’t really want to eliminate spam, because it gives them too
much control over users. Some people look at what Google is doing and
actually think they represent best practices!

So, no, I don’t really expect adding more to WWW standards is going to
make things better for anyone. Neither do I think nostalgia for a past
Internet is particularly productive. My argument remains that we need
to be looking at what is right and wrong about what we’re doing, and
make changes for the better. For me, that means moving away from the
WWW to systems that aren’t trying to act as the be-all solution for
everything online.

My aim for 2022 is to downgrade my web pages to be mostly static and
ideally serverless. I’m going to see if I can move away from HTML-only
and go with simpler text formats like Markdown, CSV, and YAML. I’ve
done similar projects in the past when I abandoned Drupal, so I know it
can be done. Then upside of browsers having a kitchen-sink approach is
that you can turn it around on itself and force it to function almost
like a usable information system! :-)

😉 Good Guy 😉

unread,
Mar 1, 2023, 10:02:56 PM3/1/23
to
On 22/03/2014 12:43, mw wrote:
Web considered harmful
======================



Can anything be done about this? Without Web we can't survive in the 21st century so we need to do something to make Web a safe place for people do their business.

We can't live without Amazon because that is where all gadgets are.

Please help us!!


Arrest
Dictator Putin

We Stand
With Ukraine

Stop Putin
Ukraine Under Attack


--
We do not live to ourselves and we do not die to ourselves; if we live, we live to the Lord, and if we die, we die to the Lord.

So then, whether we live or whether we die, we are the Lord's.

Now this is not the end. It is not even the beginning of the end. But it is, perhaps, the end of the beginning
0 new messages