The Small Website Discoverability Crisis

13 views
Skip to first unread message

Joel Souza

unread,
Nov 24, 2023, 8:15:53 AM11/24/23
to arqhp
Um breve ensaio/reflexão a rede sendo pautada por algoritmos ao invés de curadoria.



https://www.marginalia.nu/log/19-website-discoverability-crisis/


There are a lot of small websites on the Internet: Interesting websites, beautiful websites, unique websites.

Unfortunately they are incredibly hard to find. You cannot find them on Google or Reddit, and while you can stumble onto them with my search engine, it is not in a very directed fashion.

It is an unfortunate state of affairs. Even if you do not particularly care for becoming the next big thing, it’s still discouraging to put work into a website and get next to no traffic beyond the usual bots.

You get a dead-sea effect. Traffic is evaporating, and small websites are dying, which brings even fewer visitors. Rinse and repeat.

Blogs limp along through RSS and Atom, but relying on feeds shapes everything you write into a blog entry. It’s stifling, homogenizing. The blogosphere, what remains of it, is incredibly samey.

I feel there ought to be a solution to this, a better way of doing things that can help, and perhaps the Internet as a whole is an irredeemable mess that will never mend, but maybe we can (somehow) make it easier for those who are actually looking to find what they seek.

Maybe there are lessons that can be drawn from what works on Gemini, and what doesn’t work on HTTP, that can synthesize into a sketch for a solution.

Gemini seems to be discovering automatic link feeds (e.g. Antenna), and on gemini-scale it works pretty well. But I’m just going to state that automatic link feeds do not seem to work on HTTP any more. You end up with a flood of astroturfing, vapid click-bait and blogspam (i.e. reddit). Stemming the flood demands a ton of moderation and still results in dismal results.

As a whole, I think centralized and algorithmic approaches are extremely exposed to manipulation when applied on the internet.

Web rings are cute, but I think they are a bit too random to help. Likewise, curated link directories were a thing back when the Internet was in its infancy, but the task of maintaining such a directory is a full time job.

You could go for some sort of web-of-trust model to only allow trusted submitters access to an automatic link feed, but that practice is excluding and creates yet more walled gardens, which impairs the very discoverability I’m trying to help.

Instead, perhaps there is a much simpler solution.

Simple federated bookmarking

A proposal, dear reader: Create a list of bookmarks linking to websites you find interesting, and publish it for the world to see. You decide what constitutes “interesting”.

The model is as recursive as it is simple. There is nothing preventing a list of bookmarks from linking to another list of bookmarks.

The creation of a bookmark list is a surprisingly fun project, it has some of the appeal of scrapbooking; and the end-result is also appealing to browse through.

It’s a bit strange, almost nobody seems to be doing this. Looking through a sample of personal websites, very few of them has links to other personal websites. A hyperlink isn’t a marriage proposal. It is enough to find some redeeming quality in a website to link to it. It costs nothing, and helps bring traffic to pages that you yourself think deserve it.

If we actually want these small websites to flourish as a healthy community, we need to promote each other much more than we do. It is advertisement, yes, but in earnest. I like it when other people link to my stuff. What sort of hypocrite would I then be if I only ever linked to my own websites?

Leading by example, I set up my own list of bookmarks:


--
Joel Souza

Irapuan Martinez

unread,
Nov 28, 2023, 1:26:12 PM11/28/23
to ar...@googlegroups.com
On Fri, Nov 24, 2023 at 10:15 AM Joel Souza <joel....@gmail.com> wrote:
It is an unfortunate state of affairs. Even if you do not particularly care for becoming the next big thing, it’s still discouraging to put work into a website and get next to no traffic beyond the usual bots.

O dude está culpado os algoritmos — ah, sempre os algoritmos — por que os websites foram desencorajados pela culpa das… redes sociais.

Hoje os sites pequenos e legais ainda existem, mas as pessoas só estão usando o Instagram (anteriormente, o Facebook) como editor de conteúdo. Lá conseguem mais visitas assim, a ponto de não ser interessante investir num website.

Porque alguém usaria um canal de comunicação para publicar conteúdo que deveria estar num website?

Precisa entender que websites deveriam ser fáceis de ser usados e construídos caseiramente. Ou contratar alguém cuja a hora de trabalho não seja proibitiva para manter o conteúdo. Mas a gente tornou tudo mais complexo e hoje é mais fácil abrir uma conta num Instagram do que instalar um editor de conteúdo. Websites eram estáticos antes, hoje você pode publicar todos os dias. E como todo mundo foi para a rede social, você consegue fechar um pacote de divulgação, onde você dribla o algoritmo, para atrair mais visitação. Funciona melhor do que as práticas esotéricas de SEO.

Rede sociais foi o fim da web. Criamos uns cercadinhos e os institucionalizamos. O que antes você blogava, hoje você faz reels or whatever. Como estão todos no cercadinho, você só publica para eles. Além de ser de graça, ridiculamente simples, tem audiência. Só custou aquela web selvagem dos anos 90, que todo mundo adorava.


A proposal, dear reader: Create a list of bookmarks linking to websites you find interesting, and publish it for the world to see. You decide what constitutes “interesting”.

Google já fez exatamente isto. Só que ao invés de você publicar seus bookmarks que ninguém vai ver, entende que os links que você aponta merecem crédito.


Reply all
Reply to author
Forward
0 new messages