There have been some hiccups with the sljfaq.org/cgi/
pages since yesterday.
Usually the pages are protected against overuse, but because I show advertising on the pages, I don't protect the pages against accesses from Google. Somebody worked out that they could get unlimited access to my web sites if they looked at the pages using Google Docs via a spreadsheet. It looks as if some of the people doing this were scraping my sites and then using the data for their own web sites or something.
I've written to Google about this but the usual "Rain Man" problems with Google apply in spades, and they don't seem to be doing anything about it. Anyway in my attempts to block Google Docs access I damaged the site slightly, it should be OK now but the problem is that some of the outputs were cached, so some people might have to clear browser caches to access the site again.
As a result of this I'm currently doing a big rewrite of the sljfaq.org/cgi/
front end part. It needs to be moved away from cgi not only to make it easier to mitigate this kind of abuse, but also to make each access less of a burden on the server. At the moment an access to e2k.cgi "costs" compilation of a lot of quite complicated Perl modules, which is repeated over and over, and I need to stop it from being recompiled each time some abuser sends a request. Even if the request is refused, the compilation phase of the Perl CGI script happens once for each request, resulting in large use of resources.
In 2019 I tried to switch over to a persistent process model but found that all of the Perl solutions that I tried seemed to require a lot of resources just to run without even serving user requests, so I'm going to write the front end part in Go with the back end part remaining in Perl, possibly a server process or possibly an executable, I'm not sure yet.