Mat et al
You make the point that unless we have someone to call it does not serve much use, but it would be trivial if single file wikis can provide this service, that we could build a nascent network. Starting a loose network starts with a single node. Of course initially we will have a test network or node.
I have started experimenting, but getting some errors.
I believe I understand where you are coming from, are you hoping to ensure openness and connectivity? Are you looking for a Tiddlyverse network?
If I recall correctly Jed's original example and vision was a peer to peer or networked messaging platform and this is a valuable and serious application of TW Federation, personally my priority is first to enable my own controlled wikis to intercommunicate, then perhaps publish content through subscription service and later build a more open and generic network. I have always felt this part of the lack of progress with TW Federation, is we are not taking the intermediate steps first, Although Jed has enabled this.
As Jed put mixing Https and http could be a hard security restriction, I can not only live with it, but think it an important limitation and similarly on the need for a client/server component. If my site is https I would not want someone pulling content out of it in clear text http. If I have http site anyone can pull it in clear text. Https can only work if both nodes participate in it.I also like the idea that unless I install the server component I have not made the ability to "query and extract" my wikis content open to a more programmatic extraction process (although it is easy to achieve by other means). I am not saying that we can't allow a generic non plugin way to access tiddlywiki, only that it can be defeated, some wikis I may not want to leak.
I believe approaching the "Tiddlyverse network" is a logical or configuration standards problem not a technical one. Basically publishing "that a particular content type is available at particular endpoint" and listing in, or managing a directory etc... is a matter of "policy and practice". The idea would be to develop some defacto standards that on adoption may one day be dejure standards, is the way to go with larger networks. I want to build the internet, not facebook (if that makes sense).
My idea her is, lets get the "pull" mechanism working (by this I mean establishing practices, examples and further how to's and some defacto standards), then two nodes pulling from each other, like imagine I pull standard messages from your wiki and it tells me you have blog posts I can pull from your wiki, then I pull them and republish them? Then we look at a central exchange wiki for a/each network and the journey continues. A step at a time.
Not unlike my suggestion about libraries being a mechanism I see value in letting a wiki publish a subset of its content for consumption by other wikis rather than needing to load the whole wiki (efficiency) and arguably being able to pull anything from the wiki (selective publishing). The advantage of a separate file or folder is I can apply access security to each published content feed allowing private as well as public interchanges.
With the http/https issue it may be possible to build a server that can act as a gateway between the two protocols where http and https sites can exchange tiddlers. Making clear to https users that the later leg will not be https.
Regards
Tony