Push Pull Download Bittorrent

0 views
Skip to first unread message

Kristin Dampeer

unread,
Aug 21, 2024, 12:40:55 AM8/21/24
to pearbnsadilad

So if a site is hosted across many different computers, what happens when the site owner issues an update or content changes? Would it then have to send the update to all the computers hosting a piece of the code? And does anyone browsing the internet almost get an outdated version? - I mean, the web is constantly changing so all the browsers would have to constantly push and pull from the original sites

Push Pull download bittorrent


Download File https://lpoms.com/2A44tG



Looking forward to it. Are there any plans for making regular web site resources available through torrent? Something like, I go to a video streaming website and the browser looks at the ETag/Last Modified header and starts streaming the video from multiple peers. Of course there are obvious security and privacy concerns for something like this.

I really like the fact that I can click on a magnet link or a torrent file and see the content and download if I want to. However it seems to be way slower than using any torrent client. I know it's still work in progress but am I wrong to expect it to connect to and download from regular torrent users?

Thanks for the feedback. We have some fixes that we'll be getting out as soon as possible that should make the experience a lot smoother. As far as downloads being slower than other torrent clients, this may be the cost of streaming versus standard prioritization. When the default magnet link view shows up and you try play back of media, the content is being streamed so that playback can start as quickly as possible. The download speed in this case will differ from traditional BitTorrent downloading.

I have a complete static website in a torrent; all images and stylesheets load fine on index.html, but browsing within the site is just not working - no reaction to clicking. Is there some test torrent with a full site available? The one single torrent I found in the forum (no docs?!) is for that game, which is just one page.

I wish to know that too. I understand that at this moment Mealstrom only supports static websites, but I think this (really awesome) project will catch on only with some kind of (even limited) support for dynamic websites. What's the roadmap?

I believe you should make a simple webpage where one enters the URL of its website (or at list the path from its computer), and then download its .torrent/magnet from this webpage. (eventually an option to feed a bitorrent client automatically, well I guess you know the stuffs better than me)

It's based on http(s) instead of bittorrent protocol, so there is no way to make updating torrents without http servers. And it seems, that bittorrent protocol can be extended to support updates without using http links(for instance, using same mechanism as torproject uses for .onion links). Did you consider that option?

Hi, I am trying to add thickness to a curved shape. I understand from previous threads that Fredo6 Joint push pull should do this, but this no longer seems to be available from the warehouse. When I use the joint pull tool from the warehouse I get the attached result which creates a break at each joint on the surface as the pull is purely normal to the surface. Is there another tool available?
image984191 18.8 KB
Curved Peak for cap.skp (213.7 KB)

Often git is used in a centralized fashion with a central bare repositorywhich changes are pulled and pushed to using normal git commands.That works fine, if you don't mind having a central repository.

But it can be harder to use git in a fully decentralized fashion, with nocentral repository and still keep repositories in sync with one another.You have to remember to pull from each remote, and merge the appropriatebranch after pulling. It's difficult to push to a remote, since git doesnot allow pushes into the currently checked out branch.

git annex sync makes it easier using a scheme devised by JoachimBreitner. The idea is to have a branch synced/master (actually,synced/$currentbranch), that is never directly checked out, and servesas a drop-point for other repositories to use to push changes.

When you run git annex sync, it merges the synced/master branchinto master, receiving anything that's been pushed to it. (If there is aconflict in this merge, automatic conflict resolution is used toresolve it). Then it fetches from each remote, and merges in any changes thathave been made to the remotes too. Finally, it updates synced/masterto reflect the new state of master, and pushes it out to each of the remotes.

This way, changes propagate around between repositories as git annex syncis run on each of them. Every repository does not need to be able to talkto every other repository; as long as the graph of repositories isconnected, and git annex sync is run from time to time on each, a givenchange, made anywhere, will eventually reach every other repository.

Note that by default, git annex sync only synchronises the gitrepositories, but does not transfer the content of annexed files. If youwant to fully synchronise two repositories content,you can use git annex sync --content. You can also configurepreferred content settings to make only some content be synced.

I cam upon git-annex a few months ago. I saw immidiately how it could help with some frustrations I've been having. One in particlar is keeping my vimrc in sync accross multiple locations and platforms. I finally took the time to give it a try after I finally hit my boiling point this morning. I went through the walkthrough and now I have an annax everywhere I need it. git annex sync and my vimrc is up-to-date, simply grand!

By default, git annex sync will sync to all remotes, unless you specify a remote. So, I have to specify, e.g., git annex sync origin. I can simplify this with aliases, I suppose, but I do a lot of teaching non-programmer scientists... so it'd be nice to be able to configure this (so beginning users don't have to keep track of as many things).

I'm trying to teach beginning scientist programmers (mostly graduate students), and a common scenario is to fork some scientific code. I'd like forking on github to be mundane, and not trigger warnings, and generally have as little for folks to explicitly keep track of as possible (this seems to be a common concern we share, which leads you to prefer syncing to all remotes without the option to configure the default behavior!).

However, I am currently working with students on forking and fixing up scientific code where the upstream maintainer doesn't want to allow pushes upstream, except via pull request. So, part of our approach is to set up some common shared datasets in git annex (and these just end up in our fork). If we have an "upstream" remote, git annex will try to sync with it, and report an error.

So - that's why I'd like to be able to configure the deactivation of syncing to a defined branch (e.g., "upstream"). However, if you have other suggestions to smooth the workflow, I would also like to hear those!

@Dav what kind of url does the upstream remote have? Perhaps it would be sufficient to make sync skip trying to push to git:// and http[s]:// remotes. Both are unlikely to accept pushes and in the cases where they do accept pushes it would be fine to need a manual git push.

I have a central repo and client repos. I want to copy all content to the central repo after a commit. Right now, I use "git annex group central backup", "git annex wanted central standard", and a hook that triggers "git annex sync --content" after each commit. Maybe there is a more efficient way to do this? Thanks for sharing thoughts.

I too feel that syncing all remotes by default is the right thing to do, but I think it should be limited to the 'master' and 'git-annex' branch. I often create branches that I want to keep local and do not want them to be synced. But I want 'master' and 'git-annex' branches to be synced with all remotes.

Well, I have two git annex-ed repositories where "git remote -v" properly lists the other repo, and "git annex sync foo" manages to pull from foo, but "git annex sync" without a remote name simply does a local sync. Also, neither command pushes anything anywhere.

My way of working with git-annex doesn't seem to mesh well with the Assistant or even with git annex sync. I seem to have a bit of a control need when it comes to what gets committed when. But here's my workflow approximating what it does, with a twist. I have this in git config on mylaptop:

I don't need a synced/git-annex. If upstream is not up-to-date I fetch and merge. In this case upstream happens to be a bare git repo, so I don't need synced/master either. If upstream is non-bare, I use synced/master -- or sometimes I keep upstream usually checked out on an orphan branch and just switch into master to check things and then switch away to avoid conflict. If I can avoid it, I prefer not to have several branches where I don't know which one is the latest one.

If I just do git push, close the lid and run into the forest, it may or may not have a non-fastforward event on master and git-annex ... but it always succeeds in pushing to the mylaptop remote on my server.

If I have added a batch of files, I usually push first to all my remotes, to get that precious metadata up there. At that point I don't care if there's a conflict upstream. Then I git annex copy to wherever, fetch all remotes, git annex merge, maybe merge master if I have to (usually not), then push to all remotes again. It's less of a bother than it sounds like. I don't even have any handy aliases for this, I prefer to just get the for loop from my command-line history.

b37509886e
Reply all
Reply to author
Forward
0 new messages