On Fri, 15 Jun 2018 08:23:18 -0700 (PDT)
Rob <
lar...@gmail.com> wrote:
> Recently, I discovered the excellent and useful plugin, leo_cloud.py
> (thanks Terry Brown!) My initial interest was to find an easier way
> to sync myLeoSettings across multiple machines, locations and OS's. I
> see additional uses for this plugin beyond just the settings files.
>
> First, a potential `gotcha` in case some one else sees something
> similar. I was getting a series of errors when loading my settings
> file which contained numerous @leo_cloud nodes. Something to the
> effect that ('xxx') was not JSON serializable. I don't claim to
> understand exactly what that means, but after some searching I found
> the culprit. One of my abbreviations had something like [item1,
> item2, etc.]. Apparently, the JSON module can't handle this kind of
> data (set or array or something). After I modified the offending
> abbreviation, the errors ceased and all is well.
You have that pretty much worked out - it saves the subtree, including
unknownAttributes, p.v.u, as JSON. So all the values in p.v.u have to
be something JSON can represent. I forget what it does with things it
can't represent, it may convert sets to lists and dates to strings.
> Second, a question (Terry?) The docs provide examples of how to
> configure the @leo_cloud nodes to connect to local file system
> locations and to a GitHub repo. It is suggested that other cloud
> services can be used, like DropBox, but no example is given. It's not
> obvious what parameters (URL? Credentials?) to use in the case of
> DropBox or others.
What I was thinking was that you'd just use the local file system
location approach, but pointing into a folder that was being sync'ed
by DropBox or whatever.
Cheers -Terry