[Plone-Users] Data.fs maintenance, Packing by wget?

4 views
Skip to first unread message

Peter Fraterdeus

unread,
Aug 7, 2007, 6:08:46 AM8/7/07
to plone...@lists.sourceforge.net
I've got a nightly cron repozo script, and a manual shell snippet which will rm old repozo fsz's, cleaning up the archive, after a Data.fs compact. However, is it considered good practice to do, for instance, an automatic weekly compact of the Data.fs?
In which case I'd be able to also automate the archive cleanup....

I've seen references to using wget or curl to initiate the Data.fs compact process.
What's the best practice on this? I found this: http://blogs.translucentcode.org/mick/2004/03/10/using_wget_to_pack_zope/

Many thanks for pointers!

Peter

--
AzByCx DwEvFu GtHsIr JqKpLo MnNmOl PkQjRi ShTgUf VeWdXc YbZa&@
>ARTQ: Help stop in-box bloat! Always Remember to Trim the Quote!<

Semiotx Inc http://typeandmeaning.com Sign up for "Type and Meaning" !
Creative/IT facilitation "Free Range IT" Plone CMS Typography
-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*
Peter Fraterdeus http://www.alphabets.com : Sign up for "MiceType"!
Galena, Illinois Design Philosophy Fonts Lettering Letterpress Wood Type
Dubuque, Iowa http://www.fraterdeus.com
Photography Irish Fiddle Political Observation
http://flickr.com/photos/pfraterdeus

-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Plone-Users mailing list
Plone...@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/plone-users

Andreas Jung

unread,
Aug 7, 2007, 6:16:48 AM8/7/07
to Peter Fraterdeus, plone...@lists.sourceforge.net

--On 7. August 2007 05:08:46 -0500 Peter Fraterdeus <pet...@semiotx.com>
wrote:

> I've got a nightly cron repozo script, and a manual shell snippet which
> will rm old repozo fsz's, cleaning up the archive, after a Data.fs
> compact. However, is it considered good practice to do, for instance, an
> automatic weekly compact of the Data.fs? In which case I'd be able to
> also automate the archive cleanup....

If your Data.fs grows fast, you have to pack more than when it grows slows.
*You* have to monitor how much it grows during the week or month and then
make a decision when to pack and how often to pack. "Best proctise" does
not mean that the best-practice approach is always good for you.

>
> I've seen references to using wget or curl to initiate the Data.fs
> compact process. What's the best practice on this? I found this:
> http://blogs.translucentcode.org/mick/2004/03/10/using_wget_to_pack_zope/
>
>

curl, wget...it does not matter. There are tons of tools available for
performing such HTTP requests, choose one. IN addition: if you're
running ZEO: use zeopack.py for packing the Data.fs using a dedicated
ZEO connection.

-aj

Peter Fraterdeus

unread,
Aug 7, 2007, 6:44:13 AM8/7/07
to plone...@lists.sourceforge.net
Hi Andreas

Thanks very much for this, I was not aware of zeopack.py which of course would be a best practice ;-)

Ciao

PF

>...


>>
>>I've seen references to using wget or curl to initiate the Data.fs
>>compact process. What's the best practice on this? I found this:
>>http://blogs.translucentcode.org/mick/2004/03/10/using_wget_to_pack_zope/
>>
>
>curl, wget...it does not matter. There are tons of tools available for performing such HTTP requests, choose one. IN addition: if you're
>running ZEO: use zeopack.py for packing the Data.fs using a dedicated
>ZEO connection.
>
>-aj

:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*-:-*

Semiotx Inc http://typeandmeaning.com Sign up for "Type and Meaning" !
Creative/IT facilitation "Free Range IT" Plone CMS Typography

--

Peter Fraterdeus

unread,
Aug 7, 2007, 1:14:05 PM8/7/07
to plone...@lists.sourceforge.net
>
>curl, wget...it does not matter. There are tons of tools available for performing such HTTP requests, choose one. IN addition: if you're
>running ZEO: use zeopack.py for packing the Data.fs using a dedicated
>ZEO connection.

Hmmm.


I'm trying: ((for example))
> PYTHONPATH=/opt/Zope-2.9/lib/python ./zeopack.py -p8100

When Andreas says "using a dedicated ZEO connection" does that mean I should first set up a distinct Zeo on a different port? (I haven't used multiple Zeos before)

Thanks for pointers!

Andreas Jung

unread,
Aug 7, 2007, 1:19:08 PM8/7/07
to Peter Fraterdeus, plone...@lists.sourceforge.net

--On 7. August 2007 12:14:05 -0500 Peter Fraterdeus <pet...@semiotx.com>
wrote:

>
>


> I'm trying: ((for example))
> > PYTHONPATH=/opt/Zope-2.9/lib/python ./zeopack.py -p8100
>
> When Andreas says "using a dedicated ZEO connection" does that mean I
> should first set up a distinct Zeo on a different port? (I haven't used
> multiple Zeos before)

Huh? You you have *one* ZEO server running on port XXX and both your ZEO
clients and your zeopack script connect to the same server using the same
port settings.

-aj

Peter Fraterdeus

unread,
Aug 7, 2007, 5:35:53 PM8/7/07
to plone...@lists.sourceforge.net
>>
>>I'm trying: ((for example))
>> > PYTHONPATH=/opt/Zope-2.9/lib/python ./zeopack.py -p8100
>>
>>When Andreas says "using a dedicated ZEO connection" does that mean I
>>should first set up a distinct Zeo on a different port? (I haven't used
>>multiple Zeos before)
>
>Huh? You you have *one* ZEO server running on port XXX and both your ZEO clients and your zeopack script connect to the same server using the same
>port settings.
>
>-aj

OK, that's what I'm doing, which is working.
Wasn't quite sure based on the language in the prior reply if there was some other approach which might be more efficient, or some other advantage.

Many thanks

PF

Nick Davis

unread,
Aug 8, 2007, 4:41:58 AM8/8/07
to plone...@lists.sourceforge.net
Peter Fraterdeus wrote:
> I've got a nightly cron repozo script, and a manual shell snippet which will rm old repozo fsz's, cleaning up the archive, after a Data.fs compact. However, is it considered good practice to do, for instance, an automatic weekly compact of the Data.fs?
> In which case I'd be able to also automate the archive cleanup....
>
> I've seen references to using wget or curl to initiate the Data.fs compact process.
> What's the best practice on this? I found this: http://blogs.translucentcode.org/mick/2004/03/10/using_wget_to_pack_zope/


Peter,
This might not be elegant, but has worked on our live system for nearly
2 years now, and is simple for dumb people like me to understand. ;-)

Simply have a script , let's call it packzodb.py , containing :

dbs = app.Control_Panel.Database
dbs['main'].manage_pack(days=3)

And get cron to run a script that does this :
zopectl run packzodb.py

That's it. You do need zeo for this to work, so you can connect
seperately to zeo with this zope while the main zope(s) is still
running. If you haven't set up zeo yet then thats probably something
you'd want to do for a load of other reasons.

Note the days=3 above, you can change it to what you like.

We originally packed it to days=7 but went down to 3 when Data.fs grew
more rapidly. The disadvantage of packing down to less days is you lose
stuff from undo. Its more fiddly to get back anything a user deleted by
mistake if you haven't got 'undo'

Cheers
Nick

PS If someone knows of a reason why this way sucks, please do tell. ;-)

--
Nick Davis
Web Application Developer
University of Leicester
http://www2.le.ac.uk

Reply all
Reply to author
Forward
0 new messages