Priki wiki fails stress test at about 2mb

6 views
Skip to first unread message

car...@googlemail.com

unread,
Sep 24, 2007, 3:54:53 PM9/24/07
to Priki, mar
Hi all

Our new 'Stress Test' installation of Priki wiki failed today after
adding a few more pages past the last crash state. My guess is, the
snapshot file fails around 2mb of data, as my last backup yesterday
was showing 1,996 mb on the snapshot and 533kb on the journal file.
The error today was similar to the original failure as posted in
previous posts here from me.

Persistent flat file data storage has limitations, especially when
there are so many automatic links and content relations being built
up. I was noticing a performance degradation on the saving of pages
leading up to the crash yesterday. ie, it originally took about 4-5
seconds to save a page when I first installed the wiki, and gradually
crept up to 17 seconds towards the end. Today, when I clicked to save,
the page hung for over 60 seconds, then returned the java error as
below. The wiki still works to view, but as before, we cant save new
data. It appears that when it tries to save and fails, it crashes and
breaks the prevaylor system (as before).

I think Priki wiki is intrinsically flawed in the way it handles data.
It's fine for small wikis, with minimal links and data, but will
always fail with large data sets. I have been especially careful about
entering data for this stress test and I see no other factor to
confuse my findings.

I personally think, this is an excellent wiki, that needs maturing
into scalability. A database, instead of flat file persistence is the
way to go. Wether you will ever do that remains to be seen, but now
we know for sure, it was not our fault, or the server, or any other
human error that brought our wiki down today.

This wiki has 2 brilliant features that I love and will always want
to recapture. The autolinking of words to page titles and the
related content list. The second feature, is not unique, as other
wikis call this 'backlinks' and 'what links here' The autolinks
however, are pretty unique and my searches so far have not revealed
anything like it. Maybe the parsing of data to create autolinks is at
the heart of the failure. There is simply not enough memory and CPU
power to deal with so much processing at large data levels. Remember,
the autolinks have to be descovered and are written to that one big
file as part of the save process. That's a lot of work for large data
sets! (we use 2.4Ghz and 2GB memory on a dedicated tomcat server)

You have a good product with Priki Wiki, but it is not scalable and
you need to make this clear to users. If you could make it work for
large data sets, you certainly have a winner, but right now it has
serious limitations.

Please take my report seriously, it would be great to see this wiki
matured in scalability.

Richard

Klaus Wuestefeld

unread,
Sep 24, 2007, 7:46:06 PM9/24/07
to pr...@googlegroups.com
I believe you when you say Priki has limitations but that has nothing
to do with Prevayler, I believe. Flat-fileness has nothing to do with
it. The entire javafree.org portal is a prevalent system with over
150MB and thousands of daily visitors.

I understand Priki generates the snapshot only when ordered, right? So
being slow on all other operations has nothing to do with being saved
to a "flat-file", does it?

The problem with Priki is probably something else. I believe Vitor can
enlighten us.

See you, Klaus.

Vitor Fernando

unread,
Sep 24, 2007, 8:30:42 PM9/24/07
to pr...@googlegroups.com
Hello Richard,

Are you extending Priki?



Our new 'Stress Test' installation of Priki wiki failed today after
adding a few more pages past the last crash state. My guess is, the
snapshot file fails around 2mb of data, as my last backup yesterday
was showing  1,996 mb on the snapshot and 533kb on the journal file.
The error today was similar to the original failure as posted in
previous posts here from me.

Can't be that. JavaFree has snapshots of 40 MB, always OK.

Persistent flat file data storage has limitations, especially when
there are so many automatic links and  content relations being built
up.  I was noticing a performance degradation on the saving of pages
leading up to the crash yesterday.  ie, it originally took about 4-5
seconds to save a page when I first installed the wiki, and gradually
crept up to 17 seconds towards the end. Today, when I clicked to save,
the page hung for over 60 seconds, then returned the java error as
below. The wiki still works to view, but as before, we cant save new
data. It appears that when it tries to save and fails,  it crashes and
breaks the prevaylor system (as before).

hum... it shows any error message?

I think Priki wiki is intrinsically flawed in the way it handles data.
It's fine for small wikis, with minimal links and data, but will
always fail with large data sets. I have been especially careful about
entering data for this stress test and I see no other factor to
confuse my findings.
I personally think, this is an excellent wiki, that needs maturing
into scalability. A database, instead of flat file persistence is the
way to go. Wether you will ever  do that remains to be seen, but now
we know for sure, it was not our fault, or the server, or any other
human error that brought our wiki down today.
 
Yes, Priki only runs over Prevayler. However it's impossible to use the same technology with any kind of relational database. When the user post a new text, Priki runs two or more "selects" for each word of the text. After, it loads all referencing posts and "re-posts" all of them to refresh the references. It's a high processing per post for any relational database.
 
This wiki has 2 brilliant features that I love and  will always want
to  recapture.  The autolinking of words to page titles and the
related content  list.  The second feature, is not unique, as other
wikis call this 'backlinks' and 'what links here' The autolinks
however, are pretty unique and my searches so far have not revealed
anything like it. Maybe the parsing of data to create autolinks is at
the heart of the failure. There is simply not enough memory and CPU
power to deal with so much processing at large data levels. Remember,
the autolinks have to be descovered and are written to that one big
file as part of the save process. That's a lot of work for large data
sets!  (we use 2.4Ghz and 2GB memory on a dedicated tomcat server)

The text of a post transaction is saved as an String, not as a graph of objects, making the work to the "Serializator" be very easy and fast. The Problem with the high time processing is related to the text parser. Maybe a new bug? I don't know.

You have a good product with Priki Wiki, but it is not scalable and
you need to make this clear to users. If you could make it work for
large data sets, you certainly have a winner, but right now it has
serious limitations.

It is not a high scalable system, but it can hold datasets with less than 1 GB of ram memory easily.

If you are not extending priki, I think you found a bug. Can you replicate the problem? Send it to me, please.

Please take my report seriously, it would be great to see this wiki
matured in scalability.

Richard





--
Vitor Fernando Pamplona
http://vitorpamplona.com

richard strauss

unread,
Sep 26, 2007, 9:39:57 AM9/26/07
to pr...@googlegroups.com
"Are you extending Priki?"
No, we have made only changes to the language file


"Can't be that. JavaFree has snapshots of 40 MB, always OK. "
Ok, It was our observation that the wiki failed at about 2mb last time, as well as this time.


"hum... it shows any error message?"
Here is the message
The server encountered an internal error () that prevented it from fulfilling this request._

*exception*

java.lang.RuntimeException: Unable to produce a copy of the prevalent system for trying out transactions before applying them to the real system.
    org.prevayler.implementation.publishing.censorship.StrictTransactionCensor.produceNewFoodTaster(StrictTransactionCensor.java:53)
    org.prevayler.implementation.publishing.censorship.StrictTransactionCensor.royalFoodTaster(StrictTransactionCensor.java:44)
    org.prevayler.implementation.publishing.censorship.StrictTransactionCensor.approve(StrictTransactionCensor.java:28)
    org.prevayler.implementation.publishing.CentralPublisher.approve(CentralPublisher.java:72)
    org.prevayler.implementation.publishing.CentralPublisher.publishWithoutWorryingAboutNewSubscriptions(CentralPublisher.java:63)
    org.prevayler.implementation.publishing.CentralPublisher.publish(CentralPublisher.java:49)
    org.prevayler.implementation.PrevaylerImpl.publish(PrevaylerImpl.java:64)
    org.prevayler.implementation.PrevaylerImpl.execute(PrevaylerImpl.java:59)
    org.priki.service.Prevalence.execute(Prevalence.java:97)
    org.priki.actions.PrikiAction.postWikiword(PrikiAction.java:206)
    sun.reflect.GeneratedMethodAccessor223.invoke(Unknown Source)
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    java.lang.reflect.Method.invoke(Method.java:597)
    com.opensymphony.xwork.DefaultActionInvocation.invokeAction(DefaultActionInvocation.java:300)
    com.opensymphony.xwork.DefaultActionInvocation.invoke(DefaultActionInvocation.java:166)
    com.opensymphony.xwork.interceptor.AroundInterceptor.intercept(AroundInterceptor.java:35)
    com.opensymphony.xwork.DefaultActionInvocation.invoke(DefaultActionInvocation.java:164)
    com.opensymphony.xwork.interceptor.AroundInterceptor.intercept(AroundInterceptor.java:35)
    com.opensymphony.xwork.DefaultActionInvocation.invoke(DefaultActionInvocation.java:164)
    com.opensymphony.xwork.interceptor.AroundInterceptor.intercept(AroundInterceptor.java:35)
    com.opensymphony.xwork.DefaultActionInvocation.invoke(DefaultActionInvocation.java:164)
    com.opensymphony.xwork.interceptor.AroundInterceptor.intercept(AroundInterceptor.java:35)
    com.opensymphony.xwork.DefaultActionInvocation.invoke(DefaultActionInvocation.java:164)
    com.opensymphony.xwork.interceptor.AroundInterceptor.intercept(AroundInterceptor.java:35)
    com.opensymphony.xwork.DefaultActionInvocation.invoke(DefaultActionInvocation.java:164)
    com.opensymphony.xwork.interceptor.AroundInterceptor.intercept(AroundInterceptor.java:35)
    com.opensymphony.xwork.DefaultActionInvocation.invoke(DefaultActionInvocation.java:164)
    org.priki.interceptor.I18nInterceptor.intercept(I18nInterceptor.java:85)
    com.opensymphony.xwork.DefaultActionInvocation.invoke(DefaultActionInvocation.java:164)
    com.opensymphony.xwork.DefaultActionProxy.execute(DefaultActionProxy.java:116)
    com.opensymphony.webwork.dispatcher.ServletDispatcher.serviceAction(ServletDispatcher.java:272)
    com.opensymphony.webwork.dispatcher.ServletDispatcher.service(ServletDispatcher.java:237)
    javax.servlet.http.HttpServlet.service(HttpServlet.java:803)
--------------------
    org.tuckey.web.filters.urlrewrite.UrlRewriteFilter.doFilter(UrlRewriteFilter.java:350)

------------------------------------------------------------


 " If you are not extending priki, I think you found a bug. Can you replicate the problem? Send it to me, please."
The problem is easy for me to replicate, but takes a lot of input! The wiki fails at about 2mb of data in the snapshot file. Its happened twice at the same amount of data. Once the wiki fails, we can read it OK, but we can not save anymore data, we get the error as above, whenever we try to save any new pages or edit a page.

Thank you, Richard

richard strauss

unread,
Sep 30, 2007, 5:14:07 AM9/30/07
to pr...@googlegroups.com
So, do you guys have any ideas what is going wrong here? What exactly do you need to resolve the issue?

Thanks.
-------------//-----------

Vitor Fernando

unread,
Sep 30, 2007, 12:11:03 PM9/30/07
to pr...@googlegroups.com
Hello Richard,

I've made a new stress test with a 5MB text in a single transaction. Priki saves the journal and takes a snapshot correctly. So, I need more instructions to replicate the problem. Maybe a full graph with the words? is that what are you doing?

thanks

richard strauss

unread,
Oct 2, 2007, 10:40:13 AM10/2/07
to pr...@googlegroups.com
Ok, Vitor, Im going to create a new wiki and make all the content available to you. Our existing wiki is for our internal content only, so I cant send you the database files.  It will take a some time to do this, but I will post back once it's online. I will make the content structure similar to the one that has failed, to provide a fair comparison.

Thanks for your help.
Reply all
Reply to author
Forward
0 new messages