Our new 'Stress Test' installation of Priki wiki failed today after
adding a few more pages past the last crash state. My guess is, the
snapshot file fails around 2mb of data, as my last backup yesterday
was showing 1,996 mb on the snapshot and 533kb on the journal file.
The error today was similar to the original failure as posted in
previous posts here from me.
Persistent flat file data storage has limitations, especially when
there are so many automatic links and content relations being built
up. I was noticing a performance degradation on the saving of pages
leading up to the crash yesterday. ie, it originally took about 4-5
seconds to save a page when I first installed the wiki, and gradually
crept up to 17 seconds towards the end. Today, when I clicked to save,
the page hung for over 60 seconds, then returned the java error as
below. The wiki still works to view, but as before, we cant save new
data. It appears that when it tries to save and fails, it crashes and
breaks the prevaylor system (as before).
I think Priki wiki is intrinsically flawed in the way it handles data.
It's fine for small wikis, with minimal links and data, but will
always fail with large data sets. I have been especially careful about
entering data for this stress test and I see no other factor to
confuse my findings.
I personally think, this is an excellent wiki, that needs maturing
into scalability. A database, instead of flat file persistence is the
way to go. Wether you will ever do that remains to be seen, but now
we know for sure, it was not our fault, or the server, or any other
human error that brought our wiki down today.
This wiki has 2 brilliant features that I love and will always want
to recapture. The autolinking of words to page titles and the
related content list. The second feature, is not unique, as other
wikis call this 'backlinks' and 'what links here' The autolinks
however, are pretty unique and my searches so far have not revealed
anything like it. Maybe the parsing of data to create autolinks is at
the heart of the failure. There is simply not enough memory and CPU
power to deal with so much processing at large data levels. Remember,
the autolinks have to be descovered and are written to that one big
file as part of the save process. That's a lot of work for large data
sets! (we use 2.4Ghz and 2GB memory on a dedicated tomcat server)
You have a good product with Priki Wiki, but it is not scalable and
you need to make this clear to users. If you could make it work for
large data sets, you certainly have a winner, but right now it has
serious limitations.
Please take my report seriously, it would be great to see this wiki
matured in scalability.
Richard
I understand Priki generates the snapshot only when ordered, right? So
being slow on all other operations has nothing to do with being saved
to a "flat-file", does it?
The problem with Priki is probably something else. I believe Vitor can
enlighten us.
See you, Klaus.
Our new 'Stress Test' installation of Priki wiki failed today after
adding a few more pages past the last crash state. My guess is, the
snapshot file fails around 2mb of data, as my last backup yesterday
was showing 1,996 mb on the snapshot and 533kb on the journal file.
The error today was similar to the original failure as posted in
previous posts here from me.
Persistent flat file data storage has limitations, especially when
there are so many automatic links and content relations being built
up. I was noticing a performance degradation on the saving of pages
leading up to the crash yesterday. ie, it originally took about 4-5
seconds to save a page when I first installed the wiki, and gradually
crept up to 17 seconds towards the end. Today, when I clicked to save,
the page hung for over 60 seconds, then returned the java error as
below. The wiki still works to view, but as before, we cant save new
data. It appears that when it tries to save and fails, it crashes and
breaks the prevaylor system (as before).
I think Priki wiki is intrinsically flawed in the way it handles data.
It's fine for small wikis, with minimal links and data, but will
always fail with large data sets. I have been especially careful about
entering data for this stress test and I see no other factor to
confuse my findings.
I personally think, this is an excellent wiki, that needs maturing
into scalability. A database, instead of flat file persistence is the
way to go. Wether you will ever do that remains to be seen, but now
we know for sure, it was not our fault, or the server, or any other
human error that brought our wiki down today.
This wiki has 2 brilliant features that I love and will always want
to recapture. The autolinking of words to page titles and the
related content list. The second feature, is not unique, as other
wikis call this 'backlinks' and 'what links here' The autolinks
however, are pretty unique and my searches so far have not revealed
anything like it. Maybe the parsing of data to create autolinks is at
the heart of the failure. There is simply not enough memory and CPU
power to deal with so much processing at large data levels. Remember,
the autolinks have to be descovered and are written to that one big
file as part of the save process. That's a lot of work for large data
sets! (we use 2.4Ghz and 2GB memory on a dedicated tomcat server)
You have a good product with Priki Wiki, but it is not scalable and
you need to make this clear to users. If you could make it work for
large data sets, you certainly have a winner, but right now it has
serious limitations.
Please take my report seriously, it would be great to see this wiki
matured in scalability.
Richard
The server encountered an internal error () that prevented it from fulfilling this request._
*exception*
java.lang.RuntimeException: Unable to produce a copy of the prevalent system for trying out transactions before applying them to the real system.
org.prevayler.implementation.publishing.censorship.StrictTransactionCensor.produceNewFoodTaster(StrictTransactionCensor.java:53)
org.prevayler.implementation.publishing.censorship.StrictTransactionCensor.royalFoodTaster(StrictTransactionCensor.java:44)
org.prevayler.implementation.publishing.censorship.StrictTransactionCensor.approve(StrictTransactionCensor.java:28)
org.prevayler.implementation.publishing.CentralPublisher.approve(CentralPublisher.java:72)
org.prevayler.implementation.publishing.CentralPublisher.publishWithoutWorryingAboutNewSubscriptions(CentralPublisher.java:63)
org.prevayler.implementation.publishing.CentralPublisher.publish(CentralPublisher.java:49)
org.prevayler.implementation.PrevaylerImpl.publish(PrevaylerImpl.java:64)
org.prevayler.implementation.PrevaylerImpl.execute(PrevaylerImpl.java:59)
org.priki.service.Prevalence.execute(Prevalence.java:97)
org.priki.actions.PrikiAction.postWikiword(PrikiAction.java:206)
sun.reflect.GeneratedMethodAccessor223.invoke(Unknown Source)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
java.lang.reflect.Method.invoke(Method.java:597)
com.opensymphony.xwork.DefaultActionInvocation.invokeAction(DefaultActionInvocation.java:300)
com.opensymphony.xwork.DefaultActionInvocation.invoke(DefaultActionInvocation.java:166)
com.opensymphony.xwork.interceptor.AroundInterceptor.intercept(AroundInterceptor.java:35)
com.opensymphony.xwork.DefaultActionInvocation.invoke(DefaultActionInvocation.java:164)
com.opensymphony.xwork.interceptor.AroundInterceptor.intercept(AroundInterceptor.java:35)
com.opensymphony.xwork.DefaultActionInvocation.invoke(DefaultActionInvocation.java:164)
com.opensymphony.xwork.interceptor.AroundInterceptor.intercept(AroundInterceptor.java:35)
com.opensymphony.xwork.DefaultActionInvocation.invoke(DefaultActionInvocation.java:164)
com.opensymphony.xwork.interceptor.AroundInterceptor.intercept(AroundInterceptor.java:35)
com.opensymphony.xwork.DefaultActionInvocation.invoke(DefaultActionInvocation.java:164)
com.opensymphony.xwork.interceptor.AroundInterceptor.intercept(AroundInterceptor.java:35)
com.opensymphony.xwork.DefaultActionInvocation.invoke(DefaultActionInvocation.java:164)
com.opensymphony.xwork.interceptor.AroundInterceptor.intercept(AroundInterceptor.java:35)
com.opensymphony.xwork.DefaultActionInvocation.invoke(DefaultActionInvocation.java:164)
org.priki.interceptor.I18nInterceptor.intercept(I18nInterceptor.java:85)
com.opensymphony.xwork.DefaultActionInvocation.invoke(DefaultActionInvocation.java:164)
com.opensymphony.xwork.DefaultActionProxy.execute(DefaultActionProxy.java:116)
com.opensymphony.webwork.dispatcher.ServletDispatcher.serviceAction(ServletDispatcher.java:272)
com.opensymphony.webwork.dispatcher.ServletDispatcher.service(ServletDispatcher.java:237)
javax.servlet.http.HttpServlet.service(HttpServlet.java:803)
--------------------
org.tuckey.web.filters.urlrewrite.UrlRewriteFilter.doFilter(UrlRewriteFilter.java:350)