ERROR: TBC - !MESSAGE General exception in (= (<http://spinrdf.org/spr#rowCount> ?arg1) 0)

29 views
Skip to first unread message

Tim Smith

unread,
Aug 24, 2020, 7:17:14 PM8/24/20
to topbrai...@googlegroups.com
Hi,

I am attempting to write graphs to disk using an SWP.  Unfortunately, while I cannot reproduce it on demand, the actual writing of the triples to a file is only successful about 50% of the time.  When it is not successful, TBC chews up my CPUs and consumes memory until it ultimately hits the heap limit.  The closest thing to a stack trace I've been able to get is the error pasted below.  I was able to recover this from a log last week and it occured around the same time as the problem.  When this problem occurs, I have to kill TBC as it never stops nor is it interruptible.

The SWP that does the writing is below.  It calls a number of other SWPs and then attempts to do three things:

1.  Dump ui:tempGraphWorking using ui:dumpGraph
2.  Attempt to write ui:tempGraphTags to a file.  It sometimes hangs here.
3.  Attempt to clear ui:tempGraphWorking
4.  Attempt to write ui:tempGraphWorking to a file.  It hangs here most often.

Each ui:update is wrapped in a transaction.

I'm at a loss as to why this is happening.  Not being able to consistently execute these SWPs is putting my project in jeopardy.

Any help is greatly appreciated!

Tim



<ui:group let:inputFileGraph="{= IRI(?inputFile) }" let:outputFileGraph="{= IRI(?outputGraph) }">
    <!-- Load the input file into a graph -->
    <h3>The input file is {= ?inputFile }  Graph: {= ?inputFileGraph }</h3>
    <h3>The output graph is {= ?outputGraph }  Graph: {= ?outputFileGraph }</h3>{= smf:trace("Loading the input graph...") }<ui:setContext ui:queryGraph="{= ui:graphWithImports(?inputFileGraph) }">--&gt;

<l5x:ProcessController_Programs/>
        <l5x:ProcessAOIs/>
        <l5x:ProcessBlocks/>
        <l5x:ProcessICons/>
        <l5x:ProcessIRefs/>
        <l5x:ProcessOCons/>
        <l5x:ProcessORefs/>
        <l5x:ProcessWires/>
        <l5x:ProcessTags/>
    </ui:setContext>
    <ui:dumpGraph ui:filePath="/data.example.com/Temp/DATA_Line_Connections-temp_output.ttl" ui:graph="ui:tempGraphWorking"/>
    <!-- Try to insert into the output graph -->OUTPUT Graph: {= ?outputFileGraph }<ui:setContext ui:queryGraph="&lt;http://data.example.com/Instances/Global_Tags&gt;">} }"/&gt;


{= smf:trace("Writing tag graph... ") }<ui:transaction>
            <ui:update ui:updateQuery="{!
                    INSERT {
                        ?s ?p ?o .
                    }
                    WHERE {
                        GRAPH ui:tempGraphTags {
                            ?s ?p ?o .
                        } .
                    } }"/>
        </ui:transaction>
    </ui:setContext>
    <ui:setContext ui:queryGraph="&lt;http://data.example.com/instances/Line_Connections&gt;">} }"/&gt;
{= smf:trace("Clearing output graph... ") }<ui:transaction>
            <ui:update ui:updateQuery="{!
                    DELETE {
                        ?s ?p ?o .
                    }
                    WHERE {
                        BIND (IRI(&lt;http://data.example.com/instances/Line_Connections&gt;) AS ?graph) .
                        GRAPH ?graph {
                            ?s ?p ?o .
                            NOT EXISTS {
                                ?graph ?p ?o .
                            } .
                        } .
                    } }"/>
        </ui:transaction>
        <ui:transaction>{= smf:trace("Writing output graph... ") }<ui:update ui:updateQuery="{!
                    INSERT {
                        ?s ?p ?o .
                    }
                    WHERE {
                        GRAPH ui:tempGraphWorking {
                            ?s ?p ?o .
                        } .
                    } }"/>
        </ui:transaction>
    </ui:setContext>
    <h2> Finished!!! </h2>{= smf:trace("Finished!") }</ui:group>


!ENTRY org.apache.jena.sparql.engine.iterator.QueryIterFilterExpr 2 0 2020-08-21 15:49:47.958
!MESSAGE General exception in (= (<http://spinrdf.org/spr#rowCount> ?arg1) 0)
!STACK 0
java.lang.NullPointerException
at java.util.concurrent.ConcurrentHashMap.get(ConcurrentHashMap.java:936)
at org.topbraid.spin.swp.internal.sprn.SPRNTables.getTable(SPRNTables.java:92)
at org.topbraid.spin.swp.internal.sprn.RowCountFunction.exec(RowCountFunction.java:19)
at org.topbraid.jenax.functions.AbstractFunction1.exec(AbstractFunction1.java:35)
at org.topbraid.jenax.functions.AbstractFunction.exec(AbstractFunction.java:110)
at org.apache.jena.sparql.expr.E_Function.evalSpecial(E_Function.java:89)
at org.apache.jena.sparql.expr.ExprFunctionN.eval(ExprFunctionN.java:100)
at org.apache.jena.sparql.expr.ExprNode.eval(ExprNode.java:93)
at org.apache.jena.sparql.expr.ExprFunction2.eval(ExprFunction2.java:76)
at org.apache.jena.sparql.expr.ExprNode.isSatisfied(ExprNode.java:41)
at org.apache.jena.sparql.engine.iterator.QueryIterFilterExpr.accept(QueryIterFilterExpr.java:49)
at org.apache.jena.sparql.engine.iterator.QueryIterProcessBinding.hasNextBinding(QueryIterProcessBinding.java:69)
at org.apache.jena.sparql.engine.iterator.QueryIteratorBase.hasNext(QueryIteratorBase.java:114)
at org.apache.jena.sparql.engine.iterator.QueryIteratorWrapper.hasNextBinding(QueryIteratorWrapper.java:38)
at org.apache.jena.sparql.engine.iterator.QueryIteratorBase.hasNext(QueryIteratorBase.java:114)
at org.apache.jena.sparql.engine.iterator.QueryIteratorWrapper.hasNextBinding(QueryIteratorWrapper.java:38)
at org.apache.jena.sparql.engine.iterator.QueryIteratorBase.nextBinding(QueryIteratorBase.java:153)
at org.apache.jena.sparql.engine.iterator.QueryIteratorBase.next(QueryIteratorBase.java:131)
at org.apache.jena.sparql.engine.iterator.QueryIteratorBase.next(QueryIteratorBase.java:40)
at org.apache.jena.sparql.engine.QueryExecutionBase.execAsk(QueryExecutionBase.java:365)
at org.topbraid.sparql.LockOptimizedQueryExecution.execAsk(LockOptimizedQueryExecution.java:200)
at org.topbraid.spin.arq.SPINARQFunction.executeBody(SPINARQFunction.java:281)
at org.topbraid.spin.arq.SPINARQFunction.exec(SPINARQFunction.java:258)
at org.apache.jena.sparql.expr.E_Function.evalSpecial(E_Function.java:89)
at org.apache.jena.sparql.expr.ExprFunctionN.eval(ExprFunctionN.java:100)
at org.apache.jena.sparql.expr.ExprNode.eval(ExprNode.java:93)
at org.apache.jena.sparql.expr.ExprFunction1.eval(ExprFunction1.java:68)
at org.topbraid.spin.swp.engine.expressionhandlers.DefaultExpressionHandler.evaluate(DefaultExpressionHandler.java:67)
at org.topbraid.spin.swp.engine.SWPEngine.evaluateArguments(SWPEngine.java:1013)
at org.topbraid.spin.swp.engine.SWPEngine.evaluateArguments(SWPEngine.java:956)
at org.topbraid.spin.swp.engine.SWPEngine.createNodes(SWPEngine.java:703)
at org.topbraid.spin.swp.engine.SWPEngine.createNodes(SWPEngine.java:688)
at org.topbraid.spin.swp.engine.SWPEngine.addChildNodes(SWPEngine.java:417)
at org.topbraid.spin.swp.engine.SWPEngine.addChildNodes(SWPEngine.java:395)
at org.topbraid.spin.swp.engine.control.internal.GroupControlElement.run(GroupControlElement.java:20)
at org.topbraid.spin.swp.engine.SWPEngine.createNodes(SWPEngine.java:731)
at org.topbraid.spin.swp.engine.SWPEngine.createNodesFromPrototype(SWPEngine.java:821)
at org.topbraid.spin.swp.engine.SWPEngine.createNodes(SWPEngine.java:724)
at org.topbraid.spin.swp.engine.SWPEngine.createNodes(SWPEngine.java:688)
at org.topbraid.spin.swp.engine.SWPEngine.run(SWPEngine.java:1473)
at org.topbraid.spin.swp.engine.XMLEngine.run(XMLEngine.java:106)
at org.topbraid.spin.swp.engine.XMLEngine.runService(XMLEngine.java:202)
at org.topbraid.team.system.AutoTransitionsManager.checkAutoTransitions(AutoTransitionsManager.java:50)
at org.topbraid.team.system.AutoTransitionsManager.access$0(AutoTransitionsManager.java:46)
at org.topbraid.team.system.AutoTransitionsManager$1.run(AutoTransitionsManager.java:36)
at java.lang.Thread.run(Thread.java:748)

Holger Knublauch

unread,
Aug 24, 2020, 7:36:07 PM8/24/20
to topbrai...@googlegroups.com

Hi Tim,

given that it sounds urgent, let's see if we can solve this together if I give you some more background info.

The stack trace is caused by a background thread that calls the SWP script teamwork:CheckAutoTransitionsService periodically. This is for workflows that carry a teamwork:autoTransitionHours property in a transition. From our standard examples, I see that teamwork:ReviewRequiredwithEscalationWorkflowTemplate uses this property.

I am not sure yet whether there is a bug in the handling of this, but to minimize the risk, you could try to deactivate the feature. One way to achieve this would be to remove any workflow that uses teamwork:autoTransitionHours. Another would be to add a ui:override of teamwork:CheckAutoTransitionsService to make it a No-op, e.g. <ui:group />. I have attached a file that will do the latter, with the effect that auto-transitions would no longer be checked. It would help us narrow down the issue if you could place that file in your workspace to observe if the problem is still happening.

If the problem disappears then you have a patch and you can at least continue, while we investigate on our side why this script is causing issues. If the problem remains then we can treat this particular error as a red herring and need to investigate your specific script, which could become a matter of getting hold of a complete example where we can run this ourselves.

Sounds good?

Holger

--
You received this message because you are subscribed to the Google Groups "TopBraid Suite Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to topbraid-user...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/topbraid-users/CAF0WbnLjH9-C7j2QvCeCVJnDsCtmWgnLSX%2BWiJpNCT%2Bm0fybbQ%40mail.gmail.com.
avoidAutoTransitions.ui.ttlx

Irene Polikoff

unread,
Aug 24, 2020, 10:16:07 PM8/24/20
to topbrai...@googlegroups.com
Tim,

This is orthogonal to your issue, but I wanted to make sure that you know about export to S3. You can place one or more asset collections into a basket and then select “Export to S3”. This will create a file for each selected collection in an S3 bucket. Optionally, these files can be zipped.

You do need to have S3 bucket configured. To learn more, go to https://doc.topquadrant.com/6.4/basket/#Export_to_S3.

Reply all
Reply to author
Forward
0 new messages