Maven Specs2 Suite skipped

74 views
Skip to first unread message

Marino Borra

unread,
Jan 27, 2016, 7:51:11 PM1/27/16
to specs2-users
Hi to all,

If I run the "linked" specifications suite, these are skipped, the documentation says to use "all=true".

But with Maven / JUnitRunner / Surefire parameter "-Dspecs2.all=true" does not work.

<argLine>-Dspecs2.all=true</argLine>

Thanks in advance
Marino

etorreborre

unread,
Jan 28, 2016, 11:47:09 PM1/28/16
to specs2-users
Hi Marino,

Can you try with just `<argLine>-Dspecs2.all</argLine>`?

What you did should work though so I need a bit of time to reproduce and diagnose what's wrong but I'm travelling at the moment so don't expect a quick answer.

Eric.

Marino Borra

unread,
Jan 29, 2016, 6:50:29 PM1/29/16
to specs2-users
Hi Eric, I tried with your suggestion, but unfortunatly it does not work.

Here what I do:

            <plugin>
               
<groupId>org.apache.maven.plugins</groupId>
               
<artifactId>maven-surefire-plugin</artifactId>
               
<configuration>
                   
<argLine>
                        -Dspecs2.all -XX:-UseSplitVerifier -Xmx1024m -XX:MaxPermSize=256m -javaagent:${settings.localRepository}/org/powermock/powermock-module-javaagent/${org.powermock.version}/powermock-module-javaagent-${org.powermock.version}.jar                    
                   
</argLine>
                   
<useSystemClassloader>true</useSystemClassloader>
                   
<threadCount>10</threadCount>
                   
<includes>
                       
<include>**/Test*.java</include>
                       
<include>**/*Test.java</include>
                       
<include>**/*TestCase.java</include>
                       
<include>**/*Spec.java</include>
                   
</includes>
               
</configuration>
           
</plugin>

@RunWith( classOf[JUnitRunner] )
class SparkSuiteSpec extends Specification {

   
var sc: SparkContext = _

   
def is = args( asap = true ) ^  s2"""
      These are all the specifications that require Spark
      ${ "
SchemaUtilities child" ~ new SchemaUtilitiesSpec( sc ) }
      """


}

class SchemaUtilitiesSpec( sc: SparkContext ) extends Specification with IsolatedMockFactory with Checkers {

   
"Spark SchemaUtilities" should {

       
"..." in {
            ok
       
}

   
}

}

My goal is to have a suite that execute Spark and propagate the sparkcontext to the "linked" specs.

Thx
Marino

etorreborre

unread,
Jan 30, 2016, 2:12:00 AM1/30/16
to specs2-users
Hi Marino,

This is actually my mistake. The JUnitRunner was not using the `all` option.

You can give a go at version `3.7-20160130063509-4591208` and things should work better now.

Eric.

Marino Borra

unread,
Jan 31, 2016, 7:21:18 AM1/31/16
to specs2-users
Hi Eric,

ok, thanks a lot! But I don't see the version: 3.7-20160130063509-4591208, sorry.

Marino

etorreborre

unread,
Jan 31, 2016, 11:16:46 PM1/31/16
to specs2-users

Marino Borra

unread,
Feb 1, 2016, 6:19:32 AM2/1/16
to specs2-users
Ok, I have tried in the specs2-junit_2.10, 2.11 is too recent.

Marino

etorreborre

unread,
Feb 2, 2016, 1:22:17 PM2/2/16
to specs2-users
Hi Marino,

I am sorry, I was travelling and didn't have time to publish a version for 2.10 (that being said if you are using Spark, I think they recently made 2.11 their default scala version).

Unfortunately I still don't have a proper wifi access (and maybe not for the next 10 days) so I can't publish the jars for 2.10 yet. In the meantime you can clone the specs2 project,
and do a `sbt +publishLocal` to publish a scala 2.10 version locally.

E.

etorreborre

unread,
Feb 5, 2016, 5:31:45 AM2/5/16
to specs2-users

Marino Borra

unread,
Feb 7, 2016, 8:50:46 AM2/7/16
to specs2-users
Hi Eric, I try immediately.

Thank you so much.

Marino
Reply all
Reply to author
Forward
0 new messages