Update Embedded Maven

1,010 views
Skip to first unread message

Sam Jones

unread,
Feb 25, 2016, 12:37:11 PM2/25/16
to Scala IDE User
I am trying to build the Spark Examples that I downloaded from:
The problem is I keep getting the error:

[WARNING] Rule 0: org.apache.maven.plugins.enforcer.RequireMavenVersion failed with message:

Detected Maven Version: 3.3.3 is not in the allowed range 3.3.9.


I went to Eclipse preferences and then Maven -> installations. It shows EMBEDDED with version 3.3.3

At the bottom of that window, it says "Note: Embedded runtime is always used for dependency resolution"

How can I change the embedded Maven to 3.9

Simon Schäfer

unread,
Feb 25, 2016, 12:52:09 PM2/25/16
to scala-i...@googlegroups.com
By adding a new installation? There is an "Add" button in the peferences dialog, where you can select a newer installation that is somewhere installed on your system.
--
You received this message because you are subscribed to the Google Groups "Scala IDE User" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scala-ide-use...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/scala-ide-user/c6d69fc2-d0e7-4b53-824b-2dd9469e6490%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Sam Jones

unread,
Feb 25, 2016, 1:22:49 PM2/25/16
to Scala IDE User
That's the first thing I did. I have maven-3.3.9 downloaded. I clicked on the add button and pointed to that location. I still get the same error.
The problem seems to be that Embedded runtime is always used for dependency resolution. So, I need to update the embedded runtime.

Renato Garcia

unread,
Feb 25, 2016, 7:34:13 PM2/25/16
to scala-i...@googlegroups.com
I think that in order to upgrade the embedded Maven you need to upgrade m2e plugin, however 1.6.2 is the latest and uses Maven 3.3.3.

Having said that, the problem you've mentioned below is issued by a plugin, so you'd just need to select 3.3.9 to run the the build and it would work, unless of course this enforcement on Spark build is related to some dependency resolution issue.
Reply all
Reply to author
Forward
0 new messages