Hi all,
I have recently moved from SparkR 1.5.2 to 1.6.0. I am doing some experiments using SparkR:::newJObject("java.util.HashMap") and I notice the behaviour has changed, and it now returns an "environment" instead of a "jobj":
> print(class(SparkR:::newJObject("java.util.HashMap"))) # SparkR 1.5.2
[1] "jobj"
> print(class(SparkR:::newJObject("java.util.HashMap"))) # SparkR 1.6.0
[1] "environment"
Moreover, the environment returned is apparently empty (when I call ls() on the resulting environment, it returns character(0)) . This problem only happens with some Java classes. I am not able to say exactly which classes cause the problem.
If I try to create an instance of other classes such as java.util.BitSet, it works successfully. I thought it might be related with parameterized types, but it does work successfully with ArrayList and with HashSet, which take a parameter.
Any suggestions on this change of behaviour (apart from "do not use private functions" :-) ) ?
Thank you very much