Nondeterministic results with PSL?

46 views
Skip to first unread message

ebrahim.a...@gmail.com

unread,
Apr 22, 2019, 6:30:46 PM4/22/19
to PSL Users
I have a myriad of rules of the form:

VN_Frame(V,F) & VN_Frame_type(F,'get-13.5.1') & VN_Agent(F,N) -> VN_Type(N,'animate') |VN_Type(N,'organization') | VN_Type(N,'wn_hypernym=organization%1:14:00') | VN_Type(N,'wn_hypernym=organization%1:14:01') | VN_Type(N,'sumo=Human') | VN_Type(N,'sumo=Nation') .

where I'm declaring rules which state relations between verbs that exist in a certain frame (F) where some noun (N) exists in that frame then we can infer that the type of that noun is either 'animate', an 'organization', a 'human', or a 'Nation' in this example.

Ignoring the soundness of my rules, or any flaws inherent in the overall logic of my source code, I'm noticing some strange results, where inferences for the verb types will vary wildly.

For example, one one run, I may get results for all the abovementioned types where 'Human' will be weighted with a certainty of say 0.9. If I run the inference again, it may drop to 0.07, and then organization will be 0.88. No matter how many times I run inferences on my data, I always seems to get nondeterministic results.
Not only are the results nondeterministic, but again as stated earlier, they tend to vary wildly.

In other words, irrespective of the quality or soundness of my logic, I would expect my results to be the same no matter how many times I run it.

Is it possible in any case for PSL to exhibit this kind of behavior? Is it expected under certain circumstances? Any light that you can shed on these baffling results would be greatly appreciated.

Thanks,
Ebrahim

ebrahim.a...@gmail.com

unread,
Apr 24, 2019, 5:37:17 PM4/24/19
to PSL Users

To clarify the line of questioning above: Should I expect identical sets of weights for each prediction inferred when using PSL?

Eriq Augustine

unread,
Apr 26, 2019, 10:26:36 PM4/26/19
to ebrahim.a...@gmail.com, PSL Users
Hey Ebrahim,

Sorry for the late reply.

The behavior you are describing sounds like it could be intended.
PSL does convex inference to make predictions, but it is possible that there are many solutions that all achieve the optimal objective.
The values of the random variables are initiated randomly (as is standard for optimization), so running into different optimal solutions seems reasonable.

To get more consistent results, you can force PSL to initialize the random variables to a set value (namely zero).
Just set the config option "admmreasoner.initiallocalvalue" to "zero".
This wiki page covers setting config options: https://github.com/linqs/psl/wiki/Configuration

However based on the wide range of values you cite, you may want to consider adding a negative prior instead:
!VN_Type(N, T)
(Priors are covered here if you need it: https://github.com/linqs/psl/wiki/Rule-Specification#priors)
This should really restrict the optimal solution and provide more consistent results (even without setting that config value).

-eriq

On Wed, Apr 24, 2019 at 2:37 PM <ebrahim.a...@gmail.com> wrote:

To clarify the line of questioning above: Should I expect identical sets of weights for each prediction inferred when using PSL?

--
You received this message because you are subscribed to the Google Groups "PSL Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to psl-users+...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

ebrahim.a...@gmail.com

unread,
Apr 29, 2019, 3:00:59 PM4/29/19
to PSL Users
Hi Eriq,

Thanks for the suggestions, the declaration of a negative prior up-front seems to have helped significantly with increasing consistency in the inferences.

I actually declared my prior up front as: ~VN_Type(N, T) .

This makes sense too, right?

Eriq Augustine

unread,
Apr 29, 2019, 4:00:31 PM4/29/19
to ebrahim.a...@gmail.com, PSL Users
Yeah, that works as a prior.
("!" and "~" are synonymous.)

-eriq

ebrahim.a...@gmail.com

unread,
Apr 29, 2019, 4:11:02 PM4/29/19
to PSL Users
Unweighted priors sounds reasonable in this context too?

Eriq Augustine

unread,
Apr 29, 2019, 4:52:11 PM4/29/19
to ebrahim.a...@gmail.com, PSL Users
No, you should give them a weight.
You usually give them a smaller weight then the rest of your rules.

-eriq

On Mon, Apr 29, 2019 at 1:11 PM <ebrahim.a...@gmail.com> wrote:
Unweighted priors sounds reasonable in this context too?

Reply all
Reply to author
Forward
0 new messages