TLDR: According to the NAL-4 and NAL-5 rules, it probably would be derived.
In the one hand, by the premise
NARS can get <X --> a>. and <Y --> a>. by compositional rules.
(The definition of products in NAL-4 also supports it)
In the other hand,
we have
<<(X * Y) --> (a * a)> <=> <(A * B) --> (b * b)>>.
so NARS can get <(A * B) --> (b * b)>. by NAL-5 detachment.
(Maybe during NAL-1 convertion to get implication <<(X * Y) --> (a * a)> ==> <(A * B) --> (b * b)>> first)
BTW, like the derivation of
<X --> Y>. and <Y --> X>.
NARS can uses premise <(A * B) --> (b * b)> to get conclusion <A --> b> and <B --> b>,
then uses them directly to compose <(B * A) --> (b * b)> directly,
which is not actually related to
<<($1 * $2) --> (a * a)> <=> <($2 * $1) --> (a * a)>>.
(Some detailed steps:
1. begins from common term b, uses <A --> b>. and relation of b-B to compose <(B * A) --> (B * b)> first,
2. and also combine it with relation of b-B to get <(B * b) --> (b * b)>
3. then uses NAL-1 deduction to get <(B * A) --> (b * b)> by reducing common term (B * b)
)
The whole progress just like this.
(The upper task is derived from the downer tasks; Image 2 shows the backward inferencing of NARS)
It can be represented in 2 narsese phrases:
<(*,X,Y) --> (*,a,a)>.
<X --> a>?
<Y --> a>?
<A --> b>.
<B --> b>.
<(*,B,A)-->(*,b,b)>?
And, maybe the processing of NAL in ONA is much simplified, perhaps you can try some alternative versions of NARS such as
OpenNARS or
PyNARS...