Symbols default to commutative=True but I wonder if we should also default to `finite=True` since that is consistent with the behavior of `x - x` being 0. (If x is infinite then `x - x` is nan.) In this
discussion here I point out how `x-x->0` without the correct assumption (that x is finite) is inconsistent with unevaluation of `x**2 < 0`. The inequality remains unevaluated because it would be an invalid comparison if `x` had real and imaginary components and produced a comparison like `2*I < 0` when `x = 1 + I`.
I suspect that this might have minimal impact on the current behavior; I suspect that making `real=True` a default assumption, too, would have a greater impact. Without making the latter, all inequalities would continue to be unevaluated for plain/vanilla symbols because they would have to entertain the possibility of an invalid result. e.g. `Eq(1/x, 0)` for finite `x` would not evaluate to True (even though `x` might be finite) when x is not known to be real or imaginary.
/c