As Jim said one needs the files to see exactly what is going on. But this may give you a bit of perspective. One was a model which converged in Linux but not in Windows. Turned out that
the default floating point round off option was different for the two OS's.
This made no substantial difference until about 200-300 function evaluations at which point the
numbers had changed enough so that the function minimizer took a different branch in the code,
and so went off in a different direction. One found a good local minimum and the other got
stuck in a flat place.
So the first thing to do is to save all the function evaluations for your model (to 16-17 sig figs)
and see if this is what is going on. If so it does not indicate that the Mac is superior. It is just the luck of the draw. I was using Linux on the PC but you can compare windows the same way.
I also had a simple model which converged using Excel solver and not ADMB. Same thing I decided.
Just the luck of the minimizing path chosen.
Of course there could be a bug in the code. You should take the converged Mac solution
and verify that the PC version gives the same function value.