The navX-Micro can be part of a SLAM solution. As you know, SLAM is a sophisiticated technique that fuses information from multiple sensors (camera or LIDAR, IMUs like navX-Micro, wheel encoders, and potentially other sensors providing relative measures (e.g., barometers) and absolute measures (e.g., GPS).
However note that navX-Micro is only a part of the "Localization" (the L in SLAM) part of the problem. So you'll need other sensors and sophisticated sensor fusion software to implement SLAM.
As far as path planning, which requires some knowledge of (a) external surroundings as well as information about (b) robot orientation and (c) robot position, the navX-Micro can provide you orientation information (b). The orientation information is relative to a starting position if you use the yaw/pitch/roll data. If you use the compass data, you can also measure absolute orientation w/respect to the earth's magnetic field - but using a compass on a robot is tricky due to magnetic interference.
To calculate robot position, you'll need other information, for instance relative motion measurements from robot wheel encoders, or absolute measurements using distance sensors, from something simple like a ultrasonic sensor all the way up to a LIDAR ranging system.
So there's a lot to this - it's a fascinating problem, but not simple. The Robot Operating System has a Localization package (
http://wiki.ros.org/robot_localization) that you might look into for some background on this.
If you can provide more precise information about what you are trying to accomplish, folks on the forum might be able to provide more detailed suggestions.