I don't know how else that can be done.
But I think you had better figure out why it cannot find all the urls
on your site. Google and other search engine robots will be unable to
discover your urls through crawling either, and that's much more
important than the sitemap. The sitemap can help somewhat, but with no
incoming links to them, such urls are orphans and will at best end up
in the omitted list.
Perhaps you are switching back and forth between www and non-www urls
in your navigation.
Perhaps you have some javascript or flash based navigation with no
alternative html navigation.
Perhaps you have disallowed certain pages in robots.txt which cuts off
the access to other pages.
Perhaps you have subdomains or folders with their own entry points (so
they are like separate websites) that need to be crawled separately.
On Feb 16, 6:33 am, "
darrenma1...@gmail.com" <
darrenma1...@gmail.com>
wrote: