Welcome to the group!
It's smart of you to think about this before getting started.
Depending on how your site is structured, there are a few different
ways to block the duplicate content pages from crawlers. I would look
into these help docs on robots.txt (http://www.google.com/support/
webmasters/bin/answer.py?hl=en&answer=40360) and robots meta tags
Also, search through the groups a while and see what solutions other
webmasters have come up with.
Hope this helps,
On Sep 5, 9:18 am, Dino P wrote: