Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

how to migrate file share to new server

32 views
Skip to first unread message

Rich

unread,
Sep 12, 2008, 4:46:08 PM9/12/08
to
I currently have a Win 2K server with about 300GB of data in a file share,
that i want to migrate to a better server running Win 2K3 at a remote
location. What i think would be best would be if i could set up a constant
replication/syncing of the data from my current server to the old one, so any
changes made on old would be reflected on new. Then one evening after
everyone is done with the old server, i would stop that file share, and point
everyone's shortcuts to the new share. how could i best accomplish this? or
another option would be to have it sync both ways and i can just slowly move
users over one by one to the new server until they are all on it, then stop
the file share on the old server. thanks in advance for the help.

Danny Sanders

unread,
Sep 12, 2008, 5:28:12 PM9/12/08
to
Look into robocopy. Something like robocopy d:\folder e:\folder /e /sec /w:3
r:3 /log:folder.
You can schedule it as a .bat file. Use the log to determine if any files
are not copied.


hth
DDS

"Rich" <Ri...@discussions.microsoft.com> wrote in message
news:D3B68B41-F64A-432A...@microsoft.com...

DaveMills

unread,
Sep 12, 2008, 6:58:05 PM9/12/08
to
On Fri, 12 Sep 2008 15:28:12 -0600, "Danny Sanders" <DSan...@NOSPAMciber.com>
wrote:

>Look into robocopy. Something like robocopy d:\folder e:\folder /e /sec /w:3
>r:3 /log:folder.
>You can schedule it as a .bat file. Use the log to determine if any files
>are not copied.

You can make it respond to changes in the source folder to trigger a new sync
too so the when the source is updated it will re-run the copy adding changed
files.

Another way would by to use DFSR and set up replication between the two shares
but not allow users the use the link to the new share until replication has
completed. Then enable the new link. If you are removing the old server disable
that link. Finally when all have switched to the new link remove the replication
group and wait for the message in the event log that replication has been
removed before deleting the old copy of the data. If you jump the gun on this
then DFSR can notice the delete and sync that delete with the new copy, poof, no
files. I always take a backup of the data before deleting the old copy, once
bitten twice shy <grin>



>
>
>hth
>DDS
>
>"Rich" <Ri...@discussions.microsoft.com> wrote in message
>news:D3B68B41-F64A-432A...@microsoft.com...
>>I currently have a Win 2K server with about 300GB of data in a file share,
>> that i want to migrate to a better server running Win 2K3 at a remote
>> location. What i think would be best would be if i could set up a
>> constant
>> replication/syncing of the data from my current server to the old one, so
>> any
>> changes made on old would be reflected on new. Then one evening after
>> everyone is done with the old server, i would stop that file share, and
>> point
>> everyone's shortcuts to the new share. how could i best accomplish this?
>> or
>> another option would be to have it sync both ways and i can just slowly
>> move
>> users over one by one to the new server until they are all on it, then
>> stop
>> the file share on the old server. thanks in advance for the help.
>

--
Dave Mills
There are 10 type of people, those that understand binary and those that don't.

Rich

unread,
Sep 14, 2008, 8:46:01 PM9/14/08
to
Thanks to you both, Robocopy defintiely sounds like the way to go

Rich

unread,
Sep 15, 2008, 9:14:10 AM9/15/08
to
Since i have like 340GB of data, that would be nice to do one big copy over
the weekend when nobody is using the server, then throughout the week have it
keep updating whenever anything is changed on the current server. how do you
set it up like that?

Danny Sanders

unread,
Sep 15, 2008, 10:25:58 AM9/15/08
to
Stop the share on Friday evening, perform the robocopy over the weekend.
Restart the share Monday morning and use robocopy again using /MON or /MOT
switch.

hth
DDS

"Rich" <Ri...@discussions.microsoft.com> wrote in message

news:395A1372-CDD8-4D92...@microsoft.com...

Rich

unread,
Sep 15, 2008, 11:01:02 AM9/15/08
to
Thanks. So do i only need to do the /MON once and it will just keep
constantly checking and synching the new server?

Danny Sanders

unread,
Sep 15, 2008, 12:58:03 PM9/15/08
to
> Thanks. So do i only need to do the /MON once and it will just keep
> constantly checking and synching the new server?


I haven't used that switch so I can't tell you what to expect. It should be
easy enough to create 2 folders, dump some word files in one and test.

hth
DDS

"Rich" <Ri...@discussions.microsoft.com> wrote in message

news:1F142AE5-88BA-4474...@microsoft.com...

DaveMills

unread,
Sep 15, 2008, 5:40:11 PM9/15/08
to
I agree, I cannot remember the switches I used but I tested it on a folder like
"Home Folders" but at the user level. Then I made changes a watched what
happened. It only took a few tests to see what switched worked best then I moved
up a level and did the real job. I left it running for a few days while I got up
to date.

Another nice feature of Robocopy is that if you run into errors (e.g. user has
denied Admin any access) then it pops up a error but will keep retrying (2^16
times). So you simply open a new Explorer window and fix the permissions and
Robocopy suddenly just continues

On Mon, 15 Sep 2008 10:58:03 -0600, "Danny Sanders" <DSan...@NOSPAMciber.com>
wrote:

There are 10 types of people, those that understand binary and those that don't.

Mike

unread,
Sep 30, 2008, 2:55:02 PM9/30/08
to
Rich - I'm in the midst of doing the exact same type of project with about
340GB of data that I am moving to a clustered file server. I'm using
ROBOCOPY to make the first pass of data over and then I plan to re-run the
ROBOCOPY script daily to get any changes and then one weekend, I plan to
stop the share, run ROBOCOPY again and point the DFS Target to the new
location and disable the referral to the old server.

This is what my ROBOCOPY script looks like:

robocopy {source} {destination} /copyall /e /zb /MIR /log:share_data.log
/np /tee /r:2 /w:1 /ndl

Kevin Winter

unread,
Sep 6, 2010, 1:20:27 PM9/6/10
to
I tried robocopy but somehow it stops working and fails because of a very deep folder structure. Are there any alternatives? I already downloaded the newest version available from Microsoft.

Kevin


>> On Friday, September 12, 2008 5:28 PM Danny Sanders wrote:

>> Look into robocopy. Something like robocopy d:\folder e:\folder /e /sec /w:3
>> r:3 /log:folder.
>> You can schedule it as a .bat file. Use the log to determine if any files
>> are not copied.
>>
>>

>> hth
>> DDS


>>> On Friday, September 12, 2008 6:58 PM DaveMills wrote:

>>> On Fri, 12 Sep 2008 15:28:12 -0600, "Danny Sanders" <DSan...@NOSPAMciber.com>
>>> wrote:
>>>
>>>
>>> You can make it respond to changes in the source folder to trigger a new sync
>>> too so the when the source is updated it will re-run the copy adding changed
>>> files.
>>>
>>> Another way would by to use DFSR and set up replication between the two shares
>>> but not allow users the use the link to the new share until replication has
>>> completed. Then enable the new link. If you are removing the old server disable
>>> that link. Finally when all have switched to the new link remove the replication
>>> group and wait for the message in the event log that replication has been
>>> removed before deleting the old copy of the data. If you jump the gun on this
>>> then DFSR can notice the delete and sync that delete with the new copy, poof, no
>>> files. I always take a backup of the data before deleting the old copy, once
>>> bitten twice shy <grin>
>>>

>>> --
>>> Dave Mills
>>> There are 10 type of people, those that understand binary and those that don't.


>>>> On Sunday, September 14, 2008 8:46 PM Ric wrote:

>>>> Thanks to you both, Robocopy defintiely sounds like the way to go
>>>>
>>>> "DaveMills" wrote:


>>>>> On Monday, September 15, 2008 9:14 AM Ric wrote:

>>>>> Since i have like 340GB of data, that would be nice to do one big copy over
>>>>> the weekend when nobody is using the server, then throughout the week have it
>>>>> keep updating whenever anything is changed on the current server. how do you
>>>>> set it up like that?
>>>>>
>>>>> "DaveMills" wrote:


>>>>>> On Monday, September 15, 2008 10:25 AM Danny Sanders wrote:

>>>>>> Stop the share on Friday evening, perform the robocopy over the weekend.
>>>>>> Restart the share Monday morning and use robocopy again using /MON or /MOT
>>>>>> switch.
>>>>>>
>>>>>> hth
>>>>>> DDS


>>>>>>> On Monday, September 15, 2008 11:01 AM Ric wrote:

>>>>>>> Thanks. So do i only need to do the /MON once and it will just keep
>>>>>>> constantly checking and synching the new server?
>>>>>>>

>>>>>>> "Danny Sanders" wrote:


>>>>>>>> On Monday, September 15, 2008 12:58 PM Danny Sanders wrote:

>>>>>>>> I have not used that switch so I cannot tell you what to expect. It should be


>>>>>>>> easy enough to create 2 folders, dump some word files in one and test.
>>>>>>>>
>>>>>>>> hth
>>>>>>>> DDS


>>>>>>>>> On Monday, September 15, 2008 5:40 PM DaveMills wrote:

>>>>>>>>> I agree, I cannot remember the switches I used but I tested it on a folder like
>>>>>>>>> "Home Folders" but at the user level. Then I made changes a watched what
>>>>>>>>> happened. It only took a few tests to see what switched worked best then I moved
>>>>>>>>> up a level and did the real job. I left it running for a few days while I got up
>>>>>>>>> to date.
>>>>>>>>>
>>>>>>>>> Another nice feature of Robocopy is that if you run into errors (e.g. user has
>>>>>>>>> denied Admin any access) then it pops up a error but will keep retrying (2^16
>>>>>>>>> times). So you simply open a new Explorer window and fix the permissions and
>>>>>>>>> Robocopy suddenly just continues
>>>>>>>>>
>>>>>>>>> On Mon, 15 Sep 2008 10:58:03 -0600, "Danny Sanders" <DSan...@NOSPAMciber.com>
>>>>>>>>> wrote:
>>>>>>>>>
>>>>>>>>> --
>>>>>>>>> Dave Mills
>>>>>>>>> There are 10 types of people, those that understand binary and those that don't.


>>>>>>>>>> On Tuesday, September 30, 2008 2:55 PM Mik wrote:

>>>>>>>>>> Rich - I'm in the midst of doing the exact same type of project with about
>>>>>>>>>> 340GB of data that I am moving to a clustered file server. I'm using
>>>>>>>>>> ROBOCOPY to make the first pass of data over and then I plan to re-run the
>>>>>>>>>> ROBOCOPY script daily to get any changes and then one weekend, I plan to
>>>>>>>>>> stop the share, run ROBOCOPY again and point the DFS Target to the new
>>>>>>>>>> location and disable the referral to the old server.
>>>>>>>>>>
>>>>>>>>>> This is what my ROBOCOPY script looks like:
>>>>>>>>>>
>>>>>>>>>> robocopy {source} {destination} /copyall /e /zb /MIR /log:share_data.log
>>>>>>>>>> /np /tee /r:2 /w:1 /ndl
>>>>>>>>>>
>>>>>>>>>> "DaveMills" wrote:


>>>>>>>>>> Submitted via EggHeadCafe - Software Developer Portal of Choice
>>>>>>>>>> Auto-Generate Code for LINQ to SQL Repository Pattern using T4
>>>>>>>>>> http://www.eggheadcafe.com/tutorials/aspnet/a7ee34d2-c297-4ec8-a933-69254242b21b/autogenerate-code-for-linq-to-sql-repository-pattern-using-t4.aspx

Pedro Sanchez

unread,
Sep 7, 2010, 2:19:01 PM9/7/10
to
We had the same problem you are describing in our organisation. Finally we used a commercial product called Copyright2 from Sys-Manage. You can download a trial version here http://www.sys-manage.com/PRODUCTS/CopyRight/tabid/64/Default.aspx and see if it solves your problem too.

Regards
Pedro Sanchez

Submitted via EggHeadCafe - Software Developer Portal of Choice

Silverlight Binary Serialization and Compression with WCF Services
http://www.eggheadcafe.com/tutorials/aspnet/96487d4c-d92f-4ca5-85b7-0fef6f42d6c3/silverlight-binary-serialization-and-compression-with-wcf-services.aspx

0 new messages