How to use Robocopy and RichCopy to Migrate TB of data

When i tried using RoboCopy for my data migration tasks, I found out that it would take way to long since the data transfer speed on RoboCopy was quite poor.

Looking for other alternatives i found Microsoft RichCopy that did the trick.

But RichCopy comes up with some errors like “file not found” “directory not found” and i have a hard time trusting this output.

So instead i used Robocopy afterwards to verify the data, RoboCopy will only copy changes or files not copied by RichCopy and even if RoboCopy says “modified” in the output does not meen that it copies all the data Again, it only copies changes. I uses these commands which catched the missings links from RichCopy.

robocopy <source> <destination> /XO /ZB /E /FFT /DCOPY:T /COPYALL /R:1 /W:1 /V /NDL /MT:20 /LOG:<location&name>.log /TEE
/XO – Excludes files from <source> that are older then the <destination> files.
/ZB – Uses Restart mode. If access is denied, this option uses Backup mode.
/E – Copies subdirectories. Note that this option includes empty directories. For additional information
/DCOPY:T – Copies directory time stamps.
/COPYALL – All security information on files and folders are copied.
/R: – Specifies the number of retries on failed copies. The default value is 1,000,000 (one million retries).
/W: – Specifies the wait time between retries, in seconds. The default value of N is 30 (wait time 30 seconds).
/V – Verbose output showing skipped files
/NDL – Specifies that directory names are not to be logged.
/MT – Creates multi-threaded copies with N threads. N must be an integer between 1 and 128. The default value for N is 8.
/LOG:<LogFile> – Writes the status output to the log file (overwrites the existing log file).
/TEE – Writes the status output to the console window, as well as to the log file.
/FFT – if the two server have the different file systems, files are still re-copied, even though nothing has changed. You can use /fft switch, which makes the modify time comparison more tolerant (2 seconds granularity).

This way i increased the copy speed and i’m positive that the data is intact after the migration.

Use this info at your own risk.


Hotfix and Update List for Windows Server 2012 and R2

Look here to find recommended updates regarding Hyper-V and Windows Server 2012 and R2 installations.

Windows Server 2012 R2

Hotfix and update list for Windows Server 2012 R2 failover clusers

Update list for Windows Server 2012 R2


Windows Server 2012

Hotfix and update list for Windows Server 2012 Failover clusters

Update list for Windows Server 2012

How to install SCCM 2012 prerequisites the easy way

Very simple, to install all the required components for an simple, easy and quick installation, start Powershell as administrator and run these commands:

Get-Module servermanager
Install-WindowsFeature Web-Windows-Auth
Install-WindowsFeature Web-ISAPI-Ext
Install-WindowsFeature Web-Metabase
Install-WindowsFeature Web-WMI
Install-WindowsFeature BITS
Install-WindowsFeature RDC
Install-WindowsFeature NET-Framework-Features
Install-WindowsFeature Web-Asp-Net
Install-WindowsFeature Web-Asp-Net45
Install-WindowsFeature NET-HTTP-Activation
Install-WindowsFeature NET-Non-HTTP-Activ



If you want to install SQL Server 2012 SP1 in a line for you lab, use this as an example:

E:\setup.exe /ACTION=Install /IACCEPTSQLSERVERLICENSETERMS /UpdateEnabled=1 /features=”SQLEngine” “RS” “SSMS” /installshareddir=”C:\Program Files\SQL” /InstallsharedwowDir=”C:\Program Files (x86)\SQL” /instancedir=”C:\Program Files\SQL” /Instancename=MSSQLSERVER /q /AGTSVCACCOUNT=”NT Authority\System” /INSTALLSQLDATADIR=”C:\Program Files\SQL” /SQLBACKUPDIR=”C:\Program Files\SQL\Backup” /SQLCOLLATION=SQL_Latin1_General_CP1_CI_AS /SQLSYSADMINACCOUNTS=”\domain admins” “\ml-admin” /SQLSVCACCOUNT=”NT Authority\System” /SQLTEMPDBDIR=”C:\MSSQLSERVER\TEMPDB” /SQLTEMPDBLOGDIR=”C:\MSSQLSERVER\LOGS” /SQLUSERDBDIR=”C:\MSSQLSERVER\USERDB” /SQLUSERDBLOGDIR=”C:\MSSQLSERVER\Logs” /RSSVCACCOUNT=”NT AUTHORITY\SYSTEM” /INDICATEPROGRESS

You can find all the commands at this site:

Have a nice day 🙂

How to convert a WIM or ISO file to VHDX or VHD – Easy

Quite often you will need to convert a WIM or ISO file to VHDX or VHD file.

This will happen when dooing deployments of operating systems mostly.

Today i was preparing for the PowerShell Deployment Toolkit and ran into this fantastic powershell Tool that can convert Windows Server 2008 R2, 2012, 2012R2 Windows 7, 8 and 8.1 .ISO files directly to .VHD or .VHDX.

I can recommend to try WIM2VHD For Windows 8 and 8.1 out, it worked perfectly for me and saved me a lot of time.


Performance Comparison Vmware Workstation vs Windows 8

I currently have en cluster invironment running on Workstation 10 on my Windows 8.1 Machine and i have serious lack of performance on the virtual machines.

My laptop has in all 16ghz of CPU power, 32 gb of memory and 2 SSD drives in raid 0.

But.. I AM running a hyper-v cluster inside the VMware Workstation 10 and the disk is diffenencing. But still i expected more.

So reading on the internet i found an blog where a user have compared disk and CPU performance with VMware Workstation 9 and Windows 8 (yes both version have gotten a number more on its back, but still)

He has discovered that Hyper-v in Windows 8 are 4 times farster!. Which could discribe my issue at hand.

I know, test are what tests are, but this just fits in my situation.

Try reading his comparison, it makes sence.


I will return with my findings after returning to Windows 8.1 hyper-v – where i will install my hyper-v cluster inside hyper-v.

Oooh no whait.. that was why i shifted in the first place.. ohh well dual boot os must be optained to have an Windows 8.1 hyper-v to rock the soks, and a Windows 8.1 with VMware Workstation 10 to test hyper-v clustering.

/I’ll be back

Data Deduplication in Windows 8.1 – How to guide

To enable this feature which is NOT supported from Microsoft, you need to grab some files from the Windows Server 2012 R2 installation before deduplication can be enabled.. -Remember, Data Deduplication requires a dedicated drive to Work.

Storage savings on VHD / VHDX files are over 80% and is just perfect for laptop invironments.

All information written worked for me and i am running deduplication on my work laptop, but use it at your own risk

The files needed can be optained here from my Skydrive (soon to be OneDrive)

-Yes it can be done by yourself, but it is not that easy, so you choose.

Place the files in a folder, and open and CMD promt as Administrator. Go into the folder where you placed to files and run this command:

dism /online /add-package / /  / /  / /

And afterwards

dism /online /enable-feature /featurename:Dedup-Core /all

Now you can see the feature enabled under Windows Features


To enable it for a specific drive (for me the D drive) run the following command in an elevated PowerShell prompt:

Enable-DedupVolume -Volume D:

Set-DedupVolume -Volume D: -OptimizeInUseFiles

To begin Deduplication process, run this:

Start-DedupJob -Volume D: -Type Optimization

When using the drive for Hyper-V VMs (test invironment only), run:

Enable-DedupVolume -Volume D: -UsageType HyperV

To se how it goes, run:


To se status and savings, run:



Get-DedupVolume -Volume D:


Remember that Data Deduplication is a Schedules service, so change the Schedules to fit your need. To se the current Schedules, run:


To change Schedules, and yes you would like to: se my post on this matter HERE

I hope this helps you in your quest to save disk Space 🙂

Data Deduplication – How to optimize on Windows

On Windows Server 2012 there is a great feature called Data Duduplication.

I´m running this feature on my hyper-v test environment and on Windows 8.1, and the disk savings when running virtual machines is fantastic!. – try Data Deduplication, it is fantastic.

Data Deduplication requires a dedicated disk to be activated, and creating a partition to dedicate does not work.

But – all Data Deduplication is done by some scheduled maintenence jobs running in the background, if you wish to change these Windows to you needs read on.

Heads up: When running on a laptop or test invironment, running these jobs in the middle of the night (default) is not an option. Soo..


How to change Data Deduplication maintenence Schedules

See all commands by running this in powershell promt: Get-Command -Module Deduplication

Or visit Technet:

See existing Schedules on your computer/server by running: Get-DedupSchedule

Open a Powershell administrator console and run the following commands:

To change the Schedules to 11 AM on weekdays and create an optimization job, run:

New-DedupSchedule -Name “WeeklyOptimization” -Type Optimization -Days Mon,Tues,Wed,Thurs,Fri -Start 12:00 -DurationHours 2

Set-DedupSchedule –Name “WeeklyScrubbing” –Type Scrubbing –Start 11:00 –DurationHours 2 –Days Mon,Tues,Wed,Thurs,Fri –Priority Normal

Set-DedupSchedule –Name “WeeklyGarbageCollection” –Type GarbageCollection –Start 11:00 –DurationHours 2 –Days Mon,Tues,Wed,Thurs,Fri –Priority Normal

If you are planning to run hyper-v VMs on this drive (currently for testing only), run this command:

Enable-DedupVolume -Volume D: -UsageType HyperV

To start the jobs manually, run these commands:

Start-DedupJob D: -Type Scrubbing

Start-DedupJob D: -Type GarbageCollection

Start-DedupJob D: -Type Optimization

To se current status on the jobs, run: Get-DedupJob


I hope that this will help you to better utilize Data Deduplication.