Currently, I am working on an eRoom to SharePoint migration project with the help of Metalogix. Metalogix eRoom console has a feature to run a pre-migration check to identify possible migration issues that could occur during migration includes the following.
- Document Url length
- Document Size
- Document Name
- Unsupported file types
Identifying these possible issues is necessary so that you can plan for these failures by talking to the end user. Metalogix eRoom console generate a report of each of these issues based on source eRoom and target SharePoint site where the content will be migrated. These reports are generated in the form of .csv files. Every eRoom pre-migration check feature generates 4 such files. For us, we have to migrate approximately 230 eRoom to the SharePoint 2010 environment. So we wanted to run a pre-migration check on all of these 230 rooms and generate a consolidated report so that we can review it with the business user.
Now to combine these 230 CSV files for every issue could take days and involve lot of manual effort. To avoid those manual efforts, I decided to use Windows PowerShell which provides tons of cmdlets (command lets) which could do an easy trick to avoid such manual efforts.
I have created a PowerShell function with the following.
Accepts 3 parameters
- $Path: Path of the directory where all the CSV files are stored.
- $MergedFileName: Name of the file which would be created by merging target multiple csv. files.
- $Filter: Any filter criteria that could be used to filter the list of target CSV files to be merged.
Based on the path and filter criteria, it finds a list of files to be merged.
if ($Filter -ne $null -and $Filter -ne "")
{
$AllFilesToMerge = Get-ChildItem -Path $Path -File -Filter $Filter -Recurse
}
else
{
$AllFilesToMerge = Get-ChildItem -Path $Path -Recurse
}
Then it processes each file by importing it.
$csvFilePath = $FileToMerge.FullName
$csvContent = Import-Csv -Path "$csvFilePath"
Then it reads the file content based on the headers of the csv file. Every row in the CSV is stored in a temporary PSObject variable which is then added to another global PSObject.
Note
You need to update the script for your CSV headers.
if ($csvContent -ne $null) {
foreach($entry in $csvContent) {
# You can also get the headers of particular .csv file like below - commented
# $CSVContentObj = $csvContent | Get-Member -MemberType NoteProperty | Select-Object -ExpandProperty Name
# Creating custom PS Object to hold the content temporarily
$CSVContentObj = 1 | select RoomName, SourceName, SourceURL, RectifiedName
$CSVContentObj.RoomName = $roomName # TODO: User need to update the headers manually before using this function
$CSVContentObj.SourceName = $entry.SourceName
$CSVContentObj.SourceURL = $entry.SourceURL
$CSVContentObj.RectifiedName = $entry.RectifiedName
# Adding custom PS object in the array
$CSVContentList += $CSVContentObj
}
} else {
Write-Host -ForegroundColor Yellow "File doesn't have any content"
}
Finally, if the global PSObject which holds all the content is not null, export the content in the target CSV file.
if ($CSVContentList -ne $null) {
# Getting current script execution location
$CurrentLoc = Get-Location
if ($MergedFileName.IndexOf(".csv") -gt 0) {
$csvPath = "$CurrentLoc\$MergedFileName"
} else {
$csvPath = "$CurrentLoc\$MergedFileName.csv"
}
# Exporting the content as csv file
$CSVContentList | Export-Csv -Path $csvPath
Write-Host -ForegroundColor Green "Files merged successfully and exported at"
$csvPath
}
The complete PowerShell script is attached to this article.