Copying files via PowerShell remoting channel

There are a few ways to do this, and in PowerShell 3.0 you can even just use the Copy-File cmdlet.

However, I came up with the following solution which is fairly reliable, and avoids any issues when transferring files larger than the session’s restriction on size of deserialised objects (by default 10MB).

Note that $localPath and $remotePath are set to what you’d expect. $Session is a PS Remoting Session created with, e.g. New-PSSession.

(ReportInfo and ReportError are just functions which output to either the console or to TeamCity depending on where the script is being run, this is part of our testing system…)

# Use .NET file handling for speed
$content = [Io.File]::ReadAllBytes( $localPath )
$contentsizeMB = $content.Count / 1MB + 1MB

ReportInfo "Copying $fileName from $localPath to $remotePath on $Connection.Name ..."

# Open local file
try
{
[IO.FileStream]$filestream = [IO.File]::OpenRead( $localPath )
ReportInfo "Opened local file for reading"
}
catch
{
ReportError "Could not open local file $localPath because:" $_.Exception.ToString()
Return $false
}

# Open remote file
try
{
Invoke-Command -Session $Session -ScriptBlock {
Param($remFile)
[IO.FileStream]$filestream = [IO.File]::OpenWrite( $remFile )
} -ArgumentList $remotePath
ReportInfo "Opened remote file for writing"
}
catch
{
ReportError "Could not open remote file $remotePath because:" $_.Exception.ToString()
Return $false
}

# Copy file in chunks
$chunksize = 1MB
[byte[]]$contentchunk = New-Object byte[] $chunksize
$bytesread = 0
while (($bytesread = $filestream.Read( $contentchunk, 0, $chunksize )) -ne 0)
{
Try
{
$percent = $filestream.Position / $filestream.Length
ReportInfo ("Copying {0}, {1:P2} complete, sending {2} bytes" -f $fileName, $percent, bytesread)
Invoke-Command -Session $Session -ScriptBlock {
Param($data, $bytes)
$filestream.Write( $data, 0, $bytes )
} -ArgumentList $contentchunk,$bytesread
}
Catch
{
ReportError "Could not copy $fileName to $($Connection.Name) because:" $_.Exception.ToString()
Return $false
}
}

# Close remote file
try
{
Invoke-Command -Session $Session -ScriptBlock {
$filestream.Close()
}
ReportInfo "Closed remote file, copy complete"
}
catch
{
ReportError "Could not close remote file $remotePath because:" $_.Exception.ToString()
Return $false
}

# Close local file
try
{
$filestream.Close()
ReportInfo "Closed local file, copy complete"
}
catch
{
ReportError "Could not close local file $localPath because:" $_.Exception.ToString()
Return $false
}

The chunk size is set to 1MB as it seems a good compromise given the 10MB restriction. Why not just pass through the IO.FileStream object and perform the loop remotely? Well, I’ve had issues in the past with doing that as the remote end tends to dial back to the local end in order to interact with the object rather than using the existing TCP connection. Safer to just chunk the contents over.

Advertisements