Can you compress files on a network share using PowerShell without downloading them to the client? - windows

I have a network share hosted by a server (\SERVER) which is being accessed by other servers/clients.
\\SERVER\SHARE\Folder\File
If I wanted to compress Folder and everything in it, is it possible to do using PowerShell WITHOUT having the files be downloaded to the machine that is running the command?
The files in question are large, and there is not enough room on the C drive to download and compress the files on the client. So for example if I navigated to the network share from the client using Windows File Explorer, and selected the folder to compress, it would start downloading the files to the client and then fail due to insufficient free space on the client.
What about PowerShell's Invoke-Command Option?
I do have the option to Invoke-Command from the client to the server, however, the C drive of \SERVER is to small to handle the request as well. There is a D drive (which hosts the actual \SHARE), which has plenty of space though. I would have to tell PowerShell to compress files on that drive somehow instead of the default which would be the C drive.
Error when running the below PowerShell Command
Compress-Archive -Path "\\SERVER\SHARE\Folder" -DestinationPath "\\SERVER\SHARE\OtherFolder\Archive.zip"
Exception calling "Write" with "3" argument(s): "Stream was too long."
At
C:\Windows\system32\WindowsPowerShell\v1.0\Modules\Microsoft.PowerShell.Archive\Microsoft.PowerShell.Archive.psm1:820
char:29
+ ... $destStream.Write($buffer, 0, $numberOfBytesRead)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : IOException

The problem is caused by Compress-Archive's limits. Its maximum file size is 2 GB. Documentation mentions this:
The Compress-Archive cmdlet uses the Microsoft .NET API
System.IO.Compression.ZipArchive to compress files. The maximum file
size is 2 GB because there's a limitation of the underlying API.
As for a solution, compress smaller files, or use another a tool such as 7zip. There's a module available, though manual compression is not that complex. As 7zip is not a native tool, install either it or the Powershell module.
Set-Alias sz "$env:ProgramFiles\7-Zip\7z.exe"
$src = "D:\somedir"
$tgt = "D:\otherdir\archive.7z"
sz a $tgt $src
If the source files are small enough so that a single file will never create an archive larger that the limit, consider compressing each file by itself. An example is like so,
$srcDir = "C:\someidir"
$dstDir = "D:\archivedir"
# List all the files, not subdirs
$files = gci $srcDir -recurse | ? { -not $_.PSIsContainer }
foreach($f in $files) {
# Create new name for compressed archive. Add file path, but
# replace \ with _ so there are no name collisions.
$src = $f.FullName
$dst = "c:\temppi\" + $src.Replace('\', '_').Replace(':','_') + ".zip"
Compress-Archive -whatif -Path $src -DestinationPath $dstDir
}
As a side note: use Enter-PSSession or Inoke-Command to run the script on the file server. There you can use local paths, though UNC paths should work pretty well - those are processed by loopback, so data isn't going through network.

Related

Powershell - Checking # of files in a folder across a domain

So I'm trying to count the number of font files (that have different extensions) inside the local font folder of every computer in my domain at work to verify which computers have an up to date font installation using powershell.
So far I have
Write-Host ( Get-ChildItem c:\MyFolder | Measure-Object ).Count;
as a means of counting the files, I'm just at a loss on how exactly to replicate this and get a output that indicates the file count for the path for every computer on my domain (the file path is all the same for each)
How should I best proceed?
You will have to run the command against every computer. Assuming you have some sort of domain admin privelege and can access admin shares on all computers, you can use the c$ share.
The code below takes a list of computers in a single column CSV with no headers and runs the command against the admin share on each
$computers = Import-Csv -Path C:\computers.csv -Header Computer;
foreach($c in $computers)
{
Write-Host (Get-ChildItem "\\$($c.Computer)\c$\MyFolder" | Measure-Object).Count;
};

Change Visual studio build path for Ram disk

Currently I have Visual Studio 17 V 15.4.2
Is it possible to set different build path for projects? for example instead of
C:\Users\[UserName]\source\repos\[MyProject]\[bin|obj]
move it on
M:\Users\[UserName]\source\repos\[MyProject]\[bin|obj]
note that project it self is inside C but temporary files are moved somewhere else. I have drive M which is a 16GB ram disk.
Benefits of using RAM disk:(reasons that is tempting me to do this)
faster build times (no real IO)
SSD doesn't wear out with repetitive rebuilds.
projects are inherently cleaned up (which brings following benefits)
share faster, your projects are not filled with unnecessary files so that you can easily share folders with others. (code size is usually less than 1MB but build objects can go beyond 1GB)
fast backups, for same reason your project folders always remain cleaned up and you can backup project much faster. (especially when you have many projects, eg. you would only backup 100MB istead of 10GB)
less chance of creating locked files. (which cause build desync, errors etc) in that case formatting ramdisk is easier than mucking with VS settings or restarting it.
Drawbacks:
you need much more RAM, in my case I have 32GB which I can spare 16GB for it.
if you reset VS or computer you loose compiled objects and you have to rebuild (once)
but benefits of using RAM disk clearly overweight its drawbacks.
Ok, now that I convinced you reasonably why I want this give me paths :)
Fully working solution for "obj" directory
Create EnviromentVariable BUILD_RAMDRIVE which points to your ramdrive.
Create "Directory.Build.props" file inside your solution dir with this content:
<Project xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<BuildRamdrive>$([System.Environment]::GetEnvironmentVariable("BUILD_RAMDRIVE",System.EnvironmentVariableTarget.Machine))</BuildRamdrive>
</PropertyGroup>
<PropertyGroup Condition=" '$(BuildRamdrive)' != '' AND '$(MSBuildProjectFile)' != ''">
<BaseIntermediateOutputPath>$(BuildRamdrive)\Projects\$(SolutionName)\$(MSBuildProjectFile)\obj\</BaseIntermediateOutputPath>
<IntermediateOutputPath>$(BuildRamdrive)\Projects\$(SolutionName)\$(MSBuildProjectFile)\obj\$(Configuration)\</IntermediateOutputPath>
<MSBuildProjectExtensionsPath>$(BuildRamdrive)\Projects\$(SolutionName)\$(MSBuildProjectFile)\obj\</MSBuildProjectExtensionsPath>
</PropertyGroup>
</Project>
Thats all :-)
This MSBuild scripts takes BUILD_RAMDRIVE environment variable from your computer. Then redirect all obj files of all projects inside this solution to $BUILD_RAMDRIVE\Projects\$(SolutionName)\$(MSBuildProjectFile)\obj\ directory
If your computer has no BUILD_RAMDRIVE environment variable it will do nothing.
It is possible to change the project files (at least for .NET applications) which bin- and obj-path should be used. But you need to change this manually on every project. And if you check in these changes it might cause problems for someone else that is trying to build.
Instead you change the bin- and obj-directories to be a symbolic link to a ramdisk. This will not affect your project and solution files (except that git might treat this links as file so you might want to add these to your .gitignore).
I’ve created two Powershell scripts to manage this. I’m running these in the same directory as the solution file. Be aware that the bin- and obj directories will be removed when you are running these.
This script will remove bin- and obj-folders in all directories where these is a csproj-file and replace them with symbolic links:
$ramDiskDrive = "R:"
# Find all project files...
$projectFiles = Get-ChildItem -Filter *.csproj -Recurse
# Get project directories
$projectDirectories = $projectFiles | ForEach-Object { $_.DirectoryName } | Get-Unique
# Create a bin-directory on the RAM-drive
$projectDirectories | ForEach-Object { New-Item -ItemType Directory -Force -Path "$ramDiskDrive$($_.Substring(2))\bin" }
# Remove existing bin-directories
$projectDirectories | ForEach-Object { Remove-Item "$($_)\bin" -Force -Recurse }
# Link bin-directories to ramdisk
$projectDirectories | ForEach-Object { cmd /c mklink /D "$($_)\bin" "$ramDiskDrive$($_.Substring(2))\bin" }
# Create a obj-directory on the RAM-drive
$projectDirectories | ForEach-Object { New-Item -ItemType Directory -Force -Path "$ramDiskDrive$($_.Substring(2))\obj" }
# Remove existing obj-directories
$projectDirectories | ForEach-Object { Remove-Item "$($_)\obj" -Force -Recurse }
# Link obj-directories to ramdisk
$projectDirectories | ForEach-Object { cmd /c mklink /D "$($_)\obj" "$ramDiskDrive$($_.Substring(2))\obj" }
This script will remove the symbolic links:
# Find all project files...
$projectFiles = Get-ChildItem -Filter *.csproj -Recurse
# Get project directories
$projectDirectories = $projectFiles | ForEach-Object { $_.DirectoryName } | Get-Unique
# Unlink bin-directories to ramdisk
$projectDirectories | ForEach-Object { cmd /c rmdir "$($_)\bin" }
# Unlink obj-directories to ramdisk
$projectDirectories | ForEach-Object { cmd /c rmdir "$($_)\obj" }
If you are running the same script twice you will get error messages, but these could be ignored.
All this said, from my experience there is no major difference to use a RAM-disk instead of an SSD-drive. It is faster but not that much as you might expect.

Update files on FTP server folder hierarchy with local files from a single folder

I have a little-big problem. I need to copy/overwrite JPG files from my local FOLDER to server FOLDERS.
Is there a way to search and match JPG files on SERVER with my files on LOCAL and overwrite them in server folders? I do it manually and it takes lot of time.
There are 50 000 JPGs on server and I need to overwrite 20 000 of them in short time.
Many thanks for answers!!
There's no magic way to do your very specific task. You have to script it.
If you are on Windows, it's rather trivial to write a PowerShell script for this, using WinSCP .NET assembly and its Session.EnumerateRemoteFiles method:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Set up session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
HostName = "ftp.example.com"
UserName = "username"
Password = "password"
}
$remotePath = "/remote/path";
$localPath = "C:\local\Path";
# Connect
Write-Host "Connecting..."
$session = New-Object WinSCP.Session
$session.SessionLogPath = "upload.log"
$session.Open($sessionOptions)
# Enumerate remote files
$fileInfos =
$session.EnumerateRemoteFiles(
$remotePath, "*.*", [WinSCP.EnumerationOptions]::AllDirectories)
# And look for a matching local file for each of them
foreach ($fileInfo in $fileInfos)
{
$localFilePath = (Join-Path $localPath $fileInfo.Name)
if (Test-Path $localFilePath)
{
Write-Host ("Found local file $localFilePath matching remote file " +
"$($fileInfo.FullName), overwriting..."
# Command-out this line with # for a dry-run
$session.PutFiles($localFilePath, $fileInfo.FullName).Check()
}
else
{
Write-Host ("Found no local file matching remote file " +
"$($fileInfo.FullName), skipping..."
}
}
Write-Host "Done"
Save the script to a file (SortOutFiles.ps1), extract a contents of WinSCP .NET assembly package along with the script, and run it like:
C:\myscript>powershell -ExecutionPolicy Bypass -File SortOutFiles.ps1
Connecting...
Found local file C:\local\path\aaa.txt matching remote file /remote/path/1/aaa.txt, overwritting...
Found local file C:\local\path\bbb.txt matching remote file /remote/path/2/bbb.txt, overwritting...
Found local file C:\local\path\ccc.txt matching remote file /remote/path/ccc.txt, overwritting...
Done
You can first dry-run the script by commenting out the line with $session.PutFiles call.
(I'm the author of WinSCP)
download "Filezilla"... Upload your local files (all 50000 images).. If a image is already there in server,, it will ask you options.. select 'overwrite' and use 'apply for all'...

How to get the Dropbox folder in Powershell in Windows

Same question exists for Python here: How can I get the Dropbox folder location programmatically in Python?, or here for OSX: How to get the location of currently logined Dropbox folder
Same thing in Powershell. I need the path of DropBox to copy files to it (building a software and then copying it to dropbox to share with team).
This Dropbox help page tells us where this info is stored, ie, in a json file in the AppData of the user: https://www.dropbox.com/help/4584
function GetDropBoxPathFromInfoJson
{
$DropboxPath = Get-Content "$ENV:LOCALAPPDATA\Dropbox\info.json" -ErrorAction Stop | ConvertFrom-Json | % 'personal' | % 'path'
return $DropboxPath
}
The line above is taken from: https://www.powershellgallery.com/packages/Spizzi.Profile/1.0.0/Content/Functions%5CProfile%5CInstall-ProfileEnvironment.ps1
Note that it doesn't check if you've got a Dropbox business account, or if you have both. It just uses the personal one.
You can then use this base Dropbox folder to build your final path, for example:
$targetPath = Join-Path -Path (GetDropBoxPathFromInfoJson) -ChildPath 'RootDropboxFolder\Subfolder1\Subfolder2'
if (-not (Test-Path -Path $targetPath)) { throw "Path '$targetPath' not found!" }
--
Alternative way is using the host.db file, as shown on this page:
http://bradinscoe.tumblr.com/post/75819881755/get-dropbox-path-in-powershell
$base64path = gc $env:appdata\Dropbox\host.db | select -index 1 # -index 1 is the 2nd line in the file
$dropboxPath = [System.Text.Encoding]::ASCII.GetString([System.Convert]::FromBase64String($base64path)) # convert from base64 to ascii

I want to fetch the name of the latest updated folder at particular path of FTP server

Using this command I am able to get the latest updated folder in Unix
ls -t1 | head -1
But how can I get the same in FTP server from Windows?
I want to get the name of the latest updated folder at particular path of FTP server. Could any one please help?
There's no easy way to do this with Windows shell commands.
You can:
Use ftp.exe to execute ls /path c:\local\path\listing.txt to save a directory listing to a text file.
Exit ftp.exe.
Parse the listing and find the latest files. Not an easy task for Windows shell commands.
It would be a way easier with a PowerShell script.
You can use FtpWebRequest class. Though it does not have an easy way to retrieve structured directory listing either. It offers only ListDirectoryDetails and GetDateTimestamp methods.
See Retrieving creation date of file (FTP).
Or use a 3rd-party library for the task.
For example with WinSCP .NET assembly you can do:
param (
$sessionUrl = "ftp://user:mypassword#example.com/",
$remotePath = "/path"
)
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.ParseUrl($sessionUrl)
# Connect
$session = New-Object WinSCP.Session
$session.Open($sessionOptions)
# Get list of files in the directory
$directoryInfo = $session.ListDirectory($remotePath)
# Select the most recent file
$latest =
$directoryInfo.Files |
Where-Object { -Not $_.IsDirectory } |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1
# Any file at all?
if ($latest -eq $Null)
{
Write-Host "No file found"
}
else
{
Write-Host "The latest file is $latest"
}
See full example Downloading the most recent file.
(I'm the author of WinSCP)

Resources