upload entire folder to a specific folder on Azure blob - azure-blob-storage

I know how to upload entire folder to a contain on Azure blob, it is this:
Get-ChildItem -File -Recurse | Set-AzureStorageBlobContent -Container $ContainerName -Context $ctx
My question is that if I have a folder under my container, saying it is "test". How can I upload all the files/subfloders of my local folder to Azuure blob mycontainer/test/.
Thanks

To create a folder in container, we just need to use the combination of path and file name as the blob name.
The object returned by Get-ChildItem has a property called FullName. Then we can use substring method to remove the disk letter.
Get-ChildItem -File -Recurse C:\Test\ | ForEach-Object { Set-AzureStorageBlobContent -File $_.FullName -Blob $_.FullName.Substring(3) -Container uploaded -Context $ctx }
Here is the screenshot of my lab:

You can give a try to command line tool AzCopy or its core library Azure Storage Data Movement Library, which supports fast transferring for folders and can be paused & resumed.

Now it is possible to add files into blob storage into the folder. All you need to do is to download Microsoft Azure Storage Explorer. Then you can use it to edit files and folders inside your blob storage:
Or open the app directly from Azure:

Related

Powershell - Checking # of files in a folder across a domain

So I'm trying to count the number of font files (that have different extensions) inside the local font folder of every computer in my domain at work to verify which computers have an up to date font installation using powershell.
So far I have
Write-Host ( Get-ChildItem c:\MyFolder | Measure-Object ).Count;
as a means of counting the files, I'm just at a loss on how exactly to replicate this and get a output that indicates the file count for the path for every computer on my domain (the file path is all the same for each)
How should I best proceed?
You will have to run the command against every computer. Assuming you have some sort of domain admin privelege and can access admin shares on all computers, you can use the c$ share.
The code below takes a list of computers in a single column CSV with no headers and runs the command against the admin share on each
$computers = Import-Csv -Path C:\computers.csv -Header Computer;
foreach($c in $computers)
{
Write-Host (Get-ChildItem "\\$($c.Computer)\c$\MyFolder" | Measure-Object).Count;
};

Can you compress files on a network share using PowerShell without downloading them to the client?

I have a network share hosted by a server (\SERVER) which is being accessed by other servers/clients.
\\SERVER\SHARE\Folder\File
If I wanted to compress Folder and everything in it, is it possible to do using PowerShell WITHOUT having the files be downloaded to the machine that is running the command?
The files in question are large, and there is not enough room on the C drive to download and compress the files on the client. So for example if I navigated to the network share from the client using Windows File Explorer, and selected the folder to compress, it would start downloading the files to the client and then fail due to insufficient free space on the client.
What about PowerShell's Invoke-Command Option?
I do have the option to Invoke-Command from the client to the server, however, the C drive of \SERVER is to small to handle the request as well. There is a D drive (which hosts the actual \SHARE), which has plenty of space though. I would have to tell PowerShell to compress files on that drive somehow instead of the default which would be the C drive.
Error when running the below PowerShell Command
Compress-Archive -Path "\\SERVER\SHARE\Folder" -DestinationPath "\\SERVER\SHARE\OtherFolder\Archive.zip"
Exception calling "Write" with "3" argument(s): "Stream was too long."
At
C:\Windows\system32\WindowsPowerShell\v1.0\Modules\Microsoft.PowerShell.Archive\Microsoft.PowerShell.Archive.psm1:820
char:29
+ ... $destStream.Write($buffer, 0, $numberOfBytesRead)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : IOException
The problem is caused by Compress-Archive's limits. Its maximum file size is 2 GB. Documentation mentions this:
The Compress-Archive cmdlet uses the Microsoft .NET API
System.IO.Compression.ZipArchive to compress files. The maximum file
size is 2 GB because there's a limitation of the underlying API.
As for a solution, compress smaller files, or use another a tool such as 7zip. There's a module available, though manual compression is not that complex. As 7zip is not a native tool, install either it or the Powershell module.
Set-Alias sz "$env:ProgramFiles\7-Zip\7z.exe"
$src = "D:\somedir"
$tgt = "D:\otherdir\archive.7z"
sz a $tgt $src
If the source files are small enough so that a single file will never create an archive larger that the limit, consider compressing each file by itself. An example is like so,
$srcDir = "C:\someidir"
$dstDir = "D:\archivedir"
# List all the files, not subdirs
$files = gci $srcDir -recurse | ? { -not $_.PSIsContainer }
foreach($f in $files) {
# Create new name for compressed archive. Add file path, but
# replace \ with _ so there are no name collisions.
$src = $f.FullName
$dst = "c:\temppi\" + $src.Replace('\', '_').Replace(':','_') + ".zip"
Compress-Archive -whatif -Path $src -DestinationPath $dstDir
}
As a side note: use Enter-PSSession or Inoke-Command to run the script on the file server. There you can use local paths, though UNC paths should work pretty well - those are processed by loopback, so data isn't going through network.

Change Visual studio build path for Ram disk

Currently I have Visual Studio 17 V 15.4.2
Is it possible to set different build path for projects? for example instead of
C:\Users\[UserName]\source\repos\[MyProject]\[bin|obj]
move it on
M:\Users\[UserName]\source\repos\[MyProject]\[bin|obj]
note that project it self is inside C but temporary files are moved somewhere else. I have drive M which is a 16GB ram disk.
Benefits of using RAM disk:(reasons that is tempting me to do this)
faster build times (no real IO)
SSD doesn't wear out with repetitive rebuilds.
projects are inherently cleaned up (which brings following benefits)
share faster, your projects are not filled with unnecessary files so that you can easily share folders with others. (code size is usually less than 1MB but build objects can go beyond 1GB)
fast backups, for same reason your project folders always remain cleaned up and you can backup project much faster. (especially when you have many projects, eg. you would only backup 100MB istead of 10GB)
less chance of creating locked files. (which cause build desync, errors etc) in that case formatting ramdisk is easier than mucking with VS settings or restarting it.
Drawbacks:
you need much more RAM, in my case I have 32GB which I can spare 16GB for it.
if you reset VS or computer you loose compiled objects and you have to rebuild (once)
but benefits of using RAM disk clearly overweight its drawbacks.
Ok, now that I convinced you reasonably why I want this give me paths :)
Fully working solution for "obj" directory
Create EnviromentVariable BUILD_RAMDRIVE which points to your ramdrive.
Create "Directory.Build.props" file inside your solution dir with this content:
<Project xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<BuildRamdrive>$([System.Environment]::GetEnvironmentVariable("BUILD_RAMDRIVE",System.EnvironmentVariableTarget.Machine))</BuildRamdrive>
</PropertyGroup>
<PropertyGroup Condition=" '$(BuildRamdrive)' != '' AND '$(MSBuildProjectFile)' != ''">
<BaseIntermediateOutputPath>$(BuildRamdrive)\Projects\$(SolutionName)\$(MSBuildProjectFile)\obj\</BaseIntermediateOutputPath>
<IntermediateOutputPath>$(BuildRamdrive)\Projects\$(SolutionName)\$(MSBuildProjectFile)\obj\$(Configuration)\</IntermediateOutputPath>
<MSBuildProjectExtensionsPath>$(BuildRamdrive)\Projects\$(SolutionName)\$(MSBuildProjectFile)\obj\</MSBuildProjectExtensionsPath>
</PropertyGroup>
</Project>
Thats all :-)
This MSBuild scripts takes BUILD_RAMDRIVE environment variable from your computer. Then redirect all obj files of all projects inside this solution to $BUILD_RAMDRIVE\Projects\$(SolutionName)\$(MSBuildProjectFile)\obj\ directory
If your computer has no BUILD_RAMDRIVE environment variable it will do nothing.
It is possible to change the project files (at least for .NET applications) which bin- and obj-path should be used. But you need to change this manually on every project. And if you check in these changes it might cause problems for someone else that is trying to build.
Instead you change the bin- and obj-directories to be a symbolic link to a ramdisk. This will not affect your project and solution files (except that git might treat this links as file so you might want to add these to your .gitignore).
I’ve created two Powershell scripts to manage this. I’m running these in the same directory as the solution file. Be aware that the bin- and obj directories will be removed when you are running these.
This script will remove bin- and obj-folders in all directories where these is a csproj-file and replace them with symbolic links:
$ramDiskDrive = "R:"
# Find all project files...
$projectFiles = Get-ChildItem -Filter *.csproj -Recurse
# Get project directories
$projectDirectories = $projectFiles | ForEach-Object { $_.DirectoryName } | Get-Unique
# Create a bin-directory on the RAM-drive
$projectDirectories | ForEach-Object { New-Item -ItemType Directory -Force -Path "$ramDiskDrive$($_.Substring(2))\bin" }
# Remove existing bin-directories
$projectDirectories | ForEach-Object { Remove-Item "$($_)\bin" -Force -Recurse }
# Link bin-directories to ramdisk
$projectDirectories | ForEach-Object { cmd /c mklink /D "$($_)\bin" "$ramDiskDrive$($_.Substring(2))\bin" }
# Create a obj-directory on the RAM-drive
$projectDirectories | ForEach-Object { New-Item -ItemType Directory -Force -Path "$ramDiskDrive$($_.Substring(2))\obj" }
# Remove existing obj-directories
$projectDirectories | ForEach-Object { Remove-Item "$($_)\obj" -Force -Recurse }
# Link obj-directories to ramdisk
$projectDirectories | ForEach-Object { cmd /c mklink /D "$($_)\obj" "$ramDiskDrive$($_.Substring(2))\obj" }
This script will remove the symbolic links:
# Find all project files...
$projectFiles = Get-ChildItem -Filter *.csproj -Recurse
# Get project directories
$projectDirectories = $projectFiles | ForEach-Object { $_.DirectoryName } | Get-Unique
# Unlink bin-directories to ramdisk
$projectDirectories | ForEach-Object { cmd /c rmdir "$($_)\bin" }
# Unlink obj-directories to ramdisk
$projectDirectories | ForEach-Object { cmd /c rmdir "$($_)\obj" }
If you are running the same script twice you will get error messages, but these could be ignored.
All this said, from my experience there is no major difference to use a RAM-disk instead of an SSD-drive. It is faster but not that much as you might expect.

How to get the Dropbox folder in Powershell in Windows

Same question exists for Python here: How can I get the Dropbox folder location programmatically in Python?, or here for OSX: How to get the location of currently logined Dropbox folder
Same thing in Powershell. I need the path of DropBox to copy files to it (building a software and then copying it to dropbox to share with team).
This Dropbox help page tells us where this info is stored, ie, in a json file in the AppData of the user: https://www.dropbox.com/help/4584
function GetDropBoxPathFromInfoJson
{
$DropboxPath = Get-Content "$ENV:LOCALAPPDATA\Dropbox\info.json" -ErrorAction Stop | ConvertFrom-Json | % 'personal' | % 'path'
return $DropboxPath
}
The line above is taken from: https://www.powershellgallery.com/packages/Spizzi.Profile/1.0.0/Content/Functions%5CProfile%5CInstall-ProfileEnvironment.ps1
Note that it doesn't check if you've got a Dropbox business account, or if you have both. It just uses the personal one.
You can then use this base Dropbox folder to build your final path, for example:
$targetPath = Join-Path -Path (GetDropBoxPathFromInfoJson) -ChildPath 'RootDropboxFolder\Subfolder1\Subfolder2'
if (-not (Test-Path -Path $targetPath)) { throw "Path '$targetPath' not found!" }
--
Alternative way is using the host.db file, as shown on this page:
http://bradinscoe.tumblr.com/post/75819881755/get-dropbox-path-in-powershell
$base64path = gc $env:appdata\Dropbox\host.db | select -index 1 # -index 1 is the 2nd line in the file
$dropboxPath = [System.Text.Encoding]::ASCII.GetString([System.Convert]::FromBase64String($base64path)) # convert from base64 to ascii

PowerShell Copying only New added folders and email results

Is it possible to copy from the Source location only New folders that have been added.
I have a Source location that is updated with folders every 5 minutes. The PS1 script will run every 5 minutes and copy all the folders to the destination location.
The issue im having is - It's copying over everything, i only want it to match up what has already been copied over Prior and copy over only newly added folders, Instead of copying everything again that is already there. Is this possible?
Also if possible once the copying of only Recently added folders, can the script then email out completion of this with what is has done?
So far i have the following :
Copy-Item -Recurse \\192.168.1.37\d$\Transactions\* -Destination D:\UK_Copy\Transactions_Bk -Force –Verbose
You could create a script that just loops. Something like:
$Source = "\\192.168.1.37\d$\Transactions\"
$Destination = "D:\UK_Copy\Transactions_Bk"
$LastScan = Get-ChildItem $Destination -recurse
While(1 -lt 2){
$NewScan = Get-ChildItem $Source -recurse
Compare-Object $NewScan -DifferenceObject $LastScan -Passthru | Copy-Item -Destination ($_.DirectoryName -replace [regex]::escape($source),$Destination) -force
$LastScan = $NewScan
Start-Sleep 300
}
That will get a listing of files in the destination, then look for files in the source that are new, or have changed compared to the destination and copy those over. Then it sets that scan as the last known listing to compare to, sleeps for 5 minutes, and loops again looking for files that are new or updated since the last scan.

Resources