How to get the Dropbox folder in Powershell in Windows - windows

Same question exists for Python here: How can I get the Dropbox folder location programmatically in Python?, or here for OSX: How to get the location of currently logined Dropbox folder
Same thing in Powershell. I need the path of DropBox to copy files to it (building a software and then copying it to dropbox to share with team).

This Dropbox help page tells us where this info is stored, ie, in a json file in the AppData of the user: https://www.dropbox.com/help/4584
function GetDropBoxPathFromInfoJson
{
$DropboxPath = Get-Content "$ENV:LOCALAPPDATA\Dropbox\info.json" -ErrorAction Stop | ConvertFrom-Json | % 'personal' | % 'path'
return $DropboxPath
}
The line above is taken from: https://www.powershellgallery.com/packages/Spizzi.Profile/1.0.0/Content/Functions%5CProfile%5CInstall-ProfileEnvironment.ps1
Note that it doesn't check if you've got a Dropbox business account, or if you have both. It just uses the personal one.
You can then use this base Dropbox folder to build your final path, for example:
$targetPath = Join-Path -Path (GetDropBoxPathFromInfoJson) -ChildPath 'RootDropboxFolder\Subfolder1\Subfolder2'
if (-not (Test-Path -Path $targetPath)) { throw "Path '$targetPath' not found!" }
--
Alternative way is using the host.db file, as shown on this page:
http://bradinscoe.tumblr.com/post/75819881755/get-dropbox-path-in-powershell
$base64path = gc $env:appdata\Dropbox\host.db | select -index 1 # -index 1 is the 2nd line in the file
$dropboxPath = [System.Text.Encoding]::ASCII.GetString([System.Convert]::FromBase64String($base64path)) # convert from base64 to ascii

Related

Powershell - Checking # of files in a folder across a domain

So I'm trying to count the number of font files (that have different extensions) inside the local font folder of every computer in my domain at work to verify which computers have an up to date font installation using powershell.
So far I have
Write-Host ( Get-ChildItem c:\MyFolder | Measure-Object ).Count;
as a means of counting the files, I'm just at a loss on how exactly to replicate this and get a output that indicates the file count for the path for every computer on my domain (the file path is all the same for each)
How should I best proceed?
You will have to run the command against every computer. Assuming you have some sort of domain admin privelege and can access admin shares on all computers, you can use the c$ share.
The code below takes a list of computers in a single column CSV with no headers and runs the command against the admin share on each
$computers = Import-Csv -Path C:\computers.csv -Header Computer;
foreach($c in $computers)
{
Write-Host (Get-ChildItem "\\$($c.Computer)\c$\MyFolder" | Measure-Object).Count;
};

Can you compress files on a network share using PowerShell without downloading them to the client?

I have a network share hosted by a server (\SERVER) which is being accessed by other servers/clients.
\\SERVER\SHARE\Folder\File
If I wanted to compress Folder and everything in it, is it possible to do using PowerShell WITHOUT having the files be downloaded to the machine that is running the command?
The files in question are large, and there is not enough room on the C drive to download and compress the files on the client. So for example if I navigated to the network share from the client using Windows File Explorer, and selected the folder to compress, it would start downloading the files to the client and then fail due to insufficient free space on the client.
What about PowerShell's Invoke-Command Option?
I do have the option to Invoke-Command from the client to the server, however, the C drive of \SERVER is to small to handle the request as well. There is a D drive (which hosts the actual \SHARE), which has plenty of space though. I would have to tell PowerShell to compress files on that drive somehow instead of the default which would be the C drive.
Error when running the below PowerShell Command
Compress-Archive -Path "\\SERVER\SHARE\Folder" -DestinationPath "\\SERVER\SHARE\OtherFolder\Archive.zip"
Exception calling "Write" with "3" argument(s): "Stream was too long."
At
C:\Windows\system32\WindowsPowerShell\v1.0\Modules\Microsoft.PowerShell.Archive\Microsoft.PowerShell.Archive.psm1:820
char:29
+ ... $destStream.Write($buffer, 0, $numberOfBytesRead)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : IOException
The problem is caused by Compress-Archive's limits. Its maximum file size is 2 GB. Documentation mentions this:
The Compress-Archive cmdlet uses the Microsoft .NET API
System.IO.Compression.ZipArchive to compress files. The maximum file
size is 2 GB because there's a limitation of the underlying API.
As for a solution, compress smaller files, or use another a tool such as 7zip. There's a module available, though manual compression is not that complex. As 7zip is not a native tool, install either it or the Powershell module.
Set-Alias sz "$env:ProgramFiles\7-Zip\7z.exe"
$src = "D:\somedir"
$tgt = "D:\otherdir\archive.7z"
sz a $tgt $src
If the source files are small enough so that a single file will never create an archive larger that the limit, consider compressing each file by itself. An example is like so,
$srcDir = "C:\someidir"
$dstDir = "D:\archivedir"
# List all the files, not subdirs
$files = gci $srcDir -recurse | ? { -not $_.PSIsContainer }
foreach($f in $files) {
# Create new name for compressed archive. Add file path, but
# replace \ with _ so there are no name collisions.
$src = $f.FullName
$dst = "c:\temppi\" + $src.Replace('\', '_').Replace(':','_') + ".zip"
Compress-Archive -whatif -Path $src -DestinationPath $dstDir
}
As a side note: use Enter-PSSession or Inoke-Command to run the script on the file server. There you can use local paths, though UNC paths should work pretty well - those are processed by loopback, so data isn't going through network.

Push certificate to multiple windows servers

Can you please help me where to add credentials (all windows servers have same credentials) in this script to puch scripts
Point the script to a text file with a list of computers
$Computers = "C:\File Copy\Source Server\ComputerList.txt"
Sets the variable for the source file location
$Source = "C:\File Copy\prod.csv"
Sets the variable for the file destination
$Destination = "File copy\Servers"
Get the content of $computers and copy Source to Destination
Get-Content $Computers | ForEach-Object {Copy-Item $Source -Destination (Join-Path "\\$_\c`$\" $Destination)
Apologies in advance i am not an expert on powershell so dont know much can you explain a bit

I want to fetch the name of the latest updated folder at particular path of FTP server

Using this command I am able to get the latest updated folder in Unix
ls -t1 | head -1
But how can I get the same in FTP server from Windows?
I want to get the name of the latest updated folder at particular path of FTP server. Could any one please help?
There's no easy way to do this with Windows shell commands.
You can:
Use ftp.exe to execute ls /path c:\local\path\listing.txt to save a directory listing to a text file.
Exit ftp.exe.
Parse the listing and find the latest files. Not an easy task for Windows shell commands.
It would be a way easier with a PowerShell script.
You can use FtpWebRequest class. Though it does not have an easy way to retrieve structured directory listing either. It offers only ListDirectoryDetails and GetDateTimestamp methods.
See Retrieving creation date of file (FTP).
Or use a 3rd-party library for the task.
For example with WinSCP .NET assembly you can do:
param (
$sessionUrl = "ftp://user:mypassword#example.com/",
$remotePath = "/path"
)
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.ParseUrl($sessionUrl)
# Connect
$session = New-Object WinSCP.Session
$session.Open($sessionOptions)
# Get list of files in the directory
$directoryInfo = $session.ListDirectory($remotePath)
# Select the most recent file
$latest =
$directoryInfo.Files |
Where-Object { -Not $_.IsDirectory } |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1
# Any file at all?
if ($latest -eq $Null)
{
Write-Host "No file found"
}
else
{
Write-Host "The latest file is $latest"
}
See full example Downloading the most recent file.
(I'm the author of WinSCP)

PowerShell Copying only New added folders and email results

Is it possible to copy from the Source location only New folders that have been added.
I have a Source location that is updated with folders every 5 minutes. The PS1 script will run every 5 minutes and copy all the folders to the destination location.
The issue im having is - It's copying over everything, i only want it to match up what has already been copied over Prior and copy over only newly added folders, Instead of copying everything again that is already there. Is this possible?
Also if possible once the copying of only Recently added folders, can the script then email out completion of this with what is has done?
So far i have the following :
Copy-Item -Recurse \\192.168.1.37\d$\Transactions\* -Destination D:\UK_Copy\Transactions_Bk -Force –Verbose
You could create a script that just loops. Something like:
$Source = "\\192.168.1.37\d$\Transactions\"
$Destination = "D:\UK_Copy\Transactions_Bk"
$LastScan = Get-ChildItem $Destination -recurse
While(1 -lt 2){
$NewScan = Get-ChildItem $Source -recurse
Compare-Object $NewScan -DifferenceObject $LastScan -Passthru | Copy-Item -Destination ($_.DirectoryName -replace [regex]::escape($source),$Destination) -force
$LastScan = $NewScan
Start-Sleep 300
}
That will get a listing of files in the destination, then look for files in the source that are new, or have changed compared to the destination and copy those over. Then it sets that scan as the last known listing to compare to, sleeps for 5 minutes, and loops again looking for files that are new or updated since the last scan.

Resources