Copy nested folder structures - powershell-4.0

I am copying over all items from one directory to another that has files and subfolders with files using this command I am using is simple:
(Get-ChildItem -Recurse -Path $source).ForEach({$_ | Copy-Item -Destination $dest})
It works but the problem is that it doesn't copy files within subfolders to the destination subfolders they belong but to the main destination folder instead. Wondering if I can I modify this command in a concise way so it's still a one-liner and does that?
The difference between this and similar questions is that I have nested folder structure with sub-folders and files within them.

Related

PS script copy new files

Pretty noob at writing PS scripts - wrote this up and have been actively using it although still requires some manual intervention trying to achieve my goal, which I would like to automate completely.
I will try my best to explain clearly;
I am trying to copy '.bak' files to a specific directory from a source folder that has files dropped in it on a daily basis. Problem is the way I created the script, every time it runs it creates a new folder with some of the same files as previously copied.
The files being copied all follow the same name structure in date sequence;
xxxx_2018_01_01_2131231.bak
xxxx_2018_01_02_2133212.bak
xxxx_2018_01_03_2199531.bak
How could I write the script so that it copies newer files only and not what has already been copied previously?
It would also be nice to only create a new folder then a certain part of the file name changes.
Here is the script;
$basedir = "Path:\path"
$today = (Get-Date).ToString('MM_dd_yy')
$Filter = '*.bak'
$location = New-Item -Path $basedir -Type Directory -Name $today
Copy-Item -Path 'Path:\path' -Destination $location -Filter $Filter -Recurse
Any pointers are greatly appreciated!
Thanks in advance for your help!
I'm not sure if there is an easy way to code this, but the general answer would be using the Get-ChildItem cmdlet.
"The Get-ChildItem cmdlet gets the items in one or more specified locations. If the item is a container, it gets the items inside the container, known as child items. You can use the -Recurse parameter to get items in all child containers and use the -Depth parameter to limit the number of levels to recurse."
By using the Get-ChildItem, you could get the listing of files that are in both directories, and then compare them to see if they have the same name. Then build an if() argument based on criteria you wish to use to compare them.
It's not the complete answer, but it is a good starting point.
Thanks everyone for pitching in, much appreciated!
I have switched over to the batch file route and have created the following to accomplish my goal;
#echo off
setlocal
set _source="C:\users\user1\desktop\source"
set _dest="C:\users\user1\desktop\dest"
robocopy %_source% %_dest% *.bak /mir /XC /XN /XO
Any opinion on this script is encouraged!

How to use short-cut paths to Compress-Archive to zip current folder into same destination

I am using Compress-Archive and want to zip the current directory into the same path. However I do not want to have to type out the entire file path both times. Is there an easy way to do this?
I am using windows 10 pro.
This works for the most part Compress-Archive . test.zip but I want it to be on the same level as the current directory so I need to put it back one spot.
Something like this is what I want:
path/test
path/test.zip
What I am getting:
path/test
path/test/test.zip
It is going inside the actual folder which is not what I want
You propably want that:
Compress-Archive * ..\test.zip
The wildcard * avoids that the name of the folder is put inside the zip.
Using .. for the output path we go one level up in the directory tree.
This command will fail if test.zip already exists. Either add parameter -update to update the archive or add -force to overwrite the archive. Both can be used even if the archive does not already exist.
If the current working directory is "t", it can be included using the following command. I would note that I do not think putting the destination .zip file in the directory being compressed is a good idea.
Compress-Archive -Path $(Get-ChildItem -Recurse -Exclude t.zip) -DestinationPath .\t.zip -Force
It is shorter if you are willing to use aliases and cryptic switches.
Compress-Archive $(gci -r -e t.zip) .\t.zip -Force
If I have misinterpreted your situation, please leave a comment or improve the information provided by editing the question.

How can I organize recovered files into separate folders by file type?

I've got 218GB of assorted files recovered from a failing hard drive using PhotoRec. The files do not have their original file names and they're not sorted in any manner.
How can I go about sorting the files into separate folders by file type? I've tried searching for .jpg, for example, and I can copy those results into a new folder. But when I search for something like .txt, I get 16GB of text files as the result and there's no way I've found to select them all and copy them into their own folder. The system just hangs.
This is all being done on Windows 10.
Open powershell. Change to the recovered data folder cd c:\...\recovered_files. Make a directory for the text files mkdir text_files. Do the move mv *.txt text_files.
You really just want to move/cut the files like this instead of copying, because moving the files is just a name change (very fast), but to copy would have to duplicate all of the data (quite slow).
If your files are distributed among many directories, you would need to use a find command. In Linux, this would be quite simple with the command, find. In Windows, I have never tried anything like this. On MSDN there is an article about PowerShell that features an example which seems reminiscient of what you want to do. MSDN Documentation
The gist of it is that you would use the command:
cd <your recovered files directory containing the recup_dir folders>
Get-ChildItem -Path ".\*.txt" -Recurse | Move-Item -Verbose -Destination "Z:\stock_recovered\TXT"
Note that the destination is outside of the search path, which might be important!
Since I have never tried this before, there is NO WARRANTY. Supposing it works, I would be curious to know.

Windows 10 - how to compress file without creating a parent folder

In Windows 10,
from power shell, I want to compress a folder without it creating a parent folder inside the zip file
currently I use:
Compress-Archive -Path . ../abc.zip
Is it possible to create the archive without the parent folder?
You might try this (didn't test it and I don't even have any window installed, so might not work):
gci c:\folder\* | Compress-Archive -DestinationPath c:\abc.zip

Create an Array containing filenames in a directory

I'm trying to copy one or two specific files from a bunch of directories (hence why I don't want to/can't use *) from one directory to another using a batch script.
Basically I want to navigate into a "root directory" and from that list all the sub-directories using dir /AD-H /B then I want to cd into each of those directories and xcopy /y into a directory I have stored in a variable.
I've tried some examples I've found on the web, but when I've modified them they have not been able to handle the switches properly.
Thanks
Look into PHP an list the directory recursively into an array, here is a Example but you would need to modify it to fit your needs
With PowerShell, you can use something like:
Get-ChildItem C:\ | ? {if ($_.PSIsContainer) {Copy-Item -include MyFile1.ABC -path $_.FullName -destination ("E:\Test\" + $_.Name) -recurse}}
Replace C:\ with the "Root Directory" to copy from and replace "E:\Test\" with the "Root Directory" to copy to (or to use an environmental variable DestX, replace "E:\Test\" with $env:DestX.

Resources