Find any mention of google in all files and subdirectories? - windows

I've tried various versions of this command, hoping to get a hit. I put a blank txt document/file that only contained the string "google" and ran commands like this:
findstr /S "google" ./
findstr /S "*google*" ./
findstr /S ".google." .\
findstr /S /C:"google" c:\my\directory
What I'm thinking is I might need to have pipe an output format?
Please, tell me, in general, what am I doing wrong and how can I do this properly?

You have to mention the file in which you have to search the string.
Eg:
findstr /S "google" test_file.txt
To search in all files in a folder, add *.
Eg:
findstr /S "google" ./*

The PowerShell way for this could be:
Get-ChildItem -Path 'D:\Test' -File -Recurse |
Select-String -Pattern 'google' -SimpleMatch |
Select-Object Path, LineNumber, Line # properties you want returned
This would return
Path LineNumber Line
---- ---------- ----
D:\Test\SomeOtherTextFile.txt 6 google
D:\Test\test.txt 1 google
D:\Test\Blah\blah.txt 3 find stuff with Google

Regarding findstr I'd add /i to force case insensitivity. It shouldn't need any output redirection purely to work, it'll just output in the screen. /s will search the current directory and all sub directories. If you specify a directory it will search that directory and all sub-directories:
findstr /i /s "google" .\*
The "*" seems to be required.
findstr /i /s "google" c:\temp\*
I did some testing before posting and it seems if you don't spec the "*" it just runs forever.
Regarding the redirection of the output my first instinct was to use or ">>" to redirect stderr and append into a file. However, it doesn't seem to respect line endings! The output runs together as a single line.
With pure PowerShell, you can use Get-ChildItem with Select-String & Out-File instead.
Get-ChildItem -Recurse -File |
Select-String -pattern "google" |
Out-File c:\temp\select-String-test.txt -Append
If you are starting your search at the root of c:\ you are bound to get some bloody red access denied errors. Generally I don't care about those files, so I'll throw in -ErrorAction SilentlyContinue.

Related

Listing files and folders and outputting it as .csv

I'm trying to write a batch script to list all the folders, sub-folders and files inside a directory, and then output everything to a .csv-file. I have tried this using the tree command but I also need the "Creation date" and "Last modified" date to be included. The tree command doesn't seem to support that. Is there any other way to do it?
Example:
tree "C:\Windows" /A /F > "C:\Log.csv"
Use a PowerShell one liner:
Get-ChildItem -Recurse | Select-Object FullName,CreationTime,LastWriteTime|Export-Csv C:\log.csv -NotypeInformation
if necessary wrapped in a batch:
powershell -NoP -C "Get-ChildItem -Recurse | Select-Object FullName,CreationTime,LastWriteTime|Export-Csv C:\log.csv -NotypeInformation"
What about Dir /S *.*?
The /S stands for "go through the directory and all subdirectories".
Sorry. I missed the part where you needed the creation dates. Those ones can be achieved as mentioned in this post.

Store first result of where command into a variable and stop searching

I have a particular file on my C drive and I want to be able through a .bat script to search for that file and change the directory into the Folder containing this file.
What I have so far is to retrieve the search result into a variable :
For /f %%i in <'where /R C:\ *testfile.jar'> do set RESULT=%%i
The issue is that the where command does not stop after finding the file as it continues to search through the whole C drive for other similar files. However, in this case i know that this file only exists once on my drive so how can I make the where command stop after finding the path ?
The only way I know to do this without searching the entire device before setting the variable is to use PowerShell. The cmd FOR loop will complete the command before it does the DO block. PowerShell pipes push the data through as it is found.
#ECHO OFF
FOR /F "delims=" %%f IN (' ^
powershell -NoProfile -Command ^
"Get-ChildItem -Path 'C:\' -Recurse -Filter '*testfile.jar' -ErrorAction SilentlyContinue |" ^
"Select-Object -First 1 |" ^
"ForEach-Object { $_.FullName }" ^
') DO (
SET "FN=%%~f"
)
ECHO "%FN%"

cmd Search for files from list of partial filenames then copy to folder

I have a text file list of approx 120,000 filenames. Many of the files on the list are in a folder or it's subfolders, but with slight variations on the filenames.
so I want to search using the list of partial filenames and copy the matches to another folder.
Each line on the list is a name and a title separated by a bar for example:
A Name|The Title
John Smith|A Life
The files are various text formats and all have extra stuff in the filenames like:
A Name - The Title V1.4 (html).lit
John Smith - A Life: Living on the Edge [MD] (pdf).rar
I've tried the code from this thread
and this thread but neither are finding any of the files. Can anyone help please.
This PowerShell script assumes that if both the first field "name" and the second field "title" are anywhere in the filename that it should be copied. When you are confident that the correct files will be copied, remove the -WhatIf from the Copy-Item command.
Note that this does not address the issue of multiple files with the same name.
If you wanted to require the "name" field to be at the beginning of the string, you could add it to the match expression. $_.Name -match '^'+$pair.name. If you want the matches to be case sensitive, use -cmatch.
$sourcepath = 'C:\src'
$targetpath = 'C:\other'
$searchpairs = Import-Csv -Header "name","title" -Delimiter "|" -Encoding ASCII -path .\mdb.txt
foreach ($pair in $searchpairs) {
Get-ChildItem -Recurse -File -Path $sourcepath |
Where-Object { ($_.Name -match $pair.name) -and ($_.Name -match $pair.title) } |
ForEach-Object { Copy-Item $_.FullName $targetpath -WhatIf}
}
Adjust the paths and you should be good with this one:
#ECHO OFF
REM **************************************************
REM Adjust location of list
SET list=C:\adjust\path\list.txt
REM Source dir
SET source=C:\adjust\path\source
REM Target dir
SET destination=C:\adjust\path\destination
REM **************************************************
FOR /F "tokens=1,* delims=|" %%A IN (%list%) DO (
ECHO.
ECHO %%A - %%B
CALL :copy "%%A - %%B"
)
ECHO.
ECHO Done^!
PAUSE
EXIT
:copy
FOR /R "%source%" %%F IN (*) DO (
ECHO "%%~nF" | FINDSTR /C:%1 >nul && COPY "%%~fF" "%destination%\%%~nxF" && EXIT /B
)
EXIT /B
Be aware: this requires the scheme in list.txt to be A|B and the scheme of every file to be copied *A - B* (including spaces) while * may be no or any character(s).
Might not be the solution you are looking for, but I ditched batch scripting years ago. I use Powershell instead, and simply call the Powershell script from batch file.
Here is the code in case you are interested,
searchfiles.ps1
$searchStrings = #("city", "sniper") # Declare an array to iterate over.
foreach ($string in $searchStrings) {
Get-ChildItem -Path D:\Movies\Movies | ? { $_ -match $string }
}
And now call the Powershell script from batch file.
searchfiles.bat
Powershell.exe -ExecutionPolicy RemoteSigned -Command "& searchfiles.ps1 -Verb RunAs"
Hope it helps!
That's not to say I don't use batch scripting at all. I will use them only for simpler operations, like calling another script, or opening a folder, etc. With Powershell, I love taking the help of the underlying .NET framework and sweet piping!

List files with path and file size only in Command Line

Windows Command Line (or maybe PowerShell).
How can I list all files, recursively, with full path and filesize, but without anything else and export to a .txt file. Much preferably a code that works for whichever current directory I am in with the Command Line (so does not require manual entering of the target directory).
None of these provides path\filename and filesize only:
dir /s > filelist.txt
dir /s/b > filelist.txt
dir /s/o:-d > filelist.txt
Desired output (fullpath\file.ext filesize):
c:\aaa\file.ext 7755777
c:\aaa\bbb\1.txt 897667
c:\aaa\bbb\2.ext 67788990
c:\aaa\bbb\nnn\a.xls 99879000
PowerShell:
gci -rec -file|%{"$($_.Fullname) $($_.Length)"} >filelist.txt
earlier PowerShell versions:
gci -rec|?{!$_.PSIsContainer}|%{"$($_.Fullname) $($_.Length)"} >filelist.txt
Batch file:
(#For /F "Delims=" %%A in ('dir /B/S/A-D') Do #Echo %%~fA %%~zA) >filelist.txt
Cmdline
(#For /F "Delims=" %A in ('dir /B/S/A-D') Do #Echo %~fA %~zA) >filelist.txt
Get-ChildItem -Recurse | select FullName,Length | Format-Table -HideTableHeaders | Out-File filelist.txt
OP's chosen answer using PowerShell (and their comment that they used Get-ChildItem -Recurse | select Length,LastWriteTime,FullName | Format-Table -Wrap -AutoSize | Out-File filelist.txt) was almost what I wanted for processing in Excel. Thanks for that part.
Unfortunately, (as they mentioned) the output had wrapped lines for long file paths, which isn't quite what I wanted.
The following command will produce a CSV formatted file (no wrapped lines):
Get-ChildItem -Recurse | select Length,LastWriteTime,FullName | Export-Csv -path filelist.csv -NoTypeInformation
Kudos to https://stackoverflow.com/a/23434457/7270462 for the tip about Export-Csv
forfiles /s /c "cmd /c echo #path #fsize" >filelist.txt
The following removes the wrapping issue you have:
Get-ChildItem -Recurse | select Length,LastWriteTime,FullName | Format-Table -Wrap -AutoSize | out-string -width 4096 | clip
Have a look at this Reference.
Go to folder and shift+Right Click ---> Powershell and then put in this code.
gci -rec -file|%{"$($_.Length)~$($_.Name)~$($_.FullName)"} >filelist.txt
Ctrl+A then Ctrl+C ---> Copy into Excel

Batch renaming with forfiles, or potentially Powershell

I have an issue when trying to bulk rename files within folders using forfiles and Powershell. The issue I am having is I need to search for files with _T_ in them, so I use the search notation *_T_*.cr2 as they are all RAW files. In a given folder there will be 143 or so RAW files, 69 of them will have _T_ in their names and the rest don't. What I want to achieve is run a quick command that weeds out all files with _T_ in them and then adds an extra _ before the file.
So before: Ab01_T_gh.cr2 and after _Ab01_T_gh.cr2
My issue is that searching for the _T_ files causes the command to keep executing over and over and over, so the file eventually looks like _____________Ab01etc until it hits the Windows file name limit.
Here's what my forfiles command looks like:
forfiles /S /M *_T_*.cr2 /C "cmd /c rename #file _#file"
It works but it works a little too well.
I also tried Powershell with the same result.
Get-ChildItem -Filter "*_T_*.cr2" -Recurse | Rename-Item -NewName { "_" + $_.Name}
Perhaps there's a way I could split up the "find" and "rename" parts of the code? Any help would be greatly appreciated! I can't manually separate out the _T_ files as it would be very time intensive as each parent folder will sometimes have 75 subfolders with 143 RAW files in each.
This works fine:
#echo off
setlocal EnableDelayedExpansion
for %%a in (*_T_.cr2) do (
set z=%%a
set z=!z:~,1!
if not !z!==_ rename %%a _%%a
)
I had to drop forfiles since I could not work with the # variables (maybe with findstr).
I resorted to a simple for loop on the pattern, and only rename if not already starts by underscore.
Recursive version:
#echo off
setlocal EnableDelayedExpansion
for /R . %%a in (*_T_.cr2) do (
echo %%~na%%~xa
set z=%%~na
set z=!z:~,1!
if not !z!==_ rename %%a _%%~na%%~xa
)
Here's a PowerShell version based on your original code:
Get-ChildItem -Filter "*_T_*.cr2" -Recurse |
Where-Object { $_.Name.Substring(0,1) -ne '_' } |
Rename-Item -NewName $("_" + $_.Name)

Resources