How to combine two commands to get one output using command prompt - cmd

I have a folder on a shared network drive with a large number of text files. I am required to list the file name, size and number of lines/ rows in each file. I am able to use command prompt to get the output separately but I cannot seem to combine.
This works perfectly to list the file name and size:
DIR /s “files location*.txt” > Directory.txt
This works to for the line count:
for %f in ("files location*.txt" ) do find /v /c "" "%f"
I tried the following to combine but the output was empty and the command prompt window showed the full file location and name but without the line count
DIR /s “files location*.txt” | for %f in (“files location*.txt”) do find /v /c "" "%f" > Directory.txt

I think this question has been here before. Put these two (2) files into the same directory. The directory should be in the PATH variable. Many things could be done to make this more flexible using parameters. If you are on a supported Windows system, PowerShell will be available. If you have PowerShell 6 or higher, change powershell to pwsh.
=== Get-FileLineCount.bat
#ECHO OFF
powershell -NoLogo -NoProfile -File "%~dp0Get-FileListCount.ps1"
EXIT /B
=== Get-FileLineCount.ps1
Get-ChildItem -File -Path 'C:\src\t' -Filter '*.txt' |
ForEach-Object {
[PSCustomObject]#{
LastWriteTime = $_.LastWriteTime
Length = $_.Length
LineCount = (Get-Content -Path $_.FullName | Measure-Object).Count
FileName = $_.FullName
}
}
This produces the following output.
LastWriteTime Length LineCount FileName
------------- ------ --------- --------
2021-04-08 08:14:59 3 1 C:\src\t\abc.txt
2021-04-08 08:16:39 8 1 C:\src\t\abc-utf-8.txt
2019-07-08 11:38:36 30 1 C:\src\t\append.txt
2019-07-08 11:38:36 36 12 C:\src\t\appendtemp.txt
2020-03-06 09:48:51 104 25 C:\src\t\Combined.txt

Related

Saving results after a for loop in Windows CMD

I have been trying to create a line of code to ping a range of IP addresses, in the windows command prompt, and after it finishes save the results in a text file. I am using a for loop to do the pinging, but I can't figure out how to save the results in a text file.
This is what I am using:
for /l %i in (1,1,64) do #ping 10.39.63.%i -w 1500 -n 1 | find "Reply"
I tried using the following code to save results in a text file, but it only saves the last command performed by CMD:
for /l %i in (1,1,64) do #ping 10.39.63.%i -w 100 -n 1 | find "Reply" >C:\Users\brymed\Desktop\test.txt
I want to keep it simple, so it'd be awesome to use only a line of code, but I am open to suggestions. Thank you.
This is not difficult using PowerShell. The $Hosts variable is a list of IP addresses to ping. The results are written to a file.
$Hosts = #()
foreach ($i in 1..64) { $Hosts += "10.39.63.$i" }
Test-Connection -Count 1 $hosts |
Select-Object -Property Address,BufferSize,Latency,Status |
Out-File -FilePath "$Env:USERPROFILE/Desktop/test.txt" -Encoding ascii
If you -must- run this in cmd.exe, the code can be formatted to do so.
pwsh.exe -NoLogo -NoProfile -Command ^
"$Hosts = #();" ^
"foreach ($i in 1..64) { $Hosts += \"10.39.63.$i\" };" ^
"Test-Connection -Count 1 $hosts -ErrorAction SilentlyContinue |" ^
"Select-Object -Property Address,BufferSize,Latency,Status |" ^
"Out-File -FilePath "$Env:USERPROFILE/Desktop/test.txt" -Encoding ascii"
Get PowerShell Core from https://github.com/PowerShell/PowerShell

Batch file: Combine echos into one line

I want to dump all the file names in a folder without extension into a text file. They should be in one line separated by commas.
So in my folder I have
File1.bin
File2.bin
....
With
(for %%a in (.\*.bin) do #echo %%~na,) >Dump.txt
I got
File1,
File2,
But what I want in the end is a text file with, so one long combined string.
File1,File2,...
I'm kinda stuck here and probably need something else than echo.
Thanks for trying to help.
Try like this:
#echo off
setlocal enableDelayedExpansion
for %%a in (.\*.txt) do (
<nul set /p=%%~nxa,
)
check also the accepted answer here and the dbenham's one.
You could also leverage powershell from a batch-file for this task:
#"%__APPDIR__%WindowsPowerShell\v1.0\powershell.exe" -NoProfile -Command "( Get-Item -Path '.\*' -Filter '*.bin' | Where-Object { -Not $_.PSIsContainer } | Select-Object -ExpandProperty BaseName ) -Join ',' | Out-File -FilePath '.\dump.txt'"
This could probably be shortened, if necessary, to:
#PowerShell -NoP "(GI .\*.bin|?{!$_.PSIsContainer}|Select -Exp BaseName) -Join ','>.\dump.txt"

Start multiple exe with arguments

I have around 5000 folders each containing a dos executable and required files.
Currently I am using a for loop to call the below code. it takes a long time to execute one by one as each execution takes around 5 seconds.
Is there an option where I can execute all the exe files at the same time ?
Any ideas?
Thanks
I tried using
start "" 1/ddd.exe input.dat
start "" 2/ddd.exe input.dat
start "" 3/ddd.exe input.dat
.
.
.
in a batch file. input.dat has the arguments to pass on to the exe. but the exe opens up a new window and its not taking the arguments. first argument is "2" run to certain part of the exe and second any number to exit the program after it has finished.
You tried
for /D %%a in (*) do (
echo processing: %%a
start /B "Name" cmd.exe "cd %%a & ddy.exe < parameters.txt"
)
I would prefer start /D "%%a" /min "Name" cmd.exe /c "ddy.exe < parameters.txt". /B causes them to use the same console and they may block each other. /D sets the working folder (no need for cd), /min minimizes the windows to keep your screen clean.
And don't forget /c with the cmd command (without, you get no parallel processes).
As a whole:
for /D %%a in (*) do (
echo processing: %%a
start /D "%%a" /min "Name" cmd.exe /c "ddy.exe < parameters.txt"
)
This is a -very- minimalistic script to run N commands at a time from a list. If you are on a supported Windows system, it will have PowerShell.
There is no error checking or proper help information. It writes stdout to the specified log file, but does nothing with the exit code from the command. If something fails, it would need to be identified from the log file.
To use this, put the following code into the file Invoke-JobList.ps1
Create a .csv file with the commands you want to run and a different log file name for each command. The log file name cannot be the same for multiple commands. If you have 5000 commands to process, you will probably need to write a script/program to produce it.
I provided a sample .csv file and a batch file that I used for testing. You do not need to use to.bat.
=== Get-Content .\Invoke-JobList.ps1
[CmdletBinding()]
Param (
[Parameter(Mandatory=$true)]
[string[]]$jobFile
,[Parameter(Mandatory=$false)]
[int]$nConcurrent = 2
)
$jobs = Import-Csv -Path $jobFile
$jobHash = #{}
$nJobsRunning = 0
foreach ($job in $jobs) {
if ($nJobsRunning -lt $nConcurrent) {
Write-Verbose -Message "starting command $($job.command)"
$j = Start-Job -ScriptBlock ([ScriptBlock]::Create($job.command))
$jobHash[$j] = $job.logfile
$nJobsRunning++
}
while ($nJobsRunning -ge $nConcurrent) {
# wait for one or more jobs to state Completed
$jobsRunning = Get-Job
foreach ($jobRun in $jobsRunning) {
if (($null -ne $jobHash[$jobRun]) -and ($jobRun.State -eq 'Completed')) {
Receive-Job -Job $jobRun | Out-File -FilePath $jobHash[$jobRun]
Remove-Job -Job $jobRun
$jobHash.Remove($jobRun)
$nJobsRunning--
}
}
}
}
Write-Verbose -Message $($nJobsRunning.ToString() + " remaining jobs")
# Wait for all remaining jobs to complete
while ($nJobsRunning -gt 0) {
$jobsRunning = Get-Job
foreach ($jobRun in $jobsRunning) {
if (($null -ne $jobHash[$jobRun]) -and ($jobRun.State -eq 'Completed')) {
Receive-Job -Job $jobRun | Out-File -FilePath $jobHash[$jobRun]
Remove-Job -Job $jobRun
$jobHash.Remove($jobRun)
$nJobsRunning--
}
}
}
=== Get-Content .\joblist3.csv
command,logfile
C:\src\jobs\to.bat 10,ss-001.txt
C:\src\jobs\to.bat 10,ss-002.txt
C:\src\jobs\to.bat 10,ss-003.txt
C:\src\jobs\to.bat 10,ss-004.txt
C:\src\jobs\to.bat 10,ss-005.txt
C:\src\jobs\to.bat 10,ss-006.txt
C:\src\jobs\to.bat 10,ss-007.txt
=== Get-Content .\to.bat
#ECHO OFF
SET "TO=%1"
IF "%TO%" == "" (SET "TO=5")
REM Cannot use TIMEOUT command
ping -n %TO% localhost
EXIT /B 0
Invoke it with parameters.
.\Invoke-JobList.ps1 -jobFile joblist3.csv -nConcurrent 3 -Verbose

file listing - Including all subfolders with filename, path, date and size

I am trying to get a list of all files within a directory including those within all subfolders.
The columns I would like are the Filename, Path, Size and Date.
I have tried to do my own research and come close but not yet hit the full solution.
I can get the filepath and filename together with date and size using this command below, unfortunately I cannot get all the files within a subfolders.
dir /t > filelist1.txt
This below CMD command does get the filenames from all subfolders but I cannot get it to produce dates.
(#For /F "Delims=" %A in ('dir /B/S/A-D') Do #Echo %~fA %~zA) >filelist.txt
I thought maybe do this to include dates but it didn't work.
(#For /F "Delims=" %A in ('dir /B/S/A/D') Do #Echo %~fA %~zA) >filelist.txt
This file also gives me the path and filename together which I can accept (I will use Excel to separate) but is it possible to have the path and filename separated?
Also it is possible to have those columns separated by tab for easier Excel import?
This could be done with %~ variables in a cmd.exe batch script. But, it is easier and more readable in PowerShell. The output is in a file named FileList.csv.
Get-ChildItem -Recurse |
ForEach-Object {
[PSCustomObject]#{
FileName = $_.Name
Path = $_.Directory
Size = $_.Length
Date = $_.LastWriteTime
}
} |
Export-Csv -Path './FileList.csv' -Delimiter "`t" -Encoding ASCII -NoTypeInformation
If you do not want to run the .ps1 file from the .bat script, it can, with enough effort, be put into the .bat file script.
powershell -NoLogo -NoProfile -Command ^
"Get-ChildItem |" ^
"ForEach-Object {" ^
"[PSCustomObject]#{" ^
"FileName = $_.Name;" ^
"Path = $_.Directory;" ^
"Size = $_.Length;" ^
"Date = $_.LastWriteTime" ^
"}" ^
"} |" ^
"Export-Csv -Path './FileList.csv' -Delimiter "`t" -Encoding ASCII -NoTypeInformation"

Get length of the longest file path in a folder and its sub folders

I'm looking for a script that can be run from command line (batch \ PowerShell) that will go over a folder and its sub folders and will return a number which is a length of the longest file path.
I already saw some batch and PowerShell scripts like
How do I find files with a path length greater than 260 characters in Windows?
but none of them works satisfy my request.
Note that it's possible that file path will be more than 256 characters
PowerShell:
((Get-ChildItem -Recurse).FullName | Measure-Object -Property Length -Maximum).Maximum
Command line:
powershell -exec Bypass -c "((dir -rec).FullName | measure Length -max).Maximum"
Edit
related to error: Get-ChildItem : The specified path, file name, or both are too long: read Maximum Path Length Limitation and related [PowerShell]-tagged StackOverflow threads.
PS D:\PShell> ((Get-ChildItem "D:\odds and ends" -Directory -Recurse).FullName | Measure-Object -Property Length -Maximum).Maximum
242
PS D:\PShell> ((Get-ChildItem "D:\odds and ends" -Recurse -ErrorAction SilentlyContinue).FullName | Measure-Object -Property Length -Maximum).Maximum
242
Note that -ErrorAction SilentlyContinue in above command merely suppresses displaying of error messages. However, I know that the latter 242 value returned is wrong.
My workaroud applies cmd /C dir /B /S instead of (Get-ChildItem -Recurse).FullName as follows:
PS D:\PShell> $x = (. cmd /C dir /B /S "D:\odds and ends")
PS D:\PShell> $y = ( $x | Measure-Object -Property Length -Maximum).Maximum
PS D:\PShell> $y
273
PS D:\PShell> $z = $x | Where-Object { $_.Length -gt 260 }
PS D:\PShell> $z.GetTypeCode()
String
PS D:\PShell> $z
D:\odds and ends\ZalohaGogen\WDElements\zalohaeva\zaloha_honza\Music\Jazz\!Kompilace\Saint Germain des Pres Cafe Vol. 1 to 8 - The Finest Electro Jazz Complication\Saint Germain Des Pres Cafe Vol. 7 - The Finest Electro Jazz Complication\CD 1\Configuring and Using Inte.txt
PS D:\PShell>

Resources