I want to dump all the file names in a folder without extension into a text file. They should be in one line separated by commas.
So in my folder I have
File1.bin
File2.bin
....
With
(for %%a in (.\*.bin) do #echo %%~na,) >Dump.txt
I got
File1,
File2,
But what I want in the end is a text file with, so one long combined string.
File1,File2,...
I'm kinda stuck here and probably need something else than echo.
Thanks for trying to help.
Try like this:
#echo off
setlocal enableDelayedExpansion
for %%a in (.\*.txt) do (
<nul set /p=%%~nxa,
)
check also the accepted answer here and the dbenham's one.
You could also leverage powershell from a batch-file for this task:
#"%__APPDIR__%WindowsPowerShell\v1.0\powershell.exe" -NoProfile -Command "( Get-Item -Path '.\*' -Filter '*.bin' | Where-Object { -Not $_.PSIsContainer } | Select-Object -ExpandProperty BaseName ) -Join ',' | Out-File -FilePath '.\dump.txt'"
This could probably be shortened, if necessary, to:
#PowerShell -NoP "(GI .\*.bin|?{!$_.PSIsContainer}|Select -Exp BaseName) -Join ','>.\dump.txt"
Related
I have a folder on a shared network drive with a large number of text files. I am required to list the file name, size and number of lines/ rows in each file. I am able to use command prompt to get the output separately but I cannot seem to combine.
This works perfectly to list the file name and size:
DIR /s “files location*.txt” > Directory.txt
This works to for the line count:
for %f in ("files location*.txt" ) do find /v /c "" "%f"
I tried the following to combine but the output was empty and the command prompt window showed the full file location and name but without the line count
DIR /s “files location*.txt” | for %f in (“files location*.txt”) do find /v /c "" "%f" > Directory.txt
I think this question has been here before. Put these two (2) files into the same directory. The directory should be in the PATH variable. Many things could be done to make this more flexible using parameters. If you are on a supported Windows system, PowerShell will be available. If you have PowerShell 6 or higher, change powershell to pwsh.
=== Get-FileLineCount.bat
#ECHO OFF
powershell -NoLogo -NoProfile -File "%~dp0Get-FileListCount.ps1"
EXIT /B
=== Get-FileLineCount.ps1
Get-ChildItem -File -Path 'C:\src\t' -Filter '*.txt' |
ForEach-Object {
[PSCustomObject]#{
LastWriteTime = $_.LastWriteTime
Length = $_.Length
LineCount = (Get-Content -Path $_.FullName | Measure-Object).Count
FileName = $_.FullName
}
}
This produces the following output.
LastWriteTime Length LineCount FileName
------------- ------ --------- --------
2021-04-08 08:14:59 3 1 C:\src\t\abc.txt
2021-04-08 08:16:39 8 1 C:\src\t\abc-utf-8.txt
2019-07-08 11:38:36 30 1 C:\src\t\append.txt
2019-07-08 11:38:36 36 12 C:\src\t\appendtemp.txt
2020-03-06 09:48:51 104 25 C:\src\t\Combined.txt
I am trying to get a list of all files within a directory including those within all subfolders.
The columns I would like are the Filename, Path, Size and Date.
I have tried to do my own research and come close but not yet hit the full solution.
I can get the filepath and filename together with date and size using this command below, unfortunately I cannot get all the files within a subfolders.
dir /t > filelist1.txt
This below CMD command does get the filenames from all subfolders but I cannot get it to produce dates.
(#For /F "Delims=" %A in ('dir /B/S/A-D') Do #Echo %~fA %~zA) >filelist.txt
I thought maybe do this to include dates but it didn't work.
(#For /F "Delims=" %A in ('dir /B/S/A/D') Do #Echo %~fA %~zA) >filelist.txt
This file also gives me the path and filename together which I can accept (I will use Excel to separate) but is it possible to have the path and filename separated?
Also it is possible to have those columns separated by tab for easier Excel import?
This could be done with %~ variables in a cmd.exe batch script. But, it is easier and more readable in PowerShell. The output is in a file named FileList.csv.
Get-ChildItem -Recurse |
ForEach-Object {
[PSCustomObject]#{
FileName = $_.Name
Path = $_.Directory
Size = $_.Length
Date = $_.LastWriteTime
}
} |
Export-Csv -Path './FileList.csv' -Delimiter "`t" -Encoding ASCII -NoTypeInformation
If you do not want to run the .ps1 file from the .bat script, it can, with enough effort, be put into the .bat file script.
powershell -NoLogo -NoProfile -Command ^
"Get-ChildItem |" ^
"ForEach-Object {" ^
"[PSCustomObject]#{" ^
"FileName = $_.Name;" ^
"Path = $_.Directory;" ^
"Size = $_.Length;" ^
"Date = $_.LastWriteTime" ^
"}" ^
"} |" ^
"Export-Csv -Path './FileList.csv' -Delimiter "`t" -Encoding ASCII -NoTypeInformation"
Some time ago, #Magoo was nice enough to help me in working out a FOR /F command to archive files into 7-zip:
Using FOR or FORFILES batch command to individually archive specific files
Since then I've expanded it somewhat and want to do more elaborate things with it. However, to keep this question simple, I've included the basics to try and get this working, which I haven't had much success at.
I'm rather new to PowerShell and I have some specific reasons to use this instead of batch files, moving forward.
I understand that some more experienced users may note that I will have a reduction in performance by using such statements in PowerShell, but it isn't an important issue for me.
$env:Path += ";C:\Program Files\7-Zip"
$sourcedir = read-host "Enter the directory to archive: "
foreach ($aname in {
'cmd /c dir /s/b /a-d "$sourcedir\*.iso" '
'cmd /c dir /s/b /a-d "$sourcedir\*.daa" '
'cmd /c dir /s/b /a-d "$sourcedir\*.nrg" '
'cmd /c dir /s/b /a-d "$sourcedir\*.flp" '}
) {
IF NOT EXIST $aname.7z (
echo 7z a -t7z "$aname.7z" "$aname" -mx9 -mmt >> Z:\test\7z-log.txt
ECHO "$aname" archived.
) ELSE (
ECHO "$aname" archive file already exists.
)
}
I got into some trouble with the IF EXIST statement and even when I removed the IF and had a one-line ECHO just to simplify it even more, but I couldn't get it to output what I wanted.
So, I tried a different approach:
$env:Path += ";C:\Program Files\7-Zip"
$sourcedir = read-host "Enter the directory to archive: "
$dir_iso = ForEach-Object { cmd /c dir /s/b /a-d "$sourcedir\*.iso" }
$dir_daa = ForEach-Object { cmd /c dir /s/b /a-d "$sourcedir\*.daa" }
$dir_nrg = ForEach-Object { cmd /c dir /s/b /a-d "$sourcedir\*.nrg" }
$dir_flp = ForEach-Object { cmd /c dir /s/b /a-d "$sourcedir\*.flp" }
foreach ($aname in $dir_iso,$dir_daa,$dir_nrg,$dir_flp) {
ECHO "$aname" archived.
}
But what this did, is clumped each item of each type together, then appended "archived" to that set. Something like:
C:\folder1\iso1.iso C:\folder1\iso2.iso C:\folder1\iso3.iso archived.
C:\folder2\image.nrg archived.
C:\folder3\app1.flp C:\folder3\app2.flp archived.
instead of:
C:\folder1\iso1.iso archived.
C:\folder1\iso2.iso archived.
C:\folder1\iso3.iso archived.
C:\folder2\image.nrg archived.
C:\folder3\app1.flp archived.
C:\folder3\app2.flp archived.
I'm having a real hard time with getting this to work. Can anyone help?
Thanks.
The first thing I see here is you are using this to get file information from the filesystem
$dir_iso = ForEach-Object { cmd /c dir /s/b /a-d "$sourcedir\*.iso" }
More specifically cmd /c dir /s/b /a-d "$sourcedir\*.iso". This would translate easily to Get-ChildItem
$dir_iso = Get-ChildItem -Path $sourcedir -Filter "*.iso" -Recurse -File | Select-Object -ExpandProperty FullName
Path is the file path of the folder you are checking
Filter you want only files ending with .iso
Recurse all subdirectories are checked
File returns only files and not directories (PowerShell 3.0 or higher!. There is a simple equivalent if this is an issue. )
Select-Object -ExpandProperty FullName would just return the full path of the files found in an array.
IF EXIST could be replaced by Test-Path
If(Test-Path "$aname.7z"){
Do stuff...
}
As for the ForEach loop foreach ($aname in $dir_iso,$dir_daa,$dir_nrg,$dir_flp) there are a couple good approaches with this but the simplest transistion would be
$dir_iso + $dir_daa + $dir_nrg + $dir_flp | ForEach-Object{
Do Stuff
}
I would probably build the file collection in one variable to begin with to avoid the to concat the arrays together
$files = Get-ChildItem -Path $sourcedir -Recurse -File | Where-Object{$_.Extension -match "(iso|daa|nrg|flp)$" } | Select-Object -ExpandProperty FullName
$files | ForEach-Object{
Write-Host "$_"
}
Use write-host instead of echo.
Multidimensional array, so you'll need an inner loop.
foreach ($aname in $dir_iso,$dir_daa,$dir_nrg,$dir_flp) {
foreach ($bname in $aname) {
write-host "$bname" archived.
}
}
Another possibility; this pipes the commands together rather than storing each in a separate variable.
Get-Item $sourcedir\* |
Where {$_.Extension -like ".iso" -or $_.Extension -like ".daa" -or $_.Extension -like ".nrg" -or $_.Extension -like ".flp"} |
Foreach-Object {
if (-Not (Test-Path "$_.BaseName.7z"))
{
Write-Host $_.FullName not yet archived.
}
else
{
Write-Host $_.FullName already archived.
}
}
I have a XML file Testing.Config with the following content:
<?xml version="1.0" encoding="utf-8"?>
<connectionStrings>
<add name="name1" connectionString="user id=id1;password=password1;"/>
<add name="name2" connectionString="user id=id2;password=password2;"/>
<add name="name3" connectionString="user id=id3;password=password3;"/>
</connectionStrings>
I need parse this file and obtain id and password key-value pairs in an attribute of a particular tag identified by the provided attribute, for example "name=name1".
Example
Input:
name=name1
Returns:
id=id1
password=password1
#echo off
set "xml_file=test.xml"
set /p search_for=Enter name:
for /f "skip=2 tokens=3,9 delims=;= " %%a in ('find """%search_for%""" "%xml_file%"') do (
set "name=%%~a"
set "pass=%%b"
)
echo name : %name%
echo pass : %pass%
If all connectionStrings are on separated lines and every string is on one line.Change the location of the xml_file
You can also try the xpath.bat (better option according to me) -small script that will allow you to get a xml values by xpath expression without using external binaries:
call xpath.bat connection.xml "//add[#name = 'name1']/#connectionString"
Since you indicated (in comments on the question) that powershell is also okay, put the following code in a script file (lets say Foo.ps1)
param
(
[Parameter(Mandatory=$true)]
[string] $ConfigFilePath,
[Parameter(Mandatory=$true)]
[string] $Name
)
([xml](Get-Content -LiteralPath $ConfigFilePath)).connectionStrings.add |
Where-Object {$_.name -eq $name} |
ForEach-Object {($_.connectionString -split ' ')[1] -split ';'}
and then run the script with parameters to get the output.
In case someone comes looking for this like I did... the xpath.bat linked by npocmaka works great, but not for file one network paths e.g. \server\foo\bar.xml. And I couldn't get Vikus's PS script to work for attributes. So... I managed to turn Vikus's PS into this:
param
(
[Parameter(Mandatory=$true)]
[string] $XMLFile,
[Parameter(Mandatory=$true)]
[string] $Xpath
)
[xml]$xml = Get-Content -Path $XMLFile
$value = Select-XML -xml $xml -xpath $Xpath
$value.ToString()
Which I then collapsed to this for use in a .cmd file:
set psc="(Select-XML -xml ([xml](Get-Content -Path %xmlfile%)) -xpath %xpath%).ToString()"
for /f %%a in ('powershell %psc%') do set myvar=%%a
Though it's worth noting that neither %xmlfile% nor %xpath% can have spaces. That requires escaping all the parentheses so that the powershell command doesn't have to be wrapped in double quotes:
for /F %a in ('powershell ^(Select-XML -xml ^([xml]^(Get-Content -Path "test.xml"^)^) -xpath "/Computers/Computer/Name/#id"^).ToString^(^)') do #echo %a
FWIW, they're all markedly slower that my original fragile monster:
#for /F eol^=^#^tokens^=2^,5^,7^ delims^=^<^>^" %%i in (%xmlfile%) do #(
#if "%targetnodename% ATTR1=" == "%%i" (
set myvar1=%%j
set myvar1=%%k
)
)
I'm developing one application. Some path have to be changed on the whole project. The path are fixed and the files can be edited (it is in ".cshtml" ).
So I think I can use a batch file to change all the "http://localhost.com" to "http://domain.com" for example (I know relative and absolute path, but here I HAVE to make that :-) )
So if you have code that can make that changes in files, it could be marvellous!
To complete my question, here it is the path of files and dir
MyApp
MyApp/Views
MyApp/Views/Index/page1.cshtml
MyApp/Views/Index/page2.cshtml
MyApp/Views/Another/page7.cshtml
...
Thanks to help me :-)
Something like this might work as well:
#!/bin/bash
s=http://localhost.com
r=http://example.com
cd /path/to/MyApp
grep -rl "$s" * | while read f; do
sed -i "s|$s|$r|g" "$f"
done
Edit: Or not, since you just switched from bash to batch-file. A batch solution might look like this:
#echo off
setlocal EnableDelayedExpansion
for /r "C:\path\to\MyApp" %%f in (*.chtml) do (
(for /f "tokens=*" %%l in (%%f) do (
set "line=%%l"
echo !line:
)) >"%%~ff.new"
del /q "%%~ff"
ren "%%~ff.new" "%%~nxf"
)
Doing this in batch is really, really ugly, though (error-prone too), and you'd be far better off using sed for Windows, or (better yet) doing it in PowerShell:
$s = "http://localhost.com"
$r = "http://example.com"
Get-ChildItem "C:\path\to\MyApp" -Recurse -Filter *.chtml | ForEach-Object {
(Get-Content $_.FullName) |
ForEach-Object { $_ -replace [regex]::Escape($s), $r } |
Set-Content $_.FullName
}
Note that -Filter only works in PowerShell v3. For earlier versions you have to do it like this:
Get-ChildItem "C:\path\to\MyApp" -Recurse | Where-Object {
-not $_.PSIsContainer -and $_.Extension -eq ".chtml"
} | ForEach-Object {
(Get-Content $_.FullName) |
ForEach-Object { $_ -replace [regex]::Escape($s), $r } |
Set-Content $_.FullName
}
You can do this:
find /MyApp -name "*.cshtml" -type f -exec sed -i 's#http://localhost.com#http://domain.com#g' {} +
Explanation
find /MyApp -name "*.cshtml" -type f looks for files with .cshtml extension in /MyApp structure.
sed -i 's/IN/OUT/g' replaces the text IN to OUT in the files.
hence, sed -i 's#http://localhost.com#http://domain.com#g' replaces http://localhost.com with http://domain.com.
exec .... {} + executes .... within the files found by find.