Windows Batch script to read pom.properties file within .jar - windows

I'm looking for a simple way to read the 2nd line of the pom.properties file that is placed within the META-INF folder of a compiled .jar. (see here: http://maven.apache.org/guides/getting-started/index.html#How_do_I_add_resources_to_my_JAR). I often need to know the date in that file and it's just a pain to have to open the jar every time and dig down into it. I want a Windows batch script that I can run via right-clicking on a .jar (so I'll need help with the Windows registry command as well). The result of the batch command can just be displayed in a cmd window (a nice bonus would be the value being copied to the clipboard, too).
In short: I want to be able to right-click on a .jar file in Windows Explorer > select 'Get Maven Generated Date' (or whatever) > and have the 2nd line of the pom.properties file printed to the console (and copied to the clipboard).
I know this can't be too hard, I just don't know quite what to look for :).
Thanks in advance for any help.

Note that .NETv4.5 is required to use the System.IO.Compression.FileSystem class.
Add-Type -As System.IO.Compression.FileSystem;
$sourceJar = <source-jar-here>;
$jarArchive = [IO.Compression.ZipFile]::OpenRead($sourceJar).Entries
try
{
foreach($archiveEntry in $jarArchive)
{
if($archiveEntry.Name -like "*pom.properties")
{
$tempFile = [System.IO.Path]::GetTempFileName()
try
{
[System.IO.Compression.ZipFileExtensions]::ExtractToFile($archiveEntry, $tempFile, $true)
$mavenDate = Get-Content $tempFile -First 2
Write-Host $mavenDate
}
finally
{
Remove-Item $tempFile
}
}
}
}
finally
{
$jarArchive.Dispose
}

Related

Windows PowerShell - Input file name, output file path

I've just started using PowerShell and I have a task where I need to be able to have the file path displayed on screen when I enter the file name.
Is there a script that allows me to do the below ? :
Ex 1: I enter "test.txt" and I get "C:\Program Files...."
Ex 2: I enter a file name "My Documents" and I also get its path.
I have searched online on how to do this but I didn't quite find what I was looking for and all the queries/answers were too complicated for me to understand.
Can anyone help me out, please?
Thanks in advance!
Here is a starter sample for you.
This example search only within the confine of the paths present is the Path system environment variable. It also only looks for files and do not recurse through these path.
So anything you could access directly from the command line should be available to you through it.
Now, if you want to search the whole drive, you could replace the $DefaultPaths assignment with Get-ChildItem -Path 'C:' -Recurse but doing that each time won't be super efficient.
You could do it and it will work... but it will be slow.
To search on the whole drive or whole filesystem, there are alternative methods that might work better. Some examples of what might entice:
Using a database which you have to buld & maintain to index all the files so that when you search, results are instantaneous and / or very fast
Parsing the MFT table (if using Windows / NTFS filesystem only) instead of using Get-ChildItem (This is not somehting natively doable through a simple cmdlet though) .
Relying on a third party software and interface with (For example, Void Tools Everything search engine already parse MFT and build its own database, allowing users to search instantly through a Windows NTFS filesystem. It also have its own SDK you can plug in through Powershell and retrieve what you seek instantly. The caveats is that you need the software installed first for that solution to work.)
Example: Searching through all paths defined in the Path variable
# What you are looking for. Accept wildcards characters (*)
$Filter = 'notepad.exe'
# Get the System Environment Path variable in an array
$DefaultPaths = $env:Path -split ';'
$Paths =
Foreach ($P in $DefaultPaths) {
# Search for files matching the specified filter. Ignore errors (often if the path do not exist but is sin the Path)
$MatchingFiles = Get-ChildItem -Path $P -Filter $Filter -File -ErrorAction SilentlyContinue
if ($MatchingFiles.count -gt 0) {
$MatchingFiles.Directory.FullName
}
}
$Paths | out-string | Write-Host -ForegroundColor Cyan
Output for Notepad.exe search using this method.
C:\Windows\system32
C:\Windows

How to show file and folder names along with terminal icons without showing current directory?

I want to remove the space/ area taken up in showing the
Directory: C:\Users\varun\Desktop\Projects\advanced-react-patterns-v2
when I run the command:
Get-ChildItem | Format-Wide
Additional details:
Using Windows Terminal & Powershell
Font used in the screenshot: TerminessTTF NF
Used Terminal-Icons
Note: The command
Get-ChildItem -Name
failed to show the terminal icons which kind of my main goal here.
When you are using a Format-* command you are using the default formatting output for the File and Directory objects, which groups files by directory - hence the directory name at the top.
If you wanted to by pass this, you would have to write your own format.ps1xml file and then add the formatting to your output.
$files = Get-ChildItem
foreach ($file in $files) {
$file.PSObject.TypeNames.Insert(0,'Custom.Output.Type')
$file
}
Small sample of XML for the specified Typename, customise as you wish.
<View>
<Name>CustomFileFormatting</Name>
<ViewSelectedBy>
<TypeName>Custom.Output.Type</TypeName>
</ViewSelectedBy>
<TableControl>
<AutoSize />
<TableHeaders>
<TableColumnHeader>
<Label>FullName</Label>
<Alignment>Left</Alignment>
</TableColumnHeader>
</TableHeaders>
<TableRowEntries>
<TableRowEntry>
<TableColumnItems>
<TableColumnItem>
<PropertyName>FSObject</PropertyName>
</TableColumnItem>
</TableColumnItems>
</TableRowEntry>
</TableRowEntries>
</TableControl>
</View>

Files bulk renaming - match a predefined text file

Good day,
I am trying to rename/organize files based on the match/lookup found in the text file.
I have a couple of hundred Cyrillic(Russian) named media files in a folder like this:
файл 35.avi
файл34.avi
файл2 4.avi
файл14.avi
*note that some files have spaces
The text file, with the desired names, looks like this:
файл 35.avi| 4. файл 35.avi
файл34.avi| 3. файл34.avi
файл2 4.avi| 1. файл2 4.avi
файл14.avi| 2. файл14.avi
The reason it looks that way (with | as a separator) is because I tried using "Bulk Renaming Utility" which uses pipe | as a separator for "Rename Pairs" function. So essentially, the filename to the right of pipe | is the final product. Unfortunately, that function does not work with Cyrillic(Russian) or other non standard characters.
I found PowerShell script HERE which appears to be almost what I need except that it does not match file names before renaming.
Similarly, I found this Python script HERE which does what i need but it's for Ubuntu. Unfortunately, I am on a Windows7 and not sure it applies to me.
Any recommendations?
Thank you very much for your time!
You could read the text file into a hashtable, where the key is the old name (the value on the left hand side of the |), and the value is the new name:
$RenameTable = #{}
Get-Content textfile.txt |ForEach-Object {
$OldName,$NewName = $_.Split('|')
$RenameTable[$OldName] = $NewName
}
Then rename the files based on what is in the hashtable:
Get-ChildItem .\folder\with\avi\files |Rename-Item -NewName {
if($RenameTable.ContainsKey($_.Name)){
$RenameTable[$_.Name]
} else {
$_.Name
}
}

Recursive "touch" on fileserver

There's something I want to accomplish with administrating my Windows file server:
I want to change the "Last Modified" date of all the folders on my server (just the folders and subfolders, not the files within them) to be the same as the most recent "Created" (or maybe "Last Modified") date file within the folder. (In many cases, the date on the folder is MUCH newer than the newest file within it.)
I'd like to do this recursively, from the deepest subfolder to the root. I'd also like to do this without me manually entering any dates and times.
I'm sure with a combination of scripting and a Windows port of "touch" I could maybe accomplish this. Do you have any suggestions? I could maybe accomplish this. Do you have any suggestions?
This closed topic seems really close but I'm not sure how to only touch folders without touching the files inside, or how to get the most-recent-file's date. Recursive touch to fix syncing between computers
If it's for backup purposes, in Windows there is the Archive flag (instead of modifying the timestamp). You can set it recursively with ATTRIB /S (see ATTRIB /?)
If it is for other purposes you can use some touch.exe implementation and use a recursive for:
FOR /R (see FOR /?)
http://ss64.com/nt/for_r.html
http://ss64.com/nt/touch.html
I think that you can do this in PowerShell. I just tried throwing something together and it seems to work correctly. You could invoke this in PowerShell using Set-DirectoryMaxTime(".\Directory") and it will operate recursively on each directory under that.
function Set-DirectoryMaxTime([System.IO.DirectoryInfo]$directory)
{
# Grab a list of all the files in the directory
$files = Get-ChildItem -File $directory
# Get the current CreationTime of the directory we are looking at
$maxdate = Get-Date $directory.CreationTime
# Find the most recently edited file's LastWriteTime
foreach($file in $files)
{
if($file.LastWriteTime -gt $maxdate) { $maxdate = $file.LastWriteTime }
}
# This needs to be in a try/catch block because there is a reasonable chance of it failing
# if a folder is currently in use
try
{
# Give the directory a LastWriteTime equal to the newest file's LastWriteTime
$directory.LastWriteTime = $maxdate
} catch {
# One of the directories could not be updated
Write-Host "Could not update directory: $directory"
}
# Get all the subdirectories of this directory
$subdirectories = Get-ChildItem -Directory $directory
# Jump into each of the subdirectories and do the same thing to each of their CreationTimes
foreach($subdirectory in $subdirectories)
{
Set-DirectoryMaxTime($subdirectory)
}
}

How to get the IDE to update solution explorer window after adding project items in nuget ps?

During nuget install I give the user a command they can run. This command basically scans some files and creates some code templates and then inserts them into the current project. This works just fine - except for the fact that Solution Explorer does not update its tree view with the new files. I know this works because I can unload and reload the project file and the files are there.
In case it helps, here is the code I use to add the files to the project - the second function is what the user actually calls.
function add-to-project ($itemType, $project)
{
process
{
$bogus = $project.Xml.AddItem($itemType, $_)
}
}
# Parse a file
function Write-TTree-MetaData ($Path = $(throw "-Path must be supplied"))
{
$p = Get-Project
Write-Host "Inserting the results of the parsing into project" $p.Name
$ms = Get-MSBuildProject
$destDir = ([System.IO.FileInfo] $p.FullName).Directory
# Run the parse now
CmdTFileParser -d $destDir.FullName $Path
# Now, attempt to insert them all into the project
$allFiles = Get-ChildItem -Path $destDir.FullName
$allFiles | ? {$_.Extension -eq ".ntup"} | add-to-project "TTreeGroupSpec" $ms
$allFiles | ? {$_.Extension -eq ".ntupom"} | add-to-project "ROOTFileDataModel" $ms
# Make sure everything is saved!
$ms.Save()
$p.Save()
}
This code causes a funny dialog to pop up - "The project has been modified on disk - please reload" - and hopefully the user will reload, and then the files show up correctly... But it would be nice to avoid that and just have the script cause whatever is needed to happen. Perhaps I have to figure out how to unload and re-load the project?
What do I need to do to force solution explorer to update? Many thanks!
By using the MSBuild project you are bypassing Visual Studio and directly updating the MSBuild project file on disk. The easiest way to get Visual Studio to update the Solutions Explorer window is to use the Visual Studio project object instead which you get from the Get-Project command.
Below is a simple example which adds a file to the solution and changes its ItemType to be ROOTFileDataModel. The example assumes you have a packages.config file in your project's root directory which is not currently added to the project so it is not showing in Solution Explorer initially.
# Get project's root directory.
$p = Get-Project
$projectDir = [System.IO.Path]::GetDirectoryName($p.FileName)
# Add packages.config file to project. Should appear in Solution Explorer
$newFile = [System.IO.Path]::Combine($projectDir, "packages.config")
$projectItem = $p.ProjectItems.AddFromFile($newFile)
# Change file's ItemType to ROOTFileDataModel
$itemType = $projectItem.Properties.Item("ItemType")
$itemType.Value = "ROOTFileDataModel"
# Save the project.
$p.Save()
The main Visual Studio objects being used here are the Project, ProjectItem and the ProjectItems objects. Hopefully the above code can be adapted to your specific requirements.

Resources