Access file extracted by previous step - octopus-deploy

I need to access a json file that was extracted in a previous step. I can't seem to find the correct variable for this.
For example my step called 'Get Package' will download and extract the package to a temp folder within the work folder.
I then have another step that is trying to get the contents of one of the json files that was extracted.
I have tried the following variables but both say that the file cannot be found:
$json = Get-Content $octopusparameters['Octopus.Action.Package[Get Package].ExtractedPath']\config.json -raw | ConvertFrom-Json
$json = Get-Content $octopusparameters['Octopus.Action.Package[Get Package].OriginalInstalledPath']\config.json -raw | ConvertFrom-Json
I was expecting it to go to the folder created in the 'Get Package' step and find the file but it looks in the temp folder for the step that is currently running and obviously the config file doesn't exist.

The variable may be different if you are using a Custom Directory.
If not, this should work:
$OctopusParameters["Octopus.Action[YourPreviousStepName].Output.Package.InstallationDirectoryPath"]

Related

How can I convert part of a filename to become the file extension?

I downloaded a backup folder of about 3,000 files from our email service provider. None of the files have an associated filetype; instead the file extension was appended to the name of each individual file. For example:
community-involvement-photo-1-jpg
social-responsibility-31-2012-png
report-02-12-15-pdf
I can manually change the last dash to a period and the files work just fine. I'm wondering if there is a way to batch convert all of the files so they can be sorted and organized properly. I know in the Command Line I can do something like ren *. *.jpg but there are several different file types contained in the folder, so it wouldn't work for all of them. Is there any way I can tell it to convert the last "-" in each file name into a "." ?
I'm on Windows 10; unable to install any filename conversion programs unless I want to go through weeks of trouble with the IT group.
$ordner = "c:\temp\pseudodaten"
$Liste = (get-childitem -path $Ordner).Name
cd $ordner
foreach ($Datei in $Liste) {
$Length = $datei.length
$NeuerName=$Datei.Substring(0,$Length-4)+"."+$datei.Substring($Length - 3, 3)
rename-item -Path $Datei -NewName $NeuerName
}

Adding part of source path to destination path - and more

I feel like this one should be easy, but it's giving me a bit of trouble. The goal is to get downloaded wsus updates migrated to a separate drive that can be burned to a disk and moved to a secure environment.
The current process was set up by another guy and I am taking over. He has 2 batch files that run we will call them first.bat and second.bat
first.bat is run that spits out a log of how many new files there are. We'll call it new.txt this simply contains the hash file paths for the changes i.e. C:\folder\sub\1A\HASHFILE.cab
Then, we copy the file paths in new.txt by hand and manually paste them into second.bat ... add in an xcopy function and a destination folder in a new drive. i.e. xcopy C:\folder\sub\1A\HASHFILE.cab E:\eport\**1A** However, we have to manually add in the hash folder identifier (1A for example) I would like the script to pick it up from the source folder path and add it into the destination to eliminate the potential for human error.
What I am trying to write a shell script to accomplish is this.
run first.bat
export info from new.txt and modify to
add xcopy parameter to all new files in new.txt as well as the destination folder path
automate the addition of the hash folder callout (i.e. 1A to the end of the destination folder path. i.e. E:\export\**1A**)
run second.bat
Anyone have any ideas? I would like to wrap all of this up (or a similar function if that's easier, i imagine it might be) into a handy script that will automate this tedious process.
I have tinkered with a few ideas, I get it to spit out the destination path, but it doesn't actually move anything. Here is what I have so far that successfully tags the hash marker at the end of the destination folder, but does nothing else:
$source = "C:\temp\test folder\"
$dest = "C:\Temp2\export\"
$folders = Get-ChildItem $source | Where-Object {$_.PSIsContainer -eq $true} | Sort-Object
foreach ($folder in $folders){
$fname = $folder.FullName
$sourcefolder = $fname.Replace($source,"")
$newdest = $dest + $sourcefolder
Write-Output $newdest
}

Show all files in folder and subfolders and list names and encoding

I want to see all the files in a folder and its sub folders and list its encoding.
I know that you can use git ls-files to see the files and file* to get the name + its encoding.
But I need help how I can do both at the same time.
The reason is that we have problem with encoding and need to see what files are encoded in what way. So I guess a PS script would work fine as well.
I think the best way to solve this by Powershell is first get your files by following Script:
$folder = Get-ChildItem -Path "YourPath"
and in a foreach ($file in $folder) use one of the following scripts to get the encoding (which is straightforward)
https://www.powershellgallery.com/packages/PSTemplatizer/1.0.20/Content/Functions%5CGet-FileEncoding.ps1
https://vertigion.com/2015/02/04/powershell-get-fileencoding/

How to show file and folder names along with terminal icons without showing current directory?

I want to remove the space/ area taken up in showing the
Directory: C:\Users\varun\Desktop\Projects\advanced-react-patterns-v2
when I run the command:
Get-ChildItem | Format-Wide
Additional details:
Using Windows Terminal & Powershell
Font used in the screenshot: TerminessTTF NF
Used Terminal-Icons
Note: The command
Get-ChildItem -Name
failed to show the terminal icons which kind of my main goal here.
When you are using a Format-* command you are using the default formatting output for the File and Directory objects, which groups files by directory - hence the directory name at the top.
If you wanted to by pass this, you would have to write your own format.ps1xml file and then add the formatting to your output.
$files = Get-ChildItem
foreach ($file in $files) {
$file.PSObject.TypeNames.Insert(0,'Custom.Output.Type')
$file
}
Small sample of XML for the specified Typename, customise as you wish.
<View>
<Name>CustomFileFormatting</Name>
<ViewSelectedBy>
<TypeName>Custom.Output.Type</TypeName>
</ViewSelectedBy>
<TableControl>
<AutoSize />
<TableHeaders>
<TableColumnHeader>
<Label>FullName</Label>
<Alignment>Left</Alignment>
</TableColumnHeader>
</TableHeaders>
<TableRowEntries>
<TableRowEntry>
<TableColumnItems>
<TableColumnItem>
<PropertyName>FSObject</PropertyName>
</TableColumnItem>
</TableColumnItems>
</TableRowEntry>
</TableRowEntries>
</TableControl>
</View>

Request assistance writing a PS script to search for a list of files (path + filename) against all Windows servers in my environment

What I'm trying to accomplish:
Create a PS script to run from a single Admin machine, but search against C$ on all Windows servers in AD.
Search for a specific list of paths\filenames that I provide.
If ANY of the specific list of paths\filenames are found on a server, THEN output the server name, and paths\filenames to a *.CSV file titled "Badfiles.csv" on the Admin machine.
Was trying to build from the following syntax but admittedly my old brain is not good at this stuff, and I've only specified a single file here - how do I refer to a list of multiple paths\files? Thank you for helping an old lady out. :)
$name= gc env:computername
$computers= get-content -path C:\Windows\Temp\v.bat
$csvfile = "c:\temp\$badfiles.csv"
foreach ($computer in $computers) {
"\$computer\C$\" | Get-ChildItem -recurse -filter "*.bat"
}
To refer to a list of items whether those are files or computer names you will need to use what is called an Array.
You can create an array in many ways, in your case it might best to create a list in a txt file and afterwards in Powershell you read the list contents using get-content, save the result in a variable and it will automatically be saved as an array!
Then iterate through each of them using what is called a foreach loop, that basically lets you take each of the items in the array and do something with it, then move to the next item and so on until every item has been dealt with.
Now the most important part of what you want to achieve is not clear. Let me explain.
To check if a file exists you can use test-path. That will return true or false and you can then act upon the result of that. You need to define an exact path and name of a file to check for this to work.
If you don't know the exact names and paths of files that need to be checked, you can use Get-ChildItem similarly as you have done in the code you provided. The caveat here is that you have to narrow down the scope of the file search as much as you can. In your example you search for the .bat file extension on the whole machine and that can result in some issues. A typical C drive will have hundreds of thousands if not millions of files and folders. Parsing all of them can take a long time.
So this is an important distinction to understand and what causes confusion for me is that you say in "2. Search for a specific list of paths\filenames that I provide..." yet in the code you use Get-ChildItem to get all files instead of providing a list of filenames.
Further I will assume you have a list of filenames with exact known paths.
Now in your given code I can see you have found some of the right commands but they need to be arranged differently to produce the results you need.
Please review this example code that might help you further:
Example ComputerList.txt file content(list of computer hostnames to check):
computer1
serverXYZ
laptop123
Example FileList.txt file content(List of files to check for in each of the above computers):
c:\temp\virus.bat
c:\games\game.exe
c:\Pictures\personal.jpg
Now the PowerShell code:
# Gets the list of items from TXT files and saves them as arrays in variables
$ComputerNames = Get-Content 'c:\temp\ComputerList.txt'
$FileList = Get-Content 'c:\temp\FileList.txt'
# Define the path and name of CSV report
$BadFiles = "c:\temp\badfiles.csv"
# Define the foreach loop that will iterate through each hostname in computer list
foreach($computer in $ComputerNames){
# Define foreach loop that will iterate through each file in the list and test their path in the current computer in the current iteration
foreach($file in $FileList){
# Test the path of the current file in the loop and append the CSV file if it was found
# Convert the file path to C$ share path
$file = $file -replace("c:","c$")
# Define path of file to test
$FileToTest = "\\$computer\$file"
if (test-path $FileToTest -ErrorAction SilentlyContinue){
# This block will run only when a bad file has been found
# This part can be tricky but it is used to make sure we properly format the current bad file entry when we append it to the resulting CSV file
$BadFile = "" | select Computer,File
# Save information about current computer
$BadFile.computer = $computer
# Save information about current file
$BadFile.file = $file
# Append the entry to an array of found bad files
$BadFileList += $badfile
}
}
}
# When done iterating through every computer and file, save the results in a CSV file
$BadFileList | ConvertTo-Csv -NoTypeInformation | Out-File $BadFiles
The above is a full code snippet you can test and run in your environment. First please create the two TXT files and make sure you run PowerShell with the appropriate permissions to access the C$ network shares of the servers.
The snippet should work but I have not tested it myself. Let me know if there are any errors.
Please test and feel free to ask if you have any follow up questions.

Resources