I have a table with filenames in it and also a userID for the person whose file it is. What I need to do is rename the file so that the userID is in the name of the file. The file right now might be 01234_main1.3gp and the user might be 987654 so I would want it to become 987654_main1.3gp.
What I have below renames the files to system.Data.Datarow+_main1.3gp. I know the variable works, immediately before the rename command I do a Write-Output and the correct number sequence is there.
$items = Get-ChildItem -Name -Path 'c:\main1'
foreach ($item in $items) {
$Query = #"
SELECT [su_ID] as $SU_ID
FROM [database].[dbo].[table]
WHERE (assetLog_m1audio = '$item')
"#
Invoke-Sqlcmd -ServerInstance surveyname -Database database -Query $Query
Write-Output $su_ID
Rename-Item C:\main1\$item $su_ID+'_main1.3gp'
}
Related
I need everyone's help to get back to the power shell, I currently have a directory tree with a lot of folders you can see the images I borrowed.
enter image description here
I want to share folder "C and F" all directory tree at once with multiple users with view and edit permissions. hope everyone can help. I'm so stupid about this.
Hi khuchatvui and welcome to stackoverflow!
New-SmbShare can be used for creating shared folders.
If I understand correctly, you only want to share folders with a specific name that exist at multiple levels. SMB share names have to be unique, so that will provide a challenge if you want to have a specific sharename
You could partly automate this process by getting prompt for each folder name during the creation:
Solution 1 - prompt for name
$FoldersToShare = Get-ChildItem -Path C:\Tests\ -Recurse | Where-Object { $_.Name -eq 'F' -or $_.Name -eq 'C' } | Select-Object -ExpandProperty FullName
foreach ($folder in $FoldersToShare) {
New-SmbShare -Name (Read-Host -Prompt "Enter the sharename for $($folder)") -Path $folder -ChangeAccess "domain\groupname"
}
If there is no pattern in the folders you want to share, but the names are unique, you could make a list of all the folders you want to share like this:
Solution 2 - create unique folder names
Get-ChildItem -Path c:\tests -Directory -Recurse | Select-Object Name, FullName | Export-Csv -NoTypeInformation -NoClobber -Delimiter ';' -Path C:\Tests\Stackoverflow\FoldersToShare.csv
Then, modify that list using a text editor or Excel to only contain the folders you want to share and use that to loop through New-SmbShare
Finally, use PowerShell to import the contents of the modified csv file and loop through the entries with New-SmbShare to create the shared folders
$FoldersToShare = Import-Csv -Path C:\Tests\Stackoverflow\FoldersToShare.csv -Delimiter ';'
foreach ($folder in $FoldersToShare) {
New-SmbShare -Name $folder.Name -Path $folder.FullName -ChangeAccess "domain\groupname"
}
For my solution, I created the folder structure from your image under C:\Tests
C:\Tests\
A
A1
C
F
A2
C
B
B1
C
B2
C
We're migrating our FTP and I would like to only migrate folders that have had files in them that have been used/written in the last 6 months. I would think this would be something that I find all over the place with google, but all the scripts I've found have the same fatal flaw.
It seems with everything I find, it depends on the "Date modified" of the folder. The problem with that is, I have PLENTY of folders that show a "Date Modified" of years ago, yet when you dig into it, there are files that are being created and written as recently as today.
Example:
D:/Weblogs may show a date modified of 01/01/2018 however, when you dig into it, there is some folder, idk, called "Log6" let's say, and THAT folder has a log file in it that was modified as recently as yesterday.
All these scripts I'm seeing pull the date modified of the top folder, which just doesn't seem to be accurate.
Is there any way around this? I would expect something like
Get all folders at some top level, then foreach through the CONTENTS of those folders looking for files with the datemodified -lt adddays(-180) filter. If stuff that's "New" is found, then don't add the overarching directory to the array, but if not, then list the directory.
Any ideas?
Edit: I've tried this
$path = <some dir>
gci -path $path -Directory where-object {LastWriteTime -lt (get-date).addmonths(-6))}
and
$filter = {$_.LastWriteTime -lt (Get-Date).AddDays(-180)}
#$OldStuff = gci "D:\FTP\BELAMIINC" -file | Where-Object $filter
$OldFolders = gci "D:\FTP\BELAMIINC" -Directory | ForEach-Object {
gci "D:\FTP\BELAMIINC" -file | Where-Object $filter
}
Write-Host $OldFolders
Give this a try, I added comments for you to follow along the thought process.
The use of -Force is mainly to find hidden files and folders.
$path = '/path/to/parentFolder'
$limit = [datetime]::Now.AddMonths(-6)
# Get all the child folders recursive
$folders = Get-ChildItem $path -Recurse -Directory -Force
# Loop over the folders to see if there is at least one file
# that has been modified or accessed after the limit date
$result = foreach($folder in $folders)
{
:inner foreach($file in Get-ChildItem $folder -File -Force)
{
if($file.LastAccessTime -gt $limit -or $file.LastWriteTime -gt $limit)
{
# If this condition is true at least once, break this loop and return
# the folder's FullName, the File and File's Date for reference
[pscustomobject]#{
FullName = $folder.FullName
File = $file.Name
LastAccessTime = $file.LastAccessTime
LastWriteTime = $file.LastWriteTime
}
break inner
}
}
}
$result | Out-GridView
If you need to find the folders that don't have files recently modified you can use the $folders array and compare it against $result:
$folders.where({$_.FullName -notin $result.FullName}).FullName
I need to export an quite a big CSV file from Oracle once a week.
I tried two approaches.
Adapter.fill(dataset)
Looping through columns and rows to save into a CSV file one line at a time.
The first one is running out of memory when running (the server machine has only 4 GB of RAM), the second one takes about an hour as there are over 4 milion rows to export.
Here's code #1:
#Your query. It cannot contain any double quotes otherwise it will break.
$query = "SELECT manycolumns FROM somequery"
#Oracle login credentials and other variables
$username = "username"
$password = "password"
$datasource = "database address"
$output = "\\NetworkLocation\Sales.csv"
#creates a blank CSV file and make sure it's in ASCI
Out-File $output -Force ascii
#This here will look for "Oracle.ManagedDataAccess.dll" file inside "C:\Oracle" folder. We usually have two versions of Oracle installed so the adaptor can be in different locations. Needs changing if the Oracle is installed elsewhere.
$location = Get-ChildItem -Path C:\Oracle -Filter Oracle.ManagedDataAccess.dll -Recurse -ErrorAction SilentlyContinue -Force
#Establishes connection to Oracle using the DLL file
Add-Type -Path $location.FullName
$connectionString = 'User Id=' + $username + ';Password=' + $password + ';Data Source=' + $datasource
$connection = New-Object Oracle.ManagedDataAccess.Client.OracleConnection($connectionString)
$connection.open()
$command=$connection.CreateCommand()
$command.CommandText=$query
#Creates a table in memory and fills it with results from the query. Then, export the virtual table into CSV.
$DataSet = New-Object System.Data.DataSet
$Adapter = New-Object Oracle.ManagedDataAccess.Client.OracleDataAdapter($command)
$Adapter.Fill($DataSet)
$DataSet.Tables[0] | Export-Csv $output -NoTypeInformation
$connection.Close()
And here's #2
#Your query. It cannot contain any double quotes otherwise it will break.
$query = "SELECT manycolumns FROM somequery"
#Oracle login credentials and other variables
$username = "username"
$password = "password"
$datasource = "database address"
$output = "\\NetworkLocation\Sales.csv"
$tempfile = $env:TEMP + "\Temp.csv"
#creates a blank CSV file and make sure it's in ASCI
Out-File $tempfile -Force ascii
#This here will look for "Oracle.ManagedDataAccess.dll" file inside "C:\Oracle" folder. Needs changing if the Oracle is installed elsewhere.
$location = Get-ChildItem -Path C:\Oracle -Filter Oracle.ManagedDataAccess.dll -Recurse -ErrorAction SilentlyContinue -Force
#Establishes connection to Oracle using the DLL file
Add-Type -Path $location.FullName
$connectionString = 'User Id=' + $username + ';Password=' + $password + ';Data Source=' + $datasource
$connection = New-Object Oracle.ManagedDataAccess.Client.OracleConnection($connectionString)
$connection.open()
$command=$connection.CreateCommand()
$command.CommandText=$query
#Reads results column by column. This way you don't have to specify how many columns it has.
$reader = $command.ExecuteReader()
while($reader.Read()) {
$props = #{}
for($i = 0; $i -lt $reader.FieldCount; $i+=1) {
$name = $reader.GetName($i)
$value = $reader.item($i)
$props.Add($name, $value)
}
#Exports each line to CSV file. Works best when the file is on local drive as it saves it after each line.
new-object PSObject -Property $props | Export-Csv $tempfile -NoTypeInformation -Append
}
Move-Item $tempfile $output -Force
$connection.Close()
Ideally, I would like to use the first code as it is way faster than the second one but to somehow avoid running out of memory.
Do you guys and gals know if there's some way to "fill" first 1 milion records, append them to CSV, clean the "DataSet" table, next 1 milion etc? After the code finishes running the CSV weights ~1.3 GB but when it runs, even 8 GB of Memory is not enough for it (my laptop has 8 but server only has 4 GB and it really hits it hard).
Any tips will be appreciated.
In the *nix community we love one-liners!
You can set markup to 'csv on' in sqlplus (>= 12)
Create the query file
cat > query.sql <<EOF
set head off
set feed off
set timing off
set trimspool on
set term off
spool output.csv
select
object_id,
owner,
object_name,
object_type,
status,
created,
last_ddl_time
from dba_objects;
spool off
exit;
EOF
Spool the output.csv like this:
sqlplus -s -m "CSV ON DELIM ',' QUOTE ON" user/password#\"localhost:1521/<my_service>\" #query.sql
Another option is SQLcl (the SQL Developer CLI tool. Binary name: 'sql' renamed by me to 'sqlcl')
Create the query file (Note! term on|off)
cat > query.sql <<EOF
set head off
set feed off
set timing off
set term off
set trimspool on
set sqlformat csv
spool output.csv
select
object_id,
owner,
object_name,
object_type,
status,
created,
last_ddl_time
from dba_objects
where rownum < 5;
spool off
exit;
EOF
Spool the output.csv like this:
sqlcl -s system/oracle#\"localhost:1521/XEPDB1\" #query.sql
Viola!
cat output.csv
9,"SYS","I_FILE#_BLOCK#","INDEX","VALID",18.10.2018 07:49:04,18.10.2018 07:49:04
38,"SYS","I_OBJ3","INDEX","VALID",18.10.2018 07:49:04,18.10.2018 07:49:04
45,"SYS","I_TS1","INDEX","VALID",18.10.2018 07:49:04,18.10.2018 07:49:04
51,"SYS","I_CON1","INDEX","VALID",18.10.2018 07:49:04,18.10.2018 07:49:04
And the winner is sqlplus for 77k rows! (removed filter rownum < 5)
time sqlcl -s system/oracle#\"localhost:1521/XEPDB1\" #query.sql
real 0m23.776s
user 0m39.542s
sys 0m1.293s
time sqlplus -s -m "CSV ON DELIM ',' QUOTE ON" system/oracle#localhost/XEPDB1 #query.sql
real 0m3.066s
user 0m0.700s
sys 0m0.265s
wc -l output.csv
77480 output.csv
You can experiment with formats in SQL Developer.
select /*CSV|HTML|JSON|TEXT|<TONSOFOTHERFORMATS>*/ from dba_objects;
If you are loading CSV into the database, this tool will do it!
https://github.com/csv2db/csv2db
Best of luck!
Thank you all for the responses, I learned about Oracle scripts and sql*plus that I never knew existed. I probably will use them in the future but I guess I will have to update my Oracle Developer package.
I have found a way to edit my code to work using the documentation here:
https://docs.oracle.com/database/121/ODPNT/OracleDataAdapterClass.htm#i1002865
It's not perfect as it pauses every 1 milion rows, saves the output and re-run the query which re-evaluates it (the one I'm running takes about 1-2 minutes to evaluate).
It's basically the same as running one code x times (where x is a ceiling of number of rows in milions) doing "fetch first 1'000'000 rows only" then "Offset 1'000'00 rows Fetch next 1'000'000 rows only" etc. and saving it into CSV appending at the bottom.
Here's the code:
#Your query. It cannot contain any double quotes otherwise it will break.
$query = "SELECT
A lot of columns
FROM
a lot of tables joined together
WHERE
a lot of conditions
"
#Oracle login credentials and other variables
$username = myusername
$password = mypassword
$datasource = TNSnameofmyDatasource
$output = "$env:USERPROFILE\desktop\Sales.csv"
#creates a blank CSV file and make sure it's in ASCII as that's what the output of my query is
Out-File $output -Force ascii
#This here will look for "Oracle.ManagedDataAccess.dll" file inside "C:\Oracle" folder. Needs changing if the Oracle is installed elsewhere.
$location = Get-ChildItem -Path C:\Oracle -Filter Oracle.ManagedDataAccess.dll -Recurse -ErrorAction SilentlyContinue -Force
#Establishes connection to Oracle using the DLL file
Add-Type -Path $location.FullName
$connectionString = 'User Id=' + $username + ';Password=' + $password + ';Data Source=' + $datasource
$connection = New-Object Oracle.ManagedDataAccess.Client.OracleConnection($connectionString)
$connection.open()
$command=$connection.CreateCommand()
$command.CommandText=$query
#Creates a table in memory to be filled up with results from the query using ODAC
$DataSet = New-Object System.Data.DataSet
$Adapter = New-Object Oracle.ManagedDataAccess.Client.OracleDataAdapter($command)
#Declaring variables for the loop
$fromrecord = 0
$numberofrecords = 1000000
$timesrun = 0
#Loop as long as the number of Rows in the virtual table are equal to specified $numberofrecords
while(($timesrun -eq 0) -or ($DataSet.Tables[0].Rows.Count -eq $numberofrecords))
{
$DataSet.Clear()
$Adapter.Fill($DataSet,$fromrecord,$numberofrecords,'*') | Out-Null #Suppresses writing to console the number of rows filled
Write-progress "Saved: $fromrecord Rows"
$DataSet.Tables[0] | Export-Csv $output -Append -NoTypeInformation
$fromrecord=$fromrecord+$numberofrecords
$timesrun++
}
$connection.Close()
I currently have a CSV which contains 1 column that lists many file FullNames. (ie. "\\server\sub\folder\file.ext").
I am attempting to import this CSV, move the file to a separate location and append a GUID to the beginning of the filename in the new location (ie GUID_File.ext). I've been able to move the files, generate the GUID_ but haven't been able to store and reuse the existing filename.ext, it just gets cut off and the file ends up just being a GUID_. I just am not sure how to store the existing filename for reuse.
$Doc = Import-CSV C:\Temp\scripttest.csv
ForEach ($line in $Doc)
{
$FileBase = $Line.basename
$FileExt = $Line.extension
Copy-Item -path $line.File -Destination "\\Server\Folder\$((new-guid).guid.replace('-',''))_$($Filebase)$($FileExt)"
}
If possible, I'm going to also need to store and place all the new GUID_File.ext back into a CSV and store any errors to another file.
I currently have a CSV which contains 1 column that lists many file FullNames. (ie. "\server\sub\folder\file.ext").
This isn't a CSV. It's just a plaintext file with a list.
Here's how you can accomplish your goal, however:
foreach ($path in (Get-Content -Path C:\Temp\scripttest.csv))
{
$file = [System.IO.FileInfo]$path
$prefix = (New-Guid).Guid -replace '-'
Copy-Item -Path $file.FullName -Destination "\\Server\Folder\${prefix}_$file"
}
This will take your list, convert the item into a FileInfo type it can work with, and do the rest of your logic.
Based on:
$FileBase = $line.basename
$FileExt = $line.extension
it sounds like you mistakenly think that the $line instances representing the objects returned from Import-Csv C:\Temp\scripttest.csv are [System.IO.FileInfo] instances, but they're not:
What Import-Csv outputs are [pscustomobject] instances whose properties reflect the column values of the input CSV, and the values of these properties are invariably strings.
You must therefore use $line.<column1Name> to refer to the column containing the full filenames, where <column1Name> is the name defined for the column of interest in the header line (the 1st line) of the input CSV file.
If the CSV file has no header line, you can specify the column names by passing an array of column names to Import-Csv's -Header parameter, e.g.,
Import-Csv -Header Path, OtherCol1, OtherCol2, ... C:\Temp\scripttest.csv
I'll assume that the column of interest is named Path in the following solution:
$Doc = Import-Csv C:\Temp\scripttest.csv
ForEach ($rowObject in $Doc)
{
$fileName = Split-Path -Leaf $rowObject.Path
Copy-Item -Path $rowObject.Path `
-Destination "\\Server\Folder\$((new-guid).guid.replace('-',''))_$fileName"
}
Note how Split-Path -Leaf is used to extract the filename, including extension, from the full input path.
If I read your question carefully, you want to:
copy the files listed in the CSV file in the 'File' column.
the new files should have a GUID prepended to the filename
you need a new CSV file where the new filenames are stored for later reference
you want to track any errors and write those to a (log) file
Assuming you have an input CSV file looking something like this:
File,Author,MoreStuff
\\server\sub\folder\file.ext,Someone,Blah
\\server\sub\folder\file2.ext,Someone Else,Blah2
\\server\sub\folder\file3.ext,Same Someone,Blah3
Then below script does hopefully what you want.
It creates new filenames by prepending them with a GUID and copies the files in the CSV listed in column File to some destination path.
It outputs a new CSV file in the destination folder like this:
OriginalFile,NewFile
\\server\sub\folder\file.ext,\\anotherserver\sub\folder\38f7bec9e4c0443081b385277a9d253d_file.ext
\\server\sub\folder\file2.ext,\\anotherserver\sub\folder\d19546f7a3284ccb995e5ea27db2c034_file2.ext
\\server\sub\folder\file3.ext,\\anotherserver\sub\folder\edd6d35006ac46e294aaa25526ec5033_file3.ext
Any errors are listed in a log file (also in the destination folder).
$Destination = '\\Server\Folder'
$ResultsFile = Join-Path $Destination 'Copy_Results.csv'
$Logfile = Join-Path $Destination 'Copy_Errors.log'
$Doc = Import-CSV C:\Temp\scripttest.csv
# create an array to store the copy results in
$result = #()
# loop through the csv data using only the column called 'File'
ForEach ($fileName in $Doc.File) {
# check if the given file exists; if not then write to the errors log file
if (Test-Path -Path $fileName -PathType Leaf) {
$oldBaseName = Split-Path -Path $fileName.Path -Leaf
# or do $oldBaseName = [System.IO.Path]::GetFileName($fileName)
$newBaseName = "{0}_{1}" -f $((New-Guid).toString("N")), $oldBaseName
# (New-Guid).toString("N") returns the Guid without hyphens, same as (New-Guid).Guid.Replace('-','')
$destinationFile = Join-Path $Destination $newBaseName
try {
Copy-Item -Path $fileName -Destination $destinationFile -Force -ErrorAction Stop
# add an object to the results array to store the original filename and the full filename of the copy
$result += New-Object -TypeName PSObject -Property #{
'OriginalFile' = $fileName
'NewFile' = $destinationFile
}
}
catch {
Write-Error "Could not copy file to '$destinationFile'"
# write the error to the log file
Add-content $Logfile -Value "$((Get-Date).ToString("yyyy-MM-dd HH:mm:ss")) - ERROR: Could not copy file to '$destinationFile'"
}
}
else {
Write-Warning "File '$fileName' does not exist"
# write the error to the log file
Add-content $Logfile -Value "$((Get-Date).ToString("yyyy-MM-dd HH:mm:ss")) - WARNING: File '$fileName' does not exist"
}
}
# finally create a CSV with the results of this copy.
# the CSV will have two headers 'OriginalFile' and 'NewFile'
$result | Export-Csv -Path $ResultsFile -NoTypeInformation -Force
Thank you to everyone for the solutions. All of them worked and worked well. I chose Theo as the answer for the fact that his solution solved the error logging and stored all the new renamed files with GUID_File.ext new to the existing CSV info.
Thank you all.
i am trying to loop through all files no matter the type, in a folder, and change a string with one that is input by the user..
i can do this now, with the code below, but only with one type of file extension..
This is my code:
$NewString = Read-Host -Prompt 'Input New Name Please'
$scriptPath = split-path -parent $MyInvocation.MyCommand.Definition
$InputFiles = Get-Item "$scriptPath\*.md"
$OldString = 'SolutionName'
$InputFiles | ForEach {
(Get-Content -Path $_.FullName).Replace($OldString,$NewString) | Set-Content -Path $_.FullName
}
echo 'Complete'
How do i loop through the files, no matter the extension ?
so no matter if it is a md, txt or cshtml or some other, it will replace the string as instructed.
To get all the files in a folder you can get use Get-ChildItem. Add the -Recurse switch to also include files inside of sub-folders.
E.g. you could rewrite your script like this
$path = 'c:\tmp\test'
$NewString = Read-Host -Prompt 'Input New Name Please'
$OldString = 'SolutionName'
Get-ChildItem -Path $path | where {!$_.PsIsContainer} | foreach { (Get-Content $_).Replace($OldString,$NewString) | Set-Content -Path $_.FullName }
this will first get all the files from inside the folder defined in $path, then replace the value given in $OldString with what the user entered in when prompted and finally save the files.
Note: the scripts doesn't make any difference regarding if the content of the files changed or not. This will cause all files modified date to get updated. If this information is important to you then you need to add a check to see if the files contains the $OldString before changing them and saving.