Powershell not setting exact modified time? - windows

So I don't really mind and it's not super important but I wrote a powershell script to set the time of a codec converted family film. The original files ext been named to MP44 and the resulting files ext are MP4. I just want to understand why the times are not exactly the same.
Directory of C:\Users\zzz\Desktop\family videos and images\hero 2021
2022-03-18 05:24 PM <DIR> .
2022-03-16 07:18 AM <DIR> ..
2020-09-02 07:15 PM 79,353,358 ffmpeg.exe
2020-09-02 07:15 PM 79,214,606 ffplay.exe
2020-09-02 07:15 PM 79,249,422 ffprobe.exe
2022-03-16 09:45 PM 482 go.bat
2021-12-24 09:14 PM 4,000,895,516 GX010066.MP44
2021-12-24 09:14 PM 741,345,212 GX010066.MP44.ffmpeg.mp4
2021-12-24 10:25 PM 4,003,210,355 GX010067.MP44
2021-12-24 10:25 PM 687,471,776 GX010067.MP44.ffmpeg.mp4
2021-12-24 09:15 PM 4,001,034,065 GX020066.MP44
2021-12-24 09:15 PM 719,404,024 GX020066.MP44.ffmpeg.mp4
2021-12-24 10:27 PM 3,629,297,689 GX020067.MP44
2021-12-24 10:27 PM 635,027,797 GX020067.MP44.ffmpeg.mp4
2021-12-24 09:17 PM 4,000,513,626 GX030066.MP44
2021-12-24 09:17 PM 690,608,291 GX030066.MP44.ffmpeg.mp4
2021-12-24 09:17 PM 714,946,960 GX040066.MP44
2021-12-24 09:17 PM 125,647,486 GX040066.MP44.ffmpeg.mp4
2022-03-18 05:36 PM 537 timechanger.ps1
17 File(s) 24,187,221,202 bytes
2 Dir(s)
Directory shows same times/dates.
I guess it's more of a windows question than a powershell question. Why would explorer show different modified dates?
Here is my powershell for posterity:
$files = Get-ChildItem *.mp44 | select Name, LastWritetime
foreach ($file in $files) {
$currentname = $file.name
#$currentname
$currentsubstring = $currentname.substring(0,8)
#$currentsubstring
$WriteTime = $file.LastWritetime
#$WriteTime
write-host "Going to get-item $currentsubstring.MP44.ffmpeg.mp4 and set $Writetime"
$targetfile = "$currentsubstring.MP44.ffmpeg.mp4"
$Targettime = $Writetime
$bacon = Get-Item $targetfile
$bacon.LastWritetime = $targettime
}
Thanks in advance for your time and assistance.

Because the 'Date' column in Windows Explorer that you have showing there is the Creation Date property, not the Last Modified Date.
Notice how the "Date" column matches the "Date created" column, not the "Date modified" column in my screenshot.

Related

how to fetch logs content between two dates in windows powershell?

In our project, we are getting AppDynamics logs(application logs) and machine logs and sometime the the size of the logs increase which eats out the disk size. What I am trying to do it is to get the content between two dates like 10 Nov and 13 Nov and delete the rest. Since we are working in windows environment, this needs to be done in powershell. It is easier to handle such things in linux but I am not good at powershell scripting. Below is the code snippet.
[AD Thread-Metric Reporter1] 10 Nov 2019 14:47:32,899 ERROR ManagedMonitorDelegate - Error sending metrics - will requeue for later transmission
com.singularity.ee.agent.commonservices.metricgeneration.metrics.MetricSendException: Connection back off limitation in effect: /controller/instance/702/metrics
at com.singularity.ee.agent.commonservices.metricgeneration.AMetricSubscriber.publish(AMetricSubscriber.java:350)
at com.singularity.ee.agent.commonservices.metricgeneration.MetricReporter.run(MetricReporter.java:113)
at com.singularity.ee.util.javaspecific.scheduler.AgentScheduledExecutorServiceImpl$SafeRunnable.run` ]
[AD Thread-Metric Reporter1] 11 Nov 2019 14:46:32,899 ERROR ManagedMonitorDelegate - Error sending metrics - will requeue for later transmission
com.singularity.ee.agent.commonservices.metricgeneration.metrics.MetricSendException: Connection back off limitation in effect: /controller/instance/702/metrics
at com.singularity.ee.agent.commonservices.metricgeneration.AMetricSubscriber.publish(AMetricSubscriber.java:350)
at com.singularity.ee.agent.commonservices.metricgeneration.MetricReporter.run(MetricReporter.java:113)
at com.singularity.ee.util.javaspecific.scheduler.AgentScheduledExecutorServiceImpl$SafeRunnable.run` ]
[extension-scheduler-pool-5] 13 Nov 2019 18:45:40,634 INFO ReportMetricsConfigSupplier - Basic metrics will be collected and reported through the SIM extension because SIM is enabled.
[extension-scheduler-pool-8] 14 Nov 2019 18:47:18,650 INFO ReportMetricsConfigSupplier - Basic metrics will be collected and reported through the SIM extension because SIM is enabled.` ]
Code Snippet with file paths
# Get Start Time
$startDTM = (Get-Date)
$zstart = Read-Host -prompt '
Enter your start date in "10 Nov" format. Start date must be earlier than stop date.'
$zstop = Read-Host -prompt 'Enter your stop date in "13 Nov" format. Stop date must be later than start date.'
# $zstart = '10 Nov'
# $zstop = '13 Nov'
$zstart= Select-String $zstart "$env:userprofile\Desktop\machine-log.txt" | Select-Object -ExpandProperty LineNumber
$zstop= Select-String $zstop "$env:userprofile\Desktop\machine-logtxt" | Select-Object -ExpandProperty LineNumber
$AppLog = gc $env:userprofile\Desktop\machine-log.txt
$i = 0
$array = #()
foreach ($line in $AppLog){
foreach-object { $i++ }
if (($i -ge $zstart) -and ($i -le $zstop))
{$array += $line}}
$array | Out-File -encoding ascii -filepath $env:userprofile\Desktop\logfile-output.txt
The ERROR i get while executing the script.
8923 8924 8925 8926 8927 8928 8929 8930 8931 8932 8933 8934 8935 8936 8937 8938 8939 8940 8941 8942 8943
8944 8945 8946 8947 8948". Error: "Cannot convert the "System.Object[]" value of type "System.Object[]" to
type "System.Int32"."
At C:\Users\xa_abbasmn\Documents\Logs\test.ps1:13 char:5
+ if (($i -ge $zstart) -and ($i -le $zstop))
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (:) [], RuntimeException
+ FullyQualifiedErrorId : ComparisonFailure
Could not compare "18" to "1 15 16 17 31 45 46 47 48 62 76 77 91 105 106 107 121 135 136 137 151 152 196 210
211 225 239 240 241 255 269 270 271 285 299 300 314 315 329 330 331 332 346 390 404 447 448 449 463 477 478
492 506 507 508 522 536 537 538 552 566 567 581 582 583 627 641 642 643 657 671 685 686 687
The powershell and windows version:
Name: Windows PowerShell ISE Host - Version : 5.1.14409.1018
Name: Microsoft Windows Server 2012 R2 Standard 64bit
your help will be highly obliged.
Best regards,
Windows 10 64-bit. Powershell 5. Does not require admin privileges.
How to quickly and efficiently extract text from a large logfile from input1 to input2 using powershell 5?
Sample logfile below.
For testing purposes copy your logfile to the desktop and name it logfile.txt
What is the default text editor in Windows Server 2012 R2 Standard 64-bit? See line 42.
What program do you have associated with .txt files? See line 42.
# Get Start Time
$startDTM = (Get-Date)
$zstart = Read-Host -prompt '
Enter your start date in "10 Nov" format (w/o quotes). Start date must be earlier than stop date.'
$zstop = Read-Host -prompt 'Enter your stop date in "13 Nov" format (w/o quotes). Stop date must be later than start date.'
# $zstart = '10 Nov'
# $zstop = '13 Nov'
$zstart= Select-String $zstart "$env:userprofile\Desktop\logfile.txt" | Select-Object -ExpandProperty LineNumber
$zstop= Select-String $zstop "$env:userprofile\Desktop\logfile.txt" | Select-Object -ExpandProperty LineNumber
$AppLog = gc $env:userprofile\Desktop\logfile.txt
$i = 0
$array = #()
foreach ($line in $AppLog){
foreach-object { $i++ }
if (($i -ge $zstart) -and ($i -le $zstop))
{$array += $line}}
$array | Out-File -encoding ascii -filepath $env:userprofile\Desktop\logfile-edited.txt
# begin get file size
$y = (Get-ChildItem "$env:userprofile\Desktop\logfile.txt" | Measure-Object -property length -sum)
$y = [System.Math]::Round(($y.Sum /1KB),2)
$z = (Get-ChildItem "$env:userprofile\Desktop\logfile-edited.txt" | Measure-Object -property length -sum)
$z = [System.Math]::Round(($z.Sum /1KB),2)
# end get file size
# get Stop Time
$endDTM = (Get-Date)
Write-Host "
Extracted $z KB from $y KB. The extraction took $(($endDTM-$startDTM).totalseconds) seconds"
start-process -wait notepad $env:userprofile\Desktop\logfile-edited.txt
remove-item $env:userprofile\Desktop\logfile-edited.txt
exit
logfile.txt:
[AD Thread-Metric Reporter1] 09 Nov 2019 14:48:32,899 ERROR ManagedMonitorDelegate - Error sending metrics
com.singularity.ee.agent.commonservices.metricgeneration.metrics.MetricSendException: Connection back off limitation
at com.singularity.ee.agent.commonservices.metricgeneration.AMetricSubscriber.publish(AMetricSubscriber.java:350)
at com.singularity.ee.agent.commonservices.metricgeneration.MetricReporter.run(MetricReporter.java:113)
at com.singularity.ee.util.javaspecific.scheduler.AgentScheduledExecutorServiceImpl$SafeRunnable.run ]
[AD Thread-Metric Reporter1] 10 Nov 2019 14:47:32,899 ERROR ManagedMonitorDelegate - Error sending metrics
com.singularity.ee.agent.commonservices.metricgeneration.metrics.MetricSendException: Connection back off limitation
at com.singularity.ee.agent.commonservices.metricgeneration.AMetricSubscriber.publish(AMetricSubscriber.java:350)
at com.singularity.ee.agent.commonservices.metricgeneration.MetricReporter.run(MetricReporter.java:113)
at com.singularity.ee.util.javaspecific.scheduler.AgentScheduledExecutorServiceImpl$SafeRunnable.run ]
[AD Thread-Metric Reporter1] 11 Nov 2019 14:46:32,899 ERROR ManagedMonitorDelegate - Error sending metrics
com.singularity.ee.agent.commonservices.metricgeneration.metrics.MetricSendException: Connection back off limitation
at com.singularity.ee.agent.commonservices.metricgeneration.AMetricSubscriber.publish(AMetricSubscriber.java:350)
at com.singularity.ee.agent.commonservices.metricgeneration.MetricReporter.run(MetricReporter.java:113)
at com.singularity.ee.util.javaspecific.scheduler.AgentScheduledExecutorServiceImpl$SafeRunnable.run ]
[extension-scheduler-pool-5] 13 Nov 2019 18:45:40,634 INFO ReportMetricsConfigSupplier - Basic metrics will be collected
[extension-scheduler-pool-8] 14 Nov 2019 18:47:18,650 INFO ReportMetricsConfigSupplier - Basic metrics will be collected
Results of running script on logfile.txt:
[AD Thread-Metric Reporter1] 10 Nov 2019 14:47:32,899 ERROR ManagedMonitorDelegate - Error sending metrics
com.singularity.ee.agent.commonservices.metricgeneration.metrics.MetricSendException: Connection back off limitation
at com.singularity.ee.agent.commonservices.metricgeneration.AMetricSubscriber.publish(AMetricSubscriber.java:350)
at com.singularity.ee.agent.commonservices.metricgeneration.MetricReporter.run(MetricReporter.java:113)
at com.singularity.ee.util.javaspecific.scheduler.AgentScheduledExecutorServiceImpl$SafeRunnable.run ]
[AD Thread-Metric Reporter1] 11 Nov 2019 14:46:32,899 ERROR ManagedMonitorDelegate - Error sending metrics
com.singularity.ee.agent.commonservices.metricgeneration.metrics.MetricSendException: Connection back off limitation
at com.singularity.ee.agent.commonservices.metricgeneration.AMetricSubscriber.publish(AMetricSubscriber.java:350)
at com.singularity.ee.agent.commonservices.metricgeneration.MetricReporter.run(MetricReporter.java:113)
at com.singularity.ee.util.javaspecific.scheduler.AgentScheduledExecutorServiceImpl$SafeRunnable.run ]
[extension-scheduler-pool-5] 13 Nov 2019 18:45:40,634 INFO ReportMetricsConfigSupplier - Basic metrics will be collected
Powershell in four hours at Youtube
Parse and extract text with powershell at Bing
How to quickly and efficiently extract text from a large logfile from line x to line y using powershell 5?
How to quickly and efficiently extract text from a large logfile from date1 to date2 using powershell 5?

ORACLE 11G Data Downloading

I’m using Toad for Oracle to access 11G environment and my dataset has 1,000,000 records. While downloading I want to split the files in 100,000 each 10 files. I do not have the authority to create tables instead I can run and download only. Is their a way I can split the files while downloading?
You can split a text file using PowerShell or linux shell.
How to Split Large Text File into Smaller Files in Linux
example for linux
#split -l 100000 test.txt new
example for Windows
How can I split a text file using PowerShell?
split_log.ps1
param(
[string]$input_file = "",
[string]$count_line = ""
)
$lineCount = 0
$fileCount = 1
foreach ($line_file in get-content $input_file)
{
write $line_file | out-file -encoding ASCII -Append $input_file"_"$fileCount".out"
$lineCount++
if ($lineCount -eq $count_line)
{
$fileCount++
$lineCount = 0
}
}
PS C:\АСУ\Stackoverflow\split_log> ls
Каталог: C:\АСУ\Stackoverflow\split_log
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a--- 17.12.2018 6:03 10959355 1124.sql
-a--- 17.12.2018 7:27 357 split_log.ps1
PS C:\АСУ\Stackoverflow\split_log> .\split_log.ps1 .\1124.sql 10000
PS C:\АСУ\Stackoverflow\split_log> ls
Каталог: C:\АСУ\Stackoverflow\split_log
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a--- 17.12.2018 6:03 10959355 1124.sql
-a--- 17.12.2018 7:50 2461667 1124.sql_1.out
-a--- 17.12.2018 7:50 2461458 1124.sql_2.out
-a--- 17.12.2018 7:50 2461340 1124.sql_3.out
-a--- 17.12.2018 7:50 2461352 1124.sql_4.out
-a--- 17.12.2018 7:50 1113540 1124.sql_5.out
-a--- 17.12.2018 7:27 357 split_log.ps1

window 7 and 8. dir>list.txt sends file size unreadable

2016-10-18 09:38 PM 32ÿ032 error messages plus.docx
2002-10-18 02:19 PM 195 filestruc.bat
2002-10-18 02:20 PM 1ÿ588ÿ209 filestruc.txt
2004-03-07 03:26 AM 275ÿ792 FileStrucSize.txt
Result: The file size is unreadable. How do it solve this problem?

Perl's retrieval of file create time incorrect

I am attempting to use perl to rename files based on the folder they are in and the time created. Files GOPR1521.MP4 and GOPR7754.MP4 were created on two different cameras at the same time and date, and I want to be able their names to indicate that. For example .../GoProTravisL/GOPR1521.mp4 created at 12:32:38 should become 123238L_GOPR1520.mp4, and GOPR7754.MP4 becomes 123239R_GOPR7754.MP4. Right now the only problem is the time stamps. I would think its a problem with being wrong timezone or hour offset, but the minutes are off too. Is there something in perl I am missing when getting time stamps? Below is the perl code, what it outputs for times for each file, and what Finder on OS X says the creation times are.
Code:
#!/usr/bin/perl
use Time::Piece;
use File::stat;
use File::Find;
use File::Basename;
use File::Spec;
#files = <$ARGV[0]/>;
find({ wanted => \&process_file, no_chdir => 1 }, #files);
sub process_file {
my($filename, $dirs, $suffix) = fileparse($_,qr/\.[^.]*/);
if ((-f $_) && ($filename ne "" )) {
#print "\n\nThis is a file: $_";
#print "\nFile: $filename";
#print "\nDIR: $dirs";
my(#parsedirs) = File::Spec->splitdir($dirs);
my #strippeddirs;
foreach my $element ( #parsedirs ) {
push #strippeddirs, $element if defined $element and $element ne '';
}
$pardir = pop(#strippeddirs);
#print "\nParse DIR: ", $pardir;
#print "\nFile creation time: ";
$timestamp = localtime(stat($_)->ctime)->strftime("%H%M%S"); #gives time stamp
print $timestamp;
$newname = $timestamp . substr($pardir,-1) ."_". $filename . $suffix;
print "\nRename: $dirs$filename$suffix to $dirs$newname\n";
#rename ($dirs . $filename . $suffix,$dirs . $newname) || die ( "Error in renaming: " . $! );
} else {
print "\n\nThis is not file: $_\n";
}
}
Output of time stamps for each file:
/Volumes/Scratch/Raw/2016-03-21/GoProTravisL/
File: GOPR1520
File creation time: 05-55-21
File: GOPR1521
File creation time: 05-56-18
File: GOPR1522
File creation time: 05-57-44
File: GOPR1523
File creation time: 05-58-49
File: GP011520
File creation time: 05-59-53
/Volumes/Scratch/Raw/2016-03-21/GoProTravisR
File: GOPR7754
File creation time: 06-02-48
File: GOPR7755
File creation time: 06-04-19
File: GOPR7756
File creation time: 06-06-27
File: GOPR7757
File creation time: 00-06-16
File: GP017754
File creation time: 00-19-30
File: GP027754
File creation time: 00-22-20
Actual file times using ls:
MacTravis:2016-03-21 travis$ ls -lR /Volumes/Scratch/Raw/2016-03-21
total 0
drwxr-xr-x 8 travis admin 272 Apr 9 21:25 GoProTravisL
drwxr-xr-x 9 travis admin 306 Apr 9 21:25 GoProTravisR
/Volumes/Scratch/Raw/2016-03-21/GoProTravisL:
total 21347376
-rw------- 1 travis admin 4001240088 Mar 21 12:04 GOPR1520.MP4
-rw------- 1 travis admin 1447364149 Mar 21 12:31 GOPR1521.MP4
-rw------- 1 travis admin 2140532053 Mar 21 12:45 GOPR1522.MP4
-rw------- 1 travis admin 1649133454 Mar 21 13:00 GOPR1523.MP4
-rw------- 1 travis admin 1691562945 Mar 21 12:21 GP011520.MP4
/Volumes/Scratch/Raw/2016-03-21/GoProTravisR:
total 31941008
-rw------- 1 travis admin 4001129586 Mar 21 12:04 GOPR7754.MP4
-rw------- 1 travis admin 2166255754 Mar 21 12:31 GOPR7755.MP4
-rw------- 1 travis admin 3202301883 Mar 21 12:45 GOPR7756.MP4
-rw------- 1 travis admin 2466803806 Mar 21 12:08 GOPR7757.MP4
-rw------- 1 travis admin 4001257192 Mar 21 11:27 GP017754.MP4
-rw------- 1 travis admin 516025454 Mar 21 11:29 GP027754.MP4
ctime is the "time of last status change", which I believe is the time the inode was last modified. It is NOT the file's creation time[1]. ls lists the file modification time, so simply change from using ctime to using mtime.
Historically, the time at which a file was created wasn't tracked by file systems used on unix file systems. Some newer file systems track it, but I am unsure how to access it (nor is it needed here).

Convert epoch to local time in perl on windows

We are trying to convert the output received from the below code
The current output is in this form
testingwindows,1446727960,1446728560,kkulka11,testingwin
testingwindows1,1446727160,141228560,kkulka11,testingwin
testingwindows2,1446727120,1446728560,kkulka11,testingwin
testingwindows3,1446727960,1446728560,kkulka11,testingwin
The output required is something like
testingwindows from Fri Oct 3 13:51:05 2015 GMT to Mon Nov 9 13:51:05 2015 GMT by kkulka11 for testingwin.
testingwindows1 from Fri Oct 2 13:51:05 2015 GMT to Mon Nov 9 13:51:05 2015 GMT by kkulka11 for testingwin.
testingwindows2 from Fri Oct 2 13:51:05 2015 GMT to Mon Nov 9 13:51:05 2015 GMT by kkulka11 for testingwin.
testingwindows3 from Fri Oct 12 13:51:05 2015 GMT to Mon Nov 9 13:51:05 2015 GMT by kkulka11 for testingwin.
This is my current code
if ( $COMMAND eq 'queryone' ) {
my $msend_query = "$MCELL_HOME\\bin\\mquery";
my #args_query = (
$msend_query,
"-q",
"-c", "$MCELL_HOME\\etc\\mclient.conf",
"-n", "$CS_BLACKOUT_CELL",
"-d",
"-f", "csv",
"-a", "CS_EMB_GBF_BLACKOUTS" ,
"-s", "blackout_host,start_timestamp,stop_timestamp,userid,reason",
"-w", "blackout_host: == '${BLACKOUTHOST}'"
);
system(#args_query);
We tried using perl -pe 's/(\d{10})/gmtime($1)/e'; but not able to convert and it gives this error
'o~}go⌂⌂t⌂x⌂w' is not recognized as an internal or external command,
operable program or batch file.
when we used the code as
if ( $COMMAND eq 'queryone' ) {
my $msend_query = "$MCELL_HOME\\bin\\mquery";
my $mqt = "$MCELL_HOME\\mqt.pl";
my #args_query = (
$msend_query,
"-q",
"-c", "$MCELL_HOME\\etc\\mclient.conf",
"-n", "$CS_BLACKOUT_CELL",
"-d",
"-f", "csv",
"-a", "CS_EMB_GBF_BLACKOUTS",
"-s", "blackout_host,start_timestamp,stop_timestamp,userid,reason",
"-w", "blackout_host: == '${BLACKOUTHOST}'"
) | $mqt;
system(#args_query);
Needed experts quick help and guidance to achieve the output in human-readable format.
Edit:
Updated the code as per Jacob comments but still not received the output as desired. Please suggest
if ( $COMMAND eq 'queryone' ) {
my $msend_query = "$MCELL_HOME\\bin\\mquery";
my #args_query = (
$msend_query,
"-q",
"-c", "$MCELL_HOME\\etc\\mclient.conf",
"-n", "$CS_BLACKOUT_CELL",
"-d",
"-f", "csv",
"-a", "CS_EMB_GBF_BLACKOUTS" ,
"-s", "blackout_host,start_timestamp,stop_timestamp,userid,reason",
"-w", "blackout_host: == '${BLACKOUTHOST}'"
);
chomp;
my #parts = split(/,/, system(#args_query));
$parts[1] = localtime($parts[1]);
$parts[2] = localtime($parts[2]);
printf("%s from %s to %s by %s for %s\n", #parts);
}
Output:
M:\AbhayBackup\PerlKK>test.pl -q -h testingwin
testingwin
sub: testingwin
testingwin,1446727960,1446728560,kkulka11,testingwin
0 from Thu Jan 1 05:30:00 1970 to Thu Jan 1 05:30:00 1970 by for
while (<DATA>) {
chomp;
my #parts = split(/,/, $_);
$parts[1] = localtime($parts[1]);
$parts[2] = localtime($parts[2]);
printf("%s from %s to %s by %s for %s\n", #parts);
}
__DATA__
testingwindows,1446727960,1446728560,kkulka11,testingwin
testingwindows1,1446727160,141228560,kkulka11,testingwin
testingwindows2,1446727120,1446728560,kkulka11,testingwin
testingwindows3,1446727960,1446728560,kkulka11,testingwin
Output:
testingwindows from Thu Nov 5 05:52:40 2015 to Thu Nov 5 06:02:40 2015 by kkulka11 for testingwin
testingwindows1 from Thu Nov 5 05:39:20 2015 to Sun Jun 23 07:09:20 1974 by kkulka11 for testingwin
testingwindows2 from Thu Nov 5 05:38:40 2015 to Thu Nov 5 06:02:40 2015 by kkulka11 for testingwin
testingwindows3 from Thu Nov 5 05:52:40 2015 to Thu Nov 5 06:02:40 2015 by kkulka11 for testingwin
Edit: based on your most recent update to the question, I now believe that you're trying to capture the output of the command and process it. Since you haven't provided a minimal, complete, and verifiable example, and because I have no idea what mquery is and you haven't provided an explanation for that, either, I present to you this guess:
if ($COMMAND eq 'queryone') {
my #lines = `$MCELL_HOME\\bin\\mquery -q -c $MCELL_HOME\\etc\\mclient.conf -n $CS_BLACKOUT_CELL -d -f csv -a CS_EMB_GBF_BLACKOUTS -s blackout_host,start_timestamp,stop_timestamp,userid,reason -w blackout_host: == '${BLACKOUTHOST}'`;
for (#lines) {
chomp;
my #parts = split(/,/, $_);
$parts[1] = localtime($parts[1]);
$parts[2] = localtime($parts[2]);
printf("%s from %s to %s by %s for %s\n", #parts);
}
}

Resources