robocopy MAXAGE / MINAGE value with hours and minutes - windows

I am trying to copy files from server to another server every hour as files being created. I was using Robocopy for copying file, and its very useful. But now I am really stuck with this.
I need to copy files with MINAGE value of minutes, something like that.
if i ran robocopy after 2pm, i should able to copy only file which created before 2PM
Robocopy MAXAGE and MINAGE only accepts date not time.
Any suggestion

Why u don't use the MIR function and run the job every 60 min via task scheduler?
Another way could be:
/mot: Monitors source, and runs again in M minutes if changes are detected.
My last resort (non robocopy way):
Copy-Item c:\src\*.* -filter (Get-ChildItem | Where{$_.CreationTime -ge (Get-Date).AddMinutes(-60)}) "C:\dest\"
You could that even run via task scheduler

xxcopy has better granularity in date/time.

MINAGE and MAXAGE refer to the creation date of the file.
MINLAD and MAXLAD refer to the last write time of the file.
Use a combination of both.
Source: http://social.technet.microsoft.com/Forums/scriptcenter/en-US/b5cb685e-32f6-4eed-855d-e710ca4b203f/what-is-the-date-in-robocopys-minage-

Related

win cmd: remove whitespace > not enough storage to process this command

I have a long (several million lines) data sheet in plain txt. Looks like this:
cellnumber x-coordinate y-coordinate z-coordinate temperature
1 -6.383637190E-01 2.408539131E-02 -5.244855285E-01 3.081549136E+02
2 -6.390314698E-01 2.286404185E-02 -5.245100260E-01 3.081547595E+02
3 -6.381718516E-01 2.373264730E-02 -5.236577392E-01 3.081547591E+02
4 -6.360489130E-01 2.259869128E-02 -5.245736241E-01 3.081547591E+02
5 -6.369081736E-01 2.253472991E-02 -5.236831307E-01 3.081547591E+02
6 -6.382256746E-01 2.215057984E-02 -5.237988830E-01 3.081547591E+02
7 -6.381900311E-01 2.126700431E-02 -5.245448947E-01 3.081547591E+02
8 -6.373924613E-01 2.117809094E-02 -5.238834023E-01 3.081547591E+02
I currently only have win command line and need to get rid off the whitespaces ath the beginning (their length is not constant as the cellnumber increases) so that I get
cellnumber x-coordinate y-coordinate z-coordinate temperature
1 -6.383637190E-01 2.408539131E-02 -5.244855285E-01 3.081549136E+02
2 -6.390314698E-01 2.286404185E-02 -5.245100260E-01 3.081547595E+02
3 -6.381718516E-01 2.373264730E-02 -5.236577392E-01 3.081547591E+02
4 -6.360489130E-01 2.259869128E-02 -5.245736241E-01 3.081547591E+02
5 -6.369081736E-01 2.253472991E-02 -5.236831307E-01 3.081547591E+02
6 -6.382256746E-01 2.215057984E-02 -5.237988830E-01 3.081547591E+02
7 -6.381900311E-01 2.126700431E-02 -5.245448947E-01 3.081547591E+02
8 -6.373924613E-01 2.117809094E-02 -5.238834023E-01 3.081547591E+02
May I ask for a solution? I dont have a clue, am not really experienced with this. Thx!
I guess TrimStart may be my friend.
EDIT: I have put together this:
#ECHO OFF
set "victim=testJana.txt"
SETLOCAL
FOR /F "tokens=*" %%A IN (%victim%) DO (
IF NOT "%%A"=="_" ECHO %%A>>%victim%_edited.txt
)
ENDLOCAL
pause
it works fine for smaller files but Im getting the message
not enough storage to process this command
Any idea how to deal with this?
I would suggest using powershell:
First, Second and Third edit: To be executed in the directory where data.txt file is placed and in the powershell.exe shell:
(good point to add -ReadCount by #lit in other post)
Get-Content -ReadCount 500 -Path .\path_to_your_source\data.txt | % {$_ -replace " +", " "} | Set-Content -Path .\path_to_our_output\data_no_additional_spaces.txt
Why -ReadCount makes sense? Here it takes 500 lines per run via pipes.
Here is the info from Microsoft pages)
-ReadCount
Specifies how many lines of content are sent through the pipeline at a
time. The default value is 1. A value of 0 (zero) sends all of the
content at one time.
This parameter does not change the content displayed, but it does
affect the time it takes to display the content. As the value of
ReadCount increases, the time it takes to return the first line
increases, but the total time for the operation decreases. This can
make a perceptible difference in very large items.
Reads data, replaces all the spaces and then saves data into data_new.txt
This answer was meant for the powershell.exe shell not the cmd.exe where you normally run your *.bat files. In powershell you have scripts called *.ps1.
If you store this the above command in a trim_space.ps1 and then launch it as (you need to have the script in the same directory as the data being transformed):
powershell.exe -ExecutionPolicy Bypass &'C:\path_to_script\trim_space.ps1'. You will see it executed.
Forth edit
To address your:
it works fine for smaller files but Im getting the message not enough
storage to process this command
Any idea how to deal with this?
You have to process it by chunks which you are not doing in your batch file right now. You get just to the point where you exhaust all the thread memory and it naturally fails. You need to have approach which allows you to limit the chunk of lines which are processed at once like -Readcount. In batch file I imagine it would be possible to call one batch file from other which would process only limited part of the file.
Using PowerShell, you can limit how much data is processed at a time in the pipeline.
Get-Content -Path .\testJana.txt -ReadCount 1000 |
ForEach-Object { $_ -replace '^ +', '' } |
Out-File -FilePath .\testJana_edited.txt -Encoding ASCII
If you want to run this from a cmd.exe shell, put the PowerShell code above into a file named sourcelimit.ps1 and use the following in a .bat script file.
powershell -NoProfile -File .\sourcelimit.ps1

Robocopy with simple history of files backup

do you know how can I use robocopy to make incremental copy ?
A simple example that I would like to have :
\\SOURCE : all files
\\DESTINATION\2016-01-01 : all files
\\DESTINATION\2016-01-02 : only all files modified or created from 2016-01-01
\\DESTINATION\2016-01-03 : only all files modified or created from 2016-01-02
etc...
Thanks for help,
Pierre
You should be able to leverage these switches:
/MAXAGE:n : MAXimum file AGE - exclude files older than n days/date.
/MINAGE:n : MINimum file AGE - exclude files newer than n days/date.
(If n < 1900 then n = no of days, else n = YYYYMMDD date).
(More switches to tailor your solution)
So to do something historical
ROBOCOPY \\Source \\DESTINATION\2016-01-01
ROBOCOPY \\Source \\DESTINATION\2016-01-02 /MINAGE:20160102 /MAXAGE:20160101
ROBOCOPY \\Source \\DESTINATION\2016-01-02 /MINAGE:20160103 /MAXAGE:20160102
If it's to take a back up for today this will suffice, on the day of backup:
Robocopy \\source \\destination /MAXAGE:1

Automating scripts, Rscripts

Im having a lot of trouble of automating my .R files, and im having trouble understand the information regarding it. But here goes:
Im using windows 7 and simply want to automatically run a R.script every morning at 08.00. The .R file spits out the output by itself, so I dont want a separate output-file. I've created a bat-file like this:
"C:\R\R-3.0.1\bin\x64\Rscript.exe" "C:\R\R-3.0.1\bin\x64\Scripts\24AR_v1bat.R"
Echo %DATE% %TIME% %ERRORLEVEL% >> C:\R\R-3.0.1\bin\x64\scripts\24AR_v1.txt
When I run this manually, it works perfectly. Both with/without the:
--default-packages=list
When I run it through the cmd-window, it works perfectly. Yet when I try to run it through the task-scheduler it runs, but does not work. (I either get a 1 or 2 error in my error-message file).
I've looked at R Introduction - Invoking R from the command line, and help(Rscript) but I still can't manage to get it to work.
NEW EDIT: I found that not doing the MS SQL-call, will let my code run from the scheduler. Not sure if I should make a new question or?
EDIT: Adding the R-script
# 24 Hour AR-model, v1 ----------------------------------------------------
#Remove all variables from the workspace
#rm(list=ls())
# Loading Packages
library(forecast)
#Get spot-prices System from 2012-01-01 to today
source("/location/Scripts/SQL_hourlyprices.R")
sys <- data.frame()
sys <- spot
rm(spot)
# Ordering the data, first making a matrix with names: SYS
colnames(sys) <- c("date","hour","day","spot")
hour <-factor(sys[,2])
day <-factor(sys[,3])
dt<-sys[,1]
dt<-as.Date(dt)
x<-sys[,4]
q <-ts(x, frequency=24)
x0<- q[hour==0]
x1<- q[hour==1]
x0 <-ts(x0, frequency=7)
x1 <-ts(x1, frequency=7)
# ARIMA MODELS
y0<-Arima(x0,order=c(2,1,0))
y1<-Arima(x1,order=c(2,1,1))
fr0 <- forecast.Arima(y0,h=1)
fr1 <- forecast.Arima(y1,h=1)
h1<-as.numeric(fr0$mean)
h2<-as.numeric(fr1$mean)
day1 <-Sys.Date()+1
atable<-data.frame
runtime<-Sys.time()
atable<-cbind(runtime,day1,h1,h2)
options(digits=4)
write.table(atable, file="//location/24ar_v1.csv",
append=TRUE,quote=FALSE, sep=",", row.names=F, col.names=F)
But as I said, I can manually run the code with the batch-file and have it work perfectly, yet with the scheduler it won't work.
After hours of trying everything, it seems the problem was that I had:
source("/location/Scripts/SQL_hourlyprices.R")
Where I simply had a SQL-call inside:
sqlQuery(dbdata2, "SELECT CONVERT(char(10), [lokaldatotid],126) AS date,
DATEPART(HOUR,lokaldatotid) as hour,
DATENAME(DW,lokaldatotid) as dag,
pris as spot
FROM [SpotPriser] vp1
WHERE (vp1.boers_id=0)
AND (vp1.omraade_id=0)
AND lokaldatotid >='2012-01-01'
GROUP BY lokaldatotid, pris
ORDER BY lokaldatotid, hour desc") -> spot
When I moved this directly into the script, and deleted the source-line, the scripts would run with the scheduler.
I have no idea why....

Powershell or other Windows method to copy datestamped html file to network share

Im new to powershell - so serious noob.
But I wanted to see if anyone could help in doing the following.
We have a folder on a server that has reports written to it every night.
The reports are named in the following format:
DiskSpaceReport_26102012.html
and location of C:\Powershell\WebReport\
I would like a PS script to copy these 1 of these files from the folder using a daterange of -8 Days from the date the script runs - the script would be run as part of a windows scheduled task or through SQL Agent job.
So at present there are 8 files in the folder dating from Friday 26 Oct to Friday 19th Oct.
I would like the process to run today and copy the file -8 days back from todays date.
So copy the file named DiskSpaceReport_19102012.html
And this process should repeat weekly on friday and copy the last file from 8 days ago.
The copy is to a network share
\\Server01\Powershell\Webreports_Archive
And as I mentioned in title I dont mind if this is easier to do via robocopy in a batch file for example.
Would prefer it via PS though.
The following will do what you want:
$pastdays = -8
$pastdate = [datetime]::Now.AddDays($pastdays)
$filename = "DiskSpaceReport_" + $pastdate.Day + $pastdate.Month + $pastdate.Year+".html"
Copy-Item -Path "C:\Powershell\WebReport\$($filename)" "\\Server01\Powershell\Webreports_Archive"
regards
Jon

Query windows event log for the past two weeks

I am trying to export a windows event log but limit the exported events not according to number but according to time the event was logged. I am trying to do that on windows 7 and newer. So far my efforts are focused on using wevtutil.
I am using wevtutil and my command line now is: wevtutil Application events.evtx The problem here is that I export the whole log and this can be quite big so I want to limit it just to the last 2 weeks.
I have found this post but first of all it does not seem to produce any output on my system(yes I have changed the dates and time) and second it seems to be dependent on the date format which I try to avoid.
Here is the modified command I ran:
wevtutil qe Application "/q:*[System[TimeCreated[#SystemTime>='2012-10-02T00:00:00' and #SystemTime<'2012-10-17T00:00:00']]]" /f:text
I had to replace the < and > with the actual symbols as I got a syntax error otherwise. This command produces empty output.
The problem is due to /q: being inside quotes. It should be outside, like:
wevtutil qe Application /q:"*[System[TimeCreated[#SystemTime>='2012-10-02T00:00:00' and #SystemTime<'2012-10-17T00:00:00']]]" /f:text
This works just fine for me.
For the events of the last 2 weeks, you could also use timediff, to avoid hard-coding dates.
Windows uses milliseconds, so it would be 1000 * 86400 (seconds, = 1 day) * 14 (days) = 1209600000.
For your query, that would look like
wevtutil qe Application /q:"*[System[TimeCreated[timediff(#SystemTime) <= 1209600000]]]" /f:text /c:1
I added /c:1 to get only 1 event in the example, since there are many events in the last 2 weeks.
You may also want to only list warning and errors. For that, you can use (Level=2 or Level=3). (For some reason, Level<4 doesn't seem to work for me on Win7)
wevtutil qe Application /q:"*[System[(Level=2 or Level=3) and TimeCreated[timediff(#SystemTime) <= 1209600000]]]" /f:text /c:1
I don't know how you feel about PowerShell, but it's available on all the systems you tagged.
From a powershell prompt, see Get-Help Get-EventLog -Examples for more info.
If you have to do this from a .cmd or .bat file, then you can call powershell.exe -File powershell_script_file_name
where powershell_script_file_name has the Get-EventLog command(s) you need in it.
This example gives all the Security Event Log failures, I use to audit systems:
Get-EventLog -LogName security -newest 1000 | where {$_.entryType -match "Failure"}
I strongly recommend using LogParser for this kind of task:
logparser -i:evt file:query.sql
With query.sql containing something like this:
SELECT
TimeGenerated,EventID,SourceName,Message
FROM Application
WHERE TimeGenerated > TO_TIMESTAMP(SUB(TO_INT(SYSTEM_TIMESTAMP()), 1209600))
ORDER BY TimeGenerated DESC
The somewhat unintuitive date calculation converts the system time (SYSTEM_TIMESTAMP()) to an integer (TO_INT()), subtracts 1209600 seconds (60 * 60 * 24 * 14 = 2 weeks) and converts the result back to a timestamp (TO_TIMESTAMP()), thus producing the date from 2 weeks ago.
You can parameterize the timespan by replacing the fixed number of seconds with MUL(86400, $days) and changing the commandline to this:
logparser -i:evt file:query.sql+days=14
You can also pass the query directly to logparser:
logparser -i:evt "SELECT TimeGenerate,EventID,SourceName,Message FROM ..."

Resources