I am working on vbscript to unzip the multiple files one after another.
I am using below code
Dim folder(3)
folder(0) = "UBO90R1"
folder(1) = "UBO90R2"
folder(2) = "UBO100R1"
folder(3) = "UBO100R2"
For i = 0 To 3
unzip_Source = "D:\Autobackup\" & folder(i) & ".zip"
unzip_destination = "D:\Autobackup_unzip\" & folder(i) &"\"
Call ExtractFilesFromZip(unzip_Source,unzip_destination)
WScript.Echo "unzip Finished"
Next
Sub ExtractFilesFromZip(pathToZipFile, dirToExtractFiles)
Dim fso
Set fso = CreateObject("Scripting.FileSystemObject")
pathToZipFile = fso.GetAbsolutePathName(pathToZipFile)
dirToExtractFiles = fso.GetAbsolutePathName(dirToExtractFiles)
If (Not fso.FileExists(pathToZipFile)) Then
WScript.Echo "Zip file does not exist: " & pathToZipFile
Exit Sub
End If
If Not fso.FolderExists(dirToExtractFiles) Then
WScript.Echo "Directory does not exist: " & dirToExtractFiles
Exit Sub
End If
dim sa : Set sa = CreateObject("Shell.Application")
Dim zip : Set zip = sa.NameSpace(pathToZipFile)
Dim d : Set d = sa.NameSpace(dirToExtractFiles)
d.CopyHere zip.items, 4
Do Until zip.Items.Count <= d.Items.Count
WScript.Sleep(200)
Loop
End Sub
The Problem currently I have that if the folder or file is already exist than it open the dialog to the user to select any option Overwrite, keep both file etc.
If I change the following line from the code
d.CopyHere zip.items, 4
'Integer 4 didn't show the progress bar
to
d.CopyHere zip.items, 16
'Integer 16 overwrite the existing file but it shows the progress bar.
I would like to overwrite the existing file without any dialog box and without any progress bar.
PS: Code to unzip copied from here.
Looking at help we would find the follwing.
Type: FILEOP_FLAGS
Flags that control the file operation. This member can take a combination of the following flags.
FOF_ALLOWUNDO
Preserve undo information, if possible.
Prior to Windows Vista, operations could be undone only from the same process that performed the original operation.
In Windows Vista and later systems, the scope of the undo is a user session. Any process running in the user session can undo another operation. The undo state is held in the Explorer.exe process, and as long as that process is running, it can coordinate the undo functions.
If the source file parameter does not contain fully qualified path and file names, this flag is ignored.
FOF_CONFIRMMOUSE
Not used.
FOF_FILESONLY
Perform the operation only on files (not on folders) if a wildcard file name (.) is specified.
FOF_MULTIDESTFILES
The pTo member specifies multiple destination files (one for each source file in pFrom) rather than one directory where all source files are to be deposited.
FOF_NOCONFIRMATION
Respond with Yes to All for any dialog box that is displayed.
FOF_NOCONFIRMMKDIR
Do not ask the user to confirm the creation of a new directory if the operation requires one to be created.
FOF_NO_CONNECTED_ELEMENTS
Version 5.0. Do not move connected files as a group. Only move the specified files.
FOF_NOCOPYSECURITYATTRIBS
Version 4.71. Do not copy the security attributes of the file. The destination file receives the security attributes of its new folder.
FOF_NOERRORUI
Do not display a dialog to the user if an error occurs.
FOF_NORECURSEREPARSE
Not used.
FOF_NORECURSION
Only perform the operation in the local directory. Do not operate recursively into subdirectories, which is the default behavior.
FOF_NO_UI
Windows Vista. Perform the operation silently, presenting no UI to the user. This is equivalent to FOF_SILENT | FOF_NOCONFIRMATION | FOF_NOERRORUI | FOF_NOCONFIRMMKDIR.
FOF_RENAMEONCOLLISION
Give the file being operated on a new name in a move, copy, or rename operation if a file with the target name already exists at the destination.
FOF_SILENT
Do not display a progress dialog box.
FOF_SIMPLEPROGRESS
Display a progress dialog box but do not show individual file names as they are operated on.
FOF_WANTMAPPINGHANDLE
If FOF_RENAMEONCOLLISION is specified and any files were renamed, assign a name mapping object that contains their old and new names to the hNameMappings member. This object must be freed using SHFreeNameMappings when it is no longer needed.
FOF_WANTNUKEWARNING
Version 5.0. Send a warning if a file is being permanently destroyed during a delete operation rather than recycled. This flag partially overrides FOF_NOCONFIRMATION.
Related
I used to run 3 SAS EG Projects on a daily basis. Since a couple of days, we have a "SAS Scheduler" that is basically running those latter during the night (the first one at 00:00 AM, second one at 01:00 AM, third one at 03:00 AM). Each SAS Project has multiple SAS Programs.
All in all, that is great news, but this also mean I can't check the logs directly anymore.
To keep track of the night jobs, I am trying to find what could be the best way to export the log files for each project. I found out about the SAS Project Log recently, which basically summarize the logs from all the programs within a SAS Project.
I discovered CaseySmith's answer on the SAS Community forum, basically tweaking the .vbs script to save the SAS Project log file to a .txt using the following code:
Set objProjectLog = objProject.ProjectLog
objProjectLog.Clear()
objProjectLog.Enabled = True
'strProjectLog = objProjectLog.Text
objProjectLog.SaveAs "c:\temp\projectLog.txt"
But, 1) It is a .txt file not a log file and 2) I don't know where to add it in my current .vbs script:
Option Explicit
Dim app
Call dowork
'shut down the app
If not (app Is Nothing) Then
app.Quit
Set app = Nothing
End If
Sub dowork()
On Error Resume Next
'----
' Start up Enterprise Guide using the project name
'----
Dim prjName
Dim prjObject
prjName = "C:\Users\kermit\Desktop\Project.egp" 'Project Name
Set app = CreateObject("SASEGObjectModel.Application.8.1")
If Checkerror("CreateObject") = True Then
Exit Sub
End If
'-----
' open the project
'-----
Set prjObject = app.Open(prjName,"")
If Checkerror("app.Open") = True Then
Exit Sub
End If
'-----
' run the project
'-----
prjObject.run
If Checkerror("Project.run") = True Then
Exit Sub
End If
'-----
' Save the new project
'-----
prjObject.Save
If Checkerror("Project.Save") = True Then
Exit Sub
End If
'-----
' Close the project
'-----
prjObject.Close
If Checkerror("Project.Close") = True Then
Exit Sub
End If
End Sub
Function Checkerror(fnName)
Checkerror = False
Dim strmsg
Dim errNum
If Err.Number <> 0 Then
strmsg = "Error #" & Hex(Err.Number) & vbCrLf & "In Function " & fnName & vbCrLf & Err.Description
'MsgBox strmsg 'Uncomment this line if you want to be notified via MessageBox of Errors in the script.
Checkerror = True
End If
End Function
In the end, what I would like is that on the morning, I run a program that scan the 3 project log files for Notes, Warning and Errors and send to myself an email with the results. Hence, is there a way to export the SAS Project Log (not manually) in a folder?
So, first, what is this code doing?
Set objProjectLog = objProject.ProjectLog
objProjectLog.Clear()
This clears the project log. This needs to be done before your project is run - otherwise the log contains data from past runs. So put this before the prjOBject.Run().
objProjectLog.Enabled = True
'strProjectLog = objProjectLog.Text
objProjectLog.SaveAs "c:\temp\projectLog.txt"
This then exports the project log to a text file. You of course can call that text file whatever you want. You need this code to appear after your program runs, and somewhere before it closes. Right after PrjObject.Run() is probably fine.
You will need to update the names to match your vbs file's names - they use objproject and your vbs uses prjObject, but those are the same thing, just match the names.
Second - what else could you do? If VBS isn't your thing, you have a lot of other ways you could do this.
Export your EG project to a .sas file, then schedule this in base SAS with the normal output options. This may also be possible via the scheduling interface.
Use PROC PRINTTO to redirect your log inside your SAS code.
Copy your EG project to a location you can see. The EG project does contain the log of everything that was run - so there's no reason you couldn't just open the .egp and look at it, just make sure you're not doing that with the production file since you might forget to close out.
My preference is not to schedule EG projects, but to schedule .sas programs; use EG as the development environment and then export to .sas. This gives you more flexibility. But there are a lot of different ways to skin this cat.
First of all I'm really new into programming i have all these ideas on what i want to do but cannot seem to figure it out. Anyways i want a vbscript or batch file, anything at this point that when executed will ask for user input and say (Name of the file you want to search for). When i type in say hello.txt and hit enter it will then ask me another question saying where do you want me to look for these files. Then if i type in C:\ or any given drive letter it will search the entire drive letter directory including folders inside of folders if im not specific on actual path. Example c:\Program Files (x86)\ it will then search that directory and all of the folders in that directory and not the entire C:\ drive. I'm thinking to achieve this i need to call a function somehow that when i type something in a certain way it calls a function that runs a specific set of code. Example the second question asked file location so i type its location and it runs the code but replaces the location with the location i entered, this way its not only working for the location written into the code and can update and replace the line of code with user input of the location entered. Otherwise having it ask those questions were competently pointless if it doesn't have the ability to be able to replace parts of the code to adapt to user input and be more efficient, and not having to re write the script every single time you wanted to search for a new file.
Sorry lot of rambling on but i have looked everywhere found things like this but nothing close would be greatly appreciated if someone could help me out or point me in the rite direction.
This is what i have tryed for user input nothing close to what i want but here it is.
Dim Input
Input = InputBox("Enter your name")
MsgBox ("You entered: " & Input)
It ask for your name and then says the name you entered i need this concept but when i type that in it calls a function and executes it. Hope someone knows what I'm talking about. Thanks
Here's a script to do that. I'll add comments to it later to explain what each part does. Right now it just outputs everything to a message box, but you can do what you want with it.
// Comments added
Dim fso, userfile, userdir, fileslist()
Set fso = CreateObject("Scripting.FileSystemObject")
ReDim fileslist(-1) ' Find results will be stored in this dynamic array
userfile = InputBox("Enter file to search for") ' Get file to search for
userdir = InputBox("Enter search directory") ' Get search directory
If Len(userfile) < 1 Then ' Check length of file name to ensure something was entered.
MsgBox "No file name entered", 4096 + 16 ' Message box. 4096 = "System modal", essentially always on top.
ElseIf Len(userdir) < 1 Then ' Check length of dir
MsgBox "No directory entered", 4096 + 16 ' 16 = exclamation/error
ElseIf Not fso.FolderExists(userdir) Then ' Make sure search directory actually exists
MsgBox "Folder " & userdir & " doesn't exist", 4096 + 16
Else
FindFile userfile, userdir ' Call FindFile sub, with the user's file and dir args passed to it
If UBound(fileslist) >= 0 Then ' After sub completes, check whether any results were found.
MsgBox Join(fileslist, vbCrLf), 4096, "Results" ' If so, output the whole array ("join"), one result per line (delimited with vbcrlf)
Else
MsgBox "File " & userfile & " not found", 4096 + 48 ' Otherwise file not found, message stating so
End If
End If
Sub FindFile(searchname, searchdir) ' Two parameters: file name, search directory
On Error Resume Next ' Enable error handling so we don't crash out on access denied errors
Dim file, folder, subfolder
For Each file In fso.GetFolder(searchdir).Files ' Process each file (as a file object) in the search directory
If LCase(searchname) = LCase(file.Name) Then ' See if file name matches. Using LCase to convert both to lowercase for case insensitivity.
ReDim Preserve fileslist(UBound(fileslist) + 1) ' If match found then increase array size by 1
fileslist(UBound(fileslist)) = file.Path ' Store the file path in newly added array entry
End If
Next
' Now the recursive bit. For any subfolders in current search directory, call FindFile again
' with (1) the same file name originally passed in as "searchname", and (2) a new search
' directory of the subfolder's name. FindFile then starts again on this new directory: finds files,
' adds matches to the fileslist array, then does the same on each subfolder found. This
' is how it searches each subfolder (and subfolders of subfolders... etc) in a directory
For Each subfolder In fso.GetFolder(searchdir).SubFolders
FindFile searchname, subfolder.Path
Next
On Error GoTo 0
End Sub
The goal of the following VBscript is to prepend a user-defined string to all files with a particular extension within a specified directory:
directory = "C:\Users\xxxxxxxx\Desktop\Test\" 'include final backslash
extension = ".doc" 'include period, ex: ".tab"
''''''''''''''''''''''''''''''''''''''''''
addStr = InputBox("Enter the text you would like to prepend:", , "xxxxxxxx_xxxxxxxxxx_x_xx_xxx_")
Set objFSO = CreateObject("Scripting.FileSystemObject")
Set objFolder = objFSO.GetFolder(directory)
Set colFiles = objFolder.Files
For Each file In colFiles
absPath = objFSO.GetAbsolutePathName(file)
currentExtension = objFSO.GetExtensionName(absPath)
If StrComp(currentExtension, Mid(extension, 2)) = 0 Then
file.Name = addStr & objFSO.GetFileName(file)
End If
Next
The script generally works well, but occasionally demonstrates this problematic behavior:
When running the script on a directory with lots of files and/or with files with long names, the script appears to iterate back over the collection of files (i.e. prepends to files that have already been prepended) and does so until the filenames become too long to be recognized by the FSO, crashing the script.
The threshold of the number of files/length of filenames at which this occurs appears to be very distinct and reproducible. For example, if I create a target directory (e.g. "...\Desktop\Test") with a file named '1.doc' that is copied/pasted several times, the script will properly rename up to 31 files, but it demonstrates the problematic behavior with 32+ files. Similarly, if I run the script twice over 31 files (generated in the same manner), the script demonstrates the problematic behavior on the second run.
Any thoughts as to the underlying issue are very much appreciated--thanks in advance!
You may have issues here because you're modifying files while iterating them. Try creating an array of file names first and then iterate over the array, changing the names.
ReDim a(colFiles.Count - 1)
i = 0
For Each File In colFiles
a(i) = File.Path
i = i + 1
Next
For i = 0 To UBound(a)
If StrComp(objFSO.GetExtensionName(a(i)), Mid(extension, 2)) = 0 Then
With objFSO.GetFile(a(i))
.Name = addStr & .Name
End With
End If
Next
The reason the above behaviour occurs is, because when you initially call Set colFiles = objFolder.Files, the first 32 files are retrieved, and placed into a cache. Once those 32 files are processed, then the system retrieves the first 32 filenames which have not been processed yet.
Since you have renamed the files after the initial call, the system sees those as new filenames that have not been processed yet. Since their names are still first alphabetically, they are placed into the 32-file cache, and processed again.
The solution by #Bond is the standard workaround for this issue. Due to limitations of vbs, this is the only practical resolution of this issue.
I am looking for a way to monitor a folder so that it executes a batch file once it hits 10 files. It would be cool if it used vbscript or any other type of solution like that.
any help would be appreciated
thanks
Refer to this question: batch file to monitor additions to download folder
Note Nick's final solution where he counts files.
I would recommend that any test like this is executed via Task Scheduler.
Simple Example
rem Counting files...
set /a count = 0
for /f "tokens=*" %%P IN ('dir "C:\examplefolder" /A /b') do (set /a count += 1)
rem 10 or more files?
if %count% GEQ 10 call AnotherBatchFileHere.bat
The equivalent VBScript for this would involve obtaining a folder object and checking the count of its files collection. The Last Modified Date for the folder could also be examined to determine if something has changed or when.
Looping through the folder's .Files collection will let you examine the dates, size etc. of each file individually. Since this is a collection of file objects, any file object method can be executed directly or the file object can be passed off to a subroutine for processing. A similar .Subfolders collection enumerates folders created within this folder as folder objects in case you wish to monitor that situation as well.
File methods include .Copy .Move .Delete .OpenAsTextStream and the file properties .DateLastModified .DateLastAccessed .Attributes and .Name are updateable.
Note that the .Name property includes the file extension and if you change the name you may need to call FSO.GetExtensionName() to get that extension and append it to the new name before assigning it back to the property.
The Subfolders collection also has a .Add() method which can create a new child folder
.SubFolders.Add("NewFolderName")
and instead of the file object's .OpenAsTextStream method, folder objects have a .CreateTextFile() method which returns an open text stream object to a new text file created in that folder. A clever use could be to create a text stream used by your subroutines to log your file processing activities to a log file. Or read a text file directly and process its contents.
A basic example script to watch for 10 files in folder
Set FSO = WScript.CreateObject("Scripting.FileSystemObject")
WatchFolder FSO.GetFolder("c:\watched")
WScript.Quit
Sub WatchFolder(oFldr)
While True
If oFldr.Files.Count >= 10 Then
WScript.Echo oFldr.Files.Count , "files in" , ofldr.Path , _
"Last Modified" , oFldr.DateLastModified
For Each oFile In oFldr.Files
WScript.Echo "File" , oFile.Name , _
"Last Modified" , oFile.DateLastModified , _
"Created" , oFile.DateCreated , _
"Size" , oFile.Size
' call subroutine to optionally process file
KillJunkFile oFile
Next
Exit Function
End If
WScript.Sleep 2000 ' wait 2 seconds before checking again.
Wend
End Sub
Sub KillJunkFile(oTestFile)
' delete any file named junk.txt
If LCase(oTestFile.Name) = "junk.txt" Then
oTestFile.Delete True ' true forces the delete
End If
End Sub
Note that the WatchFolder() function will loop until at least 10 files are in the watched folder. You have to kill the task to stop it otherwise or add some termination logic that checks something on your system that can tell it to quit looping. Something like a specially named file, a registry entry, an environment variable, etc. You could also comment out the While Wend loop keywords and have Windows Task Scheduler run the script every hour if it takes that long for enough files to appear.
Ok, this is my problem.
I'm doing a logonscript that basically copies Microsoft Word templates from a serverpath on to a local path of each computer. This is done using a check for group membership.
If MemberOf(ObjGroupDict, "g_group1") Then
oShell.Run "%comspec% /c %LOGONSERVER%\SYSVOL\mydomain.com\scripts\ROBOCOPY \\server\Templates\Group1\OFFICE2003\ " & TemplateFolder & "\" & " * /E /XO", 0, True
End If
Previously I used the /MIR switch of robocopy, which is exellent.
But, if a user is member of more than one group, the /MIR switch removes the content from the first group, since it's mirroring the content from the second group. Meaning, I can't have both contents.
This is "solved" by not using the /MIR switch and just let the content get copied anyway.
BUT the whole idea of having the templates on a server is so that I can control the content the users receive through the script. So if I delete a file or folder from the server path, this doesn't replicate on the local computer. Since I don't use the /MIR switch anymore. Comprende?
So, what do I do?
I did a small script that basically checks the folders and files and then removes them accordingly, but this actually ended up being the same functionality as the /MIR switch anyway. How do I solve this problem?
Edit: I've found that what I actually need is a routine that scans my local template folder for files and folders and checks if the same structure exists in any of the source template folders.
The server template folders are set up like this:
\\fileserver\templates\group1\
\\fileserver\templates\group2\
\\fileserver\templates\group3\
\\fileserver\templates\group4\
\\fileserver\templates\group5\
\\fileserver\templates\group6\
And the script that does the copying is structures like this (pseudo):
If User is MemberOf (group1) Then
RoboCopy.exe \\fileserver\templates\group1\ c:\templates\workgroup *.* /E /XO
End if
If User is MemberOf (group2) Then
RoboCopy.exe \\fileserver\templates\group2\ c:\templates\workgroup *.* /E /XO
End if
If User is MemberOf (group3) Then
RoboCopy.exe \\fileserver\templates\group3\ c:\templates\workgroup *.* /E /XO
End if
Etc etc
With the /E switch, I make sure it copies subfolders as well. And the /XO switch only copies files and folders that are newer than those in my local path.
But it doesn't consider if the local path contains files or folders that doesn't exist on the server template path.
So after the copying is done, I would like to check if any of the files or folders on my c:\templates\workgroup actually exists in either of the sources. And if they don't, delete them from my local path. Something that could be combined in these memberchecks perhaps?
Using a lookup table
I'd suggest an approach that puts all templates into one common file server directory and use a lookup table to assign templates to groups.
The benefit would be that your templates would be guaranteed to be in sync; i.e. you don't have to worry that a template for group A, B, and C is really the same in all group specific folders on your file server.
Another bonus is a maintainable configuration table which allows you to assign templates to groups without the need to make changes to your logon script.
The lookup table config file would look something like
group1;\templateA.dot;\templateA.dot
group2;\B\templateB.dot;\B\templateB.dot
group3;\B\C\templateC.dot;\templateC.dot
with column 1 listing your AD group names; column 2 the source path and column 3 the target path.
This would also allow for flattening your template folder on the client side.
In any case you can avoid having to maintain multiple copies of all your templates on your file server and adding more groups or templates doesn't require to touch your logon script but just the config file.
In your logon script you can iterate over all lines and copy the ones with matching groups
Logon script code
open lookup table config file
For Each line In lookup table
If MemberOf(ObjGroupDict, groupname_column_value) Then
execute Robocopy templatename_column_value local_target
End If
Next
Removing old files on the client
Here's a script that removes files in the template directory the user's machine not present in one of the file groups copied. For clarity, the code is at the end of this answer. Here's how to use the script in your current solution that doesn't use /MIR.
In the code for each group copied, add one additional method call to 'ListFiles' - this tracks the files copied from the server:
If User is MemberOf (group3) Then
RoboCopy.exe \\fileserver\templates\group3\ c:\templates\workgroup *.* /E /XO
ListFiles("\\fileserver\templates\group3\", userTemplates)
End if
Do this for each group copied. (It is ok if the same template appears in more than one group.)
After all groups have been copied, you add this code block:
ListFiles "c:\templates\workgroup", toDelete
removeAllFrom toDelete, userTemplates
This lists all files in the user's local templates folder to toDelete. All the files just copied are then removed from that set, leaving just the files that were not copied from the server. We can then print the files to delete, and then actually delete them.
echoDictionary "deleting old user templates", toDelete
' deleteFiles c:\templates\workgroup", toDelete
The call to deleteFiles commented out - probably wise to do a trial run first! The first argument to deleteFiles is the user's template directory - it should not have a trailing slash.
With these changes in place, any files in the templates folder on the users machine that were not copied from the server will be deleted, providing effectively multi-directory synchronization.
Now comes the script. The first block can be pasted to the top of your file, and the remainder at the bottom, to help avoid clutter.
// script to remove files not present on one of the group folders on the fileserver
Set fs = CreateObject("Scripting.FileSystemObject")
Set userTemplates = CreateObject("Scripting.Dictionary")
userTemplates.CompareMode = 1
Set toDelete = CreateObject("Scripting.Dictionary")
toDelete.CompareMode = 1
-- under here are just procedures so they can go at
-- the bottom of your script if desired
Sub deleteFiles(basedir, dictionary)
for each key in dictionary.Keys
fs.DeleteFile(basedir+"\"+key)
next
End Sub
Sub echoDictionary(msg, dictionary)
for each key in dictionary.Keys
Wscript.Echo msg & ": " & key
next
End Sub
Sub removeAllFrom(target, toRemove)
for each key in toRemove.Keys
if target.Exists(key) then
target.remove key
end if
next
End Sub
Sub ListFiles(folderName, dictionary)
Set folder = fs.GetFolder(folderName)
ListSubFolders folder, "", dictionary
End Sub
Sub ListSubFolders(folder, prefix, dictionary)
Set files = folder.Files
For Each file in files
qualifiedName = prefix & file.Name
dictionary.add qualifiedName, file
Next
For Each Subfolder in Folder.SubFolders
qualifiedName = prefix+Subfolder.Name & "\"
ListSubFolders Subfolder, qualifiedName, dictionary
dictionary.add qualifiedName, Subfolder
Next
End Sub