Ok, this is my problem.
I'm doing a logonscript that basically copies Microsoft Word templates from a serverpath on to a local path of each computer. This is done using a check for group membership.
If MemberOf(ObjGroupDict, "g_group1") Then
oShell.Run "%comspec% /c %LOGONSERVER%\SYSVOL\mydomain.com\scripts\ROBOCOPY \\server\Templates\Group1\OFFICE2003\ " & TemplateFolder & "\" & " * /E /XO", 0, True
End If
Previously I used the /MIR switch of robocopy, which is exellent.
But, if a user is member of more than one group, the /MIR switch removes the content from the first group, since it's mirroring the content from the second group. Meaning, I can't have both contents.
This is "solved" by not using the /MIR switch and just let the content get copied anyway.
BUT the whole idea of having the templates on a server is so that I can control the content the users receive through the script. So if I delete a file or folder from the server path, this doesn't replicate on the local computer. Since I don't use the /MIR switch anymore. Comprende?
So, what do I do?
I did a small script that basically checks the folders and files and then removes them accordingly, but this actually ended up being the same functionality as the /MIR switch anyway. How do I solve this problem?
Edit: I've found that what I actually need is a routine that scans my local template folder for files and folders and checks if the same structure exists in any of the source template folders.
The server template folders are set up like this:
\\fileserver\templates\group1\
\\fileserver\templates\group2\
\\fileserver\templates\group3\
\\fileserver\templates\group4\
\\fileserver\templates\group5\
\\fileserver\templates\group6\
And the script that does the copying is structures like this (pseudo):
If User is MemberOf (group1) Then
RoboCopy.exe \\fileserver\templates\group1\ c:\templates\workgroup *.* /E /XO
End if
If User is MemberOf (group2) Then
RoboCopy.exe \\fileserver\templates\group2\ c:\templates\workgroup *.* /E /XO
End if
If User is MemberOf (group3) Then
RoboCopy.exe \\fileserver\templates\group3\ c:\templates\workgroup *.* /E /XO
End if
Etc etc
With the /E switch, I make sure it copies subfolders as well. And the /XO switch only copies files and folders that are newer than those in my local path.
But it doesn't consider if the local path contains files or folders that doesn't exist on the server template path.
So after the copying is done, I would like to check if any of the files or folders on my c:\templates\workgroup actually exists in either of the sources. And if they don't, delete them from my local path. Something that could be combined in these memberchecks perhaps?
Using a lookup table
I'd suggest an approach that puts all templates into one common file server directory and use a lookup table to assign templates to groups.
The benefit would be that your templates would be guaranteed to be in sync; i.e. you don't have to worry that a template for group A, B, and C is really the same in all group specific folders on your file server.
Another bonus is a maintainable configuration table which allows you to assign templates to groups without the need to make changes to your logon script.
The lookup table config file would look something like
group1;\templateA.dot;\templateA.dot
group2;\B\templateB.dot;\B\templateB.dot
group3;\B\C\templateC.dot;\templateC.dot
with column 1 listing your AD group names; column 2 the source path and column 3 the target path.
This would also allow for flattening your template folder on the client side.
In any case you can avoid having to maintain multiple copies of all your templates on your file server and adding more groups or templates doesn't require to touch your logon script but just the config file.
In your logon script you can iterate over all lines and copy the ones with matching groups
Logon script code
open lookup table config file
For Each line In lookup table
If MemberOf(ObjGroupDict, groupname_column_value) Then
execute Robocopy templatename_column_value local_target
End If
Next
Removing old files on the client
Here's a script that removes files in the template directory the user's machine not present in one of the file groups copied. For clarity, the code is at the end of this answer. Here's how to use the script in your current solution that doesn't use /MIR.
In the code for each group copied, add one additional method call to 'ListFiles' - this tracks the files copied from the server:
If User is MemberOf (group3) Then
RoboCopy.exe \\fileserver\templates\group3\ c:\templates\workgroup *.* /E /XO
ListFiles("\\fileserver\templates\group3\", userTemplates)
End if
Do this for each group copied. (It is ok if the same template appears in more than one group.)
After all groups have been copied, you add this code block:
ListFiles "c:\templates\workgroup", toDelete
removeAllFrom toDelete, userTemplates
This lists all files in the user's local templates folder to toDelete. All the files just copied are then removed from that set, leaving just the files that were not copied from the server. We can then print the files to delete, and then actually delete them.
echoDictionary "deleting old user templates", toDelete
' deleteFiles c:\templates\workgroup", toDelete
The call to deleteFiles commented out - probably wise to do a trial run first! The first argument to deleteFiles is the user's template directory - it should not have a trailing slash.
With these changes in place, any files in the templates folder on the users machine that were not copied from the server will be deleted, providing effectively multi-directory synchronization.
Now comes the script. The first block can be pasted to the top of your file, and the remainder at the bottom, to help avoid clutter.
// script to remove files not present on one of the group folders on the fileserver
Set fs = CreateObject("Scripting.FileSystemObject")
Set userTemplates = CreateObject("Scripting.Dictionary")
userTemplates.CompareMode = 1
Set toDelete = CreateObject("Scripting.Dictionary")
toDelete.CompareMode = 1
-- under here are just procedures so they can go at
-- the bottom of your script if desired
Sub deleteFiles(basedir, dictionary)
for each key in dictionary.Keys
fs.DeleteFile(basedir+"\"+key)
next
End Sub
Sub echoDictionary(msg, dictionary)
for each key in dictionary.Keys
Wscript.Echo msg & ": " & key
next
End Sub
Sub removeAllFrom(target, toRemove)
for each key in toRemove.Keys
if target.Exists(key) then
target.remove key
end if
next
End Sub
Sub ListFiles(folderName, dictionary)
Set folder = fs.GetFolder(folderName)
ListSubFolders folder, "", dictionary
End Sub
Sub ListSubFolders(folder, prefix, dictionary)
Set files = folder.Files
For Each file in files
qualifiedName = prefix & file.Name
dictionary.add qualifiedName, file
Next
For Each Subfolder in Folder.SubFolders
qualifiedName = prefix+Subfolder.Name & "\"
ListSubFolders Subfolder, qualifiedName, dictionary
dictionary.add qualifiedName, Subfolder
Next
End Sub
Related
I'm making a project out of creating a script to use at work to automate one of our processes.
I'd like the script to check an input for username to search the specified user profile path for any files of .doc,.docx,.pdf,.pst ect. and copy them as is to a created folder on a network drive location.
My main question is what is the command or chain of commands to check folders and sub folders starting at the specified userpath, for JUST files with those extensions and I guess copy them but without getting to a situation where it just copies the same file over and over and over again. Sorry if that's confusing.
This answer provides sample code for recursively traversing a folder tree. A list of extensions could be handled by creating a dictionary:
Set extensions = CreateObject("Scripting.Dictionary")
extensions.CompareMode = vbTextCompare 'case-insensitive
extensions.Add "doc", True
extensions.Add "docx", True
extensions.Add "pdf", True
extensions.Add "pst", True
...
and then checking the extension of the processed files like this:
For Each f In fldr.Files
If extensions.Exists(objFso.GetExtensionName(f.Name)) Then
f.Copy targetFolder & "\"
End If
Next
The trailing backslash is required when the destination is a folder, otherwise you'd have to specify the full target path including the target filename.
I think I have understood most of the requirements, and this can be more easily achieved by using a .BAT file approach within windows. This batch (.Bat) file can run commands such as copy / delete etc.
So create a file called test.bat, and inside the file add the below script:
::XCOPY source [destination]
XCOPY "C:\Temp\*.doc" "C:\Temp\another"
What does this do? Well it uses an XCOPY Command to copy any files within the C:\Temp direcory which have a .doc extension. The files will be copied over to a folder called C:\Temp\another.
The XCOPY takes two primary arguments: source and destination. Source is where the file currently lives, and destination is where you want to copy the files to. More info of all of the options available can be found on:
http://support.microsoft.com/kb/240268
In order to run the file, just double click it, or schedule it to run whenever required.
Let me know if this meets your requirement, I didn't fully understand the bit about an input for a username?
I need to write a vbscript for Copying files from local directory(Windows) to another(shared drive) and verify the counts at the end to make sure everything copied successfully. Any ideas on what the script will look like?
Here is what I recorded using GUI UFT:
SystemUtil.Run "C:\Users\Downloads"
Window("Documents").WinObject("Items View").WinList("Items View").Activate "Unified Functional Testing"
Window("Documents").WinObject("Items View").WinList("Items View").Select "APITest1"
Window("Documents").WinObject("ShellView").WinMenu("ContextMenu").Select "Copy"
Window("Documents").Restore Window("Documents").WinTreeView("WinTreeView").Select "Desktop;This PC;Downloads"
Window("Documents").WinObject("ShellView").WinMenu("ContextMenu").Select "Paste"
To copy files from one foler to another, Why do you record using QTP/UFT? The script recorded by QTP will not be reliable. (might not work everytime.) QTP supports VBScript. It is easy to copy files from one folder to another folder using VBScript.
To copy all files from temp1 to temp2 folder - just these 2 lines will do
Set oFSO = CreateObject("Scripting.FileSystemObject")
oFSO.CopyFile "C:\vIns\temp1\*.*" , "C:\vIns\temp2\" , TRUE
Once files are moved, You want to compare the files count. (I assume temp2 folder was empty before copying the files)
iTemp1Count = oFSO.getFolder("C:\vIns\temp1\").Files.Count
iTemp2Count = oFSO.getFolder("C:\vIns\temp2\").Files.Count
If iTemp1Count = iTemp2Count Then
Msgbox "all files are copied"
Else
Msgbox "Something is wrong!!!"
End If
I'm making a project out of creating a script to use at work to automate one of our processes.
I'd like the script to check an input for username to search the specified user profile path for any files of .doc,.docx,.pdf,.pst ect. and copy them as is to a created folder on a network drive location.
My main question is what is the command or chain of commands to check folders and sub folders starting at the specified userpath, for JUST files with those extensions and I guess copy them but without getting to a situation where it just copies the same file over and over and over again. Sorry if that's confusing.
This answer provides sample code for recursively traversing a folder tree. A list of extensions could be handled by creating a dictionary:
Set extensions = CreateObject("Scripting.Dictionary")
extensions.CompareMode = vbTextCompare 'case-insensitive
extensions.Add "doc", True
extensions.Add "docx", True
extensions.Add "pdf", True
extensions.Add "pst", True
...
and then checking the extension of the processed files like this:
For Each f In fldr.Files
If extensions.Exists(objFso.GetExtensionName(f.Name)) Then
f.Copy targetFolder & "\"
End If
Next
The trailing backslash is required when the destination is a folder, otherwise you'd have to specify the full target path including the target filename.
I think I have understood most of the requirements, and this can be more easily achieved by using a .BAT file approach within windows. This batch (.Bat) file can run commands such as copy / delete etc.
So create a file called test.bat, and inside the file add the below script:
::XCOPY source [destination]
XCOPY "C:\Temp\*.doc" "C:\Temp\another"
What does this do? Well it uses an XCOPY Command to copy any files within the C:\Temp direcory which have a .doc extension. The files will be copied over to a folder called C:\Temp\another.
The XCOPY takes two primary arguments: source and destination. Source is where the file currently lives, and destination is where you want to copy the files to. More info of all of the options available can be found on:
http://support.microsoft.com/kb/240268
In order to run the file, just double click it, or schedule it to run whenever required.
Let me know if this meets your requirement, I didn't fully understand the bit about an input for a username?
I am looking for a way to monitor a folder so that it executes a batch file once it hits 10 files. It would be cool if it used vbscript or any other type of solution like that.
any help would be appreciated
thanks
Refer to this question: batch file to monitor additions to download folder
Note Nick's final solution where he counts files.
I would recommend that any test like this is executed via Task Scheduler.
Simple Example
rem Counting files...
set /a count = 0
for /f "tokens=*" %%P IN ('dir "C:\examplefolder" /A /b') do (set /a count += 1)
rem 10 or more files?
if %count% GEQ 10 call AnotherBatchFileHere.bat
The equivalent VBScript for this would involve obtaining a folder object and checking the count of its files collection. The Last Modified Date for the folder could also be examined to determine if something has changed or when.
Looping through the folder's .Files collection will let you examine the dates, size etc. of each file individually. Since this is a collection of file objects, any file object method can be executed directly or the file object can be passed off to a subroutine for processing. A similar .Subfolders collection enumerates folders created within this folder as folder objects in case you wish to monitor that situation as well.
File methods include .Copy .Move .Delete .OpenAsTextStream and the file properties .DateLastModified .DateLastAccessed .Attributes and .Name are updateable.
Note that the .Name property includes the file extension and if you change the name you may need to call FSO.GetExtensionName() to get that extension and append it to the new name before assigning it back to the property.
The Subfolders collection also has a .Add() method which can create a new child folder
.SubFolders.Add("NewFolderName")
and instead of the file object's .OpenAsTextStream method, folder objects have a .CreateTextFile() method which returns an open text stream object to a new text file created in that folder. A clever use could be to create a text stream used by your subroutines to log your file processing activities to a log file. Or read a text file directly and process its contents.
A basic example script to watch for 10 files in folder
Set FSO = WScript.CreateObject("Scripting.FileSystemObject")
WatchFolder FSO.GetFolder("c:\watched")
WScript.Quit
Sub WatchFolder(oFldr)
While True
If oFldr.Files.Count >= 10 Then
WScript.Echo oFldr.Files.Count , "files in" , ofldr.Path , _
"Last Modified" , oFldr.DateLastModified
For Each oFile In oFldr.Files
WScript.Echo "File" , oFile.Name , _
"Last Modified" , oFile.DateLastModified , _
"Created" , oFile.DateCreated , _
"Size" , oFile.Size
' call subroutine to optionally process file
KillJunkFile oFile
Next
Exit Function
End If
WScript.Sleep 2000 ' wait 2 seconds before checking again.
Wend
End Sub
Sub KillJunkFile(oTestFile)
' delete any file named junk.txt
If LCase(oTestFile.Name) = "junk.txt" Then
oTestFile.Delete True ' true forces the delete
End If
End Sub
Note that the WatchFolder() function will loop until at least 10 files are in the watched folder. You have to kill the task to stop it otherwise or add some termination logic that checks something on your system that can tell it to quit looping. Something like a specially named file, a registry entry, an environment variable, etc. You could also comment out the While Wend loop keywords and have Windows Task Scheduler run the script every hour if it takes that long for enough files to appear.
I have created a tool which picks up a file from a specific location, copies it, zips it and then puts it at another location. The user has to select the required folders from the location.
Is there any way through which I can create an option in the tool so that the user can see the list of available folders at that location, or some way to direct the user directly to that location? I only need the folder names.
I tried it with cmd but since the location is not on my computer (it's on another computer with shared property) I dunno how to access that location. Any help, any hint is very much appreciated. My tool is in VBScript and ASP.
You can use a FileSystemObject to get the contents of a directory.
set fso = CreateObject( "FileSystemObject" )
set my_folder = fso.getFolder( "C:\Example" )
Then, use the Folder object to get its contents.
set sub_folders = my_folder.subFolders
for each f in sub_folders
wscript.echo( f.name & VBNEWLINE )
next