I have created a tool which picks up a file from a specific location, copies it, zips it and then puts it at another location. The user has to select the required folders from the location.
Is there any way through which I can create an option in the tool so that the user can see the list of available folders at that location, or some way to direct the user directly to that location? I only need the folder names.
I tried it with cmd but since the location is not on my computer (it's on another computer with shared property) I dunno how to access that location. Any help, any hint is very much appreciated. My tool is in VBScript and ASP.
You can use a FileSystemObject to get the contents of a directory.
set fso = CreateObject( "FileSystemObject" )
set my_folder = fso.getFolder( "C:\Example" )
Then, use the Folder object to get its contents.
set sub_folders = my_folder.subFolders
for each f in sub_folders
wscript.echo( f.name & VBNEWLINE )
next
Related
I want a macro to save backups of my Personal.xlsb -- Why won't this work for me?
Workbooks("Personal.xlsb").SaveCopyAs "C:Users\Tom\Documents\Test.xlsb"
I get runtime error 1004 saying Excel cannot access the file "C:Users\Tom\Documents\Test.xlsb" -- which I wouldn't think it would need to access but instead to create.
(I know I can manually copy Personal.xlsb from one place to another.)
As provided by my teacher Leila Gharani, the key is: Even using a different name, a copy of the open Personal.xlsb file can only be saved to the xlstart folder. So the code to save a copy of Personal.xlsb in another folder (with User Name substituted for Tom):
Workbooks("Personal.xlsb").SaveCopyAs _
"C:\Users\Tom\AppData\Roaming\Microsoft\Excel\xlstart\Personal(1).xlsb" _
'can only save it to this folder
Name "C:\Users\Tom\AppData\Roaming\Microsoft\Excel\xlstart\Personal(1).xlsb" As _
"C:\Users\Tom\Documents\Personal-Test.xlsb" _
'can name as (and thus move to) any folder & name
I use a seperate program like SycBackFree to make a backup for the personal.xlsb because then it is also possible to save it to a other computer,nas or USB storage.
Thanks #TommyExcels. This helped me.
To build on the post a little, there's a variable that you can use for the XLSTART folder that means you don't have to put the user name in the path....
Workbooks("Personal.xlsb").SaveCopyAs _
Application.StartupPath & "\Personal(1).xlsb"
'can only save it to this folder
Name Application.StartupPath & "\Personal(1).xlsb" As _
"C:\temp\Personal-Test.xlsb"
I'm making a project out of creating a script to use at work to automate one of our processes.
I'd like the script to check an input for username to search the specified user profile path for any files of .doc,.docx,.pdf,.pst ect. and copy them as is to a created folder on a network drive location.
My main question is what is the command or chain of commands to check folders and sub folders starting at the specified userpath, for JUST files with those extensions and I guess copy them but without getting to a situation where it just copies the same file over and over and over again. Sorry if that's confusing.
This answer provides sample code for recursively traversing a folder tree. A list of extensions could be handled by creating a dictionary:
Set extensions = CreateObject("Scripting.Dictionary")
extensions.CompareMode = vbTextCompare 'case-insensitive
extensions.Add "doc", True
extensions.Add "docx", True
extensions.Add "pdf", True
extensions.Add "pst", True
...
and then checking the extension of the processed files like this:
For Each f In fldr.Files
If extensions.Exists(objFso.GetExtensionName(f.Name)) Then
f.Copy targetFolder & "\"
End If
Next
The trailing backslash is required when the destination is a folder, otherwise you'd have to specify the full target path including the target filename.
I think I have understood most of the requirements, and this can be more easily achieved by using a .BAT file approach within windows. This batch (.Bat) file can run commands such as copy / delete etc.
So create a file called test.bat, and inside the file add the below script:
::XCOPY source [destination]
XCOPY "C:\Temp\*.doc" "C:\Temp\another"
What does this do? Well it uses an XCOPY Command to copy any files within the C:\Temp direcory which have a .doc extension. The files will be copied over to a folder called C:\Temp\another.
The XCOPY takes two primary arguments: source and destination. Source is where the file currently lives, and destination is where you want to copy the files to. More info of all of the options available can be found on:
http://support.microsoft.com/kb/240268
In order to run the file, just double click it, or schedule it to run whenever required.
Let me know if this meets your requirement, I didn't fully understand the bit about an input for a username?
I have this very complicated requirement.
We have a bunch of zipped files downloaded from an ftp server into a folder in our local directory.
Then we use the code below to unzip the files.
Set objZip = CreateObject("XStandard.Zip")
Set FSO = CreateObject("Scripting.FileSystemObject")
Set fldr = FSO.GetFolder("C:\MUSK\FTP\MainFolder\")
For Each fil In fldr.Files
If LCase( Right( fil.Name, 4 ) ) = ".zip" Then
zipFilePath = fil.Path
objZip.UnPack zipFilePath, ("C:\MUSK\FTP\Current\")
End If
Next
So far so good.
Here is where problems come in.
These downloaded files have the following naming convention:
filename_month-day-year.zip
Example: Assuming today is May 16, 2012, the filename looks like this:
myFile_5-16-2012.zip
Our requirement is to grab the downloaded zipped files and place them in their correct folder.
For instance, we have folders named according month and year.
Example: We have JAN2012, FEB2012, etc
So taking myFIle_5-16-2012.zip as an example, the myFile_5-16-2012.zip is for MAY2012.
We would like to use the script above to grab the myFile_5-16-2012.zip and place it in the appropriate folder. In this example, the appropriate folder would be MAY2012 and then unzip it.
Basically, the MonthYear folder will replace this:
objZip.UnPack zipFilePath, ("C:\MUSK\FTP\Current\")
In other words, instead of the Current folder, it will be MAY2012 or whatever MonthYear combination.
Is this possible?
I would be more than happy to clarify. Sorry if I confused anyone.
This is pretty straight forward. I would:
Create a function which converts file name to appropriate MMMYYYY format
Use the FileSystemObject to determine if the folder name created in step 1 exists, and create if needed
Pass the full directory to your XStandard.Zip object
Check out the supported methods of FileSystemObject here:
http://msdn.microsoft.com/en-us/library/z9ty6h50(v=vs.85).aspx
You'll need .FolderExists and .CreateFolder, at least.
A quick VBScript I whipped up, could probably use some error checking and whatnot. Enjoy
' parse date, assumes file name is in foo_M-D-YYYY.ext format
Function parseDate(s)
dim dt
dt = CDate(split(split(s, "_")(1), ".")(0))
parseDate = Monthname(Month(dt)) & Year(dt)
End Function
I have a .zip file that starts with a parent directory. I need to read that dir from the file then search my HD to see if that dir name already exists. If it exists, I then delete it and replace it the contents of the .zip file.
All of this I can do, except read the .zip without actually unzipping the file.
The .zip file can be upwards of 2G in size so I want to avoid unzipping, then reading the dir, then copying.
The reason I don't just unzip directly to the location and force an overwrite is that for some reason when using the CopyHere method to unzip, it ignores the switches that would normally force the overwrite and still prompts the user if they want to overwrite.
Code to unzip files:
Set objSA = CreateObject("Shell.Application")
Set objSource = objSA.NameSpace(pathToZipFile).Items ()
Set objTarget = objSA.NameSpace(extractTo)
objTarget.CopyHere objSource,4
Here is a similar question on SO.
How to list the contents of a .zip folder in c#?
I've used this library myself. It works well, http://dotnetzip.codeplex.com/, there is even a treeview example that appears to read the zip without extraction.
You will need the DLLs on the server, but I wouldn't say you have to install them. ;)
You can use For Each on your objSource object, for example:
Dim objSA, objSource, item
Set objSA = CreateObject("Shell.Application")
Set objSource = objSA.NameSpace(pathToZipFile).Items ()
For Each item in objSource
WScript.Echo item
Next
Assuming that you can use an external application, try downloading 7Zip and then have your script execute it with the -l switch. This should give you some output that you should be able to parse in some way.
Sample from the help file: 7z l archive.zip
I'm not sure if it is possible to read the contents of a zip without extracting it.
If you are just trying to avoid a time consuming copy operation on the data you could try unzipping to a temp directory and then using a "move" function. Move is usually less time consuming than copy as it doesn't actually re-write the data on the disk. It just updates the file system to point at where the data is.
Ok, this is my problem.
I'm doing a logonscript that basically copies Microsoft Word templates from a serverpath on to a local path of each computer. This is done using a check for group membership.
If MemberOf(ObjGroupDict, "g_group1") Then
oShell.Run "%comspec% /c %LOGONSERVER%\SYSVOL\mydomain.com\scripts\ROBOCOPY \\server\Templates\Group1\OFFICE2003\ " & TemplateFolder & "\" & " * /E /XO", 0, True
End If
Previously I used the /MIR switch of robocopy, which is exellent.
But, if a user is member of more than one group, the /MIR switch removes the content from the first group, since it's mirroring the content from the second group. Meaning, I can't have both contents.
This is "solved" by not using the /MIR switch and just let the content get copied anyway.
BUT the whole idea of having the templates on a server is so that I can control the content the users receive through the script. So if I delete a file or folder from the server path, this doesn't replicate on the local computer. Since I don't use the /MIR switch anymore. Comprende?
So, what do I do?
I did a small script that basically checks the folders and files and then removes them accordingly, but this actually ended up being the same functionality as the /MIR switch anyway. How do I solve this problem?
Edit: I've found that what I actually need is a routine that scans my local template folder for files and folders and checks if the same structure exists in any of the source template folders.
The server template folders are set up like this:
\\fileserver\templates\group1\
\\fileserver\templates\group2\
\\fileserver\templates\group3\
\\fileserver\templates\group4\
\\fileserver\templates\group5\
\\fileserver\templates\group6\
And the script that does the copying is structures like this (pseudo):
If User is MemberOf (group1) Then
RoboCopy.exe \\fileserver\templates\group1\ c:\templates\workgroup *.* /E /XO
End if
If User is MemberOf (group2) Then
RoboCopy.exe \\fileserver\templates\group2\ c:\templates\workgroup *.* /E /XO
End if
If User is MemberOf (group3) Then
RoboCopy.exe \\fileserver\templates\group3\ c:\templates\workgroup *.* /E /XO
End if
Etc etc
With the /E switch, I make sure it copies subfolders as well. And the /XO switch only copies files and folders that are newer than those in my local path.
But it doesn't consider if the local path contains files or folders that doesn't exist on the server template path.
So after the copying is done, I would like to check if any of the files or folders on my c:\templates\workgroup actually exists in either of the sources. And if they don't, delete them from my local path. Something that could be combined in these memberchecks perhaps?
Using a lookup table
I'd suggest an approach that puts all templates into one common file server directory and use a lookup table to assign templates to groups.
The benefit would be that your templates would be guaranteed to be in sync; i.e. you don't have to worry that a template for group A, B, and C is really the same in all group specific folders on your file server.
Another bonus is a maintainable configuration table which allows you to assign templates to groups without the need to make changes to your logon script.
The lookup table config file would look something like
group1;\templateA.dot;\templateA.dot
group2;\B\templateB.dot;\B\templateB.dot
group3;\B\C\templateC.dot;\templateC.dot
with column 1 listing your AD group names; column 2 the source path and column 3 the target path.
This would also allow for flattening your template folder on the client side.
In any case you can avoid having to maintain multiple copies of all your templates on your file server and adding more groups or templates doesn't require to touch your logon script but just the config file.
In your logon script you can iterate over all lines and copy the ones with matching groups
Logon script code
open lookup table config file
For Each line In lookup table
If MemberOf(ObjGroupDict, groupname_column_value) Then
execute Robocopy templatename_column_value local_target
End If
Next
Removing old files on the client
Here's a script that removes files in the template directory the user's machine not present in one of the file groups copied. For clarity, the code is at the end of this answer. Here's how to use the script in your current solution that doesn't use /MIR.
In the code for each group copied, add one additional method call to 'ListFiles' - this tracks the files copied from the server:
If User is MemberOf (group3) Then
RoboCopy.exe \\fileserver\templates\group3\ c:\templates\workgroup *.* /E /XO
ListFiles("\\fileserver\templates\group3\", userTemplates)
End if
Do this for each group copied. (It is ok if the same template appears in more than one group.)
After all groups have been copied, you add this code block:
ListFiles "c:\templates\workgroup", toDelete
removeAllFrom toDelete, userTemplates
This lists all files in the user's local templates folder to toDelete. All the files just copied are then removed from that set, leaving just the files that were not copied from the server. We can then print the files to delete, and then actually delete them.
echoDictionary "deleting old user templates", toDelete
' deleteFiles c:\templates\workgroup", toDelete
The call to deleteFiles commented out - probably wise to do a trial run first! The first argument to deleteFiles is the user's template directory - it should not have a trailing slash.
With these changes in place, any files in the templates folder on the users machine that were not copied from the server will be deleted, providing effectively multi-directory synchronization.
Now comes the script. The first block can be pasted to the top of your file, and the remainder at the bottom, to help avoid clutter.
// script to remove files not present on one of the group folders on the fileserver
Set fs = CreateObject("Scripting.FileSystemObject")
Set userTemplates = CreateObject("Scripting.Dictionary")
userTemplates.CompareMode = 1
Set toDelete = CreateObject("Scripting.Dictionary")
toDelete.CompareMode = 1
-- under here are just procedures so they can go at
-- the bottom of your script if desired
Sub deleteFiles(basedir, dictionary)
for each key in dictionary.Keys
fs.DeleteFile(basedir+"\"+key)
next
End Sub
Sub echoDictionary(msg, dictionary)
for each key in dictionary.Keys
Wscript.Echo msg & ": " & key
next
End Sub
Sub removeAllFrom(target, toRemove)
for each key in toRemove.Keys
if target.Exists(key) then
target.remove key
end if
next
End Sub
Sub ListFiles(folderName, dictionary)
Set folder = fs.GetFolder(folderName)
ListSubFolders folder, "", dictionary
End Sub
Sub ListSubFolders(folder, prefix, dictionary)
Set files = folder.Files
For Each file in files
qualifiedName = prefix & file.Name
dictionary.add qualifiedName, file
Next
For Each Subfolder in Folder.SubFolders
qualifiedName = prefix+Subfolder.Name & "\"
ListSubFolders Subfolder, qualifiedName, dictionary
dictionary.add qualifiedName, Subfolder
Next
End Sub