I've got an application that's going to generate about 20,000 files. To keep it simple, I'm thinking about putting all these files in the same folder. Am I likely to run into some performance issues doing so? My software is for Windows XP, Vista, Seven and Mac OS X.
Any idea?
I know that Doxygen has a feature to break the generated files up into multiple directories because of slowness in access files in large directories under Windows. I don't know if this is just FAT or NTFS or both.
Related
I am now facing a great problem. I don't know how many files has come to my computer with unknown extensions (such as: .pxv,.cf)
I don't need these files. Can you give any command by which all the files of such extension in my computer will gone forever?
We need to distribute lots of small jpg files to offline systems. Right now, we send it as a 7zip (or plain zip) which is 800MB (230K files) and use 7zip to unzip it. It is taking about an hour to unzip on fairly large 4 core processors.
Is there a way on windows7 (or win server 2008) to create and unpack a package of files of this size in a more reasonable time frame?
(I will entertain even far out answers such as: put this all in a single CloudDB database as binary blobs and then ship the archive to the target machine, or create a VM, or a virtual disk image - but I will need some pointers to tips on doing that sort of stuff).
So then here's your far out answer: ;)
The problem probably doesn't lie in computing power. The filesystem and/or harddisk are the bottleneck most likely.
For Win7 (and afaik Server2008 as well) you could use a Virtual Hard Disk instead of zipping it. Win7 has native support for VHD-files and can emulate the content as a drive or subfolder via Disk Management. So there would be no need to unzip the files.
I had the same problem, and solved it. The issue is likely the Windows Attachment Service, which subjects downloaded or attached zip files to additional scrutiny for security reasons.
To bypass this:
Right-click the file
Choose Properties
Check Unblock
For more info, see: Why is WinZip slow?
I spoke to some colleagues, and they might have an easier solution. Since the size is under 4GB, and I want READ-ONLY access, I can create an ISO image, and then mount it on win7 or win2008server, using this Microsoft utility:
This utility enables users of Windows XP, Windows Vista, and Windows 7 to mount ISO disk image files as virtual CD-ROM drives.
I find that when I install applications(.dmg files), they all are installed into the folder /Volumes.And some days ago, they were installed into the folder /Applications. I don't know my macbook had suffered what kind of attacks, improper use or something like these. Anyway, it changed. I don't know how to resolve it. Is there anyone who ever had that kind of problem like me?Anyone who know how to make my macbook back to install applications into /Applications folder by default?Or anyone who know how to specific the installing folder if the default installing folder is not I want?
You need to begin by understanding the difference between installing and application and running it from a disk image. The whole disk image situation is understandably confusing for users, but considering that this is a programming Q&A, here are the important points.
Macintosh applications are stored in special directories with the extension .app. Because of the extension, these files are known by the Finder and treated as special Bundles, which are shown to the user as a single icon which cannot easily be opened further (there are other types of bundles as well, but the .app bundles deal specifically with applications.
Because Macintosh applications are actually directories full of files and other directories, they cannot be downloaded as a single file through the Internet without some kind of packaging. Recently there has been a move to package these in standard zip files, because they are understood well by many platforms. For many years prior to this, though, Macintosh applications were distributed on mountable Disk Images (.dmg format files), which themselves were multi-file containers which could support a variety of files and directories.
The key problem in both cases is that applications, once downloaded, don't necessarily move themselves to the most obvious location (the Applications folder on the boot volume, where Apple-installed applications are stored). Zip files usually automatically decompress, but are left inside of your Downloads directory, and Disk Images are usually downloaded to the Downloads directory and then mounted on the desktop, showing up as a new volume under /Volumes and appearing in the Finder as a disk.
In most cases, applications can be run from any of these locations, leading to the particularly confusing situation of:
Download a disk image
Disk image file goes to the Downloads folder
OS X mounts the disk image
User runs the application just fine from the disk image
User reboots the Mac
Application appears to have disappeared
In this case, the application isn't gone, but the disk image was uncounted by the reboot, and so it isn't obvious to most users where the application has gone.
The most straightforward solution for users is to copy the applications to their Applications folder in order to make sure it is easy to find.
Obscure note: This works well for Disk Images (which can subsequently be deleted), but may cause some confusion for Applications decompressed from zip files if the Application was downloaded on a disk other than the boot volume. I this case, copying the Application may lead to having two copies of it, one in the downloaded location and one in the Applications folder. This can be very confusing if you delete the application as the Finder will still locate it in the Downloads folder. It can also be confusing when you download an update for the application manually, as it may result in multiple copies of the application in your Downloads folder. These will usually be named "My app", "My app 1", "My app 2", etc.
I am using Delphi 7 on Windows 7 and have an application which uses a TFileListBox component to iterate through files in a directory and copies them to a backup directory.
I'm experiencing some strange behavior whereby the TFileListBox is detecting files which do not exist within the directory?? The directory I am coping from contains 75 files but the TFileListBox detects over 100 files.
I changed my explorer settings to display hidden/system files but still cannot see where these extra files are coming from.
I was wondering whether this Windows 7 Previous Versions was playing a part in this problem as I am fairly sure that the extra files the TFileListBox is detecting did used to once reside in this directory but were deleted...
Any help on this would be much appreciated.
We have worked out from the comments above that the issue is related to the Virtual Store which is used when your application is virtualized. The virtual store was introduced with Windows Vista as part of the move to running applications without administrator rights. These files are appearing in the virtual store because your application is writing to the program files directory, to which standard users do not have write privileges.
Virtualization was introduced to help deal with legacy applications that were not going to be recompiled to take account of the new Vista policies. Nowadays you simply should not be building a virtualized application.
You can disable virtualization by linking an application manifest to your application that includes the <requestedExecutionLevel level="asInvoker"/> section.
When you do this, you will no doubt find some other problems because your application may attempt to write to the program files directory, the HKLM section of the registry, etc. Whilst it may seem painful to make these changes, they are worth the effort.
What would be the negative effects of installing a legacy 32bit app into the C:\Program Files instead of the C:\Program Files(x86) ?
It could cause a problem based on what your application does.
For example, if your app queries for the Program Files folder, the WOW emulation layer will return Program Files (x86). Thus if you're trying to find things relative to where you're installed, you'll fail.
I don't think it matters. You can run a 64-bit from your Desktop, an external drive, etc. the same way you can run a 32-bit app. I think the difference is purely for organization.
Or say you are developing a 32-bit and a 64-bit version of an application, you could install both of them and run them side by side by putting them in the separate Program Files folders.
None. I believe the two folders are there for organizational purposes only.
None whatsoever. I do it all the time, and have never encountered any ill effects. I believe it's purely organizational.
Probably none if the application that you want to install doesn't have a 64 bit parallel version.
If it has and you decide to install it, by default (if it is using the same folder names) it will overwrite the existing 32 bit application.