Quicker way to create duplicate Virtual PC images? - virtual-pc

I use Virtual PC to create fresh environments for testing my installer. But I must be doing something wrong because a VPC image with Vista or XP inside is taking around 15GB of disk space (that includes VS2005/S2008 installed in them).
To create a new copy for testing I copy and paste the folder that has the .vhd, .vmc and .vsv files inside. After using the new VPC image for testing I then delete that copied folder. This works but it takes a looong time to copy 15GB each time. Is there some faster/more efficent approach?

Use differencing/undo disks. This means when you shut down your VPC you'll be asked if you want to save changes, simply answer no and you'll be back to where you started.

Doesn't VirtualPC have a fake-write/snapshot mode? That way it should not write to your original disk at all unless you say so at the end of the session.
If it doesn't, you might seriously want to consider VMWare or VirtualBox as these do have this feature and it's REALLY useful for things like this.
Edit: it looks like VPC does have a feature like this called differencing disks. Have a look at this:
http://www.andrewconnell.com/blog/articles/UseVirtualPCsDifferencingDisksToYourAdvantage.aspx

VPC has a so called undo disk. you create sg similar like "restore point" and in vpc you can roll back to that version. ideal for testing setups.

Sound like you need to use differencing virtual hard disks rather than creating a new copy every time.
Instructions here

Another option: you can use Microsoft's ImageX to store VHDs in WIM format. If you have multiple images you are constantly reusing, this is an incredible way to manage VMs. I have a slew of Windows XP and 2003 images I keep in compressed WIM format.
You can capture the VMs by mounting them in Windows PE and then capturing them to a network drive.

Also, you mentioned cut & paste, this is not the best way to be copying large amounts of data within windows. At least use xcopy, robocopy is even faster.

Also, another option if you are looking to duplicate the images for use on other real machines, you can convert the disk to a dynamically expanding disk which will reduce the size of the vdisk making it easier to copy. This also allows for a more rapid backup, which looks to be part of what your testing does by default. The problem with dynamic disks is they tend to be slightly slower performance wise than fixed-size disks.
However, if all you are doing is using it for testing on the same machine, see the answers above. Differencing is the way to go.

Related

Unable to shrink partition size

I am trying to install Linux in a computer that has Windows 7. The first step was shrinking the disk size but Windows did not allow any reduction. Thus I followed a number of steps to disable "unmovable" files
I disabled the Page File
I disabled hibernation
I disabled System Protection
After that nothing seemed to have changed so I checked the disk fragmentation and it was 11% fragmented. I have since then run at least 4 defrags and I have also defragged the free space using Defraggler.
As of now the disk looks like this
Right now, Windows refuses to shrink the partition by any amount (I imagine that the files at the end of the disk are the troublesome ones).
Coming from an Linux background I am unsure what else needs to be done in order to shrink the partition.
Are you using Windows disk management tool to do the shrink? Here's a link for that method.
https://www.howtogeek.com/howto/windows-vista/resize-a-partition-for-free-in-windows-vista/
Also make sure the recycle bin on that drive is empty.
I finally figured it out.
The easiest way is just to use a Live USB with GParted on it since that will allow you to move Windows protected files around (the windows OS is not loaded on the live distro).
If just defragmenting is concerned one can use Hiren's Boot CD and the included Defraggler for the same purpose.
I had the same problem on Windows 10. Turns out it was antivirus software that was running on the machine that prevented defragmentation happen properly. I actually had to temporarily uninstall antivirus. After that, the Disk Management tool was able to correctly shrink the volume.

Mounting a stream encoder as a drive in windows?

For a variety of reasons, revolving around cost of copy and the travails of the Windows filesystem, I need to mount a stream encoder as a drive, so that incoming data can simply be blindly directed at this "drive", aggregated, and encoded, without the source program being any the wiser. This would be basically trivial on Linux, but seems to be an uphill struggle on windows.
Specifically, I'd like to be able to "mount" a tar builder, which I know sounds odd, but there's a compelling reason for doing this. Is there a utility or library that deals with this? Perhaps an obscure part of the Windows API?
This looks promising... but it seems intended to mount folders or similar, rather than "devices." I do have control of where the data is written, so I can specify an arbitrary path.
Having experience with virtual drives (see our Virtual Storage product line) I can say that your task needs some redefining. As said in comments, drives (or, better say, filesystems) in Windows are expected to be namely filesystems (unlike Unix world), and as such they must support certain reading and enumeration operations, which is not something you'd expect.
Probably the closest you can do is a virtual drive in memory whose contents are then passed to your application in some way. The user will drag the data to your drive and on unmounting (or on other command) drive contents are passed to other program.
Several of our products can be used for your task (see CallbackDisk, Callback File System and SolFS OS Edition on Virtual Storage page), yet they are all commercial products. If you have a one-time or short-term task, you can build something for your use with a trial key.
There exist free approaches to your task, namely Pismo File Mount and Dokan, but I don't know how well they fit.

Change Journal for Blocks in Windows(NTFS)

I have written a backup tool that is able to backup files and images of volumes for Windows. To detect which files have changed I use the Windows Change Journal. I already use the shadow copy functionality to do a consistent copy of both the files and the volume images.
To detect which blocks have changed I use hashes at the moment. This means the whole volume has to be read once (because to see which block has changed hashes of all blocks have to be calculated).
The backup integrated into Windows 7 is able to create incremental volume images without checking all blocks. I wasn't able to find an API for a kind of block level change journal.
Does anybody know how to access this information?
(I'm willing to dive deep into NTFS internals - even reading and parsing special files)
I don't think block level change info is available anywhere. Most probably what the Windows 7 integrated backup does is it installs a File System Filter Driver like some backup products does and anti-virus software. A filter driver can intercept all file system calls and in this way know which blocks changed. If you do this you can basically build your own change journal that works block level but only for the files that you are interested in.
I would really like to know a better answer myself here.
When you say Windows Change Journal I take it you are referring to the NTFS USN? It looks very much like the Windows 7 backup uses a combination of VSC and NTFS USN to detect changes and create incremental images much like you are already doing.

Compiling code on an external drive

To make things easier when switching between machines (my workstation at the office and my personal laptop) I have thought about trying an external hard drive to store my working directory on. Specifically I am looking at Firewire 800 drives (most are 5400 rpm 8mb cache). What I am wondering is if anyone has experience with doing this with Visual Studio projects and what sort of performance hit they see.
It depends on the size of the project. The throughput is low and the latency is high, so you're going to get hit every which way, but due to the latency you'll be hit harder if you have a lot of little files rather than a few large ones.
Have you considered simply carrying around a GIT or other distributed repository and updating the machine repositories as you move around? Then you can compile locally and treat the drive and a roving server. Since only changes will be moved across, it should be faster, and your code will be 'backed up' in more places.
If you forget the drive, it breaks, or is lost/stolen, then you can still sit down at a PC and program with no code missing if you're at the last PC you used, or very little code missing (which will be updated later with a resync anyway).
And it's just a hop skip and a jump away from simply using the network to move the changes between the systems if you don't want to carry the drive around later.
I use vmware and the virtual machines are on an external usb drive. Performance is fine. You might have some issues with the drive name changing - not an issue if you use virtual machines.
Granted I work in an industry were Personal Information and Intellectual Property are king, but I don't like that idea at all. That hard drive disappears and you have a big problem.
Why not Remote Desktop into the work machine?
EDIT Stipud Spelingg

Unmovable Files on Windows XP

When I defragment my XP machine I notice that there is a block of "Unmovable Files". Is there a file attribute I can use to make my own files unmovable?
Just to clarify, I want a way to programmatically tell Windows that a file that I create should be unmovable. Is this possible, and if so, how can I do it?
Thanks,
Terry
A lot of system files cannot be moved after the system boots, such as the page file and registry database files.
This utility runs before Windows boots to defragment those files. I have it set to run at every boot, and it works well for me on several machines.
Note that the very first time you boot up with this utility set to run, it may take several minutes to defrag. After that first run though, it finishes in just 3 or 4 seconds.
Edit0: To respond to your clarification- that link says windows has marked the page file and registry files as open for exclusive access. So you should be able to do the same thing with the LockFile API Call. However, that's not an attribute of the file itself. You'd have to actually run some background program that locks the file for exclusive access.
There are no file attributes that you can place on your files to mark them as immovable. The only way that a file cannot be moved (I think) during defragmentation is to have some other process have the file open (for read or write, I'm not even sure that you need to have the file open in exclusive mode or not).
Quite frankly, I cannot think of a reason that you'd want your files not to move, unless you have specific requirements about where on the disk platter your files reside. Defragmentation should generally lead to faster disk access and that seems to be desireable in all cases :-)
This usually means that the file is in use by some process. If you're defragmenting, you'll likely see this with a lot of system files. If the file should legitimately be movable and is stuck (it's being held by a process that runs at startup but shouldn't be, for example), the most useful way of resolving the problem is to remove all permissions on the file, reboot, restore the permissions, and then get rid of the file/run the program that's trying to use it.
I suppose the ugly way is to have an application boot on startup, check every few seconds if defrag is running and if so open the file in exclusive mode.
This is really ugly and I don't recommend it unless there is no cleaner solution.
Terry, the answers all mention ways to prevent files from becoming unmovable during defragmentation. From your question it appears that you are in fact wanting to make your personal files unmovable. Can you please clarify what is appealing about making your files unmovable.
I assume you're using the defragger that comes with Windows. Some commercial ones like DiskKeeper can move some of these files (usually system files). You can try their trial versions.
Contig might serve your purpose http://technet.microsoft.com/en-us/sysinternals/bb897428.aspx
I'm relatively certain I ran across some methods/attributes you could access programatically to do exactly what you want. This was back in NT4 days though and my memory isn't that good.
For a little more complete solution try Raxco's PerfectDisk. While it is a commercial product it does a very good job and supports boot time defrag of system files. The first defrag takes longer than say DiskKeeper but its a single pass defragger and supports defragging with very little free space left on the drive. Overall its a much smarter defrag program then any other I've seen and supports systems of any size.
http://www.raxco.com/
first try to move(or delete) the files within safe mode. If can not, try to move(or delete) the files with linux.
But be careful if those are the windows system files, then you are failed to boot up your windows.
Some reason why the files are unmovable are : the file size is too big, the files are being in open/in use condition, insufficient security privileges, being access by other computer/s, and many other things.

Resources