My operating system is Windows 10 and I have a problem with the free space dropping for no reason.
A couple of days ago I ran a python code in jupyter notebook, and in the middle of the execution my C drive ran out of space (there was ~50 GB free space), and since then the C drive free space changes significantly (even shrinks to few MBs) without no obvious reason.
Since then I found some huge files in a pycharm temporary directory, and I freed 47GB of space, but after a short time, it runs out of space again ( I am not even running any code anymore)!
When I restart, the free space gradually starts to increase, and again after a some time, it shrinks to a few GB or even MBs.
PS. I installed WinDirStat to show me the stat of the disk space, and it shows 93 GB under this path: C:\ProgramData\Microsoft\Search\Data\Applications\Windows\Files\Windows.edb, but I can't open Data folder in the file explorer, and it shows 0 bytes when I open the folder properties.
Windows.edb is an index database of the Windows Search function. It provides data to speed up searching in the file system due to indexing of files. There are several guides in the internet about reducing it's size. The radical way would be deleting it but I do not recomment this. You had to turn Windows Search off to do so:
net stop "Windows Search"
del %PROGRAMDATA%\Microsoft\Search\Data\Applications\Windows\Windows.edb
net start "Windows Search"
You wrote in your question that the file suddenly grew while your program was running. Maybe files will be created there. These files should be set to not be indexed. You should do that for the folder where the files are created. If this all fails, you could finally turn indexing off which slows down Windows Search.
Related
As explained by Dropbox, Smart Sync is a feature "that helps you save space on your hard drive. Access every file and folder in your Dropbox account from your computer, using virtually no hard drive space. ... With Smart Sync, content on your computer is available as either online-only, local, or in mixed state folders."
Last night and this morning, I moved a large quantity of files from an external disk into Dropbox folders on my MacBook (MacOS Mojave Version 10.14.4), then selected those Dropbox folders to be "online-only". The files rather quickly synched with Dropbox on the cloud -- I saw them appear in the local folders of a desktop computer that shares the dropbox -- but the grey icons (for "online only") took a long time to display in Finder. (More than twenty hours later, two larger folders still show the blue icon, for "synching", even though their contents have long appeared on the other computer.)
With growing alarm, I watched as each new directory added to Dropbox ratcheted up the amount of space used on the MacBook to dangerous levels (93%) even as large directories marked as "online only" continued to sync to the Dropbox cloud. I could only restore available space by moving some content back to an external disk.
Confusingly, information about how much space really remained was inconsistent. df showed 58 GB available:
Filesystem 1G-blocks Used Available Capacity Mounted on
/dev/disk1s1 465 403 58 88% /
while About this Mac => Storage showed 232 GB available.
According to one source, "the Storage tab in About This Mac ... can be useful as it is the only guide to what types of data are taking up storage space, but when you want to know how much space is used or free on any volume or disk, use Disk Utility: it’s much more likely to be accurate." Confusingly, however, my Disk Utility displayed both results:
433.68 GB used, 3.95 GB on other volumes, 62.45 GB free
Capacity 500.07 GB, Available: 232 GB (169.55 GB purgeable), Used: 433 GB
As explained by Dropbox, "setting files to be online only will free up space on your hard drive within minutes (as long as your computer is online and able to sync to Dropbox). However: ... macOS 10.13 (High Sierra) uses ... APFS. With APFS, the operating system takes snapshots of the file system and available hard drive space. These snapshots may not update after you've used Smart Sync to set Dropbox files as online only. This means that hard drive space you freed up with Smart Sync may not be immediately reflected or available if this snapshot hasn't updated. This hard drive space should eventually be freed up by the OS, but the amount of time this will take can vary. This isn't a behavior specific to Dropbox, but instead the designed behavior of macOS." On APFS, the placeholders for "online-only files use a small amount of space on your hard drive to store information about the file, such as its name and size. This uses less space than the full file." Indeed, files marked as "online-only" continue to show their non-zero (online) sizes (e.g., with ls and os.path.getsize()) as if they were still available locally.
I gather this is a MacOS (i.e., APFS) issue, not specific to Dropbox.
My question: If Disk Utility shows 232 GB "available" but only 62.45 GB "free", what are the consequences? Would bad things happen if I were to add another 100 GB of files to the disk?
I am of course reluctant to add more content than space free just "as an experiment" but see how this could happen unintentionally.
THIS HELPED ME: https://www.cbackup.com/articles/dropbox-taking-up-space-on-mac-6688.hmtl.html#A1
Solution 4. Clear the Dropbox cache folder
Generally, there is a hidden folder that containing Dropbox cache stored in your Dropbox root folder, named ".dropbox.cache". Only when the function of viewing hidden files and folders is enabled in the operating system, you can see the folder.
If you delete a large number of files from Dropbox, but the hard drive of your computer does not reflect these deletions, the deleted files may be saved in the cache folder. So, you can manually clear the cache to clear some space on the hard drive by following the steps below:
Open the Finder and select Go to folder... from the Go menu.
A dialog box should appear. Now copy and paste the following line into the box and press the return key:
~/Dropbox/.dropbox.cache
This will take you directly to the Dropbox cache folder. Delete the files in your cache by dragging them out of the Dropbox cache folder and into your Trash.
I have just noticed that my C drive is getting full, whereas it still had 30 GB free space 3 days ago.
Given last days activity I can't find any reason for this.
Now I realize that my C free space is getting lower and lower even though there's no current activity on my PC (except that it's turned on).
Every 2 minutes, I lose approximately 100 MB of free space, even though I don't download anything.
I launched my antivirus and I have closed my internet connection in order to see if the free space would stop decreasing, but it continued decreasing at the same pace.
I checked the task manager and notice there was a software running which I think was named "One Drive setup.exe" (during the past weeks, I had many pop up windows saying I had to update onedrive, but there was a problem with the auto update etc... but I didn't car because I don't even know what OneDrive is and I don't think I use it). So I killed this running task.
I thought it had stopped the loss of free space (I even gained 100 MB), but the decrease started again.
Now I connected to Internet again.
I got 300 MB free space back and now it seems constant since 4 minutes. Maybe these little ups and downs can be due to the current antivirus scanning.
But what can explain the loss of 30 GB during the past 2 or 3 days?
Could it be windows update? How can i check this with windows 10?
Could it be a virus or something bad?
Please, answer quickly, i only have 17GB left :-(
Thanks
Which version of Windows OS are you using?
Turn off/ disable System Restore point, this way you will able to recover some space. Other than use CCleaner (https://www.piriform.com/ccleaner/download) to clean your system.
They release patch I believe. But u also can use built in disk cleanup tool(https://support.microsoft.com/en-us/help/4026616/windows-disk-cleanup-in-windows-10).
Also uninstalled OneDrive/Google Drive unless u actively use this service. OneDrive sync with cloud files so that u can use those off line.
I am trying to install Linux in a computer that has Windows 7. The first step was shrinking the disk size but Windows did not allow any reduction. Thus I followed a number of steps to disable "unmovable" files
I disabled the Page File
I disabled hibernation
I disabled System Protection
After that nothing seemed to have changed so I checked the disk fragmentation and it was 11% fragmented. I have since then run at least 4 defrags and I have also defragged the free space using Defraggler.
As of now the disk looks like this
Right now, Windows refuses to shrink the partition by any amount (I imagine that the files at the end of the disk are the troublesome ones).
Coming from an Linux background I am unsure what else needs to be done in order to shrink the partition.
Are you using Windows disk management tool to do the shrink? Here's a link for that method.
https://www.howtogeek.com/howto/windows-vista/resize-a-partition-for-free-in-windows-vista/
Also make sure the recycle bin on that drive is empty.
I finally figured it out.
The easiest way is just to use a Live USB with GParted on it since that will allow you to move Windows protected files around (the windows OS is not loaded on the live distro).
If just defragmenting is concerned one can use Hiren's Boot CD and the included Defraggler for the same purpose.
I had the same problem on Windows 10. Turns out it was antivirus software that was running on the machine that prevented defragmentation happen properly. I actually had to temporarily uninstall antivirus. After that, the Disk Management tool was able to correctly shrink the volume.
I created a backup disk image of my disk yesterday and the software told me to close all Windows programs to make sure the process finishes successfully.
I did that, but I was wondering what happens when some program does write to the disk nevertheless during the process. Windows 7 is a complex system and surely various log files and such are written continuously (the disk has one partition which contains the Windows install too). How does the backup software handle it when the disk content is changed during image creation?
What is the algorithm in this case?
Snapshotting, or 'Shadow Copy' as Microsoft calls it, see Shadow Copy on wikipedia
I have to write a bat script for a test scenario where the software that we are testing fails to write to file due to a disk full error. The test script must be automated, so that we can run it on overnight tests, for example. The test script must also work at different computers, so installing a software like a virtual machine wouldn't be the best solution in this case.
How can I simulate that error in a Windows environment?
You could try writing to a full floppy disk.
Edit:
In light of your edited question, you could set up a network share with no disk space quota and write to that. The error will then be produced regardless of the logged on user or machine.
For Windows XP or later:
This command can get the amount of free space for the c:\ drive:
for /f "usebackq tokens=1-5" %%A in (`dir c:\ ^| find "bytes free"`) do (
set FREE_SPACE=%%C
)
Replace c:\ with your drive, as needed.
You can then take some space away from this value so you have a little room to work with:
set /a FREE_SPACE=FREE_SPACE-1024
or however much space you want to keep free.
You can use the fsutil command to create a file to fill up the free space on the disk:
fsutil file createnew c:\spacehog.dat %FREE_SPACE%
Run your test, writing to the drive. After you write 1024 bytes or so you should run out of space.
Download and install TrueCrypt. You can then create a virtual partition of whatever size you want (a couple of megabytes), mount it and then fill it with a couple of documents.
Best Option: Microsoft's consume program
Reasons:
It tests the system disk (vs a separate drive)
It's fast - run the program to fill the disk instantly, stop when no longer needed
It's easy - No creating and deleting files. No extra test partition hanging around. Installation is required, but you can use a simple command afterward.
It's scriptable
Steps:
Install the Windows Server 2003 Resource Kit Tools (Works fine on Windows 7)
cd "%ProgramFiles(x86)%\Windows Resource Kits\Tools" (or whereever it's installed)
consume.exe -disk-space
Command output:
C:\Program Files (x86)\Windows Resource Kits\Tools>consume.exe
Universal Resource Consumer - Just an innocent stress program, v 0.1.0
Copyright (c) 1998, 1999, Microsoft Corporation
consume RESOURCE [-time SECONDS]
RESOURCE can be one of the following:
-physical-memory
-page-file
-disk-space
-cpu-time
-kernel-pool
C:\Program Files (x86)\Windows Resource Kits\Tools>consume.exe -disk-space
Consume: Message: Total disk space: 96049 Mb
Consume: Message: Free disk space: 14705 Mb
Consume: Message: Free per user space: 14705 Mb
Consume: Message: Attempting to use: 14705 Mb
Consume: Message: Reattempting to use: 14705 Mb
Consume: Message: Sleeping ...
Other Options:
Windows 7 has a virtual hard drive feature. Basically do the following: Computer Management > Disk Management > Action Menu > Create VHD > Right click disk and Initialize > Right click
Generate large files (should be instant) until your disk is full using a shell command or Dummy File Generator program. Another prorgram: SpaceHog.
It might seem like a bit much, but one thing I can think of is to use a virtual machine, and set its virtual disk to just big enough to fit the OS on. Fill it with some garbage files to tip it over the edge, then run your program.
Create a secondary partition, fill it with junk and then run your program there.
You could setup a small ramdisk and write to that. See this page for some free ramdisk products.
Create a new user accout, set a quota for it, and use runas to run your app as that user. (not exactly the same as disk full, but should have similar consequences.)
The operating system will respond differently to it's system drive filling than to other drives filling and as such your application will do so too, surely? Simply filling a drive irrespective of what the physical media is used isn't going to be a accurate test.
Can't you mock the file system event for a full disk? Why would you want to wait until the disk is full? Wouldn't you want to monitor disk space periodically and warn the user when the disk is with a percentage margin of filling? Rather than wait until the disk space is terminal simply prevent your application from working until the issue is resolved, not doing so could effect any data IO and be unrecoverable!
If the test has to be a hard integration test then automating a virtual machine, deploying the application and then fill the remaining space with a recursive script is feasible.
Best thing that works on every computer (as testing is not neceessarily done on a dedicated machine) would be a ramdrive/ramdisk that could be set up on the fly.
Only found a Virtual Disk SDK so far see here that maybe could be included in your buzild process.
Different idea: maybe your testing computers could be set up to write to a shared network folder (that is full) mount as a drive?
I have made a modification to the above script to make it compatiable with Windows 7... Essentially adding the switch "/-c" to the for statement. This removes the thousands seperator as fsutil does not like it in the statement.
for /f "usebackq tokens=1-5" %%A in (`dir /-c d:\ ^| find "bytes free"`) do (set FREE_SPACE=%%C)
fsutil file createnew d:\largefile.txt %FREE_SPACE%
use a very small iscsi target