Bash console crashed and 6GB storage space got used up - bash

I'm using windows 10 and yesterday I installed bash.
I have had no prior coding experience and using bash was a bit of a nightmare for me. I just needed it to run a simple code to join 2 images into one image for a batch of 800 pairs of images.
I made a lot of mistakes in the code as I was using code found online.
In the process, some code would result in an ever increasing output file and I had to close the console using task manager just to end the process. System resources would go very high.
After I had ended the console a couple of times I realized over 6GB of disk space has been used up.
The problem is I can't find any way of clearing that space since I can't see what's using it. I've tried using disk cleanup but that didn't help.
How can I clear all that 6GB space that's used up by a bad code and prematurely ended bash console?

You have a couple of options:
You can try to execute "history" in bash and work out where the file is from the command line history
alternatively run:
find / -size 6G -ctime 0
This will search the whole file system for files changed in the last 24 hours that used 16G of data. Increment ctime if you don't find the file in question.

Related

How to monitor memory usage of all processes in Linux?

I'm developing a program running on embedded Linux (Debian Buster), and I found the program sometimes has performance issues. After some debugging process, I doubt the issue might not be in my program. Instead, somehow the OS start doing memory swap and my program was swapped to the file system.
Therefore, I use the code here to verify. And it turns out my program occupied much less physical memory after about 500 seconds, and it matches the hypothesis.
Now I want to find which process suddenly takes lots of memory at that point, but I don't know how.
Is there anyway to keep monitoring memory usage of all processes (or the top 10) of the system and dump to a log file? Any tools or commands would be good.
Thanks.
I'm developing a program running on embedded Linux
It will be helpful, if you could specify which embedded Linux you are working on.
Based on that, there are tools that someone could suggest.
For Linux, I would say, you could use:
top -p [PID]
you can get PID by:
ps [options]
I am not sure if there is a problem while using the command line?
dump to a log file
I think you could use grep to dump the terminal output to a log file you can create using touch command.

C drive free space drops very fast with no obvious reason

My operating system is Windows 10 and I have a problem with the free space dropping for no reason.
A couple of days ago I ran a python code in jupyter notebook, and in the middle of the execution my C drive ran out of space (there was ~50 GB free space), and since then the C drive free space changes significantly (even shrinks to few MBs) without no obvious reason.
Since then I found some huge files in a pycharm temporary directory, and I freed 47GB of space, but after a short time, it runs out of space again ( I am not even running any code anymore)!
When I restart, the free space gradually starts to increase, and again after a some time, it shrinks to a few GB or even MBs.
PS. I installed WinDirStat to show me the stat of the disk space, and it shows 93 GB under this path: C:\ProgramData\Microsoft\Search\Data\Applications\Windows\Files\Windows.edb, but I can't open Data folder in the file explorer, and it shows 0 bytes when I open the folder properties.
Windows.edb is an index database of the Windows Search function. It provides data to speed up searching in the file system due to indexing of files. There are several guides in the internet about reducing it's size. The radical way would be deleting it but I do not recomment this. You had to turn Windows Search off to do so:
net stop "Windows Search"
del %PROGRAMDATA%\Microsoft\Search\Data\Applications\Windows\Windows.edb
net start "Windows Search"
You wrote in your question that the file suddenly grew while your program was running. Maybe files will be created there. These files should be set to not be indexed. You should do that for the folder where the files are created. If this all fails, you could finally turn indexing off which slows down Windows Search.

Is it possible to recover a running VBScript file, if the original file was already deleted?

I have one Vbscript which runs continuously on my system to monitor a web page on Internet Explorer.
I have permanently deleted this Vbscript file from its original location on system by mistake, However the script is still in RAM and is still running and monitoring the web page.
This script is very important to me but I have lost it :(
I want to know if there is any way by which I can recover the code of Vbscript file from system's RAM or any temporary file as the script is still running.
I am not allowed to use any file recovery software, so please don't suggest to install any third party data recovery software.
Try using 'ADPlus.vbs' script from WinDbg:
1. http://msdn.microsoft.com/en-us/windows/hardware/hh852365
2. http://support.microsoft.com/kb/286350
As the code was running, I followed the below process to recover the running code:
Go to Task Manager
Select the process and create dump
Open online dump analyser (www.osronline.com)
Upload dump file
Download the dump analysis
The dump analysis provided almost 95% of the correct code. Code within some loops were distorted or changed. As I was the owner of the code I was able to correct it.
Use HxD, it can view all ram content relative to any process at fly. It is commonly used to hack currently running games etc.
After locating your script, it might be needed to clear alphanumeric mess between your code, N++ and regex knowledge may be useful.

Using START in a cmd file starting more than 2K processes

I tried to wrap a little command in a batchfile to prevent me from typing it the whole time. But the result was a mess! I'm ended up with thousands of cmd processes and was unable to stop it with CTRL+C
The command was quite simple START iisreset
System Win7 64bit
Why is that happening?
EDIT:
With some help and additional tests I can now say that the Batch command START within a *.cmd file cause that mess. It opens a new commandwindow with every window until it crashes. Maybe you have luck and hit CTRL-C exactly the right time, but that really has to be luck. Anyway I will not use this command in future and it also seems not to be applicable to all machines. (Read the comments for full history of this)
It works OK on Windows 7 pro, 64 bit, but based on the other stuff you've tried, it looks like it might be a bug... You could try raising a bug report
(although that seems like a non-trivial exercise).

Clear file cache to repeat performance testing

What tools or techniques can I use to remove cached file contents to prevent my performance results from being skewed? I believe I need to either completely clear, or selectively remove cached information about file and directory contents.
The application that I'm developing is a specialised compression utility, and is expected to do a lot of work reading and writing files that the operating system hasn't touched recently, and whose disk blocks are unlikely to be cached.
I wish to remove the variability I see in IO time when I repeat the task of profiling different strategies for doing the file processing work.
I'm primarily interested in solutions for Windows XP, as that is my main development machine, but I can also test using linux, and so am interested in answers for that environment too.
I tried SysInternals CacheSet, but clicking "Clear" doesn't result in a measurable increase (restoration to timing after a cold-boot) in the time to re-read files I've just read a few times.
Use SysInternal's RAMMap app.
The Empty / Empty Standby List menu option will clear the Windows file cache.
For Windows XP, you should be able to clear the cache for a specific file by opening the file using CreateFile with the FILE_FLAG_NO_BUFFERING options and then closing the handle. This isn't documented, and I don't know if it works on later versions of Windows, but I used this long ago when writing test code to compare file compression libraries. I don't recall if read or write access affected this trick.
A command line utility can be found here
from source:
EmptyStandbyList.exe is a command line tool for Windows (Vista and
above) that can empty:
process working sets,
the modified page list,
the standby lists (priorities 0 to 7), or
the priority 0 standby list only.
Usage:
EmptyStandbyList.exe workingsets|modifiedpagelist|standbylist|priority0standbylist
A quick googling gives these options for Linux
Unmount and mount the partition holding the files
sync && echo 1 > /proc/sys/vm/drop_caches
#include <fcntl.h>
int posix_fadvise(int fd, off_t offset, off_t len, int advice);
with advice option POSIX_FADV_DONTNEED:
The specified data will not be accessed in the near future.
I've found one technique (other than rebooting) that seems to work:
Run a few copies of MemAlloc
With each one, allocate large chunks of memory a few times
Use Process Explorer to observe the System Cache size reducing to very low levels
Quit the MemAlloc programs
It isn't selective though. Ideally I'd like to be able to clear the specific portions of memory being used for caching the disk blocks of files that I want to no longer be cached.
For a much better view of the Windows XP Filesystem Cache - try ATM by Tim Murgent - it allows you to see both the filesystem cache Working Set size and Standby List size in a more detailed and accurate view. For Windows XP - you need the old version 1 of ATM which is available for download here since V2 and V3 require Server 2003,Vista, or higher.
You will observe that although Sysinternals Cacheset will reduce the "Cache WS Min" - the actual data still continues to exist in the form of Standby lists from where it can be used until it has been replaced with something else. To then replace it with something else use a tool such as MemAlloc or flushmem by Chad Austin or Consume.exe from the Windows Server 2003 Resource Kit Tools.
As the question also asked for Linux, there is a related answer here.
The command line tool vmtouch allows for adding and removing files and directories from the system file cache, amongst other things.
There's a windows API call https://learn.microsoft.com/en-us/windows/desktop/api/memoryapi/nf-memoryapi-setsystemfilecachesize that can be used to flush the file system cache. It can also be used to limit the cache size to a very small value. Looks perfect for these kinds of tests.

Resources