Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
Why do some software require system restart in windows ?
Meanwhile , I have never encountered such situation in Linux based Distros.
It is innate to the way Windows was designed. Loading an executable doesn't load the file into RAM. Windows creates a memory-mapped file for the executable instead. Chunks of the program get loaded into RAM on demand only as needed. A page fault copies 4096 bytes from the file. The RAM pages themselves are not backed by the paging file. If RAM is needed for other processes then Windows simply unmaps the page and throws away the bytes it contains. If the process again lands on the page then a page fault reloads RAM from the file. Very efficient, this mattered a great deal when you need to run a 32-bit operating system and many processes in only 16 megabytes of RAM. Still efficient today, but not as critical as it once was.
One side-effect of the memory-mapped file is that it puts a write lock on the file. Necessary to prevent another processes from altering the executable. That would be disastrous, RAM could contain a mix of old and new bytes in the file. That's guaranteed to cause the program to malfunction.
Of course that makes the life harder for programs that intentionally want to change the executable. Including the malicious variety btw. So having to stop the processes that have the file loaded is required, it releases the write lock. An update delivered through Windows Update tends to update executables that cannot easily be unloaded since they are part of the operating system. Which is the reason they tend to require a reboot, the file is updated as part of the boot sequence when the machine restarts.
One way to bypass the lock is to rename the file. The lock only protects the file data, not the directory entry. You can then create a new directory entry with the same name as the old one. And the next time the process gets started, it will use the new entry. One minor complication is that you have to eventually delete the renamed file.
One thing I can think of is that some software requires services to be running for it to run properly. The restart likely adds these services to the ones that automatically run when you start the computer so that the program can run smoothly.
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 10 months ago.
Improve this question
My mac is sending me the frequent alert of low disk space. When I am checking the system storage then it's showing 170+gb is occupied by the system. I am not sure where is my space is getting used?
I tried a few cleaner tools also but couldn't get help much.
Please help to resolve it?
After doing research over various forums of mac's and StackExchange I figured out that it's mostly because of the following reasons.
Log files (Might be crash log files/docker files)
Your email messages stored in outlook (in my case it was almost ~20 GB)
Logs related to cores when a system restarts (~ 10 GB)
Docker Images (This had ~70 GB in my case).
Your nonsystem documents/downloads/itunes
So the question is how to find what all things are unnecessary and safe to delete? These system files are not visible directly.
I tried using a few tools like cleanmymac etc but all were paid so I couldn't get help much there.
To clean up your non-system unnecessary files, you can directly take the help of the storage management tool of mac. You just have to click on optimize storage and it will show all the non-system files.
To cleanup unnecessary system files, use below command
sudo find -x / -type f -size +10G
This command will give you all the files occupying more than 10 GB. You can analyze the files and delete them as necessary.
The highlighted cores are nothing but the state files of your mac to reboot from last state when your mac restarts so it's safe to delete.
Next step is to delete a hidden tmp folder
It will show the size as 0 bytes because your user won't have permission to read it. But will be occupying a hell amount of space. So delete it by giving root permission.
Now, Look if there are any docker images present in your system. Clean them all (Docker.raw).
Using all these steps I was able to clean almost 100+ GB.
Recently found that this issue was caused by a memory leak in one of the Java applications I was running. I had to start the Activity Monitor, searching for Java processes and Force Quit them. Rinse and repeat every time my space runs out. Also fix your code where you can to get rid of memory leaks.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I am trying to run my Chef/Kitchen tests that use Vagrant/Virtualbox on an ESXi VMWare cluster guest to test cookbook integration. I will likely move to a VMWare version for Kitchen/Vagrant and remove the Virtualbox portion at some point but would like to see this work first.
I have the virtualization support turned on in the VM so it does function, but it is excruciatingly slow. Where the full converge on my desktop is done in under 4 minutes, it takes nearly 40 minutes in the VM. Checking perfomance on the VMWare VM and it seems acceptable, but the VirtualBox VM inside it has very high Hardware Interrupt service. All other metrics seem to be about average. Where HI rarely gets about 1-2 under normal systems it is steady >30 even while idle in the Virtualbox guest and often above 50.
Any ideas on what to look for or magic settings I may have missed?
After reading the very helpful article linked to by itfdev at https://egustafson.github.io/esxi-nested-virtualbox.html I have found that what I want to do will likely always be slow due to the quote below:
Disk Performance
During my initial experimentation with nested VM’s I observed a clear decrease in performance of the nested VM. My initial experimentation mostly only went as far as installing the OS on the nested VM. Installing an OS is generally a disk intensive activity.
Disk virtualization is more expensive than most. Nesting virtualized disks will accumulate "virtualization debt" quicker than other virtualized components. The short, but rambling explanation goes something like this:
In my inner VM I write a block to "disk". This traverses the inner OS’s file system code and is mapped to a sector on the inner VM’s virtual block device. Writing is the passed to the outer VM, traverses the file system code, and is mapped to the outer VM’s virtual block device. Finally, the block is passed to the host, (physical), file system, mapped through to a sector, and finally
placed on the actual physical device. — If your head is spinning now, it
should be. That’s three times the block is passed through file system code on it’s eventual path to a physical write.
This problem is understood in the virtualization community, and there are methods for avoiding differing degrees of the penalty based on the requirements of an installation. I will not cover these here. My point: if your nested VM’s strike you as slow, this may be a significant part of the why.
VirtualBox, running in the virtual environment, can only use "software" virtualization. It's slow, of course. It should consume a lot of processor time. ESXi in a host system use hardware acceleration (VT-x or similar), and it performance close to real host performance. You should not install VM on VM.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I previously installed PostgreSQL 9.2 on my Mac using the EnterpriseDB installer. As such I had amended .bash_profile to read export PATH=/opt/local/lib/postgresql92/bin:$PATH, and everything was working just fine.
Then I had a hard drive corruption and had to reformat my computer and reinstall OSX. Initially I had to reinstall Snow Leopard (that's the version of the recovery discs I had), and then re-upgrade to Mountain Lion (which I was running prior to my crash). I then used a Time Machine backup with Migration Assistant to restore my Users, Applications, and "Other Files".
Looking around everything seemed to be back where it was before the crash. However, now when I try to do anything PostgreSQL-related, I get the error:
psql: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/tmp/.s.PGSQL.5432"?
Reading around online I found that this might be simply because my PostgreSQL server had not been started when I performed the system restoration. The official docs say to use the following command:
$ postgres -D /usr/local/pgsql/data
But I don't have a folder at /usr/local/pgsql; the only directory with data I can find is /Library/PostgreSQL/9.2/data. So I switched to postgres by doing sudo su postgres and tried postgres -D /Library/PostgreSQL/9.2/data again, which gave:
2013-08-18 11:38:09 SGT FATAL: could not create shared memory segment: Invalid argument
2013-08-18 11:38:09 SGT DETAIL: Failed system call was shmget(key=5432001, size=32374784, 03600).
2013-08-18 11:38:09 SGT HINT: This error usually means that PostgreSQL's request for a shared memory segment exceeded your kernel's SHMMAX parameter. You can either reduce the request size or reconfigure the kernel with larger SHMMAX. To reduce the request size (currently 32374784 bytes), reduce PostgreSQL's shared memory usage, perhaps by reducing shared_buffers or max_connections.
If the request size is already small, it's possible that it is less than your kernel's SHMMIN parameter, in which case raising the request size or reconfiguring SHMMIN is called for.
The PostgreSQL documentation contains more information about shared memory configuration.
Where do I go from here? This whole thing is a bit strange; I don't remember ever having to start the server when I initially installed PostgreSQL...
EDIT: I also tried initdb -D /Library/PostgreSQL/9.2/data in case the db cluster was missing, but got:
initdb: directory "/Library/PostgreSQL/9.2/data" exists but is not empty
If you want to create a new database system, either remove or empty
the directory "/Library/PostgreSQL/9.2/data" or run initdb
with an argument other than "/Library/PostgreSQL/9.2/data".
So it should still be there, restored along with most of the other stuff on my system, right?
Your kernel parameters need tweaking. Here is the relevant documentation on kernel resources. Configure those according to your system specifications (memory) and you should be good.
As a footnote, I would like to add that from Postgresql 9.3 onwards, the above tweaking will no longer be necessary.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
This looks like a common question. But I was not able to find answer for it. When we try to install windows programs, what exactly happens? What files are copied where? What is written in the registry?
Most programs come with an installation program named Setup.exe or Install.exe. When you install a program, the installation program usually does the following:
Looks for a previous version of the program on your hard disk. If it
finds a previous version, the program may ask whether you want to
replace the previous version.
Creates a folder in which to store the program files. Most
installation programs ask where you'd like this folder. Some
installation programs also create additional folders within this
folder. Windows creates a folder named Program Files, usually in C:\
(if Windows is stored in a partition or drive other than C, the
Program Files folder is usually in the same partition). We recommend
you install all your programs in folders within the Program Files
folder.
note Some software vendors have the bad habit of installing
application programs in locations other than your Program Files
folder. You can't do much about this; the additional folders may
clutter up your root folder, but they don't do any harm.
Copies the files onto your hard disk. If the program files are
compressed, the installation program uncompresses them. Usually, the
installation program copies most of the files into the program's
folder, but it may also put some files into your C:\Windows,
C:\Windows\System, or other folders.
Checks your system for the files and hardware it needs to run. For
example, an Internet connection program might check for a modem.
Adds entries to the Windows Registry to tell Windows which types of
files the program works with, which files the program is stored in,
and other information about the program.
Adds a command for the program to your Start | All Programs menu
(some programs add submenus to the Start | All Programs menu to
contain several commands). The installation program may also add a
shortcut to your Windows desktop to make running the program easy for
you. You can change the position on the Start menu of the command for
the program, get rid of the command, or create a command if the
installation program doesn't make one. You can also create a shortcut
icon on the desktop, if the installation program hasn't done so, or
move or delete the program's shortcut.
Asks you a series of questions to configure the program for your
system. The program may ask you to type additional information, like
Internet addresses, passwords, or software license numbers. It may
also ask which users should be able to run the program.
Every installation program is different, because it comes with the application program, not with Windows. If your computer is connected to a LAN or to the Internet, the installation program may configure your program to connect to other computers on the network.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 5 years ago.
Improve this question
Problem:
I keep getting Hash: Element not found errors.
Technical Details:
uTorrent 3.2.3 (latest as of this writing)
Running about 30 Torrents (all downloading)
Win 7 64 bit
Dell N5050 :sigh:
Symptoms:
Force recheck is disabled (sometimes)
When I resume the torrent, as it halts when this happens, it proceeds smoothly until the next Hash: Element not found error
It doesn't happen at a particular %age
Solutions Attempted:
Searched online a lot to find a few below
Re-download elsewhere. Set download folder and change it and re-download the torrent. NO! DOESN'T WORK! and its FRUSTRATING that I'd to DELETE my 90% downloaded torrent!!
Good 'ol thump. Swear at the screen making heavy fist thumps and hand gestures. Surprisingly, this doesn't work!
Force recheck. Doesn't help and sometimes not available.
Disk I/O errors. Came across an article which said this might due to Disk I/O errors.
Realized I was using a DELL laptop
Realized HDD had failed on a previous DELL
Tried Solution #2 again. Same results.
Seemed like the most likely explanation to the problem, hence read articles about HDD checking and downloaded a few suggested softwares to check HDD Health
Interestingly, the HDD was a OK
None of these worked!
I got this error when my hard drive ran out of disk space, so I think it is related to some file / disk access issue, depending on where you are writing to
I was trying to download some large files to a network drive (Windows XP to Samba) and I was getting the same Element Not Found error.
In my case, enabling the disk cache has solved the issue. I had to uncheck the Disable Windows caching of disk writes and Disable Windows caching of disk reads options under uTorrent Options -> Preferences -> Advanced -> Disk Cache (this way enabling the cache).
Source: http://forum.utorrent.com/topic/34159-error-element-not-found/page-2#entry251137
I really think this question belongs to SuperUser though.
The working solution turned out to be pretty simple.
Check your Anti-Virus!
My antivirus was quietly quarantining a few suspected files.
Added those files to the exclusion list.
All is well again.