rm -rf ~$* on macbook - what now? [closed] - macos

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 5 years ago.
Improve this question
So I recently as a total mistake ran the command:
rm -rf ~$*
So now my terminal shell looks crappy and all my files are gone. Great succes.
My terminal shows the user as:
User%
How do I get it back to "User#Machine" format?

This isn't really an answer, just a bit to add onto the advice from #swa66.
I don't know what kind of mac you have, but if you have one that you can pull the hard drive out yourself then you might want to consider doing that. There are numerous tools on the market that can recover deleted files and directories as long as you have not written over the data. If you put a new bare drive into your mac, assuming you can, then you can install a fresh copy of macOS and your third party apps, etc as swa66 advised. Then you can purchase one of the reputable disk recovery apps and attach your pulled out drive via an external enclosure or dock (I like the docks the best) and then proceed to recover your important files. It takes some work and it requires a bit of expenditure if you don't have a suitable bare drive around and an external enclosure or dock and the recovery software. But depending on the value of your lost data it maybe worth it to you. As swa66 said, drive recovery services are extremely expensive so if you have not overwritten you data with new data or repartitioned you can have good success retrieving the most common file types yourself.
If you cannot pull your drive out, but you have access to another mac, then there is the option of using target disk mode to access your drive from another mac to image the drive for later recovery attempts or direct recovery but you have to make sure that the recovery software supports target disk mode. If your lost data is important, then be very careful what you do with your computer to avoid overwriting the lost data. -rf does not actually over-write or remove the data from the disk so it is still there but the locations of the files on the disk are now available to be overwritten by anything. Don't install recovery software onto the same drive that you are trying to recover from for example.

Restore from backup
To get your files back, you have but one easy option: restore from backup.
Let's hope you made TimeMachine backups on a regular basis.
rm -rf on the command line removes files and directories recursively, no mercy, no second guesses, no second chances.
The ~$*: I'm unsure what it expanded to. $* in bash expands to the arguments given to the script. but since it likely expanded to nothing, you might have nuked the home directory ~ of the user that executed this and everything in it that you can erase recursively will be gone. That's typically way too much to still have a stable environment.
So: restore from backup as your only simple option.
If you can't do that, There are 2 options left: Start over and Recover (some) data
Start over
Myself, I'd just restore the system from scratch if I didn't want to are was unable to restore a backup. It is the only way to be sure to have a stable system again where directories like Desktop, Downloads, Library, etc. still exist with their proper permissions and contents.
Recover (some) data
If you stop using the system ASAP there's an option that some services might find some valuable data on your harddisk. No guarantees will be given at all. So consider it a last resort at best. It will not restore your system to working condition, but it might recover some valuable data.
What to do if you want to keep this option open:
Stop using the system NOW, shut it down. Every write your system does to the harddisk is (potentially) overwriting the data you might want to recover.
If you have a system with removable harddisks. Most modern macs are not easy, nor recommended for end-users to swap harddisk themselves, it's even likely to void warranties on the system, so take care!
-> Replace the disk in the machine with a new one and start rebuilding on that new disk. Use the old disk only as a target of the recovery, never boot from it, or otherwise write to it.
If you have a system without an easily removable harddisk, you'll have to stop using the system till the valuable data has been recovered. If you're going for the DIY path below, you will have to bring the system up in "target disk mode" See here how to do that: https://support.apple.com/en-us/HT201462
You now have two options:
DIY: I honestly have never had any success with this in real cases, but it is possible to find software that will claim to do this for you. Obviously nothing will ever be guaranteed and the best you can hope for is to recover some of the valuable data files. This software is typically not cheap, but significantly cheaper than the next option.
Professional data recovery service. Get the disk to the service of your choice. Expect this to be extremely expensive, without any guarantee to results.
Lessons learned
All incidents should always allow for an after the fact point in time where you learn from the experience. Without trying to preach too much:
Be careful with rm -rf ... it is powerful
Make backups regularly. On macOS timeMachine is easy and painless and costs you next to nothing compared to this pain. TimeMachine can backup to an external drive, an apple time capsule, a partition on a NAS, ... If you leave it connected, you'll have hourly backups.

Related

System Storage Taking Up Way Too Much Space in macOS Mojave [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 10 months ago.
Improve this question
My mac is sending me the frequent alert of low disk space. When I am checking the system storage then it's showing 170+gb is occupied by the system. I am not sure where is my space is getting used?
I tried a few cleaner tools also but couldn't get help much.
Please help to resolve it?
After doing research over various forums of mac's and StackExchange I figured out that it's mostly because of the following reasons.
Log files (Might be crash log files/docker files)
Your email messages stored in outlook (in my case it was almost ~20 GB)
Logs related to cores when a system restarts (~ 10 GB)
Docker Images (This had ~70 GB in my case).
Your nonsystem documents/downloads/itunes
So the question is how to find what all things are unnecessary and safe to delete? These system files are not visible directly.
I tried using a few tools like cleanmymac etc but all were paid so I couldn't get help much there.
To clean up your non-system unnecessary files, you can directly take the help of the storage management tool of mac. You just have to click on optimize storage and it will show all the non-system files.
To cleanup unnecessary system files, use below command
sudo find -x / -type f -size +10G
This command will give you all the files occupying more than 10 GB. You can analyze the files and delete them as necessary.
The highlighted cores are nothing but the state files of your mac to reboot from last state when your mac restarts so it's safe to delete.
Next step is to delete a hidden tmp folder
It will show the size as 0 bytes because your user won't have permission to read it. But will be occupying a hell amount of space. So delete it by giving root permission.
Now, Look if there are any docker images present in your system. Clean them all (Docker.raw).
Using all these steps I was able to clean almost 100+ GB.
Recently found that this issue was caused by a memory leak in one of the Java applications I was running. I had to start the Activity Monitor, searching for Java processes and Force Quit them. Rinse and repeat every time my space runs out. Also fix your code where you can to get rid of memory leaks.

VS 2017 Installation failed

I install the VS2017 on Windows 7. After some time I receive the error:
MSI: C:\ProgramData\Microsoft\VisualStudio\Packages\Microsoft.VisualStudio.MinShell.Msi,version=15.6.27421.1\Microsoft.VisualStudio.MinShell.Msi.msi, Properties: REBOOT=ReallySuppress ARPSYSTEMCOMPONENT=1 MSIFASTINSTALL="7" VSEXTUI="1" VS7.3643236F_FC70_11D3_A536_0090278A1BB8="G:\Program Files (x86)\Microsoft Visual Studio\2017\Community"
Return code: 1632
Return code details: The Temp folder is on a drive that is full or is inaccessible. Free up space on the drive or verify that you have write permission on the Temp folder.
Log
G:\TEMP\dd_setup_20180318121545_006_Microsoft.VisualStudio.MinShell.Msi.log
I have checked the G: where the TEMP located. It has 200 GB free.
BUT one strange thing: this folder and all other folders are Read-Only. I uncheck it in the Properties, then close Properties dialog, open it again: it is Read-Only.
I can modify it, even MSI installer could: it created the log file there. But in the middle of installation the error occurs.
What is it and how I can solve this problem?
I run with log:
Machine policy value 'DisableUserInstalls' is 0
SRSetRestorePoint skipped for this transaction.
Note: 1: 1336 2: 3 3: C:\Windows\Installer\
MainEngineThread is returning 1632
No System Restore sequence number for this installation.
User policy value 'DisableRollback' is 0
Machine policy value 'DisableRollback' is 0
Incrementing counter to disable shutdown. Counter after increment: 0
Note: 1: 1402 2: HKEY_LOCAL_MACHINE\Software\Microsoft\Windows\CurrentVersion\Installer\Rollback\Scripts 3: 2
Note: 1: 1402 2: HKEY_LOCAL_MACHINE\Software\Microsoft\Windows\CurrentVersion\Installer\InProgress 3: 2
Decrementing counter to disable shutdown. If counter >= 0, shutdown will be denied. Counter after decrement: -1
Restoring environment variables
Decrementing counter to disable shutdown. If counter >= 0, shutdown will be denied. Counter after decrement: -1
MainEngineThread is returning 1632
Disc Space Reclaiming - Quick Wins?: Too much to read? The essential options (arguably).
Final Summary
This issue turned out to be a redirected TEMP and C:\Windows\Installer cache folder - with the latter being on an unavailable drive.
Please be careful redirecting system folders, in particular C:\Windows\Installer. It is a super-hidden system folder and side-effects are very common.
You must make sure that the relocated folder has the correct ACL permissions that the original folder had. This is crucially important for security reasons. For one thing the whole folder could be deleted by someone who do not understand what it is for - making all packages un-uninstallable and un-maintainable. There are also other security reasons.
Also: putting this folder on the network is not technically sound in my opinion - problems will result. A local drive is also problematic if drive letters change. Which brings me to the next point:
Lacking Space for your System SSD Drive?
If your real issue is lacking disk space on your system SSD drive, please consider some alternatives listed below. Proceed with care and at your own risk with every option. Most of them should be harmless.
Disc Space Visualizing: I have an ancient tool called SpaceMonger.exe which shows me a visual representation of whatever is taking up my disc space. Very useful. It seems this tool is no longer supported. Maybe check https://en.wikipedia.org/wiki/WinDirStat for a similar tool (untested by me - run it by virustotal.com).
DriverStore: And a word to the resident hacker in all computer guys: no, no - don't try to redirect %SystemRoot%\System32\DriverStore (!). "Seductive The Dark Side Is". "Run Forrest, Run!". "Careful With That Axe Eugene". Etc... You get the picture. Leaving out Monty Python allusions for now. Seriously: I do not know what low-level stuff could be involved in the boot process. One would have to ask Raymond Chen, but don't. He has important things to do. However: pnputil.exe, DriverStore Explorer - your own risk. Don't do it :-).
Overall Suggestions
UPDATE: For laptops I like to use a high capacity, low-profile USB flash drive and / or a high capacity SD-card permanently sitting in a port to
hold my downloads and installers, VS Help files, maybe even source code (riskier). An obvious, but somewhat "clunky" option.
One can combine this drive with the Library feature in Windows Explorer
to show the flash drive under whatever library you want (Downloads, Videos, Pictures, Source, etc...).
My preferred desktop disc cleanup options below would be: 7, 19, 2, 18, 1, 6, 11, 12 (in that order).
Preferred options for laptops: 7, 19, 2, 18, 6, 10 (reduce max cache sizes), 15, 17, 3 (in that order).
The real-world approach for me is a slightly different order: 2 (purge obsolete Windows Updates - this may also trim WinSxS - but I am not positive), 19 (uninstall unneccessary software - can be relatively quick), then I run SpaceMonger.exe to find space hogs and move them - this often involves zapping the Downloads folder (7) and purging, moving or clouding media files (Pictures, Videos, Music), then 6 for developer PCs (jogging Visual Studio and uninstall useless SDKs and help files), and 9 (eliminate hibernation - not great for laptops), 18 (enable compression - can take forever), and finally I might zap the recovery partitions (laptops) and create a new partition in its place to allow data files to be stored there (freeing up system partition space). This zapping is a high-risk operation - obviously. Very error-prone (especially if inexperienced users use the diskpart command-line tool or a Linux Live Boot tool - described below). And obviously verify that you have installation media AND a valid license key before wiping out recovery partitions - it has to be mentioned. Data files I move are usually: source code repository, downloads folder, outlook PST file, images and videos, etc... This procedure should reclaim many gigabytes of disc space. Don't do it for fun though - though risk should be acceptable for most of these options (barring the recovery partition zapping - it is relatively simple to do, but error prone).
Cleanup Options
Apply healthy skepticism to these options. They are not all terribly useful in many cases - just attempting to mention all kinds of tweaks. Potential easy, big wins without much configuration and fiddling could be 2, 6, 7, 9, 18. Options 2 and 18 are almost always time consuming, but very effective. Maybe hours for option 2 (especially on Windows 7 & 8 - do not abort when it is running) and even longer for option 18 on a large computer or a slow disk (but the operation can be cancelled).
Option 0, Cloud Storage is an implied overall option in this day and age. OneDrive Filer, GDisk, Dropbox, etc... Download data files on demand.
My Documents: It is generally much better to move user data folders to a network location or another, local drive (best) than to redirect system folders! Few system-entanglements.
I wouldn't move the desktop or other folders found here: HKCU\Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders, I would move "My Documents". Just right-click it in Windows Explorer, go to properties and there is a tab there with features to help you move it. Careful whilst doing this - a backup is in order first.
Pictures and Video might also be OK to move, but not the desktop or the other special folders - they may be involved in the boot or logon process (erroneous packages could cause that even for My Documents - nothing is without risk).
Streaming and media files from apps such as iTunes or similar can obviously totally hog a disc with limited capacity. I use SpaceMonger.exe to get an overview and then move the files somewhere else.
For computers with multiple users there will obviously be multiple "My Documents" folders to redirect.
Microsoft's Disk Cleanup Tool: Run cleanmgr.exe, select Clean up system files as described here: https://serverfault.com/q/573208/20599 (top).
UPDATE Oct.2018: "Downloads" folder is now a cleanup option! DO NOT ENABLE! It deletes the whole downloads folder without question. This issue appears corrected by now Oct.2021.
You can now zap the uninstalls for applied Windows Updates - this can give you back several gigabytes on your system drive. In the picture below I can zap 5.36 GB. For Windows 7 I have seen dozens of gigabytes being purgeable.
This tool might also slim down and shrink the WinSxS directory (the Win32 side-by-side assembly folder). I am not 100% positive.
Obviously you can remove unnecessary packages in Add / Remove Programs and remove system restore point (use the second tab in the image below to access these features):
Third Party Cleanup Applications: Third party tools such as CCleaner may be able to clean out even more space by wiping out cache files and temporary files for all kinds of applications and tools. This particular tool suffered a malware attack recently. Use at your own risk.
Personal opinion / suggestion: use only for test boxes or non-critical machines. The cleanup is quite awesome, but it also involves some risks (lost login passwords, lost system logs, etc...). Self-evident, but it should probably be mentioned.
My 2 cents: not a corporate solution, but may be fine for advanced home users who like to experiment and to keep their machines tuned.
Administrative Installations: For large MSI files, performing an administrative install will prevent the caching of the whole MSI file in C:\Windows\Installer. You must install from a proper network share so files are available for repair operations.
An administrative installation essentially extracts embedded CAB files from the MSI and allows the creation of a network installation point where all computers can pull files from instead of caching all files locally.
The generic method for running and administrative installation is: msiexec /a File.msi. More details in links below.
Extract MSI from EXE
What is the purpose of administrative installation initiated using msiexec /a?
How can I eliminate the huge, cached MSI files in C:\Windows\Installer?
There is a whole lot of installer caching going on - it is a little out of hand if you ask me.
Mounted Drives: Some guys dabble with mounting external drives as folders on their system drive. In other words another drive shows up as a regular folder on your system drive and functions as such (sample).
This I have no experience with, and I have doubts about its reliability over time. For all I know it might actually be better than several other options if you do it right (and never take out the physical drive).
I would do data file folders only (not settings folders, or core OS folders such as the desktop). Maybe for source control folders. If the link breaks, the data should still be safe and the system can still boot (and the link re-established).
UPDATE: Windows Explorer's "Include in library" is an alternative? (do have a peek) I like to create a "Source Code Library" with included folders from here and there.
Visual Studio: And the obvious cleanup options for Visual Studio (for completeness):
If you have downloaded MSDN help locally (Help => Add and Remove Help Content, remove items as appropriate and rely on online help instead or change the Local store path towards the bottom to use another drive for content).
Or you have several versions of the SDKs you do not need or you have Visual Studio features you do not need, get rid of them (In Visual Studio: Tools => Get Tools and Features... - get rid of unnecessary features - I often use the Individual Components view).
Downloads Folder: I am sure I have forgotten many viable options to get some more workspace without wrecking your box. One would be to clean out your Downloads folder and move all installers to a network location - this might be the biggest save of all for some people.
This also works great for laptops - it is just about the first thing I would do for a laptop with little disc space. If you will not have access to your network share of installers - for example whilst traveling - then just use a thumb drive or external hard drive to hold your installers and ISO files.
For computers with multiple users there will obviously be multiple download folders potentially full of stuff. Use a disk space visualizer to see (see link on top of list).
Page File: Some people move the system page file (pagefile.sys) from the system drive to another drive. Back in the day this caused me an unbootable system, but perhaps things are better now. Not the first thing I would do though - this is very core OS-stuff.
Obviously impossible for a laptop with only one drive (unless you erase the recovery partition and create a real, visible partition in its place).
I find this option risky, maybe I should have put it in the "dis-honerable mentions" part below.
Be careful. Maybe the "last-known good"-feature or system restore can help you if you get problems?
Hibernation File: the hibernation file on Windows systems will live on the system drive, and I am not aware of any way to move it anywhere else for very fundamental technical reasons. However, you can disable hibernation to get rid of the whole file. This will free up a few gigabytes on a modern computer.
You obviously lose the ability to put your machine into hibernation (memory dumped to disk), but sleep mode (low-power use mode / standby) should still be available.
Hibernation mode may be more desirable to keep on for laptops (if battery runs out whilst traveling the laptop can not auto-hibernate and you could lose data).
Application Temp & Cache Folders: The above mentioned CCleaner can wipe out a lot of temporary files for various applications (though I don't really recommend this for use - I use cleanmgr.exe instead - and CCleaner for test boxes).
Web Browsers (Firefox, Opera, Vivaldi, Chrome, IE, Edge, Safari, etc...) can also spam the disk with a lot of cache files and downloaded junk. It is possible to redirect all these folders, though I prefer to reduce them to a certain acceptable maximum size.
Plenty of other applications, of all kinds, leave trash on the system over time. Some of which can be cleaned with CCleaner mentioned above (or another such tool). Again not a tool recommendation. Use the cleanup features inside the application itself if available.
For computers with multiple users there will obviously be multiple cache folders folders to restrict and clean.
Special Data-Heavy Applications' Storage Folders: Some applications can potentially store enormous data files on your system drive (and outside "My Documents") that can be moved to other drives.
The biggest suspect is probably Outlook (in older versions at least) - or other email software (Thunderbird, Lotus Notes, etc...). For Outlook there is a single *.PST file storing all email and attachments, or a similar sync file if connected to Exchange. This file can be moved to different drive with relative ease. Some even resort to using the Web-interface only for their email and eliminate the local PST file (good for laptops).
Without going overboard, MS-SQL databases could be another type of massive data file that could be moved to a different drive with relative ease.
And this list could be made very big, but diminishing returns to add any more (web server folders, virtual machine images, media / video files (mentioned above), virtualized applications maybe, etc...).
For computers with multiple users there will obviously be multiple storage locations to redirect.
Source Control Working Folder & Repository: for a developer this is 100% self-evident - and almost embarrassing to list, but I just want to have it mentioned. It is also related to the previous point, but I add it as its own bullet point. You move both your working folder and your source code repository (if different, and if local) to a different drive than the system drive. For example GIT, Mercurial, Perforce, StarTeam, etc...
Build Process Junk: Beyond moving source control folders to other drives, it is also possible that certain processes generate huge log files that spam the system in unexpected locations at times. I hear MSBuild tends to enthusiastically create log files sprinkled across the system and I am not sure if normal Microsoft cleanup tools detect them (for example cleanmgr.exe mentioned above). And your source code could have lots of object files you can zap.
Visual Studio Code: silly option, but for ad-hoc developer laptops or traveling tech-workers, one could potentially rely on the smaller and multi-platform Visual Studio Code instead of Visual Studio to do small development testing / work. Significantly smaller install. Personal note: a bit odd the whole tool :-). Also browser version now?
Visual Studio Code (cross-platform).
What are the differences between Visual Studio Code and Visual Studio?
https://code.visualstudio.com/docs/supporting/faq
Download: https://www.visualstudio.com/
Windows Store Apps & Per User Installations: if there are multiple users on the box, several Store apps could be installed multiple times, once per user. Some cleanup could be done here if need be. I suppose some games could be quite big. And in the day and age of side-by-side installation features, we are now to deploy everything per-user? Odd.
Tweak Each Package Installation: almost every package you install can be modified slightly during installation to add less files to the system partition.
Redirect Application Installation Folder: this is an option I personally dislike, but it is used a lot. For every installation you redirect the installation folder to a different drive and folder hierarchy than the regular ProgramFilesFolder. This is done on a per-package basis, and not all packages support this. Typically you go to a "Custom" installation dialog where you perform "feature selection" (what setup features to install).
Leave Out Optional Features: most packages you install will have optional components that you can leave out or even run-from-source in the case of some MSI packages. Certain developer tools can often be tweaked quite a bit without too many side-effects. Large games are often installed to a regular non-SSD hard drive which is not the system drive.
Uninstall Windows Components: a few components can be added / removed from Windows. Click Turn Windows Features On or Off from the old-style Add / Remove Control Panel Applet. You can turn off / remove certain .NET versions, IE, IIS, Windows Media Player, Message Queue Server, Print to PDF, PowerShell and various other components. Maybe not that much to gain from this (some security benefits perhaps by removing some components - for example support for SMB 1.0 / CIFS file sharing or IIS).
Enable Compression For System Drive: you can enable compression on the whole system drive - with some performance penalties - provided the file system is NTFS. Simply Right-click the system drive => Properties => Compress drive to save disc space. This can take quite some time (old HD, SSDs are faster). You can also compress individual folders. I like to enable the "Show compressed or encrypted NTFS files in color" option in Windows Explorer. File Menu => Options => Show => Show compressed or encrypted NTFS files in color.
Uninstall Unnecessary Software: the forgotten obvious option mentioned in item 2 above, you should obviously uninstall any software that is not needed anymore. Common disk hogs: games, weird SDKs and development tools installed for testing, expired trial versions for various software, etc... To uninstall: Windows key + R, type appwiz.cpl and hit Enter.
User Data Cleanup: for certain uninstalled applications a lot of junk could be left in the %UserProfile% and in the %AllUsersProfile%. Cleanup is as usual risky, use caution, but there can be lots of junk here - sometimes gigabytes.
Great care must be taken during such cleanup. Zip up the folder first. "Big wins only" - why nitpick with tiny text files? Diminishing returns for real if you get bogged down in these folders. Use disc-space visualization tools to see the hogs.
%AllUsersProfile% - shared data
%UserProfile% and %UserProfile%\AppData - user specific data, remember to clean for all users (if multiple).
Stray Package Caches: as mentioned above a lot of caching goes on for MSI packages (and other installer packages). It is likely that a lot of these packages can be left behind after uninstall (this was the case with Installshield cached setups back in the day at least).
The most commonly known caching locations are described here: Cache locations for (MSI) packages. Clean at your own risk, obviously - I repeat it, and I mean it. Some gigabytes are commonly stored here.
Paths inline (just a selection, there can be many others):
WiX: %ProgramData%\Package Cache
Installshield: %SystemRoot%\Downloaded Installations (older IS setups) and %LocalAppData%\Downloaded Installations (newer IS setups)
Advanced Installer: [AppDataFolder][|Manufacturer]\[|ProductName] [|ProductVersion]\install
Visual Studio: %AllUsersProfile%\Microsoft\VisualStudio\Packages. See important tip in comment below (disable cache).
Package Distribution Cache Folders: SCCM and other package distribution systems have cache folders that get really big. For example ccmcache. These folders can usually be cleaned or re-configured to take less space.
There are no doubt numerous other little tricks, but please don't redirect system folders!
Alternative Approaches
(Dis)-Honorable Mentions: The below are not recommendations, but some alternative approaches. They are higher risk than the options above (which should be good enough), and best if you are setting up a new laptop fresh or reinstalling it, and want to get rid of pesky vendor recovery-partitions that you can do without.
Let's state the obvious with conviction: A lot of data is lost every year using these tools. So coffee or caffeine first. Glasses on. Look around. Adjust any pony tails and beards (ladies too). Speak to yourself in the third person. Assume a demonstrably insane posture and shout out "I do!" to really commit to the imminent disaster! Good luck! Fire in the hole! "Fire for effect". SNAFU. FUBAR. OK, enough already... I have had bad experiences - but no huge disasters (knock on wood) - with all these tools. Enough said - be careful, your data is important. Wife's baby pictures, your uncommitted code, etc...
diskmgmt.msc or diskpart.exe (Windows): open partition manager (diskmgmt.msc) and wipe out any recovery partitions or hidden partitions that you can live without and then expand your system disk to fill the whole physical disk or create a new visible partition.
Factory reset no longer possible (could be outdated anyway). You need installation media to reinstall (downloadable?).
Careful what you wipe out! Unrecoverable. Partitions are often protected and untouchable. They are also unmovable and un-expandable in many cases.
Maybe create a new, visible partition replacing the recovery partition and move data files and your downloads folder there to make more room on your system partition?
If the partitions are protected, you can use diskpart to delete them instead, or see next bullet point for gparted. Very easy to mess things up using diskpart though (command line).
gparted (Linux): you may be prevented from deleting a recovery partition from diskmgmt.msc (protected partitions). If you are adamant and insist, you can boot into a Linux Live Disc / System (booted from removable media) and delete using gparted for example.
I have done this to get rid of obsolete and useless recovery partitions and / or malware, and it worked just fine. But frankly I trust this gparted app as far as I can toss it. No offence to gparted, but playing well with Windows is challenging. Backup is crucial and mandatory for such risky endeavors - obviously.
Though risky (a Linux tool is updating the partition tables where your Windows partitions are declared) this may work for laptops where there is nowhere to redirect data folders since there is only one physical disk and you want the full disk for your system partition.
I think gparted even allows you to try to resize existing partitions at this point. I have never tried it. Good luck if you try. "Fire in the hole!".
Cloning: some use imaging tools or disk cloning features (hardware) to clone the old disk onto a bigger one. Backups essential obviously. Far from my comfort zone - just mentioning it. Not really relevant for this list (which was supposed to be about simple and effective measures to gain more disc space).
I believe there are features for this in gparted as well. Never tested.
Various hardware solutions. I gave them up years ago.
Why I am skeptical? Malware. Disk errors. Encryption. NTFS complexity? AD-problems (old & new drive in use post-clone)? Etc...
Several hard drive vendors seem to deliver proprietary solutions for this - these may be better tested than generic approaches?
File System Allocation Size: the file system used and its allocation size affects available space. Never bothered to look much at this, but a lot of space can be wasted by allocation size issues: Would SSD drives benefit from a non-default allocation unit size?
Allocation size cannot be easily / safely changed for a disk in use. There may be tools that can do it, but the benefits are
uncertain.
Modern Windows versions require NTFS as system partition file system. Other file systems such as FAT32 or exFAT have lower overhead (especially for smaller partitions - there will be more space available), and they are potentially faster but have more limitations. For FAT32 the biggest limitation is probably the 4GB max file size - not viable today.
The rest of this answer (below) was written during debugging - I will leave it in. It contains generic and general-purpose debugging options.
VC+ Runtimes
As seen in the link towards the bottom, other people have seen the same deployment error. Before getting into too much debugging, let's try the simplest approach possible. Please try to install the VC++ runtimes for 2017 (and 2015 perhaps) from here:
The latest supported Visual C++ downloads.
Potential General Fixes
This seems to be the better discussion online for this problem. I would first try the suggestion to run this tool: Microsoft Install and Uninstall Troubleshooter.
You can try this list of fixes as well. Crucially I would also try a reboot before trying again to release any potential locked files. Just to wipe the slate clean. The system's event log might have further information on the error seen (sometimes even beyond what is in an msiexec.exe log).
ACLs
What is the ACL (Access Control List) for your TEMP folder on that G: drive?
UPDATE: Also make sure the hidden folder C:\Windows\Installer exists and have the correct permission settings. You need to show protected operating system files in Windows Explorer to see this folder.
Verbose Logging
Try to create a proper, verbose log for the MSI install in question (much more informative than the log you refer to). This gives you something to start with to figure out what is happening. You can find some information on how to do logging here.
I would enable logging for all MSI installations for debugging purposes. See installsite.org on logging (section "Globally for all setups on a machine") for how to do this.
I prefer this default logging switched on for dev and test boxes. Typically you suddenly see an MSI error and you wish you had a log - now you can, always ready in %tmp%.
Quick Testing
In your case, I would go to C:\ProgramData\Microsoft\VisualStudio\Packages\Microsoft.VisualStudio.MinShell.Msi,version=15.6.27421.1\ to see if the MSI package is present on disk, and then I would launch it with logging enabled:
msiexec.exe /I "Microsoft.VisualStudio.MinShell.Msi.msi" /QN /L*V "C:\msilog.log"
Alternatively I would just double click the MSI file and see if I get a better, interactive error message. You will most likely need the verbose log to get any info.
See link in comment below (concrete error).
Just check c:\windows\temp and c:\windows\installer
do they exist and are they writable?
In my case i deleted c:\windows\installer previously and forgot about it, so i must recreate it.
The same error happens if UAC is disabled. Visual Studio installer can't write anything to TEMP if User Account Control is off. Solution - enable UAC.
It was VS 2019 and Windows Server 2012 R2 in my case.

How to take a snapshot of the entire system of Macbook Pro OS x 10.8 Mountain Lion

I want to be able to take a snapshot of the current system and will go back to it whenever I mess up with files. I looked at the Time Machine solution, but realized that it's only a good solution when I know what file I am looking for. But sometimes, some installation process creates binary files in multiple system paths, which are very hard to locate and identify. Say I installed a package, but then I felt like I shouldn't have done that. Uninstallation might still leave files around. So What might be some of the graceful solutions to go back to a status of the machine when everything is nice and clean.
Use disk utility (Applications | Utilities)
Click on the HDD and then click on new image. You can choose to have a compressed image or not. If you don't have much stuff on the drive it shouldn't be more than 30-40GB. Once you have the dmg file, stick it somewhere for backup purposes.
Also, create a recovery disk/stick with the recovery tool.
I dunno about "graceful" but Carbon Copy Cloner is definitely an easy solution for rolling back to a previous state. You can make an exact clone of your drive, then restore it back if something goes horribly wrong. I use CCC to make periodic backups of my Macs, as a sort of secondary backup to time-machine, which is easy to use but which I don't have total confidence in.
You can restore an entire system from a Time Machine snapshot, but it requires booting from the Recovery Partition or a Recovery disk. Basically, once you've rebooted in recovery mode, you can choose Restore From Time Machine Backup and then you'll be asked to locate the drive. Once you've done that, a list of Time Machine snapshots will be presented for restoring.
I haven't done this recently, but there are indications that the time of the backups may always be in PST, so be careful when looking at the times.
While OSX comes with TimeMachine, it also has the well-known (in Linux community!) command line tool called rsync.
With Google, I'm sure you can find many articles of how to use it, though here's an interesting blog of why its author uses rsync with Time Machine.

let's say I am writing my code and then my PC died, how necessary is it to do a complete scan if i don't want my later source code to be contaminated?

let's say I am writing a Ruby on Rails program and while editing a file, the machine blue screened. in this case, how necessary is it to re-scan the whole hard drive if I don't want my future files to be damaged?
Let's say if the OS is deleting a tmp file at the moment when my computer crashed, and still have some pointers to some sector on the hard drive. and if my newly created files happen to be in those sector, and next time the OS clean up files again, it may think that the "left-over" sector wasn't cleaned last time and clean it again, and damaging our source code. (esp with Ruby on Rails, where the source code could be generated by rails and not by us, and we may not know why our rails server doesn't work, if a file is affected). we can rely on SVN, but what if the file is affected before we check it in?
i think the official answer will be: "always scan the disk after a crash or power outage, for the data and even the space and indicate attempt to fix any bad sector", but the thing is, nowadays with the hard drive so big, it could take 2 hours to scan everything. And especially at work, we cannot wait for 2 hours if it is the middle of the day.
Does someone know if the modern OS, like XP, Vista, Mac OS, and Linux (when sometimes the power cord was loose and it didn't shut down properly and just shut down on 0% battery), with these modern OS, are our source code safe? Do they know how to structure to write to sector so that at most it will waste sector instead of overlapping sectors?
With a modern journaling file system (ext3/4, NTFS), the only problem would be that a file could be in a "half-written" state. Obviously scanning is not going to help this (that's what backups are for). The file system itself could not be corrupted. If you are using something like FAT, then yes, you should worry about this.
There's really only 1 issue here.
Is any file currently being written in some kind of "half written" state.
The primary cause of this would be if the application/editor is writing the file and the machine dies halfway through. In this case, the file be written is, well, half done. If it was over writing the original file, the original file is "gone", and the new one is "half done". If you don't have a back up file, then, well, you have a problem.
As far as a file having dangling pointers, or references to sectors not written, or somesuch thing. That problem depends on your file system.
The major, modern files ystems are journaled and "won't allow" this to happen. You may have a "half written", but that's because the application only got to write half of it, rather than the file system losing track of a sector pointer.
If you're playing file system games for performance, or whatever (such as using a UFS without logging), then you would want to run a fschk to clean up the file systems meta data.
But if you're using a modern operating system and file system (i.e. anything from the past 5 years), you won't have this problem.
Finally, if you do have version control running, then just do an "svn status", it will show you any "corrupted" files as they will have changed and it will detect that as well.
i see some information on
http://en.wikipedia.org/wiki/Journaling_file_system
Journalized file systems
File systems may provide journaling, which provides safe recovery in the event of a system crash. A journaled file system writes some information twice: first to the journal, which is a log of file system operations, then to its proper place in the ordinary file system. Journaling is handled by the file system driver, and keeps track of each operation taking place that changes the contents of the disk. In the event of a crash, the system can recover to a consistent state by replaying a portion of the journal. Many UNIX file systems provide journaling including ReiserFS, JFS, and Ext3.
In contrast, non-journaled file systems typically need to be examined in their entirety by a utility such as fsck or chkdsk for any inconsistencies after an unclean shutdown. Soft updates is an alternative to journaling that avoids the redundant writes by carefully ordering the update operations. Log-structured file systems and ZFS also differ from traditional journaled file systems in that they avoid inconsistencies by always writing new copies of the data, eschewing in-place updates.

Locking Executing Files: Windows does, Linux doesn't. Why?

I noticed when a file is executed on Windows (.exe or .dll), it is locked and cannot be deleted, moved or modified.
Linux, on the other hand, does not lock executing files and you can delete, move, or modify them.
Why does Windows lock when Linux does not? Is there an advantage to locking?
Linux has a reference-count mechanism, so you can delete the file while it is executing, and it will continue to exist as long as some process (Which previously opened it) has an open handle for it. The directory entry for the file is removed when you delete it, so it cannot be opened any more, but processes already using this file can still use it. Once all processes using this file terminate, the file is deleted automatically.
Windows does not have this capability, so it is forced to lock the file until all processes executing from it have finished.
I believe that the Linux behavior is preferable. There are probably some deep architectural reasons, but the prime (and simple) reason I find most compelling is that in Windows, you sometimes cannot delete a file, you have no idea why, and all you know is that some process is keeping it in use. In Linux it never happens.
As far as I know, linux does lock executables when they're running -- however, it locks the inode. This means that you can delete the "file" but the inode is still on the filesystem, untouched and all you really deleted is a link.
Unix programs use this way of thinking about the filesystem all the time, create a temporary file, open it, delete the name. Your file still exists but the name is freed up for others to use and no one else can see it.
Linux does lock the files. If you try to overwrite a file that's executing you will get "ETXTBUSY" (Text file busy). You can however remove the file, and the kernel will delete the file when the last reference to it is removed. (If the machine wasn't cleanly shutdown, these files are the cause of the "Deleted inode had zero d-time" messages when the filesystem is checked, they weren't fully deleted, because a running process had a reference to them, and now they are.)
This has some major advantages, you can upgrade a process thats running, by deleting the executable, replacing it, then restarting the process. Even init can be upgraded like this, replace the executable, and send it a signal, and it'll re-exec() itself, without requiring a reboot. (THis is normally done automatically by your package management system as part of it's upgrade)
Under windows, replacing a file that's in use appears to be a major hassle, generally requiring a reboot to make sure no processes are running.
There can be some problems, such as if you have an extremely large logfile, and you remove it, but forget to tell the process that was logging to that file to reopen the file, it'll hold the reference, and you'll wonder why your disk didn't suddenly get a lot more free space.
You can also use this trick under linux for temporary files. open the file, delete it, then continue to use the file. When your process exits (for no matter what reason -- even power failure), the file will be deleted.
Programs like lsof and fuser (or just poking around in /proc//fd) can show you what processes have files open that no longer have a name.
I think linux / unix doesn't use the same locking mechanics because they are built from the ground up as a multi-user system - which would expect the possibility of multiple users using the same file, maybe even for different purposes.
Is there an advantage to locking? Well, it could possibly reduce the amount of pointers that the OS would have to manage, but now a days the amount of savings is pretty negligible. The biggest advantage I can think of to locking is this: you save some user-viewable ambiguity. If user a is running a binary file, and user b deletes it, then the actual file has to stick around until user A's process completes. Yet, if User B or any other users look on the file system for it, they won't be able to find it - but it will continue to take up space. Not really a huge concern to me.
I think largely it's more of a question on backwards compatibility with window's file systems.
I think you're too absolute about Windows. Normally, it doesn't allocate swap space for the code part of an executable. Instead, it keeps a lock on the excutable & DLLs. If discarded code pages are needed again, they're simply reloaded. But with /SWAPRUN, these pages are kept in swap. This is used for executables on CD or network drives. Hence, windows doesn't need to lock these files.
For .NET, look at Shadow Copy.
If executed code in a file should be locked or not is a design decision and MS simply decided to lock, because it has clear advantages in practice: That way you don't need to know which code in which version is used by which application. This is a major problem with Linux default behaviour, which is simply ignored by most people. If system wide libs are replaced, you can't easily know which apps use code of such libs, most of the times the best you can get is that the package manager knows some users of those libs and restarts them. But that only works for general and well know things like maybe Postgres and its libs or such. The more interesting scenarios are if you develop your own application against some 3rd party libs and those get replaced, because most of the times the package manager simply doesn't know your app. And that's not only a problem of native C code or such, it can happen with almost everything: Just use httpd with mod_perl and some Perl libs installed using a package manager and let the package manager update those Perl libs because of any reason. It won't restart your httpd, simply because it doesn't know the dependencies. There are plenty of examples like this one, simply because any file can potentially contain code in use in memory by any runtime, think of Java, Python and all such things.
So there's a good reason to have the opinion that locking files by default may be a good choice. You don't need to agree with that reasons, though.
So what did MS do? They simply created an API which gives the calling application the chance to decide if files should be locked or not, but they decided that the default value of this API is to provide an exclusive lock to the first calling application. Have a look at the API around CreateFile and its dwShareMode argument. That is the reason why you might not be able to delete files in use by some application, it simply doesn't care about your use case, used the default values and therefore got an exclusive lock by Windows for a file.
Please don't believe in people telling you something about Windows doesn't use ref counting on HANDLEs or doesn't support Hardlinks or such, that is completely wrong. Almost every API using HANDLEs documents its behaviour regarding ref counting and you can easily read in almost any article about NTFS that it in deed does support Hardlinks and always did. Since Windows Vista it has support for Symlinks as well and the Support for Hardlinks has been improved by providing APIs to read all of those for a given file etc.
Additionally, you may simply want to have a look at the structures used to describe a file in e.g. Ext4 compared to those of NTFS, which have a lot in common. Both work with the concept of extents, which separates data from attributes like file name, and inodes are pretty much just another name for an older, but similar concept of that. Even Wikipedia lists both file systems in its article.
There's really a lot of FUD around file locking in Windows compared to other OSs on the net, just like about defragmentation. Some of this FUD can be ruled out by simply reading a bit on the Wikipedia.
NT variants have the
openfiles
command, which will show which processes have handles on which files. It does, however, require enabling the system global flag 'maintain objects list'
openfiles /local /?
tells you how to do this, and also that a performance penalty is incurred by doing so.
Executables are progressively mapped to memory when run. What that means is that portions of the executable are loaded as needed. If the file is swapped out prior to all sections being mapped, it could cause major instability.

Resources