Ubuntu gone wrong, because of using mv /* [closed] - shell

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
After using this command
root#localhost:/var/www/google# mv /* ./
mv /* ./
mv: cannot move ?dev?to ?/dev? Device or resource busy
mv: cannot move ?proc?to ?/proc? Device or resource busy
mv: cannot move ?run?to ?/run? Device or resource busy
mv: cannot move ?sys?to ?/sys? Device or resource busy
mv: cannot move ?var?to a subdirectory of itself, ?/var?
Every command is going wrong.
After that, I want to zip my files as backup and it gets wrong.
Somebody help me, thank you .
I want to restore the system normally.
If that not, how to zip it with some zip tools?

Judging from the comments, you were running as root and the current directory was /var/www/google when you ran the command:
mv /* ./
This has moved everything movable from / to /var/www/google. One side effect is that the commands that normally live in /bin are now in /var/www/google/bin and those that live in /usr/bin are now in /var/www/google/usr/bin.
Do not reboot. Do not log out.
If you do, you will have to reinstall from scratch.
Temporarily, you can do:
PATH=/var/www/google/bin:/var/www/google/usr/bin:$PATH
cd /var/www/google
mv * /
These steps undo the primary damage (you should be able to reboot after this, but don't).
You then need to replace the directories that are now in / but that should be in /var/www/google back in the correct place.
You should create a new terminal session and check that your system is working sanely (do not close the open terminal until you've demonstrated that all is OK again).
Don't work as root unless you have to, and only for the minimum time necessary (one command at a time?).
If any of this fails, you should probably assume that a reinstall will be necessary. Or take the machine to someone who has the experience to help you fix the problems. There are endless things that could go wrong. Mercifully for you, the /dev directory was not moved; that avoids a lot of problems. However, the /etc directory was moved; commands could get upset about that.

try to revert it
cd /var/www/google
mv ./* /
Good Luck
PD to zip:
zip archive.zip /path/to/zip/*
EDIT
/var/www/google/bin/mv /var/www/google/* /

Related

System Storage Taking Up Way Too Much Space in macOS Mojave [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 10 months ago.
Improve this question
My mac is sending me the frequent alert of low disk space. When I am checking the system storage then it's showing 170+gb is occupied by the system. I am not sure where is my space is getting used?
I tried a few cleaner tools also but couldn't get help much.
Please help to resolve it?
After doing research over various forums of mac's and StackExchange I figured out that it's mostly because of the following reasons.
Log files (Might be crash log files/docker files)
Your email messages stored in outlook (in my case it was almost ~20 GB)
Logs related to cores when a system restarts (~ 10 GB)
Docker Images (This had ~70 GB in my case).
Your nonsystem documents/downloads/itunes
So the question is how to find what all things are unnecessary and safe to delete? These system files are not visible directly.
I tried using a few tools like cleanmymac etc but all were paid so I couldn't get help much there.
To clean up your non-system unnecessary files, you can directly take the help of the storage management tool of mac. You just have to click on optimize storage and it will show all the non-system files.
To cleanup unnecessary system files, use below command
sudo find -x / -type f -size +10G
This command will give you all the files occupying more than 10 GB. You can analyze the files and delete them as necessary.
The highlighted cores are nothing but the state files of your mac to reboot from last state when your mac restarts so it's safe to delete.
Next step is to delete a hidden tmp folder
It will show the size as 0 bytes because your user won't have permission to read it. But will be occupying a hell amount of space. So delete it by giving root permission.
Now, Look if there are any docker images present in your system. Clean them all (Docker.raw).
Using all these steps I was able to clean almost 100+ GB.
Recently found that this issue was caused by a memory leak in one of the Java applications I was running. I had to start the Activity Monitor, searching for Java processes and Force Quit them. Rinse and repeat every time my space runs out. Also fix your code where you can to get rid of memory leaks.

rm -rf ~$* on macbook - what now? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 5 years ago.
Improve this question
So I recently as a total mistake ran the command:
rm -rf ~$*
So now my terminal shell looks crappy and all my files are gone. Great succes.
My terminal shows the user as:
User%
How do I get it back to "User#Machine" format?
This isn't really an answer, just a bit to add onto the advice from #swa66.
I don't know what kind of mac you have, but if you have one that you can pull the hard drive out yourself then you might want to consider doing that. There are numerous tools on the market that can recover deleted files and directories as long as you have not written over the data. If you put a new bare drive into your mac, assuming you can, then you can install a fresh copy of macOS and your third party apps, etc as swa66 advised. Then you can purchase one of the reputable disk recovery apps and attach your pulled out drive via an external enclosure or dock (I like the docks the best) and then proceed to recover your important files. It takes some work and it requires a bit of expenditure if you don't have a suitable bare drive around and an external enclosure or dock and the recovery software. But depending on the value of your lost data it maybe worth it to you. As swa66 said, drive recovery services are extremely expensive so if you have not overwritten you data with new data or repartitioned you can have good success retrieving the most common file types yourself.
If you cannot pull your drive out, but you have access to another mac, then there is the option of using target disk mode to access your drive from another mac to image the drive for later recovery attempts or direct recovery but you have to make sure that the recovery software supports target disk mode. If your lost data is important, then be very careful what you do with your computer to avoid overwriting the lost data. -rf does not actually over-write or remove the data from the disk so it is still there but the locations of the files on the disk are now available to be overwritten by anything. Don't install recovery software onto the same drive that you are trying to recover from for example.
Restore from backup
To get your files back, you have but one easy option: restore from backup.
Let's hope you made TimeMachine backups on a regular basis.
rm -rf on the command line removes files and directories recursively, no mercy, no second guesses, no second chances.
The ~$*: I'm unsure what it expanded to. $* in bash expands to the arguments given to the script. but since it likely expanded to nothing, you might have nuked the home directory ~ of the user that executed this and everything in it that you can erase recursively will be gone. That's typically way too much to still have a stable environment.
So: restore from backup as your only simple option.
If you can't do that, There are 2 options left: Start over and Recover (some) data
Start over
Myself, I'd just restore the system from scratch if I didn't want to are was unable to restore a backup. It is the only way to be sure to have a stable system again where directories like Desktop, Downloads, Library, etc. still exist with their proper permissions and contents.
Recover (some) data
If you stop using the system ASAP there's an option that some services might find some valuable data on your harddisk. No guarantees will be given at all. So consider it a last resort at best. It will not restore your system to working condition, but it might recover some valuable data.
What to do if you want to keep this option open:
Stop using the system NOW, shut it down. Every write your system does to the harddisk is (potentially) overwriting the data you might want to recover.
If you have a system with removable harddisks. Most modern macs are not easy, nor recommended for end-users to swap harddisk themselves, it's even likely to void warranties on the system, so take care!
-> Replace the disk in the machine with a new one and start rebuilding on that new disk. Use the old disk only as a target of the recovery, never boot from it, or otherwise write to it.
If you have a system without an easily removable harddisk, you'll have to stop using the system till the valuable data has been recovered. If you're going for the DIY path below, you will have to bring the system up in "target disk mode" See here how to do that: https://support.apple.com/en-us/HT201462
You now have two options:
DIY: I honestly have never had any success with this in real cases, but it is possible to find software that will claim to do this for you. Obviously nothing will ever be guaranteed and the best you can hope for is to recover some of the valuable data files. This software is typically not cheap, but significantly cheaper than the next option.
Professional data recovery service. Get the disk to the service of your choice. Expect this to be extremely expensive, without any guarantee to results.
Lessons learned
All incidents should always allow for an after the fact point in time where you learn from the experience. Without trying to preach too much:
Be careful with rm -rf ... it is powerful
Make backups regularly. On macOS timeMachine is easy and painless and costs you next to nothing compared to this pain. TimeMachine can backup to an external drive, an apple time capsule, a partition on a NAS, ... If you leave it connected, you'll have hourly backups.

Why this occur about hidden files [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I've found 2 hidden files on my drive D:/ , I found them by accident, because one day I opened a picture (1 picture exactly on my drive D:/), and by accident I push right navigation button on keyboard, and there is another picture WHICH familiar with me, and there is another, okay maybe this is not important but this is the name (1. AlbumArtSmall.jpg, 2. Folder.jpg), okay I think these 2 jpg files is hidden, but when I unhidden by the step : Tools > Folder Options > View > Show hidden files and folders, but It doesn't make sense (there is no such file on D:/), after that I check it again by command prompt, which is the step :
1. %drive% d:
2. D:> dir
3. there is no such file (those 2 jpg file)
But, when I check with cygwin terminal as I know this is for Linux OS (my OS is windows7 by the way :
1. /cygdrive/d
2. $ dir
3. there is such file (those 2 jpg file)
I know this is not a big problem, but I'm curious why this is happened? And if I want to delete these 2 files, I can do nothing, maybe there is a way to delete them by cygwin terminal command, but the problem for me not because I want to delete them, but more for why this is occur?
Thanks in advance, sorry for my English.
They are probably hidden and system. You can display them in explorer by selecting, in addition to "show hidden files", the "show system files" option.
On the command line, dir /a will show you hidden and system files, too. To delete them from the command line, type attrib -h -s -r *.jpg to remove hidden, system, and readonly attributes from all jpg files (for example). Then just normal del file.jpg.
Cygwin does not recognize windows-like hidden files (thus showing you them), because in Linux hidden files are marked with a leading point. If you use a windows shell on "unix-"hidden files, you will see them, too, but cygwin should not.
For deletion: Afaik you can make them visible and then simply delete them.
Command prompt won't show hidden files if you execute dir. If you want to see these hidden files, then use dir /a.

Robocopy Crashed trying to delete very long(1000+) subfolders WINDOWS [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I'm using the robocopy command
robocopy empty_dir super_subfoldered_folder /s /mir
rmdir empty_dir
rmdir super_subfoldered_folder
to delete the folders at once but during this command robocopy.exe stops working.
I have tried to delete from a path that starts at least 50 sub folders inside the main folder still crashes.
I've tried renaming them to "1" but windows doesn't let me past 100+ folders and there are at least 1000 more. Tried to create new partition- subst j: . rename some folders and delete the partition but this takes forever because of their number.
Tried dir /x and del the shortened name -> doesnt work.
Is there another way to delete those folders ?
File path in Windows goes through several layers before it gets to the actual file system driver. As a result there are two limits. 1) MAX_PATH (260) limitation introduced by the top-level API 2) 32K actually used by the file system. Since you already have that path, it is obviously within the limits of the file system. Try using path by adding "\\?\" to the front. This is an indicator that Win32 API should not parse the string but pass it directly to the file system (http://msdn.microsoft.com/en-us/library/windows/desktop/aa365247(v=vs.85).aspx). This will only work if the proces making the call is Unicode and 64-bit (on a 64-bit system). Otherwise the string must be converted and/or marshaled and you are back to the 260 limit.
Windows has a subdirectory depth limit and it's not very deep.
You may get a better result by booting up a Live Linux distro on cd or USB, like Ubuntu, and using the GUI file manager to delete the tree.

How to repair/isolate hard drive bad blocks [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
During the last month Ubuntu starts having some problems: it shuts down suddenly without any apparent reason.. I figured out that the problem is in the hard disk, if I run this command:
$ sudo badblocks -sv -b 512 /dev/sda
I get 24 bad blocks all in the Linux partition (I have Windows in another one and it does not have the same problem). The question is if there is a way (different from changing the disk) for avoiding this shutting down. Maybe isolating the bad blocks?
Software/file system bad blocks marking is mostly a thing of the past; recent drives automatically relocate bad blocks in a transparent way.
If you start getting bad blocks "visible" to software it probably means that the hard drive is exhausting the reserve of free replacement blocks, so it's probably failing. You should check the SMART status of the disk to see if this is actually confirmed by the other SMART attributes, do a backup and get ready to replace your drive.
I found a good tutorial that might help you: http://www.ehow.com/how_6864409_fix-bad-sectors-ubuntu.html
Open the terminal > type the command mount and follow the steps:
Choose a filesystem to repair. For example, you might choose the filesystem named "/home" if the output from the "mount" command includes this line:
/dev/mapper/vg0-home on /home type ext3 (rw)
Type the "umount" command to unmount the filesystem. To unmount the "/home" filesystem, for example, issue the command "sudo umount /home".
Type the "fsck" command to repair the filesystem. The "fsck" command stands for "file system check"; it scans the disk for bad sectors and labels the ones that aren't working. To run fsck on the /home filesystem, issue the command "sudo fsck /dev/mapper/vg0-home". Replace "/dev/mapper/vg0-home" with the output from your "mount" command, as appropriate.
Type the "mount" command to remount the repaired filesystem. If you repaired the "/home" filesystem, then use the command "sudo mount /home".
Spinrite (grc.com) is the best tool I know of for recovering bad sectors and getting the drive to use backup sectors in their place. Its not cheap but it works. If any of your friends own a copy you are allowed to borrow it. Ive used it for 7 years now. Its good for periodic maintenance too.

Resources