iTerm can't delete directories - terminal

I just started working with iTerm an I've been playing around, creating folders and files and such. Now I want to get rid of those test folders. I used rmdir, rm -r and all of those commands and I successfully emptied the folders, but they still show up in my main folder.
When I try to delete them, it says the folders aren't empty, even though they are.
Does anybody know how to fix this? I just want to clean up my main folder. :(
Thanks in advance.
I tried various delete commands.

Related

How do I diagnose input/output errors?

I am totally baffled. I used
su cp -a /my/home/dir /backup/home/dir
to backup my home directory and found a few files (about 20) that didn't copy due to input/output errors. These files look fine, some are .jpgs, some are .gifs, one is my Virtualbox VDI file...they work fine on the original home dir, but they JUST WON'T COPY. I tried manually doing them. I tried doing them using Nautilus. I tried changing the permissions to 777 and made sure the ownership was non-root...still no dice. I get:
cp: reading `/my/home/dir/subfolder/abc_def.gif': Input/output error
I'm scratching my head and while I could lose a gif or jpg here and there, I don't want to lose my vdi file. Do I need to add a --force to the cp command? Is there any way to find out more info about why these particular files aren't copying? In the case of the .jpgs, they're all in one folder of images I took during a recent trip, shot in the same camera, same CF card, and transferred the same way at the same time.
Totally baffled. Any help would be fantastic. Ideally, a way to force copy these files. They seem to be fine, usable, and I trust them, so I've no clue why they're not getting copied.

Running program/macro to rename, add files to flash drive

I have a huge batch of flash drives that I need to move files onto. I'd also love to rename the drives (they're all called NO NAME by default). I'd love to plug two drives in, run a terminal script on the computer to accomplish all of that (most importantly the file moving). Then remove the drives, put the next two in, run it again, etc. until I'm done. All of the drives are identically named.
Is batch executing like this possible, and does anyone know how to go about doing it?
I figured it out. Put each one in and run this command to rename the drive and then move the files into it:
diskutil rename /Volumes/OLDNAME "NEWNAME" && cp -r ~/Desktop/sourceFolder/. /Volumes/NEWNAME

Git not detecting a directory

I've found questions similar to this, but they didn't work in my case, so I'm posting this.
In my windows machine, I have a folder say abc and inside that a directory bin, in which all my files are present. For some reason, git does not detect any contents of the directory abc unless I move the files from bin to abc.
What should I do to detect the files when it's inside bin directory?
As #dennisschagt had mentioned, the .gitignore file is relevant in situations like these. Since it had been auto generated for me, I didn't notice the face that it had an entry 'bin' in it, which was the root cause of the problem. removing it solved my problem of having the flder and its contents non-detected. Thank you all for the help.

Wget Not Downloading Every Folder

Hey, I have bash script running a wget command to get a directory:
wget -r -nH --cut-dirs=5 "ftp://$USER:$PASS#stumpyinc.com/subdomains/cydia/httpdocs/theme/themes/$theme_root"
And what it's supposed to do is download a folder structure that looks like this:
$theme_root/Library/Themes/$theme_name.theme/Icons
For some reason, it wont download any folder that's inside of the $theme_name.theme folder. There's also a UIImages folder in there that's not showing up, although files that are in that folder are being downloaded. Does anyone notice anything that I might have done wrong? Thanks in advance!
EDIT
if you add --level=inf it works perfectly!
Wget's default directory retrieval depth is 5 directories as per the wget manual. If the files you are trying to get are deeper than that from your starting position, it will not go down there. You can try giving a larger --level option or as your edit --level=inf.

Rsync symlinked content

I use rsync to upload websites to a server. I use it something like this:
rsync -auvzhL . username#host:/home/username/foldername
I only want to update things that are newer on my computer, and not to delete things on the server that I don't have.
This all worked fine, until I decided to symlink some files in the folder to other ones. Now if I change that file, it doesn't get included in rsync unless I delete and recreate the symlink. Assumedly because the symlink date is the creation of the link, not the content.
Is there anyway to either force rsync to always copy certain files, or, better, update the modified date on a symlink without deleting and recreating it?
In the end I just delete and recreate all the symlinks before rsyncing. I'm sure there is a better way, I just can't work it out!

Resources