Can't delete file from server [closed] - ftp

Closed. This question is not about programming or software development. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed last month.
Improve this question
Ayo,
I'm using an FTP FileZilla solution and recently noticed that a few of my old directories are still on the server. I'm simply unable to erase those old directories because of one single file inside, that whenever I try to delete, says: 550 Forbidden filename
Does anyone know how to fix that?
thanks!

There are a couple of potential workarounds for this, I believe it can be caused by "invisible" characters in the filename.
Firstly, try to rename the file (ideally in shell mv 3.jpg deleteme.jpg).
Secondly, you can try to create a PHP file in the directory which contains that file. Insert the code:
<?php
unlink('3.jpg');
?>
Then load the php file in a browser or execute in shell:
> php myphpfile.php

Related

Where does the file ''$'\033\033\033' come from in Linux? [closed]

Closed. This question is not about programming or software development. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 11 days ago.
Improve this question
In my directory at a Linux server I have discovered a file with such a strange name.
From the command history I can track that it was probably created by this command:
sudo docker logs <container_id> -n 200000 | less
I suspect I have entered some combination of letters in less (probably starting with s to save a file).
Do you know what exactly has happened?
P.S. If you want to remove such a file, see How to escape the escape character in bash?
I have discovered that such a file is created when you type s in a piped less and then you are asked to enter the log file name. If you type triple Escape and then Enter, you will get such a file.
The command s is actually helpful to save the contents of a piped less.

Ubuntu terminal removing multiple partial files using wildcard [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 12 months ago.
Improve this question
This maybe easy on linux users, but I am having a hard time figuring out how to delete multiple files (partial files) using wildcard.
sudo rm logs/archived/remove_me.2022.* or sudo rm logs/archived/remove_me.2022.? seems not to work.
I am getting the error rm: cannot remove 'logs/archived/remove_me.*': No such file or directory
I am currently on /var/lib/tomcat8/ trying to remove these logs inside logs/archived.
I am remove them one by one but there are a lot of files to remove .. (example. from 2020 and there are daily and in partial files).
Example I am inside /var/lib/tomcat8/logs/archived/ and I want to remove all log files starting with remove_me.2021.*
Below are the sample list of files that I want to remove.There are also other files in this directory that should not be removed.
remove_me.2022-03-02.1.log.gz
remove_me.2022-03-02.2.log.gz
remove_me.2022-03-02.3.log.gz
remove_me.2022-03-02.4.log.gz
remove_me.2022-03-02.5.log.gz
remove_me.2022-03-03.1.log
remove_me.2022-03-03.2.log
remove_me.2022-03-03.3.log
remove_me.2022-03-03.4.log
remove_me.2022-03-03.5.log
remove_me.2022-03-03.6.log
remove_me.2022-03-03.7.log
remove_me.2022-03-03.8.log
remove_me.2022-03-03.9.log
remove_me.2022-03-03.10.log
I believe the issue here is that the asterisk (*) is resolved by the current user, i.e., before becoming the superuser. Hence, it resolves to nothing, because the current user is not able to even read that directory.
Solve this by becoming superuser first, and then doing everything normally:
sudo -i
cd /var/lib/tomcat8/logs/archived/
rm remove_me.2022.*

wget recursive not working as expected [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 years ago.
Improve this question
Wondering if I am overlooking the obvious
I am trying to use
wget -rl 0 -A "*.fna.gz" ftp://ftp.ncbi.nlm.nih.gov/genomes/genbank/bacteria/Acinetobacter_nosocomialis/all_assembly_versions
To download all the files in all the directories contained in ftp://ftp.ncbi.nlm.nih.gov/genomes/genbank/bacteria/Acinetobacter_nosocomialis/all_assembly_versions/ that match *.fna.gz
If you visit the above link, you will see a list of directories starting with GCA. I want all the files in those directories that match *.fna.gz but I get nothing when I run the command. I'm wondering if wget is not recognizing the GCA* directories as directories, and this is the problem? Or is there something wrong with my wget command?
I am suspicious because when I try to download the directories with FileZilla I get:
GCA_000248315.2_ASM24831v2: Not a regular file
Error: Critical file transfer error
These are not directories but links to somewhere else. There is no information in the file listing which gives the type of the target file, i.e. if directory or plain file or whatever. Thus wget will probably assume plain file and not follow it.
Apparently this isn't working as expected because of a bug on the server which displays symbolic links to directories as ordinary files. Thus as #Steffen Ullrich mentioned, "There is no information in the file listing which gives the type of the target file, i.e. if directory or plain file or whatever. Thus wget will probably assume plain file and not follow it." Thanks to codesquid_ on the FileZilla IRC for the clarification.
Follow up question regarding a work around at https://stackoverflow.com/questions/35307325/recursive-wget-cant-get-files-within-symbolic-link-directories

creating a hardlink to a file contained in one subdirectory into another subdirectory [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
hey guys so im using Ubuntu and im having a issue finding the files using the ln command
currently i have a folder called myName.2 within it are 3 folders notes, assignments and web.
within notes i created 3 files called linux.txt, unix.txt and shell.txt, now i have redirected myself so im in the folder web and want to create hard links to these files here, so for example i type the command
ln /home/admin/3000/Assignment1/myName.2/notes/linux.txt
however the terminal is telling me,
ln: accessing `/home/admin/3000/Assignment1/myName.2/notes/linux.txt': No such file or directory
i went to the properties of the linux text file and copy and pasted the path straight from there
any hints would be much appreciated thanks!
It's easy to make typos, and simpler to use relative paths. Try:
ln ../notes/linux.txt
from inside the web directory.

Syncing dot files with dropbox [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I puts all my dotfiles in $HOME/Dropbox/dotfiles
and make a hard link(I think it's the way to go, and for instance vim does't load soft link rc file)
ln $HOME/Dropbox/dotfiles/.vimrc $HOME/.vimrc
The problem is as long as I make change to the file in the dropbox directories, everything works as expected. But when I change the hard link file(which is $HOME/.vimrc), the original file changes accordingly, but dropbox won't sync!!(same as iCloud mobile document folder)
Any idea?
Use soft links. Hard links make it so that Dropbox can't tell when the file is updated. This is because Dropbox doesn't poll the contents of every single file you have, it just looks at modification dates on the files located in your Dropbox.
This is exactly what I use for syncing my dot files with Dropbox:
$ ln -s ~/Dropbox/dotfiles/.vimrc .vimrc
and vim still loads the soft-linked vimrc file.

Resources