Unzip a file and then display it in the console in one step - shell

I have access to a remote server over ssh. I have only read (no write) access on the server. There is a zipped log file that I want to read.
But because I have only read access, I cannot first extract the file and then read it, because when I try to unzip, I get the message Read-only file system.
My idea was to redirect the output of the gunzip-command to some other command that can read from the standrat input and display the content in the console. So I do not write the unzipped file on the file system (that I do not have the right) but display it directly in the console. Until now I couldn't successfully do it.
How to achieve this goal? And is there any better way to do it?

Since you do not have permission to unzip the file, you will first need to view the list of contents and their path. Once you get that then you can view content using -p option of unzip command.
View contents
zipinfo your.zip
View file contents
unzip -p latest.zip wordpress/wp-config-sample.php
In case it is a .gz file then use: gunzip -c wordpress/wp-config-sample.php
Hope this helps!

Related

Reading and redirecting zip file by Less

Im trying to copy a zip file located on a server by a ssh2 library.
the way i'm about to do is using less command and write it down on client side.
Less -r -L -f zipfile
but the output file is bigger than the original.
i know this not a good practice but i have to.
so how can i handle this to have my zip file on the client machine?
Is Less an mandatory command to do that ?
You can simply use scp to achieve that by providing user and host and then typing the directory, where to copy the file from the server to local host, like on the example below:
scp your_username#remotehost.edu:foobar.txt /some/local/directory

Append to a remote file via FTP in shell script

Is there any way to write/append to a remote file via FTP? I need to append certain content in the file located in the server? Is there any way to do with shell script?
You can use cURL with the --append flag:
(FTP/SFTP) When used in an upload, this makes curl append to the target file instead of overwriting it. If the remote file doesn't exist, it will be created. Note that this flag is ignored by some SFTP servers (including OpenSSH).
See cURL man page.

Rysync Log File

Im trying to use rsysnc to transfer files from my main PC to my server. Once the files are transferred to my server I want to be able to move the files around on my PC and not have rsync send them again when I rerun rsync.
I think I can do this by having rsync write out a log file with the names of the files it transfers. Then reference that same file as the exclude list.
I'm having trouble getting the format of the log file to be readable as an exclude list. It needs to only print out the file or folder names.
Here is the current command I'm running.
rsync -avz --exclude-from=Desktop/file.txt --log-file=Desktop/file.txt --log-file-format=%i Desktop/Source Desktop/Destination
What do I need to do to make the log file only output the name of the files or folders?
You could grab the list with a find . > log.txt before running rsync

DOS ftp listing to local file

I'm trying to find a way to see if a file exists on an ftp site via DOS. I tried a get command on the file hoping that if it didn't exist it wouldn't download it to my local directory. However it seams that it still does, but it's an empty file. This doesn't work for me however because the file I'm looking for is just a empty trigger file so I can't tell the difference.
I would like to dump a listing ls of the ftp directory to a text file on my local drive and so I try
ls > listing.txt.
It creates the listing.txt file locally but it's always empty even though there are files on the ftp site.
What are my options with this?
I have used dir > listing.txt and ls > listing.txt and every time listing.txt is empty even though there are files in the directories I'm running those commands on.
Sorry if I didn't make this clear, but I'm trying to get the listing for an automated process and not simply for my visual when manually doing this.
Unless you're on FreeDOS, you're probably not using DOS. Perhaps you're using ftp.exe in the windows console? If that's the case, don't use a normal file redirect. Instead check here the syntax for ls in the standard Windows ftp client:
ls [RemoteDirectory] [LocalFile]
So you can do a ls . listing.txt to get a list of files in the current remote directory. The listing.txt file will appear in your user directory, e.g. c:\Users\user.

cron job to update a file from ftp

I tried something like:
wget ftp://username:password#ftp.somedomain.com/bla/blabla.txt -x /home/weptile/public_html/bla/blabla.txt
Appereantly -x writes the output :) I thought it was overwriting the file I need.
So what I'm trying to do is do daily updates on blabla.txt in this specific subdirectory from an external ftp file. I want to get the file from ftp and overwrite the old file on my server.
Use wget -N to overwrite existing files.
If you get stuck on stuff like this, try man wget or heck, even Google.

Resources