How can I send an HTTPS request from a file? - shell

Let's assume I have a file request.txt that looks like:
GET / HTTP/1.0
Some_header: value
text=blah
I tried:
cat request.txt | openssl -s_client -connect server.com:443
Unfortunately it didn't work and I need to manually copy & paste the file contents. How can I do it within a script?

cat is not ideally suited to download remote files, it's best used for files local to the file system running the script. To download a remote file you have other commands that you can use which handle this better.
If your environment has wget installed you can download the file by URL. Here is a link for some examples on how it's used. That would look like:
wget https://server.com/request.txt
If your environment has curl installed you can download the file by URL. Here is a link for some examples on how it's used. That would look like:
curl -O https://server.com/request.txt
Please note that if you want to store the response in a variable for further modification you can do this as well with a bit more work.
Also worth noting is that if you really must use cat to download a remote file it's possible, but it may require ssh to be used and I'm not a fan of using that method as it requires access to a file via ssh where it's already publicly available over HTTP/S. There isn't a practical reason I can think of to go about it this way, but for the sake of completion I wanted to mention that it could be done but probably shouldn't.

Related

MIME type check via SFTP connection

I want to list images by SFTP and save this list, so another script may further process it.
Unfortunately, there are also many other files there, so I need to identify which are images. I am filtering out everything with wrong file extension, but I would like to go a step further and check also the content of the file.
Downloading everything to check it with file --mime-type on local machine is too slow. Is there a way how to check MIME type of a file on remote SFTP before the download?
We found a way, downloading only first 64 bytes. It is a lot faster than downloading whole file, but still enough to see if it looks like an image:
curl "sftp://sftp.example.com/path/to/file.png" -u login:pass -o img.tmp -r 0-64
file --mime-type img.tmp
MIME type is supported by SFTP version 6 and newer only.
Most SFTP clients and servers, including the most widespread one, OpenSSH, support SFTP version 3 only.
Even the servers that I know of to support SFTP version 6, like Bitvise or ProFTPD mod_sftp, do not support the "MIME type" attribute.
So while in theory it's possible to determine MIME type of remote files over SFTP, in practice, you won't be able to do it.
You can run any command remotely using ssh:
ssh <destination> '<command_to_run>'
In this case that would be something like:
ssh <remote_machine_name_or_ip_address> 'file --mime-type ./*'

Use curl to download a Dropbox folder via shared link (not public link)

Dropbox makes it easy to programmatically download a single file via curl (EX: curl -O https://dl.dropboxusercontent.com/s/file.ext). It is a little bit trickier for a folder (regular directory folder, not zipped). The shared link for a folder, as opposed to a file, does not link directly to the zipped folder (Dropbox automatically zips the folder before it is downloaded). It would appear that you could just add ?dl=1 to the end of the link, as this will directly start the download in a browser. This, however, points to an intermediary html document that redirects to the actual zip folder and does not seem to work with curl. Is there anyway to use curl to download a folder via a shared link? I realize that the best solution would be to use the Dropbox api, but for this project it is important to keep it as simple as possible. Additionally, the solution must be incorporated into a bash shell script.
It does appear to be possible with curl by using the -L option. This forces curl to follow the redirect. Additionally, it is important to specify an output name with a .zip extension, as the default will be a random alpha-numeric name with no extension. Finally, do not forget to add the ?dl=1 to the end of the link. Without it, curl will never reach the redirect page.
curl -L -o newName.zip https://www.dropbox.com/sh/[folderLink]?dl=1
Follow redirects (use -L). Your immediate problem is that Curl is not following redirects.
Set a filename. (Optional)
Dropbox already sends a Content-Disposition Header with its Dropbox filename. There is no reason to specify the filename if you use the correct curl flags.
Conversely, you can force a filename using something of your choosing.
Use one of these commands:
curl https://www.dropbox.com/sh/AAbbCCEeFF123?dl=1 -O -J -L
Preserve/write the remote filename (-O,-J) and follows any redirects (-L).
This same line works for both individually shared files or entire folders.
Folders will save as a .zip automatically (based on folder name).
Don't forget to change the parameter ?dl=0 to ?dl=1 (see comments).
OR:
curl https://www.dropbox.com/sh/AAbbCCEeFF123?dl=1 -L -o [filename]
Follow any redirects (-L) and set a filename (-o) of your choosing.
NOTE: Using the -J flag in general:
WARNING: Exercise judicious use of this option, especially on Windows. A rogue server could send you the name of a DLL or other file that could possibly be loaded automatically by Windows or some third party software.
Please consult: https://curl.haxx.se/docs/manpage.html#OPTIONS (See: -O, -J, -L, -o) for more.

wget syncing with changing remote HTTP files

I want to ensure an authorative remote file is in sync with a local file, without necessarily re-downloading the entire file.
I did mistakenly use wget -c http://example.com/filename
If "filename" was appended to remotely, that works fine. But if filename is prepended to, e.g. "bar" is prepended to a file just containing "foo", the end downloaded result filename contents in my test were wrongly "foo\nfoo", instead of "bar\nfoo".
Can anyone else suggest a different efficient http downloading tool? Something that looks at server caching headers or etags?
I believe that wget -N is what you are looking for. It turns on timestamping and allows wget to compare the local file timestamp with the remote timestamp. Keep in mind that you might still encounter corruption if the local file timestamp cannot be trusted e.g. if your local clock is drifting too much.
You could very well use curl: http://linux.about.com/od/commands/l/blcmdl1_curl.htm]1

How to resume an ftp download at any point? (shell script, wget option)?

I want to download a huge file from an ftp server in chunks of 50-100MB each. At each point, I want to be able to set the "starting" point and the length of the chunk I want. I won't have the "previous" chunks saved locally (i.e. I can't ask the program to "resume" the download).
What is the best way of going about that? I use wget mostly, but would something else be better?
I'm really interested in a pre-built/in-build function rather than using a library for this purpose... Since wget/ftp (also, I think) allow resumption of downloads, I don't see if that would be problem... (I can't figure out from all the options though!)
I don't want to keep the entire huge file at my end, just process it in chunks... fyi all - I'm having a look at continue FTP download afther reconnect which seems interesting..
Use wget with:
-c option
Extracted from man pages:
-c / --continue
Continue getting a partially-downloaded file. This is useful when you want to finish up a download started by a previous instance of Wget, or by another program. For instance:
wget -c ftp://sunsite.doc.ic.ac.uk/ls-lR.Z
If there is a file named ls-lR.Z in the current directory, Wget will assume that it is the first portion of the remote file, and will ask the server to continue the retrieval from an offset equal to the length of the local file.
For those who'd like to use command-line curl, here goes:
curl -u user:passwd -C - -o <partial_downloaded_file> ftp://<ftp_path>
(leave out -u user:pass for anonymous access)
I'd recommend interfacing with libcurl from the language of your choice.

How can I ftp multiple files?

I have two unix servers in which I need to ftp some files.
The directory structure is almost same except a slight difference, like:
server a server b
miabc/v11_0/a/b/c/*.c miabc/v75_0/a/b/c/
miabc/v11_0/xy/*.h miabc/v11_0/xy/
There are many modules:
miabc
mfabc
The directory structure inside them is same in both the servers except the 11_0 and 75_0. And directory structure in side different modules is different
How can I FTP all the files in all modules into the corresponding module in second server b by any of scripting languages like awk, Perl, shell, ksh using FTP?
I'd say if you want to go with Perl, you have to use Net::FTP.
Once, I needed a script that diffs a directory/file structure on an FTP
server with a corresponding directory/file structure on a local harddisk,
which lead me to write this script. I don't know if it is efficient or elegant, but you might find one or another
idea in it.
hth / Rene
See you need to use correct path of directory where you want to send files.
You can create small script with php .
php provide good ftp functions.using php you can easily ftp your file. but before that, once check your ftp settings of IIS server or file zilla
I have used following code for sending files on ftp this is in php :-
$conn_id = ftp_connect($FTP_HOST) or die("Couldn't connect to ".$FTP_HOST);
$login_result =ftp_login($conn_id, $FTP_USER, $FTP_PW);
ftp_fput($conn_id, $from, $files, $mode) // ths is the function to put files on ftp
This code is just for reference , go through php manual before using it.
I'd use a combination of Expect, lftp and a recursive function to walk the directory structure.
If the file system supports symlinking or hardlinking, I would use a simple wget to mirror the ftp server. in one of them when you're wgetting just hack the directory v11_0 to point to 75_0, wget won't know the difference.
server a:
go to /project/servera
wget the whole thing. (this should place them all in /project/servera/miabc/v11_0)
server b:
go to /project/serverb
create a directory /project/serverb/miabc/75_0, link it to /project/servera/v11_0:
ln -s /project/serverb/miabc/75_0 /project/servera/v11_0
wget serverb, this will be followed when wget tries to cwd into in 75_0 it will find itself in /project/servera/v11_0
Don't make the project harder than it needs to be: read the docs on wget, and ln. If wget doesn't follow symbolic links, file a bug report, and use a hard link if your FS supports it.
It sounds like you really want rsync instead. I'd try to avoid any programming in solving this problem.
I suggest you could login on any of the server first and go to the appropraite path miabc/v75_0/a/b/c/ . From here you need to do a sftp to the other server.
sftp user#servername
Go to the appropraiate path which files needs to be transferred.
write the command mget *

Resources