How to wget a html file from a an administrative share on a Windows webserver - windows

Is there anyway I am able to use wget in Unix to transfer a html file from a Windows administrative share?
The file path I am trying to access is like the example:
www.webserv.com/share$/reportfolder/index.html
This is a very obtuse way of creating a monitoring script on a OS without the use of curl / Nagios / extra perl modules.
Any help given will be greatly appreciated.

Wget works only with http, https, and ftp protocols (man page)
if you need to access Windows files from Unix, you have several choices
FTP
You might look at Samba (http://www.samba.org)
You can use SSH

Related

Transfer file in Secure Shell

I use Secure Shell as a Chrome extension (https://chrome.google.com/webstore/detail/secure-shell/pnhechapfaindjhompbnflcldabbghjo?hl=da)
Now I have finished some programming and I need the file on my computer.
How can I transfer the file to my computer? Can I send it by email or something?
I have tried yanking all the lines in vim, but I still don't get it copied to my windows clipboard.
One entertaining (and moderately ridiculous) approach would be sprunge.
Run this on the remote machine:
cat myFile | curl -F 'sprunge=<-' http://sprunge.us
And then visit the URL it prints on your local machine. :D
I presume that you are using Windows OS and trying to download your file from a Linux like OS.
You use MobaXterm and it comes with a file transfer features.
http://mobaxterm.mobatek.net
On a CLI you can use "scp" to download and upload.
Another one is you can also use FileZilla using SFTP protocol

How can I send an HTTPS request from a file?

Let's assume I have a file request.txt that looks like:
GET / HTTP/1.0
Some_header: value
text=blah
I tried:
cat request.txt | openssl -s_client -connect server.com:443
Unfortunately it didn't work and I need to manually copy & paste the file contents. How can I do it within a script?
cat is not ideally suited to download remote files, it's best used for files local to the file system running the script. To download a remote file you have other commands that you can use which handle this better.
If your environment has wget installed you can download the file by URL. Here is a link for some examples on how it's used. That would look like:
wget https://server.com/request.txt
If your environment has curl installed you can download the file by URL. Here is a link for some examples on how it's used. That would look like:
curl -O https://server.com/request.txt
Please note that if you want to store the response in a variable for further modification you can do this as well with a bit more work.
Also worth noting is that if you really must use cat to download a remote file it's possible, but it may require ssh to be used and I'm not a fan of using that method as it requires access to a file via ssh where it's already publicly available over HTTP/S. There isn't a practical reason I can think of to go about it this way, but for the sake of completion I wanted to mention that it could be done but probably shouldn't.

Copy files from authenticated windows server to Unix server

I have a set of zip files that need to be copied from an authenticated windows server to a unix server which is authenticated too.
I have tried using Pentaho but have not found any success. Is there any other alternative way with which this copy can be done like using scripts or any such method?
Thanks in advance.
Assuming your server supports ssh..
Putty comes with a utility called pscp which works the same as scp.
To copy a file you would typically do this:
pscp myfile.zip me#myserver:/my_directory/.
There is also winscp if you want something more GUI.
Use scp command. For more detail visit http://www.garron.me/en/linux/scp-linux-mac-command-windows-copy-files-over-ssh.html

Windows Command Line FTP to deploy website

Trying to set up a post build script on my CI server to push changes to our web server by FTP. In as few lines as possible how can i push a folder of files to my webserver using windows FTP? For example deployment folder is:
c:\deployment\*.*
How can i recursively push all files to replace on the web server?
I'm open to using cmd or powershell - MS Windows only
Thanks
Windows' built-in command-line FTP client doesn't have recursion built-in. The easiest way would be to use a different FTP client. NcFTP will do what you're looking for. See the manual page for ncftpput. The syntax is basically as follows:
cd c:\deployment
ncftpput -u user -p pass -R ftp.ftpserver.com /path/on/ftp/server .\*
Or if your web server also runs an ssh service, then rsync would be even better.
Fsync is good, I am using it for long. It allows to push only what has changed. Recursion of course. Exclude files, too. Track client-side (much faster) what has changed... Biggest only drawback: No SFTP./ProductList/Fsync.html

How can I ftp multiple files?

I have two unix servers in which I need to ftp some files.
The directory structure is almost same except a slight difference, like:
server a server b
miabc/v11_0/a/b/c/*.c miabc/v75_0/a/b/c/
miabc/v11_0/xy/*.h miabc/v11_0/xy/
There are many modules:
miabc
mfabc
The directory structure inside them is same in both the servers except the 11_0 and 75_0. And directory structure in side different modules is different
How can I FTP all the files in all modules into the corresponding module in second server b by any of scripting languages like awk, Perl, shell, ksh using FTP?
I'd say if you want to go with Perl, you have to use Net::FTP.
Once, I needed a script that diffs a directory/file structure on an FTP
server with a corresponding directory/file structure on a local harddisk,
which lead me to write this script. I don't know if it is efficient or elegant, but you might find one or another
idea in it.
hth / Rene
See you need to use correct path of directory where you want to send files.
You can create small script with php .
php provide good ftp functions.using php you can easily ftp your file. but before that, once check your ftp settings of IIS server or file zilla
I have used following code for sending files on ftp this is in php :-
$conn_id = ftp_connect($FTP_HOST) or die("Couldn't connect to ".$FTP_HOST);
$login_result =ftp_login($conn_id, $FTP_USER, $FTP_PW);
ftp_fput($conn_id, $from, $files, $mode) // ths is the function to put files on ftp
This code is just for reference , go through php manual before using it.
I'd use a combination of Expect, lftp and a recursive function to walk the directory structure.
If the file system supports symlinking or hardlinking, I would use a simple wget to mirror the ftp server. in one of them when you're wgetting just hack the directory v11_0 to point to 75_0, wget won't know the difference.
server a:
go to /project/servera
wget the whole thing. (this should place them all in /project/servera/miabc/v11_0)
server b:
go to /project/serverb
create a directory /project/serverb/miabc/75_0, link it to /project/servera/v11_0:
ln -s /project/serverb/miabc/75_0 /project/servera/v11_0
wget serverb, this will be followed when wget tries to cwd into in 75_0 it will find itself in /project/servera/v11_0
Don't make the project harder than it needs to be: read the docs on wget, and ln. If wget doesn't follow symbolic links, file a bug report, and use a hard link if your FS supports it.
It sounds like you really want rsync instead. I'd try to avoid any programming in solving this problem.
I suggest you could login on any of the server first and go to the appropraite path miabc/v75_0/a/b/c/ . From here you need to do a sftp to the other server.
sftp user#servername
Go to the appropraiate path which files needs to be transferred.
write the command mget *

Resources