copy file using an URL from command line - windows

I have a batch script that is used to collect some data and upload that on other servers, using xcopy in a windows 7 command line. I want that script to collect some files that are on share point, so I need to get them using an URL and I need to login.
xcopy can't do the job, but are there other programs that can do it?

Theoretically, you can bend cURL to download a file from a SharePoint site. If site is publicly available, it's all very simple. If not, you'll have to authenticate first, and this might be a problem.

wget for windows maybe? http://gnuwin32.sourceforge.net/packages/wget.htm

The login part can be done using CURL, supplying the user name and password as post arguments. You can supply post args using -d or --data flag. Once you are logged in (and have required permission), you can fetch the required file and then simply transfer it using xcopy as you are already doing for the local files.

Related

Need to create/delete text file in Unix box from windows

I have a requirement where i need to create and delete a text file on unix from my windows server where i have informatica installed.
Using workflow i was able to place file in unix but not able to find a way to delete already existing file.
Also the client does not want us to download any additional software like putty on windows server.
Please feel free to ask for more information if required.
if you are able to drop .sh (shell script) and execut it on unix then you can do that with "rmdir yourfolder" command, if you folder has andything under then you will need to use "rm -r yourfolder", if files inside the folder have any dependicies you need to use "rm -rf yourfolder".
Just make sure that you will navigate to the correct folder where you are deleting things.
Br, Aljaž.
Use Command Task in Informatica Workflow to invoke syntax mentioned by #Aljaz

How can I send an HTTPS request from a file?

Let's assume I have a file request.txt that looks like:
GET / HTTP/1.0
Some_header: value
text=blah
I tried:
cat request.txt | openssl -s_client -connect server.com:443
Unfortunately it didn't work and I need to manually copy & paste the file contents. How can I do it within a script?
cat is not ideally suited to download remote files, it's best used for files local to the file system running the script. To download a remote file you have other commands that you can use which handle this better.
If your environment has wget installed you can download the file by URL. Here is a link for some examples on how it's used. That would look like:
wget https://server.com/request.txt
If your environment has curl installed you can download the file by URL. Here is a link for some examples on how it's used. That would look like:
curl -O https://server.com/request.txt
Please note that if you want to store the response in a variable for further modification you can do this as well with a bit more work.
Also worth noting is that if you really must use cat to download a remote file it's possible, but it may require ssh to be used and I'm not a fan of using that method as it requires access to a file via ssh where it's already publicly available over HTTP/S. There isn't a practical reason I can think of to go about it this way, but for the sake of completion I wanted to mention that it could be done but probably shouldn't.

Copy files from authenticated windows server to Unix server

I have a set of zip files that need to be copied from an authenticated windows server to a unix server which is authenticated too.
I have tried using Pentaho but have not found any success. Is there any other alternative way with which this copy can be done like using scripts or any such method?
Thanks in advance.
Assuming your server supports ssh..
Putty comes with a utility called pscp which works the same as scp.
To copy a file you would typically do this:
pscp myfile.zip me#myserver:/my_directory/.
There is also winscp if you want something more GUI.
Use scp command. For more detail visit http://www.garron.me/en/linux/scp-linux-mac-command-windows-copy-files-over-ssh.html

Downloading FTP files in Q (KDB)

Does anybody know if there is any way to download files from ftp server directly from Q (kdb) ? I know that it's possible to use http but didn't see any examples of using ftp. Seems only way is to write wrapper around something like curl etc, but may be it is already done ? Any thoughts ?
Why not either:
Write a script to fetch the file then start the q processing.
Use a system command to call any linux/dos commands you want, then use the kdb key command to check that the files exist as expected
use a system call to curl without a file destination -- its default destination is stdout, so the file contents will be returned to q as the return value of system
data:system"curl ftp://wherever/whatever"
For Linux, you can simply run any curl command or system command using q. I used following for example:
system "curl --proxy my_proxy_details ftp://ftp.microsoft.com/developr/visual_c/README.TXT -o README.txt"
-> -o option is to give name for the downloaded file.
Similarly you can run other curl commands or other system commands to get ftp files in Q.
This site has good curl examples:
http://www.cyberciti.biz/faq/curl-download-file-example-under-linux-unix/

Windows Command Line FTP to deploy website

Trying to set up a post build script on my CI server to push changes to our web server by FTP. In as few lines as possible how can i push a folder of files to my webserver using windows FTP? For example deployment folder is:
c:\deployment\*.*
How can i recursively push all files to replace on the web server?
I'm open to using cmd or powershell - MS Windows only
Thanks
Windows' built-in command-line FTP client doesn't have recursion built-in. The easiest way would be to use a different FTP client. NcFTP will do what you're looking for. See the manual page for ncftpput. The syntax is basically as follows:
cd c:\deployment
ncftpput -u user -p pass -R ftp.ftpserver.com /path/on/ftp/server .\*
Or if your web server also runs an ssh service, then rsync would be even better.
Fsync is good, I am using it for long. It allows to push only what has changed. Recursion of course. Exclude files, too. Track client-side (much faster) what has changed... Biggest only drawback: No SFTP./ProductList/Fsync.html

Resources