Upload to S3 via shell script without aws-cli, possible? - bash

As the title says, is it possible to upload to S3 via shell script without aws-cli-tools?
If so, how?
What I'm trying to do is read from a txt file on S3 (which is public, so no authentication is required).
But I want to be able to overwrite whatever is in the file (which is just a number).
Thanks in advance,
Fadi

Yes you can! You basically emulate the api calls the SDK would do for you through standard linux cmd utils.
Look at:
https://aws.amazon.com/code/Amazon-S3/943
and/or
http://tmont.com/blargh/2014/1/uploading-to-s3-in-bash

I use s3cmd which is a command line tool written in Python.
It uses the (restful) web APIs.
s3cmd put --recursive
s3cmd sync
would be the interesting bits:
Synchronize a directory tree to S3
s3cmd sync LOCAL_DIR s3://BUCKET[/PREFIX] or s3://BUCKET[/PREFIX] LOCAL_DIR

Related

How to execute a bash file containing curl instructions within a Google storage bucket and directly copy the contents to the bucket?

I have a bash script file where each line is a curl command to download a file. This bash file is in a Google bucket.
I would like to execute the file either directly from the storage and copy its downloaded contents there or execute it locally and directly copy its content to the bucket.
Basically, I do not want to have these fils on my local machine.. I have tried things along these lines but it either failed or simply downloaded everything locally.
gsutil cp gs://bucket/my_file.sh - | bash gs://bucket/folder_to_copy_to/
Thank you!
To do so, the bucket needs to be mounted on the pod (the pod would see it as a directory).
If the bucket supports NFS, you would be able to mount it as shown here.
Also, there is another way as shown in this question.
otherwise, you would need to copy the script to the pod, run it, then upload the generated files to the bucket, and lastly clean everything up.
The better option is to use a filestore which can be easily mounted using CSI drivers as mentioned here.

Copy files from authenticated windows server to Unix server

I have a set of zip files that need to be copied from an authenticated windows server to a unix server which is authenticated too.
I have tried using Pentaho but have not found any success. Is there any other alternative way with which this copy can be done like using scripts or any such method?
Thanks in advance.
Assuming your server supports ssh..
Putty comes with a utility called pscp which works the same as scp.
To copy a file you would typically do this:
pscp myfile.zip me#myserver:/my_directory/.
There is also winscp if you want something more GUI.
Use scp command. For more detail visit http://www.garron.me/en/linux/scp-linux-mac-command-windows-copy-files-over-ssh.html

Windows Command Line FTP to deploy website

Trying to set up a post build script on my CI server to push changes to our web server by FTP. In as few lines as possible how can i push a folder of files to my webserver using windows FTP? For example deployment folder is:
c:\deployment\*.*
How can i recursively push all files to replace on the web server?
I'm open to using cmd or powershell - MS Windows only
Thanks
Windows' built-in command-line FTP client doesn't have recursion built-in. The easiest way would be to use a different FTP client. NcFTP will do what you're looking for. See the manual page for ncftpput. The syntax is basically as follows:
cd c:\deployment
ncftpput -u user -p pass -R ftp.ftpserver.com /path/on/ftp/server .\*
Or if your web server also runs an ssh service, then rsync would be even better.
Fsync is good, I am using it for long. It allows to push only what has changed. Recursion of course. Exclude files, too. Track client-side (much faster) what has changed... Biggest only drawback: No SFTP./ProductList/Fsync.html

copy file using an URL from command line

I have a batch script that is used to collect some data and upload that on other servers, using xcopy in a windows 7 command line. I want that script to collect some files that are on share point, so I need to get them using an URL and I need to login.
xcopy can't do the job, but are there other programs that can do it?
Theoretically, you can bend cURL to download a file from a SharePoint site. If site is publicly available, it's all very simple. If not, you'll have to authenticate first, and this might be a problem.
wget for windows maybe? http://gnuwin32.sourceforge.net/packages/wget.htm
The login part can be done using CURL, supplying the user name and password as post arguments. You can supply post args using -d or --data flag. Once you are logged in (and have required permission), you can fetch the required file and then simply transfer it using xcopy as you are already doing for the local files.

How can I ftp multiple files?

I have two unix servers in which I need to ftp some files.
The directory structure is almost same except a slight difference, like:
server a server b
miabc/v11_0/a/b/c/*.c miabc/v75_0/a/b/c/
miabc/v11_0/xy/*.h miabc/v11_0/xy/
There are many modules:
miabc
mfabc
The directory structure inside them is same in both the servers except the 11_0 and 75_0. And directory structure in side different modules is different
How can I FTP all the files in all modules into the corresponding module in second server b by any of scripting languages like awk, Perl, shell, ksh using FTP?
I'd say if you want to go with Perl, you have to use Net::FTP.
Once, I needed a script that diffs a directory/file structure on an FTP
server with a corresponding directory/file structure on a local harddisk,
which lead me to write this script. I don't know if it is efficient or elegant, but you might find one or another
idea in it.
hth / Rene
See you need to use correct path of directory where you want to send files.
You can create small script with php .
php provide good ftp functions.using php you can easily ftp your file. but before that, once check your ftp settings of IIS server or file zilla
I have used following code for sending files on ftp this is in php :-
$conn_id = ftp_connect($FTP_HOST) or die("Couldn't connect to ".$FTP_HOST);
$login_result =ftp_login($conn_id, $FTP_USER, $FTP_PW);
ftp_fput($conn_id, $from, $files, $mode) // ths is the function to put files on ftp
This code is just for reference , go through php manual before using it.
I'd use a combination of Expect, lftp and a recursive function to walk the directory structure.
If the file system supports symlinking or hardlinking, I would use a simple wget to mirror the ftp server. in one of them when you're wgetting just hack the directory v11_0 to point to 75_0, wget won't know the difference.
server a:
go to /project/servera
wget the whole thing. (this should place them all in /project/servera/miabc/v11_0)
server b:
go to /project/serverb
create a directory /project/serverb/miabc/75_0, link it to /project/servera/v11_0:
ln -s /project/serverb/miabc/75_0 /project/servera/v11_0
wget serverb, this will be followed when wget tries to cwd into in 75_0 it will find itself in /project/servera/v11_0
Don't make the project harder than it needs to be: read the docs on wget, and ln. If wget doesn't follow symbolic links, file a bug report, and use a hard link if your FS supports it.
It sounds like you really want rsync instead. I'd try to avoid any programming in solving this problem.
I suggest you could login on any of the server first and go to the appropraite path miabc/v75_0/a/b/c/ . From here you need to do a sftp to the other server.
sftp user#servername
Go to the appropraiate path which files needs to be transferred.
write the command mget *

Resources