I'd like to copy a file from a remote server rather than recompute it if possible when using gnu make. To do this I've been keeping a local "dummy" file which records (via timestamp) when the file was last created and copied to the remote server. The gist of what I want to do is below. computed.file is the file itself and computed.file.remote is the dummy.
computed.file: computed.file.remote
<copy computed.file from remote server>
computed.file.remote:
<command to create computed.file>
<copy computed.file to remote server>
touch computed.file.remote
However, this will force the file to be copied to and from the remote server if both rules are invoked, even though it exists when the file is created in the second rule.
Is there another way to do this?
Well, you can do something like this:
computed.file: computed.file.remote
if [ $< -nt $# ]; then <copy computed.file from remote server>; fi
computed.file.remote:
<command to create computed.file>
<copy computed.file to remote server>
touch -r computed.file $#
Related
I'm doing a local to remote server automated file transfer upon building the job in Jenkins
Using the execute shell in the build section, I have this code
ssh root#10.x.x.53 '
sftp -v -o IdentityFile=/root/.ssh/userkey userkey#10.x.x.11 <<EOF
lcd "/NAS/Migration/Automation"
mput *.*
ls
bye
EOF
'
Upon using different stuffs like the date(date +%D%H%M%s) and mput *.* *.*.%TIMESTAMP#yyyyddmmhhnnss% (which is I believe used in batch, but ain't working here), I cannot append a date to the end of all files copied to the remote server.
I want the all the files that is being uploaded renamed with a timestamp
file1.tar --> file1_010822235959.tar
file2.tar --> file2_010922000059.tar
randomzipfile.zip --> randomzipfile_010922030659.zip
filetype.gzip --> filetype_010922153041.gzip
Is this possible in the execute shell in Jenkins?
For example i have two servers 1. Server A & 2. Server B
Server A has directory called /testdir with some files, I need a shell script which will run in Server B to download (FTP) the files from Server A /testdir. This download should happen automatically whenever a new file is added in Server A /testdir and old files should be neglected.
Consider using 'lftp' incremental transfer (mirror). As an alternative, 'wget' has similar mirroring functionality:
With wget:
wget -mirror -nH -o ftp://serverA/testdir
With lftp:
lftp
open ftp://serverA/
mirror /testdir .
I am new to FTP configuration. What I am trying to do is as follows:
I am running a shell script on my localhost and downloading some files to my machine. Now I want a functionality where the files which I downloaded should be stored in a temporary directory, and then it should be transferred to a location(other directory) which I specify. I feel this mechanism is achievable by FTP communication and will be helpful when I host this on a domain, but I am not getting resources from where I can teach myself how to set this up.
OK, having visited many sites, here are some resources you might find handy:
For configuring vsftpd, here's a manual of how to install, configure and use.
About receiving many files recursively via FTP, you can use wget (extracted from this site):
cd /tmp/ftptransfer
wget --mirror --username=foo --password=bar ftp://ftp.originsite.com/path/to/folder
About sending many files recursively, many people find the only way of doing so by tar-n-send; the only problem is that the files will remain tarred until you extract them by going to the other machine (remotely or via ssh) to extract the manually. There is an alternative, not using FTP, but using ssh and pipes which lets you have files extracted on target machine:
tar -cf - /tmp/ftptransfer | ssh geek#targetsite "cd target_dir; tar -xf -"
Explained:
tar is the application to make tar files
-c: create file
-f -: file name is "stdout"
/tmp/ftptransfer: include this folder and all subdirectories in the tar
|: Make a pipe to the next program (connect stdout to stdin)
ssh: Secure Shell program
geek#targetsite: username # machinename where you want to connect to
"..." command to send to the remote host
cd target_dir: changes the dir of output
tar -xf -: extracts the file received by "stdin"
For configuring SSH on Ubuntu, have a look here.
If you need more help, don't be afraid to ask! :)
I'm running CentOS 6.
I need to upload some files every hour to another server.
I have SSH access with password to the server. But ssh-keys etc. is not an option.
Can anyone help me out with a .sh script that uploads the files via scp and delete the original after a successful upload?
For this, I'd suggest to use rsync rather than scp, as it is far more powerful. Just put the following in an executable script. Here, I assume that all the files (and nothing more) is in the directory pointed to by local_dir/.
#!/bin/env bash
rsync -azrp --progress --password-file=path_to_file_with_password \
local_dir/ remote_user#remote_host:/absolute_path_to_remote_dir/
if [ $? -ne 0 ] ; then
echo "Something went wrong: don't delete local files."
else
rm -r local_dir/
fi
The options are as follows (for more info, see, e.g., http://ss64.com/bash/rsync.html):
-a, --archive Archive mode
-z, --compress Compress file data during the transfer
-r, --recursive recurse into directories
-p, --perms Preserve permissions
--progress Show progress during transfer
--password-file=FILE Get password from FILE
--delete-after Receiver deletes after transfer, not during
Edit: removed --delete-after, since that's not the OP's intent
Be careful when setting the permissions for the file containing the password. Ideally only you should have access tot he file.
As usual, I'd recommend to play a bit with rsync in order to get familiar with it. It is best to check the return value of rsync (using $?) before deleting the local files.
More information about rsync: http://linux.about.com/library/cmd/blcmdl1_rsync.htm
Is there a way to publish a web site from Visual Studio 2008 using SCP or SFTP? I know it is possible to publish to my local filesystem and then perform the transfer with SCP, but I'd like something more seamless (e.g. part of Visual Studio). Does this feature exist? An addin perhaps?
The built in system for publishing pages is a little bit limited.
One thing that I find useful is with WinSCP, there is a featured called "Keep Remote Directory up to Date". What it will do is set a bunch of file system watchers for your local system and if you change something locally, it will auto upload it. Using that and publishing to a local directory makes things easy.
If you have Windows 10 and bash/linux subsystem installed and a Linux/BSD server you can:
Combine ssh and rsync
I prefer to use rsync through a ssh pipe, since it won't upload files that werer not modified, it's more efficient.
from visual studio, publish in a folder, say I:/www/WebProject
use this command that updates changes only, and delete files that were deleted/absent from publish folder thanks to --delete
bash -c "rsync -avH --delete --progress /mnt/i/www/WebProject -e ssh server:/var/www/"
Where I:/www/WebProject is the local folder where the project was published, and /var/www the remote directory of the web application.
Preparation (to do once)
You need to work a bit to allow ssh to work without password but with keys.
let's say your bash username is also the same on the server; if not, just use username#server
name your server:
add xx.xx.xx.xx server to the file c:/windows/system32/drivers/etc/hosts)
add your server to hosts from bash with sudo echo "xx.xx.xx.xx server" >> /etc/hosts
from bash, generate your keys:
ssh --keygen
then [enter] (no passphrase)
send your public key to the server, in your home folder:
scp ~/.ssh/id_rsa.pub servername:~/
from your server (ssh server then password):
cat id_rsa.pub >> .ssh/authorized_keys && rm id_rsa.pub
Now you can ssh and scp without password. IMO this is way better than filezilla or just scp.