Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I am migrating the jobs from one server to another server of jenkins. Please find the below job structure
job1 job2 job3
config.xml config.xml config.xml
buildnumber buildnumber buildnumber
last build last build last build
As the above mentioned job structure, we have lot of jobs. I need to copy only the job name and inside the config.xml of all the jobs to the corresponding jobs in the another server. Can anyone help to copy from one server to another unix server(centos). If i use scp -r , it copies all the subfolders, i need to copy only the job name and the config.xml.
Your can use rsync
rsync -av --include=job[123]/config.xml --exclude=job[123]/* -e ssh job[123] remote-server:/your/disired/path/
or you can use tar
tar -cpvf - job[123]/config.xml | ssh remote-server "cd /your/desired/path/; tar -xpf -"
or if your tar supports -C
tar -cpvf - job[123]/config.xml | ssh remote-server tar -xpf - -C /your/desired/path/
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 years ago.
Improve this question
How can I copy a directory to my local desktop from a remote machine ? I accessing the remote machine via ssh in the terminal.
There are lots of ways...
Using scp
On your Mac, in Terminal, make a directory on your Desktop to store the remote files:
mkdir ~/Desktop/remote
Then use scp to copy some files from the remote host to your Mac:
scp remoteHost:path/to/directory/* ~/Desktop/remote
Using rsync
Make a directory on your Desktop into which to copy the remote files:
mkdir ~/Desktop/remote
Now use rsync to make a synchronised copy of a directory on the remote machine in that folder on your Mac:
rsync -av RemoteMachine:path/to/directory/* ~/Desktop/remote
Using tar and ssh
In the Terminal on your Mac, run this:
ssh remoteMachine "tar -cf - /path/to/SomeDirectory" > ~/Desktop/RemoteDir.tar
That will log into the remote machine and create a tar file of the specified directory and write it to stdout. That will be picked up on your Mac and redirected into tar file on your Mac's Desktop that you can inspect with the Archiver or similar.
There is a easier way, install a FTP CLient like FileZilla. https://filezilla-project.org/download.php.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
Im trying to write a script that keeps a tar in sync with a folder. I am dealing with a lot of files and don't want to remake the tar every time the script is run. I want it to only add/remove files from the tar that have been added/removed from the folder since the last script run. Here's what I have.
# Create tar if it doesn't exist but don't over write if it does exist
touch -a /home/MyName/data.tar
cd /home/MyName
# Make the tar
tar -uv --exclude='dirToTar/FileIWantToExclude' -f $tarFile dirToTar
This works great for adding files. But if a file is deleted from dirToTar, it doesn't get removed from data.tar.
Unfortunately, tar just doesn't support this. As an alternative, you could use zip, like this:
zip -r -FS myArchiveFile.zip dirToZip
Not "tar" like you asked for, but it does seem to work nicely. Another alternative would be to use 7z (the 7-zip archiver), which may give you better compression. The command-line options for this is obscure, but this works:
7z u -up1q0r2x2y2z1w2 myArchiveFile.7z dirToZip
(I found documentation for these 7z command-line options here: https://www.scottklement.com/p7zip/MANUAL/switches/update.htm. I don't know why it's so hard to find this documentation...).
If, for some reason, you don't want the compression provided by zip or 7z, there are ways to disable that too, so zip or 7z just create a file container kind of like tar does.
In the end, though, I think you should just re-create the archive each time. I suspect that the time saved doing the kind of synchronization you ask for is probably small.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
Hey I'm looking for Shell script to transfer compressed archives from server a to server b. Only the compressed archives which have not been transferred should be transfer from server a to server b.
Please don't say scp or rsync because it will copy all the files from server a to server b.
I want the script which compare the existence of the file in the server b. If the file does not exist in server b then it has to transer that file from server a to server b.
As Oli points out - this is exactly what rsync does.... But if you want to go the manual way thentake at my answer here rsync to backup one file generated in dynamic folders
What you could also do for the comparison is ssh first to host a by running command and storing its output locally
ssh localhost "find /var/tmp/ -name \* -exec du -sm {} \;" > /tmp/out.txt
head /tmp/out.txt
531 /var/tmp/
0 /var/tmp/aaa
1 /var/tmp/debian
You now have a file locally with remote filenames,sizes feel free to expand as required
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I have a small shell script that starts a program when I double-click it. (I have set the permissions to allow executing the script).
I want to be able to copy that script to another computer so that the new user can double-click it without needing to know anything about chmod or permissions. But I can't find out how to preserve the execute permission when I copy the file.
I can usually find answers with Google but this has me defeated - I guess I am not expressing my question properly.
Thanks
Use rsync or tar.
rsync -p file user#host:destdir
plus other options you might need.
Or
tar cvzf file.tar file
then copy (or email, etc.) file.tar to the other machine and extract the file:
tar xpvzf file.tar
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I'm trying to copy my .profile, .rvm and .ssh folders/files to a new computer and keep getting a "not a regular file" response. I know how to use the cp and ssh commands but I'm not sure how to use them in order to transfer files from one computer to another.
Any help would be great, thanks!
You can do this with the scp command, which uses the ssh protocol to copy files across machines. It extends the syntax of cp to allow references to other systems:
scp username1#hostname1:/path/to/file username2#hostname2:/path/to/other/file
Copy something from this machine to some other machine:
scp /path/to/local/file username#hostname:/path/to/remote/file
Copy something from another machine to this machine:
scp username#hostname:/path/to/remote/file /path/to/local/file
Copy with a port number specified:
scp -P 1234 username#hostname:/path/to/remote/file /path/to/local/file
First zip or gzip the folders:
Use the following command:
zip -r NameYouWantForZipFile.zip foldertozip/
or
tar -pvczf BackUpDirectory.tar.gz /path/to/directory
for gzip compression use SCP:
scp username#yourserver.com:~/serverpath/public_html ~/Desktop
You may also want to look at rsync if you're doing a lot of files.
If you're going to making a lot of changes and want to keep your directories and files in sync, you may want to use a version control system like Subversion or Git. See http://xoa.petdance.com/How_to:_Keep_your_home_directory_in_Subversion