Copy a file from server to local hard disk [closed] - macos

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I'm trying to copy a database dump sql file from my server to my hdd using mac OSX Terminal (open ssh client). I know the command should be something like:
scp [[user]#host]:mydump.sql mydump_local.sql
But I found out that it did copy that file on the same server instead of my hdd (i.e. using ls *, I found both files mydump.sql and mydump_local.sql)
What am I doing wrong?

first, don't ssh into the remote server. then, I find this to be a very good resource for scp syntax: Example syntax for Secure Copy (scp)
the one your looking for is this: to copy the file "foobar.txt" from a remote host to the local host
$ scp your_username#remotehost.edu:foobar.txt /some/local/directory
if you're still having issues, please post the exact command you're using

try to use ./mydump_local as the destination when you want it to be placed in the current directory.

If you firt make a ssh connect to the remote host, you'll get a remote shell. If you are coping your file in that remote-shell, you make a scp onto the remote server.
Don't make a ssh to the remote host, just use scp.

Related

Copy a directory when logged in via ssh to my desktop via terminal [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 years ago.
Improve this question
How can I copy a directory to my local desktop from a remote machine ? I accessing the remote machine via ssh in the terminal.
There are lots of ways...
Using scp
On your Mac, in Terminal, make a directory on your Desktop to store the remote files:
mkdir ~/Desktop/remote
Then use scp to copy some files from the remote host to your Mac:
scp remoteHost:path/to/directory/* ~/Desktop/remote
Using rsync
Make a directory on your Desktop into which to copy the remote files:
mkdir ~/Desktop/remote
Now use rsync to make a synchronised copy of a directory on the remote machine in that folder on your Mac:
rsync -av RemoteMachine:path/to/directory/* ~/Desktop/remote
Using tar and ssh
In the Terminal on your Mac, run this:
ssh remoteMachine "tar -cf - /path/to/SomeDirectory" > ~/Desktop/RemoteDir.tar
That will log into the remote machine and create a tar file of the specified directory and write it to stdout. That will be picked up on your Mac and redirected into tar file on your Mac's Desktop that you can inspect with the Archiver or similar.
There is a easier way, install a FTP CLient like FileZilla. https://filezilla-project.org/download.php.

shell script for comparing files between two servers [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
Hey I'm looking for Shell script to transfer compressed archives from server a to server b. Only the compressed archives which have not been transferred should be transfer from server a to server b.
Please don't say scp or rsync because it will copy all the files from server a to server b.
I want the script which compare the existence of the file in the server b. If the file does not exist in server b then it has to transer that file from server a to server b.
As Oli points out - this is exactly what rsync does.... But if you want to go the manual way thentake at my answer here rsync to backup one file generated in dynamic folders
What you could also do for the comparison is ssh first to host a by running command and storing its output locally
ssh localhost "find /var/tmp/ -name \* -exec du -sm {} \;" > /tmp/out.txt
head /tmp/out.txt
531 /var/tmp/
0 /var/tmp/aaa
1 /var/tmp/debian
You now have a file locally with remote filenames,sizes feel free to expand as required

Executing the following steps in bash [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Here are the steps I am doing again and again, and I was wondering if I can write a script which does this.
I have two local accounts:
thrust
hduser
Now, I am writing Java code in Eclipse in my thrust account.
After the code runs satisfactorily, I do:
mvn clean
cp -r /home/thrust/projectA -r /tmp/
su - hduser
cp -r /tmp/projectA /home/hduser/
cd /home/hduser/projectA
mvn package
Is there a way I can automate all these steps?
Or is there a way I can write code on this thrust account and the code automatically syncs with the hduser account?
Given that you are writing code (and you are doing it "again and again"), it seems you should be using a revision control system (like Subversion or Git), either with a local repository or with a hosting service (for example: GitHub or Bitbucket).
If you don't want to use an RCS, you can create a shell script to automate what you are already doing, as suggested by #iamnotmaynard.
Also, take a look at rsync or ssh. They can help you to copy files from one user to another more easily (rsync can also help you to keep these files synchronized).

How to mount a specific file system on a disk image? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I have a disk image file containing multiple file systems, such as HFS (Journaled) in addition to Joliet or UDF. I want to mount whatever non-HFS file system is there. First, I attach the image without mounting:
$ hdiutil attach -nomount path/to/image.iso
/dev/disk3 Apple_partition_scheme
/dev/disk3s1 Apple_partition_map
/dev/disk3s2 Apple_HFS
Then, the man page for mount seems to say that I can mount non-HFS file systems like this:
$ mount -a -t nohfs /dev/disk3s2 /tmp
But the response is
mount: exec /System/Library/Filesystems/nohfs.fs/Contents/Resources/mount_nohfs for /private/tmp: No such file or directory
which sounds like it just doesn't understand the documented "no" prefix for filesystem types that you don't want to mount. Is there any way to make this work, or must I know what specific file system I want to mount?
EDIT TO ADD: Would someone care to explain the negative votes and close votes?
First, you don't want the -a option, as that tells it to mount everything listed in /etc/fstab; your disk image isn't listed there, so that's incorrect. Second, I'm not sure why the "no" prefix isn't working, but you should be able to do it by specifying the correct filesystem to use (cd9660 would be the one to use for a Joliet image). Third, if the hybrid format is done the way I've seen, you'll want to mount /dev/disk3, not /dev/disk3s2:
mkdir /tmp/mountpoint
mount -t cd9660 /dev/disk3 /tmp/mountpoint

How to copy files across computers using SSH and MAC OS X Terminal [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I'm trying to copy my .profile, .rvm and .ssh folders/files to a new computer and keep getting a "not a regular file" response. I know how to use the cp and ssh commands but I'm not sure how to use them in order to transfer files from one computer to another.
Any help would be great, thanks!
You can do this with the scp command, which uses the ssh protocol to copy files across machines. It extends the syntax of cp to allow references to other systems:
scp username1#hostname1:/path/to/file username2#hostname2:/path/to/other/file
Copy something from this machine to some other machine:
scp /path/to/local/file username#hostname:/path/to/remote/file
Copy something from another machine to this machine:
scp username#hostname:/path/to/remote/file /path/to/local/file
Copy with a port number specified:
scp -P 1234 username#hostname:/path/to/remote/file /path/to/local/file
First zip or gzip the folders:
Use the following command:
zip -r NameYouWantForZipFile.zip foldertozip/
or
tar -pvczf BackUpDirectory.tar.gz /path/to/directory
for gzip compression use SCP:
scp username#yourserver.com:~/serverpath/public_html ~/Desktop
You may also want to look at rsync if you're doing a lot of files.
If you're going to making a lot of changes and want to keep your directories and files in sync, you may want to use a version control system like Subversion or Git. See http://xoa.petdance.com/How_to:_Keep_your_home_directory_in_Subversion

Resources