I have a specific problem.
I have a server(say x) , first I have to connect to the x server using ssh x#domain. Then there is an internal server y again I have to connect to that specific domain using ssh. I have to download a folder in server y. I tried using scp after logging into x, using
scp -r /data/home/path /Users/username/Desktop
I got the following error
cp: cannot create regular file `/Users/username/Desktop': No such file or directory
please help me in downloading the folder
That error is because the destination /Users/username/Desktop doesn't exist on server X.
However, there's more going on here. That command is just trying to copy the first folder locally because there is no host information.
You should run from X:
scp -r usery#servery:/data/home/path destination
And then repeat the process from your machine using serverx information.
Related
I want to create a watcher that will automatically sync file changes from a local directory to a remote docker container. I need to find a way to transfer the files efficiently. I will also need it for a one time upload command which would transfer a complete folder from local directory to a remote docker container.
I figure one solution would be to scp to a tmp directory on the remote host, and then run docker cp via ssh to copy the files from tmp directory. Is that a good solution? Is there anything better?
By the way, if anyone knows a file sync utility for that use case, please let me know. I tried to search, but it seems like it's not the most popular development workflow?
I would tr using rsync for local to remote host syncing. From their volume mount the directory into the docker container.
How to download file(s) from remote server directory to local machine in PuTTY ?
I got the command for inserting file to remote directory from local machine. But it is not working for me though there is no error message.
pscp c:\documents\foo.txt fred#example.com:/tmp/foo
(Question is probably more suited to Superuser)
You have your parameters in the wrong order. Please refer to the documentation:
https://the.earth.li/~sgtatham/putty/0.70/htmldoc/Chapter5.html#pscp
To download, you need:
pscp [options] [user#]host:source target
What you have there is the opposite, it's for doing an upload.
Since i have to change some settings inside "etc/httpd/conf.d/phpMyAdmin.conf".
i can't download this file using "FileZilla", I also tried sudo nano command in putty , it returns empty. i don't know how to change permission for this file.
I spent more than an hour. Guide me if someone know how to resolve this.
EC2 is a computer rental service, not a web hosting service, so you won't be able to connect with FTP (filezilla) unless you run an FTP server on your EC2 instance.
As for editing the file while you're connected through SSH (putty), you need to make sure that you're properly referencing the file you want. Try running "sudo nano /etc/httpd/conf.d/phpMyAdmin.conf". Note the leading "/" on the file path; it's important.
So I installed a LAMP on a Google Cloud instance with debain wheezy7. Everything is working fine but I am not able to work the ftp. I am following this tutorial by digital ocean
I am stuck at this last step where I need to make vsftpd allow the user to write outside the chroot file.
The error is get is
hetunandu_gmail_com#lamp:~$ mkdir /root/hetunandu/files
mkdir: cannot create directory /root/hetunandu/files': Permission denied
Then when i use sudo with it i get this error
hetunandu_gmail_com#lamp:~$ sudo mkdir /root/hetunandu/files
mkdir: cannot create directory /root/hetunandu/files': No such file or directory
Where do I go from here?
Also I dont know how to get my username and password setup for FTP
I followed the tutorial and could not replicate your issue. I initially got "Permission denied" but you can circumvent this by running:
$ sudo su
and then
$ mkdir -p /root/$USER/files
Why not use /home/$USER ? not sure why you want to create the folders under /root.
As for your second question, regarding the username and password, I am not sure I understand. From the Developers Console > Compute Engine > VM Instances > click SSH and that should log you in with root privileges. then you can create all the users you want:
$ sudo adduser test_user
Please don't use FTP as it's an insecure clear-text protocol which will let others see your password and easily get access your instance, read/modify/delete your files, etc.
Instead, you should use secure protocols such as SCP or SFTP with public key authentication.
Here are some options to transfer files to/from your GCE VM instance:
sftp CLI tool, as described in this answer
gcloud compute copy-files, as described in this answer
WinSCP with SFTP
I'm a newbie and all I want is to set up an ec2 instance for fever rss.
Here is my info: os x 10.9.2, aws with ami of ubuntu 12.04 lts. I set up lamp on ec2 following this guide: http://www.robotmedia.net/2011/04/how-to-create-an-amazon-ec2-instance-with-apache-php-and-mysql-lamp/
Now I can ssh to my server public IP using terminal. After connected the server, I typed
scp -i /path/to/keypair.pem /path/to/test.txt ubuntu#theServerPublicIP:~/
and got the error as follows:
Warning: Identity file keypair.pem not accessible: No such file or directory.
I have tried to resolve the problem by:
1. change permission of .pem file to 600 on my os x.
chmod 600 keypair.pem
and ssh again, scp again, and got same error. Then I change its permission to 400 on my os x,
chmod 600 keypair.pem
and redid ssh and scp, and got same error.
rewrite file path using ~/path/to/file for both of keypair.pem and test.txt, and then redid ssh and scp, got same error.
Next rewrite file path using /Users/myUserName/path/to/file for both files and redid ssh and scp, got same error.
Next cd to the folder of keypair.pem and test.txt (I put them in the same folder), and tried the above two naming and got same error for each.
change path on the server. I have tried "~","~/","/","/var/www/", for all I still got the same error.
I also tried forklift because I saw the developer of Fever using it in the demo video. I tried all options for connection: sftp... but couldn't connect to the server.
Please help to get the test.txt uploaded... then I will be able to upload the fever folder.
Thanks!
If you gotta do this frequently I advise you to create an alias.
For example: I was running a webserver on EC2 instance and I had HTML contents in a local dir awsplaywww
$ alias syncaws="rsync -avrz --delete /home/sanket/workspace/awsplaywww/ -e ssh sanket#awsplay1.ddns.net:/var/www/html/"
Now everytime I update a HTML file or something and need to send it back to server. I just open terminal and type syncaws and job done!