Laravel SSH put files to remote server - laravel

I am trying to upload a file from my laravel project to a remote server.
$fileName = base_path().'/Storage/app/public/uploads/'.$file;
SSH::into('production')->put($fileName, './');
The result is a blank screen with no errors or anything and the file is not on the remote server. I know my ssh config is correct (keys and username/host stuff) because this works fine:
SSH::into('production')->run('ls', function($line)
{
echo $line.PHP_EOL;
});
What am i missing? What can i do to see any verbose logging of the SSH call?

Related

In Ruby, Copy text from remote host(s) to local host

Disclaimer I am very new to Ruby!
I am currently writing a program (using Ruby) to ssh into a remote host from my local host. The program must then copy the docker processes running and print them to *.txt file in my local host. Please note: I do not need to be within any containers, I simply need to record the process.
How can I copy the docker processes on a remote host to a *.txt file on the local host?
I have got the ssh part down, but due to some constraints cannot publish any of my code.
I appreciate any responses and have a good day everyone!
You can try saving the output of the ps into a text file on that server and then using the net-scp gem to move the output file to your local host.
# download a file from a remote server
Net::SCP.download!("remote.host.com", "username",
"/remote/path", "/local/path",
:ssh => { :password => "password" })
You need to call a remote command on ssh and then store output of this command to your txt file.
ssh user#xxx.xxx.xxx.xxx 'ps -eaf| grep docker' > output.txt
I guess this what you need.

Unix script to download file in other server

I have a scenario where I need to write a shell script which has to download a file in a remote directory.
Eg: I logged into Server A and from there my script needs to login to Server B and using wget command it has to download a file.
Any suggestions on this?
Instead of doing an SSH into Server B and then doing a wget, I would suggest that you first download the file onto Server A and then scp it to Server B at the path you want.

Connect to FTP via Shell Script issue

While I am able to log in fine manually through the command line, for some reason my log in fails when I try to programmatically log in my Unix Shell script. I am using the same exact script which is successful for another FTP server and I know the values I’m passing are correct. Could this be a configuration issue on the FTP server side?
I get the error: 530 User cannot log in. Login failed.
Here is the code I'm using:
ftp -inv $FTPSERVER << EOF
user $USERNM $PASS
lcd $DLPATH
binary
prompt
mget *.txt
bye
EOF

Cronjob not executing the Shell Script completely

This is Srikanth from Hyderabad.
I the Linux Administrator in one of the corporate company. We have a squid server, So i prepared a Backup squid server, so that when LIVE Squid server goes down i can put the backup server into LIVE.
My squid servers are configured with Centos 5.5. I have prepared a script to take backup of all configuration files in /etc/squid/ of LIVE server to the backup server. i.e It will copy all files from Live server's /etc/squid/ to backup server's /etc/squid/
Here's the script saved as squidbackup.sh in the directory /opt/ with permission 755(rwxr-xr-x)
#! /bin/sh
username="<username>"
password="<password>"
host="Server IP"
expect -c "
spawn /usr/bin/scp -r <username>#Server IP:/etc/squid /etc/
expect {
"*password:*"{
send $password\r;
interact;
}
eof{
exit
}
}
** Kindly note that this will be executed in the backup server that will check for the user which is mentioned in the script. I have created a user in the live server and given the same in the script too.
When i execute this command using the below command
[root#localhost ~]# sh /opt/squidbackup.sh
Everything works fine till now, this script downloads all the files from the directory /etc/squid/ of LIVE server to the location /etc/squid/ of Backup server
Now the problem raises, If i set this in crontab like below or with other timings
50 23 * * * sh /opt/squidbackup.sh
Dont know what's wrong, it is not downloading all files. i.e Cronjob is downloading only few files from /etc/squid/ of LIVE server to the /etc/squid/ of backup server.
**Only few files are downloaded when cron executes the script, If i run this script manually then it is downloading all files perfectly with out any errors or warnings.
If you have any more questions, Please go ahead to post it.
Now i kindly request to give if any solutions are available.
Please Please, Thank you in advance.
thanks for your interest. I have tried what you have said, it show like below, but previously i use to get the same output to mail of the User in the squid backup server.
Even in cron logs it show the same, but i was not able to understand what was the exact error from the below lines.
Please note that only few files are getting downloaded with cron.
spawn /usr/bin/scp -r <username>#ServerIP:/etc/squid /etc/
<username>#ServerIP's password:
Kindly check if you can suggest any thing else.
Try the simple options first. Capture the stdout and stderr as shown below. These files should point to the problem.
Looking at the script, you need to specify the location of expect. That could be an issue.
50 23 * * * sh /opt/squidbackup.sh >/tmp/cronout.log 2>&1

Transfer files between servers without downloading and uploading

I need to get one (huge) file from one server to another and I have a slow Internet connection. I tried using Transmit, the ftp program, but I believe it's downloading the file and uploading it to the other server.
So, is there a way to move it directly from one server to the other, either using and ftp client or the Mac terminal, without having to download and upload the file?
If you have shell access to one of the servers, simply login to that server using telnet or ssh. Start a simple ftp client in the shell and log into the other server. Use a basic ftp command (put or get) to copy the file. Usually though, sysadmins are likely to make shell access difficult.
If you have no shell access, but you do have a webserver with PHP, then the easiest is to write a simple PHP program to do the job. Upload it and trigger it from a browser. Here's one I wrote:
<?php
// qdftp.php - Quick & Dirty FTP
// Place this script in a web accessible
// folder alongside the file you want to send,
// then invoke it from a browser.
//===============================
$server = "123.123.123.123"; //target server address or domain name
$user = "username"; //username on target server
$pass = "password"; //password on target server
$file = "myfile.zip"; //source file
//================================
$sessid = ftp_connect($server); //connect
$login_ok = ftp_login($sessid, $user, "$pass"); //login
if ((!$sessid) || (!$login_ok)):
echo "failed to connect: check hostname, username & password";
exit; //failed? bail!
endif;
$xfer = ftp_put($sessid, $file, $file, FTP_BINARY); //transfer
echo "file transfer " . ($xfer)? "succeded" : "failed" ;
ftp_close($sessid);
?>
Then trigger it from your browser
http://mysourceserver.com/qdftp.php
Last thing: delete qdftp.php when you're done - it's got your userename and password!
The FTP protocol does not support 3rd-party transfers like this. You might try:
$ scp host1:file host2:file2
But I don’t know how that’s implemented. It might bounce everything through you again, which is what you’re trying to avoid. That is, I suspect that it does this, which routes everything through your local pipe:
$ (ssh host1 "cat <file") | (ssh host2 "cat >file")
But you really want:
$ ssh host1 "scp file host2:"

Resources