How to include a sub-script in a remote shell from remote location? - macos

I am running a local bootstrap.sh script from OSX on a remote Ubuntu server which does some "if else then" stuff to load a specific subscript.sh when a specific condition is met.
I am running that local script with:
ssh user#host "bash -s" <~/projects/projectname/bootstrap.sh
I am having issues with getting the subscript.sh sourced (loaded/included).

You can't. You're only sending the contents of bootstrap.sh to the remote shell. It's attempting to source subscript.sh on the remote machine, and it isn't there.
You'll need to either copy subscript.sh (or both scripts!) to the remote machine, or insert the contents of subscript.sh into bootstrap.sh in place of the source command.

What I would recommend is to rsync your 'bootstrap.sh' from your local computer to your server. You should be able to do this with your ssh credentials.
A very cool utility is Transmit. It is $25 and allows you to cleanly mount your server as if it were a portable hard drive (Transmit can also do synchronizations). All you need is ssh credentials and is very user friendly.
If you are allowed to install on your server, then I would install qsub on it. (Actually just check to see if it is installed.) Then just mount your computer's drive and you can submit scrips with qsub (I actually would just make a small server on your mac). This is what I use for using a linux cluster from my OSX computer.
Alternatively you can make a small server from your osx and have it mounted on your linux server.

Related

How to SSH from local linux into specific directory on windows 10 remote

I want to ssh from my local linux computer into a specific directory on a windows 10 remote. The shell that is used on the remote is git bash. I don't want to keep changing the directory every time I log into my remote using ssh.
for linux remotes this is easily done using something like this:
ssh -t user#x.x.x.x "cd /targetDir ; \$SHELL --login"
The question is how can the same thing be achieved for Windows 10 remotes? If nothing else works I would also accept changing the default entry point in git bash for any ssh sessions on the remote.
Please note that I am not looking for help setting up ssh (already works). I just want to jump right into a specific directory when a session is started.
I was able to figure this thing out myself. The following command gets the job done. Using double and single quotes together is required to make it work (in no particular order).
ssh -t user#x.x.x.x "'cd /targetDir ; bash'"

How to run a script from local on remote but at some point continue running the script on the local server?

I need to run a bash script that takes some parameters from server-1 and then from my local server where I ran the script with
ssh user#server-1 bash -s <script.sh
I then need to use those parameters to be executed with all kind of commands on my local server and also server-2 is involved. But the script will still be running on server-1 because of
ssh user#server-1 bash -s <script.sh
Maybe I can use 2 scripts but I want them to be only on local server. and putting in the script more commands after SSH doesn't seem to be working.
I would place the script on the remote server and remote execute it via SSH.
If the script should change over time, then break it up into 2-3 steps
1. gather any additional parameter from remote machine
2. copy script to remote machine using scp
3. ssh to "remote execute" script on remote machine
Am not sure what parameter you need from the remote system.
I would try to hand it over via command line options to the script in #3.
Otherwise "hack"/patch it in before #2.

SCP file from ssh session to localhost

I have a headless file server on which I store and manage downloads and media, but occasionally I have to transfer small files back to my computer (Mac, using bash shell). The problem is that some files have more user-friendly names and commonly have spaces in them, and they are buried in the file directory hierarchy I have set up on my server.
When I'm using scp from my local machine, I don't have tab completion, so I have to manually type out the entire path and name with spaces escaped. When I ssh into the server first, the command:
scp /home/me/files/file\ name\ with\ spaces.png Me#localhost:/Users/Me/MyDirectory
fails with the error "Permission denied, please try again" even though I'm entering my local machine user password properly.
I've learned a little bit of sftp since I've been told that may be a better tool for file transfer. However, the utility seems outdated and I still don't have tab completion after establishing a connection to the server (on my Terminal when pressing Tab I just get a tab character).
My question is this: what can I do to allow tab completion while using scp from my Mac? Or am I using incorrect syntax for scp while in an ssh session, and is there something in that command I should fix? Or, is there a (better? newer?) tool other than sftp that would offer tab completion on a server?
Finally, if none of these problems have simple solutions, is there some package I could install (e.g. a completion package from Homebrew or the like) that would facilitate better tab-completion with any of these commands?
Looks to me like this is some incorrect scping.
This is the format of the command
scp ./localFile.txt remoteUser#remoteHost:/remoteFile.txt
You were so close, but you have localhost set where you should have your remoteHost.
localhost is the name that resolves to the machine that you are currently on - so in your workflow, you are sshing to a machine, and then trying to scp that file to the same machine you are already sshd into.
What you need to do, is figure out the IP address, or the physical host name of the computer that you are trying to connect to, and use that instead.
scp ./localFile.txt remoteUser#192.168.1.100:/remoteFile.txt
# where 192.168.1.100 would be the IP of your Mac
I am assuming the reason you were getting permission denied, was because you were using your the login credentials for you mac, but unknowingly trying to login again to your headless machine.

Shell Script program to download files from linux remote server

I am very new in shell scripting , i want to download some files from linux remote server ,so how can i proceed for that.That remote server is ssh based .
first of all, ftp service is better choice to get files from remote server.
If only sshd service is available, then you may use ssh based command sftp or scp.
However, using sftp or scp commands will invoke an interactive password prompt, which is a problem in shell script --> You have to ask for help to expect command. see Automate scp file transfer using a shell script .
Besides expect, you may also set up trust relationship between two servers, then you may use scp without password. See http://www.linuxproblem.org/art_9.html

Automatically copying files from a Linux machine to a Windows machine

I need to automatically copy files from a linux machine to a windows one every day.
I'm looking for something simple and secure like scp, rsync, sftp. Unfortunately, I'm at a loss of how to set this up on the Windows machine.
Does anyone know how to do this?
You can try mounting the Windows drive as a mount point on the Linux machine, using smbfs; you would then be able to use normal Linux scripting and copying tools such as cron and scp/rsync to do the copying.
You can find rsync for windows in cygwin, with that you can setup a rsync server on the windows box and run a cron job on your linux machine rsync'ing all the files to the windows machine. We used to do that and it worked fine.
"I'm at a loss of how to set this up on the Windows machine." Windows is the client or the server? At a loss means what, specifically? What can't you do?
"linux machine to a windows" can be done two ways.
Linux is client. Windows runs an FTP or SCP or SSH server. Linux has a client and pushes the file to Windows. Look at FileZilla for free windows FTP server. Also, windows often has an FTP service that's turned off. Turn it on.
Windows is client. Windows periodically pulls the file from the linux server. This is easier, since Linux already has all the necessary servers available. You do, howeveevr, need to start them on Linux.
There are scores of sftp, scp clients for Windows. Windows comes with an ftp client. Google for sftp client. You'll find WinSCP, Putty, filezilla, and list free country list of sftp clients.
I haven't used it in years now, but you could try Unison from http://www.cis.upenn.edu/~bcpierce/unison/
It could be done with 'smbclient', which acts much like an FTP client to a Windows share. Check out the manpage: man smbclient and look for ways to script it with the -c option, or man expect to drive it.
Here's how I'd probably do it though:
Pick which user you're going to be
when you sync the files. Log in as
this user and type 'id', and get the
numeric ID. You will use this ID in
step 4
Become 'root'
mkdir /mnt/sharename
Edit your /etc/fstab file and add an entry something like this. Replace the user ID of 500 with your user ID. Replace sharename with your windows share name. Replance WINDOWSHOSTNAME with your host name or IP address. If you don't know the shares, run smbclient -L WINDOWSHOSTNAME.
//WINDOWSHOSTNAME/sharename /mnt/sharename cifs credentials=/root/smblogin,uid=500,noauto,user 0 0
Edit /root/smblogin and put the following two lines in it
username=YOUR_WINDOWS_USERNAME
password=YOUR_WINDOWS_PASSWOD
Log in as the user from step 1.
Try mounting the share: mount /mnt/sharename
If that succeeds, then write a script to do it automatically. Let's call it 'backup.sh':
#!/bin/sh
df | grep -q /mnt/sharename
if test $? -ne 0 ; then
mount /mnt/sharename
fi
cp -r /path/to/dir /mnt/sharename/destination/
Use cron to run the script.
Type crontab -e
Put the following in the file:
PATH=/bin:/usr/bin
# Backup at 2:15 A.M. every day. Run 'man 5 crontab' for help on the time format
15 2 * * * /path/to/backup.sh
You may try WinSCP and its scripting support. And Windows support some kind of cron-like operation in its management stuff, don't they?

Resources