Uploading files from multiple directories to an SFTP site using Shell Scripting - shell

I'm trying to upload items from multiple folder locations locally to an SFTP site. I'm using an existing shell script that I know works for uploads from a single local location, but I can't figure out how to make it work for uploads from multiple local locations.
I'm fairly new to coding and have only basic experience with batch scripting and some minor editing of existing shell scripts, so I would appreciate any help that can be given.
Here's the sample of my existing single local location upload script
open sftp://(userid):(password)#(sftp site) -hostkey="(hostkey)"
pwd
ls
lcd "(local directory)"
lls
cd (remote directory)
ls
put * -filemask=|*/ ./
exit
This has worked well for us previously, but I'm trying to clean up some of our existing scripts by combining them into one process that runs as an automated task, but I can't figure out how to chain multiple tasks like this together.

Just repeat the upload code for each location:
cd /remote/directory
lcd /local/directory1
put * -filemask=|*/ ./
lcd /local/directory2
put * -filemask=|*/ ./
Though if it's really a WinSCP script, you can use just one command like:
put -filemask=|*/ /local/directory1/* /local/directory2/* /remote/directory/
See the documentation for the put command:
put <file> [ [ <file2> ... ] <directory>/[ <newname> ] ]
...
If more parameters are specified, all except the last one specify set of files to upload. Filename can be replaced with Windows wildcard to select multiple files. To upload all files in a directory, use mask *.
The last parameter specifies target remote directory and optionally operation mask to store file(s) under different name. Target directory must end with slash. ...

Related

How to specify a sibling folder of a folder mentioned in a variable in a shell script?

I have a folder path stored in a variable ${PROJECT_DIR}.
I want to navigate up into its parent folder, and back down into a folder called "Texture Packer" , i.e.. ${PROJECT_DIR} and "Texture Packer" are siblings.
How do I specify it in a shell script ?
So far I have:
TP=/usr/local/bin/TexturePacker
# create all assets from tps files
${TP} "${PROJECT_DIR}/../Texture Packer/*.tps"
But this is incorrect, since Texture Packer can't detect the files in the path. The error message displays:
TexturePacker:: error: Can't open file
/Users/john/Documents/MyProj/proj.ios_mac/../Texture Packer/*.tps for
reading: No such file or directory
EDIT: The following seems to work but isn't clean:
#! /bin/sh
TP=/usr/local/bin/TexturePacker
if [ "${ACTION}" = "clean" ]
then
# remove sheets - please add a matching expression here
# Some unrelated stuff
else
cd ${PROJECT_DIR}
cd ..
cd "Texture Packer"
# create all assets from tps files
${TP} *.tps
fi
exit 0
You're on the right track; the problem is that wildcards (like *.tps) don't get expanded when they're in quotes. The solution is to leave that part of the path outside of the quotes:
${TP} "${PROJECT_DIR}/../Texture Packer"/*.tps
BTW, I almost always recommend against using cd in scripts. It's too easy to lose track of where the current directory will be at various points in the script, or have an error occur and the rest of the script runs in the wrong place, or... Also, any relative pathis you're using (e.g. those supplied by the user as arguments) change meanings every time you cd. Basically, it's an opportunity for things to go weirdly wrong.

how to convert a list of image URLs to a zip file of images?

Does anyone know how to batch download images relying only just a list of image URLs as the data source? I've looked through applications but all I could find was this: http://www.page2images.com/ (which only hardcodes a screenshot of every image on the URLs.)
So have a server running whatever you'd like.
Send an array of image names to the server - use whatever language you want but have the function do a for loop over the array
Execute wget https://image.png from the file (let's say you use NodeJS, this would be eval('wget ' + imgList[i]) - this will download everything to your current directory
Once the for loop is finished, the next step is to zip all your items tar -zcvf files.tar.gz ./ - this will create a tar ball of all the files within that directory
Download that tar
If you want to get fancy with this, you should create a randomly named directory and execute all your commands to point inside that directory. So you would say wget https://image.png ./jriyxjendoxh/ to get the file into the randomly named folder. Then at the end tar -zcvf files.tar.gz jriyxjendoxh/*
Then to make sure you have all the files downloaded, you can create a semaphore to put a block on the creation of the tar ball until the number of files is equal to the count of the passed in array. That would be a real fancy way to make sure all the files are downloaded.
Hi there You could try free download manager or if you have Linux use the wget command with the text source file

How to create a batch file in Mac?

I need to find a solution at work to backup specific folders daily, hopefully to a RAR or ZIP file.
If it was on PC, I would have done it already. But I don't have any idea to how to approach it on a Mac.
What I basically want to achieve is an automated task, that can be run with an executable, that does:
compress a specific directory (/Volumes/Audio/Shoko) to a rar or zip file.
(in the zip file exclude all *.wav files in all sub Directories and a directory names "Videos").
move It to a network share (/Volumes/Post Shared/Backup From Sound).
(or compress directly to this folder).
automate the file name of the Zip file with dynamic date and time (so no duplicate file names).
Shutdown Mac when finished.
I want to say again, I don't usually use Mac, so things like what kind of file to open for the script, and stuff like that is not trivial for me, yet.
I have tried to put Mark's bash lines (from the first answer, below) in a txt file and executed it, but it had errors and didn't work.
I also tried to use Automator, but it's too plain, no advanced options.
How can I accomplish this?
I would love a working example :)
Thank You,
Dave
You can just make a bash script that does the backup and then you can either double-click it or run it on a schedule. I don't know your paths and/or tools of choice, but some thing along these lines:
#!/bin/bash
FILENAME=`date +"/Volumes/path/to/network/share/Backup/%Y-%m-%d.tgz"`
cd /directory/to/backup || exit 1
tar -cvz "$FILENAME" .
You can save that on your Desktop as backup and then go in Terminal and type:
chmod +x ~/Desktop/backup
to make it executable. Then you can just double click on it - obviously after changing the paths to reflect what you want to backup and where to.
Also, you may prefer to use some other tools - such as rsync but the method is the same.

Creating an executable file to download a file, then upload the file to new location

I'm having trouble finding the correct method to accomplish a relatively simple task
I'm trying to make a simple executable that I can run/schedule to run.
That
1. Downloads a file from an intranet location (192.168.100.112/file.txt)
2. Uploads the new version file to web (fpt.website.com/docs/file.txt)
There are 5 pdf files that auto generate on an intranet and I would like to keep the web versions updated. Ideally create one executable that does all 5 files at once and have the ability to do each one individually.
thanks
Use the windows ftp command. Is has a -s option for providing ftp "scripts". Basically just add all the commands you need to accomplish your task to something.txt for example:
open 192.168.100.112
get file.txt
close
open fpt.website.com
cd docs
put file.txt
close
bye
then do:
ftp -s:something.txt
You could make ftp scripts, one for each upload. Then put all five commands in a batch file

regex/wildcard in scp

Is it possible to use a wildcard in scp
I am trying to achieve:
loop
{
substitue_host (scp path/file.jar user#host:path1/foo*/path2/jar/)
}
I keep on getting "scp: ambiguous target"
Actually I am calling an api with source and dest that uses scp underneath and loops over diff hosts to put files
Thanks!
In general, yes, it is certainly possible to use a wildcard in scp.
But, in your scp command, the second argument is the target, the first argument is the source. You certainly cannot copy a source into multiple targets.
If you were trying to copy multiple jars, for example, then the following would certainly work:
scp path/*.jar user#host:path2/jar/
"ambigious target" in this case is specifically complaining that the wildcard you're using results in multiple possible target directories on the #host system.
--- EDIT:
If you want to copy to multiple directories on a remote system and have to determine them dynamically, a script like the following should work:
dir_list=$(ssh user#host ls -d '/path1/foo*/path2/jar/')
for dir in $dir_list; do
scp path/file.jar user#host:$dir
done
The dir_list variable will hold the results of the execution of the ls on the remote system. The -d is so that you get the directory names, not their contents. The single quotes are to ensure that wildcard expansion waits to execute on the remote system, not on the local system.
And then you'll loop through each dir to do the remote copy into that directory.
(All this is ksh syntax, btw.)

Resources