regex/wildcard in scp - shell

Is it possible to use a wildcard in scp
I am trying to achieve:
loop
{
substitue_host (scp path/file.jar user#host:path1/foo*/path2/jar/)
}
I keep on getting "scp: ambiguous target"
Actually I am calling an api with source and dest that uses scp underneath and loops over diff hosts to put files
Thanks!

In general, yes, it is certainly possible to use a wildcard in scp.
But, in your scp command, the second argument is the target, the first argument is the source. You certainly cannot copy a source into multiple targets.
If you were trying to copy multiple jars, for example, then the following would certainly work:
scp path/*.jar user#host:path2/jar/
"ambigious target" in this case is specifically complaining that the wildcard you're using results in multiple possible target directories on the #host system.
--- EDIT:
If you want to copy to multiple directories on a remote system and have to determine them dynamically, a script like the following should work:
dir_list=$(ssh user#host ls -d '/path1/foo*/path2/jar/')
for dir in $dir_list; do
scp path/file.jar user#host:$dir
done
The dir_list variable will hold the results of the execution of the ls on the remote system. The -d is so that you get the directory names, not their contents. The single quotes are to ensure that wildcard expansion waits to execute on the remote system, not on the local system.
And then you'll loop through each dir to do the remote copy into that directory.
(All this is ksh syntax, btw.)

Related

I want to get the latest file names under each directory of gcs

I want to know the path to the latest file under each directory using gsutil ls.
Executing the command in a loop like this is very slow.
I want the final output to be
How can I do this?
I want to know the path to the latest file under each directory using gsutil ls.
shell script
for dir in dir_list[#];do
file+=$(gsutil ls -R ${dir} | tail -n 1);
done
Running the command in a loop process is very slow.
I want the final output to be
Is there another way?
results image
gs://bucket/dir_a/latest.txt
gs://bucket/dir_b/latest.txt
gs://bucket/dir_c/latest.txt
gs://bucket/dir_d/latest.txt
There isn't other strategy for a good reason: directory doesn't exist. So, you need to scan all the files, get the metadata, get this one which is the last, and do that for each "similar prefix".
A prefix is what you call directories "/path/to/prefix/". That's why you can only perform search by prefix in GCS not by file pattern.
So, you can imagine to build a custom app which, for each different prefix (directory), create a concurrent process (fork) dedicated to this prefix. Like that you can perform parallelization. It's not so simple to write but you can!

How to scp multiple files from remote to local in a folder other than ~/?

I'm trying to make a bash expect script that takes in input like user, host, password, and file names and then copies the files from remote to local. From what I've read so far, scp'ing multiple files from remote to local works just fine when you're assuming the files are coming from ~/, i.e:
scp -r user#host:"file1 file2 file3" .
would copy file1, file2, and file3 from ~/ into the current working directory. But I need to be able to pass in another directory as an argument, so my bash script looks like this (but doesn't work, I'll explain how):
eval spawn scp -oStrictHostKeyChecking=no -oCheckHostIP=no -r $user#$host:$dir/"$file1 $file2 $file3" ~/Downloads
This doesn't work after the first file; the shell raises a "No such file or directory" error after the first file, which I would assume means that the script only works on $dir for the first file, then kicks back into ~/ and of course can't find the files there. I've looked everywhere for an answer on this but can't find it, and it would be super tedious to do this one file at a time.
Assuming your remote login shell understands Brace Expansion, this should work
scp $user#$host:$dir/"{$file1,$file2,$file3}" ~/Downloads
If you want to download multiple files with a specific pattern, you can do the following for example if you want all zip files:
scp -r user#host:/path/to/files/"*.zip" /your/local/path

Uploading files from multiple directories to an SFTP site using Shell Scripting

I'm trying to upload items from multiple folder locations locally to an SFTP site. I'm using an existing shell script that I know works for uploads from a single local location, but I can't figure out how to make it work for uploads from multiple local locations.
I'm fairly new to coding and have only basic experience with batch scripting and some minor editing of existing shell scripts, so I would appreciate any help that can be given.
Here's the sample of my existing single local location upload script
open sftp://(userid):(password)#(sftp site) -hostkey="(hostkey)"
pwd
ls
lcd "(local directory)"
lls
cd (remote directory)
ls
put * -filemask=|*/ ./
exit
This has worked well for us previously, but I'm trying to clean up some of our existing scripts by combining them into one process that runs as an automated task, but I can't figure out how to chain multiple tasks like this together.
Just repeat the upload code for each location:
cd /remote/directory
lcd /local/directory1
put * -filemask=|*/ ./
lcd /local/directory2
put * -filemask=|*/ ./
Though if it's really a WinSCP script, you can use just one command like:
put -filemask=|*/ /local/directory1/* /local/directory2/* /remote/directory/
See the documentation for the put command:
put <file> [ [ <file2> ... ] <directory>/[ <newname> ] ]
...
If more parameters are specified, all except the last one specify set of files to upload. Filename can be replaced with Windows wildcard to select multiple files. To upload all files in a directory, use mask *.
The last parameter specifies target remote directory and optionally operation mask to store file(s) under different name. Target directory must end with slash. ...

Bash union of two directories in one statement

I'm trying to run a command that takes one location input (intended for a single directory of files), but I need to run it on files in several locations. While I'd normally run it on */*.type, I'm looking for some way to run the command over (*/dirA/*.type AND dirB/*.type).
I basically need all files of *.type within a directory structure, but they're not all at the same directory level (or I'd just do */*/*.type or something to grab them all). Unfortunately they're in a particular layout for a reason, which I can't just reorganize to make this command run.
Is there any bash shortcut/command/whatever-it's-called that I could use to get both sets of files at once?
you can say
dir{A,B}/*.type
For example running this with ls command
root#do:/tmp# ls dir{A,B}/*.type
dirA/test.type dirB/test.type
If the command works when you pass one wildcard in, that means it is expecting a list of file names. In that case you can pass it two wildcards just as easily.
command */dirA/*.type dirb/*.type

Batch FTP mget command not working with wildcard?

I have written a batch script that logs into my ftp server, then navigates to a directory. I am having trouble with the mget command, I want it to download every .dat file in the directory, but it simply returns this error:
Cannot access file '/home/minecraft/multicraft/servers/server267/world/players/*.dat':No such file or directory.
200 Type set to: ANSI
Cannot find list of remote files
Here is my script (ran from cmd)
open 66.71.244.202
USER
PASSWORD
cd /world
cd players
mget *.dat
That is by design. The most recent update to the FTP specification (RFC 3659) explicitly forbids it (see section 2.2.2):
For the commands defined in this specification, all pathnames are to be treated literally. That is, for a pathname given as a parameter to a command, the file whose name is identical to the pathname given is implied. No characters from the pathname may be treated as special or "magic", thus no pattern matching (other than for exact equality) between the pathname given and the files present in the NVFS of the server-FTP is permitted.
Clients that desire some form of pattern matching functionality must obtain a listing of the relevant directory, or directories, and implement their own file name selection procedures.
When you execute your script file with ftp, you have to turn off the globbing which will allow the use of wildcards in the script. For example:
ftp -n -i -s:scriptfile.txt
should work but
ftp -n -i -g -s:scriptfile.txt
will not.
I know this is old, but it might help someone. I had the same issue with wildcards on MGET from Windows FTP, but it was not consistent in that it worked talking to some remote systems, but not to all of them.
My script was doing this:
cd /folder/folder
mget ./-400TA/folder/*_XYZ
In the folder structure I have a set of different folders that begin with hyphens, and for whatever reason the script CD's down to just above there, and uses the relative path in the MGET. I had the same issue that some of you reported, that if I connected interactively and typed the commands one by one, it worked. But in batch, it didn't.
I followed the suggestions in this and other posts, but no joy. I don't have access to the remote systems at the moment to look at them to figure out why some worked and some didn't.
However, what I did find was this. Changing my script as follows:
cd /folder/folder/-400TA/folder
mget *_XYZ
did the trick. Simple. There's some strange interaction going on somewhere possibly with folder protections or something, but it just shows that trying out different things may get you there in the end.
I would make sure glob is on, when turned off the file name in the put and get commands is taken literally and wildcards will not be looked at.
More info:
glob:Toggle filename expansion for mdelete, mget and mput. If globbing
is turned off with glob, the file name arguments are taken literally
and not expanded. Globbing for mput is done as in csh. For mdelete and
mget, each remote file name is expanded separately on the remote
machine and the lists are not merged. Expansion of a directory name is
likely to be different from expansion of the name of an ordinary file:
the exact result depends on the foreign operating system and ftp
server, and can be previewed by doing ‘mls remote-files -’ Note: mget
and mput are not meant to transfer entire directory subtrees of files.
That can be done by transferring a tar archive of the subtree (in
binary mode).
Once you are inside your ftp try to check the glob and set it on if it is off. The default behaviour is on, from the command line when connecting to ftp with the option -g you can turn off the file name globbing.
It could very well also be a firewall issue where it is not permitting or forwarding the servers inbound connection. Happened to me.

Resources