How to show subdirectories using SFTP ls command - bash

I'm trying to get subdirectories list in a folder
echo "ls -1 /path/to/folder/*/" | sftp -i /path/to/key user#host | grep -v 'sftp>'
If there is more than one subdirectory I get list of subdirectories:
/path/to/folder/subdirectory1/
/path/to/folder/subdirectory2/
If there is only one subdirectory I get nothing.
Thank you for your suggestions.
Note: using SSH is not allowed

If there is only one subdirectory I get nothing.
You should only get nothing if the only one subdirectory is empty, because ls if given a single directory argument lists its contents. With the normal ls we could solve this problem simply by means of the option -d, but unfortunately sftp's ls doesn't have that option. The only way coming to my mind is to filter the desired directories from a long listing:
echo "ls -l /path/to/folder" | sftp -i /path/to/key user#host | awk '/^d/{print "/path/to/folder/"$NF}'

Related

How to pass path to xargs command in Linux

I have the following code, which removes old files in a directory based on their timestamps:
ls -tp | grep -v '/$' | tail -n +2 | xargs -I {} rm -- {}
I am trying to make an executable script out of this and I don't want to cd into the directory where the above command should be run, but rather simply pass the path e.g. /tmp/backups/ to it.
How would I do this? Attaching the path directly after each command ls, grep, tail, xargs and rm ?
Assuming that your path is the first parameter of your script, you could do a
cd "${1:-/tmp/backups}" # Use the parameter, supply default if none
# Do your cleanup here
If you have afterwards more stuff to do in the original working directory, just do a
cd - >/dev/null
# Do what ever you need to do in the original working directory
The sole purpose of the redirection of stdout into the bit bucket is, because cd - by default prints to stdout the directory it changes into.

Bash script to separate files into directories, reverse sort and print in an HTML file works on some files but not others

Goal
Separate files into directories according to their filenames, run a Bash script that reverse sorts them and assembles the content into one file (I know steps to achieve this are already documented on Stack Overflow, but please keep reading...)
Problem
Scripts work on all files but two
State
Root directory
dos-18-1-18165-03-for-sql-server-2012---15-june-2018.html
dos-18-1-18165-03-for-sql-server-2016---15-june-2018.html
dos-18-1-18176-03-for-sql-server-2012---10-july-2018.html
dos-18-1-18197-01-for-sql-server-2012---23-july-2018.html
dos-18-1-18197-01-for-sql-server-2016---23-july-2018.html
dos-18-1-18232-01-for-sql-server-2012---21-august-2018.html
dos-18-1-18232-01-for-sql-server-2016---21-august-2018.html
dos-18-1-18240-01-for-sql-server-2012---5-september-2018.html
dos-18-1-18240-01-for-sql-server-2016---5-september-2018.html
dos-18-2-release-notes.html
dos-18-2-known-issues.html
Separate the files into directories according to their SQL Server version or name
ls | grep "^dos-18-1.*2012.*" | xargs -i cp {} dos181-2012
ls | grep "^dos-18-1.*2016.*" | xargs -i cp {} dos181-2016
ls | grep ".*notes.*" | xargs -i cp {} dos-18-2-release-notes
ls | grep ".*known.*" | xargs -i cp {} dos-18-2-known-issues
Result (success)
/dos181-2012:
dos-18-1-18165-03-for-sql-server-2012---15-june-2018.html
dos-18-1-18176-03-for-sql-server-2012---10-july-2018.html
dos-18-1-18197-01-for-sql-server-2012---23-july-2018.html
dos-18-1-18232-01-for-sql-server-2012---21-august-2018.html
dos-18-1-18240-01-for-sql-server-2012---5-september-2018.html
/dos181-2016:
dos-18-1-18165-03-for-sql-server-2016---15-june-2018.html
dos-18-1-18197-01-for-sql-server-2016---23-july-2018.html
dos-18-1-18232-01-for-sql-server-2016---21-august-2018.html
dos-18-1-18240-01-for-sql-server-2016---5-september-2018.html
/dos-18-2-known-issues
dos-18-2-known-issues.html
/dos-18-2-release-notes
dos-18-2-release-notes.html
Variables (all follow this pattern)
dos181-2012.sh
file="dos181-2012"
export
dos-18-2-known-issues
file="dos-18-2-known-issues"
export
Reverse sort and assemble (assumes /$file exists; after testing all lines of code I believe this is where the problem lies):
cat $( ls "$file"/* | sort -r ) > "$file"/"$file".html
Result (success and failure)
dos181-2012.html has the correct content in the correct order.
dos-18-2-known-issues.html is empty.
What I have tried
I tried to ignore the two files in the command:
cat $( ls "$file"/* -i (grep ".*known.*" ) | sort -r ) > "$file"/"$file".html
Result: The opposite occurs
dos181-2012.html is empty
dos-18-2-known-issues.html is not empty
Thank you
I am completely baffled. Why do these scripts work on some files but not others? (I can share more information about the file contents if that will help, but the file contents are nearly identical.) Thank you for any insights.
first off, you question is quite incomplete. You start great, showing the input files and directories. But then you talk about variables and $files, but you do not show the code from which these originate. So I based my answer on the explanation in the first paragraph and what I deduced from the rest of the question.
I did this:
#!/bin/bash
cp /etc/hosts dos-18-1-18165-03-for-sql-server-2012---15-june-2018.html
cp /etc/hosts dos-18-1-18165-03-for-sql-server-2016---15-june-2018.html
cp /etc/hosts dos-18-1-18176-03-for-sql-server-2012---10-july-2018.html
cp /etc/hosts dos-18-1-18197-01-for-sql-server-2012---23-july-2018.html
cp /etc/hosts dos-18-1-18197-01-for-sql-server-2016---23-july-2018.html
cp /etc/hosts dos-18-1-18232-01-for-sql-server-2012---21-august-2018.html
cp /etc/hosts dos-18-1-18232-01-for-sql-server-2016---21-august-2018.html
cp /etc/hosts dos-18-1-18240-01-for-sql-server-2012---5-september-2018.html
cp /etc/hosts dos-18-1-18240-01-for-sql-server-2016---5-september-2018.html
cp /etc/hosts dos-18-2-release-notes.html
cp /etc/hosts dos-18-2-known-issues.html
DIRS='dos181-2012 dos181-2016 dos-18-2-release-notes dos-18-2-known-issues'
for DIR in $DIRS
do
if [ ! -d $DIR ]
then
mkdir $DIR
fi
done
cp dos-18-1*2012* dos181-2012
cp dos-18-1*2016* dos181-2016
cp *notes* dos-18-2-release-notes
cp *known* dos-18-2-known-issues
for DIR in $DIRS
do
/bin/ls -c1r $DIR >$DIR.html
done
The cp commands are just to create the files with something in them.
You did not specify how the directory names were produced, so I went with the easy option and listed them in a variable ($DIRS). These could be built based on the filenames, but you did not mention that.
Then created the directories (first for).
Then 4 cp commands. Your code is very complicated for something so basic. cp, like rm;mv;ls;... can do wildcard expansion, so there is no need for complex grep and xargs to copy files around.
Finally in the last for loop, list the files (ls), in 1 column (-c1, strictly output formatting), reversed the sort order (-r). The result of that ls is sent to a ".html" file of the same name as the directory.

Listing without "headers" for every directory

is it possible to make a listing of every data in the root directory including the path but without every Directory in a new section? Something like this but without every a "header" on every directory"
ls -l $PWD/*
When you want to see hidden files you can use
find $PWD -maxdepth 1 | xargs ls -ld
Or use grep -v "${PWD}/\." when you don't want them.
pipe to awk to handle it.
ls -l * | awk '$1!="total"'

ssh to another server and list only the directories inside a directory

dev1.ab.com is the current server, I have to ssh to tes1.ab.com, move inside the path /spfs/tomcat/dir and list all the directories (no files) present there.
For this I am using the below command from the command line of dev1.ab.co
*ssh tes1.ab.com " ls -1d / /spfs/tomcat/dir "
but in output I am getting output the directories of home is /spfs/tomcat, I have also tried
*ssh tes1.ab.com " ls -lrt /spfs/tomcat/dir | ls -1d /"
*ssh tes1.ab.com " cd /spfs/tomcat/dir | ls -1d /"
and ended up with nearly same result.
can someone help me to write a command in command line of dev1.ab.com to ssh to tes1.ab.com and get the output as the list of directories present in /spfs/tomcat/dir directory.
Use find instead of ls.
ssh tes1.ab.com "find /my/folder -type d"
add -maxdepth 1 to list only the first level
you could use the following
ssh tes1.ab.com 'ls -ld /spfs/tomcat/dir/*/'
to list all the directories alone inside /spfs/tomcat/dir

ls first file of given type from given directory

This searches only in the pwd, despite having specified the directory. Probably caused by the third argument.
supported="*mov *mp4"
ls /home/kv/m $supported | head -1
..Removing the filter brings up the first file found by ls, but what can I use to tell ls to consider only the file types listed in $supported? --It's worth mentioning that the extensions mustn't be case-sensitive.
ls /home/kv/m | head -1
ls /home/kv/m | grep -i -E '\.(mov|mp4)$' | head -1
Run it in a subshell, and cd to the directory first
first=$( cd /home/kv/m && ls $supported | head -1 )
you might want to shopt -s nullglob first, in case there are not .mov or .mp4 files.

Resources