curl to SFTP and list files in directory - shell

I am using the below curl command in my shell script to connect to SFTP remote directory.
curl -k "sftp://url.test.com/test_folder" --user "username:password"
Is there a way to list the files in directory test_folder.

Yes: end the URL with a trailing slash, to indicate to curl that it is in fact a directory! Like this:
curl -k sftp://url.test.com/test_folder/ --user "username:password"

Related

iterate through specific files using webHDFS in a bash script

I want to download specific files in a HDFS directory, with their names starting with "total_conn_data_". Since I've got many files I want to write a bash script.
Here's what I do:
myPatternFile="total_conn_data_*.csv"
for filename in `curl -i -X GET "https://knox.blabla/webhdfs/v1/path/to/the/directory/?OP=LISTSTATUS" -u username`; do
curl -i -X GET "https://knox.blabla/webhdfs/v1/path/to/the/directory/$filename?OP=OPEN" -u username -L -o "./data/$filename" -k;
done
But it does not work since curl -i -X GET "https://knox.blabla/webhdfs/v1/path/to/the/directory/?OP=LISTSTATUS" -u username is sending back a json text and not file names.
How should I do? Thanks
curl provides output in json format only. you will have to use other tools like jquery and sed to format that output and get the list of files.

How can I verify curl response before passing to bash -s?

I have a script that looks like:
curl -sSL outagebuddy.com/path/linux_installer | bash -s
Users can install a linux client for the site using the command that is provided to them. I'm thinking there should be an intermediary step that verifies the curl had a 2XX response and downloaded the content successfully before passing it to bash. How can I do that?
Without a user-managed temporary file:
if script=$(curl --fail -sSL "$url"); then
bash -s <<<"$script"
fi
If you don't mind having an intermediate file (which you certainly need if you want to make sure the curl command worked fully) then you can use:
if curl --fail -sSL <params> -o script.sh
then
bash script.sh
fi

Bash Delete folder from Artifactory with Curl

I am attempting to delete all folders from our Artifactory that are older than 2 weeks.
I find the files that are all older than 2 weeks with this command:
curl -s -u usernamd:password "URL/api/search/dates?dateFields=created&from=${Two_Years_Ago}&${Two_Weeks_Ago}&repos=generic-sgca" | jq -r '.results[].uri'|sed 's=/api/storage=='
Which will return the following:
https://URL/generic-sgca/KsfSettingsEngine/0.2.0-123456/latest/export/conanfile.py
I can delete the "conanfile.py" by running this command:
curl -u username:password -X DELETE https://URL/generic-sgca/KsfSettingsEngine/0.2.0-123456/latest/export/conanfile.py
However, I want to delete the entire 0.2.0-123456 folder, not just the conanfile.py file.
When I try this command:
curl -u username:password -X DELETE https://URL/generic-sgca/KsfSettingsEngine/0.2.0-123456
it returns a "Could not locate artifact" error. How can I accomplish deleting the entire artifactory folder from a curl command?

Using CURL to download file is having issue

I am trying to download a file from remote server using curl
curl -u username:password -O https://remoteserver/filename.txt
In my case a file filename.txt is getting created but the content of file says virtaul user logged in. It is not downloading the actual file.
I am not sure why this is happening. Any help on why the download is not working.
Try this in terminal:
curl -u username:password -o filedownload.txt -0 https://remoteserver/filename.txt
This command with -o will copy the contents of filename.txt to filedownload.txt in the current working directory.

Save file to specific folder with curl command

In a shell script, I want to download a file from some URL and save it to a specific folder. What is the specific CLI flag I should use to download files to a specific folder with the curl command, or how else do I get that result?
I don't think you can give a path to curl, but you can CD to the location, download and CD back.
cd target/path && { curl -O URL ; cd -; }
Or using subshell.
(cd target/path && curl -O URL)
Both ways will only download if path exists. -O keeps remote file name. After download it will return to original location.
If you need to set filename explicitly, you can use small -o option:
curl -o target/path/filename URL
The --output-dir option is available since curl 7.73.0:
curl --create-dirs -O --output-dir /tmp/receipes https://example.com/pancakes.jpg
curl doesn't have an option to that (without also specifying the filename), but wget does. The directory can be relative or absolute. Also, the directory will automatically be created if it doesn't exist.
wget -P relative/dir "$url"
wget -P /absolute/dir "$url"
it works for me:
curl http://centos.mirror.constant.com/8-stream/isos/aarch64/CentOS-Stream-8-aarch64-20210916-boot.iso --output ~/Downloads/centos.iso
where:
--output allows me to set up the path and the naming of the file and extension file that I want to place.
Use redirection:
This works to drop a curl downloaded file into a specified path:
curl https://download.test.com/test.zip > /tmp/test.zip
Obviously "test.zip" is whatever arbitrary name you want to label the redirected file- could be the same name or a different name.
I actually prefer #oderibas solution, but this will get you around the issue until your distro supports curl version 7.73.0 or later-
For powershell in Windows, you can add relative path + filename to --output flag:
curl -L http://github.com/GorvGoyl/Notion-Boost-browser-extension/archive/master.zip --output build_firefox/master-repo.zip
here build_firefox is relative folder.
Use wget
wget -P /your/absolut/path "https://jdbc.postgresql.org/download/postgresql-42.3.3.jar"
For Windows, in PowerShell, curl is an alias of the cmdlet Invoke-WebRequest and this syntax works:
curl "url" -OutFile file_name.ext
For instance:
curl "https://airflow.apache.org/docs/apache-airflow/2.2.5/docker-compose.yaml" -OutFile docker-compose.yaml
Source: https://krypted.com/windows-server/its-not-wget-or-curl-its-iwr-in-windows/
Here is an example using Batch to create a safe filename from a URL and save it to a folder named tmp/. I do think it's strange that this isn't an option on the Windows or Linux Curl versions.
#echo off
set url=%1%
for /r %%f in (%url%) do (
set url=%%~nxf.txt
curl --create-dirs -L -v -o tmp/%%~nxf.txt %url%
)
The above Batch file will take a single input, a URL, and create a filename from the url. If no filename is specified it will be saved as tmp/.txt. So it's not all done for you but it gets the job done in Windows.

Resources