Bash Delete folder from Artifactory with Curl - bash

I am attempting to delete all folders from our Artifactory that are older than 2 weeks.
I find the files that are all older than 2 weeks with this command:
curl -s -u usernamd:password "URL/api/search/dates?dateFields=created&from=${Two_Years_Ago}&${Two_Weeks_Ago}&repos=generic-sgca" | jq -r '.results[].uri'|sed 's=/api/storage=='
Which will return the following:
https://URL/generic-sgca/KsfSettingsEngine/0.2.0-123456/latest/export/conanfile.py
I can delete the "conanfile.py" by running this command:
curl -u username:password -X DELETE https://URL/generic-sgca/KsfSettingsEngine/0.2.0-123456/latest/export/conanfile.py
However, I want to delete the entire 0.2.0-123456 folder, not just the conanfile.py file.
When I try this command:
curl -u username:password -X DELETE https://URL/generic-sgca/KsfSettingsEngine/0.2.0-123456
it returns a "Could not locate artifact" error. How can I accomplish deleting the entire artifactory folder from a curl command?

Related

Bash script to download latest release from GitHub

Looking for a simple way to download a .zip from a latest GitHub release.
There are other similar questions, but I havent been able to get them to work. :(
Trying to pull latest release from https://github.com/CTCaer/hekate
Currently ive got:
#!/bin/bash
curl -s https://api.github.com/repos/CTCaer/hekate/releases/latest | jq -r ".assets[] | select(.name | test(\"hekate_ctcaer\")) | .browser_download_url"
trying to fetch the url of the latest .zip and only grab the "hekate_ctcaer_X.X.X_Nyx_X.X.X.zip"
I saw someone trying to achieve this with 'Xidel', so im open to trying that if someone knows the syntax to grab a specific file from the GitHub api.
As I understand it (?), the Github API spits out an array for the release 'assets', so im trying to specify an item in this array that matches "hekate_ctcaer", and download the specified file.
Github is also a compatible git repo. I provide a new train of thought.
use git ls-remote to fetch last release tag.
git -c 'versionsort.suffix=-' ls-remote --tags --sort='v:refname' http://github.com/CTCaer/hekate.git
| tail --lines=1
| cut --delimiter='/' --fields=3
Here this examples outputs v5.8.0
then clone remote repo
git clone --branch v5.8.0 http://github.com/CTCaer/hekate.git
zip repos to zipped file.
zip hekate.zip -r hekate/
This will print out the url to the zip file of the latest release:
curl -sL https://api.github.com/repos/CTCaer/hekate/tags \
| jq -r '.[0].zipball_url' \
| xargs -I {} curl -sL {} -o latest.zip
I saw someone trying to achieve this with 'Xidel'
I assume you're referring to my answer here. That answer is tagged batch-file, so you first of all have to swop the quotes for bash ("function('string')" --> 'function("string")'). And secondly, you're right. You have to select the appropriate object in the "assets"-array.
$ xidel -s "https://api.github.com/repos/CTCaer/hekate/releases/latest" \
-f '$json/(assets)()[starts-with(name,"hekate_ctcaer")]/browser_download_url' \
--download '{substring-after($headers[starts-with(.,"Content-Disposition")],"filename=")}'
This downloads 'hekate_ctcaer_5.8.0_Nyx_1.3.0.zip' in the current dir.
With r8389 or newer you can just use --download ..
also how would I modify this for the following: github.com/Atmosphere-NX/Atmosphere/releases/tag/1.3.2 the .zip AND the .bin
Strictly speaking you'd have to raise a new question for this, but ok.
It appears that (at the moment) v1.3.2 is also the latest release for this repo, so you can use...
$ xidel -s "https://api.github.com/repos/Atmosphere-NX/Atmosphere/releases/latest" \
-e '$json'
or alternatively...
$ xidel -s "https://api.github.com/repos/Atmosphere-NX/Atmosphere/releases" \
-e '$json()[tag_name="1.3.2"]'
The "assets"-array here has just 2 objects; one with the zip-file and one with the bin-file, so just "follow" (--follow / -f) the 2 "browser_download_url"-keys to download:
$ xidel -s "https://api.github.com/repos/Atmosphere-NX/Atmosphere/releases" \
-f '$json()[tag_name="1.3.2"]//browser_download_url' \
--download .

curl to SFTP and list files in directory

I am using the below curl command in my shell script to connect to SFTP remote directory.
curl -k "sftp://url.test.com/test_folder" --user "username:password"
Is there a way to list the files in directory test_folder.
Yes: end the URL with a trailing slash, to indicate to curl that it is in fact a directory! Like this:
curl -k sftp://url.test.com/test_folder/ --user "username:password"

How to iterate all files inside folder and run curl command to install AEM packages

I am trying to create a shell script which to iterate all the zip files and install those in AEM package manager using curl command.
The below single curl command is working, It is properly installing the package in respective AEM instance.
curl -u admin:admin -F file=#"content-ope.zip" -F name="content-ope.zip" -F force=true -F install=true http://localhost:4502/crx/packmgr/service.jsp
But we have to install many zip files so we are planning to keep all of them in one folder, iterate all the zip files and install using curl command. Tried with while and for loop but unable to read all the .zip files using shell script.
Can anyone have any idea on this?
I wrote that exact thing, see here:
https://gist.github.com/ahmed-musallam/07fbf430168d4ac57bd8c89d8be9bca5
#!/bin/bash
# this script will install ALL zip packages in current directory the AEM instance at 4502
for f in *.zip
do
    echo "installing: $f"
    curl -u admin:admin -F file=#"$f" -F name="$f" -F force=true -F install=true http://localhost:4502/crx/packmgr/service.jsp
    echo "done."
done
Instead of using curl you can just copy the files over to the install folder of the AEM instance. These will get installed automatically. https://helpx.adobe.com/in/experience-manager/6-3/sites/administering/using/package-manager.html#FileSystemBasedUploadandInstallation
find . -name "*.zip" -maxdepth 1 -exec curl -u admin:admin -F file=#"{}" -F name="{}" -F force=true -F install=true http://localhost:4502/crx/packmgr/service.jsp ";"
Node, this will substitute ./foo.zip instead of foo.zip. If you need to strip ./, you should probably write shell-script wrapping you curl command that accepts zip file name as argument and strips ./ from it before passing to curl.

How to upload source version to Azure Setting using curl?

My objective is simple: get hold of the last commit hash when I run my app
attempts:
I started to use git-last-commit package but where the app run, it's a normal directory, and the repository is outside this folder
/config
/deployments
/diagnostics
/ipaddr_0
/locks
/repository
/wwwroot
the website runs inside wwwroot and git repo is in repository. I couldn't get hold of it programmatically.
So I tried the Kudu API and it's as easy as a curl POST ... but how can I pass the commit has as a curl data?
I've tried:
$ git log -n1 --pretty=format:"%H" | curl -X POST -H 'Content-Type: application/json' https://$AZURE_LOGIN:$AZURE_PASS#$AZURE_APPNAME.scm.azurewebsites.net/api/settings -d '{ "SOURCE_VERSION":"&> /dev/stdin" }'
and
$ git log -n1 --pretty=format:"%H" | curl -X POST -H 'Content-Type: application/json' https://$AZURE_LOGIN:$AZURE_PASS#$AZURE_APPNAME.scm.azurewebsites.net/api/settings -d '{ "SOURCE_VERSION":"#d" }'
only to find that it sends literally what I write and not the piped value
The idea was to have this as a Bitbucket pipeline step to be executed for every deployment...
Does any of you have some trick to accomplish this?

Using CURL to download file is having issue

I am trying to download a file from remote server using curl
curl -u username:password -O https://remoteserver/filename.txt
In my case a file filename.txt is getting created but the content of file says virtaul user logged in. It is not downloading the actual file.
I am not sure why this is happening. Any help on why the download is not working.
Try this in terminal:
curl -u username:password -o filedownload.txt -0 https://remoteserver/filename.txt
This command with -o will copy the contents of filename.txt to filedownload.txt in the current working directory.

Resources