Git post-receive hook, send curl commit message to Discord Webhook - bash

I am trying to post an information-message in our discord every time someone pushes to the master.
I have a post-receive bash script looking like this:
#!/bin/bash
while read oldrev newrev ref
do
if [[ $ref =~ .*/master$ ]];
then
tail=$(git log -1 --pretty=format:'%h %cn: %s%b' $newrev)
url='https://discordapp.com/api/webhooks/validapikey'
curl -i \
-H "Accept: application/json" \
-H "Content-Type:application/json" \
-X POST \
--data '{"content": "'$tail'"}' $url
fi
done
If I output tail to a file I get the expected string
6baf5 user: last commit message
but the message does not get posted on discord
If I replace $tail with "hello" it gets posted.

3 suggestions:
a) -d "{\"content\": \"${tail}\"}"
b) You can write this in the same language your project is, like Python or NodeJS, which is always better than bash (use the same name and make it executable)
c) To avoid this to be maintained in each dev machine, you can version this logic inside your repo using https://pypi.org/project/hooks4git or any other too that provides git hook management.

Related

How to use SSH in bash script to download files from bitbucket?

At my work,
the current method of downloading custom git pre-commit hooks (from a bitbucket repo) uses curl in a bash script as shown :
where $USERNAME, $PASSWORD and $build_support_url are previously assigned.
...<some code>...
# Download templates from http://swbuilds to home dir
echo "Downloading pre-commit.pl hook"
curl -u $USERNAME:$PASSWORD --fail --show-error --silent --output ~/.git_template/hooks/pre-commit $build_support_url/pre-commit.pl
echo "Downloading prepare-commit-msg.py hook"
curl -u $USERNAME:$PASSWORD --fail --show-error --silent --output ~/.git_template/hooks/prepare-commit-msg $build_support_url/prepare-commit-msg.py
echo "Downloading commit-msg.py hook"
curl -u $USERNAME:$PASSWORD --fail --show-error --silent --output ~/.git_template/hooks/commit-msg $build_support_url/commit-msg.py
# Force the execute bit to be set
chmod a+x ~/.git_template/hooks/*
# Also download the customer list used by the pre-commit hook
echo "Downloading customer list"
curl -u $USERNAME:$PASSWORD --fail --show-error --silent --output ~/.git_template/customer_list.txt $build_support_url/customer_list.txt
# Configure Git templates
git config --global init.templatedir '~/.git_template'
...<some more code>...
This downloads the pre-commit hooks from the link $build_support_url/pre-commit.pl and places them in the ~/.git_template folder.
However, since this process is using curl withHTTPS, the script will require a password everytime it is run.
To avoid that hassle, I am told to edit the script so that, it uses SSH to download the files (which doesn't require a password).
Any suggestions on how to use SSH in this script to obtain those files?
Thanks.
PS: I have a crude idea of backend and API (and only know the basic HTTP requests like GET and POST. Hope that gives a little more context to the situation).
You can store your credentials in ~/.netrc file so curl uses them as explained here.
The solution was just to use the 'git archive' command instead of curl

POST multiple files with -d in curl

I'm using curl to create several classifications. I have written the json for the many classifications and they are in one folder. I would like to create all the classifications in one go. But using curl I can only create them one at a time. How could I make them in one request?
curl -u admin:admin -H "Content-Type: application/json" -X POST -d #pii.json http://127.0.0.1:21000/api/atlas/v2/types/typedefs
The curl manual for -d says 'Multiple files can also be specified'. How can I do this? All my attempts have failed.
Do I need a bash script instead? If so, could you help me - I'm not a coder and I'm struggling without an example!
Thanks in advance.
You probably don't want to use multiple -d with JSON data since curl concatenates multiple ones with a & in between. As described in the man page for -d/--data:
If any of these options is used more than once on the same command
line, the data pieces specified will be merged together with a
separating &-symbol. Thus, using '-d name=daniel -d skill=lousy' would
generate a post chunk that looks like 'name=daniel&skill=lousy'.
You can however easily and conveniently pass several files on stdin to let curl use them all in one go:
cat a.json b.json c.json | curl -d#- -u admin:admin -H "Content-Type: application/json" http://127.0.0.1:21000/api/atlas/v2/types/typedefs
(please note that -X POST has no place on a command line that uses -d)
I found the following to work in the end:
<fileToUpload.dat xargs -I % curl -X POST -T "{%}" -u admin:admin -H "Content-Type: application/json" http://127.0.0.1:21000/api/atlas/v2/types/typedefs
Where fileToUpload.dat contained a list of the .json files.
This seemed to work over Daniel's answer, probably due to the contents of the files. Hopefully this is useful to others if Daniel's solution doesn't work for them.
I needed to upload all the *.json files from a folder via curl and I made this little script.
nfiles=*.json
echo "Enter user:"
read user
echo "Enter password:"
read -s password
for file in $nfiles
do
echo -e "\n----$file----"
curl --user $user:$password -i -X POST "https://foo.bar/foo/bar" -H "Content-Type: application/json" -d "#$file"
done
Maybe fits your needs.

How to upload source version to Azure Setting using curl?

My objective is simple: get hold of the last commit hash when I run my app
attempts:
I started to use git-last-commit package but where the app run, it's a normal directory, and the repository is outside this folder
/config
/deployments
/diagnostics
/ipaddr_0
/locks
/repository
/wwwroot
the website runs inside wwwroot and git repo is in repository. I couldn't get hold of it programmatically.
So I tried the Kudu API and it's as easy as a curl POST ... but how can I pass the commit has as a curl data?
I've tried:
$ git log -n1 --pretty=format:"%H" | curl -X POST -H 'Content-Type: application/json' https://$AZURE_LOGIN:$AZURE_PASS#$AZURE_APPNAME.scm.azurewebsites.net/api/settings -d '{ "SOURCE_VERSION":"&> /dev/stdin" }'
and
$ git log -n1 --pretty=format:"%H" | curl -X POST -H 'Content-Type: application/json' https://$AZURE_LOGIN:$AZURE_PASS#$AZURE_APPNAME.scm.azurewebsites.net/api/settings -d '{ "SOURCE_VERSION":"#d" }'
only to find that it sends literally what I write and not the piped value
The idea was to have this as a Bitbucket pipeline step to be executed for every deployment...
Does any of you have some trick to accomplish this?

Running Curl in Run Shell Script with arguments in Automator

I'm trying to send a curl command in Automator via the 'Run Shell Script', with arguments, but having no luck. I'm using /bin/bash and passing info as arguments. Here is my script but keep getting Bad Request from IFTTT. I get it's to do with not using the args correctly (if I just put "value1":"test" it works fine), how should I format the $1?
for f
do
curl -X POST -H "Content-Type: application/json" -d '{"value1":$1}' https://maker.ifttt.com/trigger/Automator/with/key/heremykey
done
Thanks!
You should pass a valid JSON. There is no built-in JSON support in Bash, so you need to use external tools, such as PHP, or Node:
#!/bin/bash -
function json_encode {
printf "$1" | php -r 'echo json_encode(stream_get_contents(STDIN));'
}
for f
do
value=`json_encode "$f"`
curl -X POST -H "Content-Type: application/json" -d "{\"value1\":$value}" \
https://maker.ifttt.com/trigger/Automator/with/key/heremykey
done
The script is supposed to send {"value1": ...} string for each item in $#(because the short version of the for loop operates on $#).

Triggering builds of dependent projects in Travis CI

We have our single page javascript app in one repository and our backend server in another. Is there any way for a passing build on the backend server to trigger a build of the single page app?
We don't want to combine them into a single repository, but we do want to make sure that changes to one don't break the other.
Yes, it is possible to trigger another Travis job after a first one succeeds. You can use the trigger-travis.sh script.
The script's documentation tells how to use it -- set an environment variable and add a few lines to your .travis.yml file.
It's possible yes and it's also possible to wait related build result.
I discover trigger-travis.sh from the previous answer but before that I was implementing my own solution (for full working source code: cf. pending pull request PR196 and live result)
References
Based on travis API v3 documentation:
trigger a build triggering-builds
get build information resource/builds
You will need a travis token, and setup this token as secreet environment variable on travis portal.
Following this doc, I were able to trigger a build, and wait for him.
1) make .travis_hook_qa.sh
(extract) - to trigger a new build :
REQUEST_RESULT=$(curl -s -X POST \
-H "Content-Type: application/json" \
-H "Accept: application/json" \
-H "Travis-API-Version: 3" \
-H "Authorization: token ${QA_TOKEN}" \
-d "$body" \
https://api.travis-ci.org/repo/${QA_SLUG}/requests)
(it's trigger-travis.sh equivalent) You could make some customization on the build definition (with $body)
2) make .travis_wait_build.sh
(extract) - to wait a just created build, get build info :
BUILD_INFO=$(curl -s -X GET \
-H "Content-Type: application/json" \
-H "Accept: application/json" \
-H "Travis-API-Version: 3" \
-H "Authorization: token ${QA_TOKEN}" \
https://api.travis-ci.org/repo/${QA_SLUG}/builds?include=build.state\&include=build.id\&include=build.started_at\&branch.name=master\&sort_by=started_atdesc\&limit=1 )
BUILD_STATE=$(echo "${BUILD_INFO}" | grep -Po '"state":.*?[^\\]",'|head -n1| awk -F "\"" '{print $4}')
BUILD_ID=$(echo "${BUILD_INFO}" | grep '"id": '|head -n1| awk -F'[ ,]' '{print $8}')
You will have to wait until your timeout or expected final state..
Reminder: possible travis build states are created|started (and then) passed|failed

Resources