Add specific header to bitbake wget fetcher - embedded-linux

I need to set a specific header to fetch an archive from a resource using the wget fetcher, analogous to:
wget --header "PRIVATE-ACCESS-TOKEN:blablablablabla https://some-resource...."
How can I set specific headers using that fetcher?
Thanks in advance!

You can do it in various ways, here are some:
Download the file manually and place it in downloads folder, as mentioned here
Override the do_fetch task:
do_fetch() {
bbnote "Fetching some file ..."
wget ...
}
But you need to take note that do_unpack uses SRC_URI, so you still gonna need to specify SRC_URI to the file URL for the unpack, example that I test with wget package itself:
LICENSE="CLOSED"
SRC_URI = "http://ftp.gnu.org/gnu/wget/wget2-2.0.0.tar.gz"
do_fetch(){
bbwarn "Fetching wget"
wget http://ftp.gnu.org/gnu/wget/wget2-2.0.0.tar.gz
}
After running do_fetch the file gets downloaded in downloads and then do_unpack unpacked it under WORKDIR of the recipe.
Specify your own wget command line for the wget fetcher:
FETCHCMD_wget = "/usr/bin/env wget --header "PRIVATE-ACCESS-TOKEN:blablablablabla""
the default wget command is present in: poky/bitbake/lib/bb/fetch2/wget.py:
self.basecmd = d.getVar("FETCHCMD_wget") or "/usr/bin/env wget -t 2 -T 30 --passive-ftp --no-check-certificate"
For more information check: this link.

Related

Mailgun attach file Gitlab CI

I'm trying to send csv file - artifact from Gitlab CI over Mailgun.
Regular mail works well, but when I'm add attachment it fails with an error:
curl: (26) Failed to open/read local data from file/application
My yaml file:
artifact:
paths:
-report_folder/result.csv
send_email:
script: curl --user "api:$Mailgun_API_KEY"
"https://api.mailgun.net/v3/$Mailgun_domain/messages"
-F from='Gitlab <gitlab#example.com>'
-F to=xxx#mail.com
-F subject='test'
-F text='hello form mailgun'
-F attachment='#report_folder/result.csv'
I guess something wrong in last line in a file path, but I tried different combinations, nothing works for now.

how to download file VIA wget while target path include Wildcards

here is elegant example how to download file and copy it to
/etc/yum.repo.d folder
example
REPOSITORY_SERVER=master_machine01
wget -nd -r -P /etc/yum.repos.d/ -A ".repo" "http://$REPOSITORY_SERVER/ambari/centos7/2.6.2.2-1/ambari.repo"
after above command ambari.repo file will copied to /etc/yum.repos.d/
note: the file amabri.rep path is
ls -ltr /var/www/html/ambari/centos7/2.6.2.2-1/ambari.repo
-rw-r--r-- 1 root users 304 Jun 11 2018 /var/www/html/ambari/centos7/2.6.2.2-1/ambari.repo
so this is the simple case
now what about path could be as ( with diff path's )
$REPOSITORY_SERVER/ambari/centos7/2.6.2.3-1/ambari.repo
or
$REPOSITORY_SERVER/ambari/centos7/2.6.2.2-4/ambari.repo
then how to use the cli with Wildcards
we try the following
wget -nd -r -P /etc/yum.repos.d/ -A ".repo" "http://$REPOSITORY_SERVER/ambari/centos7/*/ambari.repo"
but we get
HTTP request sent, awaiting response... 404 Not Found
2021-11-28 18:40:07 ERROR 404: Not Found.
or even with backslash
wget -nd -r -P /etc/yum.repos.d/ -A ".repo" "http://$REPOSITORY_SERVER/ambari/centos7/\*/ambari.repo"
BUT WITH THE SAME ERROR
any idea how to resolve this issue?
how to use the cli with Wildcards
It is not possible to perform a glob expansion with HTTP protocol. These are very unrelated technologies.
how to resolve this issue?
Devise and implement a method of getting the available files under certain path from an HTTP server. For example, contact the server administrator and ask him about it. Potentially, if the HTTP server supports serving a directory listing, recursively filter the listing to find all matching paths. Or find and query some other site that contains all the links and filter the obtained answer to extract all links, for example. Etc.

run cURL to upload a file without header

My aim is to upload the given file to service_now and I use the below code in shell script to do it. $ values are declared, where SERVICENOW_API_URL=https://dev80516.service-now.com/api/now/attachment/file and table_name is sys_attachment.
$ curl $SERVICENOW_API_URL?table_name=$SERVICENOW_ATTACHMENT_TABLE_NAME\&table_sys_id=$SERVICENOW_ATTACHMENT_TABLE_SYS_ID\&file_name=$UPLOAD_FILE_NAME --request POST --header \"Accept:application/json\" --user $SERVICENOW_USER_NAME:$SERVICENOW_PASSWORD --header \"Content-Type\:application/json\" -F "uploadFile=#./blank.txt"
Whenever I upload a file, the uploaded file has the below type of headers added to it. Please make changes in my command to upload the file without this header.
--------------------------565a353520eb07b1
Content-Disposition: form-data; name="uploadFile"; filename="blank.txt"
Content-Type: text/plain
--------------------------565a353520eb07b1--
Also when the zipped file is uploaded to service_now and then downloaded back from serviceNow, unable to extract/unzip the file.
Wow, it seems command substitution with backticks works perfectly with text files
-F "uploadFile=`cat blank.txt`"

Using CURL to download file and view headers and status code

I'm writing a Bash script to download image files from Snapito's web page snapshot API. The API can return a variety of responses indicated by different HTTP response codes and/or some custom headers. My script is intended to be run as an automated Cron job that pulls URLs from a MySQL database and saves the screenshots to local disk.
I am using curl. I'd like to do these 3 things using a single CURL command:
Extract the HTTP response code
Extract the headers
Save the file locally (if the request was successful)
I could do this using multiple curl requests, but I want to minimize the number of times I hit Snapito's servers. Any curl experts out there?
Or if someone has a Bash script that can respond to the full documented set of Snapito API responses, that'd be awesome. Here's their API documentation.
Thanks!
Use the dump headers option:
curl -D /tmp/headers.txt http://server.com
Use curl -i (include HTTP header) - which will yield the headers, followed by a blank line, followed by the content.
You can then split out the headers / content (or use -D to save directly to file, as suggested above).
There are three options -i, -I, and -D
> curl --help | egrep '^ +\-[iID]'
-D, --dump-header FILE Write the headers to FILE
-I, --head Show document info only
-i, --include Include protocol headers in the output (H/F)

How can I update jenkins plugins from the terminal?

I am trying to create a bash script for setting up Jenkins. Is there any way to update a plugin list from the Jenkins terminal?
At first setup there is no plugin available on the list
i.e.:
java -jar jenkins-cli.jar -s `http://localhost:8080` install-plugin dry
won't work
A simple but working way is first to list all installed plugins, look for updates and install them.
java -jar /root/jenkins-cli.jar -s http://127.0.0.1:8080/ list-plugins
Each plugin which has an update available, has the new version in brackets at the end. So you can grep for those:
java -jar /root/jenkins-cli.jar -s http://127.0.0.1:8080/ list-plugins | grep -e ')$' | awk '{ print $1 }'
If you call install-plugin with the plugin name, it is automatically upgraded to the latest version.
Finally you have to restart jenkins.
Putting it all together (can be placed in a shell script):
UPDATE_LIST=$( java -jar /root/jenkins-cli.jar -s http://127.0.0.1:8080/ list-plugins | grep -e ')$' | awk '{ print $1 }' );
if [ ! -z "${UPDATE_LIST}" ]; then
echo Updating Jenkins Plugins: ${UPDATE_LIST};
java -jar /root/jenkins-cli.jar -s http://127.0.0.1:8080/ install-plugin ${UPDATE_LIST};
java -jar /root/jenkins-cli.jar -s http://127.0.0.1:8080/ safe-restart;
fi
You can actually install plugins from the computer terminal (rather than the Jenkins terminal).
Download the plugin from the plugin site (http://updates.jenkins-ci.org/download/plugins)
Copy that plugin into the $JENKINS_HOME/plugins directory
At that point either start Jenkins or call the reload settings service (http://yourservername:8080/jenkins/reload)
This will enable the plugin in Jenkins and assuming that Jenkins is started.
cd $JENKINS_HOME/plugins
curl -O http://updates.jenkins-ci.org/download/plugins/cobertura.hpi
curl http://yourservername:8080/reload
Here is how you can deploy Jenkins CI plugins using Ansible, which of course is used from the terminal. This code is a part of roles/jenkins_ci/tasks/main.yaml:
- name: Plugins
with_items: # PLUGIN NAME
- name: checkstyle # Checkstyle
- name: dashboard-view # Dashboard View
- name: dependency-check-jenkins-plugin # OWASP Dependency Check
- name: depgraph-view # Dependency Graph View
- name: deploy # Deploy
- name: emotional-jenkins-plugin # Emotional Jenkins
- name: monitoring # Monitoring
- name: publish-over-ssh # Publish Over SSH
- name: shelve-project-plugin # Shelve Project
- name: token-macro # Token Macro
- name: zapper # OWASP Zed Attack Proxy (ZAP)
sudo: yes
get_url: dest="{{ jenkins_home }}/plugins/{{ item.name | mandatory }}.jpi"
url="https://updates.jenkins-ci.org/latest/{{ item.name }}.hpi"
owner=jenkins group=jenkins mode=0644
notify: Restart Jenkins
This is a part of a more complete example that you can find at:
https://github.com/sakaal/service_platform_ansible/blob/master/roles/jenkins_ci/tasks/main.yaml
Feel free to adapt it to your needs.
You can update plugins list with this command line
curl -s -L http://updates.jenkins-ci.org/update-center.json | sed '1d;$d' | curl -s -X POST -H 'Accept: application/json' -d #- http://localhost:8080/updateCenter/byId/default/postBack
FYI -- some plugins (mercurial in particular) don't install correctly from the command line unless you use their short name. I think this has to do with triggers in the jenkins package info data. You can simulate jenkins' own package update by visiting 127.0.0.1:8080/pluginManager/checkUpdates in a javascript-capable browser.
Or if you're feeling masochistic you can run this python code:
import urllib2,requests
UPDATES_URL = 'https://updates.jenkins-ci.org/update-center.json?id=default&version=1.509.4'
PREFIX = 'http://127.0.0.1:8080'
def update_plugins():
"look at the source for /pluginManager/checkUpdates and downloadManager in /static/<whatever>/scripts/hudson-behavior.js"
raw = urllib2.urlopen(self.UPDATES_URL).read()
jsontext = raw.split('\n')[1] # ugh, JSONP
json.loads(jsontext) # i.e. error if not parseable
print 'received updates json'
# post
postback = PREFIX+'/updateCenter/byId/default/postBack'
reply = requests.post(postback,data=jsontext)
if not reply.ok:
raise RuntimeError(("updates upload not ok",reply.text))
print 'applied updates json'
And once you've run this, you should be able to run jenkins-cli -s http://127.0.0.1:8080 install-plugin mercurial -deploy.
With a current Jenkins Version, the CLI can just be used via SSH. This has to be enabled in the "Global Security Settings" page in the administration interface, as described in the docs. Furthermore, the user who should trigger the updates must add its public ssh key.
With the modified shell script from the accepted answer, this can be automatized as follows, you just have to replace HOSTNAME and USERNAME:
#!/bin/bash
jenkins_host=HOSTNAME #e.g. jenkins.example.com
jenkins_user=USERNAME
jenkins_port=$(curl -s --head https://$jenkins_host/login | grep -oP "^X-SSH-Endpoint: $jenkins_host:\K[0-9]{4,5}")
function jenkins_cli {
ssh -o StrictHostKeyChecking=no -l "$jenkins_user" -p $jenkins_port "$jenkins_host" "$#"
}
UPDATE_LIST=$( jenkins_cli list-plugins | grep -e ')$' | awk '{ print $1 }' );
if [ ! -z "${UPDATE_LIST}" ]; then
echo Updating Jenkins Plugins: ${UPDATE_LIST};
jenkins_cli install-plugin ${UPDATE_LIST};
jenkins_cli safe-restart;
else
echo "No updates available"
fi
This greps the used SSH port of the Jenkins CLI and then connects via SSH without checking the host key, as it changes for every Jenkins restart.
Then all plugins with an update available are upgraded and afterwards Jenkins is restarted.
In groovy
The groovy path has one big advantage: it can be added to a 'system groovy script' build step in a job without any change.
Create the file 'update_plugins.groovy' with this content:
jenkins.model.Jenkins.getInstance().getUpdateCenter().getSites().each { site ->
site.updateDirectlyNow(hudson.model.DownloadService.signatureCheck)
}
hudson.model.DownloadService.Downloadable.all().each { downloadable ->
downloadable.updateNow();
}
def plugins = jenkins.model.Jenkins.instance.pluginManager.activePlugins.findAll {
it -> it.hasUpdate()
}.collect {
it -> it.getShortName()
}
println "Plugins to upgrade: ${plugins}"
long count = 0
jenkins.model.Jenkins.instance.pluginManager.install(plugins, false).each { f ->
f.get()
println "${++count}/${plugins.size()}.."
}
if(plugins.size() != 0 && count == plugins.size()) {
println "restarting Jenkins..."
jenkins.model.Jenkins.instance.safeRestart()
}
Then execute this curl command:
curl --user 'username:token' --data-urlencode "script=$(< ./update_plugins.groovy)" https://jenkins_server/scriptText

Resources