my requirement is to run postman scripts in bamboo. We have a repository for the collection.json, however, as my environment file has some sensitive data like client_id, secret_id, username, password etc I can't push it to my repo.
Please advise me how can I run my collections in bamboo using Newman.
You can utilize your environmental variables if you have a Postman account:
You'll need to get the X-Api-Key for your Postman project (on your Postman account), then you'll be able to use Postman API calls to get the collection and environment ids. Here's the link with Postman API documentation.
Install Newman to your npm. Here are the details about Newman
Run a command line from your bamboo that will trigger the Postman test run.
The command will look this way:
newman run https://api.getpostman.com/collections/{{collectionId}}?apikey={{ApiKey}}
-e https://api.getpostman.com/environments/{{EnvironmentId}}?apikey={{ApiKey}}
Related
I have a Jenkins job that clones the repository from GitHub, then runs the Execute Shell script with the help of the Newman command containing the collection and Environment JSON file.
After that, the collection of some of the services is working fine.
But the file upload-related services are failing.
The error is displayed as File load error: <“file name”>, no such fileFile Issue
Can you help me to resolve this issue with some solutions?
My all services are automated and all are working fine in Postman and Newman tools.
But now I'm trying to run my collection on Jenkins by cloning the repository from GitHub with the help of Newman from the shell script.
Expectation:
In the collection, I have 15 services where I have to upload different data files, and based on that response some of the services are dependent.
So I want to upload the files while running the Jenkins job
While trying to set the build status of a commit through ssh, I was experiencing some difficulties. I first set the build status successfully, using a GitHub personal access token. Based on this answer, I created the following curl command:
#!/bin/bash
USER="red"
REPO="code"
COMMIT_SHA="6ec8d6ef221c3e317fa20b1f541770b8f46f065c"
MY_TOKEN="somelongpersonaltoken"
curl -H "Authorization: token $MY_TOKEN" --request POST --data '{"state": "failure", "description": "Failed!", "target_url": "https://www.stackoverflow.com"}' https://api.github.com/repos/$USER/$REPO/statuses/$COMMIT_SHA
Which sets the build status similar to the red cross below:
Next, I retrieved the GitHub commit status, using:
GET https://api.github.com/repos/$USER/$REPO/commits/$COMMIT_SHA/statuses
Which outputs:
[{"url":"https://api.github.com/repos/... ,"state":"failure","description":"Failed!","target_url":"https://www.stackoverflow.com","context":"default","created_at":"2021-12-19T10:10:20Z","updated_at":"2021-12-19T10:10:20Z"...,"site_admin":false}}]
Which is as expected.
Then for the second part, I tried to omit using a GitHub personal access token, and use my ssh credentials to set the commit build status. However, this answer seems to suggest that that is currently not possible. Hence, I would like to ask:
How can I set a GitHub commit build status using ssh credentials in Bash?
I stand by my 2013 answer and confirm, in late 2021, that using SSH for GitHub API URL seems not supported.
Even the latest GitHub CLI gh api command only proposes HTTPS calls, not SSH.
Makes an authenticated HTTP request to the GitHub API and prints the response.
The endpoint argument should either be a path of a GitHub API v3 endpoint, or "graphql" to access the GitHub API v4.
I'm trying to run a collection from a free webhosting text files and I can run it easily with:
newman run %https://mysite.txt%
Now I'm trying to locally capture the sent requests from the newman run so I'm adding this command in CMD:
set HTTP_PROXY=127.0.0.1:62248
this should allow my app to record requests from Newman when using it like that:
newman run %https://mysite.txt% --env-var HTTP_PROXY --insecure
(it works perfect on local hosted files)
However since the txt file is hosted by https protocol, I'm getting the following error:
error: collection could not be loaded
unable to fetch data from url "https://mysite.txt"
tunneling socket could not be established, cause=getaddrinfo ENOTFOUND 62248
Can I run Newman collection from a secure https web address while locally recording it, or should I download and save it first locally and then run it locally?
My current temporary(hopefully I'll get a better one), is to download the file from wherever it's being stored, and run Newman locally with the downloaded file.
There is a need to upload the script from local to server and then run over there. can someone please let me know how to can achieve this.
Just copy it from local machine to the "server" using SCP for Linux or SMB for Windows, once done you can log into the server over SSH or RDP and execute your JMeter test in command-line non-GUI mode
If you want fully unattended/automated execution consider the following:
Setting up a version control system, i.e. use Github to store your script(s)
Configure the Webhook to trigger an action when you commit the file
Install Jenkins on the server
Configure Jenkins to listen to the Github Webhook and when it happens kick off a build running a JMeter test
This way whenever you add new or update existing script it will automatically trigger the job which will execute the test, check out How to Integrate Your GitHub Repository to Your Jenkins Project article for detailed steps if needed.
I'm creating a Rundeck job which will be used to rollback an application. My .jar files are stored in a Nexus repository and I would like to add an option to Rundeck where I can choose a .jar version from Nexus and then run the rollback job on this.
I have tried using this plugin: https://github.com/nongfenqi/nexus3-rundeck-plugin, but it doesn't seem to be working. When I am logged in to Nexus I can access the JSON file listing the artifacts from my browser, but when I am logged off the JSON file is empty, even if the Nexus server is running.
When adding the JSON URL as a remote URL option in Rundeck like the picture below, I get no option to choose from when running the job, even if I am logged in to Nexus, as shown by picture number 2. Is there a way to pass user credentials with options, or any other workaround for this?
I would recommend you to install Apache / HTTPD locally on your rundeck server and use a CGI script for this.
Write a CGI script that queries your Nexus3 service for versions available on the jar file, and echo the results in JSON format.
Place the script in /var/www/cgi-bin/ with executable bit enabled. You can test it like so:
curl 'http://localhost/cgi-bin/script-name.py'
In your job you can configure your remote URL accordingly.
I find using local CGI script to be much more reliable and flexible. You can also handle any authentication requirements there.