Collect output of script and publish it to confluence in Jenkins - shell

I have a shell script that logs on to the given server, finds all the wars running on the same and prints their name and versions to the console.
I need to take this output and display it on a page in confluence.
I went through this plugin for Jenkins and can connect and print static stuff on the page I provide.
The output that my script creates is dynamic, how do I, say store it in a variable and use it in the: Post-build Actions -> Publish to confluence -> Wiki Markup Replacements.
I have googled around and played with Jenkins to get these results, to no avail. I have seen '/env-vars.html' page, but none of these are of use to me, for my data is dynamic.

I referred this to accomplish this task.

Related

Is there a way to publish protractor test results in confluence and have an overview?

We run our protractor regression tests in gitlab CI and we have jasmine HTML reports. Right now it is only the QA team that monitors and checks failure if any.
But we would like to make it more visible. The devs have also asked us if we can make it visible in a single place instead of having to go to gitlab job and browse for artifacts.Also would it be possible to have an overview of pass/fail tests over time.
I'm not sure how and where to start. Any pointers would be appreciated.
You're looking for the expose_as keyword for artifacts. The full docs are here: https://docs.gitlab.com/ee/ci/yaml/#artifactsexpose_as.
If you use expose_as with your artifacts, Gitlab CI will link them to any application Merge Request with the name you give in this field.
For example (from the docs):
test:
script: ["echo 'test' > file.txt"]
artifacts:
expose_as: 'artifact 1'
paths: ['file.txt']
In this example, a Merge Request for this pipeline will have a link called "artifact 1" that opens the file "file.txt".
This also works for directories, but if there's more than one file it will open in the job's artifacts browser (like you currently do).
There are some caveats, like:
If you use a variable in the artifacts path field, expose_as won't work
Max of 10 artifacts can be exposed
Glob patters won't work
If Gitlab Pages is enabled, some file extensions will be automatically rendered using Pages (html, xml, txt, etc.).

How to add a custom link in a multibranch Jenkins Pipeline

We are archiving a lot of html reports in a Jenkins pipeline (scripted Pipeline). These are accessible through a link "Last Successful Artifacts" on the job page as usual. But we would like to create an additional custom link that that points to one of these reports (that is being generated whether the build is successful or not).
I found the DocLink plugin, but it's not listed on the pipeline compatibility list and I'm not able to figure out how this eventually could be used in a pipeline.
The HTML Publisher Plugin is another one I was looking at. But it’s not suited for our use case, since it requires us to gather all reports and publish them again. It also puts all the content in an iframe, but all we need is link to one of the already archived html reports.
Here is a example to add summary links to a build
manager.createSummary("document.png").appendText("<a href='"+ pom.url + "'>View Maven Site</a>", false)
As that method accepts HTML and can be used for XSS you need to approve them.
https://jenkins.io/doc/book/managing/script-approval/
for more examples look here: https://wiki.jenkins.io/display/JENKINS/Groovy+Postbuild+Plugin
For pipeline the Badge plugin was extracted from the Groovy Postbuild plugin and if can create the summary using something like:
createSummary icon:'package.png', text: "<a href='$pom.url'>View Maven Site</a>"

Play! Framework 2.1.3 pdf problems

so I am working on a school project in which we have designed a web application that takes in much user info and creates a pdf then should display that pdf to the user so they can print it off or save it. We are using Play! Framework 2.1.3 as our framework and server and Java for the server side. I create the pdf with Apache's PDFbox library. Every thing works as it should in development mode ie launching it on a localhost with plays run command. the issue is when we put it up to the server and launch with plays start command I it seems to take a snapshot of the directory (or at least the assets/public folder) which is where I am housing the output.pdf file/s (i have attempted to move the file elsewhere but that still seems to result in a 404 error). Initially I believed this to be something with liunx machine we were deploying to which was creating a caching problem and have tried many of the tricks to defeat the browser from caching the pdf
like using javascript to append on a time stamp to the filename,
using this cache-control directive in the play! documentation,
"assets.cache./public/stylesheets/output.pdf"="max-age=0",
then I tried to just save the pdf as a different filename each time and pass back the name of that file and call it directly through the file structure in the HTML
which also works fine with the run command but not the start.
finally I came to the conclusion that when the start command is issued it balls up the files so only the files that are there when the start command is issued can be seen.
I read the documentation here
http://www.playframework.com/documentation/2.1.x/Production
which then I noticed this part
When you run the start command, Play forks a new JVM and runs the
default Netty HTTP server. The standard output stream is redirected to
the Play console, so you can monitor its status.
so it looks like the fact that it forks a new JVM is what is causing my pain.
so my question really is can this be gotten around in some way that a web app can create and display a pdf form? (if I cannot get this to work my only solution
that I can see is that I will have to simulate the form with HTML and fill it out from there) --which I really think is a bad way to do this.
this seems like something that should have a solution but I cannot seem to find or come up with one please help.
i have looked here:
http://www.playframework.com/documentation/2.1.x/JavaStream
the answer may be in there but Im not getting it to work I am pretty novice with this Play! Framework still
You are trying to deliver the generated PDF file to the user by placing it in the assets directory, and putting a link to it in the HTML. This works in development mode because Play finds the assets in the directory. It won't work in production because the project is wrapped up into a jar file when you do play dist, and the contents of the jar file can't be modified by the Play application. (In dev mode, Play has a classpath entry for the directory. In production, the classpath points to the jar file).
You are on the right lines with JavaStream. The way forward is:
Generate the PDF somewhere in your local filesystem (I recommend the temp directory).
Write a new Action in your Application object that opens the file you generated, and serves it instead of a web page.
Check out the Play docs for serving files. This approach also has the advantage that you can specify the filename that the user sees. There is an overloaded function Controller.ok(File file, String filename) for doing this. (When you generate the file, you should give it a unique name, otherwise each request will overwrite the file from a previous request. But you don't want the user to see the unique name).

Get XML Reports in TeamCity from Google Test

I am trying to figure out how to run unit tests, using Google Test, and send the results to TeamCity.
I have run my tests, and output the results to an xml, using a command-line argument --gtest_output="xml:test_results.xml".
I am trying to get this xml to be read in TeamCity. I don't see how I can get XML Reports passed to TeamCity during build/run...
Except through XML report Processing:
I added XML Report Processing, added Google Test, then...
it asks me to specify monitoring rules, and I added the path to the xml file... I don't understand what monitoring rules are, or how to create them...
[Still, I can see nowhere in the generated xml, the fact that it intends to talk to TeamCity...]
In the log, I have:
Google Test report watcher
[13:06:03][Google Test report watcher] No reports found for paths:
[13:06:03][Google Test report watcher] C:\path\test_results.xml
[13:06:03]Publishing internal artifacts
And, of course, no report results.
Can anyone please direct me to a proper way to import the xml test results file into TeamCity ? Thank you so much !
Edit: is it possible that XML Report Processing only processes reports that were created during build ? (which Google Test doesn't do?) And is ignoring the previously generated reports, as "out of date", while simply saying that it can't find them - or are in the wrong format, or... however I should read the message above ?
I found a bug report that shows that xml reports that are not generated during the build are ignored, making a newbie like me believe that they may not be generated correctly.
Two simple solutions:
1) Create a post build script
2) Add a build step that calls the command line executable with the command-line argument. Example:
Add build step
Add build feature - XML report processing
I had similar problems getting it to work. This is how I got it working.
When you call your google test executable from the command line, prepend %teamcity.build.checkoutDir% to the name of your xml file to set the path to it like this:
--gtest_output=xml:%teamcity.build.checkoutDir%test_results.xml
Then when configuring your additional build features on the build steps page, add this line to your monitoring rules:
%teamcity.build.checkoutDir%test_results.xml
Now the paths match and are in the build directory.

how do I include the full url of the ftp location in a husdon email after a build and publishied via the ftp plugin?

I am using the hudson "Publish artifacts to FTP" task after a build to put an installer up to a web site.
I would like to automatically add that link to my email.
unfortunately hudson makes up a directory name based on time and date and places the tile there.
Is there a way to get that value and put it in the build success email or otherwise automatically create the full url?
Hopefully you are using the "Hudson Email Extension" Plugin. This gives you many more customization options over the email support built into the core.
If you are, you might consider putting the token:
${ENV, var} - Displays an environment variable
...into the email. You could set an environment variable in your build script to the FTP link and then insert it into the email. I'm sorry I don't use the Publish Artifacts to FTP plugin myself, but you should be able to mimic the way that plugin sets the FTP destination, and then stick it into an environment variable, which the Email Extension Plugin can then use.

Resources