How to mock script invocation with Pester? - windows

I want to create a mock with Pester which is supposed to be called in stead of a script.
Simplified example below.
somecmd.ps1
# This is the script I want to not be called in my test.
Write-Host "Output from somecmd.ps1"
throw "Error, should have been mocked"
somemodule.psm1
# This is the module I am testing.
function ModuleFunction{
.\somecmd
}
function somecmd {
Write-Host "Output from somecmd in somemodule"
throw "Error, should never be called"
}
somemodule.Tests.ps1
# This is my Pester test.
BeforeAll {
Import-Module .\somemodule.psm1 -Force
}
Describe 'SomeTest' {
It 'Should mock' {
# How must I declare the mock below so that somecmd.ps1 is not invoked?
Mock somecmd { Write-Host "Output from mock" } -ModuleName somemodule
ModuleFunction
}
}
As stated in the comment above, my problem is I cannot figure out how to declare the mock, so that the test will use that in stead of actually invoking somecmd.ps1.
I tried looking into this issue, but following the suggestion there did not solve my problem.
Unfortunately, in my real scenario, it is not an option for me to re-write the module to better support testing.
I am running:
PowerShell: 5.1.19041.1682
Pester: 5.3.3
Does anyone have an idea?

From the GitHub Issue you linked, I don't think you can mock a *.ps1 script directly.
However, a simple workaround is to wrap your call to .\somecmd.ps1 in a separate function, and then mock that...
somemodule.psm1
function Invoke-SomeCmd
{
.\somecmd
}
function ModuleFunction{
# do some stuff
Invoke-SomeCmd
# do some more stuff
}
somemodule.Tests.ps1
BeforeAll {
Import-Module .\somemodule.psm1 -Force
}
Describe 'SomeTest' {
It 'Should mock' {
Mock "Invoke-SomeCmd" {
Write-Host "Output from mock"
} -ModuleName somemodule
ModuleFunction
}
}
and then you get this output:
Starting discovery in 1 files.
Discovery found 1 tests in 327ms.
Running tests.
Output from mock
[+] C:\src\so\pester\somemodule.tests.ps1 992ms (274ms|433ms)
Tests completed in 1.02s
Tests Passed: 1, Failed: 0, Skipped: 0 NotRun: 0

Related

How can i set the jenkins job to pass when the test are actually failed

I'm trying to do something similar to what this guy is doing:
Jenkins failed build: I Want it to pass
create a pipeline job in Jenkins for all the Known bugs tests, I want the job to PASS when all the tests are FAILED. while when even 1 test is PASS, the job will be GREEN.
I found here this solution
stage('Tests') {
steps {
script{
//running the tests
status = sh "${mvnHome}/bin/mvn clean test -e -Dgroups=categories.knownBug"
if (status === "MARK-AS-UNSTABLE") {
currentBuild.result = "STABLE"
}
}
}
}
but got an error
Unsupported operation in this context # line 47, column 39.
if (status === "MARK-AS-UNSTABLE") {
------------EDIT---------
Thanks to #yrc I changed the code to
try {
sh "${mvnHome}/bin/mvn clean test -e -Dgroups=categories.knownBug"
} catch (err) {
echo "Caught: ${err}"
currentBuild.result = "STABLE"
}
It did help with the error msg, but I want the job to pass when one of the tests is failing. Now, both test and job has failed
Just wrap your execution with a try-catch block.
try {
sh "${mvnHome}/bin/mvn clean test -e -Dgroups=categories.knownBug"
} catch (err) {
echo "Caught: ${err}"
currentBuild.result = "STABLE"
}

Only run another PP file after one has completed

A chocolatey provider is required, to install packages this will work but only works once another pp file finishes executing.
The problem is that puppet evaluates both files under the node statement and errors on invalid provider; the problem is I run the first pp file by commenting the other out , then let it run & uncomment it then rerun with puppet agent --test it all works.
I have tried tags and used an if statement with the tag , but this doesn't seem to work either.
class windows::chocolatey {
exec { 'set_executionpolicy':
command => "set-executionpolicy unrestricted -force -scope process;
(iex((new-object
net.webclient).DownloadString('https://chocolatey.org/install.ps1')))>\$null
2>&1",
provider => 'powershell',
creates => 'C:/ProgramData/chocolatey',
}
node "web-iis-02" {
class { 'windows':} #chocolatey installing to allow atom.pp to work
class { 'atom': } # init.pp below install using chocolatey
#installs package
class atom {
if tagged(windows) {
include atom::pakages
notify { "Calling Pakagepp script": }
}
}
#if tagged init.pp above calls this:
class atom::pakages {
include chocolatey
package { 'Atom':
ensure => 'latest',
provider => 'chocolatey',
}
I get this from pakages.pp:
Error: Failed to apply catalog: Parameter provider failed on
Package[Atom]: Invalid package provider 'chocolatey' (file:
/etc/puppetlabs/code/environments/production/modules/atom/manifests/pakages.pp, line: 3)
Try adding a require dependency, so the atom class is declared after the windows class:
class { 'windows': }
class { 'atom':
require => Class['windows'],
}
or quick and dirty:
class { 'windows': }
-> class { 'atom': }
You'll need to remove that tagged condition as it isn't needed.
I can't quite tell from your question which classes depend on which, but I'm pretty sure it is require you need. You may need to add a require for the chocolatey class:
class { 'atom':
require => Class['windows', 'chocolatey'],
}

Fetch credentials depending on environment

I can take credentials like explained in the example taken from here - https://jenkins.io/doc/book/pipeline/syntax/#environment
stage('Example') {
environment {
CREDS = credentials('MY_CREDS_DEV')
}
steps {
sh 'echo hello'
}
}
But what I want to do is to get credentials based on some condition.
For example I have MY_CREDS_DEV and MY_CREDS_QA defined in Jenkins credentials. And I have a property ENV=dev defined in Jenkins 'Prepare an environment for the run' section.
I'd like to access credentials based on my environment, i.e. ENV property.
I tried to use CREDS = credentials('MY_CREDS_' + ${ENV}) and tried to extract strings concatenation to a separate function and call it like CREDS = credentials(concatenate(${ENV})) but I got Internal function call parameters must be strings.
So seems I can put only a string to credentials() function which basically means to hardcode it. But how can I choose which credentials to use - dev or qa?
Use CREDS = credentials('MY_CREDS_' + ENV) or CREDS = credentials("MY_CREDS_${ENV}"). ${ENV} will not become 'dev'but ${'dev'} and therefore is no string.
For completeness:
In fact - after playing aroung with the groovy console - it looks like ${ENV} will try to call a function called $ with the closure parameter {ENV} which in turn would return 'dev'. It would give the same result as ENV if you would have defined a function like:
def $(Closure closure) {
closure()
}
But most probably that's not what you wanted to do.
Got this working in Jenkins:2.190.2 with a little groovy. Haven't tested on earlier versions. Just happens to be the one I'm on now. Works fine with multiple stages.
pipeline {
agent {
label "xxxxx"
}
environment {
ROLE = getRole()
}
stages{
stage("write to s3 etc") {
environment {
AWS = credentials("${ROLE}")
}
steps {
script {
sh"""
aws s3 sync build/ "s3://xxxxxxxxxxxx"
"""
}
}
}
}
}
def getRole() {
def branchName = "${env.BRANCH_NAME}"
if (branchName == "xxxxxx") {
return 'some_credential_string'
}
else {
return 'some_other_credential_string'
}
}
If you would like to use different credentials based on the condition, this could be done with the following example:
stage ("Example") {
steps {
script {
if ( params.TEST_PARAMETER == "test_value1" ) {
withCredentials([string(credentialsId: env.CREDENTIALS_1, variable: 'SOME_VARIABLE')]) {
yourFunction()
}
}
else {
withCredentials([string(credentialsId: env.CREDENTIALS_2, variable: 'SOME_VARIABLE')]) {
yourFunction()
}
}
}
}
}
You would need to define yourFunction in the end of your jenkinsfile. In this case, when TEST_PARAMETER is test_value1 in the job, CREDENTIALS_1 will be used from Jenkins credentials list. When TEST_PARAMETER is different, CREDENTIALS_2 credentials will be used. You could have more options by modifying this to the case loop.
Hope this helps.

Can't read a file parm in jenkins pipeline

example :
pipeline{
agent any
stages{
stage('Parse CSV'){
steps {
script{
def fileToParse = readFile(params.FileLocation)
}
echo fileToParse
}
}
}
}
I configured the job from the GUI, the file location parameter is called FileLocation. I uploaded a file and tried to read it. When I try to access params.FileLocation it returns null, as if it doesn't recognise it.
Your problem is with the variable scope. You def the variable in the script {} block scope, then try to use it outside of it. One easy fix is to def the variable outside the pipeline {} block at the global level. Or, just use the params.FileLocation in your echo statement.
def fileToParse
pipeline{
agent any
stages{
stage('Parse CSV'){
steps {
script{
fileToParse = readFile(params.FileLocation)
}
echo fileToParse
echo params.FileLocation
}
}
}
}
file param is not supported and it is removed from documentation as well.
https://issues.jenkins-ci.org/browse/JENKINS-27413
Check available parameters: https://jenkins.io/doc/book/pipeline/syntax/#parameters

How to define and call custom methods in build.gradle?

As part of my project, I need to read files from a directory and do some operations all these in build script. For each file, the operation is the same(reading some SQL queries and execute it). I think its a repetitive task and better to write inside a method. Since I'm new to Gradle, I don't know how it should be. Please help.
One approach given below:
ext.myMethod = { param1, param2 ->
// Method body here
}
Note that this gets created for the project scope, ie. globally available for the project, which can be invoked as follows anywhere in the build script using myMethod(p1, p2) which is equivalent to project.myMethod(p1, p2)
The method can be defined under different scopes as well, such as within tasks:
task myTask {
ext.myMethod = { param1, param2 ->
// Method body here
}
doLast {
myMethod(p1, p2) // This will resolve 'myMethod' defined in task
}
}
If you have defined any methods in any other file *.gradle - ext.method() makes it accessible project wide. For example here is a
versioning.gradle
// ext makes method callable project wide
ext.getVersionName = { ->
try {
def branchout = new ByteArrayOutputStream()
exec {
commandLine 'git', 'rev-parse', '--abbrev-ref', 'HEAD'
standardOutput = branchout
}
def branch = branchout.toString().trim()
if (branch.equals("master")) {
def stdout = new ByteArrayOutputStream()
exec {
commandLine 'git', 'describe', '--tags'
standardOutput = stdout
}
return stdout.toString().trim()
} else {
return branch;
}
}
catch (ignored) {
return null;
}
}
build.gradle
task showVersion << {
// Use inherited method
println 'VersionName: ' + getVersionName()
}
Without ext.method() format , the method will only be available within the *.gradle file it is declared. This is the same with properties.
You can define methods in the following way:
// Define an extra property
ext.srcDirName = 'src/java'
// Define a method
def getSrcDir(project) {
return project.file(srcDirName)
}
You can find more details in gradle documentation Chapter 62. Organizing Build Logic
An example with a root object containing methods.
hg.gradle file:
ext.hg = [
cloneOrPull: { source, dest, branch ->
if (!dest.isDirectory())
hg.clone(source, dest, branch)
else
hg.pull(dest)
hg.update(dest, branch)
},
clone: { source, dest, branch ->
dest.mkdirs()
exec {
commandLine 'hg', 'clone', '--noupdate', source, dest.absolutePath
}
},
pull: { dest ->
exec {
workingDir dest.absolutePath
commandLine 'hg', 'pull'
}
},
]
build.gradle file
apply from: 'hg.gradle'
hg.clone('path/to/repo')
Somehow, maybe because it's five years since the OP, but none of the
ext.someMethod = { foo ->
methodBody
}
approaches are working for me. Instead, a simple function definition seems to be getting the job done in my gradle file:
def retrieveEnvvar(String envvar_name) {
if ( System.getenv(envvar_name) == "" ) {
throw new InvalidUserDataException("\n\n\nPlease specify environment variable ${envvar_name}\n")
} else {
return System.getenv(envvar_name)
}
}
And I call it elsewhere in my script with no prefix, ie retrieveEnvvar("APP_PASSWORD")
This is 2020 so I'm using Gradle 6.1.1.
#ether_joe the top-voted answer by #InvisibleArrow above does work however you must define the method you call before you call it - i.e. earlier in the build.gradle file.
You can see an example here. I have used this approach with Gradle 6.5 and it works.
With Kotlin DSL (build.gradle.kts) you can define regular functions and use them.
It doesn't matter whether you define your function before the call site or after it.
println(generateString())
fun generateString(): String {
return "Black Forest"
}
tasks.create("MyTask") {
println(generateString())
}
If you want to import and use a function from another script, see this answer and this answer.
In my react-native in build.gradle
def func_abc(y){return "abc"+y;}
then
def x = func_abc("y");
If you want to check:
throw new GradleException("x="+x);
or
println "x="+x;

Resources