copying to teamcity's out directory before running unit tests - continuous-integration

So my situation is that I finally finished configuring TeamCity for CI. I got it to run my unit tests with some friendly help on SO.
However, many unit tests fail because there needs to be a config file alongside the unittests.dll once it's built and ready to run.
I've written a simple Command Line step with:
copy %system.teamcity.build.checkoutDir%\xx.configfile <destination>
The destination is the problem, I need it to be the Out directory teamcity creates.
TC creates SYSTEM_<machinename> <datetime>\OUT. An example:
C:\TeamCity\buildAgent\temp\buildTmp\SYSTEM_GIDEON 2015-07-02 16_51_09\Out
In there is my unittests.dll and I want to copy my config file there. What environment var or (anything else) can I use in the command line script?
The (1) Build Tests is a Step then I want to run the (2) Copy Config Step Then (3) Run Tests. After step (1) I have that xxx\xxx\Out directory and I need that directory from some variable.
I'm using Teamcity 9.0.2

Your problem is not to do with TeamCity I don't think, it's to do with the way that MSTest works. You need your .config file to be a DeploymentItem and have your tests deploy it to the directory that MSTest will run the tests in.
To be honest I'm surprised that you don't have this problem running locally, and it makes me think that you must be using some other test runner (like ReSharper) to run the tests if you have not seen this problem on your local machines.

Related

Gradle Cucumber test generates build folder on daemon instead of project directory

I have a test automation project where basically I run cucumber test via gradle task. What's weird is that the build folder is generated on .daemon folder instead of the project directory. E.g.
/Users/my_user/.gradle/daemon/5.6
Whereas it should be on:
/Users/my_user/my_project/build
Weirdly enough this seems to only happen on my local. Is there anything I might have missed on setting up the gradle?

Deploying Oracle Service Bus With Maven: Deploys Fine From One Directory But Fails From Another

I'm attempting to create an automated build and deployment for an OSB (Oracle Service Bus) composite. Such a system consists of two commands (run via command prompt from the directory in which the POM resides) after setting up Maven and the OSB plugin on the build server:
mvn package
mvn deploy -DoracleServerUrl=http://serverurl:port -DoraclUsername=username -DoraclePassword=password
This fails in the build system with the following exception:
The session cannot be activated due to the existence of conflicts.
But I believe, at it's core, this is because the build system creates the package with the first command during the build phase, and then deploys with the second command during the release phase.
If I take the code directly and run the two commands from the directory 1:
D:\OSBComposites\HelloWorldOSBService\HelloWorldOSBService
the commands run and the composite deploys fine.
If I literally copy the same code from Directory 1 to Directory 2 and run the same commands from directory 2:
D:\OSBComposites\HelloWorldOSBService\HelloWorldOSBService2
the second command fails with the same exception cited above.
This isn't a one-off situation either - I can recreate it dozens of times consistently. Running the commands from Directory 1 always succeeds while running the commands from Directory 2 always fails with the exception noted above.
And yes, this is a simple default HelloWorld composite - as simple as can be with no references to absolute paths.
Is there a cache in Maven or OSB that's "remembering" the original path from which the composite was first deployed or some other mechanism that prevents a composite from being deployed from a different location?
If your pom.xml resides in /path/directory1/pom.xml , your OSB project would get deployed as directory1 - redeploying as directory2 could then cause conflicts that you observe.
If you need to deploy it from a different location, you could place it in /path2/directory1/pom.xml
For your example, this should work:
Copy your project's content to the path similar to below and then run the maven deployment
D:\OSBComposites\HelloWorldOSBService2\HelloWorldOSBService

Where does Jenkins store the project source

I have a Jenkins job that uses a script to build my project. On the following line, the script fails mvn -e -X -Dgit='$git' release:prepare.
Because I want to search for the cause of this, I want to go to the Jenkins server and run mvn -e -X -Dgit='$git' release:prepare from the command line, to see if it works.
Does Jenkins store the projects' source code somewhere, such that I can go to that folder and call Maven?
If yes, then where?
Yes, It Stores the project files for the job by default at
/var/lib/jenkins/workspace/{your-job-name}
This is where jenkins suppose the project files to be present or it pulls it from a source before start working/building from it.
Quote from Andrew M.:
"Hudson/Jenkins doesn't quite work that way. It stores configurations and job information in /var/lib/jenkins by default (if you're using the .deb package). If you want to setup persistence for a specific application, that's something you'll want to handle yourself - Hudson is a continuous integration server, not a test framework.
Check out the Wiki article on Continuous Integration for an overview of what to expect."
From this Question on serverfault.
This worked for me:
/var/jenkins/workspace/JobNameExample
but, if your build machine (node) is a different than the one where Jenkins is running (manager), You need specify it:
/var/jenkins/workspace/JobNameExample/label/NodeName
Where you can define label too:
jenkins stores its workspace files currently in /var/jenkins_home/workspace/project_name
I am running from docker though!

Configure Mocha Test Runner in Bamboo

I've configured and am executing mocha tests in WebStorm, so I know the module is working properly. But I can't seem to make it run from a Bamboo task. The task runs with Success but there are 0 tests executed.
This is my configuration atm:
app/ is my working dir. Tried also with app/node_modules/mocha/bin/ and other possibilities. I am not sure which exactly is the Mocha executable of all the mocha named files in the module...
Or maybe the problem lies in the tests dir? I've got test files, respectively in app/test/unit/models/ and app/test/unit/services/. But in WebStorm I configured it with the general test dir - just /app/test. Configuring the Mocha task in Bamboo with the specific test folders did not yield result...
I believe the problem comes from wrong directory configurations in the task, but I've tried writing whatever paths already and I've got no idea what's missing or wrong...
I noticed from your screenshot that the "Parse test results produced by this task" box isn't checked. This is what tells Bamboo to parse the output of the tests that you run.

How to access test artifacts from Jenkins if test fails

I have a Maven project which performs a number of time consuming tests as part of the integration-test Maven cycle. I'm using Jenkins as the CI server.
During the integration test a number of files are produced in the target folder. For example, an "actual" BMP file is produced and compared to an "expected" BMP file. If the test fails, I need to look at the files in the target folder to determine how to deal with the error. Maybe the actual BMP looks fine and so it should be promoted to the new expected BMP. On the other hand, it may reveal a problem that requires a code fix.
The thing is I don't have any way to get access to these files, other than to ssh into the CI server and manually scp the files over to my own machine for closer inspection. It would be extremely helpful if I could access these files from the Jenkins web interface.
I tried using the build-helper-maven-plugin to attach the relevant files as Maven artifacts, but the problem is that there is no suitable phase in Maven that executes after an integration-test, if any test fails.
What can I do? Can I use the "Copy Artifact" plugin for this?
1) The files in the target folder can be accessed using a link such as /ws/projectname/target/filename...
2) Rather than typing the url each time, the SideBar plugin can be used to add a link to the file to Jenkins' left menu, making it easily accessible.
You need to copy your files into your workspace in a build step and archive them from there - Jenkins lets you specify artifacts only relative to the workspace.
I usually create a directory keyed by the BUILD_ID in the workspace, so that artifacts from different builds do not get mixed up in case I do not clean the workspace and archive from there (specifying ${BUILD_ID}/**/* in the archiving step).
In case your build fails before it can run the copying step and because of it does not do the copy, take a look at this question.

Resources