MsTest unable to copy test data files to build server - visual-studio-2013

I am trying to implement automated unit testing with each build using TFS.
Problem statement :
I have created few xml files which stores test data and set to copy always. When run locally files are picked up from bin folder. When I schedule a build, build process looks for files in out folder under TestResults on Build Server. Out folder contains ddls but not the xml files. Hence unable to find files and results into build failure partially.

You can specify additional files to deploy in your test settings file:
More details here - https://msdn.microsoft.com/en-us/library/ms182475.aspx

You could also use the DeploymentItem Attribute.
https://msdn.microsoft.com/en-us/library/microsoft.visualstudio.testtools.unittesting.deploymentitemattribute.aspx

Related

How to pass a directory with test files to a xUnit test?

I have a xUnit test subproject which includes a directory where test data files are store. Does xUnit provide any feature to access test data included with the source files?
I think you could just get the current directory:
var workingDirectoryPath = System.IO.Directory.GetCurrentDirectory();
This will give you working path like:
C:\Users\XXX\Documents\Visual Studio 2017\Projects\PROJECT_NAME\bin\Debug\netcoreapp2.2
Then traverse upwards to the test project directory. From there you can proceed to the test data directory and perform any operations you like.

Automatically generate Code Coverage during nightly build

I have some problems getting the code coverage .coverage file generated in nightly build.
What I have: I've configured my build to use a .runsettings file and Type of run settings : CodeCoverageEnabled
The build is correctly running all the required unit tests and measuring the code coverage, using only a selected number of assemblies (specified in the .runsettings file).
In the build report, within VS2013, I can manually export the code coverage file (a .coverage file).
What I need:
I would need to configure the build to automatically generate that .coverage file in a target folder.
How do I do that?
The .coverage file is present as a part of the test results. You can use the .runsettings to set a outputpath for the test results
<ResultsDirectory>c:\\TestResults</ResultsDirectory>
The .coverage file will be present in a subfolder within the results directory.
If you want to push it to another location you can do that via a post-build script in your nightly's build process template.

Create a Deployment Package to folder TeamCity

I currently have Team City set up on a server that does:
1) Build the Project
2) Run unit tests
3) Publish the project through IIS if successful
I want to add another step so that a .zip file is produced consisting of what was deployed to a directory on the server running team city. I am to understand that this is possible through artefacts but everything I've tried so far hasn't worked to give me the publish output.
I've tried options such as "** => C:\TC\Test.zip" but that includes the actual code implementation files.
Is there a way in which to publish a zip containing the publish result?
I've been trying this for hours without luck so far so hopefully I can get an answer.
You can use this to zip all files relative to the root of the build:
**/* => artifacts.zip
To zip all files relative to a folder named publish:
publish/**/* => artifacts.zip
If your published files are not included in the zip, they may have been published to someplace outside of the root of the build.

How to access test artifacts from Jenkins if test fails

I have a Maven project which performs a number of time consuming tests as part of the integration-test Maven cycle. I'm using Jenkins as the CI server.
During the integration test a number of files are produced in the target folder. For example, an "actual" BMP file is produced and compared to an "expected" BMP file. If the test fails, I need to look at the files in the target folder to determine how to deal with the error. Maybe the actual BMP looks fine and so it should be promoted to the new expected BMP. On the other hand, it may reveal a problem that requires a code fix.
The thing is I don't have any way to get access to these files, other than to ssh into the CI server and manually scp the files over to my own machine for closer inspection. It would be extremely helpful if I could access these files from the Jenkins web interface.
I tried using the build-helper-maven-plugin to attach the relevant files as Maven artifacts, but the problem is that there is no suitable phase in Maven that executes after an integration-test, if any test fails.
What can I do? Can I use the "Copy Artifact" plugin for this?
1) The files in the target folder can be accessed using a link such as /ws/projectname/target/filename...
2) Rather than typing the url each time, the SideBar plugin can be used to add a link to the file to Jenkins' left menu, making it easily accessible.
You need to copy your files into your workspace in a build step and archive them from there - Jenkins lets you specify artifacts only relative to the workspace.
I usually create a directory keyed by the BUILD_ID in the workspace, so that artifacts from different builds do not get mixed up in case I do not clean the workspace and archive from there (specifying ${BUILD_ID}/**/* in the archiving step).
In case your build fails before it can run the copying step and because of it does not do the copy, take a look at this question.

Build a Oracle ADF application jar file using Ant

I am trying to write Ant scripts to build jar files for Oracle ADF application and I noticed some differences in the contents being generated by the build (deploy might be a better word actually) process within JDev and from the Ant process:
META-INF/adfc-config.xml
META-INF/adflibWEBINDEX.txt
META-INF/adfm.xml
META-INF/faces-config.xml
META-INF/jar-adf-config.xml
META-INF/jar-connections.xml
META-INF/jax-ws-catalog.xml
META-INF/oracle.adf.common.services.ResourceService.sva
META-INF/task-flow-registry.xml
Does anyone know how these files are generated and how to edit the Ant scripts to include them?
I know that some of these exist in the project folder structure but when I compare their contents against the same files generated from the Jdev build produced jar file they are different. So I assume there is something more than a simple copy going on here.
Cheers,
Mo
Your best bet is to use "Create buildfile from project" in JDeveloper. This will produce the calls to ojdeploy necessary to create/update all the extra artifacts.
http://www.oracle.com/technetwork/articles/adf/part4-098813.html

Resources