XUnit conditional tests - xunit

I am writing some xunit test to validated the correctness of an xml document, some of the xml elements are optional so I would like to write an xunit test that will run only if the optional element is present
I have an xml builder class that creates the document and inserts the optional elements if they are present in the dto.

It's not the unit testing framework's responsibility to decide whether it should run or not, however it's up to you what assertions do you have in your test case. You could simply return from the test case if the attribute is not present and perform the assertion if it is present.

Related

JMeter Custom Plugin Variable Substitution

Context
I am developing a custom JMeter plugin which generates test data dynamically from a tree like structure.
The editor for the tree generates GUI input fields as needed, and therefore I have no set of defined configuration properties which are set in the respective TestElement. Instead, I serialize the tree as a whole in the GUI class, set the result as one property and deserialize it in the config element where it is processed further during test execution.
Problem
This works just fine, except that JMeter variable/function expressions like ${foo} or ${_bar(..)} in the dynamic input fields are not evaluated. As far as I understand the JMeter source code, the evaluation is triggered somehow if the respective property setters in org.apache.jmeter.testelement.TestElement are used which is not possible for my plugin.
Unfortunately, I was not able to find a proper implementation which can be used in my config element to evaluate such expressions explicitly after deserialization.
Question
I need a pointer to JMeter source code or documentation for evaluating variable/function expressions explicitly.
After I manages to setup the JMeter-Project properly in my IDE, I found org.apache.jmeter.engine.util.CompoundVariable which can be used like this:
CompoundVariable compoundVariable = new CompoundVariable();
compoundVariable.setParameters("${foo}");
// returns the value of the expression in the current context
compoundVariable.execute();

JMeter assertion modularity (can I re-use assertions?)

I am working on a test plan for our REST web application and we have several common test types which have common criteria we want to test for. For example, when creating entities through the API we have a common set of expectations for the JSON response; id should be set, created date should be set, etc.
Now, I would like to model my plans like this:
Thread Group
Users (Simple Controller)
User Create Tests (Simple Controller)
Create Test 1 (Sampler)
Create Test 2 (Sampler)
Create Test 3 (Sampler)
Common Creation Asserts (Module Controller)
User Delete Tests (Simple Controller)
Samplers...
Common Delete Asserts (Module Controller)
Events (Simple Controller)
Event Create Tests (Simple Controller)
Samplers...
Common Creation Asserts (Module Controller)
Event Delete Tests (Simple Controller)
Samplers...
Common Delete Asserts (Module Controller)
Thread Group for common assertions (disabled)
Common Creation Assertions (Simple Controller)
BSF Assertion 1
BSF Assertion 2
BSF Assertion 3
Common Delete Assertions (Simple Controller)
Asserts...
Now, I understand how the scoping works and that if I placed assertions where the BOLDed module controllers are they would be invoked for each sampler. However, I'd rather not have to copy-paste-maintain numerous copies of the same assertions in each of these locations. Hence, why I want a way to define assertions once, and invoke where appropriate.
However, with this approach, the ACCENTed assertions placed in the common simple controllers are never invoked (confirmed by using a BSF assertion with logging messages). If I place an additional sampler in the common assertions simple controller it is invoked. But only a single time.
I'm using JMeter 2.12 but have confirmed that JMeter 2.8 behaves the same way.
So, how can I use JMeter to define assertions once, and re-use them anywhere?
Thanks!
There is no way to do this.
You can try factoring by using Variables within Assertions, thus if it's a Response Assertion you will factor out this.
I ended up getting creative.
Using JSR223 assertions in Javascript I've accomplished what I wanted. This is a natural fit because all the response data I want to test is in JSON, YMMV.
In User Defined Variables I define the tests I want to perform using Javascript.
Tests like:
TEST_JSON:
try
{
eval('var obj = ' + prev.getResponseDataAsString());
} catch(e)
{
setFailed();
}
TEST_RESULT_SUCCESS
if(obj.status != "success")
{
setFailed();
}`
Then in the assertion(s) I can do something like:
eval(vars.get("TEST_JSON"));
eval(vars.get("TEST_RESULT_SUCCESS"));
And I don't have to re-write tests over and over and over.
I even have some a some utility functions that I can add to my assertion by doing
eval(vars.get("TEST_UTIL"));
which allows me to print additional logging from my assertions if I desire.

MSTest - data drive ntest from a method?

Anyonw knows how? There is a way to put in a data source, but what is the syntax to have the data injected from a amethod?
I need to test all classes with specific attributes. The test basically validates certain attributes in certain assemblies (checking whether the database is in sync).
For that it would be nieto use one data driven test htat has a "driver" method that feeds in the names or types of the classes to test.
You can use the DataSourceAttribute http://msdn.microsoft.com/en-us/library/microsoft.visualstudio.testtools.unittesting.aspx

How to initialize test class resources in Visual Studio Unit Testing framework?

I'm using the unit testing framework in .NET in C++/CLI to test unmanaged C++ code.
I would like for example an instance of System::Random to generate random values throughout the test methods.
Do I need to put this as a member variable in my test class?
If yes where can I put the initialization code, cause the ClassInitialize() method that is generated is static for some reason and it only has access to a TestContext which I read is only for using testing data from some external sources.
You can add static properties to your test class and initialize them in the ClassInitialize() method if you need them to be available to all tests. If you want them initialized per test, then using the TestInitialize() method is better.
Are you sure you want to use random values in your unit tests? Typically you'd want to use known values (good values, bad values, edge cases, etc) so that your tests are predictable. Using multiple tests with various values where you know the expected behavior (outcome) is more typical than using random values.

Is it possible to create data driven tests with MSpec?

With MSpec is it possible to create data driven tests?
For example, NUnit has the TestCase attribute that allows for multiple data driven cases.
[TestFixture]
public class ExampleOfTestCases
{
[TestCase(1,2,3)]
[TestCase(3,3,6)]
[TestCase(2,2,4)]
public void when_adding_two_numbers(int number1, int number2, int expected)
{
Assert.That(number1 + number2, Is.EqualTo(expected);
}
}
That's not possible. I would advise against driving MSpec with data, use NUnit or MbUnit if you need row tests or combinatorial tests (and MSpec when you describe behavior).
Follow-up: Aeden, TestCases/RowTests are not possible with MSpec and likely will never be. Please use NUnit for such cases, as it is the best tool for that job. MSpec excels when you want to specify system behavior (When an order is submitted => should notify the fulfilment service). For TestCases with MSpec you would need to create a context for every combination of inputs which might lead to class explosion.
MSpec is also great when you want to have a sane test structure that is easy to learn. Instead of starting with a blank sheet of paper (think NUnit's [Test] methods) MSpec gives you a template (Establish, Because, It) that you can build your specifications around. Contrast this to the example you give where Arrange, Act and Assert are combined into one line of code.

Resources