I made a custom function block which uses FB_XmlSrvRead and FB_XmlSrvWrite from the Tc2_TcXmlDataSrv library, to read and write an xml file. The functionality of the custom function blocks are tested with the TcUnit library. After moving the unit tests to a new project, only the tests which used the either a xml read or write failed, but other unit tests in the new project still worked.
The xml function blocks reported the following error:
Error code
Name
Description
1808
ADSERR_DEVICE_SYMBOLNOTFOUND
Symbol not found.
I tried to write different symbols, but each time it failed. I'm quite sure the symbols exist.
It turned out that I had added {attribute 'hide'} above the program which I used to execute the unit tests. Somehow this caused only the unit tests which needed to read or write an xml file to fail. Other unit tests were not affected.
Related
A fellow developer wrote a test code that uses mockito verify
verify(spyService, times(1)).register("ios", any());
Funny thing is that this code executes fine in their local machine, and in jenkins build environment. Code went to production. But the code failed in my local with following
Caused by: org.mockito.exceptions.misusing.InvalidUseOfMatchersException:
Invalid use of argument matchers!
4 matchers expected, 1 recorded:
-> at com.tile.services.rest.service.insurance.InsuranceCoverageServiceTest.testNonPartnerTilesWithoutTosAcceptance(InsuranceCoverageServiceTest.java:163)
This exception may occur if matchers are combined with raw values:
//incorrect:
someMethod(anyObject(), "raw String");
When using matchers, all arguments have to be provided by matchers.
For example:
//correct:
someMethod(anyObject(), eq("String by matcher"));
For more info see javadoc for Matchers class.
Given that the project is well structured via gradle, the junit version is same everywhere.
Do you know what may be causing this inconsistency ?
Did you already use this?
verify(spyService, times(1)).register(eq("ios"), any());
In your line you are using a specific value "ios" and a matcher any()and we cannot use both at the same time.
keep it strong
This is question about unit test (jest + #testing-library/react)
Hi. I started using #nrwl/react these days.
This is amazing products and I'm excited monorepos project with nx.
Btw, there is afterEach(cleanup); in generated template test file.
This is my sample project.
https://github.com/tyankatsu0105/reproducibility-react-test-nx/blob/master/apps/client/src/app/app.spec.tsx#L7
However react-testing-library doesn't need cleanup when using jest.
https://testing-library.com/docs/react-testing-library/api#cleanup
Please note that this is done automatically if the testing framework you're using supports the afterEach global (like mocha, Jest, and Jasmine). If not, you will need to do manual cleanups after each test.
In fact, I see error when remove afterEach(cleanup); from test files.
Found multiple elements with the text:
thanks!
I've got a Java/Maven project that uses RestDocs to generate documentation of our endpoints.
For the uninitiated of RestDocs, what it does is monitor tests during the build and use information from the tests to generate "snippets" that get used to create API documenation using asciidoc during the mvn package phase.
The problem I'm facing is that if the asciidoc (.adoc) file is referencing a non-existent snippet, the doc gets generated and will say something like:
"Unresolved directive in myDoc.adoc - include::{snippets}/error-example/response-fields.adoc[]"
Since this happens after the tests, I don't know how to test for something like this so that the build can stop and let the developer know that they need to update either the .adoc or the snippet.
Any guidance would be much appreciated.
Adding this to the configuration of the asciidoctor plugin fails the build when a snippet is not found that the .adoc is expecting:
<logHandler>
<outputToConsole>true</outputToConsole>
<failIf>
<containsText>include file not found</containsText>
</failIf>
</logHandler>
I have a package that validates if the process has specific environment variables set on init(), else it panics.
I do this to ensure the process is properly configured at launch.
The problem is that this approach is not really testable (using _test.go files), since the environment doesn't exist in the test suite.
What's the best way to solve this problem?
Do you want to be able to test the validation, or just skip it entirely in the test file? Either way is going to use the same basic approach, which is to separate out the validation code into its own file that doesn't build during tests. If you just want to skip validation entirely during the test, put the whole init() function into that file. If you want to test validation, just make the validation code call your own shim to get environment values, and put your shim in that non-test file, and put a separate shim into a file that only compiles during tests.
You can control whether the file builds during tests using a build constraint in the file header. IIRC, running the tests applies the test constraint, so you can check for that.
I am trying to figure out how to run unit tests, using Google Test, and send the results to TeamCity.
I have run my tests, and output the results to an xml, using a command-line argument --gtest_output="xml:test_results.xml".
I am trying to get this xml to be read in TeamCity. I don't see how I can get XML Reports passed to TeamCity during build/run...
Except through XML report Processing:
I added XML Report Processing, added Google Test, then...
it asks me to specify monitoring rules, and I added the path to the xml file... I don't understand what monitoring rules are, or how to create them...
[Still, I can see nowhere in the generated xml, the fact that it intends to talk to TeamCity...]
In the log, I have:
Google Test report watcher
[13:06:03][Google Test report watcher] No reports found for paths:
[13:06:03][Google Test report watcher] C:\path\test_results.xml
[13:06:03]Publishing internal artifacts
And, of course, no report results.
Can anyone please direct me to a proper way to import the xml test results file into TeamCity ? Thank you so much !
Edit: is it possible that XML Report Processing only processes reports that were created during build ? (which Google Test doesn't do?) And is ignoring the previously generated reports, as "out of date", while simply saying that it can't find them - or are in the wrong format, or... however I should read the message above ?
I found a bug report that shows that xml reports that are not generated during the build are ignored, making a newbie like me believe that they may not be generated correctly.
Two simple solutions:
1) Create a post build script
2) Add a build step that calls the command line executable with the command-line argument. Example:
Add build step
Add build feature - XML report processing
I had similar problems getting it to work. This is how I got it working.
When you call your google test executable from the command line, prepend %teamcity.build.checkoutDir% to the name of your xml file to set the path to it like this:
--gtest_output=xml:%teamcity.build.checkoutDir%test_results.xml
Then when configuring your additional build features on the build steps page, add this line to your monitoring rules:
%teamcity.build.checkoutDir%test_results.xml
Now the paths match and are in the build directory.