I have a package that validates if the process has specific environment variables set on init(), else it panics.
I do this to ensure the process is properly configured at launch.
The problem is that this approach is not really testable (using _test.go files), since the environment doesn't exist in the test suite.
What's the best way to solve this problem?
Do you want to be able to test the validation, or just skip it entirely in the test file? Either way is going to use the same basic approach, which is to separate out the validation code into its own file that doesn't build during tests. If you just want to skip validation entirely during the test, put the whole init() function into that file. If you want to test validation, just make the validation code call your own shim to get environment values, and put your shim in that non-test file, and put a separate shim into a file that only compiles during tests.
You can control whether the file builds during tests using a build constraint in the file header. IIRC, running the tests applies the test constraint, so you can check for that.
Related
I want to use standard Laravel facades, such as Str and Config, in my PestPHP Feature tests. Is there some concise way I can get PestPHP to do this for all of my tests in my tests/Feature directory?
I thought having uses(Tests\TestCase::class)->in('Feature'); in my Pest.php file would be sufficient, but all I ever get is this error: "A facade root has not been set."
The problem appears to be that Pest does not boot Laravel when it runs your test files, but rather only when you call certain global functions, e.g. the test closure functions test() and it(). So it's not possible to use Pest's "simplified" test script structure to easily initialize class-wide (file wide) values, i.e. attributes. beforeAll() does not work for this. Instead, one has to use beforeEach() in each class where facades will be needed.
See these issues: https://github.com/pestphp/pest/issues/237, https://github.com/pestphp/pest/issues/33#issuecomment-658022624, https://github.com/pestphp/pest/issues/246.
I have tests (JMX) with modular controllers and sometimes throw an error when they are started when there are modular controllers without a controller selected.
Is there a way to verify if every (enabled) modular controller has an associated controller?
As of JMeter 5.4.1 it is not possible to "validate" the Module Controllers without actually running your tests, the Test Plan Tree is being built in the runtime from the test elements provided and controllers like Module Controller or Include Controller basically modify the Test Plan Tree on the fly creating one big test plan from reusable fragments or external scripts.
So the options are in:
Perform a dry-run with 1-2 users/iterations to see whether your test still works, produces .jtl file without errors, etc.
It's also possible to run JMeter from Java code so you could come up with an utility which will do the test plan checking basing on your acceptance criteria
i'm using Protractor and Jasmine and would like to organize my E2E test in the best way.
Example:
There is a set of the tests for check registration function (registration with right credentials, register as existed user, etc.).
I need to run those tests in three different projects. Tests are same, but credentials are different. For one project it could be 3 fields in the registration form, in another one - 6.
Now everything is organized in a very complicated way:
each single test is made not as "it" but as a function
there is a function which contains all tests (functions which test)
there is a file with Describe function in each
in that file there is one "it" which call the function which contains all tests
there is test suite for each project
I believe that there is a practice how to organize everything in a right way, that each test was in own "it". So will be happy to see some links or advice.
Thank you in advance!
Since it's a broad question, i will redirect you to few links. You should probably be looking at page-object model of Protractor. It will help you simplify and set a standard to organise your tests in a way that is readable and easy to use. Here's the link to it as described by Protractor team.
page-object model
However if you want to know why do we need to use such a framework, there are many shortcomings to it, which can be solved by using such framework. A detailed explanation is here
shortcomings of protractor and how to overcome them
EDIT: Based on your comments i feel that you are trying make a unified file/function that can cater to all the suites that will be using it. In order to handle such things try adding a generalised function (to fill form fields in your case), export that function and then require it into your test suites. Here's a sample link to it -
Exports and require
Hope this helps.
I'm working on a project regarding website test automation and I hope someone can help me with this question?
How would you recommend setting up some automated test processes that would not constantly need to be updated to test each of the core flows to test the following for a website:
login
register
sign in with facebook
save an item
delete an item
test that the few key pages (both logged in and logged out) are working like
Thanks in advance.
I'm not sure if I understand well.
But here is an instance of project setup.
Split your KeywordTest section into two folders:
A Test folder: it should contain all the KeywordTests that are called when you run your testsuite. Each of those KeywordTests should test a specific feature (Verify_Login_Fail, Verify_Login_Success...)
A Library folder: it should contain all your KeywordTests that are reusable and that might be used often from your Test folder KeywordTests. It's a kind of function library. It avoids code repetition and is easier to maintain.
For instance, you can create a KeywordTest that takes as parameter a Login and Password and that proceeds with the actions on your website to log a user in.
Store the sensible data (that might change often) in file or a database rather than hardcode it. For instance a file Login.csv where you store all the combination of login and password you want to test.
You have to write all the steps into in a class which should be in your library package and then you have to call all the methods from your test class of test package. You should use testng in your test class then create a testsuit of test class and run the script.
I have the following folder structure:
/main
/loader.php
/build.xml
/components
/package1
/class1.php
/package2
/class2.php
/tests
/package1
/class1.test.php
/package2
/class2.test.php
When I run the web application I load loader.php at first and include other components by calling Loader::load( 'package_name' ). Then all neccessary files are included. The good thing here is that I don't need to include loader.php within the class files because I can rely on having a working instance of Loader.
The Unit Test classes simulate this behaviour by including all neccessary classes explicitly. So there is also no problem with phing and PHPUnit.
But now I want to generate a coverage report with phing and Xdebug. The problem here is that phing seems to load every single PHP file to create the coverage database. Unfortunately it stops because it cannot find the Loader class that is used in the PHP files.
I could easily add an include statement to every class file, but I wonder whether there is a way to include files only if code coverage analysis is inspecting the file?
Other idea: I could also configure the coverage analysis in a way that it scans the unit tests directory and therefore finds all neccessary includes. Then I'd need to filter classes that match to a pattern like /Test$/i or so. Possible?
I looked for ages for something similar.
In the end I ended up with the changes below. Basically you tell php cli to prepend a php file which contains your loading logic.
In php.ini of my cli I've set the following:
auto_prepend_file = autoload.php
I made sure that the file was on my include path (/usr/share/php in my case) and put following lines in it (I use Zend Framework which is also on my include path):
require_once "Zend/Loader/Autoloader.php";
$autoloader = Zend_Loader_Autoloader::getInstance();
$autoloader->registerNamespace('Model_');
Now, what you could do is define your __autoload function and define what needs to be autoloaded, but you get the idea.
It's an ugly hack, but it got things done for me.
Wkr
Jeroen