Laravel unit tests in the wild - laravel-4

Where would be a good place to find Laravel unit tests in the wild? Similar to this: Good Cucumber examples in the wild?
but for laravel. I know there are lots of examples out there but I’m looking for real tests based on a real project.

You could look at Laravel's own test suite, which is quite extensive and is based on PHPUnit. See https://github.com/laravel/framework/tree/master/tests.

Related

Unit Testing with Laravel

I'm using PHP with Laravel 5.5 framework.
I recently started writing unitTests for my code and I got a few questions:
What is the best way to interact with my database?
Should I use InMemoryDB like SQLite or Mock everything with Mockery.
If I have an interaction with DB than that is still unitTesting or Integration Testing?
Thank you for the answers in advence
I work in a company where we strive for 80% code coverage, in general we test mostly End-2-End, with database and mocking external calls, we use SQLite so our testsuite can run quickly in a local environment. When the case make sense, we unit tests it, as an example an Tax service i did for different countries i unit tested, because it was very input output based.
Why we prefer End-2-End:
It's quicker if you don't have to make unit, integration and end to end testing
You test the endpoint will actually will be used
I prefer to run with a real database if you are running with Continous Integration
There is drawbacks with SQLite, mainly it does not work as other RDB where there is a lot of settings and limitations, on top of my head i had problems with foreign key enforcing etc.
So to answer your question:
It's smart to use SQLite at least locally
In Unit Testing you are only testing one class and mocking everything else, you are basically testing that the code can execute. Note this is a oversimplified version on a very complex subject.

Laravel API testing unit vs feature

I am using Laravel 5.6 to write a API with no views, just endpoints.
I am very new to testing and my understanding is that Unit tests are from a programmers perspective and Feature tests are from a users perspective.
As i am only creating an API, would I be right to assume that i will only be writing Unit tests? and i am safe to remove the tests/Feature directory all together?
My tests will consist of things like
public function it_authenticate_a_user()
Sorry if its a dull question, i am only trying to learn.
Thanks
No, it's not a good idea to write only unit tests.
A true unit test verifies that a single class or function works as expected. However, it does NOT verify that the whole application works as expected - it's perfectly possible for the application to have 100% coverage from unit tests, but not actually work because the components don't quite fit together as expected. You should also write functional tests for your endpoints. The majority of your tests should be unit tests, but it's a good idea to make sure every API method is covered by functional tests too, just to make sure the pieces fit together.
Put it this way, at Google they advocate a model called the testing pyramid, which gives a typical ratio of 70% unit tests, 20% functional tests, and 10% high level acceptance tests. It should not be seen as rigid, and for an API I see little need for acceptance tests, but it gives some idea as to a healthy mix of tests.
An API is in some ways easier to test than a conventional web app since it's stateless and each method is relatively simple, but it's just as important it has functional test coverage. It's straightforward to test API routes in Laravel - just do the setup, make the request and check the response is correct and any changes have been made.

How to setup testing environment for rxjs-v5?

The official doc just give a short example for it.
https://github.com/ReactiveX/rxjs/blob/master/spec/observables/interval-spec.ts
This official testing setup to much thing.
I need a simple example so that I can start to write the unit test for my rxjs app.

How complex should smoke tests be?

So we've been running a daily build on our current project for a lot of months at this point. The smoke tests that goes along with that daily build isn't very complex, though - we run a few nUnit tests on our main class library (which, admittedly, doesn't offer great code coverage), and we make sure that things compile and build. The application in question is an ASP.NET site which consumes some business objects (which include LINQ-to-SQL).
Are there more complex smoke tests that we should be running, particularly on the ASP.NET sites? How would we develop a smoke test for an ASP.NET site, for that matter?
As well as unit testing, it can be good to launch the site to a staging server with some example data. As close to live-like as possible. Then use a HTTP traffic generating script to simulate user traffic and sessions. You can monitor debug logging, exceptions and other testing code on the back-end. You can also take performance measurements here.
Much like a more intense, iterative version of playing with it in the browser yourself.
You can do this by defining (or through inspection) your public resources and their inputs. The scripts can then try and cause validation problems, odd permutations of site flow and other things that test the entire context of the site in a live setting.
If testing is not complete ... from unit testing through to "does it play nice with real data and traffic" then you are ultimately going to be running around like a headless chicken fixing bugs later.
Smoke tests, by nature, should be superficial: Does it compile? Deploy? Does the welcome page load? Maybe load a test page which does a query against the database to see that this connection works, too. That's it.
You should not be doing smoke tests. Are you aware of the etymology of that term? A "smoke test", in electronics, is when you turn on the power and see if any smoke comes out.
You should be doing more comprehensive unit tests; enough to give you good code coverage. This is what you should do on every build. You should also try to do a deployment, and run some "installation verification tests".

Tagging unit-tests

I'm working on a PHP project with solid unit-tests coverage.
I've noticed, that last time, I'm making very tricky manipulations with unit-tests Command-Line Test Runner' --filter command.
Here is this command's explanation from official documentation:
--filter
Only runs tests whose name matches the given pattern. The pattern can be either the name of a single test or a regular expression that matches multiple test names.
I ofter use it because sometimes it becomes very useful to run just a single test suite or test case from the whole test base.
I'm wondering if this is good practice or not?
I have heard that sometimes it is good practice to to run the whole test suite on your Continuous Integration machine, if you know for sure that you have modified only one component and 100% percent confident, that it won't fail other component's unit-tests.
What do you think about it?
Some time ago I thought that we shouldn't care so much about time require to run the whole suite of all unit-tests, but when you have very complicated business logic and unit-tests - this can take significant time.
I understand, that "real" unit-tests shouldn't interact with DB, use mock/stubs objects, I agree with that. But sometimes, it is much easier(cheaper) to use DB fixtures for the tests.
Please give me some advice, how this problem can be solved?
Good unit tests should:
Have clear methods names and variable names to act as documentation
Run fast. This will also be possible
for test with complicated business
logic. Test should run in an avarage
time of something around 0.1 second.
Test exactly one thing in one test method
Not integrate with external resources like the filesystem, email,
databases, webservices, and
everything else. You can create
seperate database integration tests
to test your database ineraction.
These test will slower then your unit
test most of the time. I put my
integration tests in a seperate
project and I run them only when I am
working on the integration code. I
also run them on all builds on the CI
server.
Be completely isolated from each other. When you have tests depending
on each other, you cannot see what
your problem is from reading which
tests are failed. You might have to
debug to find the problem. Isolated
tests will save you a lot of time.
Personally, I don't use category names in my tests. I use 2 test projects per application. One for the unit test and one for the integration tests and slower tests.
Reaction on:
"But sometimes, it is much
easier(cheaper) to use DB fixtures for
the tests."
When your code is written well, it will be easier to mock. I don't know about mocking frameworks in Php, but I use them in other languages to save me a lot of time. Writing test first and code later might help you to design your code to be testable easier.
Personally I learned to test better by
reading blogs about it
reading books about it
reading tested code written by others
writing a lot of tests of course. It took me a few thousends of tests to become good at it.
I ofter use it because sometimes it becomes very useful to run just a single test suite or test case from the whole test base.
I'm wondering if this is good practice or not?
Sure, as long as you run the full set of unit-tests occasionally (via a CI server sounds perfect)
Running the "interesting" tests regularly is better than running all the tests rarely..
I'd address the issue by having a subset of tests ("smoke tests") that take 1 minute or less that must be run before committing, then run the full set of tests from your CI server.
If your full set of tests takes > 15 minutes then I'd look to divide them and run them in parallel.
Then you can use the --filter to run the tests you're most interested in first, then the smoke tests prior to commit, and have the rest run from the CI server.

Resources