Background
I'm currently debugging some old unit tests in my code base, and I found a failing unit test at this point:
$this->seeInDatabase('table', [
'amount' => //some value,
]);
in my phpunit.xml I'm running the tests against my testing env:
<php>
<env name="APP_ENV" value="testing"/>
Question
I would like to check the value that's actually stored on the database in my unittest environment. However, when I put a break point right before the seeInDatabase line, nothing appears at all in the database (the same happens when I try to run tinker like so:
php artisan tinker --env=testing
ideas?
Related
I am using phpunit in connection with jenkins, and I want to skip certain tests by setting the configuration in the XML file phpunit.xml
I know that I can use on the command line:
phpunit --filter testStuffThatBrokeAndIOnlyWantToRunThatOneSingleTest
how do I translate that to the XML file since the <filters> tag is only for code-coverage?
I would like to run all tests apart from testStuffThatAlwaysBreaks
The fastest and easiest way to skip tests that are either broken or you need to continue working on later is to just add the following to the top of your individual unit test:
$this->markTestSkipped('must be revisited.');
If you can deal with ignoring the whole file then
<?xml version="1.0" encoding="UTF-8"?>
<phpunit>
<testsuites>
<testsuite name="foo">
<directory>./tests/</directory>
<exclude>./tests/path/to/excluded/test.php</exclude>
^-------------
</testsuite>
</testsuites>
</phpunit>
Sometimes it's useful to skip all tests from particular file based on custom condition(s) defined as php code. You can easily do that using setUp function in which makeTestSkipped works as well.
protected function setUp()
{
if (your_custom_condition) {
$this->markTestSkipped('all tests in this file are invactive for this server configuration!');
}
}
your_custom_condition can be passed via some static class method/property, a constant defined in phpunit bootstrap file or even a global variable.
I have create a very simple Jhipster app and deployed it to Heroku. Everything work fine so I added a new field to my very simple object and redeploy. I got the following error:
2016-09-07T12:32:49.375947+00:00 heroku[router]: at=info method=POST path="/api/tsts?cacheBuster=1473251569324" host=deplyjhip.herokuapp.com request_id=2b7190f7-0301-456d-87a9-7342640aad9d fwd="5.2.192.47" dyno=web.1 connect=0ms service=17ms status=500 bytes=532
2016-09-07T12:32:49.361875+00:00 app[web.1]: 2016-09-07 12:32:49.361 ERROR 3 --- [io-40257-exec-5] o.h.engine.jdbc.spi.SqlExceptionHelper : ERROR: column "amend" of relation "tst" does not exist
2016-09-07T12:32:49.361530+00:00 app[web.1]: 2016-09-07 12:32:49.361 WARN 3 --- [io-40257-exec-5] o.h.engine.jdbc.spi.SqlExceptionHelper : SQL Error: 0, SQLState: 42703
I know what happens. When I redeploy using:
./gradlew -Pprod bootRepackage -x test
heroku deploy:jar --jar build/libs/*war
it didn't run ./gradlew liquibaseDiff
How do I run liquibase diff and apply the changes on the heroku DB?
This seems you didn't fully migrated your new field. It looks like you just added the attribute to the Entity class in domain package, but didn't any liquibase migration. You have two option to achieve that:
manual migration
just create a "YYYYMMDDHHmmss_add_field_to_my_entity.xml" in src/main/resources/config/liquibase/changelog directory with a content like
<?xml version="1.0" encoding="utf-8"?>
<databaseChangeLog
xmlns="http://www.liquibase.org/xml/ns/dbchangelog"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.liquibase.org/xml/ns/dbchangelog http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-3.4.xsd">
<changeSet id="YYYYMMDDHHmmss" author="you">
<addColumn tableName="your_table">
<column name="column_name" type="??" />
</addColumn>
</changeSet>
</databaseChangeLog>
to make your changes happen. do not edit some changelog file, already been migrated in past!
adding field to your db, and thenn liquibase:diff
Here you can edit your database using tools you know in order to add that column via. SQL, and then run either ./mvnw liquibase:diff or ./gradlew liquibaseDiffChangelog to generate the migration
In order to run ./gradlew liquibaseDiff I set the db connection details in gradle/liquibase.gradle as follows:
args "--username=USERNAME"
args "--password=PASSSWORD"
args "--url=jdbc:postgresql://ec2-BLA.bla.eu-west-1.compute.amazonaws.com:5432/DATABASENAME?ssl=true&sslfactory=org.postgresql.ssl.NonValidatingFactory"
It doesn't work without: ssl=true&sslfactory=org.postgresql.ssl.NonValidatingFactory
This generated the changelog so I added it to master.xml and redeploy the app (basically step 2 from the David Steiman's answer)
I am having exactly same issue.I used jhipster entity foo added a new attribute, and run ./mvnw liquibase:diff, then I use heroku deploy:jar target/*.war, it give me same error. column "xxx" of relation "xxx" does not exist. I didn't find any solution : ( but remove the database from heroku, then run jhipter heroku
The accepted answer does not really solve the question asked. The question is why jhipster heroku does not run liquidbase migrations scripts even after defining new migrations.
Just add this to the Procfile as a seperate line
release: ./mvnw liquibase:update
Or you can manually run this commannd, in the run console of your app on heroku
./mvnw liquibase:update
I have a fresh installation of Laravel 5.1, and am trying to run automated tests using Elixir. According to documentation, I can run gulp tdd and have my tests execute automatically each time a file is saved. I have the initial ExampleTest.php which has this test:
public function testBasicExample()
{
$this->visit('/')
->see('Laravel 5');
}
This test asserts if the default welcome.blade.php file shows Laravel 5. Each time when I save the ExampleTest.php file, the automated tests do execute, and that's great. But when I change and save the welcome.blade.php file, the tests do not execute automatically.
Is this the desired behaviour or not? If not, what could be causing it?
By default elixir comes with two tasks for your test suites. One for phpunit and the other for phpspec, in your gulpfile phpUnit method is called on the mix object for phpunit test suite.
mix.phpUnit();
mix.phpSpec();
And then you need to type Gulp watchfrom terminal.
I'm building an application that needs to create a new database, perform migrations and seed db data via a web page.
I'm trying to achieve this with the following code in Laravel 4.2. Note, this is within a controller I've setup.
Artisan::call("migrate", array(
"--env" => "production"
));
No matter what environment I pass with the "--env" option, the environment that the migration is run on is the current environment that the site is currently running on. Ie. If I'm running on my local environment, and I run the above, it will execute the migration on the local environment which isn't what I'm looking to do.
If I run the equivalent command php artisan --env=production migrate from the command line, I get the results I'm looking to achieve. For the time being, I'm getting past this via passthru() but I'd like to take advantage of this Artisan facade if I can.
Does anyone know what's going on with this?
This isn't a pleasant way to do it, but it works.
Assuming your Artisan environment is based on $_SERVER['HTTP_HOST'] and you know the HTTP_HOST that will load your environment then you can set it manually before calling start.php
I used this to define the Artisan environment based on the base_url I was using in a Behat profile. That way I could configure fixture my database before running tests.
/**
* #BeforeSuite
*/
public static function runFixtures(SuiteEvent $suiteEvent) {
// Get the environment domain
$parameters = $suiteEvent->getContextParameters();
$baseUrl = $parameters['base_url'];
$urlParts = parse_url($baseUrl);
$_SERVER['HTTP_HOST'] = $urlParts['host'];
// Now call start.php
require_once 'bootstrap/start.php';
// Call Artisan
$stream = fopen('php://output', 'w');
Artisan::call(
'migrate:refresh',
[
'--seed' => true,
],
new StreamOutput($stream)
);
}
--env is the option to specify application's environment when the application is starting. In other words, if you specify --env option, Laravel will use your specified environment instead runs a detecting method in environment detecting method.
So, If you run artisan via CLI with --env option, In start file, artisan can detect --env option from $_SERVER variable, specify the application environment and run your command.
In contrast, when you call Artisan::call(), Laravel will resolve the console application class (Illuminate\Console\Application) and run your command. Because your application was started, then Application just runs your command without detecting environment. More over, latest version of migration command class use application environment to get a database connection
Therefore, when your call Artisan::call() the --env option is completely omitted.
Just my opinion. If you really want to avoid using passthru() function, you can rename the production database connection name in app/config/database.php to unique name e.g. production and set your default database connection to your new name. When you want to migrate production database, just call Artisan::call('migrate', array('--database' => 'production', '--force' => true)) instead of changing the environment.
I am trying to run some acceptance tests in my Laravel application. While functional tests trigger testing environment, acceptance tests do not. Is it a bug or a feature of acceptance tests? The main problem why this is bothering me is the fact, that it is not using(+populating+cleanup) testing database, it only connects to dev database (which is used, when no other ENV is specified e.g. testing, production) and this often fails those tests when I run them multiple times.
This is my configuration:
codeception.yml
paths:
tests: app/tests
log: app/tests/_log
data: app/tests/_data
helpers: app/tests/_helpers
settings:
bootstrap: _bootstrap.php
suite_class: \PHPUnit_Framework_TestSuite
colors: true
memory_limit: 1024M
log: true
modules:
config:
Db:
dsn: 'mysql:host=localhost;dbname=testdb'
user: 'root'
password: 'root'
dump: 'app/tests/_data/dump.sql'
populate: true
cleanup: true
acceptance.suite.yml
class_name: WebGuy
modules:
enabled:
- PhpBrowser
- WebHelper
- Db
config:
PhpBrowser:
url: 'http://localhost/'
functional.suite.yml
class_name: TestGuy
modules:
enabled: [Filesystem, TestHelper, Laravel4, Db]
Thanks for your help!
"Acceptance" tests are not run in the testing environment. The reason is when Laravel is in the testing environment, it disables filters by default. So therefore the testing environment is only for unit and functional tests.
Acceptance tests should be run in another environment (like dev or a specific one to Codeception).
Because Codeception 2.x now uses Guzzle to get a page response, it is possible to detect when you are on your host machine and Codeception is doing a specific request. That way you can have a "testing" environment and also a "codeception" environment specifically for your acceptance tests.
If you are using Homestead, I do this in my start.php file to detect if Codeception is running, and specifically put it into a 'codeception' environment, otherwise I let it run my environment detection normally
if ((gethostname() === 'homestead') && (isset($_SERVER['REMOTE_ADDR'])) && ($_SERVER['REMOTE_ADDR'] === '127.0.0.1'))
{
$env = $app->detectEnvironment(['codeception' => ['homestead']]);
}
else
{
$env = $app->detectEnvironment(['dev' => ['homestead']]);
}
Then in my 'codeception' environment, I setup a SQLite file database, and run acceptance tests against that (which is faster than mySQL testing).
First, you must understand that Codeception acceptance tests do not run inside the Laravel testing environment. Rather, it uses Guzzle to make external HTTP requests. Assuming your acceptance tests run against localhost, you are running in your standard development environment. It is just like using your browser.
That said, here is how I use Codeception Acceptance tests to run against a Laravel testing environment. I run Vagrant with a LAMP stack on Ubuntu.
Edit your /etc/hosts file. Add test.localhost to the 127.0.0.1 line. Do not remove the other hosts there. If you use WAMP/MAMP/or other, it might be a similar setup.
/etc/hosts
127.0.0.1 localhost test.localhost
Setup your environment handling with Laravel. Below is code in the bootstrap/start.php file in your larval root dir. pay attention to the 'testing' line.
bootstrap/start.php
$env = $app->detectEnvironment(array(
'local' => array('your-machine-name'),
'testing' => array('test.localhost')
));
Make sure your codeception acceptance test is configured to hit the right domain/url. The code below is from my own acceptance tests for an API. Notice the url: sections have test.localhost. That is the url that Codeception will hit for the tests.
app/tests/acceptance.suite.yml
class_name: ApiGuy
modules:
enabled: [PhpBrowser, REST, ApiHelper, Db, FileSystem]
config:
PhpBrowser:
url: http://test.localhost/
REST:
url: http://test.localhost/api/v1/
Putting it all together
We edited the /etc/hosts file so that Codeception can find test.localhost. Editing your web server config to handle test.localhost is outside the scope of this answer.
We edited Laravel's bootstrap/start.php so that it knows that any requests coming into test.localhost should run in the testing environment.
We edited Codeception's acceptance.suite.yml file to tell it to run all tests against http://test.localhost
Now, assuming the above is done correctly, you should be able to run codeception run acceptance and see the output of your tests.