Testing multiple databases in Laravel with phpunit - laravel

In my application I have tests that use Sqlite and seeding to cover my functionality. So my phpunit env value is set to:
<env name="DB_DEFAULT" value="sqlite" />
However, I'd also like a small subset of my tests to look at my current db data. So within 1 test file, connect to my MySQL connection.
The reason for this is I want to check if we have images for all the records in one of my tables, so the test needs to access the latest data, dummy data would not help in this case.
I presumed I could just connect using the following:
$pdo = DB::connection('mysql')->getPdo();
However, the test only connects if I specify "sqlite" here, which is the name of my sqlite connection.
The testing I want to do is not based on a Model, so I don't want to have a solution built into my Model file, I'd like to switch database just in this 1 test file if possible.
How do I configure a small number of tests to use a different database connection, the existing MySQL connection?

So, the reason why I could not connect was because there were no DB_ values in my .env.testing file. The mysql connection was then using the Laravel config defaults so the connection did not error.
.env.testing
DB_DEFAULT=mysql
DB_HOST=database
DB_NAME=my_database_name
DB_USER=root
DB_PASS=root
Adding the lines above got everything working and now my test file can access the current local database.
public function __construct()
{
$this->imgPath = dirname(__FILE__, 3) . '/public/images/';
}
public function testImages()
{
$items = DB::connection('mysql')
->select(DB::raw('SELECT image_filename FROM items')
);
foreach ($items as $item) {
$this->assertFileExists($this->imgPath . $item->image_filename . '.jpg');
}
}

Related

Using Memcache Directly in Laravel 5 instead of using through Cache

I am using local memcache server for storing values. It works fine if I go through defining Memcache as selected driver for Cache. in config/cache.php However, if I use memcache outside laravel the memcache accessing is much faster than within Laravel controllers using Cache::get( ).
I need to store decent amount of data in Memcache and will be accessed across the system. So I was trying to use memcache directly but I am getting Following error.
[2016-08-23 14:11:19] local.ERROR: Symfony\Component\Debug\Exception\FatalThrowableError: Class 'App\Http\Controllers\Memcache' not found in....
My Code is as following :
namespace App\Http\Controllers;
use Illuminate\Http\Request;
use App\Http\Requests;
use Cache;
use stdClass;
use DB;
use Memcache;
class InternalCommunication extends Controller
{
public function update_stock_prices_memcache()
{
echo "\n before the memcache obj creation ".microtime(true);
$memcache = new Memcache();
$memcache->connect('localhost', 11211) or die ("Could not connect");
//$res1 = $memcache->set('key1',"Some value 2");
$res1 = $memcache->get('key1');
.....
Just to be clear - memcache packages are installed and working fine, as I could get it working via Cache: as well as directly accessing memcache from outside Laravel installation.
Appreciate any sort of help I can get.
As per laravel cache docs, you need to set memcached configuration in config/cache.php, and specify driver to use as "memcached".
Then, simply use \Cache as below example.
// to get value
$value = \Cache::get('key');
// to set value
$minuts = 30;
\Cache::put('key', 'value', $minutes);
You could also specify driver on your code if have multiple caches
$value = \Cache::store('memcached')->get('key');
I was able to figure out the required changes needed to access the memcache within Laravel. Following is the code that is working smoothly for me now.
use Memcached;
....
$memcache = new Memcached;
$memcache->addServer('localhost', 11211) or die ("Could not connect");
$res1 = $memcache->get('key1');
....
This is definitely faster than Cache::get using memcache driver!

Laravel 5.1 unit testing with session

Given this test:
echo "\n";
echo Session::getId();
$this->call('GET', '/');
echo "\n";
echo Session::getId();
The output shows that the Session ID is not persisted after the request:
fb7e02798f043fac798a424547f0d01acd0dbdc0
83133c07abdbba5bc32f74eaf14362a69406ca45
As far as I can tell they should be the same, app/config/session settings
'driver' => env('SESSION_DRIVER', 'file'),
'lifetime' => 120,
'expire_on_close' => false,
'domain' => "test.com",
The same tests works in 4.2, not entirely sure if there is an additional requirement to use sessions in unit tests or I should be using the facade implementations.
edit: 0 session issues browsing normally on the site
phpunit.xml
<env name="APP_ENV" value="local"/>
<env name="CACHE_DRIVER" value="file"/>
<env name="SESSION_DRIVER" value="file"/>
<env name="QUEUE_DRIVER" value="sync"/>
Unit testing is about testing single classes
I think the intention of unit testing is not fully understood in this example.
Unit testing is not about touching classes you don't want to test. Therefor it should not include third party libraries or resources and it should be about just one class.
Every other code dependency should be implemented with a test double.
If I would include the concrete session in my unit test, I would actually test that object too. Why would I? It means that if Laravel changes the session implementation, I would have to change my unit tests.
Which would mean I would lose the advantage of the unit test, because it should be about change that happens to just one class. With losing the advantage I will lose the benefit of the test, to see if other classes still work after a change.
Integration testing
The visit method is more like integration testing, since it loads the routing mechanism, controller and other dependencies defined in the controller.
You can manage the session data by setting up the expected values for each test case (or test method).
For example, if you have a checkout process which contains a form in each checkout step:
public function testFormStep1()
{
//$this->visit...
}
public function testFormStep2()
{
$this->session([
//... data from step 1 you need in step 2
]);
//$this->visit...
}
In this way you have control over each test case. You don't want to depend on other test cases, test should be done in isolation.
So for example, if step 1 of a form can only pass to step 2 when all form fields are present, you can simply make another test case which simulates fields are left empty in step 1:
public function testFormStep2Error()
{
$this->session([
//... incomplete data from step 1
]);
//... verify form step 2 should do something when form in step 1 is not completed
}
Selenium
Another option is Selenium, which exists more at the level of the system, because it includes a browser and goes through the web server with an HTTP request. The session is bound to the browser by a session cookie (and id). In this way you can preserve the session for each request.
Unit tests run in CLI/terminal/commandline. Cli does not work with persistent sessions.
When running tests, Laravel will automatically set the configuration environment to testing. Laravel automatically configures the session and cache to the array driver while testing, meaning no session or cache data will be persisted while testing.
See: http://laravel.com/docs/5.1/testing
Actually this is almost the same message as in version 4.2. So no idea how you could have made persistent sessions in unit tests before.

Using SESSION in parameter.ini file

I need to change the db name for particular login in (parameter . ini file) in symfony 2.0. So i tried in session .But it is not working .Is it possible to put session value in parameter . ini file ?
My code: (parameter . ini File)
<?php
session_start();
$dbnamenew='';
if($_SESSION['test_db']!=""){
$dbnamenew=$_SESSION['test_db'];
}
else {
$dbnamenew ='test';
}
?>
; These parameters can be imported into other config files
; by enclosing the key with % (like %database_user%)
; Comments start with ';', as in php.ini
[parameters]
database_driver="pdo_mysql"
database_host="localhost"
database_port="22"
database_name="<?php echo dbnamenew;?>"
database_user="user"
database_password="pass"
mailer_transport="smtp"
mailer_host="localhost"
mailer_user=""
mailer_password=""
locale="en"
secret="b538e3680321a85b2e39a3d1772e0b711ff9371c"
Even if you could use PHP in the configuration file it wouldn't work since the parsed configuration is cached (so it's only executed once).
If you have a finite number of databases, define a database connection for each one. You can later choose it based on the variable taken from the session.
If you use doctrine's entity manager as well, or if you have infinite number of databases, you'll have to alter the connection parameters on the fly. You can do it with an event listener.
Related answer describing how to change the database connection: Symfony 2 : multiple and dynamic database connection

CodeIgniter custom encrypt function

In my previous project i was created a custom encryption function to login. How can i use it in CI. here is my code
function sha_password($username,$password){
$username = strtoupper($username);
$password = strtoupper($password);
return SHA1($username.':'.$password);
}
and i was called like that to get encrypted password
$password = strtoupper(sha_password($username,$password));
how can i do it to work in CI? :?
you can place it in various places:
a model - if you have a model for a user, $user->getEncryptedPassword();
a library - in my project i have libuser that has the encryption function, so i call it by $this->libuser->encrypt_password();
a controller (MY_Controller for example) - create a function and call it by $this->encrypt_user_password(..)
just drop it in some of the files that will always be loaded, in config or something
if you don't plan on changing it, just do $encpass = sha1(strtoupper($username.':'.$password)); although i wouldn't go there.
options 1 and 2 are most recommended

Run Doctrine 2 CLI Tools Tasks from a script run from the browser

A little background information: I am working on integrating Doctrine into a CodeIgniter application. I have it working, but I would like to be able to run the Doctrine command line (CLI) tasks from the browser, i.e. not from the command line script.
The reason I desire this is because I will be running Doctrine and CodeIgniter on a shared hosting package where I will not have command line access.
This seems like a very basic feature, but is not readily available with Doctrine 2.
My last-ditch effort will be going into the command line tool and figuring out how the tasks are being executed then duplicating that code in a CodeIgniter controller.
If there is any simpler way to do this, please let me know.
Thanks!
Unanswered duplicate posted a while back.
For the following
$doctrine = \Zend_Registry::get('doctrine');
$em = $doctrine->getEntityManager();
$tool = new \Doctrine\ORM\Tools\SchemaTool($em);
Get the SQL to update the current schema:
$sqlArray = $tool->getUpdateSchemaSql($em->getMetadataFactory()->getAllMetadata());
Update the schema with the current metadata
$res = $tool->updateSchema($em->getMetadataFactory()->getAllMetadata());
Create the schema.
$res = $tool->createSchema($em->getMetadataFactory()->getAllMetadata());
This belongs in an install script. Just create and verify the db connection
$conn = $doctrine->getConnection();
$sql = "SELECT * FROM users";
try {
$stmt = $conn->query($sql); // Simple (too simple?)
die('Already installed');
} catch (Exception $e) {
// Table not found, continue
}
Then create your schema.
You probably don't want to try to run the command-line tools without a command-line.
However, you can do it yourself in scripts pretty simply. For instance, if you wanted to do things that orm:schema-tool:* does, you'd start here

Resources