Even when running a single simple test like the following:
public function test_simple()
{
$user = factory(User::class)->create();
$this->browse(function ($browser) use($user){
$browser->visit('/login')
->type('email', $user->email)
->type('password', 'secret')
->press('Login')
->assertPathIs('/home');
});
}
Laravel Dusk runs very slow with speed of 14-16 seconds for this test alone.
How can I speed it up ? Because if I will be running like 100 tests it would take extremely long.
Any Solutions ?
For me the thing that was taking lots of time was running migrations before each test and migrate:rollback after each test done by DatabaseMigrations trait. Here is how I solved it on my project:
Remove usage of DatabaseMigrations trait from you test cases.
Before starting tests call ./artisan migrate:fresh --seed --env=dusk. I do it in bash file that I call to run tests.
In your base DuskTestCase class in setUp method call a command to delete all data in your tables. Here is what I do for MySQL (inside Artisan command):
$command = 'mysql -u test_database -Nse "show tables" test_database | while read table; do mysql -u test_database -e "SET FOREIGN_KEY_CHECKS = 0; delete from $table" test_database; done;';
$result = exec($command);
After above command run $this->artisan('db:seed');
Now tests running much faster.
Solving your issue might depend on many things (your dev environment setup, your database or some other issues).
In my case it was migrations that were taking much time for every test. When you are using Laravel Dusk you are using DatabaseMigrations trait that is running migrations over and over so if you can improve this part it might be you a big gain. You talked about 100 tests. So assuming applying your migrations takes 10 seconds and you could decrease to 4 seconds, in total you would gain 600 seconds.
You can read more about it how I made my Laravel Dusk tests 3 times faster than they were initially - https://laradevtips.com/2018/07/23/make-laravel-dusk-tests-3-times-faster/ - I don't know if it will solve the issue in your case but in my case (tested today) the gain is really impressive.
Related
I'm wondering if anyone has more details on how to schedule Joomla to be set to "Offline/Maintenance Mode" at a specific date and time. I found this post here on stack overflow and I'm wondering if anyone has been successful in either (1) implementing a custom plugin to add this functionality, or (2) via a script that sets the site into offline/maintenance mode and perhaps a cronjob to run the script at specific time or if (3) maybe there is an extension that already exists that simply adds this offline/maintenance mode scheduling feature.
Based on the previous post I linked to above I'm not sure if a plugin would work or how best to go about the script and cronjob technique. From my understanding based on the responses in the post, it sounded like the script and cronjob would be the only way to accomplish this. If someone can let me know if they were successful implementing this and how that would be great or any suggestions or direction on how to go about it would be helpful.
Using a plugin for such little would not worth it in my opinion.
I would rather use a little script like:
<?php
// Make sure this is only called through command line
if (php_sapi_name() !== "cli") die('Only command line');
// Replace by your joomla configuration file path
$configuration_file_path = '/var/www/joomla/configuration.php';
if (!empty($argv[1])) {
$offline = 1;
} else {
$offline = 0;
}
// Retrieve configuration file content
$configuration_content = file_get_contents($configuration_file_path);
// Replace the offline line by the calculated value
$configuration_content = preg_replace('/(.*)public \$offline =(.*)/m', '$1public $offline = \'' . $offline . '\';' , $configuration_content);
// Write back the configuration file
file_put_contents($configuration_file_path, $configuration_content);
This script can be called through the command line:
php offline.php 1 #to enable offline status
php offline.php 0 #to disable offline status
If you need to run it through a cronjob by editing /etc/crontab or add it in your hosting settings:
# Offline at 4AM each day
0 4 * * * www-data php /path/of/your/script/offline.php 1 >> /dev/null 2>&1
# Online at 4:05AM each day
5 4 * * * www-data php /path/of/your/script/offline.php 0 >> /dev/null 2>&1
I'm trying to setup cypress on my local machine and run parallel tests. But I can't find some information how to do this.
Technically, it's possible. Cypress does not recommend it since running multiple cypress instances on the same machine consumes lots of resources (CPU over all) and it slows down the entire machine performances with useless results.
Anyway if you have limited resources and you cannot use the official dashboard or you don't have more than one CI server available, you can run your test on a single machine launching cypress run multiple times dividing your test suite in multiple folders.
I've created a npm library called cypress-parallel (https://github.com/tnicola/cypress-parallel) that (after the first run), balances and splits the test suite in multiple subsets, based on the tests run history and for each subset it launches a Cypress command. It also collect the results from all the specs file, logging them at the end of the execution. In terms of performances, it seems that with 2 processes you can improve your overall tests execution time time up to 40%.
Well, I kinda run them parallel locally. Some thoughts to use:
I have a MacBook, so it's implemented for iOS.
My application runs in a Docker container, I only need one instance to run multiple tests at the same time. Via my terminal I created multiple files splitting the specs into seperate .command-files like this:
echo "cd <PROJECT_DIRECTORY> && npx cypress run --spec cypress/integration/<SPECS_DIRECTORY>/*" > cypress.command; chmod +x cypress.command
You can stack multiple directories/files behind the --spec, thus --spec cypress/integration/<SPECS_DIRECTORY>/* cypress/integration/<SPECS_DIRECTORY2>/* is also valid.
Lets say I have 2 of those .command-files. I can start those with this command:
open cypress-01.command cypress-02.command
This will launch two separate terminals, both running the specs mentioned in each file.
This reduced my runtime for the local tests from 1,5h to 15 minutes.
A) The most "naive" (1minute and done) solution assuming you are on linux/macOs that actually worked somewhat decently (just to re-run regression locally) to have a bash script with simple & at the end
# to keep vid,pic could cause issue when trying to write and delete at the same time
export CYPRESS_trashAssetsBeforeRuns=false
XDG_CONFIG_HOME=/tmp/cyhome1 cypress run -spec cypress/integration/first.spec.js,cypress/integration/another1.spec.js &
XDG_CONFIG_HOME=/tmp/cyhome2 cypress run -spec cypress/integration/another2.spec.js,cypress/integration/another3.spec.js &
B) But if you want something tiny bit more "sophisticated" continue reading on :
However in our test we run same regression on 4 datacenters (aws,gc) and on each we run multiple brands (some are for redundancy, some are specific to that DC location) so for our needs we don't need a specs balancing. Rather parallel the cypress processes.
So far it seems for this to work well, you need couple of pre-requisites as you can read here. We've had to solve few issues.
Xvfb race condition
have ability to limit the amount of threads
profile locking issue
image access issues
starting the Xvfb prior running our parallel run script.
# Start x11 server to avoid race condition in threads
Xvfb :96 &
# Make all cypress instances connect to the spawned x11
export DISPLAY=:96
# Read 4)
export CYPRESS_trashAssetsBeforeRuns=false
# read below (that's where the parallelization happens node.js 10+)
node ./main.js
There are better more robust solutions out there but this seems to have worked for us.The bash above runs the main.js below.
Each Array of brands is executed in parallel but awaited execution of each series forEachSeries, without it you would just run everything in parallel (instead of 2 you'd have 4 threads ).
So as long as you can create an array the amount 1st-level arrays will define the amount of parallel threads. You can google search for balanced arrays function and use it for balancing the array and if you decide to balance specs instead of "brands" as we do below, you just need to modify the command passed to awaitedSpawn() with something like XDG_CONFIG_HOME=/tmp/cyhome${cfg.id} cypress run --spec {cfg.spec}.
// main.js
// this part is synchronous
if (run.THREADS_NO >= 2) {
// 2 threads with 2 brands each
const threads = {[[brand: "brand1"],[brand: "brand2"],[[brand: "brand3"],[brand: "brand4"]]};
threads.forEach((threadBrandInfo) => {
asyncWrapper(threadBrandInfo);
});
}
// async_stuff.js
// courtesy of https://github.com/toniov/p-iteration
exports.forEachSeries = async (array, callback, thisArg) => {
for (let i = 0; i < array.length; i++) {
if (i in array) {
await callback.call(thisArg || this, await array[i], i, array);
}
}
};
const awaitedSpwan = async (cmd) => {
const child = await exec(cmd);
return new Promise((resolve, reject) => {
child.on('close', (code) => {
if (code === 0) {
resolve(child.stdout);
} else {
const err = new Error(child.stdout,child.stderr);
reject(err);
}
});
});
}
const asyncWrapper = async (brandsConfigs) => {
forEachSeries(brandsConfigs, async (cfg) => {
await awaitedSpawn(`XDG_CONFIG_HOME=/tmp/cyhome${cfg.brand} cypress run`)
.then((res) => {
console.log(res);
return res;
})
.catch((e) => {
console.error(e.stderr);
});
});
};
This part of the code above solves that issue XDG_CONFIG_HOME=/tmp/cyhome1
Simply set the cypress env var trashAssetsBeforRuns=false
one way of doing that is using cypress.json or as in the bash script in 1)
I have created a npm tool called orchestrator (https://github.com/0xIslamTaha/orchestrator) to be able to run all your specs in one machine. It uses docker underneath and it splits all specs across multiple docker machines.
Features:
Open source.
Automatically split all specs.
Support multiple browsers.
Generate a handsome HTML report.
Easy Configurable.
Working great with docker.
Fully documented.
There is an open-source use case repo (ready to go).
Articles:
Cypress parallelization with the Orchestrator — part 1
Cypress parallelization with the Orchestrator — part 2 — ShowCase
Show Cases:
Orchestrator-Public-Use-Case
On Linux, you can use GNU parallel. Then you can run Cypress on 8 cores for example with:
find cypress/integration/ -name '*.js' | parallel -j8 npx cypress run --spec {}
Add --tty parameter to keep colors. Add --group to have outputs not mixed. I did not achieve to use those two parameters at the same time and keep colors though.
Bazel+rules_nodejs is capable of running multiple cypress tests in parallel on the same machine. But, the experience of writing cypress tests in bazel will be quite different than you are used to.
https://github.com/bazelbuild/rules_nodejs/tree/2.0.0-rc.3/examples/cypress
Try This, but this could halt your system
Add it to your Package.json and run "npm run cy:runBrowsers"
"cy:runSpec" : "npx cypress run --spec 'cypress\\e2e\\ecommerceDemo\\*' --headed",
"cy:runBrowsers": "npm run cy:runSpec -- --browser chrome | npm run cy:runSpec -- --browser edge"
it's weird, when developing localhost, everything works fine, the default page shows.
after upload to server, it just show blank page !
it's driving me crazy !
echo 'outside route';
Route::get('/', function()
{
echo 'inside route';
return View::make('hello');
});
both echo works, but View::make('hello') just don't work, views/hello.php is the default file.
You might have to fix your permissions on the remote server, as it might be a cache issue.
1) Run recursive chmod on you storage path (*assuming you already have proper file ownage)
cd /path/to/laravel
chmod -R 755 app/storage
2) Clear cache with Artisan
php artisan cache:clear
3) Refresh page, should work now.
*if you are running the http server as different user (for example you're on Ubuntu and Apache runs as user www-data), you might want to set file ownage for Laravel app files as well
chown -R www-data .
EDIT:
Just a remark about your code example - remember that if you want to use Blade templating engine you have to name your files accordingly. If you want to have a blade template called 'something', you will place your code in app/views/something.blade.php and than reffer to it for example View::make('something').
My website hosting server is hostmonster.com.
My application uses codeigniter framework.
I have a code which sends emails to my users and I want to make it automatic.
I have used the cpanel of the hosting service and I tried to give the command as
php -q www.mysite.com/admin admin sendDailyEmail
my controller is admin and the method is sendDailyEmail and the controller is present inside the application/controllers/admin folder.
I have also set a reminder email to me whenever the cronjob is run.
The email subject reads
Cron php -q /home1/username/public_html/admin admin sendDailyEmail
and the body says
No input file specified
Where do I go wrong.
I have never run cronjobs and this is my first time.
I am no good in giving command line instuctions too.
My admin sendDailyEmail code is as follows
function sendDailyEmail() {
$data = $this->admin_model->getDailyData();
foreach ($data as $u) {
if($u->Daily){
//if(!$u->Amount){
if ($u->Email=='myemail#gmail.com') {
$user['user_data']['FirstName'] = $u->FirstName;
$user['user_data']['LastName'] = $u->LastName;
$user['user_data']['Id']=$u->Id;
$this->email->clear();
$this->email->to($u->Email);
$this->email->from('alerts#mysite.com', 'MySite');
$this->email->subject("My Subject");
$msg = $this->load->view('emails/daily_view', $user, true);
$this->email->message($msg);
if ($this->email->send())
$data['message'] = "Daily Emails has been sent successfully";
else
$data['message'] = "Daily Emails Sending Failed";
}
}
}
$data['main_content']['next_view'] = 'admin_home_view';
$this->load->view('includes/admin_template', $data);
}
You can use wget and set the time for whatever you like:
wget http://www.mysite.com/admin/sendDailyEmail
You can also use curl:
curl --silent http://www.mysite.com/admin/sendDailyEmail
For CodeIgniter 2.2.0
You can try this:
php-cli /home/username/public_html/index.php controller method
or at your case
php-cli /home/username/public_html/index.php admin sendDailyEmail
It works fine with me..
Cheers!
Codeigniter sets up command line differently for running crons, etc.
Read:
http://www.codeigniter.com/user_guide/general/cli.html
So you should run:
php index.php admin admin sendDailyEmail
(that may need adjusted; based on your code above)
Have a look at an article I just wrote that goes a little deeper into it all:
http://codebyjeff.com/blog/2013/10/setting-environment-vars-for-codeigniter-commandline
i have facing same issue while, but following work for me
wget http://www.yoursite.com/controller/function
I am trying to learn how to do my first cron job using CodeIgniter. In the past, it seemed the only way to do this with CI was to use the wget command instead of php.
The CodeIgniter User Guide, however, says that now you can do this from the command line, for example by running:
$ cd /path/to/project;
$ php index.php controller method
This works great using Terminal on my local setup. But when I use a similar command in the cron section of cPanel on my shared hosting, the task just returns the contents of index.php.
I'm not entirely sure what cPanel does with this command, so unsure as to whether it's using the command line at all.
Could someone explain how I might be able to set up a cron job on shared hosting using CodeIgniter please?
Here is the example code from the CodeIgniter user guide:
tools.php
public function message($to = 'World')
{
echo "Hello {$to}!".PHP_EOL;
}
}
?>
It's going to depend on your host. Cron jobs could really screw stuff up if you're not careful, so a lot of shared hosts don't allow it. You probably need to be on some virtual container (like a VPS, virtuozo, etc.) to do this. This isn't a CodeIgniter issue, but a hosting provider issue. Call them first.
We worked around this exact issue as follows:
Set up a normal php file that is scheduled by cron. Nothing to do with codeigniter yet
Inside it, you can make an fsocket or curl request to perform your regular CodeIgniter call as you do from the web.
Here's an example (say, cron.php)
#!/usr/local/bin/php.cli
<?php
DEFINE('CRON_CALL_URL','https://my_server/'); //
DEFINE('CRON_HTTPS_PORT', 443); // port to use during fsocket connetion
DEFINE('CRON_SSL_PREFIX', 'ssl://'); // prefix to be used on the url when using ssl
$current_time = now();
$md5_hash = md5('somevalue'.$current_time);
$url = CRON_CALL_URL.'MYCTRL/MYMETHOD';
$parts=parse_url($url);
//
$parts['query']='md5_hash='.$md5_hash.'&time='.$current_time;
$fp = fsockopen(CRON_SSL_PREFIX.$parts['host'],
isset($parts['port'])?$parts['port']:CRON_HTTPS_PORT,
$errno, $errstr, 30);
if (!$fp) {
} else {
if (!array_key_exists('query', $parts)) $parts['query'] = null;
$out = "POST ".$parts['path']." HTTP/1.1\r\n";
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Type: application/x-www-form-urlencoded\r\n";
$out.= "Content-Length: ".strlen($parts['query'])."\r\n";
$out.= "Connection: Close\r\n\r\n";
if (isset($parts['query'])) $out.= $parts['query'];
fwrite($fp, $out);
fclose($fp);
}
}
?>
NOTE: Make sure that in your MYCTRL/MYMETHOD function you have
ignore_user_abort(true);
that way when you fsocket connection is closed, your script will still run to the end.
We actually have a bunch of these fsockets for various reasons. If you need to make sure that the call to that controller/method came from the cron script, you need to pass some additional hash values so that only cron and the script know it. Once the script is called it has access to any codeigniter functions. Works like a charm.
I've set up 100s of CI cronjob on shared hosting like this: create a short php script which calls the CI controller as if it was a webbrowser.
So, script.php contains this:
script #! /usr/local/bin/php -f /home/example/public_html/script.php
<?php
get_get_contents('http:example.com/cronjob/');
?>
Then set your cronjob in cPanel to call script.php
When it runs Script.php will call the Codeigniter Cronjob controller. There you have the entire CI framework at your disposal.
If you are going to call it like a web browser, why not replace the cronjob
command with:
wget http://example.com/cronjob/
instead of creating something new or simply
curl --suppress http://example.com/cronjob/`