I am using codeigniter. I am fetching time from the database.
When I am fetching time, the hour time and second time are right but the minute time goes wrong.
In database time is,
12:25:51
and I am getting in output as,
12:08:51
In view, I am fetching like this,
$cur_date=date("Y-m-d", $date);
foreach($result as $k=>$v)
{
if(($cur_date) == (date('Y-m-d',strtotime($v['clock_in']))))
{
echo date('H:m:s', strtotime($v['clock_out']));
}
}
So, actually where is the problem?
Make test to see if there is clock difference between file server and db server. SELECT CURTIME(); is command for getting time from db. And date('H:i:s', time()); is command for getting time from file server where PHP is installed. Make an controller and echo line after line. Of course, for DB command you would have to use CI's $this->db->query();.
Related
I want to check my stored data at database for every minute if now time has exceeded 'close_at' time. So then I can change the 'status' column to be closed.
I've been thinking that I can ->get() all data from to the table and use looping to check every data's 'close_at' then change the status but that's not effective nor efficient is it?
Got any idea?
To build on the above answers into something more comprehensive :
Make sure cron is executing Laravel's scheduler every minute :
php8.0 /home/yourdomain.com/artisan schedule:run
Make a new Laravel command to execute the relevant logic :
php artisan create:command CloseJobs
Write the necessary code in that class - in this case, retrieve all the relevant open jobs which need to be closed and close them, which we can do through mass assignment in one easy line (make sure your 'status' is added to the $fillable array on your model) :
protected $signature = 'close:jobs';
....
$jobs = Job::where('status', 'open')->where('close_at', '<', now())->update(['status' => 'closed']);
Tell Laravel to execute that task every minute, in App\Console\Kernel.php :
$schedule->command('close:jobs')->everyMinute();
I am using form. I am unable to redirect after completion of process. It is working proper when I have less amount of data for processing but when data is high, it is taking time 10 to 15 minutes and here it is unable to redirect on another location.
following this
view.php
doing my work
after completing functionality.
$controller->redirect_function($path);
controller.php
function redirect_function($path){
redirect($path);
}
Please help
Your script must be dying before completion because of the timeout value set in your php.ini.
You can use set-time-limit function to increase the maximum time a script can run.
I have about 25.000 rows in my DB table 'movies' (InnoDB, 17.5 mb)
And when I try to get them all to display in my admin panel, nothing happens. Just 5-8 seconds pending and white screen. No displayed errors, just nothing. (max execution time is 3600 seconds, because it's on my local machine). My simple as hell code:
public function index()
{
$data['movies'] = Movies::all();
dd('This var_dump & die never fires');
// return view('admin.movies', $data);
}
I just wonder why it not performs the query and just die without declaration of war.
I didn't found anything interesting in .ENV or config/database.php to explain what happens in such situations.
PS. yes, I can make serverside pagination and search, and take only 10-25 records from the DB, question is not about that.
Looks like you are running out of memory. Try quering half, of the results, or maybe just 100 to see if that at least fixes the white page, if so use chunk:
Movies::chunk(200, function($movies)
{
foreach($movies $movie)
{
var_dump($movie);
}
});
You should definitely look at your storage\logs directory to verify the error. It's quite possible that it takes to much memory getting 25k rows.
In fact as you mentioned in real life there is no need to get so many rows because unless you export them into CSV or XLS.
One of the main purposes of caching is to save resources and not do things like hit your database every request. In light of this, I'm confused by what all Codeigniter does in a controller when it encounters a cache() statement.
For example:
$this->output->cache(5);
$data=$this->main_model->get_data_from_database();
$this->load->view("main/index", $data);
I realize that the cached main/index html file will show for the next 5 minutes, but during these 5 minutes will the controller still execute the get_data_from_database() step? Or will it just skip it?
Note: the Codeigniter documentation says you can put the cache() statement anywhere in the controller function, which confuses me even more about whats getting executed.
I can answer my own question. NOTHING in the controller function other than the cached output gets executed during the time in which the cache is set.
To test this yourself, do a database INSERT or something that would be logged somehow (e.g. write to a blank file).
I added the following code below my cache() statement and it only inserted into the some_table table the first time I loaded the controller function and not the 2nd time (within the 5 minute span).
$this->db->insert('some_table', array('field_name' => 'value1') );
I think this can be verified enabling the Profiler in your controller and check if any query is done. Make sure this is enabled only for your IP if you're using it in Production environment.
$this->output->enable_profiler(TRUE);
-- EDIT 1 --
This will be visible only once. Soon after the cached page is stored, the profiles result won't be visible again (so you might wanna delete the file and refresh the page).
-- EDIT 2 --
You might also use:
log_message('info', 'message');
inside your model, then change in config.php, $config['log_threshold'] to 3 and check the log file.
-- EDIT 3 --
For sure the selection will be done unless you have enabled the database cache. In this case, in the cache folder you'll see the database selection cached.
In Magento I write a number of small command line scripts to do things like set a new attribute on a number of products. I am finding that the time it takes to update 900 products takes about 6 hours to complete.
The time it takes to load the individual products goes as fast as I would except, but the act of saving once I have made the change takes a very long time.
I am attaching how I am loading the products in case there is something I can do to better optimize the process. Any help here would be greatly appreciated.
$product = Mage::getModel('catalog/product')->load($magento_id);
$product->setMadeInUsa(1);
try {
$product->save();
} catch(Exception $e) {
echo "ERROR: " . $e->getMessage() . "\n";
}
The code runs without error, but it takes forever.
Mage::getSingleton('catalog/product_action')
->updateAttributes(array($product->getId()), $attributesData, $storeId);
This code only updates the attributes you want to change. The first paramater is an array of product IDs, the second is an array of attribute names and values, and then the third is the store ID you wish to update.
This is MUCH faster than saving the entire model.
Try first seting indexing to Manual and then reindex after update is done. This should improve the performance. However the ultimate solution, if you are going to do the import often, is to follow the code ideas you can find in update attributes mass action, which is optimized for saving many products at once.