Automatic script Codeigniter - codeigniter

How to make sure that every 24 hours run script automatically, which compares the current time with the time that is stored in the database, and if the current time is greater then I need to make changes to the database. I use Codeigniter.

CRON Jobs is the solution, but it is not script, it's server based application that runs every given time. You can setup it from hosting panel, but not all of the hosting providers support it :(

Related

In CI/CD piplines, how do you run a portion of the build process using an interval?

Assume I'm using a GitLab pipeline and there is a build process that gets everything ready for production. There is a 3rd party database that needs to be downloaded, e.g. a MaxMind Geo database. I don't want to strain their servers every time we run a build, so I'd want to only download the latest database once a month.
What tactics can I use to save a "last run" date, check it, and take action to download the DB if the last run date is more than a month ago?
i would use the cache option in the gitlab-ci.yml
Create a file named "update_date" once you update the db then cache it.
In the logic.py (python is just an example, write it however you'd like), check the file exists and date is not over 30 days ago, in any other case update the DB
db_update:
script:
- logic.py
cache:
paths:
- ./update_date

Laravel Scheduler (withoutOverlapping)

I have two apps running on the same server.
Now it seems like when adding withoutOverlapping() to the scheduler job and managing the base cronjob via cron itself, these 2 apps are blocking each other in execution.
Could that be?
Yes, withoutOverlapping only works per application.
Laravel creates a file in the storage folder with a hash of the job. This way, if the file exists, Laravel knows the job is still running. The one application cannot possibly know if the other one is currently running a job because it does not have access to the storage folder of the other application.
If your code looks like the following
$schedule->command('process:queue 0')->everyMinute()->withoutOverlapping();
$schedule->command('process:queue 1')->everyMinute()->withoutOverlapping();
It is because same commands with different parameters might bc considered overlapping.
I.e. the hash of the job will consider only the command signature.

How to email time-out notice from Google Cloud Storage?

I have a gsutil script that that periodically backs up data to Google Could Storage.
The gsutil backup script runs on my local box.
I would like to run a script (or service) on Google Could Storage, that emails a warning to the administrator when no backup has been made in 24 hours.
I am new to cloud services. Please point me in the right direction.
Where would such a script be located? Is there a similar example script?
Thank you.
There's no built-in feature that accomplishes this. However, you could accomplish something like this with another monitor program.
For example, I might edit my backup script such that after successfully completing a backup, it writes the current time to a "last_successful_backup.txt" file. Then, I'd put a cronjob wherever I keep my monitors and alerting systems that would check the "last_successful_backup.txt" file every few hours and set off an alarm if the time it contains is older than 24 hours.
What about to spin up Google VM and send emails from the instance? Using, say, SendGrid, Mailgun, or Mailjet

How do I run a partial reindex in Magento 1.13?

I would like to be able to set all indexes to "Update on schedule" and then have them all update automatically like Magento says they should in the background. The problem is, this doesn't happen. There is no cron job that automatically reindexes (See this related question).
So, if I have to create my own cron job, how do I do this exactly in an efficient way? I don't want to run "php shell/indexer.php reindexall". That does full rebuilds of index tables. Sure, I could do that nightly, but that means that no changes will be reflected on the frontend until the next day. That's not an acceptable solution. If I run full reindexes throughout the day, I end up with the same problem that I have right now - table locks and slowness due to reindexing while people are working in the admin.
Magento's new "partial reindexing" should fix this right?
This is my understanding of how it works:
I edit an entity that has a related index (e.g. A product).
A database trigger adds a record to related change log tables.
Some process later reads the change log tables and reindexes these specific entities
Concrete example
I update a value in "catalog_product_entity_varchar".
The database trigger "trg_catalog_product_entity_varchar_after_update" flags this product as changed by inserting a new version into "catalog_product_flat_cl" and "catalogsearch_fulltext_cl".
A partial reindex process reads these change log tables and reindexes only the products mention to "catalog_product_flat" and "catalogsearch_fulltext" respectively.
If this were the case, the reindexing process would be minimal and could be run often. Even every minute to where indexing becomes almost unnoticeable to admin users. (I say every minute, because Magento tells us this is possible)
In this release, however, the flat catalog is updated for you — either every minute, or according to your Magento cron job.
Where is this mystical partial reindex? How do I call it instead of reindexing everything?
Is there a reindexPartial()?
The enterprise_refresh_index cron job appears to run this. It runs every time the Magento cron runs. See Enterprise_Index_Model_Observer::refreshIndex().
This is not intended to run manually because of the need to establish a lock file. It is easiest just to run the cron.php file if you need a manual reindex.
I believe I just have a project specific issue with this not running.
The partial reindexing is executed through the cron job operator built into Magento. You do not need to run the actual indexer.php file. Instead, you must setup Magento's built in Cron scheduler based on the documentation.
Documentation: http://www.magentocommerce.com/wiki/groups/227/setting_up_magento_in_cron
You simply execute the cron.php file, which will in turn call the partial reindexing process.
php5-cli -f /home/USERNAME/public_html/cron.php
How it works:
A change to the an entity is made and is flagged to be reindex.
A cronjob executes the cron.php file
Magento checks to see which cron tasks it will run, and runs the partial reindexing process
The indexing process will see the changed entity and update the index tables with the new values.

Good way to demo a classic ASP web site

What is the best way to save data in session variables in a classic web site?
I am maintaining a classic web site and want to be able to allow my users to demo all functionality of the site, this means allowing them to delete records.
The closet example I have seen so far are the demos of Telerik controls where they are saving the dataset in sessions on first load and allowing the user to manipulate the data.
How can I achieve the same in ASP with an MS Access backend?
If you want to persist the state over multiple pages (e.g. to demo you complete application) then it's a bit tricky.
I would suggest copying the MDB file for each session and using the copied version. This would ensure that every session uses its own data.
create a version of your access db which will be used as a fresh template for each user
on session copy the template and name it after the users session ID
use the individual MDB
Note: Then only drawback I can see here is that you need to remove the unused MDB files as it can get a lot after sometime. You could do it with a scheduled task or even on session start before you create a new one.
I am not sure what you can use to check if it's used or not but check the files creation date or maybe the LDF file can help you as well (if it does not exist = unused).
You can store a connection or inclusive an object in a session variable as far you remember what kind of variable are you storing at the retrieving time. I had never stored a dataset in a session variable but I had stored a lot of arrays in session variables so you can use the ADO Getrows method to locate a complete dataset into a session variable.
How big is the Access database? If your database is small enough (relative to the server capacity, expected number of users, and so forth) then I like the idea of using a fresh copy of the database for each user that runs the demo.
With this approach, you simplify your possible code paths. Otherwise this "are we in demo mode or not?" logic will permeate a heck of a lot of your code.
I'd do it like this...
When the user begins the demo, make a copy of the Access DB for that user to use. If your db is foo.mdb, copy it to /tempdb/foo_1234567890.mdb where 1234567890 is the user's session ID.
Alter the user's connection string to point to the fresh database copy. From this point on, your app can operate like "normal" with no further modifications.
Have a scheduled task that deletes all files in /tempdb with last-modified times more than __ hours in the past. If you don't have the ability to schedule tasks on the server (perhaps you're in a shared hosting environment, etc) then you could do this at the same time you do step #1.

Resources