Magento: product bulk import on cron run - magento

I have more than 2000 product, are updating on cron run. The procedure is delete the product first and then import. I am using:
$product->delete() for delete. But its taking long time to run whole script and getting
500 Internal error. How can I optimize this one.
I really need to get out from here. I also increased my php.ini max_execution_time.

Rather then a delete break this into three parts:
Delete by checking what are going to be non existent as per feed received.
Insert/update other products.
If for some reason if you still want to delete all, delete a smaller chunk of product and loop till all are deleted.

Related

Updating Same Record with Multiple Times at the Same Time on Oracle

I have a Voucher Management System. And I also have a campaign on that system works first click first get order. First minutes on the start of campaign, so many people trying to get to vouchers. And every time when they try to do this, the same mechanism works on thedesign image.
At the updating number of given voucher, I realize a bottleneck. All this updates trying to update same row on a limited time. Because of that, transactions adding a queue and waiting for the current update.
After the campaign, I see some updates waited for 10 seconds. How can I solve this?
Firstly, I try to minimize query execution time, but it is already a simple query.
Assuming everyone is doing something like:
SELECT ..
FROM voucher_table
WHERE <criteria indicating the first order>
FOR UPDATE
then obviously everyone (except one person) is going to queue up until the first person commits. Effectively you end up with a single user system.
You might to check out the SKIP LOCKED clause, which allows a process to attempt to lock an eligible row but still skip over it to the next eligible row if the first one is locked.

CRM 2015 - Delete takes almost 35 to 40 seconds if I select multiple records from Project-Product, Quote-Product, Order-Project (not a bulk delete)

In CRM 2015, It will takes 35 to 40 seconds to delete a more then 5 records from project product, quote product or order product list. It's not a bulk delete.
Sometimes I can see below unresponsive/wait screen in between delete process.
How to fix this issue or reduce the time?
Due to internal processes including recalculation of the parent Order values, deleting child records like an Order Product can take a while.
When you delete a single Order Product, the system will recalculate the parent Order's total, etc. This happens for each record, and deleting multiple records of course takes longer.
There may also be other processes happening - either custom or system processes. You can check if there are any custom ones, but the system processes are largely a black box.
I have seen situations where a client occasionally needed to create an invoice with over 10,000 lines. Since creating each line triggers a recalculation, normal automation options were timing out. I wound up creating a console app to add the lines to the monster invoices in batches.

Magento determine if report script is running

I have created a Magento module that will, based on some filters, create a CSV file with the order data. This report takes anytime from 15–40 min to run depending on the selected filters. Since there is a lot of data, I used straight queries to generate the report.
So what I am trying to do now, is to make sure that when this report is being generated, no one else can run it. So I need to be able to detect that the query is running. Any suggestions on the best approach to this?
create a file called report.lock when you start the report. Check to see if this file exists when you start the report and return an error if it does, otherwise create the file. Delete it once it is complete.

Magento 1.5.0.1 catalog_product_price index issue

I am using Magento 1.5.0.1 with 600,000 products. Indexing is a major issue, especially catalog_product_price index.
1/ Towards the end of the indexing process a query is run DELETE FROM catalog_product_index_price . This has the effect of removing every item from our site so that the site displays 'There are no products matching the selection.' for all categories, the home page, search results.
2/ The process to insert from catalog_product_index_price_idx into catalog_product_index_price takes 10 minutes so we have a 10 minute window with no products on the site. I am absolutely certain this is a bug, there is no way someone intended for indexing to remove all products for a period of time - even if it was only 10 seconds this is not right for an ecommerce website
3/ For some reason the process of DELETE FROM catalog_product_index_price sometimes leaves a few products in the table therefore when the process of inserting from catalog_product_index_price_idx into catalog_product_index_price runs the indexer throws up an integrity constraint issue because of duplicate entries. This ends the indexing process and leaves the site with no products on. We run indexing in the early hours of the morning so sometimes we have a number of hours with no products on site if the index fails.
Does anyone know of a fix to these issues or a better way to update prices on the site that does not require us to index?
Firstly well done for running Magento with 600k products, that is the most I have heard of.
The best way to work around this I think would be to override the indexing process so that instead of truncating and rebuilding the price index, it replaces line by line. It is likely that this would take longer over all but would resolve the issue of it having this window with no products.
One thing you could try is to replace DELETE FROM with TRUNCATE TABLE which might be more reliable for your "still some items in the table" issue.
Ultimately though I think you are going to be building a bespoke optimised indexer.

Exporting 8million records from Oracle to MongoDB

Now I have an Oracle Database with 8 millions records and I need to move them to MongoDB.
I know how to import some data to MongoDB with JSON file using import command but I want to know that is there a better way to achieve this regarding these issues.
Due to the limit of execution time, how to handle it?
The database is going up every seconds so what's the plan to make sure that every records have been moved.
Due to the limit of execution time, how to handle it?
Don't do it with the JSON export / import. Instead you should write a script that reads the data, transforms into the correct format for MongoDB and then inserts it there.
There are a few reasons for this:
Your tables / collections will not be organized the same way. (If they are, then why are you using MongoDB?)
This will allow you to monitor progress of the operation. In particular you can output to log files every 1000th entry or so to get some progress and be able to recover from failures.
This will test your new MongoDB code.
The database is going up every seconds so what's the plan to make sure that every records have been moved.
There are two strategies here.
Track the entries that are updated and re-run your script on newly updated records until you are caught up.
Write to both databases while you run the script to copy data. Then once you've done the script and everything it up to date, you can cut over to just using MongoDB.
I personally suggest #2, this is the easiest method to manage and test while maintaining up-time. It's still going to be a lot of work, but this will allow the transition to happen.

Resources