Not able to generate more than 20 pages in wkhtmltopdf (approx) - wkhtmltopdf

I am trying to generate pdf with 30-35 pages using wkhtmltopdf, but there are blank pages after 20 pages(sometimes 21/22). To confirm this I have tried to generate same page 35 times using a loop (please note there is no error in html file).
I am using NReco.PdfGenerator (C#).

Use JavaScript profiler(chrome ) and optimize your code as much as possible if
wkhtmltopdf.exe throws timeout exception.
After doing refactoring based on above approach we also added
--no-stop-slow-scripts parameter along with 3 minutes timeout to the
wkhtmpltopdf exe. And now i can generate more than 80 pages :)

Related

Magento Import "blank" errors even when uploading export file

I am running into issues when uploading inventory files using the Dataflow import. I am only importing about 500 items but at record 42 it seems to find an error and just display a blank red bar, no description of the error. If I run the import with less than 40 lines I don't get an error.
To eliminate my format/coding I exported the product stock file, and then tried importing the same file and get the same "blank" errors. I have tried just about everything I could search for without any luck. seeing if anyone else has an idea or something else I could try.
Magento version - 1.9.2.1
You should try by getting your server limits increased:
Some thing like this
max_execution_time = 36000 ; // Maximum execution time of each script, in seconds
max_input_time = 60 ; // Maximum amount of time each script may spend parsing request data
memory_limit = 1024M ; // Maximum amount of memory a script may consume (32MB)
OR,
You try the following code by replace in index.php
ini_set('memory_limit', '1024M');
ini_set('max_execution_time', 36000);
Also Check out Customer CSV export fails with blank page
I might be late but here's what i did to overcome the same issue.
Split your CSV product file into little CSV files, consisting of 40 products each, with the help of a free online CSV splitter - http://splitfile.vavro.me (I had a CSV file of 3000+ products. So it got split into 75 files.)
Then upload all those little CSV files to your server's "var/import/" folder. Now login into your magento's back-end, go to System>Import/Export>Dataflow-Profiles, choose your profile then go to 'Run-Profile'. Here you can see all the uploaded CSV files listed in a drop-down. Now execute the files one by one. Pretty long process, but worth it.
Note that after executing each file, you need to wait for at least 2 minutes to get the indexes refreshed. Or else you might end up getting the same red blank errors again!

wkhtmltopdf runtime for many pdf-creations

I am using wkhtmltopdf on my ubuntu server to generate pdfs out of html-templates.
wkhtmltopdf is therefore started from a php-script with shell_exec.
My problem is, that I want to create up to 200 pdfs at (almost) the same time, which makes the runtime of wkhtmltopdf kind of stack for every pdf. One file needs 0.6 seconds, 15 files need 9 seconds.
My idea was to start wkhtmltopdf in a screen-session to decrease the runtime, but I can't make it work from php plus this might not make that much sense, because I want to additionally summarize all pdfs in one after creation, so I would have to check if every session is terminated?!
Do you have any ideas how I can decrease the runtime for this amount of pdfs or can you give me advice how to realize this correctly and smart with screen?
My script looks like the following:
loop up to 200times {
- get data for html-template from database
- fill template-string and write .html-file
- create pdf out of html-template via shell_exec("wkhtmltopdf....")
- delete template-file
}
merge all generated pdfs together to one and send it via mail
Thank you in advance and sorry for my bad english.
best wishes
Just create a single large HTML file and convert it in one pass instead of merging multiple PDFs afterwards.

Generating document for a huge PHP file

I have a huge PHP File ( approx 2 MB). As its a third party file so can't change it. Now I want to generate documentation for it using automatic documentation generators.
I tried Doxygen, Apigen and PHP Documenter. but each of it hangs or exhaust the memory.
Is there a way to generate documentation for this file.
You could try increasing the memory limit setting in your php.ini
If this fails, you could split the file up into several new ones, and later "stitch" the output of the generator together.

php (and codeigniter) upload size limited to 1MB

I have a function (I'm using CodeIgniter) that uploads a file, resizes it, and saves details into a database.
I have no problems in uploading images up to 1MB, so I know that permissions work ok.
However, as soon as I try to upload something above 1MB, the function becomes really slow, and after a while I'm presented with a blank page.
These are the main values in the php ini file:
post_max_size: 32M
max_input_time: 60
max_execution_time: 30
file_uploads: 1
upload_max_filesize: 32M
According to this I should have plenty of time and megabytes to upload the file successfully.
What else this could depend on?
UPDATE (following Mike's and Minboost questions below)
a. logs are clean, no sign of problems there and actually the log shows that the page has been processed on 0.03 seconds!
b. Memory_limit is 96 MB
c. I'm not applying XSS filters on this
...any additional ideas?
the thing i don't understand is that it takes a very long time to upload a file even on my Mac (localhost); i've managed to upload a 2.7mb picture, however i had to wait there for a few minutes. there seem to be a step change (for the worse) above the 500KB threshold. Upload is smooth and fast below that, and becomes very slow above it..
It could also depend on memory_limit.
Are you checking error_logs? What are the errors returned? Make sure you're not XSS filtering the upload file form field. Also I've had to try this before:
set the max_allowed_packets higher in /etc/my.cnf and restart MySQL.

Rendering large collections of articles to PDF fails in MediaWiki with mwlib

I have installed the Mediawiki Collection Extension and mwlib to render articles (or collections of articles) to PDF. This works very well for single articles and collections with up to 20 articles.
When I render larger collections, the percentage counter in the parsing page (which counts to a 100% when rendering succeeds) is stuck at 1%.
Looking at the mwrender.log I see an Error 32 - Pipe Broken error. Searching the internet reveals that Error 32 can be caused by the receiving process (the part after the pipe) crashing or not responding.
From here it is hard to proceed. Where should I look for more clues? Could it be the connection to the MySQL server that dies?
The whole applicance is running on a Turnkey Linux Mediawiki VM.
I'm using PDF Export Extension and it works with more than 20 articles. Maybe try that?
I figured out the problem myself.
Mw-render spawns a parallel request for every article in a collection. This means that for a collection of 50 pages, 50 simultaneous requests are made. Apache could handle this, but not the MySQL db of MediaWiki.
You can limit the amount of threads that mw-render spawns with the --num-threads=NUM option. I couldn't find where mw-serve calls mw-render, so I just limited the maximum amount of threads (workers) Apache could spawn to 10.
mw-render automatically repeats requests for articles if the first ones fail, so this approach worked.
I rendered a PDF with 185 articles within 4 minutes, the resulting PDF had 300+ pages.

Resources