Laravel - Export PDF with huge data (~10k rows, >80 cols) - laravel

I'm writing code for feature export report data, I used PhpSpreadsheet with TcfPdf library to export.
But when my data is huge (~10k rows, >80 cols) then output only one blank page.
I tried chunk data and export splitting to multiple files pdf (1.pdf, 2.pdf,...), then merged to one file using pdftk library but still no success.
Additional, if I export multiple columns, pdf did not view all those columns, because paper size of pdf is small.
Does anyone help me? What best solution for export huge data and what library I should use?
Thanks, everyone!

You should give a try to Laravel Snappy PDF
or my second favorite DOMPDF
But for large data I definitely recommend Laravel Snappy PDF

Related

Looking for the optimum options for Lightweight Charts to visualize data in CSV files or a database

I am a Python developer and new to JavaScript and Lightweight Charts.
I am noticing that all Lightweight Charts code samples use a JavaScript array to initialize chart data. My candle bar data reside in a database that I can export to one or more CSV files.
What are the practical options for Lightweight Charts to visualize data in CSV files or a database?

From selected data into PDF using RDF file

I am currently trying to convert a simple table into a PDF file using an existing .rdf file.
My first approach was to look for a new program that can do so because I want to replace the current 'Oracle Reports' program.
Is there any other program that would support converting SQL data into an PDF using an .rdf File?
I tried writing a Python 3 script to do just that, but I would not know where to start.
Oracle APEX 21.2 (latest at the current time) has a package named APEX_DATA_EXPORT that can take a SELECT statement and export it into various formats, one of them being PDF. The example in the documentation shows how to generate a PDF from a simple query. After calling apex_data_export.export, you can use the BLOB that is returned by the function and do whatever you need with the PDF.
There are not very many options for styling and formatting the table, but Oracle does plan on adding additional printing capabilities for PDFs in the future.

SSRS - Unzip image varbinary(max) data and display

I'm working with a database-driven application that allows users to upload images which are then zipped and embedded into a database in a varbinary(max) format. I am now trying to get that image to display within an SSRS report (using BI 2005).
How can I convert the file data (which is 65,438 characters long when zipped and 65,535 characters when not zipped) into a normal varbinary format that I can then display in SSRS?
Many thanks in advance!
You'll have to embed a reference to a dll in your project and use a function to decompress the data within SSRS, see for example SharpZipLib. Consider storing the data uncompressed if possible, as the CPU / space trade off is unlikely to be in your favour here, as impage data is likely to have a poor compression ratio (it is usually already compressed).

How can one export filtered data to csv or excel format?

How can one export filtered data to csv or excel format?
This will give me all the current filtered data:
dim.top(Infinity)
Need help pushing the data back to the server and then pushing it to csv/excel or doing everything client side.
Any ideas would be appreciated.
This isn't really a dc.js question. There isn't anything built into dc.js or d3.js to write data into csv or json format, probably because writing formats is easier than parsing.
Take a look at, for example:
How to export JavaScript array info to csv (on client side)?
and its correction (it apparently has a bug) here:
JavaScript array to CSV
I don't know if there are libraries to do this. It's not very complicated to write by hand.
EDIT: I wasn't aware that d3 does have CSV output as well:
https://github.com/mbostock/d3/wiki/CSV#format

Codeigniter PHPExcel reader displaying data very slow

I am using Codeigniter PHPExcel.
Basically i am using an excel files to read data and display it on the website through PHPExcel. Currently it is taking time to load.
What it basically is doing is creating JSON files through PHPExcel libraries and the data is being read through JSON once the page has been loaded.
But, I am facing a slow load now. When i went through the JSON file i saw that the size is around 3.5 MB and i have more than 3 files through which i am reading the data.
Can anyone suggest me any workarounds for the optimisation? I have read about "Reading in Chunks".
Can we read for the few rows for the first request, like basic filtering we generally do while fetching from database?
May be you should use and DB to prevent loading files for many time? If you use JSON format - MongoDB or PostrgesQL (with json field) will be perfect. Or just parse fields from excel file and load this to normalized DB.

Resources