Failed to store cell in cache - caching

I am using multiple csv files to populate the tables in a database.
As I am on Symfony, I created a command which from a given directory read files in a defined order.
A csv file equals a table in my BD.
File sizes differ from file to file and may contain over than 65 thousand lines.
my script has been running locally for 3 days, it's progressing but it's heavy.
On a recipe server my script stops just after a few minutes showing the error
CRITICAL: Error thrown while running command "symfony-command-name". Message: "Failed to store cell BB18118 in cache" {"exception": "[object] (PhpOffice \ PhpSpreadsheet \ Exception (code: 0): Failed to store cell BB18118 in cache at project / vendor / phpoffice / phpspreadsheet / src / PhpSpreadsheet / Collection / Cells.php: 393)
I m using Symfony FileSystemAdapter, SimpleCacheBridge (1.1)
In My Symfony command I do
$pool = new FilesystemAdapter();
$simpleCache = new SimpleCacheBridge($pool);
Settings::setCache($simpleCache); // PhpOffice\PhpSpreadsheet\Settings
// loop directories and call Symfony service ...
In Symfony Service
$spreadSheet = IOFactory::load($csvPath);
$sheet = $spreadSheet->getActiveSheet()->toArray();
// loop sheet and databases operations ...
Stack :
PHP : 7.4
SF : 5.3.9
phpspreadsheet : 1.18
SimpleCache : 1.1
Any Help please

I got the symfony cache working like this:
composer require symfony/cache
Code
use Symfony\Component\Cache\Adapter\FilesystemAdapter;
use Symfony\Component\Cache\Psr16Cache;
$cache = new FilesystemAdapter();
$psr16Cache = new Psr16Cache($cache);
Settings::setCache($psr16Cache);
Anyway, it did not solve my memory issue.

Related

How to download spatie/browsershot generated file into User/Downloads?

In laravel 5.8 app I use https://github.com/spatie/browsershot
and if I save file as
$save_to_file= 'file.pdf';
Browsershot::html(htmlspecialchars_decode($pdf_content))
->showBackground()
->save($save_to_file);
it is downloaded and saved in /public dir of my app at my local OS
If I try to set path to ‘Downloads’ directory of my Kubuntu 18 as
$save_to_file= '/home/currentuser/Downloads/file.pdf';
Browsershot::html(htmlspecialchars_decode($pdf_content))
->showBackground()
->save($save_to_file);
I got error:
Symfony \ Component \ Process \ Exception \ ProcessFailedException
The command "PATH=$PATH:/usr/local/bin NODE_PATH=`npm root -g` node '/mnt/_work_sdb8/wwwroot/lar/votes/vendor/spatie/browsershot/src/../bin/browser.js' '{"url":"file:\/\/\/tmp\/0906513001561868598\/index.html","action":"pdf","options":{"path":"\/home\/serge\/Downloads\/file.pdf","args":[],"viewport":{"width":800,"height":600},"displayHeaderFooter":false,"printBackground":true}}'" failed. Exit Code: 1(General error) Working directory: /mnt/_work_sdb8/wwwroot/lar/votes/public Output: ================ Error Output: ==============
1) If there is a way to download generated file into ‘Downloads’(OS independently) ?
2) I think that I can use php remove function but again how define ‘Downloads’(OS independently) directory ?
I received the exact same error. No idea why I got that. Followed all the steps mentioned in the docs. Tried various resources still could not figure it out.
Finally this worked
$save_to_file= '/var/www/laravel/storage/app/file.pdf';
Browsershot::url('https://www.google.com')
->noSandbox()->format('a4')->save($save_to_file);
The ->noSandbox() was the key. Let me know if this works for you.

Required field 'uncompressed_page_size' was not found in serialized data! Parquet

I am getting below error while trying to save parquet file from local directory using pyspark.
I tried spark 1.6 and 2.2 both give same error
It display's schema properly but gives error at the time of writing file.
base_path = "file:/Users/xyz/Documents/Temp/parquet"
reg_path = "file:/Users/xyz/Documents/Temp/parquet/ds_id=48"
df = sqlContext.read.option( "basePath",base_path).parquet(reg_path)
out_path = "file:/Users/xyz/Documents/Temp/parquet/out"
df2 = df.coalesce(5)
df2.printSchema()
df2.write.mode('append').parquet(out_path)
org.apache.spark.SparkException: Task failed while writing rows
Caused by: java.io.IOException: can not read class org.apache.parquet.format.PageHeader: Required field 'uncompressed_page_size' was not found in serialized data! Struct: PageHeader(type:null, uncompressed_page_size:0, compressed_page_size:0)
In my own case, I was writing a custom Parquet Parser for Apache Tika and I experienced this error. It turned out that if the file is being used by another process, the ParquetReader will not be able to access uncompressed_page_size. Hence, causing the error.
Verify if other processes are not holding on to the file.
Temporary resolved by the spark config:
"spark.sql.hive.convertMetastoreParquet": "false"
Although it would has extra cost, but a walkaround approach by now.

apache-pig: ERROR 1066: Unable to open iterator for alias

I am trying to run a pig-script on bulk wikipedia page statistics data.
To start off with, I am just doing a basic filter like:
A = LOAD '/data' using PigStorage(' ') as (project:chararray, page:chararray, requests:int, size:int);
B= FILTER A BY project == 'en';
dump B;
This is working fine if I am loading 2-3 files but erroring out if I load all the files. The error is :
org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to open iterator for alias B
To confirm that there are no corrupted records, I made several copies of the file that was working and ran the above script, but no luck. Please advise!

pig + hbase + hadoop2 integration

has anyone had successful experience loading data to hbase-0.98.0 from pig-0.12.0 on hadoop-2.2.0 in an environment of hadoop-2.20+hbase-0.98.0+pig-0.12.0 combination without encountering this error:
ERROR 2998: Unhandled internal error.
org/apache/hadoop/hbase/filter/WritableByteArrayComparable
with a line of log trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/filter/WritableByteArra
I searched the web and found a handful of problems and solutions but all of them refer to pre-hadoop2 and base-0.94-x which were not applicable to my situation.
I have a 5 node hadoop-2.2.0 cluster and a 3 node hbase-0.98.0 cluster and a client machine installed with hadoop-2.2.0, base-0.98.0, pig-0.12.0. Each of them functioned fine separately and I got hdfs, map reduce, region servers , pig all worked fine. To complete an "loading data to base from pig" example, i have the following export:
export PIG_CLASSPATH=$HADOOP_INSTALL/etc/hadoop:$HBASE_PREFIX/lib/*.jar
:$HBASE_PREFIX/lib/protobuf-java-2.5.0.jar:$HBASE_PREFIX/lib/zookeeper-3.4.5.jar
and when i tried to run : pig -x local -f loaddata.pig
and boom, the following error:ERROR 2998: Unhandled internal error. org/apache/hadoop/hbase/filter/WritableByteArrayComparable (this should be the 100+ times i got it dying countless tries to figure out a working setting).
the trace log shows:lava.lang.NoClassDefFoundError: org/apache/hadoop/hbase/filter/WritableByteArrayComparable
the following is my pig script:
REGISTER /usr/local/hbase/lib/hbase-*.jar;
REGISTER /usr/local/hbase/lib/hadoop-*.jar;
REGISTER /usr/local/hbase/lib/protobuf-java-2.5.0.jar;
REGISTER /usr/local/hbase/lib/zookeeper-3.4.5.jar;
raw_data = LOAD '/home/hdadmin/200408hourly.txt' USING PigStorage(',');
weather_data = FOREACH raw_data GENERATE $1, $10;
ranked_data = RANK weather_data;
final_data = FILTER ranked_data BY $0 IS NOT NULL;
STORE final_data INTO 'hbase://weather' USING
org.apache.pig.backend.hadoop.hbase.HBaseStorage('info:date info:temp');
I have successfully created a base table 'weather'.
Has anyone had successful experience and be generous to share with us?
ant clean jar-withouthadoop -Dhadoopversion=23 -Dhbaseversion=95
By default it builds against hbase 0.94. 94 and 95 are the only options.
If you know which jar file contains the missing class, e.g. org/apache/hadoop/hbase/filter/WritableByteArray, then you can use the pig.additional.jars property when running the pig command to ensure that the jar file is available to all the mapper tasks.
pig -D pig.additional.jars=FullPathToJarFile.jar bulkload.pig
Example:
pig -D pig.additional.jars=/usr/lib/hbase/lib/hbase-protocol.jar bulkload.pig

Joomla Fatal error

I am new to Joomla and i have created a component folder as com_joomlabook
within that 2 files as joomlabook.php and joomlabook.html.php
But when i run it in the browser using http://localhost/joomla/Joomla_1.5.7-Stable-Full_Package/administrator/index.php?option=com_joomlabook
Fatal error: require_once() [function.require]: Failed opening required '' (include_path='.:/usr/share/php:/usr/share/pear') in /home/ntdg/public_html/joomla/Joomla_1.5.7-Stable-Full_Package/administrator/components/com_joomlabook/joomlabook.php on line 6 that is in the getPath line why so??Please suggest me.
Did you "install" the component or add a row to the components table? Joomla doesn't just see what folders are there and run scripts.
I actually just wrote a blog post last week on how to create your own components quickly, which might be useful: http://infuseddesign.co.uk/blog/7-joomla/24-building-joomla-components

Resources