Why Laravel is truncating large text in the db - laravel

My table field is Longtext, however when I try to save a text of length equal to 3 page of a text book. It saves in the db but not complete. It truncates the text. What is wrong, do Laravel set default max length of a payload?
Is it a problem regarding my database?
Any kind of help is appreciated.
Table Schema

Related

What is the use of db_datas column in ir_attachment table in Odoo12?

I am working on Odoo12, I read in https://apps.odoo.com/apps/modules/8.0/product_image_filestore/ that images are store in filestore in later version of Odoo. i.e.9-13.
If that is true then why there is field db_datas in ir_attachment table. It is showing binary_data in this column. Although there is store_fname in this table having file path. Is odoo storing both informations binary form and filepath of image? Will not it increase the size of database?
PGADMIN show "NULL" value of db_datas as "<binary data>". If you do "select query" or view this lines in phppgadmin, you'll see "NULL".

Saving float changes to a float or to a varchar2 column?

I need to save before and after value changes of certain fields of an items table to an items_log table. Changes are saved by an after change trigger on the items table.
Some of the items table columns are varchar2 type and some are number(*) type.
What is the better approach? Saving to separate two before and after number fields and two before and after varchar2 fields? Or conserving space by saving everything to two before and after varchar2 fields?
The purpose of this log table is to record which user changed a field and the before and after values.
Could saving a float value to a string field lead to an unexpected diversion from the original value?
Thanks in advance
"What is the better approach?"
There is no "better" approach. There is only an approach that's good enough for your application. If your table will have a few thousand rows in it, it doesn't really matter. If your table will have a few million rows, then space may be more of a concern.
If your goal is to display to a user what changes occurred to your item and it's not going to see a lot of activity, storing everything as a varchar may be good enough. You probably don't want to store rows for fields that did not change.
I use APC's approach often. The items_log table is the same as the item table, and includes a history id, timestamp, action (I, U, or D), and user along with all the columns of the item row. Everything is maintained by a trigger. There are also built-in Oracle auditing features to do auditing for you.

Magento Terms & Conditions max character limit

I have a problem here that even after hours of searching with my friend Google, I'm still getting no results...
My Terms & Conditions are larger then the max character set of the Magento section for it.
Then I would like to know if one of you could please help me to locate the file and the line to edit to make the max character set biger and letting me put all my Terms & Condition without problem.
Thank you very much in advance for your time.
sincerely,
Nicolas
The T&C content is stored in the checkout_agreement table in a field named content
This field is assigned the datatype text and has a maximum length of around 64kB with actual content depending on how many bytes your UTF-8 encoded text uses.
You would need to change the datatype to longtext which has a maximum length of 16MB.
Testing this will be necessary to make sure no validation limits have been imposed on the entry template.
You can modify the table structure of table checkout_agreement by changing the data type of content field from TEXT to LONGTEXT to allow for more characters.

Storing and retrieving value inconsistency in a table in Oracle

I am facing a weird problem. I have a table (observation_measurement) in oracle DB and it has many fields. One field name is observation_name. this observation_name field stores different measurements with it's value from a text file.
For example, observation_name stores four measurements a,b,c,d (name of the measurements) and their corresponding values 1,2,3,4 (values of those measurements).
Later it is reading same text file. This time that text file has three measurements a,b,d (c is not there) and their values are 7,8,9 and then store in the table. So, if I need the latest values for all observation_names then I should get a=7,b=8,c=null,d=9. But it is giving me
a=7,b=8,c=3,d=9. I dont know why it is getting old data for c measurement.
Any ideas?
NULL has to be handled specially in Oracle, like IS NULL or IS NOT NULL.
Hope, your update logic involves some validation over the column and it leaves NULL values untreated.
Since, some validation fails because of NULL, the old value is retained in the table.
Can you please update your question with the Query used to UPDATE the table.

How to increase maximum size of csv field in Magento, where is this located

I have one field when importing that can contain large data, it seems that CSV has unofficial limitation of about 65000 (likely 65535*) character. as both libreoffice calc and magento truncating the data for that particular field. I have investigated well and I'm certain it is not because of a special character or quotes. the data pretty straight forward, the lines are similar in format to each other.
Question: How to increase that size? or at least where I should look to find it?
Note: I counted in libreoffice writer and it was about 65040. but probably with carriage return characters it could reach 65535
I change:
1) in table catalog_category_entity_text
type of field "value" from "text" to "longtext"
2) in file app/code/core/Mage/ImportExport/Model/Import/Entity/Abstract.php
const DB_MAX_TEXT_LENGTH = 65536;
to
const DB_MAX_TEXT_LENGTH = 16777215;
and all OK
You are right, there is a limitation in Magento, because it sets texts fields as TEXT in MySQL database and, according to MySQL docs, this kind of field supports a maximum of 65535 chars.
http://dev.mysql.com/doc/refman/5.0/es/storage-requirements.html
So you could change the column type in your Magento database to use MEDIUMTEXT. I guess the correct place is in the catalog_product_entity_text table, where you should modify the 'value' field type to match your needs. But please, keep in mind this is dangerous. Make a full backup before trying. And you may even need to play with core files... not recommended!
I'm having the same issue with 8 products from a list of more than 400, and I think I'm not going to mess with Magento core and database, we can reduce the description strings for those few products.
The CSV could care less. Due to Microsoft Access allowing Memo fields which can contain quite a bit of data, I've exported 2-3k descriptions in CSV format to be imported into Magento quite successfully.
Your limitation is either because you are using a spreadsheet that has a cell limitation or export limitation on cells or because the field you are trying to import into has a maximum character limitation set in its table for that field.
You can determine the latter by using phpMyAdmin to see what the maximum character setting is for that field.

Resources