Cannot import solution because index size too large - dynamics-crm

I am experiencing the following error for a custom entity:
"Index size exceeded the size limit of 900 bytes. The key is too large. Try removing some columns or making the strings in string columns shorter."
I looked at the key and it previously had a max length of 300. I reduced it to 20 since it is a Phone Number entity, but it still fails to import with the following error above. I also increased it to 450 based on similar Dynamics questions I found online but no dice. How can I get around this error? Where should I be looking?

Is your field a find column in the quick find view?
If yes, that's the reason. Because for find columns automatically an index is being created and there are limitations reg. the max. number of characters.
This is a limitation on sql side.
https://learn.microsoft.com/en-us/sql/sql-server/maximum-capacity-specifications-for-sql-server?view=sql-server-2017

Related

FoxPro ERP throwing "Numeric Overflow" error. No support

So, a company I work at has an older ERP system that uses FoxPro 4 or 5. There is no support for the system, so I am trying to use skills that I don't possess. I'm good with Servers and even networks, but not coding. I have attached links to two similar errors that are occuring to two different users in different departments using different computers. Your help would be appreciated.
FoxPro Error 1
FoxPro Error 2
Well, the problem is exactly what it says on the tin. It looks like the issue is with the field BODY.COST. The field will have a maximum capacity, for example N(12, 2) would allow numbers up to 999999999.99 to be stored in it.
The system is attempting to put a number that is bigger than the defined capacity into this field. You can see it is a GATHER MEMVAR statement in both cases. This statement takes memory variables and updates a database table using them. One of the memory variables has ended up with a bigger number in it than the database field (looks like BODY.COST) that is intended to store it has capacity for.
Beyond that, with no support and no source code you are really limited to looking at what the user is trying to post and seeing if that gives you any clues. Is that the extent of the error dumps or are those just snippets?
The messages are saying that you are trying to store a larger value than the field would accept. This happens with numeric and float fields in Foxpro. In both of the messages, the table was indirectly aliased as "BODY" and the problematic field is "COST".
As a solution, using VFP5 (do not use a later version - there weren't VFP4), you can make all the numeric and float fields to either Currency or Double data type.
Currency has a high certainity and suggested for monetary values (need not be monetary). It is in the range of –922337203685477.5808 to 922337203685477.5807. That range is actually above what a numeric/float field can support.
If you think that is not enough range, than you can use double (something like -10^327 to 10^304 - VFP has a precision of 15 digits, you lose precision beyond that).
I would go with Currency.

Magento Terms & Conditions max character limit

I have a problem here that even after hours of searching with my friend Google, I'm still getting no results...
My Terms & Conditions are larger then the max character set of the Magento section for it.
Then I would like to know if one of you could please help me to locate the file and the line to edit to make the max character set biger and letting me put all my Terms & Condition without problem.
Thank you very much in advance for your time.
sincerely,
Nicolas
The T&C content is stored in the checkout_agreement table in a field named content
This field is assigned the datatype text and has a maximum length of around 64kB with actual content depending on how many bytes your UTF-8 encoded text uses.
You would need to change the datatype to longtext which has a maximum length of 16MB.
Testing this will be necessary to make sure no validation limits have been imposed on the entry template.
You can modify the table structure of table checkout_agreement by changing the data type of content field from TEXT to LONGTEXT to allow for more characters.

rethinkdb: "RqlRuntimeError: Array over size limit" even when using limit()

I'm trying to access a constant number of the latest documents of a table ordered by a "date" key. Note that the date, unfortunately, was implemented (not by me...) such that the value is set as a string, e.g "2014-01-14", or sometimes "2014-01-14 22:22:22". I'm getting a "RqlRuntimeError: Array over size limit 102173" error message when using the following query:
r.db('awesome_db').table("main").orderBy(r.desc("date"))
I tried to overcome this problem by specifying a constant limit, since for now I only need the latest 50:
r.db('awesome_db').table("main").orderBy(r.desc("date")).limit(50)
Which ended with the same error. So, my questions are:
How can I get a constant number of the latest documents by date?
Is ordering by a string based date field possible? Is this issue has something to do with my first question?
The reason you get an error here is that the orderBy gets evaluated before the limit so it orders the entire table in memory which is over the array limit. The way to fix this is by using and index. Try doing the following:
table.indexCreate("date")
table.indexWait()
table.orderBy({index: r.desc("date")}).limit(50)
That should be equivalent to what you have there but uses an index so it doesn't require loading the entire table into memory.
This code is decision problem.
ro:= r.RunOpts{ArrayLimit: 500000}
r.DB("wrk").Table("log").Run(sessionArray[0],ro)
// This code for Python
r.db('awesome_db').table("main").run(sesion, r.runOpts{arrayLimit: 500000})

PeopleSoft Payroll Interface Field length

I have added a field to a Payroll Interface definition. I am using the delivered field TEXT254. The field where you define the length of the field in bytes (field definition table) is three characters, so it would appear that you can define the length as 999 bytes. The PI process fails when I set the length to 999 bytes, until I lowered it to 150 bytes. I am experimenting, with it, slowly increasing the value I'm wondering if anyone knows what the limit really is? Our PI takes 3 hours to run, so experimenting takes a long time.
edit - I cut down the runtime by getting rid of all but one company. The largest byte size that I seem to be able to get to run is 240. I did some research, and it looks like when you build your tables, Oracle will set the field to VARCHAR2(n*3) where n is the size of the field specified in AppDesigner. Sure enough, the script generated by the Project...Build sets my field to VARCHAR2(762).
This is what I found - the data that the PI exports is pretty much unlimited - in the PI_PARTIC_EXPT table, the EXPORT_ROW field is 250 characters. If the row you're exporting exceeds this, a new row is inserted with a new sequence number (export_seq), and the data is continued in the EXPORT_ROW field in this new row.
There is,however, a limit to an idividual field that the PI can handle, and that is 240 characters, so once I limited the field to 240 characters all was well.

How to increase maximum size of csv field in Magento, where is this located

I have one field when importing that can contain large data, it seems that CSV has unofficial limitation of about 65000 (likely 65535*) character. as both libreoffice calc and magento truncating the data for that particular field. I have investigated well and I'm certain it is not because of a special character or quotes. the data pretty straight forward, the lines are similar in format to each other.
Question: How to increase that size? or at least where I should look to find it?
Note: I counted in libreoffice writer and it was about 65040. but probably with carriage return characters it could reach 65535
I change:
1) in table catalog_category_entity_text
type of field "value" from "text" to "longtext"
2) in file app/code/core/Mage/ImportExport/Model/Import/Entity/Abstract.php
const DB_MAX_TEXT_LENGTH = 65536;
to
const DB_MAX_TEXT_LENGTH = 16777215;
and all OK
You are right, there is a limitation in Magento, because it sets texts fields as TEXT in MySQL database and, according to MySQL docs, this kind of field supports a maximum of 65535 chars.
http://dev.mysql.com/doc/refman/5.0/es/storage-requirements.html
So you could change the column type in your Magento database to use MEDIUMTEXT. I guess the correct place is in the catalog_product_entity_text table, where you should modify the 'value' field type to match your needs. But please, keep in mind this is dangerous. Make a full backup before trying. And you may even need to play with core files... not recommended!
I'm having the same issue with 8 products from a list of more than 400, and I think I'm not going to mess with Magento core and database, we can reduce the description strings for those few products.
The CSV could care less. Due to Microsoft Access allowing Memo fields which can contain quite a bit of data, I've exported 2-3k descriptions in CSV format to be imported into Magento quite successfully.
Your limitation is either because you are using a spreadsheet that has a cell limitation or export limitation on cells or because the field you are trying to import into has a maximum character limitation set in its table for that field.
You can determine the latter by using phpMyAdmin to see what the maximum character setting is for that field.

Resources