Convert Int32 to Oracle number(5) with EF4 - oracle

I am using EF 4 (database first, model fully generated from it) with an oracle 10g database and I have a problem with one field.
My field is defined as a NUMBER(5) in my database. In my model, EF has defined it as a short.
My problem is that i have some values that are greater than 32,767 (max of a short)
I found this post : Entity Framework generates short instead of int. I follow the instruction and it works, my model contain now Int32 values.
But I have a new problem :
Error 2019: Member Mapping specified is not valid. The type 'Edm.Int32[Nullable=True,DefaultValue=]' of member 'XX' in type 'Model.XXX' is not compatible with 'OracleEFProvider.number[Nullable=True,DefaultValue=,Precision=5,Scale=0]' of member 'XX' in type 'Model.Store.XXX'.
This error is always show in the Error List tab of Visual Studio. However, the build success, and it half works:
read a value in database works
write a value do not work : 99999 was transformed in -31073 (see edit)
Is there a solution to have it works on both ways ?
BTW, is there any way to tell entity to use int32 for oracle INTEGER fields ? It use decimal by default.
EDIT
While debuging step by step, I found why my value was -31073. I forgot this line :
dao.Value = (short)dto.Value;
My two values were int, but the implicit conversion in short was the origin.

I found how to remove the error.
I edited the edmx file in xml mode, found my field in the ssdl section :
<Property Name="SIT_INSEE" Type="number" Precision="5" />
I removed the Precision="5" and the warning disappeared.

Just to add my two cents in case anyone else is having similar problems. I noticed if you add the following to mappings in web.config and then recreate the edmx model from scratch (delete it and re-generate from database) it resolves some of these issues. Where simply adding the values to web.config resolved nothing (probably re-generating some of the code behind the scenes would be my guess).
<oracle.dataaccess.client>
<settings>
<add name="int32" value="edmmapping number(9,0)" />
</settings>
</oracle.dataaccess.client>

Related

Datamapper type for char (aka character) field type

I'm working with an open source database. I'm trying to map it to classes with DataMapper, and later I'm going to make changes in a Model driven approximation instead of a Database driven one.
But first I would like to map the open source database in an exact way. This database is a PostgreSQL one and in some tables there are some fields with a character type.
How can I map character type in DataMapper? This type it's not in its primitive types, nor in dm-types, nor in dm-types-legacy.
If it gives more information, actually I'm not writing the model by hand but I'm using dm-is-reflective, which automatically maps an existing database table. It gives me following error:
/var/lib/gems/1.9.1/gems/dm-is-reflective-1.0.0/lib/dm-is-reflective/is/adapters/data_objects_adapter.rb:141:in `reflective_lookup_primitive': bpchar not found for DataMapper::Adapters::PostgresAdapter (TypeError)
EDIT
It was a problem with dm-is-reflective and not with datamapper core, which can work well with char type as a String type with a length set. I answer with the solution to the problem.
godfat, the man working in dm-is-reflective quickly solved this issue :) Many thanks to him!
https://github.com/godfat/dm-is-reflective/issues/3#issuecomment-5726650

Entity Framework is padding out my text fields although they are not Fixed Length

I am building an MVC3 site using Entity Framework 4 and I'm having a problem with fixed length fields.
When I look at my code during debug it shows that MyEntity.Title="Hello name " with the title padded out to the maximum length of the field.
This is usually a question of having fixed field length in the EDMX file or using a char data type on the underlying database rather than a varchar. In this case neither of those is correct, however it is possible that the problem fields were of fixed length originally. I have manually changed each field in the EDMX ( and the model has been regenerated ) and the fields were never fixed length in the database ( which was the starting point for the application ) so I guess that the need to pad out the fields is being stored somewhere in the Entity Framework configuration and hasn't been updated.
The problem occurs in new records when they are added to the database- when the object is created the Title will be correct, when it is instanciated from the database it is padded.
What do I need to do in order to get rid of the padding, which is really screwing up my string comparisons unless I trim everything?
It turns out that in the .EDMX file the padded files were still listed as nchar. This wasn't visible through the Model Editor, the only way to change it was to right-click on the model in Visual Studio and select "open with..." then use an XML editor. The offending fields looked like this:
<Property Name="MyProperty" Type="nchar" Nullable="false" MaxLength="50" />
Changing the Type to nvarchar and running the template again seemed to clear the problem up.
Existing fields do not get updated in the model when you update it from the database. You either have to delete the entities from the model, or manually change those fields to the new values.
Check the property types in the Model Browser and make sure they are correct.
Change your Title field to have the Fixed Length property set equal to true. It's probably defaulting to none :)
Make sure you make the change in the dbase first and then update your edmx.

Failed to enable constraints. One or more rows contain values violating non-null, unique, or foreign-key constraints.

Please help, I have been trying to fix this error for the better part of 8 hours so far. I have a report in Crystal Reports that just started throwing this error. I changed a field in the View that is attached to the report, so I opened up my XSD file in VS2010 and renamed the current DT to ViewTracker0 and then pulled in the ViewTracker view. I added my queries from the old DT, ensured that there is no primary key, double checked that each length of the fields were the same as the db, checked to make sure that each column name is named to match the DB. I can preview my data fine in the XSD, as well as in SQL I can run my queries and everything is returned correctly. When I run my report, it dies everytime with this error.
Failed to enable constraints. One or more rows contain values violating non-null, unique, or foreign-key constraints.
What do I need to check next.
Have you tried verifying the database within the Crystal Reports designer, then running the report?
Try traversing GetErrors(), described here:
http://www.fransson.net/blog/failed-to-enable-constraints-one-or-more-rows-contain-values-violating-non-null-unique-or-foreign-key-constraints/

Mapping a long text string in Oracle and NHibernate

Using NHibernate 3.1 with both SQL Server and Oracle DBs, we need to store a text string that is longer than 4000 characters. The text is actually XML, but that is not important - we just want to treat it as raw text. With SQL Server, this is easy. We declare the column as NVARCHAR(MAX) and map it thusly:
<property name="MyLongTextValue" length="100000"/>
The use of the length property tells NHibernate to expect a string that may be longer than 4000 characters.
For the life of me, I cannot figure out how to make this work on Oracle 11g. I've tried declaring the column as both XMLTYPE and LONG with no success. In the first case, we end up with ORA-01461: can bind a LONG value only for insert into a LONG column when trying to insert a row. In the second case, the data is inserted correctly but comes back as an empty string when querying.
Does anyone know how to make this work? The answer has to be compatible with both SQL Server and Oracle. I'd rather not have to write custom extensions such as user types and driver subclasses. Thanks.
You should use something like this
<property name="MyLongTextValue" length="100000" type="StringClob"
not-null="false"/>
This should work with Oracle CLOB type and SqlServer NTEXT type.
Make sure the property on your model is nullable
public virtual string MyLongTextValue {get;set;}
You should always use the Oracle.DataAccess when dealing with CLOBs
For whom this may interest, I solved my problem following the step 3 of this article:
3. Using correct Mapping attributes: type="AnsiString"
Normally we can use type="String" default for CLOB/NCLOB. Try to use > type="AnsiString" if two steps above not work.
<property name="SoNhaDuongPho" column="SO_NHA_DUONG_PHO" type="AnsiString"/>
In my case I set it with FluentNHibernate:
.CustomType("AnsiString")
You might be interested in this article.
<property column="`LARGE_STRING`" name="LargeString" type="StringClob" sql-type="NCLOB" />

"Cannot change DataType of a column once it has data" error in Visual Studio 2005 DataSet Designer

I've got a DataSet in VisualStudio 2005. I need to change the datatype of a column in one of the datatables from System.Int32 to System.Decimal. When I try to change the datatype in the DataSet Designer I receive the following error:
Property value is not valid. Cannot change DataType of a column once
it has data.
From my understanding, this should be changing the datatype in the schema for the DataSet. I don't see how there can be any data to cause this error.
Does any one have any ideas?
I get the same error but only for columns with its DefaultValue set to any value (except the default <DBNull>). So the way I got around this issue was:
Column DefaultValue : Type in <DBNull>
Save and reopen the dataset
Since filled Datatables do not entertain a change in the schema a workaround can be applied as follows:
Make a new datatable
Use datatable's Clone method to
create the datatable with the same
structure and make changes to that
column
In the end use datatable's ImportRow
method to populate it with data.
HTH
For those finding this via Google and you have a slightly different case where your table has got data and you add a new column (like me), if you create the column and set the datatype in separate statements you also get this same exception. However, if you do it in the same statement, it works fine.
So, instead of this:
var column = myTable.Columns.Add("Column1");
column.DataType = typeof(int); //nope, exception!
Do this:
var column = myTable.Columns.Add("Column1", typeof(int));
I have found a work around. If I delete the data column and add it back with the different data type, then it will work.
Close the DataSet in the visual designer
Right click the dataset, choose Open With...
Choose XML (Text) Editor
Find the column in the XML, in your dataset it will look something like:
<xs:element name="DataColumn1"
msprop:Generator_ColumnVarNameInTable="columnDataColumn1"
msprop:Generator_ColumnPropNameInRow="DataColumn1"
msprop:Generator_ColumnPropNameInTable="DataColumn1Column"
msprop:Generator_UserColumnName="DataColumn1"
type="xs:int"
minOccurs="0" />
Change the type="xs:int" to type="xs:decimal"
Save and close the XML editor
You may need to right click the DataSet again and choose Run Custom Tool
Its an old Question but it still can happen at VS 2019
Solution:
Change the DefaultValue to <DBNull>
Save the Dataset
Close the DataSet Designer
Re-Open the Designer
Now it should be possible to change the type without any problem.

Resources