NHibernate schema export issue with Oracle Blob field - oracle

I'm having an issue creating my Oracle DB with schemaexport function of NHibernate.
For a property defined as byte[], it creates a DB field of type RAW (btw limited to 2000 byte).
This field type is not enough for my needs, and I need NH to create a blob field instead.
How can I achive that?
I tried declaring the field in the mapping file (I use xml mapping, thus hbm files) specifying either type="Binary" and type="BinaryBlob", but none of those seems to have the desired effects: the created field is always a RAW.
Can anyone help me here?

<property name="prop">
<column name="blobcolumn" sql-type="BinaryBlob">
</property>
Update: maybe this could also do the trick
<property name="prop" type="Binary" length="1000000"/>

I had a similar problem and the solution is the length attribute:
<property name="Attachment" length="5224880"/>
If you specify no length then whatever you write in the type attribute it's going to end as RAW(2000) in oracle because its maximum is 2000 bytes, but if you say I need 5 MB or in bytes its 5224880 bytes then nhibernate switch's to BLOB automatically because it's bigger then 2000 bytes
so given the dot net property
public virtual byte[] Attachment { get; set; }
the proper mapping would be
<property name="Attachment" length="5224880"/>
Or you can explore the OracleLiteDialect.cs in the codebase (source code) of Nhibernate

If someone wants a convention way to do the byte[] type translate into BLOB in database i came up with this:
public class ByteArrayToDbBlobConvention : IPropertyConvention, IPropertyConventionAcceptance
{
public void Accept(IAcceptanceCriteria<IPropertyInspector> criteria)
{
criteria.Expect(x => x.Type == typeof(byte[]));
}
public void Apply(IPropertyInstance instance)
{
instance.CustomSqlType("BLOB");
}
}

Related

Don't overwrite extra fields when serializing BSON?

I'm using the C# driver for MongoDB and trying to edit some MongoDB elements. When deserializing BSON, I'm using the [IgnoreExtraElements] tag to filter out fields I don't really care about editing. The problem is when I'm trying to serialize the elements back into the Mongo database. Instead of changing only the fields I've edited, serializing elements back overwrites the whole object.
For example, I'm changing a Word element with C# properties:
[BsonId]
public ObjectId _id;
public string word;
In MongoDB it also has the element "conjugations", which is a kind of complicated array I don't want to serialize or mess with. But when I try something similar to the below code, it blanks out the conjugations array.
MongoWord word = collection.FindOneAs<MongoWord>(Query.EQ("word","hello"));
word.word = "world";
collection.Save(word);
How can I avoid overwriting extra fields in the Json database? Right now I'm trying to write an Update.Set builder query and only update the fields I've changed using Reflection/Generics.. Is there something easy like a reverse [IgnoreExtraElements] or update setting that I'm missing here?
That's why IgnoreExtraElements must be specified manually. You are essentially opting-in to potentially losing data.
The correct way to handle this is to actually support extra elements. You can see this section in the documentation for how to do this: https://mongodb.github.io/mongo-csharp-driver/2.12/reference/bson/mapping/#supporting-extra-elements
This is the function I ended up using. I didn't want to use [BsonExtraElements] since it seemed unnecessary to pull down ExtraElements just to save them again if they hadn't been edited. Instead I'm using the mongo UpdateBuilder to only update the fields that I've changed/brought down from mongo. The solution is problematic if I need to clear fields by setting them to null.
/// <summary>
/// Update a Mongo object without overwriting columns C# think are null or doesn't know about
/// </summary>
/// <typeparam name="T"></typeparam>
/// <param name="model">The mongo object to update</param>
/// <param name="collection">The collection the object should go in</param>
/// <param name="objectId">The Bson ObjectId of the existing object in collection</param>
public void UpdateNotNullColumns<T>(T model, MongoCollection<T> collection, ObjectId objectId)
{
if (objectId==default(ObjectId))
{
return;
}
else
{
//Build an update query with all the non null fields using reflection
Type type = model.GetType();
FieldInfo[] fields = type.GetFields();
UpdateBuilder builder = new UpdateBuilder();
foreach(var field in fields)
{
BsonValue bsonValue = BsonValue.Create(field.GetValue(model));
if(bsonValue!=null)
{
Type bsonType = bsonValue.GetType();
builder.Set(field.Name, bsonValue);
}
}
//Actually update!
collection.Update(Query.EQ("_id", objectId), builder);
}
}

NHibernate: How to know if, on Flush() SQL will be sent?

I'm a bit puzzled with the NHibernate's IsDirty() method.
Directly after getting a (very large) complex object from my database, NHibernate's ISession.IsDirty() gives 'true'.
IFacadeDAL fd = new FacadeDAL();
// Session's not dirty
IProject proj = fd.GetByID<IProject, string>("123611-3640");
// Session is dirty
However, if i call Commit() like so:
using (ITransaction trans = Facade.Session.Transaction)
{
trans.Begin();
Facade.Session.Save(entity);
trans.Commit();
return true;
}
this results in no sql (exept for "exec sp_reset_connection").
I have read that due to 'mapping-choices' you can get "ghosts" in your session (causing the session to say it's dirty), but wouldn't it then also try to update something? Also, if this is caused e.g. by "converting" an sql bit to a c# bool i don't think i can change it... (no clue if that could be a cause for ghosts, though).
Update 2:
There are several (sql server) views and tables involved here. This is the (very) simplified class:
public class Project : IProject
{
private string id;
private List<IPlantItem> plantItems;
public Project() { }
public virtual string ID
{
get { return id; }
}
public virtual IEnumerable<IPlantItem> PlantItems
{
get { return plantItems; }
}
}
'PlantItem is being stored in a table. So i expect when i change anything in a PlantItem, IsDirty should change to 'true'.
My question is: is there a way to check if the session at that point, on flush() (or in my case on commit() for that matter) would generated actual sql statements? And if not: is there another way of (manually) storing some sort of a snapshot of the session to compare the current session to?
Update 1: I should really also mention these aspects:
that my FlushMode is set to 'None'.
that the underlying data of 'IProject'-object itself is based on a sql-view and therefore has most properties in the mapping set to update="false"
that when i actually change something in an object and use the same method for saving, sql update statements are being sent (and thus all is committed just fine)
In my experience Ghosts can be caused by the database being a nullable int and the mapping an ordinary int.
When the entity gets hydrated the nullable db int is converted to zero and hence it is now dirty.
Another way to get dirty records is by specifying a wrong type in the XML mapping, e.g.
public enum Sex
{
Unspecified,
Male,
Female
}
...
public virtual Sex Sex { get; set; }
and specify an int in the mapping.
<property name="Sex" type="int"/>
See this link to test your mappings which explains in more details.
If some of your entities is dirty - and therefore the ISession is dirty - they you have a mismatch between the properties and the database. For example, imagine you have a column in a table that is nullable, but in your code it is set as not null (an int, for example). NHibernate will consider it dirty, because its current value (0 in case of an integer) is different from the value that came from the db (null). Look for "Ghost properties NHibernate" in Google.

Finding a property is Primary Key in POCO Template t4 generator

I'm using the POCO t4 template generator that comes with VS 2012. I made few changes to include the Entity.Name, but I'm not able to figure out the primary key.
public string EntityClassOpening(EntityType entity)
{
return string.Format(
CultureInfo.InvariantCulture,
"{0} {1}partial class {2}{3}<{4},{5}>{6}",
Accessibility.ForType(entity),
_code.SpaceAfter(_code.AbstractOption(entity)),
_code.Escape(entity),
": EntityBase",
entity.Name,
entity.Name,
_code.StringBefore(" ", _typeMapper.GetTypeName(entity.BaseType)));
}
I don't find a way to find the primary key from the EntityType object hierarchy. It exposes properties but the property does not have anything to say it is a primary key.
Any help appreciated.
Just in case anyone is trying to do this while migrating RIA services stuff, I'm using the standard dbcontext template in VS2013 and have added two things to the entities template.
first you need:
using System.ComponentModel.DataAnnotations;
I put it just under the //---- block near the top.
Then I modified the bit of code that looks like this. Just search for the first name. My change is ef.IsKey... and adding the Key() attribute.
var simpleProperties = typeMapper.GetSimpleProperties(entity);
if (simpleProperties.Any())
{
foreach (var edmProperty in simpleProperties)
{
#>
<#if (ef.IsKey(edmProperty))
{#> [Key()]
<#}#>
<#=codeStringGenerator.Property(edmProperty)#>
<#
}
}
Use EntityType.KeyMembers property to get properties the primary key consists of.
I added this to the TypeMapper section, delighted with the results:
public IEnumerable<EdmProperty> GetPrimaryKeyProperties(EntityType type)
{
return type.KeyMembers.Select(s => (EdmProperty)s);
}

Guid values in Oracle with fluentnhibernate

I've only been using fluent nhibernate a few days and its been going fine until trying to deal with guid values and Oracle. I have read a good few posts on the subject but none that help me solve the problem I am seeing.
I am using Oracle 10g express edition.
I have a simple test table in oracle
CREATE TABLE test (Field RAW(16));
I have a simple class and interface for mapping to the table
public class Test : ITest
{
public virtual Guid Field { get; set; }
}
public interface ITest
{
Guid Field { get; set; }
}
Class map is simple
public class TestMap : ClassMap<Test>
{
public TestMap()
{
Id(x => x.Field);
}
}
I start trying to insert a simple easily recognised guid value
00112233445566778899AABBCCDDEEFF
Heres the code
var test = new Test {Field = new Guid("00112233445566778899AABBCCDDEEFF")};
// test.Field == 00112233445566778899AABBCCDDEEFF here.
session.Save(test);
// after save guid is changed, test.Field == 09a3f4eefebc4cdb8c239f5300edfd82
// this value is different for each run so I pressume nhibernate is assigning
// a value internally.
transaction.Commit();
IQuery query = session.CreateQuery("from Test");
// or
// IQuery query = session.CreateSQLQuery("select * from Test").AddEntity(typeof(Test));
var t in query.List<Test>().Single();
// t.Field == 8ef8a3b10e704e4dae5d9f5300e77098
// this value never changes between runs.
The value actually stored in the database differs each time also, for the run above it was
EEF4A309BCFEDB4C8C239F5300EDFD82
Truly confused....
Any help much appreciated.
EDIT: I always delete data from the table before each test run. Also using ADO directly works no problem.
EDIT: OK, my first problem was that even though I thought I was dropping the data from the table via SQL command line for oracle when I viewed the table via oracle UI it still had data and the first guid was as I should have expected 8ef8a3b10e704e4dae5d9f5300e77098.
Fnhibernate still appears to be altering the guid value on save. it alters it to the value it stores in the database but I'm still not sure why it is doing this or how\if I can control it.
If you intend on assigning the id yourself you will need to use a different id generator than the default which is Guid.comb. You should be using assigned instead. So your mapping would look something like this:
Id(x => x.Field).GeneratedBy.Assigned();
You can read more about id generators in the nhibernate documentation here:
http://www.nhforge.org/doc/nh/en/index.html#mapping-declaration-id-generator

Azure Table Storage, WCF Service and Enum

Here's my problem. A class which defines an order has a property called PaymentStatus, which is an enum defined like so:
public enum PaymentStatuses : int
{
OnDelivery = 1,
Paid = 2,
Processed = 3,
Cleared = 4
}
And later on, in the class itself, the property definition is very simple:
public PaymentStatuses? PaymentStatus { get; set; }
However, if I try to save an order to the Azure Table Storage, I get the following exception:
System.InvalidOperationException: The type Order+PaymentStatuses' has no settable properties.
At this point I thought using enum isn't possible, but a quick Google search returned this: http://social.msdn.microsoft.com/Forums/en-US/windowsazure/thread/7eb1a2ca-6c1b-4440-b40e-012db98ccb0a
This page lists two answers, one of which seems to ignore the problems and suggests that using an enum in Azure Storage is fine.
Now, I don't NEED to store the enum in the Azure Table Storage as such, I could just as well store a corresponding int, however, I do need this property to be exposed in the WCF service.
I've tried making the property use get and set to return the enum from a stored integer, and remove this property from Azure by using the WritingEntity event on my DataContext, but I get that exception before the event for this entity is fired.
At this point, I'm at a loss, I don't know what else I can do to have this property in WCF as an enum, but have Azure store just the int.
Enum is not supported. Even though it is defined like an int, it is really not an integral type supported by Table Storage. Here is the list of types supported. An enum is just a string expression of an integral number with an object-oriented flavor.
You can store int in table storage and then convert it using Enum.Parse.
Here's a simple workaround:
public int MyEnumValue { get; set; } //for use by the Azure client libraries only
[IgnoreProperty] public MyEnum MyEnum
{
get { return (MyEnum) MyEnumValue; }
set { MyEnumValue = (int) value; }
}
It would have been nicer if a simple backing value could have been employed rather than an additional (public!) property - without the hassle of overriding ReadEntity/WriteEntity of course. I opened a user voice ticket that would facilitate that, so you might want to upvote it.
ya i was having this same problem
i changed my property which was earlier enum to int. now this int property parses the incoming int and saves it into a variale of the same enum type so now the code that was
public CompilerOutputTypes Type
{get; set;}
is chaged to
private CompilerOutputTypes type;
public int Type
{
get {return (int)type;}
set { type = (CompilerOutputTypes)value; }
}
Just suggestions...
I remember that in WCF you have to mark enums with special attributes: http://msdn.microsoft.com/en-us/library/aa347875.aspx
Also, when you declare PaymentStatuses? PaymentStatus, you are declaring Nullable<PaymentStatuses> PaymentStatus. The ? sintax is just syntactic sugar. Try to remove the ? and see what happen (you could add a PaymentStatuses.NoSet = 0 , because the default value for an Int32 is 0).
Good luck.
Parvs solution put me on the right track but I had some minor adjustments.
private string _EnumType;
private EnumType _Type;
//*********************************************
//*********************************************
public string EnumType
{
get { return _Type.ToString(); }
set
{
_EnumType = value;
try
{
_Type = (EnumType)Enum.Parse(typeof(EnumType), value);
}
catch (Exception)
{
_EnumType = "Undefined";
_Type = [mynamespace].EnumType.Undefined;
}
}
}
I have come across a similar problem and have implemented a generic object flattener/recomposer API that will flatten your complex entities into flat EntityProperty dictionaries and make them writeable to Table Storage, in the form of DynamicTableEntity.
Same API will then recompose the entire complex object back from the EntityProperty dictionary of the DynamicTableEntity.
This is relevant to your question because the ObjectFlattenerRecomposer API supports flattening property types that are normally not writeable to Azure Table Storage like Enum, TimeSpan, all Nullable types, ulong and uint by converting them into writeable EntityProperties.
The API also handles the conversion back to the original complex object from the flattened EntityProperty Dictionary. All that the client needs to do is to tell the API, I have this EntityProperty Dictionary that I just read from Azure Table (in the form of DynamicTableEntity.Properties), can you convert it to an object of this specific type. The API will recompose the full complex object with all of its properties including 'Enum' properties with their original correct values.
All of this flattening and recomposing of the original object is done transparently to the client (user of the API). Client does not need to provide any schema or any knowledge to the ObjectFlattenerRecomposer API about the complex object that it wants to write, it just passes the object to the API as 'object' to flatten it. When converting it back, the client only needs to provide the actual type of object it wants the flattened EntityProperty Dictionary to be converted to. The generic ConvertBack method of the API will simply recompose the original object of Type T and return it to the client.
See the usage example below. The objects do not need to implement any interface like 'ITableEntity' or inherit from a particular base class either. They do not need to provide a special set of constructors.
Blog: https://doguarslan.wordpress.com/2016/02/03/writing-complex-objects-to-azure-table-storage/
Nuget Package: https://www.nuget.org/packages/ObjectFlattenerRecomposer/
Usage:
//Flatten object (ie. of type Order) and convert it to EntityProperty Dictionary
Dictionary<string, EntityProperty> flattenedProperties = EntityPropertyConverter.Flatten(order);
// Create a DynamicTableEntity and set its PK and RK
DynamicTableEntity dynamicTableEntity = new DynamicTableEntity(partitionKey, rowKey);
dynamicTableEntity.Properties = flattenedProperties;
// Write the DynamicTableEntity to Azure Table Storage using client SDK
//Read the entity back from AzureTableStorage as DynamicTableEntity using the same PK and RK
DynamicTableEntity entity = [Read from Azure using the PK and RK];
//Convert the DynamicTableEntity back to original complex object.
Order order = EntityPropertyConverter.ConvertBack<Order>(entity.Properties);

Resources