I'm trying to create Entity Data Model using Visual Studio 2012 and Oracle 10g.
I'm getting this error:
Error 1 Running transformation: The types of all properties in the
Dependent Role of a referential constraint must be the same as the
corresponding property types in the Principal Role. The type of
property 'QUARTAL_SEC_ID' on entity 'Model.QUARTAL' does not match the
type of property 'SEC_ID' on entity 'Model.SEC' in the referential
constraint 'FK_QUARTAL_SEC_ID'.
Indeed sec_id has number(32) and quartal_sec_id - number(10) in the database. I can't change it there.
I have changed both types in mapping details in VS12 to int32. But it doesn't help.
Is it possible to solve this problem in VS12? Can I force it to accept different values in foreign keys?
I can't map number(10) (ORA) to decimal(VS) cause will get additional err:
Error 2 Error 2019: Member Mapping specified is not valid. The type
'Edm.Decimal[Nullable=False,DefaultValue=,Precision=,Scale=]' of
member 'QUARTAL_SEC_ID' in type 'Model.QUARTAL' is not compatible with
'OracleEFProvider.number[Nullable=False,DefaultValue=,Precision=10,Scale=0]'
of member 'QUARTAL_SEC_ID' in type 'Model.Store.QUARTAL'.
I am currently using EF 5 with Oracle and for number types we have our entity properties set as decimals.
This article outlines the data types.
Related
I'm using the Common Data Service for Apps connector in Azure Data Factory to load data into Dynamics 365
I've done this successfully before using the entity key. See this question: Loading records into Dynamics 365 through ADF
Now I'm trying to use an alternate key to Upsert records into the account entity. (In this case insert)
In Dynamics
I've created two custom attributes fields in account:
Field name Data Type Field Type Max Length
=======================================================
xyz_srcsystem Single Line Simple 50
xyz_srccode Single Line Simple 50
Then I created a Key on account which contains these fields:
xyz_alternatekeyaccount
In ADF
Then I used a Copy Data activity in ADF to copy data from a SQL view into the account entity, using the CDS connector as a target.
This my source SQL statement:
SELECT
CAST(NULL as uniqueidentifier) as accountid,
'ADFTest1' as accountnumber, 'ADF Test 1' as [description],
'nmcdermaid#xyz.com.au' as emailaddress1,
CAST('TST' AS NVARCHAR(50)) as xyz_srcsystem,
CAST('1' AS NVARCHAR(50)) as xyz_srccode
In the target, in the Alternate key name field I entered the alternate key name: xyz_alternatekeyaccount
The error I get when I run the pipeline is
Invalid type for entity id value
Some test to rule out edge cases:
if I put a dummy alternate key in, I get Cannot retrieve key information of alternate key 'xyz_alternatekeyaccountx' for entity 'account'. This implies it is finding the alternate key correctly
If I remove the alternate key from the connector, it drops back to the other usual set of errors that I see
When I pull the entity into SQL using the CDM connector, the custom attributes arrive as NVARCHAR(MAX)
I've tried casting to these data types: NVARCHAR(MAX) NVARCHAR(50) VARCHAR(MAX) VARCHAR(50)
If I use the normal key (not an alternate key), and get the datatype wrong (anything other than GUID), I'll get the same error
Also see this Doco GitHub I raised:
https://github.com/MicrosoftDocs/azure-docs/issues/59028
When I changed the source SQL to this, it worked:
SELECT
'ADFTest1' as accountnumber, 'ADF Test 1' as [description],
'nmcdermaid#xyz.com.au' as emailaddress1,
CAST('TST' AS NVARCHAR(50)) as xyz_srcsystem,
CAST('1' AS NVARCHAR(50)) as xyz_srccode
Note: the difference is I did not include the true primary key in the source dataset.
Not that if you want to UPSERT a new record (INSERT) and this isn't based on an alternate key, you have to include a NULL primary key
I'm attempting to deploy a database project, Operations, that builds without errors. The deployment is failing with an "Invalid object name" error. The object name in the error is a three-part name, of an object in another database project, Company. The name is valid: The object exists in Company, there is a database reference in the Operations project to the Company project, and the three-part name appears in other objects in the Operations project without causing an error.
As far as I can see, the reason for the error is that the three-part name appears in a join which uses a COLLATE statement. eg
CREATE VIEW BranchDetails
AS
SELECT b.BranchID, bu.LocationID
FROM Branch b
INNER JOIN Company.dbo.BusinessUnit bu on b.BranchAbbreviation = bu.[Name] COLLATE SQL_Latin1_General_CP1_CI_AS;
The COLLATE statement is needed because the two databases have different collations and it's not possible to join on columns with different collations without converting the collation of one column or the other.
Is there some workaround that will allow me to deploy the database without the "Invalid object name" error?
EDIT: Looks like the COLLATE statement may have been a red herring. I replaced the view with a dummy one:
CREATE VIEW BranchDetails
AS
SELECT NULL AS BranchID, NULL AS LocationID;
Then dealt with other errors until I was able to deploy the database. I then copied the original text of the BranchDetails CREATE VIEW back into the script and the database built and deployed without error.
The only explanations I can think of are:
1) In copying the view text out and then back in again later, I somehow omitted something that was causing the error;
2) The error was actually related to one of the other errors I fixed when I replaced the view with the dummy text, and Visual Studio was giving me an incorrect error message pointing to the wrong .sql file. I suppose this is possible as the other errors were related to the server name included in four-part names in a stored procedure not existing in our dev environment.
I've got the following objects:
CREATE FUNCTION CONSTFUNC RETURN INT
DETERMINISTIC
AS
BEGIN
RETURN 1;
END;
CREATE TABLE "FUNCTABLE" (
"ID" NUMBER(*,0) NOT NULL,
"VIRT" NUMBER GENERATED ALWAYS AS ("CONSTFUNC"()) NULL
);
however, the functable => constfunc dependency is not listed in all_ or user_ dependencies. Is there anywhere I can access this dependency information in the dictionary?
I just created your function and table in 11G (11.1) and can confirm your findings. I couldn't find anything in the Oracle docs either.
If you drop the function, the table status remains "VALID", but when you select from the table you get ORA-00904: "CHAMP"."CONSTFUNC": invalid identifier. This suggests that Oracle itself isn't aware of the dependency.
It might be worth asking this question on asktom.oracle.com, because Tom Kyte will have access to more information - he may even raise a bug about it if need be.
The expression used to generate the virtual column is listed in the DATA_DEFAULT column of the [DBA|ALL|USER]_TAB_COLUMNS views.
I've tried to do that.
HasManyToMany<YechidotDoarInGroup>(x => x.Col_yig)
.Table("PigToYig")
.ChildKeyColumn("YIG_GROUP_RECID")
.ParentKeyColumn("PIG_GROUP_RECID");
but I've got:
ORA-00942: table or view does not exist
I am trying to establish HasManyToMany connection not by ID , but by
some other property .
First I've got - too long message. When I've tried to enter my own Table name as an alias , it's not recognized. What should I do?
The problem may well be this:
.Table("PigToYig")
Oracle object names are, by default, in UPPER case. However, Oracle applies names in double-quotes in the given case. In other words, if your table has the default naming you may need to pass in this instead ...
.Table("PIGTOYIG")
It depends how NHibernate converts those variables into SQL (I'm not familiar with NHibernate).
Define Table() method before all of your mapping declaration.
public EmployeeMap : ClassMap<Employee>
{
public EmployeeMap()
{
Table("EMPLOYEE");
// your declaration
Id(x => x.IdEmployee);
}
}
Cause: The table or view entered does
not exist, a synonym that is not
allowed here was used, or a view was
referenced where a table is required.
Existing user tables and views can be
listed by querying the data
dictionary. Certain privileges may be
required to access the table. If an
application returned this message, the
table the application tried to access
does not exist in the database, or the
application does not have access to
it.
Action:
Check each of the following:
* the spelling of the table or view name.
* that a view is not specified where a table is required.
* that an existing table or view name exists.
source ora-code.com
I'm working in two different Oracle schemas on two different instances of Oracle. I've defined several types and type collections to transfer data between these schemas. The problem I'm running into is that even though the type have exactly the same definitions (same scripts used to create both sets in the schemas) Oracle sees them as different objects that are not interchangeable.
I thought about casting the incoming remote type object as the same local type but I get an error about referencing types across dblinks.
Essentially, I'm doing the following:
DECLARE
MyType LocalType; -- note, same definition as the RemoteType (same script)
BEGIN
REMOTE_SCHEMA.PACKAGE.PROCEDURE#DBLINK( MyType ); -- MyType is an OUT param
LOCAL_SCHEMA.PACKAGE.PROCEDURE( MyType ); -- IN param
END;
That fails because the REMOTE procedure call can't understand the MyType since it treats LocalType and RemoteType as different object types.
I tried DECLARING MyType as follows as well:
MyType REMOTE_SCHEMA.RemoteType#DBLINK;
but I get another error about referencing types across dblinks. CASTing between types doesn't work either because in order to cast, I need to reference the remote type across the dblink - same issue, same error. I've also tried using SYS.ANYDATA as the object that crosses between the two instance but it gets a similar error.
Any ideas?
UPDATE:
Tried declaring the object type on both sides of the DBLINK using the same OID (retrieved manually using SYS_OP_GUID()) but Oracle still "sees" the two objects as different and throws a "wrong number or types of arguements" error.
I have read the Oracle Documentation and it is not very difficult.
You need to add an OID to your type definitions in both databases.
You can use a GUID as OID.
SELECT SYS_OP_GUID() FROM DUAL;
SYS_OP_GUID()
--------------------------------
AE34B912631948F0B274D778A29F6C8C
Now create your UDT in both databases with the SAME OID.
create type testlinktype oid 'AE34B912631948F0B274D778A29F6C8C' as object
( v1 varchar2(10) , v2 varchar2(20) );
/
Now create a table:
create table testlink
( name testlinktype);
insert into testlink values (testlinktype ('RC','AB'));
commit;
Now you can select from the table via the dblink in the other database:
select * from testlink#to_ora10;
NAME(V1, V2)
--------------------------
TESTLINKTYPE('RC', 'AB')
If you get error ORA-21700 when you try to select via the dblink the first time, just reconnect.
I think the underlying issue is that Oracle doesn't know how to automatically serialize/deserialize your custom type over the wire, so to speak.
Your best bet is probably to pass an XML (or other) representation over the link.