Linq insert with no primary key - linq

I need to insert records into a table that has no primary key using LINQ to SQL. The table is poorly designed; I have NO control over the table structure. The table is comprised of a few varchar fields, a text field, and a timestamp. It is used as an audit trail for other entities.
What is the best way to accomplish the inserts? Could I extend the Linq partial class for this table and add a "fake" key? I'm open to any hack, however kludgey.

LINQ to SQL isn't meant for this task, so don't use it. Just warp the insert into a stored procedure and add the procedure to your data model. If you can't do that, write a normal function with a bit of in-line SQL.

Open your DBML file in the designer, and give the mapping a key, whether your database has one or not. This will solve your problem. Just beware, however, that you can't count on the column being used for identity or anything else if there isn't a genuine key in the database.

I was able to work around this using a composite key.
I had a similar problem with a table containing only two columns: username, role.
This table obviously does not require an identity column. So, I created a composite key with username and role. This enabled me to use LINQ for adding and deleting entries.

You might use the DataContext.ExecuteCommand method to run your own custom insert statement.
Or, you might add a primary key to a column, this will allow the objects to be tracked for inserts/updates/deletes by the datacontext. This will work even if the column isn't really an enforced primary key in the database (how would linq know?). If you're only doing inserts and never re-use a primary key value in the same datacontext, you'll be fine.

Related

How to insert data into destination table without having any primary key in Talend

I am using talend for ETL I don't have enough experience in this, I am having two tables for example- account and account_roles account table is having id, name, password etc fields and account_roles table is having account_id which is f.k to account table's pk. and one more field.
Both the fields in account_roles are having duplicates, I want to save account_roles in destination with update and insert logic using talend.
But I am getting error as I don't have any table that can be treated as primary key in the account_roles table, so talend can't update or insert it.
How I deal with this situation I tried tDBOutput advance option use_field_option but still it requires unique entries.
Is there any possible solution to this issue, I also want to know if I can make table Foreign key in the account_roles table will it work then? If yes then How to make F.k in talend OPen studio is my second question.
Attaching Snapshots of my tables and tMap below -
I want to know the way I can put my tables into database if I don't have any primary key! Kindly help me.
First question
I think you should place the primary key in the physical account_roles table. Talend will use the key indication of the dbOutput component, and the physical key of the table.
In order to delete duplicates rows, you can also use a tUniqRow before the dbOutput: The key you indicate in the UniqRow is not directly linked to the database; this is only the key on which tUniqRow will be based.
Second question
It's not possible to delegate the f.k. verification to Talend. But you can do this verification in your database by placing foreign keys in your table. If an id is not present in the reference table, the database returns an error that is returned by Talend.

Oracle SQL Data Modeler missing a PRIMARY KEY on DDL script export

The diagram has over 40 tables, most of them have a primary key defined.
For some reason there is this one table, which has a primary key defined, but that's being ignored when I export the model to a DDL script.
This is the "offending" key (even though it's checked it is nowhere to be found on the generated DDL script):
Has anybody had the same problem? Any ideas on how to solve it?
[EDIT] This is where the key is defined:
And this is the DDL preview (yes, the primary key shows up there):
This is what happens if I try to generate the DDL for just that table (primary key still not generated):
I was finally able to identify and reproduce the problem.
It was a simple conflict of constraints.
Table MIEMBROS had a mandatory 1 to n relationship (foreign key) from another table on its primary key column and vice-versa (there was a foreign key on MIEMBROS against the other table's primary key).
This kind of relationship between two tables makes it impossible to add a record to any of them: The insert operation will return an error complaining about the foreign key restriction pointing the other table.
Anyway I realized that one of the relationships was 0 to n so I simply unchecked the "mandatory" checkbox on the foreign key definition and everything went fine.
So, in a nutshell: The Data Modeler "fails" silently if you are defining a mutual relationship (two foreign keys, one on each table against the other table) on non nullable unique columns, by not generating the primary key of one of the tables.
Such an odd behavior, if you ask me!
"This kind of relationship between two tables makes it impossible to add a record to any of them: The insert operation will return an error complaining about the foreign key restriction pointing the other table."
Actually, if you have deferred constraints, this is not impossible. The constraints can be enforced, for example, at commit time rather than immediately at insert time.
From the Data Modeler menu under File, I used Export -> DDL File. The keys appeared in the DDL, then when I went back to the diagram and did DDL Preview, it showed all the missing stuff.

Does Oracle automatically create a secondary index for FOREIGN KEY columns?

I'm currenly developing on Oracle. I have several tables for which I defined FOREIGN KEY constraints. I have already read this SQL Server-oriented and this MySQL-oriented questions but I could find none about Oracle.
So the question is always the same: in order to optimize query performance, for those columns for which I create a FOREIGN KEY constraint, do I also have to create an explicit secondary index? Doesn't Oracle automatically create an index on FOREIGN KEYed columns to boost performances during JOINs?
I usually perform queries in which the WHERE clause compare against those columns.
No, Oracle doesn't automatically create indexes on foreign key columns, even though in 99% of cases you probably should. Apart from helping with queries, the index also improves the performance of delete statements on the parent table.

Make column unique in two tables in our database

I have come into a bump at my current company where they have an account and a member. For some reason or another both are stored in separate tables.
Right now a member and an account can be registered. That's fine, except the users of both member and an account can have the same username. This is of course as you all know just wrong. Especially since they use the username to login to the same system except with different functionality levels.
Right now we are doing a check at the application level, and we're just wondering if it's possible to get the database to enforce two columns to be unique, say like a union of the two tables.
Can't set them up as primary or foreign key at the moment but that's for future anyway. Right now looking for a quick fix. In the future I will probably merge databases and get all members added on as new rows in the account table and add a boolean for IsMember.
In general, I agree with the consensus opinion that it's better to fix the design than to kluge a fix using triggers. However, a properly implemented trigger-based solution is still probably better than your current situation.
If you're going to use triggers, the right way to do it is to:
Create a new table that will contain nothing but usernames, with a primary key enforcing uniqueness (this may, in fact, be a good candidate for an indes-organized table).
Create before-insert triggers on both existing tables that add the new username to the new table. If the new username already exists, an error will be thrown, preventing the insert of both rows. Of course, the application will need to be able to handle this error gracefully (presumably it already can, for scenarios in which the new username already exists in the table it's being added to).
The wrong way to do this would be to make the trigger select from the other table, in order to verify uniqueness.
You can add a trigger that enforces your requirement.
The recommended triggers tend to be really brittle with concurrent transaction.
What you can do (AFAIK) is to create a materialized view containing the union of the column in question and put a unique constraint on that column.
Make sure you do some performance tests though.
As you use a soft delete pattern.
A trigger could be used (on each table) as a temporary measure.
By inserting a disabled record in the the other table, you will get a failure if the other record already exists
Remember this will not enforce the rule on existing data, only records that are inserted will be checked
Something like this:
-- Insert into the accounts table too
CREATE OR REPLACE TRIGGER tr_member_chk
BEFORE INSERT ON members
FOR EACH ROW
BEGIN
INSERT INTO account (name, id, etc, isenabled) VALUES(:new.name, :new.id, :new.etc, 0);
END;
-- Insert into the members table too
CREATE OR REPLACE TRIGGER tr_account_chk
BEFORE INSERT ON accounts
FOR EACH ROW
BEGIN
INSERT INTO members (name, id, etc,isenabled) VALUES(:new.name, :new.id, :new.etc,0);
END;

Maximum number of columns in a LINQ to SQL object?

I have 62 columns in a table under SQL 2005 and LINQ to SQL doesn't handle the updates though the reading would work just fine, I tried re-adding the table to the model, created a new data model but nothing worked, I'm guessing I've hit the maximum number of columns limit on an object, can anyone explain that ?
I suspect there is some issue with an identity or timestamp column (something autogenerated on the SQL server). Make sure that any column that is autogenerated is marked that way in the model. You might also want to look at how it is handling concurrency. If you have triggers that update any values on the row after it is updated (changing values) and it is checking all columns on updates, this would cause the update to fail. Typically I create my tables with a timestamp column -- LINQ2SQL picks this up when I generate the model and uses it alone for concurrency.
Solved, either one of the following two
-I'm using a UniqueIdentifier column that was not set as Primary key
-Set Unique ID primary key, checked the properties of the same column in Server Explorer and it was still not showing as Primary key, refreshed the connection,dropped the same table on the model and voila.
So I assume I made a change to my model some time before, deleted the table from the model and added the same from the Server explorer without refreshing the connection and it never used to work.
Question is, does VS Server Explorer maintain it's own table schema and requires connection refresh everytime a change is made in the database ?
There is no limit to the number of columns LINQ to SQL will handle.
Have you got other tables updating successfully?
What else is different about how you are accessing the table content?

Resources