VS 2013. Why id(Primary Key) is decrement? - visual-studio-2013

I have a problem with VS 2013. When I create new table at my Service-based DB and set my id as primary key with Identity Increments = 1(Is Identity: true), I have my table which I can fill with VS 2013(Table data). My Primary Key is auto-increments when I add new record.
But when I add new data source, drag on my form and debug my APP, my Primary key for new record started from -1 and decremented step by step.

The DataSet+DataAdapter stack will do that, the negative values are temporary, client-side, keys.
They should turn positive after you committed the records to the actual Db.

Related

How to insert data into destination table without having any primary key in Talend

I am using talend for ETL I don't have enough experience in this, I am having two tables for example- account and account_roles account table is having id, name, password etc fields and account_roles table is having account_id which is f.k to account table's pk. and one more field.
Both the fields in account_roles are having duplicates, I want to save account_roles in destination with update and insert logic using talend.
But I am getting error as I don't have any table that can be treated as primary key in the account_roles table, so talend can't update or insert it.
How I deal with this situation I tried tDBOutput advance option use_field_option but still it requires unique entries.
Is there any possible solution to this issue, I also want to know if I can make table Foreign key in the account_roles table will it work then? If yes then How to make F.k in talend OPen studio is my second question.
Attaching Snapshots of my tables and tMap below -
I want to know the way I can put my tables into database if I don't have any primary key! Kindly help me.
First question
I think you should place the primary key in the physical account_roles table. Talend will use the key indication of the dbOutput component, and the physical key of the table.
In order to delete duplicates rows, you can also use a tUniqRow before the dbOutput: The key you indicate in the UniqRow is not directly linked to the database; this is only the key on which tUniqRow will be based.
Second question
It's not possible to delegate the f.k. verification to Talend. But you can do this verification in your database by placing foreign keys in your table. If an id is not present in the reference table, the database returns an error that is returned by Talend.

How to update a column which is also a primary key?

There is a name field in the UI which is also the primary key column in the underlying table. There is a requirement to make that field editable in the UI. There should be an ID which should serve as the primary key, but there isn't and now it is not feasible to introduce any ID column.
Is there any alternate design idea which can be used in such a scenario ?
The UI is in Swing and DB is Oracle.
First of all, I don't know, who thinks Name field can be Primary Key. That's the wrong database design ever.
Yes, you better change it to some ID column as Primary Key and that shouldn't be updated in future. Since, you can't have multiple Primary Key. So, you need to perform some circus here.
You need to drop existing Primary Key first. Since, you can't have multiple Primary Key in single table.
Create your ID column and allow NULL
Then, update this column with sequence.
Once your ID column gets populated, you need to create Primary Key on this column.
You can only have one primary key, but you can have any number of unique indexes on a table. So let the existing primary key be the immutable primary key and have the application use this key internally for everything. Add another column to the table and create a unique index on it. Let the users modify this other field.
Another alternative would be to declare all child tables with foreign keys ON UPDATE CASCADE. That way, any update to the primary key will cascade to the child tables. Once implemented in production, quit the company and run fast in the other direction and write an article about how you were the first person ever to use ON UPDATE CASCADE in a production setting.

Unique constraint exception in Hibernate

I am using one oracle db with two copies of same application one is uploaded to server and one through local host through remote connection.problem is in primary key constraint on inserting entries in db table by both application.Suppose there is 5 entries in db table with last entry having primary key value 5,currently both the applications having last value 5 in session,now suppose one application inserts a new row it will increment primary key to 6 and inserts it successfully now if second application wants to insert a row in same table it will try to insert with primary key 6 because in its session last value was 5,but in the db there is already a row with primary key 6(inserted by first app) there it gives Unique constraint exception.
can anyone suggest how to take current value of primary key from DB not from session...??? Thanks in Advance..

Data generation plan failed because of duplicate PRIMARY KEY constraints

In VS2010 database project, I try to generate test data for a table that has existing data (by clicking 'No' when prompted). Identity column (that is the primary key) is SQL computed value so I can not change the data generator for that column.
So why the data generation plan doesn't recognize existing primary key values in the database, but tries always to insert duplicates, i.e., seems that the plan is starting always from the seed value, not from the next available identity column value? Can I force the data generation plan to start from some other seed value for this particular table?
To answer my second question: seems that from the schema view, I can set some other seed to start from. Still, I don't understand why the data generation plan doesn't automatically recogzine the next available IDENTITY value.

Unique constraint violation during insert: why? (Oracle)

I'm trying to create a new row in a table. There are two constraints on the table -- one is on the key field (DB_ID), the other constrains a value to be one of several the the field ENV. When I do an insert, I do not include the key field as one of the fields I'm trying to insert, yet I'm getting this error:
unique constraint (N390.PK_DB_ID) violated
Here's the SQL that causes the error:
insert into cmdb_db
(narrative_name, db_name, db_type, schema, node, env, server_id, state, path)
values
('Test Database', 'DB', 'TYPE', 'SCH', '', 'SB01', 381, 'TEST', '')
The only thing I've been able to turn up is the possibility that Oracle might be trying to assign an already in-use DB_ID if rows were inserted manually. The data in this database was somehow restored/moved from a production database, but I don't have the details as to how that was done.
Any thoughts?
Presumably, since you're not providing a value for the DB_ID column, that value is being populated by a row-level before insert trigger defined on the table. That trigger, presumably, is selecting the value from a sequence.
Since the data was moved (presumably recently) from the production database, my wager would be that when the data was copied, the sequence was not modified as well. I would guess that the sequence is generating values that are much lower than the largest DB_ID that is currently in the table leading to the error.
You could confirm this suspicion by looking at the trigger to determine which sequence is being used and doing a
SELECT <<sequence name>>.nextval
FROM dual
and comparing that to
SELECT MAX(db_id)
FROM cmdb_db
If, as I suspect, the sequence is generating values that already exist in the database, you could increment the sequence until it was generating unused values or you could alter it to set the INCREMENT to something very large, get the nextval once, and set the INCREMENT back to 1.
Your error looks like you are duplicating an already existing Primary Key in your DB. You should modify your sql code to implement its own primary key by using something like the IDENTITY keyword.
CREATE TABLE [DB] (
[DBId] bigint NOT NULL IDENTITY,
...
CONSTRAINT [DB_PK] PRIMARY KEY ([DB] ASC),
);
It looks like you are not providing a value for the primary key field DB_ID. If that is a primary key, you must provide a unique value for that column. The only way not to provide it would be to create a database trigger that, on insert, would provide a value, most likely derived from a sequence.
If this is a restoration from another database and there is a sequence on this new instance, it might be trying to reuse a value. If the old data had unique keys from 1 - 1000 and your current sequence is at 500, it would be generating values that already exist. If a sequence does exist for this table and it is trying to use it, you would need to reconcile the values in your table with the current value of the sequence.
You can use SEQUENCE_NAME.CURRVAL to see the current value of the sequence (if it exists of course)

Resources