How to pass JSON array in PLSQL Rest Service - oracle

I am trying to pass JSON array in Rest Service in EBS (12.2.10) but getting following error:
"-40491 ORA-40491: invalid input data type for JSON_TABLE"
I created following types:
CREATE OR REPLACE EDITIONABLE TYPE XRCL_CB_INBOUND_TALLY_OBJ AS OBJECT
(TRANSACTION_DATE VARCHAR2(30),
TRANSACTION_TYPE VARCHAR2(5),
ORGANIZATION_ID VARCHAR2(5),
DOCUMENT_ID VARCHAR2(25),
DOCUMENT_LINE_ID VARCHAR2(25),
SKU_CODE VARCHAR2(25),
QUANTITY VARCHAR2(10),
SUBINVENTORY VARCHAR2(25),
LOT_NUMBER VARCHAR2(25));
CREATE OR REPLACE EDITIONABLE TYPE XRCL_CB_INBOUND_TALLY_NT AS TABLE OF XRCL_CB_INBOUND_TALLY_OBJ;
Below is my json object which I am passing as parameter:
{
"TALLYQUANTITY_Input": {
"RESTHeader": {
"Responsibility": "ROCELL",
"RespApplication": "XRCL",
"SecurityGroup": "STANDARD",
"NLSLanguage": "AMERICAN"
},
"InputParameters": {
"P_TRANSACTION_LINES": [
{
"TRANSACTION_TYPE": "IO",
"TRANSACTION_DATE": "01/02/2022 12:00:00 AM",
"ORGANIZATION_ID": "121`enter code here`",
"DOCUMENT_ID": "1",
"DOCUMENT_LINE_ID": "1",
"SKU_CODE": "RC.001.000102.MA.03",
"QUANTITY": "1",
"LOT_NUMBER": "1013A.B.7.J.G",
"SUBINVENTORY": "Saleable"
}
]
}
}
}

Issue resolved by passing parameters in same order as defined in xsd sequence tag

Related

How do I updated an Android Room column from notNull=true to notNull=false?

Problem: With Android Room, it uses a pre-populated database, I cannot seem to get the table columns to change from notNull=true to notNull=false? The pre-populated database schema is correct but I cannot get Android Room to update correctly to match:
What I have done: I edited the json schema file, removing the NOT NULL for the specific columns, and under the fields I updated the same field column information to "notNull": false. I tried a migration, not knowing if it was correct, using ALTER TABLE Notes ADD COLUMN 'QuestionID' INTEGER and it actually updated the json file to NOT NULL again. I can't seem to find information on how to do this? The Entity does not have these annotations and I wasn't sure it was necessary to define these at the Entity as this DB has other tables without these annotations and they are passing through compilation without issue. I'm sure this is another 80/20 rule where I'm stupid and missing something.
Example Table in the json file The Question, Quote, Term and Deleted fields need to be notNull=false and keep changing back to true... and the pre-populated table is correct.
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` (`NoteID` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `SourceID` INTEGER NOT NULL, `CommentID` INTEGER NOT NULL, `QuestionID` INTEGER NOT NULL, `QuoteID` INTEGER NOT NULL, `TermID` INTEGER NOT NULL, `TopicID` INTEGER NOT NULL, `Deleted` INTEGER NOT NULL, FOREIGN KEY(`SourceID`) REFERENCES `Source`(`SourceID`) ON UPDATE NO ACTION ON DELETE NO ACTION , FOREIGN KEY(`CommentID`) REFERENCES `Comment`(`CommentID`) ON UPDATE NO ACTION ON DELETE NO ACTION , FOREIGN KEY(`TopicID`) REFERENCES `Topic`(`TopicID`) ON UPDATE NO ACTION ON DELETE NO ACTION )",
"fields": [
{
"fieldPath": "noteID",
"columnName": "NoteID",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "sourceID",
"columnName": "SourceID",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "commentID",
"columnName": "CommentID",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "questionID",
"columnName": "QuestionID",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "quoteID",
"columnName": "QuoteID",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "termID",
"columnName": "TermID",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "topicID",
"columnName": "TopicID",
"affinity": "INTEGER",
"notNull": true
},
{
"fieldPath": "deleted",
"columnName": "Deleted",
"affinity": "INTEGER",
"notNull": true
}```
The schema in the json file is generated and based upon the Entity, changing it will make no difference. It isn't even required (except if using AutoMigration).
The pre-populated database schema is correct but I cannot get Android Room to update correctly to match:
You have to either change the Entities accordingly or convert the pre-populated database accordingly. Noting again that the Entities define what Room expects.
The language used matters as to the exact answer.
With Kotlin then Notes could be:-
data class Note(
#PrimaryKey(autoGenerate = true)
val NoteId: Long,
val SourceID: Long?,
val CommentID: Long?,
val QuestionID: Long?,
val QuoteID: Long?,
val TermID: Long, //<<<<< NOT NULL
val TopicID: Long?,
val Deleted: Long?
)
The generated java then shows the table create as :-
_db.execSQL("CREATE TABLE IF NOT EXISTS `Note` (`NoteId` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `SourceID` INTEGER, `CommentID` INTEGER, `QuestionID` INTEGER, `QuoteID` INTEGER, `TermID` INTEGER NOT NULL, `TopicID` INTEGER, `Deleted` INTEGER, FOREIGN KEY(`SourceID`) REFERENCES `Source`(`SourceID`) ON UPDATE NO ACTION ON DELETE NO ACTION , FOREIGN KEY(`CommentID`) REFERENCES `Comment`(`CommentID`) ON UPDATE NO ACTION ON DELETE NO ACTION , FOREIGN KEY(`TopicID`) REFERENCES `Topic`(`TopicID`) ON UPDATE NO ACTION ON DELETE NO ACTION )");
i.e. those with Long? do not have NOT NULL (the TermID column has NOT NULL as Long instead of Long? was used).
With Java the column type cannot be a primitive type for NULLs to be allowed, as these MUST have a value and cannot be null, so Room will derive NOT NULL. Just the object type (e.g. Long not long) will be taken as NULLs allowed. To force NOT NULL then the #NotNull annotation needs to be used.
So Java equivalent (named JavaNote to allow both to be used/compiled) could be :-
#Entity(
foreignKeys = {
#ForeignKey(entity = Source.class,parentColumns = {"SourceID"},childColumns = {"SourceID"}),
#ForeignKey(entity = Comment.class,parentColumns = {"CommentID"},childColumns = {"CommentID"}),
#ForeignKey(entity = Topic.class,parentColumns = {"TopicID"}, childColumns = {"TopicID"})
}
)
class JavaNote {
#PrimaryKey(autoGenerate = true)
long NoteID=0; // primitives cannot be NULL thus imply NOT NULL
Long SourceID;
Long CommentID;
Long QuestionID;
Long QuoteID;
#NotNull
Long TermID; // or long TermID
Long TopicID;
Long Deleted;
}
The generated java then has the table create as :-
_db.execSQL("CREATE TABLE IF NOT EXISTS `JavaNote` (`NoteID` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `SourceID` INTEGER, `CommentID` INTEGER, `QuestionID` INTEGER, `QuoteID` INTEGER, `TermID` INTEGER NOT NULL, `TopicID` INTEGER, `Deleted` INTEGER, FOREIGN KEY(`SourceID`) REFERENCES `Source`(`SourceID`) ON UPDATE NO ACTION ON DELETE NO ACTION , FOREIGN KEY(`CommentID`) REFERENCES `Comment`(`CommentID`) ON UPDATE NO ACTION ON DELETE NO ACTION , FOREIGN KEY(`TopicID`) REFERENCES `Topic`(`TopicID`) ON UPDATE NO ACTION ON DELETE NO ACTION )");
again the TermID has been purposefully coded to use NOT NULL
The generated java is available after compiling.
It is found in the generated java (use Android View) in the member/class name the same as the class annotated with #Database suffixed with _Impl. The statements themselves are in the createAlltables method.
e.g. :-

AUTO_INCREMENT in H2 database doesn't work when requesting with Postman

I want to persist TODOs in a H2 DB facilitating a Spring Boot application.
The following SQL script initializes the DB and it works properly:
DROP TABLE IF EXISTS todos;
CREATE TABLE todos (
id INT AUTO_INCREMENT PRIMARY KEY,
title VARCHAR(50) NOT NULL UNIQUE,
description VARCHAR(250) NOT NULL,
completion_date DATE,
priority VARCHAR(6) CHECK(priority IN ('LOW', 'MEDIUM', 'HIGH'))
);
INSERT INTO todos (title, description, priority) VALUES
('Create xxx Todo', 'An xxx-TODO must be created.', 'HIGH'),
('Delete xxx Todo', 'An xxx-TODO must be deleted.', 'HIGH'),
('Update xxx Todo', 'An xxx-TODO must be updated.', 'MEDIUM'),
('Complete xxx Todo', 'An xxx-TODO must be completed.', 'LOW');
Console output when starting Spring Boot:
Hibernate: drop table if exists todos CASCADE
Hibernate: drop sequence if exists hibernate_sequence
Hibernate: create sequence hibernate_sequence start with 1 increment by 1
Hibernate: create table todos (id bigint not null, completion_date date, description varchar(250) not null, priority varchar(250) not null, title varchar(50) not null, primary key (id))
Hibernate: alter table todos add constraint UK_c14g1nqfdaaixe1nyw25h3t0n unique (title)
I implemented controller, service and repositiory in Java within the Spring Boot application.
I used Postman to test the implemented functionality and getting all Todos works well but creating a Todo fails for the first 4 times because of an
org.h2.jdbc.JdbcSQLIntegrityConstraintViolationException: Unique index or primarky key violated: "PRIMARY KEY ON PUBLIC.TODOS(ID) [1, 'Create xxx Todo', 'An xxx TODO must be created.', NULL, 'HIGH']"
This is the request body:
{
"title": "Creating xxxx Todo via API",
"description": "An xxxx TODO was created via API.",
"id": null,
"completionDate": null,
"priority": "LOW"
}
This exception occurs 4 times with the following response:
{
"timestamp": "2021-05-25T17:32:57.129+00:00",
"status": 500,
"error": "Internal Server Error",
"message": "",
"path": "/api/todo/create"
}
With the fifth attempt the Todo gets created:
{
"title": "Create xxxx Todo via API",
"description": "An xxxx TODO was created via API.",
"id": 5,
"completionDate": null,
"priority": "LOW"
}
and the ID 5 was assigned to this record.
Hence, the problem seems to be the number of inserted records during the H2 start-up when Spring Boot starts and initializes the H2 database.
In the Todo entity I annotated the id as follows:
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
private Long id;
How can I solve this problem that when I try to access the creation endpoint of the Spring Boot application via postman?

Apache NiFi put value to serial column

I have a database table with following structure:
CREATE TABLE fact.cabinet_account (
id serial NOT NULL,
account_name text NULL,
cabinet_id int4 NULL,
CONSTRAINT cabinet_account_account_name_key UNIQUE (account_name),
CONSTRAINT cabinet_account_pkey PRIMARY KEY (id),
CONSTRAINT cabinet_account_cabinet_id_fkey FOREIGN KEY (cabinet_id) REFERENCES fact.cabinet(id)
);
And I have a JSON from InvokeHttp which I want to put to database:
{
"login" : "some_maild#gmail.com",
"priority_level" : 5,
"is_archive" : false
}
I'm using QueryRecord with this script:
SELECT
19 AS cabinet_id,
login AS account_name
FROM FLOWFILE
I'm trying to UPSERT in PutDatabaseRecord processor and got the error:
ERROR: value NULL at column "id"
How to put value for serial column with Apache NiFi?
UPDATE
My JSON looks like (before PutDatabase):
[ {
"account_name" : "email1#maximagroup.ru",
"priority_level" : 1000,
"cabinet_id" : 19
}, {
"account_name" : "email2#gmail.com",
"priority_level" : 1,
"cabinet_id" : 19
}, {
"account_name" : "email3#umww.com",
"priority_level" : 1000,
"cabinet_id" : 19
} ]
PutDatabaseRecord looks like:
Try making the operation INSERT rather than UPSERT
An UPSERT needs to check whether the given id exists, to know if it should insert or update, which it can't do as no id is provided.

How can I create a relationship between `json` column and a `int` (id) column in Hasura + Postgres?

I have 2 tables users and post
Table users has columns id and post, column contains an array of the form [1, 2, 3, 4, 5] - where 1, 2, 3, 4, 5 is id in table post
In the table posts the following columns id and text
Table users:
https://i.stack.imgur.com/ywdS7.png
Table posts:
https://i.stack.imgur.com/IBdpb.png
in hasura made an array relation
https://i.stack.imgur.com/311sd.png
Next I made the following request
{
users_test {
postz {
id
}
}
}
I would like to receive such data in response:
postz: [
   {
     text: 'qwe'
   },
   {
     text: 'sdf'
   }
]
But with such a request, I get a trace. error:
{
"errors": [
{
"extensions": {
"internal": {
"statement": "SELECT coalesce(json_agg(\"root\" ), '[]' ) AS \"root\" FROM (SELECT row_to_json((SELECT \"_5_e\" FROM (SELECT \"_4_root.ar.root.postz\".\"postz\" AS \"postz\" ) AS \"_5_e\" ) ) AS \"root\" FROM (SELECT * FROM \"public\".\"users_test\" WHERE ('true') ) AS \"_0_root.base\" LEFT OUTER JOIN LATERAL (SELECT coalesce(json_agg(\"postz\" ), '[]' ) AS \"postz\" FROM (SELECT row_to_json((SELECT \"_2_e\" FROM (SELECT \"_1_root.ar.root.postz.base\".\"id\" AS \"id\" ) AS \"_2_e\" ) ) AS \"postz\" FROM (SELECT * FROM \"public\".\"posts\" WHERE ((\"_0_root.base\".\"post\") = (\"id\")) ) AS \"_1_root.ar.root.postz.base\" ) AS \"_3_root.ar.root.postz\" ) AS \"_4_root.ar.root.postz\" ON ('true') ) AS \"_6_root\" ",
"prepared": true,
"error": {
"exec_status": "FatalError",
"hint": "No operator matches the given name and argument type(s). You might need to add explicit type casts.",
"message": "operator does not exist: json = integer",
"status_code": "42883",
"description": null
},
"arguments": [
"(Oid 114,Just (\"{\\\"x-hasura-role\\\":\\\"admin\\\"}\",Binary))"
]
},
"path": "$",
"code": "unexpected"
},
"message": "postgres query error"
}
]
}
What am I doing wrong and how can I fix it?
A few suggestions:
There are some typos in your query, as far as I can tell. Try:
{
users {
id
posts {
text
}
}
}
You don't need the post column on the users table. You just need a user_id column on the posts table, and a foreign key constraint from the posts table to the users table using the user_id and id columns of the tables respectively. Check out the docs here:
https://docs.hasura.io/1.0/graphql/manual/schema/relationships/create.html#step-3-create-an-array-relationship
https://docs.hasura.io/1.0/graphql/manual/schema/relationships/database-modelling/one-to-many.html
If you have to have the post array column for some reason, you can use computed fields to create a "relationship" between a json array and another table’s id.
https://docs.hasura.io/1.0/graphql/manual/schema/computed-fields.html#table-computed-fields
Your function would:
Take in the json array column
Extract the id's
Return select * from table where id in id's
Example:
https://jsonb-relationships-hasura.herokuapp.com/console/api-explorer
Computed field definition at: https://jsonb-relationships-hasura.herokuapp.com/console/data/schema/public/tables/authors/modify
Run these queries:
# Get list of articles for each author
query {
authors {
id
name
articles
}
}
# Get actual articles for each author
query {
authors {
id
name
owned_articles {
id
title
}
}
}

How to merge old and new values of entity in JPA with shortest code?

I have two tables, Customer and Address:
Customer Table:
CREATE TABLE `customer` (
`customer_id` smallint(5) unsigned NOT NULL AUTO_INCREMENT,
`user_name` varchar(45) NOT NULL,
`password` varchar(45) NOT NULL,
`encrypt_key` varchar(200) NOT NULL,
`first_name` varchar(45) NOT NULL,
`last_name` varchar(45) NOT NULL,
`email` varchar(50) DEFAULT NULL,
`active` tinyint(1) NOT NULL DEFAULT '1',
`last_update` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
`self_description` varchar(2000) NOT NULL,
PRIMARY KEY (`customer_id`),
KEY `idx_last_name` (`last_name`),
) ENGINE=InnoDB AUTO_INCREMENT=101 DEFAULT CHARSET=utf8;
Address Table:
CREATE TABLE `address` (
`account_id` smallint(5) unsigned NOT NULL AUTO_INCREMENT,
`customer_id` smallint(5) unsigned NOT NULL,
`address_type` varchar(15) NOT NULL, -- Office, Branch-1, Branch-2,
`door_num` varchar(50) NOT NULL,
`landmark` varchar(150) DEFAULT NULL,
`street` varchar(50) DEFAULT NULL,
`area_name` varchar(25) NOT NULL,
`district` varchar(25) NOT NULL,
`city` varchar(25) NOT NULL,
`postal_code` varchar(10) DEFAULT NULL,
`phone1` varchar(20) NOT NULL,
`phone2` varchar(20),
`last_update` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
PRIMARY KEY (`account_id`),
KEY `idx_fk_city` (`city`)
) ENGINE=InnoDB AUTO_INCREMENT=201 DEFAULT CHARSET=utf8;
In Customer.java
#OneToMany(fetch = FetchType.EAGER, cascade = CascadeType.ALL)
#JoinColumn(name = "customer_id", nullable = false)
private List<Address> addresses;
In Address.java
Nothing regarding customer, because I am using uni-direction.
In CustomerDaoImpl.java
public boolean updateEntity(Customer customer) {
session = sessionFactory.openSession();
tx = session.beginTransaction();
session.saveOrUpdate(customer);
tx.commit();
session.close();
return false;
}
The issue is I am doing update, it is creating new user every time. But I have to update customer and its childes addresses and theme objects.
My request body:
{
"customerId": 102,
"addresses": [
{
"accountId": 203,
"addressType": "main office",
"areaName": "area3",
"city": "city3",
"district": "district3",
"doorNum": "89",
"landmark": "landmark3",
"phone1": "646432365465",
"phone2": "4534542355675",
"postalCode": "453245",
"street": "street3"
}
],
"active": 1,
"email": "bbb#gmail.com",
"encryptKey": "wwwwwfsad",
"firstName": "ccc",
"lastName": "ddd",
"password": "user2",
"selfDescription": "user2",
"userName": "user2",
"theme": {
"themeId": 402,
"description": "theme2",
"name": "theme2",
"categoryId": 301
}
}
What changes do I have to do?
If you want to update a record in database: first you have to retrieve your record from database to the Persistence Context then update its columns values. I think you are trying update database records directly and it is wrong. It just creates new object with different id and saves it in database.
Customer retrievedCustomer = session.get(customer);
make you changes on retrievedCustomer...
session.update(retrievedCustomer);

Resources