How to return DTO - spring

There is a select which returns:
current_id -> integer
current_date -> date
current_position -> character_varying (255)
current_relevant_url -> character_varying (255)
current_restrictions_probability -> character_varying (255)
current_url_changed -> character_varying (255)
current_identifier_id -> integer
phrase_id -> integer
previous_id -> integer
previous_date -> date
previous_position -> character_varying (255)
previous_relevant_url -> character_varying (255)
previous_restrictions_probability -> character_varying (255)
previous_url_changed -> character_varying (255)
previous_identifier_id -> integer
Here it is:
#Query(value = """
select
current.id as current_id,
current.date as current_date,
current.position as current_position,
current.relevant_url as current_relevant_url,
current.restrictions_probability as current_restrictions_probability,
current.url_changed as current_url_changed,
current.identifier_id as current_identifier_id,
current.phrase_id as phrase_id,
previous.id as previous_id,
previous.date as previous_date,
previous.position as previous_position,
previous.relevant_url as previous_relevant_url,
previous.restrictions_probability as previous_restrictions_probability,
previous.url_changed as previous_url_changed,
previous.identifier_id as previous_identifier_id
from (
select
id,
date,
position,
relevant_url,
restrictions_probability,
url_changed,
identifier_id,
phrase_id
from overoptimisation
where identifier_id = :currentOveroptimisationIdentifierId) current
left join (
select
id,
date,
position,
relevant_url,
restrictions_probability,
url_changed,
identifier_id,
phrase_id
from overoptimisation
where identifier_id = :previousOveroptimisationIdentifierId) previous
on current.phrase_id = previous.phrase_id
""",
nativeQuery = true)
List<OveroptimisationDto> report(#Param("previousOveroptimisationIdentifierId") Integer previousOveroptimisationIdentifierId,
#Param("currentOveroptimisationIdentifierId") Integer currentOveroptimisationIdentifierId);
DTO:
#Getter
#Setter
public class OveroptimisationDto {
Integer currentId;
Date currentDate;
String currentPosition;
String currentRelevantUrl;
String currentRestrictionsProbability;
String currentUrlChanged;
Integer currentIdentifierId;
Integer phraseId;
Integer previousId;
Date previousDate;
String previousPosition;
String previousRelevantUrl;
String previousRestrictionsProbability;
String previousUrlChanged;
Integer previousIdentifier_id;
}
Error:
org.springframework.core.convert.ConverterNotFoundException: No converter found capable of converting from type [org.springframework.data.jpa.repository.query.AbstractJpaQuery$TupleConverter$TupleBackedMap] to type [com.example.marketing3.arsenkin.dtos.OveroptimisationDto]
Have I made a mistake somewhere in naming?

As mentioned in #Jens answer, you should create an interface for projection.
public interface Overoptimisation {
Integer getCurrentId();
Date getCurrentDate();
String getCurrentPosition();
String getCurrentRelevantUrl();
String getCurrentRestrictionsProbability();
String getCurrentUrlChanged();
Integer getCurrentIdentifierId();
Integer getPhraseId();
Integer getPreviousId();
Date getPreviousDate();
String getPreviousPosition();
String getPreviousRelevantUrl();
String getPreviousRestrictionsProbability();
String getPreviousUrlChanged();
Integer getPreviousIdentifier_id();
}
Then you should call to your method like:
#Query(value = "
select
current.id as current_id,
current.date as current_date,
current.position as current_position,
current.relevant_url as current_relevant_url,
current.restrictions_probability as current_restrictions_probability,
current.url_changed as current_url_changed,
current.identifier_id as current_identifier_id,
current.phrase_id as phrase_id,
previous.id as previous_id,
previous.date as previous_date,
previous.position as previous_position,
previous.relevant_url as previous_relevant_url,
previous.restrictions_probability as previous_restrictions_probability,
previous.url_changed as previous_url_changed,
previous.identifier_id as previous_identifier_id
from (
select
id,
date,
position,
relevant_url,
restrictions_probability,
url_changed,
identifier_id,
phrase_id
from overoptimisation
where identifier_id = :currentOveroptimisationIdentifierId) current
left join (
select
id,
date,
position,
relevant_url,
restrictions_probability,
url_changed,
identifier_id,
phrase_id
from overoptimisation
where identifier_id = :previousOveroptimisationIdentifierId) previous
on current.phrase_id = previous.phrase_id",
nativeQuery = true)
List<Overoptimisation> report(#Param("previousOveroptimisationIdentifierId") Integer previousOveroptimisationIdentifierId,
#Param("currentOveroptimisationIdentifierId") Integer currentOveroptimisationIdentifierId);

There is no DTO class support in Spring Data JPA, but you may use an interface as return type and it should work just fine.

Just an addition:
Actually there are 3 types of projections in Spring Sata JPA:
Interface-based Projections (that used above)
Class-based Projections (DTOs)
Dynamic Projections
You can read more about them and how to use here

Related

How do I use Room's prepackagedDatabaseCallback?

With Version 2.3.0-alpha03 Room has a prepackagedDatabaseCallback it says:-
This callback will be invoked after the pre-package DB is copied but before Room had a chance to open it and therefore before the RoomDatabase.Callback methods are invoked. This callback can be useful for updating the pre-package DB schema to satisfy Room's schema validation.
So how could I uses this to circumvent the Invalid Schema expected .... found ....?
Could I use this to introduce Triggers, as room doesn't have annotations for the creation of triggers?
Note this is intended as an ask, and answer, your own question
So how could I uses this to circumvent the Invalid Schema expected .... found ....?
The following, albeit long winded shows an example that corrects a schema accordingly.
Could I use this to introduce Triggers, as room doesn't have annotations for the creation of triggers?
Yes, although this hasn't actually been tested but the example caters for the creation of the triggers.
Notes regrading the example
First, it should be noted that when the callback was initially being looked at that there were issues. These have now been addressed BUT requires 2.4.0-beta02 or greater. The issues were:-
Not allowing version 0 in the pre-packaged database (often this would be the case)
Empty Database passed to PrePackagedDatabaseCallback
Database updates applied in PrePackagedDatabaseCallback Lost/Undone
The fix applied being
Correctly open pre-package database during PrepackagedDatabaseCallback invocation.
The actual pre-package database was not being opened because only the file name was being used when creating the temporary open helper instead of the full path. The framework open helper will only open an existing file if the name of the database is a path (starts with '/') otherwise it create a new file. This changes fixes it by using the absolute file path as name.
The Flow
When control is passed to the callback the asset database has been copied. So it's just a matter, in theory, of ALTERing the TABLES and VIEWS (if any) to match the schema that room expects.
Often trying to match what room expects with what room finds frustrates some. The example below will overcome minor/small/often missed issues. It works by:-
Dropping all non-sqlite or non-android components other than the tables (i.e. views, indexes and triggers from the copied assets database),
Renaming the tables to allow the ones that room expects to be created,
Creating new tables using Room's expected schema,
as copied from the java that Room generates
Copying the data from the renamed (asset) tables into the newly created tables
INSERT OR IGNORE has been used so constraint conflicts, such as NOT NULL, UNIQUE, CHECK will not result in an exception (change INSERT OR IGNORE to just INSERT to fail)
Creating the other components that Room expects (views, indexes and triggers (note Room only has triggers for FTS tables)).
Dropping the now redundant renamed tables
Doing a VACUUM to clean up the database,
and finally adding any triggers.
It does expect that the source (asset) has the columns in the correct order, that nulls are are not included in columns that room has a NOT NULL constraint. However, as INSERT OR IGNORE is used then such rows will not be inserted rather than resulting in an exception.
Other than copying come code from the generated java, the process is automated and should cope with most/many assets without modification.
The Code
The vast majority of the code is in the #Database class OtherDatabase. Note that this should cope with many databases with a little tailoring (see comments for changes to make):-
#SuppressLint({"Range"}) /*<<<<< used due to bug/issue with getColumnIndex introduced with SDK 31 */
#Database(
entities = {Person.class, Company.class, CompanyPersonMap.class}, //<<<<< CHANGED ACCORDINGLY
version = OtherDatabase.DATABASE_VERSION, /* note due to shared usage of DBVERSION OtherDatabase used <<<<< CHANGE ACCORDINGLY */
exportSchema = false /* change to suit */
)
#TypeConverters({AllTypeConverters.class})
abstract class OtherDatabase extends RoomDatabase {
public static final String DATABASE_NAME = "otherprepackageddatabasecallbacktest.db"; //<<<<< CHANGE AS REQUIRED
public static final int DATABASE_VERSION = 1; //<<<<< CHANGE AS REQUIRED
public static final String ASSET_FILE_NAME = "prepackageddatabasecallbacktest.db"; //<<<<< CHANGED AS REQUIRED
/**
* sqlite_master table and column names !!!!!DO NOT CHANGE!!!!!
*/
private static final String SQLITE_MASTER_TABLE_NAME = "sqlite_master";
private static final String SQLITE_MASTER_COL_NAME = "name";
private static final String SQLITE_MASTER_COL_TYPE = "type";
private static final String SQLITE_MASTER_COL_TABLE_NAME = "tbl_name";
private static final String SQLITE_MASTER_COL_SQL = "sql";
abstract AllDao getAllDao(); //<<<<< CHANGE ACCORDINGLY
private static volatile OtherDatabase instance = null; //<<<<< CHANGE ACCORDINGLY
public static OtherDatabase /*<<<< CHANGE ACCORDINGLY */ getInstance(Context context) {
if (instance == null) {
instance = Room.databaseBuilder(context, OtherDatabase.class, OtherDatabase.DATABASE_NAME)
.allowMainThreadQueries() /*<<<<< USED FOR BREVITY CONVENIENCE */
.createFromAsset(ASSET_FILE_NAME, prePkgDbCallback) /* 2nd parameter is the THE CALL BACK to invoke */
.build();
}
return instance;
}
/* THE CALLBACK */
static final PrepackagedDatabaseCallback prePkgDbCallback = new PrepackagedDatabaseCallback() {
final static String assetTablePrefix = "asset_"; /* prefix used for renamed tables - should not need to be changed */
private List<SQLiteMasterComponent> sqLiteMasterComponentArray; /* store for sqlite_master extract */
#Override
public void onOpenPrepackagedDatabase(#NonNull SupportSQLiteDatabase db) {
super.onOpenPrepackagedDatabase(db);
sqLiteMasterComponentArray = buildComponentsList(db); /* gets relevant rows from sqlite_master */
dropNonTableComponents(sqLiteMasterComponentArray,db); /* everything except the tables */
renameTableComponents(sqLiteMasterComponentArray, assetTablePrefix,db); /* rename the tables using prefix */
/*<<<<< TAILOR (the db.execSQL's below) AS APPROPRIATE - SEE COMMENTS THAT FOLLOW >>>>>*/
/* copied from the #Database classes generated java e.g. leave indexes till later
_db.execSQL("CREATE TABLE IF NOT EXISTS `Person` (`personid` INTEGER, `firstName` TEXT, `lastName` TEXT, `middleNames` TEXT, `dateOfBirth` INTEGER, PRIMARY KEY(`personid`))");
_db.execSQL("CREATE INDEX IF NOT EXISTS `index_Person_firstName` ON `Person` (`firstName`)");
_db.execSQL("CREATE INDEX IF NOT EXISTS `index_Person_lastName` ON `Person` (`lastName`)");
_db.execSQL("CREATE TABLE IF NOT EXISTS `company` (`companyid` INTEGER, `companyName` TEXT, `city` TEXT, `state` TEXT, `country` TEXT, `notes` TEXT, PRIMARY KEY(`companyid`))");
_db.execSQL("CREATE TABLE IF NOT EXISTS `company_person_map` (`companyid_map` INTEGER NOT NULL, `personid_map` INTEGER NOT NULL, PRIMARY KEY(`companyid_map`, `personid_map`), FOREIGN KEY(`companyid_map`) REFERENCES `company`(`companyid`) ON UPDATE CASCADE ON DELETE CASCADE , FOREIGN KEY(`personid_map`) REFERENCES `Person`(`personid`) ON UPDATE CASCADE ON DELETE CASCADE )");
_db.execSQL("CREATE INDEX IF NOT EXISTS `index_company_person_map_personid_map` ON `company_person_map` (`personid_map`)");
*/
/* Create the tables as per Room definitions - ***** CREATE TABLES COPIED FROM GENERATED JAVA *****
only indexes, views, triggers (for FTS) should be done after the data has been copied so :-
data is loaded faster as no index updates are required.
triggers don't get triggered when loading the data which could result in unexpected results
*/
db.execSQL("CREATE TABLE IF NOT EXISTS `Person` (`personid` INTEGER, `firstName` TEXT, `lastName` TEXT, `middleNames` TEXT, `dateOfBirth` INTEGER, PRIMARY KEY(`personid`));");
db.execSQL("CREATE TABLE IF NOT EXISTS `company` (`companyid` INTEGER, `companyName` TEXT, `city` TEXT, `state` TEXT, `country` TEXT, `notes` TEXT, PRIMARY KEY(`companyid`))");
db.execSQL("CREATE TABLE IF NOT EXISTS `company_person_map` (`companyid_map` INTEGER NOT NULL, `personid_map` INTEGER NOT NULL, PRIMARY KEY(`companyid_map`, `personid_map`), FOREIGN KEY(`companyid_map`) REFERENCES `company`(`companyid`) ON UPDATE CASCADE ON DELETE CASCADE , FOREIGN KEY(`personid_map`) REFERENCES `Person`(`personid`) ON UPDATE CASCADE ON DELETE CASCADE )");
copyData(sqLiteMasterComponentArray, assetTablePrefix,db); /* copy the data from the renamed asset tables to the newly created Room tables */
/* Create the other Room components - ***** CREATE ? COPIED FROM GENERATED JAVA *****
Now that data has been copied create other Room components indexes, views and triggers
(triggers would only be for FTS (full text search))
again sourced from generated Java
*/
db.execSQL("CREATE INDEX IF NOT EXISTS `index_Person_firstName` ON `Person` (`firstName`)");
db.execSQL("CREATE INDEX IF NOT EXISTS `index_Person_lastName` ON `Person` (`lastName`)");
db.execSQL("CREATE INDEX IF NOT EXISTS `index_company_person_map_personid_map` ON `company_person_map` (`personid_map`)");
dropRenamedTableComponents(sqLiteMasterComponentArray, assetTablePrefix,db); /* done with the renamed tables so drop them */
db.execSQL("VACUUM"); /* cleanup the database */
createTriggers(sqLiteMasterComponentArray,db); /* create any triggers */
}
};
static int dropNonTableComponents(List<SQLiteMasterComponent> components, SupportSQLiteDatabase db) {
int rv = 0;
for(SQLiteMasterComponent c: components) {
if (!c.type.equals("table") ) {
db.execSQL("DROP " + c.type + " IF EXISTS " + c.name);
rv++;
}
}
return rv;
}
static int dropRenamedTableComponents(List<SQLiteMasterComponent> components, String prefix, SupportSQLiteDatabase db) {
int rv = 0;
int maxForeignKeyCount = 0;
for (SQLiteMasterComponent c: components) {
if (c.type.equals("table") && c.foreignKeyCount > maxForeignKeyCount) {
maxForeignKeyCount = c.foreignKeyCount;
}
}
for (int i= maxForeignKeyCount; i >= 0; i--) {
for (SQLiteMasterComponent c: components) {
if (c.type.equals("table") && c.foreignKeyCount == i) {
db.execSQL("DROP " + c.type + " IF EXISTS " + prefix + c.name);
rv++;
}
}
}
return rv;
}
static int renameTableComponents(List<SQLiteMasterComponent> components, String prefix, SupportSQLiteDatabase db) {
int rv = 0;
db.execSQL("PRAGMA foreign_keys = ON"); // Just in case turn foreign keys on
for(SQLiteMasterComponent c: components) {
if (c.type.equals("table")) {
db.execSQL("ALTER TABLE " + c.name + " RENAME TO " + prefix + c.name);
rv++;
}
}
return rv;
}
/*
NOTE tables with fewest Foreign Key definitions done first
NOTE makes an assumption that this will not result in FK conflicts
TODO should really be amended to ensure that a table with FK's is only attempted when all of it's parent tables have been loaded
*/
static int copyData(List<SQLiteMasterComponent> components, String prefix, SupportSQLiteDatabase db) {
int rv = 0;
int maxForeignKeyCount = 0;
for (SQLiteMasterComponent c: components) {
if (c.type.equals("table") && c.foreignKeyCount > 0) {
maxForeignKeyCount = c.foreignKeyCount;
}
}
for (int i=0; i <= maxForeignKeyCount; i++) {
for (SQLiteMasterComponent c: components) {
if (c.type.equals("table") && c.foreignKeyCount == i) {
db.execSQL("INSERT OR IGNORE INTO " + c.name + " SELECT * FROM " + prefix + c.name + ";");
rv++;
}
}
}
return rv;
}
static int createTriggers(List<SQLiteMasterComponent> components, SupportSQLiteDatabase db) {
int rv = 0;
for (SQLiteMasterComponent c: components) {
if (c.type.equals("trigger")) {
//TODO should really check if the sql includes IF NOT EXISTSand if not add IF NOT EXISTS
db.execSQL(c.sql);
rv++;
}
}
return rv;
}
/**
* Build the list of required SQLiteMasterComponents to save having to access
* sqlite_master many times.
* #param db the SupportSQliteDatabase to access
* #return the list of SQliteMasterComponents extracted
*/
#SuppressLint("Range")
static List<SQLiteMasterComponent> buildComponentsList(SupportSQLiteDatabase db) {
final String FOREIGN_KEY_FLAG_COLUMN = "foreign_key_flag";
ArrayList<SQLiteMasterComponent> rv = new ArrayList<>();
Cursor csr = db.query("SELECT *," +
/* Column to indicate wherther or not FK constraints appear to have been defined
* NOTE!! can be fooled
* e.g. if a column is defined as `my badly named FOREIGN KEY column ` ....
* */
"(" +
"instr(" + SQLITE_MASTER_COL_SQL + ",'FOREIGN KEY') > 0) + " +
"(instr(" + SQLITE_MASTER_COL_SQL + ",' REFERENCES ')> 0) " +
"AS " + FOREIGN_KEY_FLAG_COLUMN + " " +
"FROM " + SQLITE_MASTER_TABLE_NAME + " " +
/* do not want any sqlite tables or android tables included */
"WHERE lower(" + SQLITE_MASTER_COL_NAME + ") NOT LIKE 'sqlite_%' AND lower(" + SQLITE_MASTER_COL_NAME + ") NOT LIKE 'android_%'");
while (csr.moveToNext()) {
SQLiteMasterComponent component = new SQLiteMasterComponent(
csr.getString(csr.getColumnIndex(SQLITE_MASTER_COL_NAME)),
csr.getString(csr.getColumnIndex(SQLITE_MASTER_COL_TYPE)),
csr.getString(csr.getColumnIndex(SQLITE_MASTER_COL_TABLE_NAME)),
csr.getString(csr.getColumnIndex(SQLITE_MASTER_COL_SQL)),
csr.getInt(csr.getColumnIndex(FOREIGN_KEY_FLAG_COLUMN)),
0);
if (csr.getInt(csr.getColumnIndex(FOREIGN_KEY_FLAG_COLUMN)) > 0) {
component.foreignKeyCount = component.getForeignKeyCount(db);
}
rv.add(component);
}
csr.close();
return (List<SQLiteMasterComponent>) rv;
}
/**
* Class to hold a row from sqlite_master
*/
private static class SQLiteMasterComponent {
private String name;
private String type;
private String owningTable;
private String sql;
private int foreignKeyFlag = 0;
private int foreignKeyCount = 0;
SQLiteMasterComponent(String name, String type, String owningTable, String sql, int foreignKeyFlag, int foreignKeyCount) {
this.name = name;
this.type = type;
this.owningTable = owningTable;
this.sql = sql;
this.foreignKeyFlag = foreignKeyFlag;
this.foreignKeyCount = foreignKeyCount;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public String getType() {
return type;
}
public void setType(String type) {
this.type = type;
}
public String getOwningTable() {
return owningTable;
}
public void setOwningTable(String owningTable) {
this.owningTable = owningTable;
}
public String getSql() {
return sql;
}
public void setSql(String sql) {
this.sql = sql;
}
public int getForeignKeyFlag() {
return foreignKeyFlag;
}
public void setForeignKeyFlag(int foreignKeyFlag) {
this.foreignKeyFlag = foreignKeyFlag;
}
public boolean isForeignKey() {
return this.foreignKeyFlag > 0;
}
public int getForeignKeyCount() {
return foreignKeyCount;
}
public void setForeignKeyCount(int foreignKeyCount) {
this.foreignKeyCount = foreignKeyCount;
}
/**
* Retrieve the number of rows returned by PRAGMA foreign_key_list
* #param db The SupportSQLiteDatabase to access
* #return The number of rows i.e. number of Foreign Key constraints
*/
private int getForeignKeyCount(SupportSQLiteDatabase db) {
int rv =0;
Cursor csr = db.query("SELECT count(*) FROM pragma_foreign_key_list('" + this.name + "');");
if (csr.moveToFirst()) {
rv = csr.getInt(0);
}
csr.close();
return rv;
}
}
}
Working Example
The Asset database
The asset database contains 3 tables, person, company and company_person_map
person DDL is
CREATE TABLE person (personid INTEGER PRIMARY KEY, firstName TEXT, lastName TEXT, middleNames TEXT, dateOfBirth DATE);
Room expects:-
CREATE TABLE IF NOT EXISTS `Person` (`personid` INTEGER, `firstName` TEXT, `lastName` TEXT, `middleNames` TEXT, `dateOfBirth` INTEGER, PRIMARY KEY(`personid`))
note the type of DATE v INTEGER for the dateOfBirth column which room would not accept so an expected ... found ....
company DDL is
CREATE TABLE company (companyid INTEGER PRIMARY KEY, companyName TEXT, city TEXT, state TEXT, country TEXT, notes TEXT);
Room expects:-
CREATE TABLE IF NOT EXISTS `company` (`companyid` INTEGER, `companyName` TEXT, `city` TEXT, `state` TEXT, `country` TEXT, `notes` TEXT, PRIMARY KEY(`companyid`))
both are comparable room would accept
company_person_map DDL is
CREATE TABLE company_person_map (
companyid_map INTEGER NOT NULL REFERENCES company(companyid) ON DELETE CASCADE ON UPDATE CASCADE,
personid_map INTEGER NOT NULL REFERENCES person(personid) ON DELETE CASCADE ON UPDATE CASCADE,
PRIMARY KEY (companyid_map, personid_map));
Room expects:-
CREATE TABLE IF NOT EXISTS `company_person_map` (
`companyid_map` INTEGER NOT NULL,`personid_map` INTEGER NOT NULL,
PRIMARY KEY(`companyid_map`, `personid_map`),
FOREIGN KEY(`companyid_map`) REFERENCES `company`(`companyid`) ON UPDATE CASCADE ON DELETE CASCADE ,
FOREIGN KEY(`personid_map`) REFERENCES `Person`(`personid`) ON UPDATE CASCADE ON DELETE CASCADE )
although they are significantly different, they are comparable, room would accept
The asset table contains the following data:-
person
company
company_person_map
person 11 is not mapped to a company.
AllDao is
none of the inserts are used, and not all the queries either.
The invoking Activity is :-
public class MainActivity extends AppCompatActivity {
OtherDatabase dbOther;
AllDao daoOther;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
dbOther = OtherDatabase.getInstance(this);
daoOther = dbOther.getAllDao();
for(Person p: daoOther.getAllPeople()) {
Log.d("DBINFO",
"Person is " + p.firstName
+ " " + p.middleNames
+ " " + p.lastName
+ " Date of Birth is " + p.dateOfBirth
+ " ID is " + p.id
);
}
for(CompanyWithPeople cwp: dao.getCompanyWithPeople()) {
logCompany(cwp.company,"DBINFO","");
for(Person p: cwp.person) {
logPerson(p,"DBINFO","\n\t");
}
}
}
void logCompany(Company c, String tag, String lineFeed) {
Log.d(tag,lineFeed + "Company is " + c.companyName
+ " Location is " + c.city + ", " + c.state + ", " + c.country
+ " ID is " + c.companyId
+ "\n\t Notes: " + c.notes
);
}
void logPerson(Person p, String tag, String lineFeed) {
Log.d(tag, lineFeed + p.firstName + " " + p.middleNames
+ " " + p.lastName + " Date of Birth is " + p.dateOfBirth
+ " ID is " + p.id
);
}
}
Result
2021-11-28 19:56:01.928 D/DBINFO: Person is Robert John Smith Date of Birth is Sat Dec 27 17:46:33 GMT+10:00 1969 ID is 1
2021-11-28 19:56:01.929 D/DBINFO: Person is Julie Mary Armstrong Date of Birth is Mon Jan 12 23:22:04 GMT+10:00 1970 ID is 2
2021-11-28 19:56:01.929 D/DBINFO: Person is Andrea Susan Stewart Date of Birth is Mon Jan 05 04:56:09 GMT+10:00 1970 ID is 3
2021-11-28 19:56:01.929 D/DBINFO: Person is Mary Belinda Allway Date of Birth is Mon Jan 12 00:15:21 GMT+10:00 1970 ID is 4
2021-11-28 19:56:01.929 D/DBINFO: Person is Lisa Elizabeth Brooks Date of Birth is Sat Jan 03 03:51:21 GMT+10:00 1970 ID is 5
2021-11-28 19:56:01.930 D/DBINFO: Person is Stephen Colin Cobbs Date of Birth is Tue Jan 06 14:01:55 GMT+10:00 1970 ID is 6
2021-11-28 19:56:01.930 D/DBINFO: Person is Breane Cath Davidson Date of Birth is Thu Jan 01 22:30:14 GMT+10:00 1970 ID is 7
2021-11-28 19:56:01.930 D/DBINFO: Person is Trevor Arthur Frankston Date of Birth is Sat Jan 10 03:47:02 GMT+10:00 1970 ID is 8
2021-11-28 19:56:01.930 D/DBINFO: Person is George Howard Erksine Date of Birth is Sun Jan 11 00:47:02 GMT+10:00 1970 ID is 9
2021-11-28 19:56:01.930 D/DBINFO: Person is Uriah Stanley Jefferson Date of Birth is Mon Dec 29 19:11:31 GMT+10:00 1969 ID is 10
2021-11-28 19:56:01.931 D/DBINFO: Person is Valerie Alana Singleton Date of Birth is Thu Jan 01 11:45:07 GMT+10:00 1970 ID is 11
2021-11-28 19:56:01.931 D/DBINFO: Person is Vladimir Oscar Whitworth Date of Birth is Sat Jan 10 00:29:45 GMT+10:00 1970 ID is 12
2021-11-28 19:56:01.936 D/DBINFO: Company is Allbright Construction Location is Sydney, NSW, Australia ID is 1
Notes:
2021-11-28 19:56:01.936 D/DBINFO: Julie Mary Armstrong Date of Birth is Mon Jan 12 23:22:04 GMT+10:00 1970 ID is 2
2021-11-28 19:56:01.936 D/DBINFO: Mary Belinda Allway Date of Birth is Mon Jan 12 00:15:21 GMT+10:00 1970 ID is 4
2021-11-28 19:56:01.937 D/DBINFO: Stephen Colin Cobbs Date of Birth is Tue Jan 06 14:01:55 GMT+10:00 1970 ID is 6
2021-11-28 19:56:01.937 D/DBINFO: Trevor Arthur Frankston Date of Birth is Sat Jan 10 03:47:02 GMT+10:00 1970 ID is 8
2021-11-28 19:56:01.937 D/DBINFO: Uriah Stanley Jefferson Date of Birth is Mon Dec 29 19:11:31 GMT+10:00 1969 ID is 10
2021-11-28 19:56:01.937 D/DBINFO: Vladimir Oscar Whitworth Date of Birth is Sat Jan 10 00:29:45 GMT+10:00 1970 ID is 12
2021-11-28 19:56:01.937 D/DBINFO: Company is Dextronics Location is Slough, Berkshire, England ID is 2
Notes:
2021-11-28 19:56:01.937 D/DBINFO: Robert John Smith Date of Birth is Sat Dec 27 17:46:33 GMT+10:00 1969 ID is 1
2021-11-28 19:56:01.938 D/DBINFO: Andrea Susan Stewart Date of Birth is Mon Jan 05 04:56:09 GMT+10:00 1970 ID is 3
2021-11-28 19:56:01.938 D/DBINFO: Lisa Elizabeth Brooks Date of Birth is Sat Jan 03 03:51:21 GMT+10:00 1970 ID is 5
2021-11-28 19:56:01.938 D/DBINFO: Breane Cath Davidson Date of Birth is Thu Jan 01 22:30:14 GMT+10:00 1970 ID is 7
2021-11-28 19:56:01.938 D/DBINFO: George Howard Erksine Date of Birth is Sun Jan 11 00:47:02 GMT+10:00 1970 ID is 9

SQLRPGLE & JSON_OBJECT CTE Statements -101 Error

This program compiles correctly, we are on V7R3 - but when running it receives an SQLCOD of -101 and an SQLSTATE code is 54011 which states: Too many columns were specified for a table, view, or table function. This is a very small JSON that is being created so I do not think that is the issue.
The RPGLE code:
dcl-s OutFile sqltype(dbclob_file);
xfil_tofile = '/ServiceID-REFCODJ.json';
Clear OutFile;
OutFile_Name = %TrimR(XFil_ToFile);
OutFile_NL = %Len(%TrimR(OutFile_Name));
OutFile_FO = IFSFileCreate;
OutFile_FO = IFSFileOverWrite;
exec sql
With elm (erpRef) as (select json_object
('ServiceID' VALUE trim(s.ServiceID),
'ERPReferenceID' VALUE trim(i.RefCod) )
FROM PADIMH I
INNER JOIN PADGUIDS G ON G.REFCOD = I.REFCOD
INNER JOIN PADSERV S ON S.GUID = G.GUID
WHERE G.XMLTYPE = 'Service')
, arr (arrDta) as (values json_array (
select erpRef from elm format json))
, erpReferences (refs) as ( select json_object ('erpReferences' :
arrDta Format json) from arr)
, headerData (hdrData) as (select json_object(
'InstanceName' : trim(Cntry) )
from padxmlhdr
where cntry = 'US')
VALUES (
select json_object('header' : hdrData format json,
'erpReferenceData' value refs format json)
from headerData, erpReferences )
INTO :OutFile;
Any help with this would be very much appreciated, this is our first attempt at creating JSON for sending and have not experienced this issue before.
Thanks,
John
I am sorry for the delay in getting back to this issue. It has been corrected, the issue was with the "values" statement.
This is the correct code needed to make it work correctly:
Select json_object('header' : hdrData format json,
'erpReferenceData' value refs format json)
INTO :OutFile
From headerData, erpReferences )

Query returning -> ORA-19011: Character string buffer too small

i'm running this query below, and oracle returning the following message: Query returning -> ORA-19011: Character string buffer too small
When i uncomment the last line, it works. I think thats why this return few results. how to solve this?
select substr(xmlagg(xmlelement (e, str||',')).extract ('//text()'),1,(Length(Trim(xmlagg (xmlelement (e, str||',')).extract ('//text()')))-2)) motivos
FROM (
select to_char(lo_valor) str
from editor_layout el,
editor_versao_documento ev,
editor_documento ed,
editor_layout_campo elc,
editor_campo ec,
editor_registro er,
editor_registro_campo erc,
pw_editor_clinico pec,
pw_documento_clinico pdc
where pec.cd_editor_registro = er.cd_registro
and pdc.cd_documento_clinico = pec.cd_documento_clinico
and ed.cd_documento = ev.cd_documento
and el.cd_versao_documento = ev.cd_versao_documento
and el.cd_layout = elc.cd_layout
and elc.cd_campo = ec.cd_campo
and elc.cd_campo = erc.cd_campo
and er.cd_registro = erc.cd_registro
and er.cd_layout = el.cd_layout
and ec.ds_identificador = 'nut_orientacoes_entregues_1'
AND To_Char(lo_valor) IS NOT null
--and Trunc(pdc.dh_documento) BETWEEN Trunc(SYSDATE-5) AND Trunc(SYSDATE)
)
The contents of the xmlagg(xmlelement ...) call returns a clob, and you are trying to stuff it into a varchar2 since you call substr for example. That will fail if the clob is too large.
The reason it will fail when you uncomment the last line is that outside that range, there are rows having the contents that is too long.

Parameters in NamedQuery

I am working with JSF 2.0, Primefaces 3.4, Oracle, NetBeans 7.2.1, and i got the following error:
ADVERTENCIA: Local Exception Stack:
Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.3.2.v20111125-r10461): org.eclipse.persistence.exceptions.DatabaseException
Internal Exception: java.sql.SQLSyntaxErrorException: ORA-00936: missing expression
Error Code: 936
Call: SELECT t1.OPT_ID, t1.OPT_ANT_FM_DET FROM SAE_PACIENTE t0, SAE_OPTOMETRIA t1 WHERE ((((t0.P_ID = ) AND (t0.P_IDENTIFICACION = ?)) AND (t0.P_TIPO_IDENTIFICACION = ?)) AND (t0.P_ID = t1.OPT_P_ID))
i have two beans SaeOptometria and SaePaciente; in SaeOPtometria i have this atribute that is the relationship with the SaePaciente bean:
#JoinColumn(name = "OPT_P_ID", referencedColumnName = "P_ID")
#ManyToOne
private SaePaciente optPId;
And also have the following namedquery:
#NamedQuery(name = "SaeOptometria.findByPaciente", query = "SELECT s FROM SaeOptometria s INNER JOIN s.optPId d WHERE d.pId=s.optPId AND d.pIdentificacion = :pIdentificacion AND d.pTipoIdentificacion = :pTIdentificacion"),
I am passing the parameters :pIdentificacion and :pTIdentificacion in the SaeOptometriaFacade file with the following code:
public List<SaeOptometria> consultaporIdYTI(String UIpIdentificacion, String UIpTipoIdentificacion){
Query query= em.createNamedQuery("SaeOptometria.findByPaciente");
query.setParameter("pIdentificacion", UIpIdentificacion);
query.setParameter("pTIdentificacion", UIpTipoIdentificacion);
List<SaeOptometria> SaeOptometriaList= query.getResultList();
return SaeOptometriaList;
}
I clarified that the parameters pIdentificacion and pTipoIdentificacion in the NamedQuery belong to SaePaciente bean:
#Id
#SequenceGenerator(name="SEQ1",sequenceName = "SAE_AINCR_PACIENTE",allocationSize=1)
#GeneratedValue(strategy=GenerationType.IDENTITY, generator="SEQ1")
#Basic(optional = false)
#NotNull
#Column(name = "P_ID")
private Integer pId;
#Size(max = 15)
#Column(name = "P_IDENTIFICACION")
private String pIdentificacion;
#Size(max = 20)
#Column(name = "P_TIPO_IDENTIFICACION")
private String pTipoIdentificacion;
why doesnt work?
thanks.
Your query seems wrong:
SELECT s FROM SaeOptometria s INNER JOIN s.optPId d WHERE d.pId=s.optPId AND d.pIdentificacion = :pIdentificacion AND d.pTipoIdentificacion = :pTIdentificacion
FROM SaeOptometria s INNER JOIN TO_WHAT
After your comment, I decided that my first understanding is wrong.
You both join in FROM clause and in the from WHERE clause. Try to do only one of them.
Can you try following queries and report results:
First one: Join is provided by JPA. No explicitly providing Ids. This information should be given in your JPA mapping
SELECT s FROM SaeOptometria s INNER JOIN s.optPId d WHERE d.pIdentificacion = :pIdentificacion AND d.pTipoIdentificacion = :pTIdentificacion
Second one: you join explicitly. We give tables and join giving ID values explicitly.
SELECT s FROM SaeOptometria s WHERE s.optPId.pId=s.optPId and s.optPId.pIdentificacion = :pIdentificacion AND s.optPId.pTipoIdentificacion = :pTIdentificacion

how to generate a string from complex format similiar to regular expression?

We know String.format() can accept format with such style:
%[argument_index$][flags][width][.precision]conversion
which is well known as C printf() format, and introduced in java 1.5.
My task is, generating complex format including repeated or optional parameters.
For example, with the format:
select %1 from %2 where %3 and give %1->'*' %2->'users' %3->'age>20'
it returned:
select * from users where age>20
the format can be supported by Stirng.format().
However, I expect a format similar to:
select %1{, } from (%2(as %3)){,} (where %4 (and %5))?
when: %1->'*', %2->'users' %3->null, %3->'age>20'
it returned:
select * from users where age>20
when: %1->Stirng{'name','age'} , %2->'users' %3->'u', %4->'age>20', %5->null
it returned:
select name, age from users as u where age>20
when: %1->Stirng{'name','age'} , %2->'users' %3->'u', %4->null, %5->null
it returned:
select name, age from users as u
when: %1->Stirng{'name','age'} , %2->String{'users','order'} %3->{'u','o'}, %4->'age>20', %5->'u.id=o.userid'
it returned:
select name, age from users as u,orders as o where age>20 and u.id=o.userid
I think now you may understand my meanings. Are there a mature library to do such complex 'anti-regexp' work?
Maybe you are looking for a CustomFormatProvider?
class SqlFormatter:IFormatProvider, ICustomFormatter
{
public object GetFormat(Type formatType)
{
return this;
}
public string Format(string format, object arg, IFormatProvider formatProvider)
{
StringBuilder concat = new StringBuilder();
string[] formatParts = format.Split(':');
switch (formatParts[0])
{
case "sqlfield":
string sep = "";
if (formatParts.Length>1)
{
sep = formatParts[1];
}
string[] fields = (string[]) arg;
concat.Append(fields[0]);
for (int fieldIndex = 1; fieldIndex < fields.Length; fieldIndex++)
{
concat.Append(sep);
concat.Append(fields[fieldIndex]);
}
break;
case "sqltables":
concat.Append(arg.ToString());
break;
default:
break;
}
return concat.ToString();
}
}
Use it like this:
String sql = String.Format(
new SqlFormatter()
, "select {0:sqlfield:,} from {1:sqltables} "
, new string[]{"name","lname"}
, "person" );
Will give:
"select name,lname from person "
I'll leave the rest of the implementation (and robustness etc) to you...

Resources