which is good idea run large SQL select query script in spring boot or We can execute required things from query and write code in Service level - spring-boot

I am able to obtain all required data from SQL Query. is it good practice to execute big select sql query in JPA in In spring boot. or get required fields from select query. write logic in service level.

Related

Nifi || Can we execute multiple sql queries in single database session

I have a requirement in which i have to execute multiple sql queries in Nifi using executesql processor.
Those queries are dependent on each other as I am storing data in session based temporary table.
So I need to know if I do so , whether all the queries will get executed in single database session.
Example:
Query a data is inserted into Temp_A, Now I need that data in next query so will it be possible.
Note: I am talking about session based Temporary table only.
you can use ExecuteSQL processor where in SQL Pre-Query parameter that could contain multiple commands separated with semicolon ;
all those commands with main sql query will be executed using the same connection for one flow file.
note that the same sql connection could be used for the next flow file.

how to query data from a postgres table ex 'pg_stat_statements' in Spring data?

Hello I am using a spring boot application and my requirement is to query pg_stat_statements table for some metrics. I am confused as I cant define a repository because this is not an entity.
Try to create view with this table, add row_number() as Id of this view, then create entity based on this view and use it in HQL

Searching from all tables in oracle

I am developing a java app which can connect to oracle database and selecting column names from any tables, after selecting columns i have to query the data from those tables which the user select in my java app, now my question is how can i join all tables in the database so that query returns data successfully, i want to connect to any oracle schema to a specific, i will make the logic in java, but i am unable to find the query which can extract the data from all tables, i tried natural join among all tables but it has dependency of having same name of connecting columns. so i want to know any generic way which can work in all conditions.
As others have mentioned.. it seems that there are other tools out there that you probably should leverage prior to trying to roll your own complex solution.
With that said if you wish to roll your own solution you could look into using some of oracle's dictionary tables. Such as:
Select * from all_tables;
Select * from all_tab_cols;

SSIS - Iterating with SQL Server Data in ForEachLoop to Dataflow with Oracle Backend and Inserting Results to SQL Server

Hey EXPERIENCED SSIS DEVELOPERS, I need your help.
High-Level Requirements
Query SQL Server table (on a different server than my SSIS server) resulting in about 200-300k records results set.
Use three output colums for each row to lookup date in Oracle database.
Insert or Update SQL Server table with results.
Use SSIS.
SQL Server 2008
Sounds easy, right?
Here is what I have done:
Created on Control Flow Execute SQL Task that gets a recordset from SQL Server. Very fast, easy query, like select field1, field2, field 3 from table where condition > 0. That's it. Takes less than a second.
Created a variable (evaluated as expression) for the Oracle query that uses the results set from the above in the WHERE clause.
Created a ForEachLoop Container that takes the results (from #1 above) for each row in the recordset and runs it through a Data Flow that uses the Oracle query (from #2 above) with Data access mode: SQL command from variable against an Oracle data source. Fast, simple query with only about 6 columns returned.
Data Conversion - obvious reasons - changing 3 columns from Oracle data types to SQL Server data types.
OLE DB Destination to insert to SQL Server using Fast Load to staging table.
It works perfectly! Hooray! Bad news - it is very, very slow. When I say slow, I mean it process 3000 records per hour. Holy moly - so freaking slow.
Question: am I missing a way to speed it up? It seems like the ForEachLoop Container is the bottleneck. Growl.
Important Points:
- I have NO write access in Oracle environment, so don't even suggest a potential solution that requires it. Not a possibility. At all.
Oracle sources do not allow for direct parameter definition. So no SELECT FIELD FROM TABLE WHERE ?. Don't suggest it - doesn't work.
Ideas
- Should I find a way to break down the results of the Execute SQL task and send them through several ForEachLoop Containers for faster processing?
Is there another design that is more appropriate?
Is there a script I can use that is faster?
Would it be faster to create a temporary table in memory and populate it - then use the results to bulk insert to SQL Server? Does this work when using an Oracle data source?
ANY OTHER IDEAS?

H2 Database triggers

generally, when you use H2 database you have to create custom class and implement method "fire" to write Trigger.
For my project I am using batches for inserts. I need to use trigger to make a kind of complex data integrity check on the table I want to insert into, which is not possible using CHECK. So I have to make a select statement in the trigger method to make a check.
Since there could be many inserts I would like to avoid many server roundtrips for each trigger select statement(that's why I am using batches for inserts). Does H2 database sends request to DB from "fire" method everytime if I make a select statement there or is this trigger method somehow integrated into database engine itself ?
Thanks,
Lubos
Triggers are executed on the server side, so the are no server roundtrips when executing triggers.

Resources