I'm using Spring Boot and hibernate and have issues scheduling a cron task using values existing in database. Now I'm reading values from properties file it is working fine.
I defined property in application property file: cron.expression=* * * ? * * and used in the class #Scheduled(cron="${cron.expression}").
but instead of getting values from properties file, I want to get them from database table.can some one share any sample code
Related
I have a architectural requirement to have the data stored in ADLS under a medallion model, and are trying to achieve writing to ADLS using Delta Live Tables as a precursor to creating the Delta Table.
I've had had success using CREATE TABLE {dlt_tbl_name} USING DELTA LOCATION {location_in_ADLS} to create the Delta Table without Delta Live... however the goal is to use Delta live and I don't see how this method is supported in Delta Live
Anyone have a suggestion? I'm guessing at this point that writing to ADLS isn't supported.
If you look into the documentation then you can see that you can specify path parameter for the #dlt.table (for Python). And similarly, you can specify LOCATION parameter when using SQL (docs). You just need to make sure that you're provided all necessary Spark configuration parameters on the pipeline level (with service principal or SAS token). Following code works just fine:
In Python:
import dlt
#dlt.view
def input():
return spark.range(10)
#dlt.table(
path="abfss://test#<account>.dfs.core.windows.net/dlt/python"
)
def python():
return dlt.read("input")
In SQL:
CREATE OR REFRESH LIVE TABLE sql
LOCATION 'abfss://test#<account>.dfs.core.windows.net/dlt/sql'
AS SELECT * from LIVE.input
I'm externalize sql queries by using named queries on file META-INF/jpa-named-queries.properties.
Since I have many queries, it will be good to create multiple properties file, e.g.:
META-INF/jpa-named-queries-product.properties
META-INF/jpa-named-queries-logistic.properties
META-INF/jpa-named-queries-payment.properties
How can I achieve this on Spring Data? #EnableJpaRepositories(namedQueriesLocation = "") only receive one parameter.
I am working on a spring batch application where I read from a stored procedure from database and write it to an xml file.
My writer is a org.springframework.batch.item.xml.StaxEventItemWriter
I am trying to implement a situation in which I find duplicates using this method - Spring Batch how to filter duplicated items before send it to ItemWriter
However, in my situation I don't want to skip a duplicate but override an existing record written to XML by my ItemWriter.
How can I achieve it ?
I read everywhere how to read data in spring batch itemReader and write in database using itemWriter, but I wanted to just read data using spring batch then somehow I wanted to access this list of items outside the job. I need to perform remaining processing after job finished.
The reason I wanted to do this is because I need to perform a lot of validations on every item. I have to validate each item's variable xyz if it exists in list(which is not available within job). After performing a lot of processing I have to insert information in different tables using JPA. Please help me out!
I've never used Spring Batch before but it seems like a viable option for what I am attempting to accomplish. I have about 15 CSV files for 10 institutions that I need to process nightly. I am stashing the CSV into staging tables in an Oracle database.
The CSV File may look something like this.
DEPARTMENT_ID,DEPARTMENT_NAME,DEPARTMENT_CODE
100,Computer Science & Engineering,C5321
101,Math,M333
...
However when I process the row and add it to the database I need to fill in an institution id which would be determined based on the folder being processed at that time.
The database table would like like this
INSTITUTION_ID,DEPARTMENT_ID,DEPARTMENT_NAME,DEPARTMENT_CODE
1100,100,Computer Science & Engineering,C5321
There is also validation that needs to be done on each row in the CSV files as well. Is that something Spring Batch can handle as well?
I've seen reference to CustomItemReader and CustomItemWriter but not sure if that is what I need. The examples I've seen seem basic just dumping a CSV exactly as it is into a matching table.
Yes , all the task that you have reported can be done by spring batch -
For the Reader you may use - multi Resource Item Reader with your wild card name matching your - file names .
To validate the rows from file you can use item processor and handle the validation.
And for your case you need not use the custom item writer - you can configure the item writer as DB item writer in your XML file.
I suggest you to use the XML based approach for Spring batch implementation.
The XML will be used to configure all the architecture of your batch - as in
job -- step -- chunk -- reader -- processor -- writer
and to track errors and exceptions you can implement listeners at each stage.
-- step Execution Listener
-- Item Reader Listener
-- Item Processor Listener
-- Item Writer Listener