Problem with DBRider: NoSuchTableException "the table 'MY_NAME_TABLE' does not exist in schema 'null' " - dbunit

Maybe it`s will be helpfull information.
When I use framework "database-rider" with my postgresql database for spring-boot application they don`t work properly.
DBUnit the case-sensitive and want uppercase table names.
I solved my problem - disabled the case-sensitive option. Added dbunit.yml file in datasets folder. And set next property:
caseInsensitiveStrategy: !!com.github.database.rider.core.api.configuration.Orthography 'LOWERCASE'
properties:
caseSensitiveTableNames: false

Related

How do I escape a Freemarker template in an insert statement run by a Flyway migration?

I have a Spring Boot app that allows the user to define a Freemarker template. I store that in my database. I want to preload one as part of a migration in Flyway:
insert into templates(1001, '${foo.bar}');
When I put that in my migration script, I get an error about it being an unrecognized placeholder.
org.flywaydb.core.api.FlywayException: Unable to parse statement in /projects/so-example/src/main/resources/db/migration/V1__insert_initial_data.sql at line 1 col 1. See https://flywaydb.org/documentation/knownparserlimitations for more information: No value provided for placeholder: ${foo.bar}. Check your configuration!
I know how to disable placeholders (-Dflyway.placeholderReplacement=false) and that seems to get me past it for now.
However, I'd like to know how to escape this particular instance so I could use placeholders elsewhere.
Thanks!

How do I log every db change using Spring Boot in a file?

I need to log every database change to a file. And I need to show how data looked before and how data looked after.
For example, if I change person`s name it would be ok if following line was appended to my log file: "updated person table: field 'first_name' was changed from 'Alex' to 'George' (person_id = 4622)".
And if I insert new person: "inserted to person table: 'person_id' = 4623; 'last_name' = 'Smith'; 'first_name' = 'John'; 'status' = 0".
I make changes to db with services that use interfaces that extend JpaRepository and theoretically I could add code to append log file to every method of every service. But that's a lot of code, such approach does not follow DRY principles and I believe there should be better solution I haven't found yet. What should I use to achieve this?
I use Spring Boot version 2.0.3.RELEASE and my database is MariaDB.
Log file can have any format, but, I guess, the best would be yaml.
You can use JDBC logging with log4jdbc - see my related post.
To get it working in Spring Boot app just add log4jdbc-spring-boot-starter to your project, when set jdbc logging props as following:
logging.level.resultsettable=info
logging.level.sqltiming=info
logging.level.sqlonly=fatal
logging.level.audit=fatal
logging.level.resultset=fatal
logging.level.connection=fatal
Then you get full SQL queries, their execution time and their results in the app log.
To make queries print in one line you can use these settings of log4jdbc:
log4jdbc.dump.sql.addsemicolon=true
log4jdbc.dump.sql.maxlinelength=0
log4jdbc.trim.sql.extrablanklines=false
Then you get in your log something like this:
2018-08-27 14:36:14.183 INFO 5452 --- [127.0.0.1] jdbc.sqltiming : SELECT 1; {executed in 1 msec}
2018-08-27 14:36:14.184 INFO 5452 --- [127.0.0.1] jdbc.resultsettable :
|---------|
|?column? |
|---------|
|1 |
|---------|
Also you can output those SQL query and/or their result to a different log file. How to do this you can read here.
You could turn on show_sql and log hibernate output to a file but it won't out of the box give you the data used in the changes, just the form of the sql. To get the data as well you'd need to spy on the jdbc driver.
You could use spring data envers to log changes to audit tables. Then from the tables you can export to files if you wish.
Or you could look at ways to do the auditing at the db level
These are the options that come first to mind.

postgresql update a bytea field with image from desktop

I'm trying to write an update sql statement in postgresql (pg commander) that will update a user profile image column
I've tried this:
update mytable set avatarImg = pg_read_file('/Users/myUser/profile.png')::bytea where userid=5;
got ERROR: absolute path not allowed
Read the file in the client.
Escape the contents as bytea.
Insert into database as normal.
(Elaborating on Richard's correct but terse answer; his should be marked as correct):
pg_read_file is really only intended as an administrative tool, and per the manual:
The functions shown in Table 9-72 provide native access to files on the machine hosting the server. Only files within the database cluster directory and the log_directory can be accessed.
Even if that restriction didn't apply, using pg_read_file would be incorrect; you'd have to use pg_read_binary_file. You can't just read text and cast to bytea like that.
The path restrictions mean that you must read the file using the client application as Richard says. Read the file from the client, set it as a bytea placement parameter in your SQL, and send the query.
Alternately, you could use lo_import to read the server-side file in as a binary large object, then read that as bytea and delete the binary large object.
pg_read_file can read the files only from the data directory path, if you would like to know your data directory path use:
SHOW data_directory;
For example it will show,
/var/lib/postgresql/data
Copy you file to the directory mentioned.
After the you can use only file name in your query.
UPDATE student_card SET student_image = pg_read_file('up.jpg')::bytea;
or can use pg_read_binary_file function.
UPDATE student_card SET student_image = pg_read_binary_file('up.jpg')::bytea;

Failed to enable constraints. One or more rows contain values violating non-null, unique, or foreign-key constraints.

Please help, I have been trying to fix this error for the better part of 8 hours so far. I have a report in Crystal Reports that just started throwing this error. I changed a field in the View that is attached to the report, so I opened up my XSD file in VS2010 and renamed the current DT to ViewTracker0 and then pulled in the ViewTracker view. I added my queries from the old DT, ensured that there is no primary key, double checked that each length of the fields were the same as the db, checked to make sure that each column name is named to match the DB. I can preview my data fine in the XSD, as well as in SQL I can run my queries and everything is returned correctly. When I run my report, it dies everytime with this error.
Failed to enable constraints. One or more rows contain values violating non-null, unique, or foreign-key constraints.
What do I need to check next.
Have you tried verifying the database within the Crystal Reports designer, then running the report?
Try traversing GetErrors(), described here:
http://www.fransson.net/blog/failed-to-enable-constraints-one-or-more-rows-contain-values-violating-non-null-unique-or-foreign-key-constraints/

Enterprise Architect Oracle long field column properties

I have a little problem with Enterprise Architect by Sparx System.
Im trying to model database schema for Oracle. I created table with primary key with data type long. But when im trying to modify column properties (set AutoNum = true) I see empty properties. I read documentation of EA and saw that I need to setup this property to generate sequence syntax.
When I change data type to number, or switch database to mysql (for example) everything is alright, there are properties so Im able to modify AutoNum value.
Did you had similar problem and found solution ? or maybe im doing something wrong.
regards
It's becouse Oracle use sequence instead of autoincrement option. I've checked it and I think you have to use NUMBER column type and then set AutoNum property (you have to select Generate Sequences in options to get proper DDL code too). Instead of LONG data type you can set PRECISION and SCALE options on NUMBER type ie NUMBER(8) mean you can have 8 digits number and it can be set up to 38, so if you don't want to store info about every star in the universe will be enought for your scenario :)

Resources