Highlight the index type in liquibase - methods

I want to create an index on my MySql table, but i cannot find how to highlight the specific index type in 'createIndex' for all Available Attributes,
So, how could i can do ?

Liquibase does not currently support specifying specific index types. To support it, there would need to be extra attributes in the XML and then the MySQL code in Liquibase would need to be updated to look for those attributes and generate the proper SQL. Ideally the MySQL code that inspects the database would also figure out what kind of index type was being used on existing indexes and generate the proper XML in a generated changelog.

Related

Old value in _aud table in envers

I have integrated hibernate envers in spring boot. Now my requirement is to have old value also for the particular column when value changed in *_AUD tables. However I cant see any feature available in Hibernate Envers plugin.
Please suggest.
Thanks
Unfortunately what you are looking to do just isn't someting supported.
It's one thing to think of an entity and needing to store basic type values such as strings or numeric data and have its old/new value represented by two columns in the audit table; however when you move beyond basic entity mappings to ones where you have relationships between entity types or collections; you begin to see that trying to store old/new data in the same row just isn't efficient and in some cases feasible.
That said, you can still read the audit history and deduce these old/new values using a variety of ways that include the Envers Query API, Debezium, or even basic database triggers.

ODI 12c LKM in mapping without IKM

in ODI 12c LKM is used in a mapping in order to load the data from the source to the staging area, but i do not need an IKM which to inset the data from the staging to the target, can ODI mapping do the first phase only that is the LKM pahse, as doing the 2 KM in my case doubles the time.
That's possible but you'd need to use LKM that is written that way.
I don't think there is one OOB there but you should be able to easily write your own.
The main thing there is in the java bean shell code (see A. Substitution API Reference) you would need to change the call from Collection Table:
…INTO TABLE <%=snpRef.getTable("L", "COLL_NAME", "W")%>
to Target Table:
…INTO TABLE <%=snpRef.getTable("L", "TARG_NAME", "A")%>
That's the main thing. You would also need to adjust the fields etc… The post here ODI - Load data directly from the Source to the Target without creating any Temporary table describes steps in more details, but once you get the idea how powerful API is you can pretty much do anything.

Rewrite PK and related FK based on an oracle sequence

I want to migrate a subset of customer data from one shared database environment to another shared database environment. I use hibernate and have quite a few ID and FK_ID columns which are auto generated from an oracle sequence.
I have a liquibase change log that I exported from jailer which has the customer specific data.
I want to be able to rewrite all of the sequence ID columns so that they don't clash with what's already in the target database.
I would like to avoid building something that my company has to manage, and would prefer to upstream this to liquibase.
Is anyone aware of anything within liquibase that might be a good place to start.
I would like to either do this on the liquidbase xml before passing it to 'update' command, or as part of the update command itself. Ideally as part of the update command itself.
I am aware that I would need to make liquibase aware of which columns are PK sequence columns and the related FK columns. The database structure does have this all well defined, so I should be able to read this into the update process.
Alternatively I had thought I could use the extraction model csv from jailer
Jailer - http://jailer.sourceforge.net/
I would suggest that for one-time data migrations like this, Liquibase is not the best tool. It is really better for schema management rather than data management. I think that an ETL tool such as Pentaho would be a better solution.
I actually managed to figure it out for myself with the command line 'update' command of liquibase by using a custom change exec listener.
1) I pushed a MR to liquibase to allow registration of a change exec listener
2) I implemented my own change exec listener that intercepts each insert statement and rewrites each FK and PK field to one that is not as yet allocated in the target database. I achieve this by using a oracle sequence. In order to avoid having to go back to the database each time for a new sequence, I implemented my own version of the hibernate sequence caching
https://github.com/liquibase/liquibase/pull/505
https://github.com/pellcorp/liquibase-extensions
This turned out to be quite a generic solution and in concert with some fixes upstreamed to jailer to improve the liquibase export support its a very viable and reusable solution.
Basic workflow is:
1) Export a subset of data from source db using jailer to liquibase xml
2) Run the liquibase update command, with the custom exec change listener against the target.
3) TODO Run the jailer export on the target db and compare with the original source data.

Map Oracle timestamp type to java.sql.Timestamp or java.util.Date using Hibernate?

I have an existing database with hundreds of tables using TIMESTAMP(6) as the data type for some columns.
When I reverse engineered this database using Hibernate, the Java type Serializable is used to map these columns.
How can I get Hibernate to automatically map columns in these hundreds of tables to java.sql.Timestamp or java.util.Date?
This question is related to How to map oracle timestamp to appropriate java type in hibernate? However, the answer to that question was a work around involving an ant task modifying the generated classes. I don't want to modify the generated classes, I want to get the right type in the first place.
I see there's a solution like
<sql-type jdbc-type="OTHER" hibernate-type="java.sql.Timestamp" />
I wouldn't mind if the solution was more like
<sql-type jdbc-type="TIMESTAMP" hibernate-type="java.sql.Timestamp" />
But I don't want all unmapped types (OTHER) to go to java.sql.Timestamp. I would like to simply tell Hibernate to map these Oracle timestamps to Java's timestamp class.
When I use Netbeans to do the mapping, it detects the timestamp in Oracle and properly converts it. I cannot use Netbeans though because I would like to use an automated tool in my build script so I'm using Hibernate.

How to query the session in ASP.NET MVC with a dynamic query

I want to store some user data in memory, like some in-memory noSQL database.
But later on I want to query that data with a dynamic query constructed from the user. That query is stored in a classic DB like a string, so when I need to query the data stored in memory I would like to parse that string and construct the desired query (by some known rules).
I looked at Redis and I figured out it isn't maintained for Windows anymore, I have also looked at RavenDB but it's main query language is LINQ, even though it can be created dynamic Lucene Query.
Can you suggest me another in memory DB that work with ASP.NET and can be queried with a dynamically created query? Maybe I haven't seen all the options.
I prefer name-value or JSON based noSQL so it's schema can be easyly modified without the constraints of the relation type of DBs
I would suggest to simply use sqlite. It can be easily used as an in-memory database (just open the database using ":memory:" instead of a file name).
You can use a simple 2 columns table with a primary key to emulate a key/value store.
Here are a few links you might find helpful:
http://www.sqlite.org/inmemorydb.html
How to create asp.net web application using sqlite

Resources