I'm trying to use Liquibase for different db vendors and for each one of them I have different sql scripts.
What I was trying to do is this:
<?xml version="1.1" encoding="UTF-8" standalone="no"?>
<databaseChangeLog xmlns="http://www.liquibase.org/xml/ns/dbchangelog"
xmlns:ext="http://www.liquibase.org/xml/ns/dbchangelog-ext"
xmlns:pro="http://www.liquibase.org/xml/ns/pro" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.liquibase.org/xml/ns/dbchangelog-ext http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-ext.xsd http://www.liquibase.org/xml/ns/pro http://www.liquibase.org/xml/ns/pro/liquibase-pro-3.10.xsd http://www.liquibase.org/xml/ns/dbchangelog http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-4.9.xsd">
<includeAll path="oracleLiquibase" />
<includeAll path="postgresql" />
<includeAll path="sqlserver" />
<includeAll path="mysql" />
</databaseChangeLog>
In these folders I have each sql file to be executed but with the dbms property set, since I cannot add the dbms property on the includeAll tag.
As an example, two sql files for 2 different db vendors look like this:
FunctionFile.sql (for postgresql)
-- liquibase formatted sql
-- changeset ion.grigoras:myFunction dbms=postgresql
CREATE OR REPLACE FUNCTION myFunction(...)
RETURNS BIGINT
LANGUAGE plpgsql
AS $function$
declare
-- vars;
BEGIN
-- body;
END;
$function$;
FunctionFile.sql (for oracle)
--liquibase formatted sql
-- changeset ion.grigoras:myFunction dbms=oracle
CREATE OR REPLACE function myFunction(...) return NUMBER is
BEGIN
-- body;
END;
/
I'm launching a Spring boot application and I'm using an oracle datasource so I'm expecting only the FuntionFile.sql for oracle to be executed, but what I see in the logs is that a changeset has failed and that changeset was from a sql file for postresql.
I don't understand why is postgresql being executed since I set the dbms property for each changeset in the fomatted sql files.
I would definitely like to avoid the use of <sqlFile /> on the changelog.xml file since I have a moderated number of files for each db vendor, keeping them separate in different folders and using <includeAll /> is the perfect scenario in my opinion.
I'm not very familiar with SQL notation, but I believe you have an error in your changeSet syntax. Attributes should be separated by : from it's values, according to the documentation:
--changeset author:id attribute1:value1 attribute2:value2 [...]
So I'd try the following syntax:
-- changeset ion.grigoras:myFunction dbms:postgresql
and
-- changeset ion.grigoras:myFunction dbms:oracle
Related
I am trying to use cmd to parse a NITF file to an XML using apache daffodil.
In cmd, I run .\daffodil.bat parse --schema nitf.dfdl.xsd 2301573_3.ntf
The nitf.dfdl.xsd, nitf_common_types.dfdl.xsd, nitf_extension_types.dfdl.xsd and the NITF are contained in the same folder as the daffodil.bat file. The NITF schemas can be found here
I get the error:
[error] Schema Definition Error: Error loading schema due to org.xml.sax.SAXParseException;
DaffodilXMLLoader: Unable to resolve
schemaLocation='com/tresys/nitf/xsd/nitf_common_types.dfdl.xsd'.
Schema context: file:/C:/Users/rinat/OneDrive/Desktop/WORK%20STUFF/apache-daffodil-3.4.0-
bin/apache-daffodil-3.4.0-bin/bin/nitf.dfdl.xsd Location in
file:/C:/Users/rinat/OneDrive/Desktop/WORK STUFF/apache-daffodil-3.4.0-bin/apache-daffodil-
3.4.0-bin/bin/nitf.dfdl.xsd
How do I resolve this?
The nitf schema imports files using the full path, so it expects the imported files to be in a com/tresys/nitf/xsd/... directory on the classpath. If you are copying the files out of those expected paths, then you'll need to modify the xs:import statements to also not use those paths. For example, these lines in nitf.dfdl.xsd:
<xs:import namespace="urn:nitfCommonTypes" schemaLocation="com/tresys/nitf/xsd/nitf_common_types.dfdl.xsd" />
<xs:import namespace="urn:nitfExtensionTypes" schemaLocation="com/tresys/nitf/xsd/nitf_extension_types.dfdl.xsd" />
Need to changed to the schemaLocation attribute to this:
<xs:import namespace="urn:nitfCommonTypes" schemaLocation="nitf_common_types.dfdl.xsd" />
<xs:import namespace="urn:nitfExtensionTypes" schemaLocation="nitf_extension_types.dfdl.xsd" />
The other DFDL schema files may need a similar change.
When I run the Saxon command line to validate multiple DITA files:
a) using the -s option for a folder does not work.
b) using a wildcard for the files does, but is limited to a single topic type:
C:\Users\542470>java -cp C:\Tools\SaxonEE11-3J\saxon-ee-11.3.jar com.saxonica.Validate -catalog:C:\Tools\dita-schemas\catalog-dita.xml -xi:on -xsiloc:on -xsdversion:1.1 "C:\Tools\SaxonEE11-3J\garage\tasks\*"
Saxon license expires in 25 days
Warning at xs:import on line 42 column 73 of softwareDomain.xsd:
SXWN9018 The schema document at urn:oasis:names:tc:dita:xsd:xml.xsd:1.3 is not being read
because schema components for this namespace are already available
Warning at xs:import on line 42 column 73 of uiDomain.xsd:
SXWN9018 The schema document at urn:oasis:names:tc:dita:xsd:xml.xsd:1.3 is not being read
because schema components for this namespace are already available
Warning at xs:import on line 63 column 73 of commonElementMod.xsd:
SXWN9018 The schema document at urn:oasis:names:tc:dita:xsd:xml.xsd:1.3 is not being read
because schema components for this namespace are already available
Warning at xs:import on line 31 column 78 of topicMod.xsd:
SXWN9018 The schema document at urn:oasis:names:tc:dita:xsd:ditaarch.xsd:1.3 is not being
read because schema components for this namespace are already available. To force the
schema document to be read, set --multipleSchemaImports:on
Error on line 13 column 11 of garagetaskoverview.dita:
XQDY0084 One validation error was reported: Cannot validate <Q{}**concept**>: no element
declaration available
In this case, all the topics validated with no errors, but the topic was not recognized. I am using the DITA-OT/Oxygen garage DITA samples to test the command line. Validating a single DITA file causes no problems. This only occurs when mixing the DITA topic types in the same folder.
DITA topic types used:
<concept id="taskconcept" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="urn:oasis:names:tc:dita:xsd:concept.xsd:1.3"
xml:lang="en-US">...
<task id="changeoil" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="urn:oasis:names:tc:dita:xsd:task.xsd:1.3"
xml:lang="en-US">...
Note: Having thousands of files eliminates the option of listing the files to validate.
All the DITA XML Schemas are in no namespace, if Saxon has some kind of schema caching, once it loads the "urn:oasis:names:tc:dita:xsd:task.xsd:1.3" schema for the first validated task, it considers that for no namespace it already has a schema so it might re-use the schema for "task.xsd" to also validate the concept file.
I do not see a setting to avoid using this schema cache in the command line. Maybe you can try to iterate all files in the folder using a "for" loop in a Windows bat file and for each file run the validation process instead of running the validation on the entire folder.
You could also ask directly on the Saxonica users list for advice about this cache.
I'm not an expert on DITA, but I think that all the DITA schema modules are compatible with each other in the sense that you can combine any selection of modules into a single schema. For example, you could write a schema document that has xs:include's for any subset of DITA modules that you want to use. I would suggest using such a composite schema in the -xsd option to the Saxon validate command.
Alternatively, try using the option --multipleSchemaImports:on - this should cause Saxon to load a schema module for a particular namespace (or for the null namespace) even if it already has a schema module for that namespace loaded. (But note this can cause failures if two schema modules have overlapping definitions - I don't know if this applies to DITA.)
However, you're going to get more control over a task like this if you write a little Java application to invoke Saxon repeatedly, rather than trying to do everything in a single command from the command line.
I finally did a "for loop" in a batch file.
command-line:
batch-validate C:\XML-WORK\repair\DITA-xsd\topics > testlog.txt 2> testerrors.txt
batch file command:
for %%i in (*) do java com.saxonica.Validate -catalog:C:\Tools\dita-schemas\catalog-dita.xml -xi:on -xsiloc:on -xsdversion:1.1 %%i
Note: Set the classpath to Saxon and change Dir (CD) to the folder to be validated before running the batch file
It took over 3 hours to validate some 4,000 files. I was using a trial version of Saxon. If this is always the expected result, using a batch file is not feasible.
My yaml master changelog:
databaseChangeLog:
- includeAll:
path: db/changelog/changes/
my two SQL files:
file1:
-- liquibase formatted sql
-- changeset user1:1
CREATE TABLE public.version
(
...
);
CREATE TABLE version_files
(
...
);
-- rollback DROP TABLE version_files;
-- rollback DROP TABLE version;
file2:
-- liquibase formatted sql
-- changeset user1:2
INSERT into version(id, version_json) values('12334', '{"a":1, "b":2}'::jsonb)
-- rollback DELETE FROM Version WHERE id='12334';
When I run mvn:liquibase:update or when I just start my spring boot application, the statements from the SQL files are being executed just fine, however, the tag column values of the databasechangelog table are never set. This way every call to mvn liquibase:rollback -Dliquibase.rollbackTag=1 fails because the tag "1" cannot be found.
if I set a tag "manually" with mvn liquibase:tag then it works and I can rollback to this tag. but what is the easiest/cheapest way to modify my code to set any kind of tag "automatically" upon updating the database with mvn liquibase:update or with starting the spring boot app?
You can have a separate changeset in your changelog file, that would define a tag.
When running an update command, liquibase will first execute the tagDatabase changeset followed by deploying your sql scripts.
I am using Propel 2 with Postgres database. When I run the following command, it appears to work, but in the generated schema.xml file, I do not get any column information:
vendor\bin\propel reverse -v --verbose "db"
Reading database structure of database `db` using dsn `pgsql:host=localhost;dbname=db`
SchemaParser `Propel\Generator\Reverse\PgsqlSchemaParser` chosen
Successfully reverse engineered 8 tables
Writing XML file to generated-reversed-database\schema.xml
Schema reverse engineering finished.
The generated schema.xml looks like this:
<?xml version="1.0" encoding="utf-8"?>
<database name="db" defaultIdMethod="native" defaultPhpNamingMethod="underscore">
<table name="sometable" idMethod="native" phpName="SomeTable">
<unique name="sometable_name_key">
<unique-column name="name"/>
</unique>
</table>
... other tables, no columns here ...
</database>
I am completely stumped on why this is happening. Any help is appreciated.
I try to create 2 tables for a component I m writing for joomla. I have in the xml file:
<install> <!-- Runs on install -->
<sql>
<file driver="mysql" charset="utf8">sql/install1.mysql.utf8.sql</file>
</sql>
<sql>
<file driver="mysql" charset="utf8">sql/install2.mysql.utf8.sql</file>
</sql>
</install>
But it installs only the first file hence the first table. And if I put all the sql commands in one line I get an error from joomla:
JInstaller: :Install: Cannot find Joomla XML setup file
What am i doing wrong?
Thanks for the help,
John.
Ok,
I found the problem. In fact, I was using the wrong collation in my database :(