We use liquibase with oracle. So we have some packages with procedures inside.
In our case changelog structure looks like:
master.xml
|
| release1
| |
| release-maser.xml
| release2
| |
| release-maser.xml
| softObjects
| |
| package-master.xml
| packages
| |
| somePackage.pkb
| somePackage.pks
change sets inside "releases" with runOnChange=false and "softObjects"(that can be create used CREATE OR REPLACE) with runOnChange=true like in best practices:
Try to maintain separate changelog for Stored Procedures and use
runOnChange=”true”. This flag forces LiquiBase to check if the
changeset was modified. If so, liquibase executes the change again.
So every update intall some "delta" change sets from "release" and AFTER reinstal all "softObjects" that was changed and in usual case all is OK.
But if I need to setup new DB, I will get a problem:
in change set from second release I use somePackage(v1), but in next release i need to change logic/API somePackage so I will get somePackage(v2) that can not be used in previously created change sets. So for now I have change set that on update will try to use wrong package version.
For avoid this I can add soft objects directly inside release folder without runOnChange=true when i create it. When I need to change it, i should just copy previous version of file to my new release and make changes inside a COPY.
This approach has some disadvantages:
you make a lot of copy files that can consist from thousands lines of
code(yes I know that it isn`t good)
version control system recognize it as a new file (good challenge for
reviewers)
What i undestand wrong? How should I work with "softObjects", if this objects might be changed?
Related
I have created a cloud build with Trigger event push to branch. I have added 2 files in included file filter. So when ever I updated those files my cloud build is triggering fine. But I want to know which file is updated, based on that I need to implement some other logic.
I can use $(body.commit) it printing all the commit information. but i need only files list which are modified in this commit.
Here a code sample to know which Cloud Run directory has changed in my project
- name: 'gcr.io/cloud-builders/git'
id: "Init devops and list changes"
entrypoint: bash
args:
- -c
- |
git fetch --depth=2
# List the Cloud Run changes
git diff --dirstat=files,0 HEAD^ HEAD -- cloud-run/ | sed 's/^[ 0-9.]\+% //g' | cut -d'/' -f2 | uniq > /workspace/change-in-cloudrun
I got the previous commit and compare what changed. I saved the result in a workspace file to be able to reuse the file later on.
So, let's say on the first of this month, I create branch A, with a migration file named 2020_04_01_113108_modify_request_logs_table.php
But let's say I don't merge this branch into my Master branch yet, and then I start working on branch B 2 days later, with a migration file named 2020_04_03_113108_create_label_logs_table.php
So on the 4th, I merge branch B into master and run php artisan migrate, and it runs the second migration.
And then on the 6th, I'm finally ready to merge branch A into master and run php artisan migrate. Is there anything that's going to go wrong with this migration? Does the migration system care that the dates of the files happened out of order? Will it ignore the A-branch file because it's already run a migration with a date later than that file?
Migration that haven't been executed yet, will be.
To check this before hand, you can run php artisan migrate:status to see which one are already executed ('Yes') and the ones that aren't ('No').
The output will look like this:
+------+-------------------------------------------------------------------+-------+
| Ran? | Migration | Batch |
+------+-------------------------------------------------------------------+-------+
| Yes | 2019_12_12_184629_create_users_table | 1 |
| Yes | 2020_03_27_153830_create_another_table | 1 |
| No | 2020_04_01_090622_modify_user_table | |
| Yes | 2020_04_11_102846_update_level | 1 |
| No | 2020_04_22_094132_dummy_migration | |
+------+-------------------------------------------------------------------+-------+
Acutally, Laravel will resolve this out of the box. All previously ran migrations are stored in your databse, in table migrations. Upon running new migrations, Laravel will compare them against the migrations that have already been run for this application, by looking in that migrations table.
I did some research, and actually found the methods in framework that do the described logic above. You can check them here.
I am totally green in those kind of things, I tried reading some tutorials and still couldn't do it on my own.
Here lies the problem: I have 2 files (bulid, compile) which are somehow supposed to take other files and create a working warcraft3 map.
There's an instruction I followed:
To build:_
$ ./scripts/compile # most basic way of calling compile
$ ./scripts/build # most basic way of building a map
build
Script which takes an unportected map and applies any build settings passed via argv
upon on it then turns it into a working warcraft3 map file.
Options | Default | Description
----------------|-------------------------|---------------------------------------------------------------------
env | beta | map environment: each environment has default build settings
debug_script | false | debug this build script
do_jasshelper | true | turns vJass & ZINC into JASS
do_compile | true | turns ../src into out.j. When false, looks for {map_script_path}
do_optimizer | false | uses Vexorian's map optimizer to protect and make the map run faser
do_widgetizer | false | uses PitzerMike's map widgetizer to make map load faster
debug | false | whether the --debug flag should be passed to jasshelper
launchwc3 | false | whether the script should launch wc3 with the map loaded on exit
map_unpro_path | base-maps/{highest}.w3x | the base map file to inject script into
map_script_path | ../out.j | map script path to load into map
map_output_path | ITT_{commit}_{time}.w3x | path where to put the compiled map
setting up
This application requires Ruby
$ git clone git#github.com:theQuazz/island-troll-tribes
$ cd island-troll-tribes
$ scripts/build
It might be a easy thing to do, but I am really bad at this kind of stuff, if someone could explain what to do step by step I would greatly appreciate it
Hi I have downloaded the Standard Set from the JMeter plugin site.
I installed it as it says here:
http://jmeter-plugins.org/wiki/PluginInstall/
The problem is that I don't get to see any option in the Listener Menu that let me add a new Graphs Generator Listener as described here:
http://jmeter-plugins.org/wiki/GraphsGeneratorListener/
I need to create a Transactions per Second graph , but I don't know how to do it.
I really appreciate if you could help me out.
Thanks in advance.
If you do not see the extra elements in the menu, there is something wrong with your jmeter-plugins installation.
Make sure you unpacked the zip in the folder above 'bin', that is not clear in the instructions.
At the same folder level as 'bin' there should also be a 'lib', and beneath this, 'ext'.
Check that the 'ext' folder contains the jmeter plugin jar and other support files.
it should look like this:
|-apache-jmeter-2.11
| |-bin
| | |-jmeter.bat
| | |-ApacheJMeter.jar
| | |-...
| |-lib
| | |-ext
| | | |-JMeterPlugin-Standard.jar
| | | |-...
| | |-...
| |-...
|-...
Make sure you restart jmeter after moving these files. If you have the jar in the right place, the extensions will be loaded and available from menu options.
To install the JMeter Plugins,
-> Copy the JMeterPlugins.jar file from JMeterPlugins-VERSION.zip
-> Paste the file to JMETER_INSTALL_DIR/libexec/lib/ext
Is it possible to create aliases when I enter a certain folder?
What I want:
I use composer a lot (a PHP package manager), which installs binaries in ./vendor/bin. I would like to run the binaries directly from ..
For example:
/path/to/project
| - composer.json // dictates dependencies for the project
| - vendor // libs folder for composer, is created by composer
| | - bin // if lib has bin, composer creates this folder
| | | phpunit // binary
| | | phinx // binary
| | - somelib1 // downloaded by composer
| | - somelib2 // downloaded by composer
Is it possible to get this to work:
> cd /path/to/project
> phpunit
And get phpunit to execute?
Something like "sensing" the composer.json file and dynamically find the binaries in ./vendor/bin and then do something like alias="./vendor/bin/<binary-name> $#" automatically?
I use OS X 10.9 and the boxed in Terminal app.
You can override cd, trap my_function DEBUG to run something on every command, or add a command into PS1 or PROMPT_COMMAND.
These have different behaviour and caveats, and I can't recommend doing any of them for this use case (after having used each of them at some point). They are bad solutions to X-Y problems.
An alternative which is much less likely to break things horribly is to create a custom function to do both things:
cdp() {
cd "$#" && phpunit
}