How to divide the changelog.yaml into subsections? - yaml

I would like to separate each change of the databaseChangeLog in YAML format into its own file (again YAML) and include/import it somehow. It should be in way such that I can use a FileSystemAccessor or ClassPathAccessor to load it again.
Is there some example how to do that?
Thanks,
Dieter

The way to do this is described at https://www.liquibase.org/bestpractices.html
The example given shows XML formatted changelogs, but the basic idea would be the same for YAML formatted changelogs.
Leave me a comment and I can generate a sample in YAML.

After some searching I found this on the liquibase repo:
https://github.com/liquibase/liquibase/blob/master/liquibase-core/src/test/resources/liquibase/parser/core/yaml/doubleNestedChangeLog.yaml
It is an example how you can separate your yaml files analog to the xml as shown by SteveDonies link.

Related

Documenting YAML configuration files

We are using YAML as a config file format and want to add comments to it that we can convert to documentation, in the same way that Sphinx Autodoc, Doxygen, or Roxygen (in R) work.
I'm aware of this take that suggests description should be an integral part of the YAML document: Documenting yaml
I disagree with this. Our YAML files are configuration, not data - our documentation should show how to change and adapt the YAML rather than describe what's in the YAML right now.
I found:
http://chrisbcole.me/yamldoc/
https://yamldocs.dev/
https://github.com/Jakski/sphinxcontrib-autoyaml
https://github.com/ted-dunstone/yaml2doc
Neither appears to be in very wide use, am I missing something?

.properties to .yaml conversion while preserving comment?

Here is an earlier discussion, which asks about - How to convert from application.properties to application.yml in Spring Boot?
The solutions discussed above work but there is a limitation with #comments.
Comments (#comments) in .properties file are not carried forward in to .yaml file after conversion.
So want to check if there is any tool (online/ide based/offline) that supports conversion with comments carried forward into .yaml file.
read each line of the properties file
keep track of the comments
put the grouped property in a data structure together with the comment
write the data structure to the YAML file

How can I comment on prettierrc file

I know json file cannot add comments unless I have “_comment”:”comment content”.
However, with .prettierrc file having this is going to interfere with prettier configuration.
Is there any good way of commenting?
Prettier's configurations file doesn't have to be JSON. It can be written in JavaScript too (.prettierrc.js or prettier.config.js), with as many comments as you want.
See https://prettier.io/docs/en/configuration.html

conditional include in asciidoc

I am using Spring RestDoc together with AsciiDoc to describe my rest api. RestDoc generates different files depending if there are request parameters described / response fields etc. I would like to have one template conditionally including whatever file exists.
something like this:
Request:
include::{reqresPath}/http-request.adoc[]
Response:
include::{reqresPath}/http-response.adoc[]
Parameters:
ifeval::[{{reqresPath}/request-parameters.adoc}.exists]
include::{reqresPath}/request-parameters.adoc[]
endif::[]
ifeval::[{{reqresPath}/request-parameters.adoc}.exists]
include::{reqresPath}/request-parameters.adoc[]
endif::[]
or at least exclude warnings in case of a missing file. But I could not figure out how to suppress these.
As of today, where is no operator for ifeval available, which can be used to check the existence of a file.
The way I would go is to write an extension for Asciidoctor, which can also be done by using Java. If your projects is big enough, I would suggest to go for this solution.
The most extreme way is to make a custom TemplatedSnippet which is generating an empty snippet to be included...
I hope there is a better way to do this.
Edit:
Take a look of http://asciidoctor.org/docs/user-manual/#by-tagged-regions

How to convert hadoop sequence file to json format?

As the name suggests, I'm looking for some tool which will convert the existing data from hadoop sequence file to json format.
My initial googling have only shown up results related to jaql, which I'm desperately trying to get to work.
Is there any tool from Apache available for this very purpose?
NOTE:
I've hadoop sequence file sitting on my local machine and would like to get data in corresponding json format.
So in-effect, I'm looking for some tool/utility which will take hadoop sequence file as input and produce output in json format.
Thanks
Apache Hadoop might be a good tool for reading sequence files.
All kidding aside, though, why not write the simplest possible Mapper java program that uses, say, Jackson to serialize each key and value pair it sees? That would be a pretty easy program to write.
I thought there must be some tool which will do this given that its such common requirement. Yes, it should be pretty easy to code but again why to do so if you already have something which does just the same.
Anyway, I figured out to do it using jaql. Sample working query which worked for me,
read({type: 'hdfs', location: 'some_hdfs_file', inoptions: {converter: 'com.ibm.jaql.io.hadoop.converter.FromJsonTextConverter'}});

Resources