Update code generated by Swagger code-gen - spring-boot

I have generated the code from swagger.yaml file using swagger code-gen in spring.
Now I have updated the swagger.yaml file for my API and added few more HTTP operations.
Will it be possible to update the existing code generated previously automatically without merging it manually?

I guess you are talking about the Controllers generated by codegen, that you have then implemented. They are overwritten after each generation, which means you will have to manually merge the code to add the changes every time... which is really annoying.
Well the best workflow I could find was to use the interfaceOnly option to generate only the model and interface classes in the target directory, and then manually create the controllers that implement those interfaces.
Lets say you update your API specification file with one more GET operation, the interface is regenerated with that new operation and you will just have to adjust your controller to implement that new method (super quick and easy with modern IDE), everything else remain the same and you have more control over your code (splitting controllers in different folders...etc...).
Here is config I used for the plugin:
<plugin>
<groupId>io.swagger</groupId>
<artifactId>swagger-codegen-maven-plugin</artifactId>
<version>2.2.3</version>
<executions>
<execution>
<goals>
<goal>generate</goal>
</goals>
<configuration>
<inputSpec>./api-contract/petstore.yml</inputSpec>
<language>spring</language>
<configOptions>
<sourceFolder>swagger</sourceFolder>
<java8>true</java8>
<interfaceOnly>true</interfaceOnly>
</configOptions>
</configuration>
</execution>
</executions>
</plugin>
You can check a complete example project using Spring Boot with swagger-codegen-maven-plugin here.
Cheers

Having the same issue, I found the solution to start with by applying git
commit the current status
run the generator
use git to stage the intended changes but do not stage reversal of your manual edits
commit and continue
I just start with this approach but it seems to work at least for php-slim where only one file (index.php) is changed when regenerating.

Related

When should I use mapstruct or converters with java 8 to avoid error-prone?

At work, we use MapStruct in many SpringBoot projects with Java 8 REST Full applications and when we need to map Entity to DTO or DTO to Response or in similar cases. But today my friend showed me a great advantage of using a simple Converter instead of MapStruct.
This is a simple example using MapStrurct:
#Mapper(componentModel="spring", unmappedTargetPolicy = ReportingPolicy.IGNORE)
public interface AccountMapper {
#Mapping(source = "customerBank.customerId", target = "customerId")
AccountResponse toResponse(AccountBank accountBank);
}
It works perfectly but actually in the case of someone change the customerId attribute by another name and forgets to change this mapper we will have a Runtime error.
The pros for Converter is we will have a Compile time error and avoid Runtime error.
Please, let me know if someone managed to share how to avoid Runtime error, like my presented scenario, using MapStruct, due to Converter does not bring the same advantage.
My question is: Is it possible to use MapStruct with efficiency, I mean without Runtime error prone?
If I understand well, you would like to have a compile-time error for wrong property naming, using MapStruct custom mapping.
If so, you should add a necessary build plugin in your pom.xml (if you use maven).
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.5.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
<annotationProcessorPaths>
<path>
<groupId>org.mapstruct</groupId>
<artifactId>mapstruct-processor</artifactId>
<version>${mapstruct.version}</version>
</path>
</annotationProcessorPaths>
</configuration>
</plugin>
And of course declare a property for MapStruct version:
<properties>
<mapstruct.version>1.4.1.Final</mapstruct.version>
</properties>
After compiling the project, with the added plugin, the annotation processor will generate full implementation:
public class AccountMapperImpl implements AccountMapper
In target\generated-sources\annotations folder.
You can check generated source of the implementation class, all is set and carefully checked.
In case of unexisting property names in #Mapping annotation, the compiler will throw an error.

Maven site generation using advanced Markdown?

We are using Markdown in our Maven generated site. Works like a charm. AFAIK the plugin uses Flexmark under the hood, which supports the Admonition extensions.
We would like to use them too, the infoboxes are quite helpful for documentation. Our site configuration in the pom.xml looks like this:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-site-plugin</artifactId>
<version>3.8.2</version>
</plugin>
How could we configure it to recognise the additional markdown?
This question is a little old but I have been trying to figure this out for using the Gitlab extension for math equations and the short answer is I don't think you can do it out of the box.
The maven-site-plugin uses the doxia-module-markdown module for markdown which uses Flexmark internally but it is pre-configured with the extensions that it uses for this.
Here's a link to the code and the exact snippet from
// Initialize the Flexmark parser and renderer, once and for all
static
{
MutableDataSet flexmarkOptions = new MutableDataSet();
// Enable the extensions that we used to have in Pegdown
flexmarkOptions.set( com.vladsch.flexmark.parser.Parser.EXTENSIONS, Arrays.asList(
EscapedCharacterExtension.create(),
AbbreviationExtension.create(),
AutolinkExtension.create(),
DefinitionExtension.create(),
TypographicExtension.create(),
TablesExtension.create(),
WikiLinkExtension.create(),
StrikethroughExtension.create()
) );
// ...
}
I think that you could possibly fork this project, add the extensions you need, and add it as a dependency to the maven-site-plugin and it might work like this:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-site-plugin</artifactId>
<version>3.8.2</version>
<dependencies>
<dependency>
<groupId>${my-forked-groupId}</groupId>
<artifactId>${my-forked-artifactId}</artifactId>
<version>${my-forked-version}</version>
</dependency>
</dependencies>
</plugin>
This in my opinion is less than ideal and I will probably be exploring other (non-maven) options to get the result I'm looking for but will probably still try this out in the next day or so and if I get it working I'll send a link to the code in case anybody else can benefit from it.

Integrate Activiti Modeler using Maven

How one can integrate Activiti Modeler into their own web application and keep all the advantages Maven suggests?
The probem is that Activiti Modeler in Maven is part of Activiti Explorer. There are several questions online from people who want to develop their own web applications, use Modeler to edit the processes, but don't need other Explorer features.
For example, Activiti BPM without activiti-explorer or How To Integrate Activiti Modeller Into own Web Application
I have managed to do this using Maven overlay feature:
1) include overlay of Explorer web app, but include only the Modeler files:
pom.xml:
<dependency>
<groupId>org.activiti</groupId>
<artifactId>activiti-webapp-explorer2</artifactId>
<version>${activiti-version}</version>
<type>war</type>
<scope>compile</scope>
</dependency>
....
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.6</version>
<configuration>
<overlays>
<overlay>
<groupId>org.activiti</groupId>
<artifactId>activiti-webapp-explorer2</artifactId>
<includes>
<include>WEB-INF/classes/stencilset.json</include>
<include>editor-app/**</include>
<include>modeler.html</include>
</includes>
</overlay>
</overlays>
</configuration>
</plugin>
2) add modeler Spring resources. They are used to retrieve and to save models (note: not process definitions, it's kinda different thing) and also to serve the stencilset:
<dependency>
<groupId>org.activiti</groupId>
<artifactId>activiti-modeler</artifactId>
<version>${activiti-version}</version>
</dependency>
3) that would be it but it won't actually work in your application unless it is called "activiti-explorer". Add a file to your project called "editor-app/app-cfg.js", and add the following content there:
editor-app/app-cfg.js:
'use strict';
var ACTIVITI = ACTIVITI || {};
ACTIVITI.CONFIG = {
'contextRoot' : window.location.pathname.substring(0,window.location.pathname.lastIndexOf('/')),
};
This is actually a copy of the native app-cfg, which contains the strange "/activiti-explorer/service" setting for context root. We change it to more generic setting. It will be used to retrieve models from the repository. Our file will overlay the one that ships with explorer webapp.
Notes:
a) you have to manage conversion of process definitions to models by yourselves. For an idea, see https://github.com/Activiti/Activiti/blob/activiti-5.19.0.1/modules/activiti-explorer/src/main/java/org/activiti/editor/ui/ConvertProcessDefinitionPopupWindow.java
b) I had to avoid using one Jackson Object Mapper for everything, I haven't research why this didn't work:
<bean id="objectMapper" class="com.fasterxml.jackson.databind.ObjectMapper">
</bean>
<mvc:annotation-driven>
<!-- using the same objectMapper leads to stencilset resource being served as string -->
<mvc:message-converters register-defaults="true">
<bean class="org.springframework.http.converter.json.MappingJackson2HttpMessageConverter">
<property name="objectMapper" ref="objectMapper"/>
</bean>
</mvc:message-converters>
</mvc:annotation-driven>
Don't do this or research more, as this actually breaks the stencilcet resurce serving part of the "activiti-modeler". It starts serving stencilset as a malformed string instead normal json.
c) I had no idea how to inject CSRF security headers in the Modeler saving function, so I switched it off - if you don't use Spring Security, discard this

How do I control spring injections that vary between the test environment and the production environment?

I'm setting up a CI situation in which I will deploy my web app to a test environment. In this test environment, I want the business objects used by the app to be mocks of the real ones; the mocks will return static test data. I'm using this to run tests agains my ui. I'm controlling the injections of these business object dependencies with Spring; it's a struts 2 application, for what that's worth.
My question is Maven related, I think. What is the best way to have my Maven build determine whether or not to build the spring configuration out for injecting the mocks or injecting the real thing? Is this a good use for maven profiles? Other alternatives?
Spring itself has support for profiles (if you're using 3.1 or newer), for a web-application you can use context-parameter to set the active profile for different environments in the web.xml:
<context-param>
<param-name>spring.profiles.default</param-name>
<param-value>test</param-value>
</context-param>
Edit: For Maven & Jenkins, you should be able to set the parameter for a build job as follows:
First, let Maven filter your xml-resources (in this example, only files ending with xml are filtered, others are included without filtering) by adding the following into your pom.xml inside the <build> </build> -tags:
<resources>
<resource>
<directory>src/main/webapp</directory>
<filtering>true</filtering>
<includes>
<include>**/*xml</include>
</includes>
</resource>
<resource>
<directory>src/main/webapp</directory>
<filtering>false</filtering>
<excludes>
<exclude>**/*xml</exclude>
</excludes>
</resource>
</resources>
Then, parameterize the context-param in your web.xml:
<context-param>
<param-name>spring.profiles.default</param-name>
<param-value>${env.SPRINGPROFILE}</param-value>
</context-param>
Then parameterize the build job in Jenkins to set the desired string parameter for SPRINGPROFILE (for example test or prod): https://wiki.jenkins-ci.org/display/JENKINS/Parameterized+Build
It's probably a bad idea to do anything with the build of the web app artifact ( Maven best practice for generating artifacts for multiple environments [prod, test, dev] with CI/Hudson support? ). While you could use various mechanisms to produce a WAR file with different configurations of the Spring injections for different contexts, the WAR artifact should be the same every time it's built.
In order to extract the configuration out of the WAR, I have used Spring 3's ability to pull in override values from an external property file. I define default, i.e. produciton, values of my business objects. And I configure spring to check for the existence of a properties file, something I will deploy when the app is in a testing environment and requires mock injections. If that properties file exists, it's values are injected instead. Here's the relevent bit of the spring config file.
<!-- These are the default values -->
<util:properties id="defaultBeanClasses">
<prop key="myManagerA">com.myco.ManagerAImpl</prop>
<prop key="myManagerB">com.myco.ManagerBImpl</prop>
</util:properties>
<!-- Pull in the mock overrides if they exist. -->
<context:property-placeholder
location="file:///my/location/mockBeans.properties"
ignore-resource-not-found="true"
properties-ref="defaultBeanClasses"/>
<!-- The beans themselves. -->
<bean id="managerA" class="${myManagerA}"/>
<bean id="managerB" class="${myManagerB}"/>
And here is the contents of the external "mockBeans.properties" file:
#Define mock implementations for core managers
myManagerA=com.myco.ManagerAMockImpl
myManagerB=com.myco.ManagerBMockImpl
This works nicely. You can even include the mockBeans.properties file in the actual WAR, if you like, but not in the live location. Then the test environment task would be too move it to the location pointed at by the spring config. Alternatively, you could have the mock properties reside in a completely different project.

Loading a partial Spring context

I am not much of a Spring expert, but was given a legacy system with a huge context file (not separated into modules).
I want to add some unit tests - that validates different parts of the system, with the actual production configuration.
I started using the ClassPathXmlApplicationContext/FileSystemXmlApplicationContext classes to load the context, however - that takes forever.
Is it possible to load only parts of the context file (recursively) without the need to separate the original file into modules?
Update:
I'll just post here my implementation of Ralph's solution using maven:
my pom.xml:
<plugin>
<groupId>com.google.code.maven-config-processor-plugin</groupId>
<artifactId>maven-config-processor-plugin</artifactId>
<version>2.0</version>
<configuration>
<namespaceContexts>
<s>http://www.springframework.org/schema/beans</s>
</namespaceContexts>
<transformations>
<transformation>
<input>context.xml</input>
<output>context-test.xml</output>
<config>test-context-transformation.xml</config>
</transformation>
</transformations>
</configuration>
<executions>
<execution>
<goals>
<goal>process</goal>
</goals>
<phase>test</phase>
</execution>
</executions>
</plugin>
my test-context-transformation.xml:
<processor>
<add>
<name>/s:beans</name>
<value>
<![CDATA[
default-lazy-init="true"
]]>
</value>
</add>
</processor>
If you are trying to run "unit" tests, you will not require the full application context at all. Just instantiate the class you want to test (and maybe its collaborators, though mocking may be a better option) and off you go. Unit tests should concentrate on single components in isolation - otherwise they are not unit tests.
If you are trying to run a full integration test by creating the complete object hierarchy defined in your application context, then it may be easiest by first refactoring your context and splitting it into modules - as you were suggesting already.
I guess it does not work out of the box. But you can try this (it is just an idea, I don't know if it works)
Spring support so called lazy initialization the idea is to add this to all the beans.
I can imagine two ways.
A simple tool that create an copy of the orignal configuration xml file and add default-lazy-init="true" the container level beans (with s) declaration.
Try do do it programmatic. With a Bean Post Processor, or try to "inject" the default-lazy-init="true" configuration programmatic

Resources