lesscss-maven-plugin not generating Source map? - maven

I am currently using lesscss-maven-plugin by org.codehaus.mojo;
the build is ok, and it is compiling properly.
How ever I am not generating any source map:
on my pom.xml:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>lesscss-maven-plugin</artifactId>
<version>1.0-beta-3</version>
<configuration>
<force>true</force>
<lessJs>${project.basedir}/src/main/resources/lessjs/less.1.7.5-min.js</lessJs>
<sourceDirectory>${project.build.directory}/${project.build.finalName}/</sourceDirectory>
<outputDirectory>${project.build.directory}/${project.build.finalName}-LESS</outputDirectory>
<compress>true</compress>
<includes>
<include>r/assets/bootstrap.less</include>
<include>r/assets/page.less</include>
</includes>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
</goals>
<phase>prepare-package</phase>
</execution>
</executions>
</plugin>
on the less.js I added this at the top:
less: {
env: "development",
"development": {
options: {
compress: true,
yuicompress: true,
optimization: 2,
sourceMap: true,
sourceMapFilename: "r/css/styles.map"
},
files: {
"r/css/bootstrap.css": "r/assets/bootstrap.less",
"r/css/page.css": "r/assets/page.less",
}
}
}
Did I forgot something? TIA.

Related

How to use suppressions in dependency-track

In my maven pom I have:
<properties>
<suppressions>./suppressions.json</suppressions>
</properties>
<plugin>
<groupId>dev.iabudiab</groupId>
<artifactId>dependency-track-maven-plugin</artifactId>
<version>2.1.0</version>
</plugin>
And in the root of my project I have a suppressions.json like this:
{
"suppressions": [
{
"by": "cve",
"cve": "CVE-2020-7019",
"expiration": "2024-12-31",
"note": "Bladi bla, reason because bladi"
}
]
}
Yet, after my build in Bamboo I still see CVE-2020-7019 between the vulnerabilities.

Spring Boot SwaggerUI with not working with latest version

I am trying to add the Swagger UI in my Spring Boot Application, but I am unable to access the swagger-ui.html.
pom.xml
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.3.1.RELEASE</version>
</parent>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-webflux</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-actuator</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-elasticsearch</artifactId>
</dependency>
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-swagger2</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-swagger-ui</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-spring-webflux</artifactId>
<version>3.0.0</version>
</dependency>
Code:
#Configuration
public class SwaggerConfig {
#Bean
public Docket api() {
return new Docket(DocumentationType.SWAGGER_2).select()
.apis(RequestHandlerSelectors.basePackage("com.learnings.search.web")).build().pathMapping("/")
.enableUrlTemplating(false);
}
}
#SpringBootApplication(exclude = { DataSourceAutoConfiguration.class, CassandraAutoConfiguration.class,
KafkaAutoConfiguration.class })
#EnableSwagger2
public class SearchApp {
public static void main(String[] args) {
SpringApplication.run(SearchApp.class, args);
}
}
I am able to access the swagger resources:
/swagger-resources
[
{
"name": "default",
"url": "/v2/api-docs",
"swaggerVersion": "2.0",
"location": "/v2/api-docs"
}
]
/swagger-resources/configuration/ui
{
"deepLinking": true,
"displayOperationId": false,
"defaultModelsExpandDepth": 1,
"defaultModelExpandDepth": 1,
"defaultModelRendering": "example",
"displayRequestDuration": false,
"docExpansion": "none",
"filter": false,
"operationsSorter": "alpha",
"showExtensions": false,
"showCommonExtensions": false,
"tagsSorter": "alpha",
"validatorUrl": "",
"supportedSubmitMethods": [
"get",
"put",
"post",
"delete",
"options",
"head",
"patch",
"trace"
],
"swaggerBaseUiUrl": ""
}
Please let me know if something has changed in the latest version, because I have been using the swagger-ui in my other projects.
In 3.0.0 swagger-ui.html has been moved to swagger-ui/index.html
You need to add this dependency also If using swagger 3 version.
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-boot-starter</artifactId>
<version>3.0.0</version>
</dependency>
Also, add #EnableOpenApi annotation in main class in Springboot application.
Hope this will work. Thanks!

Kafka spring boot application producer and unable to reflect that with Kafka Sink Connector with Avro format

My target is I have spring boot application kafka producer with Avro serializing property and i am excepting the message which is pushed to respective topic that should access by confluent Sink Connector and insert into mysql/Oracle database tables , am able to produce Avro serialize and spring boot consumer can Avro deserialize but my Sink connector is not working , I am not able to catch what kind of payload sink connector is excepting and how should be spring boot producer coded to push message such a way that sink connector can cop-up with that properties
Thanks in advance :)
This is application.yml in spring boot application
server:
port: 9000
spring.kafka:
bootstrap-servers: "localhost:9092"
properties:
schema.registry.url: "http://localhost:8081"
specific.avro.reader: true
producer:
key-serializer: "io.confluent.kafka.serializers.KafkaAvroSerializer"
value-serializer: "io.confluent.kafka.serializers.KafkaAvroSerializer"
app:
topic: event_pay2
This is payload for creation of schema from spring boot application
{
"schema": {
"type": "struct",
"fields": [
{
"type": "string",
"optional": false,
"field": "userid"
},
{
"type": "string",
"optional": false,
"field": "regionid"
},
{
"type": "string",
"optional": false,
"field": "gender"
}
],
"optional": false,
"name": "oracle.user"
},
"payload": {
"userid": "User_1",
"regionid": "Region_5",
"gender": "FEMALE"
}
}
Pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.0.5.RELEASE</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>com.kafka</groupId>
<artifactId>kafka-producer-example</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>kafka-producer-example</name>
<description>Demo project for Spring Boot</description>
<repositories>
<repository>
<id>confluent</id>
<url>http://packages.confluent.io/maven</url>
</repository>
</repositories>
<properties>
<java.version>1.8</java.version>
<confluent.version>4.0.0</confluent.version>
</properties>
<dependencies>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-devtools</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>io.confluent</groupId>
<artifactId>kafka-avro-serializer</artifactId>
<version>${confluent.version}</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro</artifactId>
<version>1.8.2</version>
</dependency>
</dependencies>
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.avro</groupId>
<artifactId>avro-maven-plugin</artifactId>
<version>1.8.2</version>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>schema</goal>
</goals>
<configuration>
<sourceDirectory>${project.basedir}/src/main/resources/avro/</sourceDirectory>
<outputDirectory>${project.build.directory}/generated/avro</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</pluginManagement>
</build>
</project>
This is my rest call how am pushing messgae into kafka topic
#PostMapping("/publish/avrodata")
public String sendMessage(#RequestBody String request) {
sender.send(request);
return "Published successfully";
}
Finally My sink connector
"name": "JDBC_Sink_EVENT_PAY",
"connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
"tasks.max": "1",
"topics": "event_pay2",
"connection.url": "jdbc:mysql://localhost:3306/user",
"connection.user": "****",
"connection.password": "****",
"auto.create": "true",
"auto.evolve":"true",
"key.converter": "io.confluent.connect.avro.AvroConverter",
"key.converter.schema.registry.url": "http://localhost:8081",
"key.converter.schemas.enable": "true",
"value.converter": "io.confluent.connect.avro.AvroConverter",
"value.converter.schema.registry.url": "http://localhost:8081",
"value.converter.schemas.enable": "true"
Always, always debug your topic before setting up a Connector. Use kafka-avro-console-consumer for this. If that doesn't work, then Connect + AvroConverter likely won't work either (in my experience), and you can reduce the problem space.
If I read your code correctly, you've sent a String, and not an Avro object
public String sendMessage(#RequestBody String request) {
sender.send(request); // <--- here and ^^^^^^ here
return "Published successfully";
}
Instead, you need to parse your input request into an object that was created as part of your /src/main/resources/avro schema data, not just forward through the incoming request as a string.
And that AVSC file might look something like
{
"type": "Record",
"namespace": "oracle.user",
"name": "User",
"fields": [
{ "type": "string", "name: "user_id" },
{ "type": "string", "name": "regionid" },
{ "type" "string", "name" "gender" }
]
}
Which would create a oracle.user.User object, making your KafkaTemplate need to be something like KafkaTemplate<String, oracle.user.User> sender

How to integrate ReactJS with Redux and Spring mVC

I am a newbiew of ReactJS. Before, I used ReactJS for my company's project but actually that project has configurated by somone. I just apply what I know about Reactjs into project such as : state, props, component, apply redux.....
But , now I want to create a project by myself step by step. In that I can use reactjs integrate redux to manage state for using data and Spring is server in RESTfull standard and combine with JPA
Please help me
You can integrate spring and react js using 2 approaches
You can do it by making an isomorphic web app which uses nashorn engine(jdk8).
You can use pusher api(I have used pusher api in my app)
As i can not explain the full working here,here are tutorials for reference
using pusher api
isomorphic wb app
There is no simple answer.
You can use maven plugin https://github.com/eirslett/frontend-maven-plugin
Add to your pom.xml something like this
<plugin>
<groupId>com.github.eirslett</groupId>
<artifactId>frontend-maven-plugin</artifactId>
<version>1.2</version>
<configuration>
<installDirectory>target</installDirectory>
</configuration>
<executions>
<execution>
<id>install node and npm</id>
<goals>
<goal>install-node-and-npm</goal>
</goals>
<configuration>
<nodeVersion>v4.4.5</nodeVersion>
<npmVersion>3.9.2</npmVersion>
</configuration>
</execution>
<execution>
<id>npm install</id>
<goals>
<goal>npm</goal>
</goals>
<configuration>
<arguments>install</arguments>
</configuration>
</execution>
<execution>
<id>webpack build</id>
<goals>
<goal>webpack</goal>
</goals>
</execution>
</executions>
</plugin>
and you need to add package.json like this
{
"dependencies": {
"react": "^15.4.2",
"react-dom": "^15.4.2",
"requirejs": "^2.3.2"
},
"devDependencies": {
"babel-core": "^6.22.1",
"babel-loader": "^6.2.10",
"babel-plugin-transform-regenerator": "^6.22.0",
"babel-preset-es2015": "^6.22.0",
"babel-preset-react": "^6.22.0",
"babel-preset-stage-0": "^6.22.0",
"react-frame-component": "^0.6.6",
"webpack": "^2.2.1",
"webpack-dev-server": "^2.3.0"
},
"scripts": {
"start": "webpack-dev-server --progress --inline --hot",
"build": "webpack -d"
}
}
But list of dependenccies is yours
And you need webpack.config.js something like this
var path = require('path');
var webpack = require('webpack');
var packageJSON = require('./package.json');
module.exports = {
entry: [
'webpack/hot/only-dev-server',
'./src/main/resources/static/App.js'],
devtool: 'sourcemaps',
cache: true,
output: {
path: __dirname,
filename: './src/main/resources/static/built/bundle.js',
publicPath: 'http://localhost:3000/'
},
resolve: {extensions: ['.js', '.jsx']},
plugins: [
new webpack.HotModuleReplacementPlugin()
,new webpack.LoaderOptionsPlugin({
debug: true
})
],
module: {
loaders: [
{
test: path.join(__dirname, '.'),
exclude: /(node_modules)/,
loader: 'babel-loader',
query: {
cacheDirectory: true,
presets: ['es2015', 'react']
}
},
]
},
devServer: {
noInfo: false,
quiet: false,
lazy: false,
watchOptions: {
poll: true
}
}
};

Is there a way to minify an ExtJS application without Sencha CMD?

I have an existing ExtJS app developed using ExtJS 4.2.1.
I am using Closure minifier through Maven plugin minify-maven-plugin.
The generated minified JS files (without merge) works fine.
However, generated merged minified file throws undefined errors because the definition comes later in the merged file.
My question is, is there a way I can figure out the order I have to provide the plugin? (I don't want to use Sencha Cmd)
The app folder follows the structure
app/common, app/controller, app/model, app/proxy, app/store, app/utils, app/view
At the moment this is how I have defined the build process in Maven POM file
<plugins>
<plugin>
<groupId>com.samaxes.maven</groupId>
<artifactId>minify-maven-plugin</artifactId>
<version>1.7.4</version>
<executions>
<execution>
<id>default-minify</id>
<phase>prepare-package</phase>
<goals>
<goal>minify</goal>
</goals>
<inherited>false</inherited>
<configuration>
<charset>UTF-8</charset>
<!-- <skipMerge>true</skipMerge> -->
<webappSourceDir>${basedir}/src/main</webappSourceDir>
<jsSourceDir>js/app</jsSourceDir>
<jsTargetDir>js/app</jsTargetDir>
<jsEngine>CLOSURE</jsEngine>
<closureLanguage>ECMASCRIPT5</closureLanguage>
<closureAngularPass>true</closureAngularPass>
<nosuffix>true</nosuffix>
<webappTargetDir>${project.build.directory}</webappTargetDir>
<jsSourceIncludes>
<jsSourceInclude>**/*.js</jsSourceInclude>
</jsSourceIncludes>
</configuration>
</execution>
</executions>
</plugin>
Why dont use Sencha Cmd? It does exactly what you want!
Maybe it helps if you know that you can use Sencha Cmd without the Sencha application structure. If you want only merge the files use the concatenate cmd.
If you really dont want use Sencha Cmd, then you have to take care of all the extends, requires, mixins and so on... and I would not recommend this!
For example use Sencha Cmd with manual paths and exclude the extjs classes
sencha compile --classpath=myApp/src,extjs/src -debug=false exclude -all and include -namespace MyApp.* and concat bundle.js
the extjs/src path is the path where your extjs classes are
Since your question is about the order of of files for minification, providing that information below:
We had a similar requirement, where I could not use Sencha cmd as it is to minify files, so created jsb file on my own [I know this is not recommended :( ].
What I did was, created below jsb file [please note: sequence of files is very important]:
{
"projectName": "ProductName",
"builds": [
/** This file is for production purpose **/
{
"name": "ProductName - Production",
"target": "all-app.js",
"compress": true,
"files": [
/** utils **/
{
"path": "app/util/",
"name": "util.js"
}
/** models **/
{
"path": "app/model/",
"name": "MyModel.js"
},
/** stores **/
{
"path": "app/store/",
"name": "MyStore.js"
},
/** custom components **/
{
"path": "resources/ux/form/",
"name": "MySearchField.js"
},
/** views **/
{
"path": "app/view/admin/",
"name": "MyView.js"
},
/** controllers **/
{
"path": "app/controller/",
"name": "Window.js"
},
/** app.js **/
{
"path": "",
"name": "app.js"
}
]
},
/** This file is for debug purpose **/
{
"name": "ProductName - debug",
"target": "all-app-debug.js",
"compress": false,
"files": [
/** utils **/
{
"path": "app/util/",
"name": "util.js"
}
/** models **/
{
"path": "app/model/",
"name": "MyModel.js"
},
/** stores **/
{
"path": "app/store/",
"name": "MyStore.js"
},
/** custom components **/
{
"path": "resources/ux/form/",
"name": "MySearchField.js"
},
/** views **/
{
"path": "app/view/admin/",
"name": "MyView.js"
},
/** controllers **/
{
"path": "app/controller/",
"name": "Window.js"
},
/** app.js **/
{
"path": "",
"name": "app.js"
}
]
}
],
"resources" : []
}
If you create a minified file using above sequence, it will work in your case as well.
Hope this help.
Thanks!

Resources