Querydsl fetchCount() & fetch() NullPointerException, Connection is closed - spring-boot

My goal is to implement dao method within pagination, sorting and filtering. For pagination I need get firstly count and then set offset & limit and get result (so get just one "page" from database):
#Slf4j
#Repository
public class UserJdbcRepository {
private final SQLQueryFactory queryFactory;
#Autowired
public UserJdbcRepository(DataSource dataSource) {
Configuration configuration = new Configuration(new OracleTemplates());
configuration.setExceptionTranslator(new SpringExceptionTranslator());
this.queryFactory = new SQLQueryFactory(configuration, dataSource);
}
public Page<User> findAll(BooleanExpression predicate, Pageable pageable) {
QUser u = new QUser("u");
SQLQuery<Tuple> sql = queryFactory
.select(u.userId, // omitted)
.from(u)
.where(predicate);
long count = sql.fetchCount();
List<Tuple> results = sql.fetch();
// Conversion List<Tuple> to List<User> omitted
return new PageImpl<>(users, pageable, count);
}
}
fetchCount() is executed correctly, but fetch() is throwing NullPointerException:
java.lang.NullPointerException: null
at com.querydsl.sql.AbstractSQLQuery.fetch(AbstractSQLQuery.java:502) ~[querydsl-sql-4.4.0.jar:na]
From debug I found that root cause is in com.querydsl.sql.AbstractSQLQuery:
java.sql.SQLException: Connection is closed
If I create second (the same as first one) query sql2, then it is working (of course):
SQLQuery<Tuple> sql2 = queryFactory
.select(... // same as first one)
long count = sql.fetchCount();
List<Tuple> results = sql2.fetch();
My question is if connection should be really closed after fetchCount() is called? Or do I have some misconfiguration?
I have SpringBoot 2.4.5; spring-data-commons 2.5.0; Oracle driver ojdbc8 21.1.0.0; QueryDSL 4.4.0
<dependency>
<groupId>com.querydsl</groupId>
<artifactId>querydsl-sql</artifactId>
<version>${querydsl.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.querydsl</groupId>
<artifactId>querydsl-sql-spring</artifactId>
<version>${querydsl.version}</version>
<scope>compile</scope>
</dependency>
<plugin>
<groupId>com.querydsl</groupId>
<artifactId>querydsl-maven-plugin</artifactId>
<version>${querydsl.version}</version>
<executions>
<execution>
<goals>
<goal>export</goal>
</goals>
</execution>
</executions>
<configuration>
<jdbcDriver>oracle.jdbc.OracleDriver</jdbcDriver>
<jdbcUrl>jdbc:oracle:thin:#//localhost:1521/XE</jdbcUrl>
<jdbcUser>user</jdbcUser>
<jdbcPassword>password</jdbcPassword>
<sourceFolder>${project.basedir}/src/main/java</sourceFolder>
<targetFolder>${project.basedir}/src/main/java</targetFolder>
<packageName>org.project.backend.repository.querydsl</packageName>
<schemaToPackage>true</schemaToPackage>
<schemaPattern>project</schemaPattern>
<tableNamePattern>
// omitted
</tableNamePattern>
</configuration>
<plugin>

Issue is caused by missconfiguration. QueryDSL doc contains Spring integration section, where is mentioned that SpringConnectionProvider must be used. So I changed my constructor and it is working now as expected:
#Autowired
public UserJdbcRepository(DataSource dataSource) {
Configuration configuration = new Configuration(new OracleTemplates());
configuration.setExceptionTranslator(new SpringExceptionTranslator());
// wrong: this.queryFactory = new SQLQueryFactory(configuration, dataSource);
Provider<Connection> provider = new SpringConnectionProvider(dataSource);
this.queryFactory = new SQLQueryFactory(configuration, provider);
}
I also found there is useful method fetchResults() containing count for pagination purpose (so not needed to explicitly call fetchCount()):
public Page<User> findAll(BooleanExpression predicate, Pageable pageable) {
QUser u = new QUser("u");
SQLQuery<Tuple> sql = queryFactory
.select(u.userId, // omitted)
.from(u)
.where(predicate);
sql.offset(pageable.getOffset());
sql.limit(pageable.getPageSize());
QueryResults<Tuple> queryResults = sql.fetchResults();
long count = queryResults.getTotal();
List<Tuple> results = queryResults.getResults();
// Conversion List<Tuple> to List<User> omitted
return new PageImpl<>(users, pageable, count);
}

Related

Accessing files in a Jar using ClassPathResource

I have a spring application that i must convert to jar. In this application I have a unit test:
#BeforeEach
void setUp() throws IOException {
//facturxHelper = new FacturxHelper();
facturxService = new FacturxService();
// String pdf = "facture.pdf"; // invalid pdfa1
String pdf = "resources/VALID PDFA1.pdf";
// InputStream sourceStream = new FileInputStream(pdf); //
InputStream sourceStream = getClass().getClassLoader().getResourceAsStream(pdf);
byte[] sourceBytes = IOUtils.toByteArray(sourceStream);
this.b64Pdf = Base64.getEncoder().encodeToString(sourceBytes);
}
#Test
void createFacturxMin() throws Exception {
// on va créer une facturX avec l'objet request
FacturxRequestMin request = FacturxRequestMin.builder()
.pdf(this.b64Pdf)
.chorusPro(Boolean.FALSE)
.invoiceNumber("FA-2017-0010")
.issueDate("13/11/2017")
.buyerReference("SERVEXEC")
.seller(TradeParty.builder()
.name("Au bon moulin")
.specifiedLegalOrganization(LegalOrganization.builder()
.id("99999999800010") .scheme(SpecifiedLegalOrganizationScheme.FR_SIRENE.getSpecifiedLegalOrganizationScheme())
.build())
.postalAddress(PostalAddress.builder()
.countryId(CountryIso.FR.name())
.build())
.vatId("FR11999999998")
.build())
.buyer(TradeParty.builder()
.name("Ma jolie boutique")
.specifiedLegalOrganization(LegalOrganization.builder()
.id("78787878400035")
.scheme(SpecifiedLegalOrganizationScheme.FR_SIRENE.getSpecifiedLegalOrganizationScheme())
.build())
.build())
.headerMonetarySummation(HeaderMonetarySummation.builder()
.taxBasisTotalAmount("624.90")
.taxTotalAmount("46.25")
.prepaidAmount("201.00")
.grandTotalAmount("671.15")
.duePayableAmount("470.15")
.build())
.build();
FacturXAppManager facturXAppManager = new FacturXAppManager(facturxService);
FacturxResponse facturxResponse = facturXAppManager.createFacturxMin(request);
Gson gson = new GsonBuilder().setPrettyPrinting().create();
String json = gson.toJson(facturxResponse);
System.out.println(json);
}
The aim of the application is to create an xml and to embed it into the pdf file.
My issue is concerning an xml validation through xsd.
Here is an abstract of the code :
public static boolean xmlValidator(String fxGuideLine, String xmlString) throws Exception {
System.out.println("xmlValidator() called");
File xsdFile = null;
Source source = new StreamSource(new StringReader(xmlString));
// i removed a lot of if else statement concerning files which allow to validate xml
try {
xsdFile = new ClassPathResource(FacturxConstants.FACTUR_X_MINIMUM_XSD).getFile();
} catch (IOException e) {
throw new FacturxException(e.getMessage());
}
// validation du contenu XML
try {
SchemaFactory schemaFactory = SchemaFactory
.newInstance(XMLConstants.W3C_XML_SCHEMA_NS_URI);
Schema schema = schemaFactory.newSchema(xsdFile);
Validator validator = schema.newValidator();
validator.validate(source);
return true;
} catch (SAXException | IOException e) {
throw new FacturxException(e.getLocalizedMessage());
}
...
}
In constants class, I added path to the xsd file:
public static final String FACTUR_X_MINIMUM_XSD = "resources/xsd/MINIMUM_XSD/FACTUR-X_MINIMUM.xsd";
In my POM file I do want to put the resources files in the built jar.
<build>
<finalName>${project.artifactId}</finalName>
<resources>
<resource>
<directory>src/main/resources</directory>
<includes>
<include>*</include>
</includes>
</resource>
</resources>
<plugins>
<plugin>
<artifactId>maven-resources-plugin</artifactId>
<version>3.3.0</version>
<configuration>
<outputDirectory> ${project.build.outputDirectory}\resources</outputDirectory>
</configuration>
</plugin>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>3.4.2</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</plugin>
</plugins>
</build>
When I do a simple maven clean package, everything is running perfectly.
So far so good.
Next step is where my problem comes. Let's consider i want to use this dependency in an another application (a spring boot application). The previous jar compiled is a high level API that i want to integrate.
I launched the following command line :
mvn install:install-file -Dfile=myapi.jar -DgroupId=fr.myapi -DartifactId=graph-api-sharepoint -Dversion=1.0.0-SNAPSHOT -Dpackaging=jar
I do add my dependency correctly in my new project. that's perfect.
To check if my import worked correctly, i created a simple unit test with the same code (I do have a VALID PDFA1 in my resources folder. So far so good.
When running the test I do have the following error:
class path resource [resources/xsd/BASIC-WL_XSD/FACTUR-X_BASIC-WL.xsd] cannot be resolved to absolute file path because it does not reside in the file system: jar:file:/.m2/repository/fr/myapi/1.1.0/myapi-1.1.0.jar!/resources/xsd/BASIC-WL_XSD/FACTUR-X_BASIC-WL.xsd
How can i fix this issue ? I read many post but not fixes solved my issue. I do also think that i will have an issue also while compiling the springboot app as a jar
As mentionned, using a File won't work.
In the current code I updated it using InputStream:
InputStream is = new ClassPathResource(FacturxConstants.FACTUR_X_MINIMUM_XSD).getInputStream();
xsdSource = new StreamSource(is);
if my xsd path doesn't have resources:
public static final String FACTUR_X_MINIMUM_XSD = "xsd/MINIMUM_XSD/FACTUR-X_MINIMUM.xsd";
I have the following exception:
class path resource [xsd/MINIMUM_XSD/FACTUR-X_MINIMUM.xsd] cannot be opened because it does not exist
If i do put
public static final String FACTUR_X_MINIMUM_XSD = "resources/xsd/MINIMUM_XSD/FACTUR-X_MINIMUM.xsd";
the response is the following:
src-resolve: Cannot resolve the name 'ram:ExchangedDocumentContextType' to a(n) 'type definition' component.
I updated also the SchemaFactory and schema implementation:
SchemaFactory schemaFactory =
SchemaFactory.newInstance(XMLConstants.W3C_XML_SCHEMA_NS_URI);
Schema schema = schemaFactory.newSchema(xsdSource);
Validator validator = schema.newValidator();
validator.validate(source);
return true;
public static final String FACTUR_X_MINIMUM_XSD = "resources/xsd/MINIMUM_XSD/FACTUR-X_MINIMUM.xsd";
Is wrong it should be (assuming src/main/resources/xsd is the actual location you are using).
public static final String FACTUR_X_MINIMUM_XSD = "/xsd/MINIMUM_XSD/FACTUR-X_MINIMUM.xsd";
Then your code is using a java.io.File which won't work, as a java.io.File needs to be a physical file on the file system. Which this isn't as it is inside a jar file. You need to use an InputStream.
public static boolean xmlValidator(String fxGuideLine, String xmlString) throws Exception {
System.out.println("xmlValidator() called");
Source source = new StreamSource(new StringReader(xmlString));
// i removed a lot of if else statement concerning files which allow to validate xml
try {
InputStream xsd = new ClassPathResource(FacturxConstants.FACTUR_X_MINIMUM_XSD).getInputStream();
StreamSource xsdSource = new StreamSource(xsd);
SchemaFactory schemaFactory = SchemaFactory
.newInstance(XMLConstants.W3C_XML_SCHEMA_NS_URI);
Schema schema = schemaFactory.newSchema(xsdSource);
Validator validator = schema.newValidator();
validator.validate(source);
return true;
} catch (SAXException | IOException e) {
throw new FacturxException(e.getLocalizedMessage());
}
...
}
Which loads the schema using an inputstream.
Thanks to M. Deinum, I was able to find out a solution. I had to use indeed StreamSource. This didn't solve the following issue:
src-resolve: Cannot resolve the name 'ram:ExchangedDocumentContextType' to a(n) 'type definition' component.
As I used several xsd files, I implemented a way to retrieve a list of sources using PathMatchingResourcePatternResolver (from spring)
private static Source[] buildSources(String fxGuideLine, String pattern) throws SAXException, IOException {
List<Source> sources = new ArrayList<>();
PathMatchingResourcePatternResolver patternResolver = new PathMatchingResourcePatternResolver();
Resource[] resources = patternResolver.getResources(pattern);
for (Resource resource : resources) {
StreamSource dtd = new StreamSource(resource.getInputStream());
dtd.setSystemId(resource.getURI().toString());
sources.add(dtd);
}
return sources.toArray(new Source[sources.size()]);
}

Spring Boot Application using Oracle - ORA-01000: maximum open cursors exceeded while using Spring Jdbctemplate

I know there are lot of solution for this in internet but nothing seems to work for me.
I have following entries for in the pom.xml file for my jdk11 app
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.3.9.RELEASE</version>
</parent>
<dependency>
<groupId>com.oracle.database.jdbc</groupId>
<artifactId>ojdbc8</artifactId>
<version>21.1.0.0</version>
</dependency>
<dependency>
<groupId>com.oracle.database.ha</groupId>
<artifactId>ons</artifactId>
<version>21.1.0.0</version>
</dependency>
<dependency>
<groupId>com.oracle.database.jdbc</groupId>
<artifactId>ucp</artifactId>
<version>21.1.0.0</version>
</dependency>
I am using connection pool with config as follows
platform: oracle
url: jdbc:oracle:thin:#localhost:1521:TEST
connectionFactoryClassName: oracle.jdbc.pool.OracleDataSource
fastConnectionFailoverEnabled: true
username: test
password: test
initialPoolSize: 3
minPoolSize: 0
maxPoolSize: 12
inactivityTimeout: 300
queryTimeout: 600
validateConnectionOnBorrow: true
I am only querying for table no add or update to oracle record, something like this
PoolDataSource dataSource = PoolDataSourceFactory.getPoolDataSource();
#Bean
public DaoSpringJdbcImpl listenerDAO(final DataSource dataSource ) {
NamedParameterJdbcTemplate template = new NamedParameterJdbcTemplate(dataSource );
DaoSpringJdbcImpl listenerDAO = new DaoSpringJdbcImpl(template);
return listenerDAO;
}
public class DaoSpringJdbcImpl {
private NamedParameterJdbcTemplate jdbcTemplate;
public DaoSpringJdbcImpl(NamedParameterJdbcTemplate jdbcTemplate) {
this.jdbcTemplate = jdbcTemplate;
}
public void method() {
List<String> results = jdbcTemplate.query(SQL_QUERY, params,
new RowMapper<String>()
{
public String mapRow(ResultSet rs, int rowNum) throws SQLException
{ return rs.getString(0)}
}
}
so everytime my application queries it opens up new cursor and never close it ultimately resulting to open cursor exception
PS I tried adding env property spring.jdbc.getParameterType.ignore = true which didnt work

flink elasticsearch connector

I used the following code to connect Flink to ElasticSearch. But when running with Flink, a lot of errors are displayed.The program first enters the data from a port and then reads each line in the command line according to the program written. It then displays the number of words. The main problem is when connecting to a elasticsearch that unfortunately gives error when connecting. Are these errors? What classes do you need to connect Minimal Flink to Elastic Search?
public class Elastic {
public static void main(String[] args) throws Exception {
// the port to connect to
final int port;
try {
final ParameterTool params = ParameterTool.fromArgs(args);
port = params.getInt("port");
} catch (Exception e) {
System.err.println("No port specified. Please run 'SocketWindowWordCount --port <port>'");
return;
}
// get the execution environment
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
// get input data by connecting to the socket
DataStream<String> text = env.socketTextStream("localhost", port, "\n");
// parse the data, group it, window it, and aggregate the counts
DataStream<WordWithCount> windowCounts = text
.flatMap(new FlatMapFunction<String, WordWithCount>() {
#Override
public void flatMap(String value, Collector<WordWithCount> out) {
for (String word : value.split("\\s")) {
out.collect(new WordWithCount(word, 1L));
}
}
})
.keyBy("word")
.timeWindow(Time.seconds(5), Time.seconds(1))
.reduce(new ReduceFunction<WordWithCount>() {
#Override
public WordWithCount reduce(WordWithCount a, WordWithCount b) {
return new WordWithCount(a.word, a.count + b.count);
}
});
// print the results with a single thread, rather than in parallel
windowCounts.print().setParallelism(1);
text.print().setParallelism(1);
env.execute("Socket Window WordCount");
List<HttpHost> httpHosts = new ArrayList<HttpHost>();
httpHosts.add(new HttpHost("127.0.0.1", 9200, "http"));
httpHosts.add(new HttpHost("10.2.3.1", 9200, "http"));
httpHosts.add(new HttpHost("my-ip",9200,"http"));
ElasticsearchSink.Builder<String> esSinkBuilder = new ElasticsearchSink.Builder<String>(
httpHosts,
new ElasticsearchSinkFunction<String>() {
public IndexRequest createIndexRequest(String element) {
Map<String, String> json = new HashMap<String, String>();
json.put("data", element);
return Requests.indexRequest()
.index("iran")
.type("int")
.source(json);
}
#Override
public void process(String element, RuntimeContext ctx, RequestIndexer indexer) {
indexer.add(createIndexRequest(element));
}
}
);
esSinkBuilder.setBulkFlushMaxActions(1);
final Header[] defaultHeaders = new Header[]{new BasicHeader("header", "value")};
esSinkBuilder.setRestClientFactory(new RestClientFactory() {
#Override
public void configureRestClientBuilder(RestClientBuilder restClientBuilder) {
restClientBuilder.setDefaultHeaders(defaultHeaders)
.setMaxRetryTimeoutMillis(10000)
.setPathPrefix("a")
.setRequestConfigCallback(new RestClientBuilder.RequestConfigCallback() {
#Override
public RequestConfig.Builder customizeRequestConfig(RequestConfig.Builder builder) {
return builder.setSocketTimeout(10000);
}
});
}
});
text.addSink(esSinkBuilder.build());
}
// Data type for words with count
public static class WordWithCount {
public String word;
public long count;
public WordWithCount() {
}
public WordWithCount(String word, long count) {
this.word = word;
this.count = count;
}
#Override
public String toString() {
return word + " : " + count;
}
}
}
my elasticsearch version: 7.5.0
my flink version: 1.8.3
my error:
sudo /etc/flink-1.8.3/bin/flink run -c org.apache.flink.Elastic /root/FlinkElastic-1.0.jar --port 9000
------------------------------------------------------------
The program finished with the following exception:
java.lang.RuntimeException: Could not look up the main(String[]) method from the class
org.apache.flink.Elastic:
org/apache/flink/streaming/connectors/elasticsearch/ElasticsearchSinkFunction
at org.apache.flink.client.program.PackagedProgram.hasMainMethod(PackagedProgram.java:527)
at org.apache.flink.client.program.PackagedProgram.<init>(PackagedProgram.java:246)
... 7 more
Caused by: java.lang.NoClassDefFoundError:
org/apache/flink/streaming/connectors/elasticsearch/ElasticsearchSinkFunction
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
at org.apache.flink.client.program.PackagedProgram.hasMainMethod(PackagedProgram.java:521)
... 7 more
Caused by: java.lang.ClassNotFoundException:
org.apache.flink.streaming.connectors.elasticsearch.ElasticsearchSinkFunction
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader.loadClass(FlinkUserCodeClassLoaders.java:120)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 13 more
my pom:
<groupId>org.apache.flink</groupId>
<artifactId>FlinkElastic</artifactId>
<version>1.0</version>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.6.1</version>
<configuration>
<source>6</source>
<target>6</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-elasticsearch6_2.11</artifactId>
<version>1.8.3</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>1.8.3</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.11</artifactId>
<version>1.8.3</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.11</artifactId>
<version>1.8.3</version>
</dependency>
</dependencies>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
Please find the Flink Elastic Connector code here. I have used the following dependencies and versions mentioned below.
Flink: 1.10.0
ElasticSearch: 7.6.2
flink-connector-elasticsearch7
Scala: 2.12.11
SBT: 1.2.8
Java: 11.0.4
Point to be noted here:
Since ElasticSearch 6.x onwards they started full support of the REST elastic client. And till Elastic5.x they were using Transport elastic client.
1. Flink DataStream
val inputStream: DataStream[(String, String)] = ...
ESSinkService.sinkToES(inputStream, index)
2. ElastiSearchSink Function
package demo.elastic
import org.apache.flink.streaming.api.scala._
import org.apache.log4j._
import org.apache.flink.api.common.functions.RuntimeContext
import org.apache.flink.streaming.connectors.elasticsearch7.{ElasticsearchSink, RestClientFactory}
import org.apache.flink.streaming.connectors.elasticsearch.{ActionRequestFailureHandler, ElasticsearchSinkFunction, RequestIndexer}
import org.apache.http.HttpHost
import org.elasticsearch.client.{Requests, RestClientBuilder}
import org.elasticsearch.common.xcontent.XContentType
import org.elasticsearch.action.ActionRequest
import org.apache.flink.streaming.api.datastream.DataStreamSink
class ESSinkService {
val logger = Logger.getLogger(getClass.getName)
val httpHosts = new java.util.ArrayList[HttpHost]
httpHosts.add(new HttpHost("localhost", 9200, "http"))
httpHosts.add(new HttpHost("localhost", 9200, "http"))
def sinkToES(counted: DataStream[(String, String)], index: String): DataStreamSink[(String, String)] = {
val esSinkBuilder = new ElasticsearchSink.Builder[(String, String)](
httpHosts, new ElasticsearchSinkFunction[(String, String)] {
def process(element: (String, String), ctx: RuntimeContext, indexer: RequestIndexer) {
indexer.add(Requests.indexRequest
.index(element._2 + "_" + index)
.source(element._1, XContentType.JSON))
}
}
)
esSinkBuilder.setBulkFlushMaxActions(2)
esSinkBuilder.setBulkFlushInterval(1000L)
esSinkBuilder.setFailureHandler(new ActionRequestFailureHandler {
override def onFailure(actionRequest: ActionRequest, throwable: Throwable, i: Int, requestIndexer: RequestIndexer): Unit = {
println("#######On failure from ElasticsearchSink:-->" + throwable.getMessage)
}
})
esSinkBuilder.setRestClientFactory(new RestClientFactory {
override def configureRestClientBuilder(restClientBuilder: RestClientBuilder): Unit = {
/*restClientBuilder.setDefaultHeaders(...)
restClientBuilder.setMaxRetryTimeoutMillis(...)
restClientBuilder.setPathPrefix(...)
restClientBuilder.setHttpClientConfigCallback(...)*/
}
})
counted.addSink(esSinkBuilder.build())
}
}
object ESSinkService extends ESSinkService
Note: For more details click here.
A couple of things:
Flink doesn't yet support Elasticsearch 7. An ES7 connector will be released along with Flink 1.10.
You must include the flink/elasticsearch dependency in your project -- this error suggests you haven't included it:
ClassNotFoundException:
org.apache.flink.streaming.connectors.elasticsearch.ElasticsearchSinkFunction
See the elasticsearch docs for more info.
Your Flink application code runs in the task managers. Each task manager must be able to find all of your application's dependencies in its CLASSPATH. The connector classes are not included out-of-the-box, so you will need to either build an uber jar (i.e., a fat jar, or jar with dependencies), or copy the flink-connector-elasticsearch6_2.11 jar file into the lib directory of every machine in the cluster. See the docs on connector dependencies for more details.

Hikari + Hibernate + Postgres: can't connect

I have an application that uses C3p0 with Hibernate 5 and Hibernate. I would like to try out Hikari, but I'm unable to get the application to run.
Maven
<dependency>
<groupId>com.zaxxer</groupId>
<artifactId>HikariCP</artifactId>
<version>3.4.1</version>
</dependency>
<dependency>
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
<version>42.2.8</version>
</dependency>
Hibernate version is: 5.2.17.Final
Spring configuration
public LocalSessionFactoryBean sessionFactory()
throws Exception
{
LocalSessionFactoryBean sessionFactory = new LocalSessionFactoryBean();
sessionFactory.setDataSource( dataSource() );
...
}
private DataSource dataSource()
{
String jdbcUrl = "jdbc:postgresql://localhost/demo";
String user = "john";
String pw = "pwd123";
String url = String.format( "%s?user=%s&password=%s", jdbcUrl, user, pw);
final HikariDataSource ds = new HikariDataSource();
ds.setMaximumPoolSize( 10 );
ds.setDataSourceClassName( "org.postgresql.ds.PGSimpleDataSource" );
ds.addDataSourceProperty( "url", url );
...
return ds;
}
I have tried different permutations of the above approach, including passing username and password in to the Datasource directly:
ds.addDataSourceProperty( "user", "john" );
ds.addDataSourceProperty( "password", "pw123" );
But I always end up with this error:
Caused by: java.sql.SQLFeatureNotSupportedException
at com.zaxxer.hikari.HikariDataSource.getConnection(HikariDataSource.java:135)
which is caused by this Hikari function:
#Override
public Connection getConnection(String username, String password) throws SQLException
{
throw new SQLFeatureNotSupportedException();
}
Look at this issue. It tells you have to either use jdbcUrl or dataSourceClassName but not both. I see you are using both, so just use one as mentioned in that issue and check if that helps.
Apart from that, I just checked when that getConnection(String username, String password) throws SQLException gets called, looks like if you have:
hibernate.connection.username and/or hibernate.connection.password
set, thats when it gets called. Make sure you don't have those two properties set somewhere.

Spring-Boot Elasticseach EntityMapper can not be autowired

Based on this answer and the comments I implemented the code to receive the scores of an elastic search query.
public class CustomizedHotelRepositoryImpl implements CustomizedHotelRepository {
private final ElasticsearchTemplate elasticsearchTemplate;
#Autowired
public CustomizedHotelRepositoryImpl(ElasticsearchTemplate elasticsearchTemplate) {
super();
this.elasticsearchTemplate = elasticsearchTemplate;
}
#Override
public Page<Hotel> findHotelsAndScoreByName(String name) {
QueryBuilder queryBuilder = QueryBuilders.boolQuery()
.should(QueryBuilders.queryStringQuery(name).lenient(true).defaultOperator(Operator.OR).field("name"));
NativeSearchQuery nativeSearchQuery = new NativeSearchQueryBuilder().withQuery(queryBuilder)
.withPageable(PageRequest.of(0, 100)).build();
DefaultEntityMapper mapper = new DefaultEntityMapper();
ResultsExtractor<Page<Hotel>> rs = new ResultsExtractor<Page<Hotel>>() {
#Override
public Page<Hotel> extract(SearchResponse response) {
ArrayList<Hotel> hotels = new ArrayList<>();
SearchHit[] hits = response.getHits().getHits();
for (SearchHit hit : hits) {
try {
Hotel hotel = mapper.mapToObject(hit.getSourceAsString(), Hotel.class);
hotel.setScore(hit.getScore());
hotels.add(hotel);
} catch (IOException e) {
e.printStackTrace();
}
}
return new PageImpl<>(hotels, PageRequest.of(0, 100), response.getHits().getTotalHits());
}
};
return elasticsearchTemplate.query(nativeSearchQuery, rs);
}
}
As you can see I needed to create a new instance of DefaultEntityMapper mapper = new DefaultEntityMapper(); which should not be the case because it should be possible to #Autowire EntityMapper. If I do so, I get the exception that there is no bean.
Description:
Field entityMapper in com.example.elasticsearch5.es.cluster.repository.impl.CustomizedCluserRepositoryImpl required a bean of type 'org.springframework.data.elasticsearch.core.EntityMapper' that could not be found.
Action:
Consider defining a bean of type 'org.springframework.data.elasticsearch.core.EntityMapper' in your configuration.
So does anybody know if its possible to autowire EntityMapper directly or does it needs to create the bean manually using #Bean annotation.
I use spring-data-elasticsearch-3.0.2.RELEASE.jar where the core package is inside.
My pom.xml:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-elasticsearch</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-elasticsearch</artifactId>
</dependency>
I checked out the source code of spring-data-elasticsearch. There is no bean/comoponent definition for EntityMapper. It seems this answer is wrong. I test it on my project and get the same error.
Consider defining a bean of type 'org.springframework.data.elasticsearch.core.EntityMapper' in your configuration.
I couldn't find any other option by except defining a #Bean

Resources