ElasticSearch client connection impossible - elasticsearch

I am having problems with ElasticSearch Client.
I have designed a Java application (to finish my developer studies) that manage different databases MySQL, mongoDb and ElasticSearch.
I was able to read, create and update data from a db to another.
All was designed and coded with local databases and everything works very well.
All the tests have been validated so we decide to make it work on distant servers on the real databases.
And then the problems begin to happened, and especially the one i'm stuck with since two long days and i'm getting really desperate.
With TransportClient it worked on local, but on real server it bring a NoNodeAvailableException saying that none of the configured nodes are available. Have try a lot of things to make it work but it doesn't.
So i decide to try connect and request with REST. Here is my code, and the error i got is not very detailed : it's a ConnectException.
Code :
// Get the connection client from ES database
public static RestHighLevelClient getConnection()
{
// Singleton
if(client == null)
{
try
{
// Set Credz
final CredentialsProvider credentialsProvider = new BasicCredentialsProvider();
credentialsProvider.setCredentials(AuthScope.ANY, new UsernamePasswordCredentials("Login", "pwd"));
// Builder
RestClientBuilder builder = RestClient.builder(new HttpHost("ServerIP", 9200, "http")).setHttpClientConfigCallback(new HttpClientConfigCallback()
{
#Override
public HttpAsyncClientBuilder customizeHttpClient(HttpAsyncClientBuilder httpClientBuilder)
{
System.out.println("Setting taken !");
return httpClientBuilder.setDefaultCredentialsProvider(credentialsProvider);
}
});
// Create client
client = new RestHighLevelClient (builder);
ClusterHealthRequest healthReq = new ClusterHealthRequest("looppy-cluster");
ClusterHealthResponse response = client.cluster().health(healthReq, RequestOptions.DEFAULT);
int numberOfNodes = response.getNumberOfNodes();
System.out.println("Nodes Nb : " + numberOfNodes);
}
catch(Exception ex)
{
System.out.println("Client connection problem");
ex.printStackTrace();
}
}
return client;
}
And here is the exception i got :
java.net.ConnectException
at org.elasticsearch.client.RestClient$SyncResponseListener.get(RestClient.java:943)
at org.elasticsearch.client.RestClient.performRequest(RestClient.java:227)
at org.elasticsearch.client.RestHighLevelClient.performRequest(RestHighLevelClient.java:1256)
at org.elasticsearch.client.RestHighLevelClient.performRequestAndParseEntity(RestHighLevelClient.java:1231)
at org.elasticsearch.client.ClusterClient.health(ClusterClient.java:146)
at utils.ElasticSearchMng.getConnection(ElasticSearchMng.java:86)
at utils.ElasticSearchMng.getLoopiesFromES(ElasticSearchMng.java:133)
at program.Program$1.run(Program.java:96)
at java.awt.event.InvocationEvent.dispatch(Unknown Source)
at java.awt.EventQueue.dispatchEventImpl(Unknown Source)
at java.awt.EventQueue.access$500(Unknown Source)
at java.awt.EventQueue$3.run(Unknown Source)
at java.awt.EventQueue$3.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(Unknown Source)
at java.awt.EventQueue.dispatchEvent(Unknown Source)
at java.awt.EventDispatchThread.pumpOneEventForFilters(Unknown Source)
at java.awt.EventDispatchThread.pumpEventsForFilter(Unknown Source)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(Unknown Source)
at java.awt.EventDispatchThread.pumpEvents(Unknown Source)
at java.awt.EventDispatchThread.pumpEvents(Unknown Source)
at java.awt.EventDispatchThread.run(Unknown Source)
Caused by: java.net.ConnectException
at org.apache.http.nio.pool.RouteSpecificPool.timeout(RouteSpecificPool.java:168)
at org.apache.http.nio.pool.AbstractNIOConnPool.requestTimeout(AbstractNIOConnPool.java:561)
at org.apache.http.nio.pool.AbstractNIOConnPool$InternalSessionRequestCallback.timeout(AbstractNIOConnPool.java:822)
at org.apache.http.impl.nio.reactor.SessionRequestImpl.timeout(SessionRequestImpl.java:183)
at org.apache.http.impl.nio.reactor.DefaultConnectingIOReactor.processTimeouts(DefaultConnectingIOReactor.java:210)
at org.apache.http.impl.nio.reactor.DefaultConnectingIOReactor.processEvents(DefaultConnectingIOReactor.java:155)
at org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor.execute(AbstractMultiworkerIOReactor.java:348)
at org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager.execute(PoolingNHttpClientConnectionManager.java:192)
at org.apache.http.impl.nio.client.CloseableHttpAsyncClientBase$1.run(CloseableHttpAsyncClientBase.java:64)
at java.lang.Thread.run(Unknown Source)
I'm getting despered and ireally don't understand what's happening, would appreciate a lot if someone can help !
thx a lot guyz

The problem is finally resolved, the port was not the good one, the peoples who work with the servers have tell me the port because they changed it and it directly worked ;)

Related

aws serverless spring boot set up is failing

here is my setup below,
#SpringBootApplication
#ComponentScan(basePackages = "com.amazonaws.serverless.sample.springboot.controller")
public class Application extends SpringBootServletInitializer {
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
}
public class LambdaHandler implements RequestStreamHandler {
private SpringBootLambdaContainerHandler<AwsProxyRequest, AwsProxyResponse> handler;
private static ObjectMapper mapper = new ObjectMapper();
private Logger log = LoggerFactory.getLogger(LambdaHandler.class);
#Override
public void handleRequest(InputStream inputStream, OutputStream outputStream, Context context)
throws IOException {
if (handler == null) {
try {
handler = SpringBootLambdaContainerHandler.getAwsProxyHandler(Application.class);
} catch (ContainerInitializationException e) {
e.printStackTrace();
outputStream.close();
return;
}
}
AwsProxyRequest request = mapper.readValue(inputStream, AwsProxyRequest.class);
AwsProxyResponse resp = handler.proxy(request, context);
mapper.writeValue(outputStream, resp);
// just in case it wasn't closed by the mapper
outputStream.close();
}
}
each time when when I am trying invoke my lambda from local console using command - "serverless invoke local -f hello" I am getting below exception,
j
ava.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at com.serverless.InvokeBridge.invoke(InvokeBridge.java:102)
at com.serverless.InvokeBridge.<init>(InvokeBridge.java:40)
at com.serverless.InvokeBridge.main(InvokeBridge.java:153)
Caused by: java.lang.NoClassDefFoundError: javax/servlet/ServletContainerInitializer
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(Unknown Source)
at java.security.SecureClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.access$100(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at com.amazonaws.serverless.proxy.spring.SpringBootLambdaContainerHandler.initialize(SpringBootLambdaContainerHandler.java:167)
at com.amazonaws.serverless.proxy.spring.SpringBootLambdaContainerHandler.getAwsProxyHandler(SpringBootLambdaContainerHandler.java:77)
at com.techprimers.serverless.services.AWSLambdaHandler.handleRequest(AWSLambdaHandler.java:172)
... 7 more
Caused by: java.lang.ClassNotFoundException: javax.servlet.ServletContainerInitializer
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
... 23 more
at line - "handler = SpringBootLambdaContainerHandler.getAwsProxyHandler(Application.class);"
Please HELP me resolve this, I tried various ways of boot setup with aws lambda but failed every time. Please help me fix this or help me out in setting up in any other way. Thanks.
In your logs we can see :
NoClassDefFoundError: javax/servlet/ServletContainerInitializer
ClassNotFoundException: javax.servlet.ServletContainerInitializer
Could you please check if libs are loaded , there must be something missing and not loaded up correctly
Also, something like this has been done
<dependency>
<groupId>com.amazonaws.serverless</groupId>
<artifactId>aws-serverless-java-container-spring</artifactId>
<version>1.3.2</version>
</dependency>
Also double check steps in :
https://github.com/awslabs/aws-serverless-java-container/wiki/Quick-start---Spring-Boot
I had the same issue. I was using version 0.2 of aws-serverless-java-container-spring and had to switch to version 1.3.2 as the former version didn't have SpringBootLambdaContainerHandler class.

How to upload an image to remote server using SMB protocol in java which works both for windows and linux?

I am trying to upload an image through SMB protocol but unable to connect to remote server.
Following is my code:
String user = "admin";
String pass ="admin";
byte[] imageBytes = Base64.decodeBase64(resourceDTOReq.getImageData());
String path="smb://ip/sharedfolderName/";
NtlmPasswordAuthentication auth = new NtlmPasswordAuthentication("",user, pass);
SmbFile smbFile = new SmbFile(path,auth);
SmbFileOutputStream smbfos = new SmbFileOutputStream(smbFile);
smbfos.write(imageBytes);
System.out.println("Upload completed!");
I am getting the following exception:
jcifs.smb.SmbException: Failed to connect: 0.0.0.0<00>/172.20.2.198
jcifs.util.transport.TransportException
java.net.ConnectException: Connection refused: connect
at java.net.DualStackPlainSocketImpl.waitForConnect(Native Method)
at java.net.DualStackPlainSocketImpl.socketConnect(Unknown Source)
at java.net.AbstractPlainSocketImpl.doConnect(Unknown Source)
at java.net.AbstractPlainSocketImpl.connectToAddress(Unknown Source)
at java.net.AbstractPlainSocketImpl.connect(Unknown Source)
at java.net.PlainSocketImpl.connect(Unknown Source)
at java.net.SocksSocketImpl.connect(Unknown Source)
at java.net.Socket.connect(Unknown Source)
at jcifs.smb.SmbTransport.ssn139(SmbTransport.java:196)
at jcifs.smb.SmbTransport.negotiate(SmbTransport.java:249)
at jcifs.smb.SmbTransport.doConnect(SmbTransport.java:322)
at jcifs.util.transport.Transport.run(Transport.java:241)
at java.lang.Thread.run(Unknown Source)
at jcifs.util.transport.Transport.run(Transport.java:258)
at java.lang.Thread.run(Unknown Source)
Please help me get through this, Thanks in advance!

Impala Kerberos authentication issue with long running program

I developed a Java program that runs on a Hadoop Edge Node (Cloudera CDH 5.9.1). It runs "for ever" as it is a (micro)service replying to requests. It receives requests from a webapp running somewhere else and extract data from Impala using Impala JDBC driver and send back data.
It runs well during roughly 10 hours, then it becomes unable to get Impala JDBC connections. You will find stacktrace and the end of this question. In short, it throws javax.security.auth.login.LoginException: Unable to obtain Principal Name for authentication.
My understanding is that Impala JDBC driver becomes enable to authenticate against Kerberos because the Kerberos ticket expired. So to be sure that the user with who the program is launched has a Kerberos ticket, I added a piece of code before the JDBC connection creation. I read those 2 Stackoverflow threads :
Should I call ugi.checkTGTAndReloginFromKeytab() before every action on hadoop?
HBase Kerberos connection renewal strategy
They explained very well the issue and the solution. Here is my attempt to implement that (I don't understand why it fails for Impala) :
public class KerberosRelogger {
private final static Logger logger = LogManager.getLogger();
public static String userName;
public static String user;
public static String keytab;
public static Configuration conf;
private static UserGroupInformation ugi;
static {
try {
init();
} catch (IOException e) {
throw new RuntimeException(e);
}
}
public static void init() throws IOException {
userName = System.getProperty("user.name");
user = userName + "#MY_REALM";
keytab = System.getProperty("user.home") + "/" + userName + ".keytab";
conf = new Configuration();
conf.addResource(new Path("file:///etc/hadoop/conf/core-site.xml"));
conf.addResource(new Path("file:///etc/hadoop/conf/hdfs-site.xml"));
UserGroupInformation.setConfiguration(conf);
ugi = UserGroupInformation.loginUserFromKeytabAndReturnUGI(user, keytab);
logger.debug("userName : {}", userName);
logger.debug("user : {}", user);
logger.debug("keytab : {}", keytab);
}
public static void testHadoopConnexion() throws IOException {
ugi.checkTGTAndReloginFromKeytab();
FileSystem fs = FileSystem.get(conf);
FileStatus[] statuses = fs.listStatus(new Path("/"));
List<String> dirNames = new ArrayList<>();
for (FileStatus status : statuses) {
dirNames.add(status.getPath().getName());
}
logger.debug("HDFS root dirs : " + dirNames);
}
private static Connection createImpalaConnection() throws IOException {
ugi.checkTGTAndReloginFromKeytab();
String IMPALAD_HOST = "my_node";
String jdbcUrl = "jdbc:impala://" + IMPALAD_HOST + ":21050;AuthMech=1;KrbRealm=MY_REALM;KrbHostFQDN=" + IMPALAD_HOST + ";KrbServiceName=impala;SSL=1";
Properties props = new Properties();
props.put("user", userName);
props.put("password", "");
try {
Connection connection = DriverManager.getConnection(jdbcUrl, props);
return connection;
} catch (SQLException e) {
throw new RuntimeException("Fail to create JDBC connection", e);
}
}
private static void testImpalaConnection() throws IOException, SQLException {
int n = 0;
try (Connection cnx = createImpalaConnection()) {
String sql = "show databases";
try (PreparedStatement ps = cnx.prepareStatement(sql)) {
try (ResultSet rs = ps.executeQuery()) {
while (rs.next()) {
n++;
}
}
}
}
logger.debug("{} Impala databases", n);
}
public static void main(String[] args) {
ScheduledExecutorService scheduledExecutorService = Executors.newSingleThreadScheduledExecutor();
scheduledExecutorService.scheduleAtFixedRate(new Runnable() {
#Override
public void run() {
try {
logger.debug("Test Hadoop connection");
testHadoopConnexion();
logger.debug("Hadoop connection - SUCCESS");
} catch (Exception e) {
logger.error("Hadoop connection - FAIL", e);
}
System.out.println();
try {
logger.debug("Test Impala connection");
testImpalaConnection();
logger.debug("Impala connection - SUCCESS");
} catch (Exception e) {
logger.error("Impala connection - FAIL", e);
}
}
}, 1, 10, TimeUnit.SECONDS);
}
}
If the ticket expired or if I perform a "kdestroy" with the command line, Hadoop HDFS listing still works but Impala JDBC driver is unable to get a connection.
Why Impala cannot authenticate whereas Hadoop API calls works ? It the Hadoop API works, it means that the programmatic relogin worked and that a "fresh" ticket is there. Am i right ? Why Impala failed on its side ?
Exception stacktrace
Caused by: java.sql.SQLException: [Simba][ImpalaJDBCDriver](500168) Error creating login context using ticket cache: Unable to obtain Principal Name for authentication .
at com.cloudera.hivecommon.api.HiveServer2ClientFactory.createTransport(Unknown Source) ~[ImpalaJDBC41.jar:ImpalaJDBC_2.5.41.1061]
at com.cloudera.hivecommon.api.HiveServer2ClientFactory.createClient(Unknown Source) ~[ImpalaJDBC41.jar:ImpalaJDBC_2.5.41.1061]
at com.cloudera.hivecommon.core.HiveJDBCCommonConnection.establishConnection(Unknown Source) ~[ImpalaJDBC41.jar:ImpalaJDBC_2.5.41.1061]
at com.cloudera.impala.core.ImpalaJDBCConnection.establishConnection(Unknown Source) ~[ImpalaJDBC41.jar:ImpalaJDBC_2.5.41.1061]
at com.cloudera.jdbc.core.LoginTimeoutConnection.connect(Unknown Source) ~[ImpalaJDBC41.jar:ImpalaJDBC_2.5.41.1061]
at com.cloudera.jdbc.common.BaseConnectionFactory.doConnect(Unknown Source) ~[ImpalaJDBC41.jar:ImpalaJDBC_2.5.41.1061]
at com.cloudera.jdbc.common.AbstractDriver.connect(Unknown Source) ~[ImpalaJDBC41.jar:ImpalaJDBC_2.5.41.1061]
at java.sql.DriverManager.getConnection(DriverManager.java:664) ~[?:1.8.0_92]
at java.sql.DriverManager.getConnection(DriverManager.java:208) ~[?:1.8.0_92]
...
Caused by: com.cloudera.support.exceptions.GeneralException: [Simba][ImpalaJDBCDriver](500168) Error creating login context using ticket cache: Unable to obtain Principal Name for authentication .
at com.cloudera.hivecommon.api.HiveServer2ClientFactory.createTransport(Unknown Source) ~[ImpalaJDBC41.jar:ImpalaJDBC_2.5.41.1061]
at com.cloudera.hivecommon.api.HiveServer2ClientFactory.createClient(Unknown Source) ~[ImpalaJDBC41.jar:ImpalaJDBC_2.5.41.1061]
at com.cloudera.hivecommon.core.HiveJDBCCommonConnection.establishConnection(Unknown Source) ~[ImpalaJDBC41.jar:ImpalaJDBC_2.5.41.1061]
at com.cloudera.impala.core.ImpalaJDBCConnection.establishConnection(Unknown Source) ~[ImpalaJDBC41.jar:ImpalaJDBC_2.5.41.1061]
at com.cloudera.jdbc.core.LoginTimeoutConnection.connect(Unknown Source) ~[ImpalaJDBC41.jar:ImpalaJDBC_2.5.41.1061]
at com.cloudera.jdbc.common.BaseConnectionFactory.doConnect(Unknown Source) ~[ImpalaJDBC41.jar:ImpalaJDBC_2.5.41.1061]
at com.cloudera.jdbc.common.AbstractDriver.connect(Unknown Source) ~[ImpalaJDBC41.jar:ImpalaJDBC_2.5.41.1061]
at java.sql.DriverManager.getConnection(DriverManager.java:664) ~[?:1.8.0_92]
at java.sql.DriverManager.getConnection(DriverManager.java:208) ~[?:1.8.0_92]
...
Caused by: javax.security.auth.login.LoginException: Unable to obtain Principal Name for authentication
at com.sun.security.auth.module.Krb5LoginModule.promptForName(Krb5LoginModule.java:841) ~[?:1.8.0_92]
at com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:704) ~[?:1.8.0_92]
at com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:617) ~[?:1.8.0_92]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_92]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_92]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_92]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_92]
at javax.security.auth.login.LoginContext.invoke(LoginContext.java:755) ~[?:1.8.0_92]
at javax.security.auth.login.LoginContext.access$000(LoginContext.java:195) ~[?:1.8.0_92]
at javax.security.auth.login.LoginContext$4.run(LoginContext.java:682) ~[?:1.8.0_92]
at javax.security.auth.login.LoginContext$4.run(LoginContext.java:680) ~[?:1.8.0_92]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_92]
at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680) ~[?:1.8.0_92]
at javax.security.auth.login.LoginContext.login(LoginContext.java:587) ~[?:1.8.0_92]
at com.cloudera.jdbc.kerberos.Kerberos.getSubjectViaTicketCache(Unknown Source) ~[ImpalaJDBC41.jar:ImpalaJDBC_2.5.41.1061]
at com.cloudera.hivecommon.api.HiveServer2ClientFactory.createTransport(Unknown Source) ~[ImpalaJDBC41.jar:ImpalaJDBC_2.5.41.1061]
at com.cloudera.hivecommon.api.HiveServer2ClientFactory.createClient(Unknown Source) ~[ImpalaJDBC41.jar:ImpalaJDBC_2.5.41.1061]
at com.cloudera.hivecommon.core.HiveJDBCCommonConnection.establishConnection(Unknown Source) ~[ImpalaJDBC41.jar:ImpalaJDBC_2.5.41.1061]
at com.cloudera.impala.core.ImpalaJDBCConnection.establishConnection(Unknown Source) ~[ImpalaJDBC41.jar:ImpalaJDBC_2.5.41.1061]
at com.cloudera.jdbc.core.LoginTimeoutConnection.connect(Unknown Source) ~[ImpalaJDBC41.jar:ImpalaJDBC_2.5.41.1061]
at com.cloudera.jdbc.common.BaseConnectionFactory.doConnect(Unknown Source) ~[ImpalaJDBC41.jar:ImpalaJDBC_2.5.41.1061]
at com.cloudera.jdbc.common.AbstractDriver.connect(Unknown Source) ~[ImpalaJDBC41.jar:ImpalaJDBC_2.5.41.1061]
at java.sql.DriverManager.getConnection(DriverManager.java:664) ~[?:1.8.0_92]
at java.sql.DriverManager.getConnection(DriverManager.java:208) ~[?:1.8.0_92]
...

spring redis running script : how to pass expiry time value

I use the lua script:
local lock = redis.call('get', KEYS[1])
if not lock then
return redis.call('SETEX', KEYS[1], ARGV[1] ,ARGV[2] );
end
return false
From spring boot application I use to call redis with the script
DefaultRedisScript<Boolean> redisScript = new
DefaultRedisScript<Boolean>();
redisScript.setScriptSource(new ResourceScriptSource(new ClassPathResource("checkandset2.lua")));
redisScript.setResultType(Boolean.class);
System.out.println(redisTemplate.execute(redisScript , Collections.singletonList("value123"),"10" ,"key123"));
I always get exception :
java.lang.ClassCastException: java.lang.Long cannot be cast to java.lang.String
at org.springframework.data.redis.serializer.StringRedisSerializer.serialize(StringRedisSerializer.java:32)
at org.springframework.data.redis.core.script.DefaultScriptExecutor.keysAndArgs(DefaultScriptExecutor.java:116)
at org.springframework.data.redis.core.script.DefaultScriptExecutor$1.doInRedis(DefaultScriptExecutor.java:63)
at org.springframework.data.redis.core.RedisTemplate.execute(RedisTemplate.java:202)
at org.springframework.data.redis.core.RedisTemplate.execute(RedisTemplate.java:164)
at org.springframework.data.redis.core.RedisTemplate.execute(RedisTemplate.java:152)
at org.springframework.data.redis.core.script.DefaultScriptExecutor.execute(DefaultScriptExecutor.java:60)
at org.springframework.data.redis.core.script.DefaultScriptExecutor.execute(DefaultScriptExecutor.java:54)
at org.springframework.data.redis.core.RedisTemplate.execute(RedisTemplate.java:298)
at com.masary.ledger.ResisScriptTestClass.msisdnJustRechargedException(ResisScriptTestClass.java:34)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192)
When I use
System.out.println(redisTemplate.execute(redisScript ,
Collections.singletonList("value123"),new Long(10) ,"key123"));
I get exception
org.springframework.data.redis.RedisSystemException: Unknown redis exception; nested exception is java.lang.ClassCastException: [B cannot be cast to java.lang.Long
at org.springframework.data.redis.FallbackExceptionTranslationStrategy.getFallback(FallbackExceptionTranslationStrategy.java:48)
at org.springframework.data.redis.FallbackExceptionTranslationStrategy.translate(FallbackExceptionTranslationStrategy.java:38)
at org.springframework.data.redis.connection.jedis.JedisConnection.convertJedisAccessException(JedisConnection.java:212)
at org.springframework.data.redis.connection.jedis.JedisConnection.evalSha(JedisConnection.java:3173)
at org.springframework.data.redis.connection.jedis.JedisConnection.evalSha(JedisConnection.java:3158)
at org.springframework.data.redis.connection.DefaultStringRedisConnection.evalSha(DefaultStringRedisConnection.java:1374)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.springframework.data.redis.core.CloseSuppressingInvocationHandler.invoke(CloseSuppressingInvocationHandler.java:57)
at com.sun.proxy.$Proxy182.evalSha(Unknown Source)
at org.springframework.data.redis.core.script.DefaultScriptExecutor.eval(DefaultScriptExecutor.java:81)
at org.springframework.data.redis.core.script.DefaultScriptExecutor$1.doInRedis(DefaultScriptExecutor.java:71)
at org.springframework.data.redis.core.RedisTemplate.execute(RedisTemplate.java:202)
at org.springframework.data.redis.core.RedisTemplate.execute(RedisTemplate.java:164)
at org.springframework.data.redis.core.RedisTemplate.execute(RedisTemplate.java:152)
at org.springframework.data.redis.core.script.DefaultScriptExecutor.execute(DefaultScriptExecutor.java:60)
at org.springframework.data.redis.core.script.DefaultScriptExecutor.execute(DefaultScriptExecutor.java:54)
at org.springframework.data.redis.core.RedisTemplate.execute(RedisTemplate.java:298)
at com.masary.ledger.ResisScriptTestClass.msisdnJustRechargedException(ResisScriptTestClass.java:34)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
.
.
Caused by: java.lang.ClassCastException: [B cannot be cast to java.lang.Long
at org.springframework.data.redis.connection.jedis.JedisScriptReturnConverter.convert(JedisScriptReturnConverter.java:53)
at org.springframework.data.redis.connection.jedis.JedisConnection.evalSha(JedisConnection.java:3171)
... 46 more
any advice how can I pass expiry time value to the lua script?
I'm a newbie myself, and was having a similar problem. Try sending the string "10", instead of the Long. Worked for me.
The thread might be old, but might be helpful for others who might stumble upon this.
For me, the reason was using JdkSerializationRedisSerializer as value serializer, like: redisTemplate.setValueSerializer(new JdkSerializationRedisSerializer()), this is also the default.
To fix this, see below the fix:
private static void redisEvalForExpire(String scriptText, List<String> keys, Object[] argv) {
// since spring boot serializes data to - "\xac\xed\x00\x05t\x00\x0" it was not being recognized by redis as integer
redisTemplate.execute(new DefaultRedisScript<>(scriptText), new StringRedisSerializer(), new StringRedisSerializer(), keys, argv);
}
This says use StringRedisSerializer for any value, so when we do - set a 2, using JdkSerializationRedisSerializer, 2 serialized to something like:
"\xac\xed\x00\x05sr\x00\x11java.lang.Integer\x12\xe2\xa0\xa4\xf7\x81\x878\x02\x00\x01I\x00\x05valuexr\x00\x10java.lang.Number\x86\xac\x95\x1d\x0b\x94\xe0\x8b\x02\x00\x00xp\x00\x00\x00\x02"
but when we say use StringRedisSerializer, 2 is sent as "2", which Redis can easily recognize.
Official Documentation

Https Connection rejected by server

I am trying to write a standalone program that will be called by a bat file. I found something like a trust all certificate manager.
TrustManager[] trustAllCerts = new TrustManager[]{
new X509TrustManager() {
public java.security.cert.X509Certificate[] getAcceptedIssuers() {
return null;
}
public void checkClientTrusted(
java.security.cert.X509Certificate[] certs, String authType) {
}
public void checkServerTrusted(
java.security.cert.X509Certificate[] certs, String authType) {
}
}
};
// Install the all-trusting trust manager
try {
SSLContext sc = SSLContext.getInstance("SSL");
sc.init(null, trustAllCerts, new java.security.SecureRandom());
HttpsURLConnection.setDefaultSSLSocketFactory(sc.getSocketFactory());
URL url = new URL("--insert server url here--");
//url.openStream();
// URLConnection urlConnection = url.openConnection();
HttpsURLConnection urlConnection = (HttpsURLConnection) url.openConnection();
} catch (MalformedURLException mfe) {
mfe.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
}
}
it works fine in my ide but after I built in and run it via the bat file it keeps giving me this error
javax.net.ssl.SSLException: Server key
at sun.security.ssl.Handshaker.throwSSLException(Unknown Source)
at sun.security.ssl.ClientHandshaker.processMessage(Unknown Source)
at sun.security.ssl.Handshaker.processLoop(Unknown Source)
at sun.security.ssl.Handshaker.process_record(Unknown Source)
at sun.security.ssl.SSLSocketImpl.readRecord(Unknown Source)
at sun.security.ssl.SSLSocketImpl.performInitialHandshake(Unknown Source
)
at sun.security.ssl.SSLSocketImpl.startHandshake(Unknown Source)
at sun.security.ssl.SSLSocketImpl.startHandshake(Unknown Source)
at sun.net.www.protocol.https.HttpsClient.afterConnect(Unknown Source)
at sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.connect
(Unknown Source)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(Unknown So
urce)
at java.net.HttpURLConnection.getResponseCode(Unknown Source)
at sun.net.www.protocol.https.HttpsURLConnectionImpl.getResponseCode(Unk
nown Source)
at sss.Sss.main(Sss.java:65)
Caused by: java.security.NoSuchAlgorithmException: NONEwithRSA Signature not ava
ilable
at java.security.Signature.getInstance(Unknown Source)
at sun.security.ssl.JsseJce.getSignature(Unknown Source)
at sun.security.ssl.RSASignature.<init>(Unknown Source)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Sou
rce)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at java.lang.Class.newInstance0(Unknown Source)
at java.lang.Class.newInstance(Unknown Source)
at java.security.Provider$Service.newInstance(Unknown Source)
at sun.security.jca.GetInstance.getInstance(Unknown Source)
at java.security.Signature.getInstance(Unknown Source)
at sun.security.ssl.JsseJce.getSignature(Unknown Source)
at sun.security.ssl.RSASignature.getInstance(Unknown Source)
at sun.security.ssl.HandshakeMessage$DH_ServerKeyExchange.<init>(Unknown
Source)
... 13 more
am I doing something wrong? Should I sign a certificate in order to connect to the server?
Have found the answer for the problem above, will need to include the sunjce_provider.jar into the lib folder of the standalone program.

Resources