Cannot use Jedis when in Pipeline. Please use Pipeline or reset jedis state - jedis

I have trouble executing pipeline commands in spring data redis. I am using StringRedisTemplate. spring-data-redis 1.6.1, spring boot 1.3.2, and jedis both 2.7.3 and 2.8.0.
The code:
public void saveUserActivityEvents(Event... events) {
List<Object> results = stringRedisTemplate.executePipelined(
new RedisCallback<Object>() {
public Object doInRedis(RedisConnection connection) throws DataAccessException {
StringRedisConnection stringRedisConn = (StringRedisConnection)connection;
for(int i=0; i< events.length; i++) {
Event event = events[i];
String userId = getUserId(event.getUser());
String eventType = event.getEventType();
String itemId = event.getItem();
Integer amount = event.getAmount() == null ? 0 : Integer.parseInt(event.getAmount());
Double timestamp = Double.valueOf(event.getTimestamp());
Map<String, String> valueMap= new HashMap<String, String>();
valueMap.put("itemId", itemId);
valueMap.put("userId", userId);
String userItemEventsKey = StrSubstitutor.replace(Constants.KEY_USER_ITEM_EVENTS, valueMap);
valueMap.put("userId", userId);
String userItemsKey = StrSubstitutor.replace(Constants.KEY_USER_ITEMS, valueMap);
stringRedisConn.zAdd(userItemsKey, timestamp, itemId);
stringRedisConn.hIncrBy(userItemEventsKey, eventType, amount);
long expireInMs = TimeoutUtils.toMillis(getExpiryTimeInDays(event.getUser()), TimeUnit.DAYS);
stringRedisConn.pExpire(userItemEventsKey, expireInMs);
}
return null;
}
});
}
It blows with the exception in subject when executing pExpire.
I've tried with different flavour suggested in reference guide:
with
execute(redisCallback, true, true)
The same result. Any idea?
Thanks

Related

Simple aggregation is getting failed in javaelasticsearch 8.0+ client

I have got a simple method that performs simple terms aggregation using elastic search8.0
I am able to do it using RestHighLevelClient but with ElasticsearchClient I am getting empty buckets.
can someone please help me to resolve
public void aggregate(ElasticsearchClient client) throws ElasticsearchException, IOException {
String field = "loglevel";
Map<String, Long> buckets = new HashMap<String, Long>();
SearchResponse<SspDevLog> response = client.search(fn -> fn
.aggregations("loglevel", a -> a.terms(v-> v.field(field))), SspDevLog.class);
Map<String, Aggregate> aggrs = response.aggregations();
for(Map.Entry<String, Aggregate> entry : aggrs.entrySet()) {
Aggregate aggregate = entry.getValue();
StringTermsAggregate sterms = aggregate.sterms();
Buckets<StringTermsBucket> sbuckets = sterms.buckets();
List<StringTermsBucket> bucArr = sbuckets.array();
for(StringTermsBucket bucObj : bucArr) {
buckets.put(bucObj.key(), bucObj.docCount());
}
}
System.out.println(buckets);
}

Unit test in Spring boot using Mockito

While executing Juint4 test. it shows null pointer exception. while using save method in unit test it returns null. Here i am using Mockito Juint4 Testing to mock the method. someone help me out with this.
**Service Method.**
public Result save(Map inputParams){
Result result = new Result();
logger.info("::::::::::::::: save ::::::::::::::::"+inputParams);
try{
String name = inputParams.get("name").toString();
String type = inputParams.get("type").toString();
CoreIndustry coreIndustry = coreIndustryDao.findByName(name);
if(coreIndustry != null){
result.setStatusCode(HttpStatus.FOUND.value());
result.setMessage(Messages.NAME_EXIST_MESSAGE);
result.setSuccess(false);
}else{
CoreIndustry coreIndustryNew = new CoreIndustry();
coreIndustryNew.setName(name);
coreIndustryNew.setType(type);
coreIndustryNew.setInfo(new Gson().toJson(inputParams.get("info")));
System.out.println("CoreIndustry Info is :............:.............:..............:"+coreIndustryNew.getInfo());
CoreIndustry coreIndustryData = coreIndustryDao.save(coreIndustryNew);
System.out.println("Saved Data Is.............::::::::::::::::::::................ "+coreIndustryData.getName()+" "+coreIndustryData.getType()+" "+coreIndustryData.getType());
result.setData(coreIndustryData);
result.setStatusCode(HttpStatus.OK.value());
result.setMessage(Messages.CREATE_MESSAGE);
result.setSuccess(true);
}
}catch (Exception e){
logger.error("::::::::::::::: Exception ::::::::::::::::"+e.getMessage());
result.setStatusCode(HttpStatus.INTERNAL_SERVER_ERROR.value());
result.setSuccess(false);
result.setMessage(e.getMessage());
}
return result;
}
**Controller**
#PostMapping(path = "/industry/save")
public Result save(#RequestBody Map<String, Object> stringToParse) {
logger.debug("save---------------"+stringToParse);
Result result = industryService.save(stringToParse);
return result;
}
**Unit Test**
#RunWith(SpringRunner.class)
#SpringBootTest
public class IndustryServiceTest {
#MockBean
private CoreIndustryDao coreIndustryDao;
private IndustryService industryService;
#Test
public void getAll() {
System.out.println("::::::: Inside of GetAll Method of Controller.");
// when(coreIndustryDao.findAll()).thenReturn(Stream.of(
// new CoreIndustry("Dilip","Brik","Brik Industry"))
// .collect(Collectors.toList()));
//assertEquals(1,industryService.getAll().setData());
}
#Test
public void save() {
ObjectMapper oMapper = new ObjectMapper();
CoreIndustry coreIndustry = new CoreIndustry();
coreIndustry.setId(2L);
coreIndustry.setName("Dilip");
coreIndustry.setType("Business");
HashMap<String,Object> map = new HashMap();
map.put("name","Retail");
map.put("type","Development");
coreIndustry.setInfo(new Gson().toJson(map));
when(coreIndustryDao.save(any(CoreIndustry.class))).thenReturn(new CoreIndustry());
Map<String, Object> actualValues = oMapper.convertValue(coreIndustry,Map.class);
System.out.println("CoreIndustry Filed values are........ : "+coreIndustry.getName()+" "+coreIndustry.getInfo());
Result created = industryService.save(actualValues);
CoreIndustry coreIndustryValue = (CoreIndustry) created.getData();
Map<String, Object> expectedValues = oMapper.convertValue(coreIndustryValue, Map.class);
System.out.println(" Getting Saved data from CoreIndustry........"+expectedValues);
System.out.println(" Getting Saved data from CoreIndustry........"+coreIndustryValue.getName());
assertThat(actualValues).isSameAs(expectedValues);
}
I am new in this Spring boot Technology.
After Running the source code for save method.
After Debugging my source code..
It will be great please to help me out. Thank you.

can I modify the consumer auto-offset-reset to latest of kafka stream?

Working with kafka 0.10.1.0, I used these config
val props = new Properties
props.put(StreamsConfig.APPLICATION_ID_CONFIG, applicationId)
props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, broker)
props.put(StreamsConfig.KEY_SERDE_CLASS_CONFIG, Serdes.String.getClass)
props.put(StreamsConfig.VALUE_SERDE_CLASS_CONFIG, Serdes.Integer.getClass)
props.put(StreamsConfig.consumerPrefix(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG), "latest")
but these code props.put(StreamsConfig.consumerPrefix(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG), "latest")
does not work, what is the reason?
I read the code of org.apache.kafka.streams.StreamsConfig, there has some code:
private static final Map<String, Object> CONSUMER_DEFAULT_OVERRIDES;
static
{
Map<String, Object> tempConsumerDefaultOverrides = new HashMap<>();
tempConsumerDefaultOverrides.put(ConsumerConfig.MAX_POLL_RECORDS_CONFIG, "1000");
tempConsumerDefaultOverrides.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
tempConsumerDefaultOverrides.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, "false");
CONSUMER_DEFAULT_OVERRIDES = Collections.unmodifiableMap(tempConsumerDefaultOverrides);
}
public Map<String, Object> getConsumerConfigs(StreamThread streamThread, String groupId, String clientId) throws ConfigException {
final Map<String, Object> consumerProps = getClientPropsWithPrefix(CONSUMER_PREFIX, ConsumerConfig.configNames());
// disable auto commit and throw exception if there is user overridden values,
// this is necessary for streams commit semantics
if (consumerProps.containsKey(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG)) {
throw new ConfigException("Unexpected user-specified consumer config " + ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG
+ ", as the streams client will always turn off auto committing.");
}
consumerProps.putAll(CONSUMER_DEFAULT_OVERRIDES);
// bootstrap.servers should be from StreamsConfig
consumerProps.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, this.originals().get(BOOTSTRAP_SERVERS_CONFIG));
// add client id with stream client id prefix, and group id
consumerProps.put(ConsumerConfig.GROUP_ID_CONFIG, groupId);
consumerProps.put(CommonClientConfigs.CLIENT_ID_CONFIG, clientId + "-consumer");
// add configs required for stream partition assignor
consumerProps.put(StreamsConfig.InternalConfig.STREAM_THREAD_INSTANCE, streamThread);
consumerProps.put(StreamsConfig.REPLICATION_FACTOR_CONFIG, getInt(REPLICATION_FACTOR_CONFIG));
consumerProps.put(StreamsConfig.NUM_STANDBY_REPLICAS_CONFIG, getInt(NUM_STANDBY_REPLICAS_CONFIG));
consumerProps.put(ConsumerConfig.PARTITION_ASSIGNMENT_STRATEGY_CONFIG, StreamPartitionAssignor.class.getName());
consumerProps.put(StreamsConfig.WINDOW_STORE_CHANGE_LOG_ADDITIONAL_RETENTION_MS_CONFIG, getLong(WINDOW_STORE_CHANGE_LOG_ADDITIONAL_RETENTION_MS_CONFIG));
if (!getString(ZOOKEEPER_CONNECT_CONFIG).equals("")) {
consumerProps.put(StreamsConfig.ZOOKEEPER_CONNECT_CONFIG, getString(ZOOKEEPER_CONNECT_CONFIG));
}
consumerProps.put(APPLICATION_SERVER_CONFIG, getString(APPLICATION_SERVER_CONFIG));
return consumerProps;
}
It will be use the CONSUMER_DEFAULT_OVERRIDES override the config of I set?
This is a bug of 0.10.1.0,which is fixed in 0.10.1.1 and beyond.
https://issues.apache.org/jira/browse/KAFKA-4361

CloseableHttpClient.execute freezes once every few weeks despite timeouts

We have a groovy singleton that uses PoolingHttpClientConnectionManager(httpclient:4.3.6) with a pool size of 200 to handle very high concurrent connections to a search service and processes the xml response.
Despite having specified timeouts, it freezes about once a month but runs perfectly fine the rest of the time.
The groovy singleton below. The method retrieveInputFromURL seems to block on client.execute(get);
#Singleton(strict=false)
class StreamManagerUtil {
// Instantiate once and cache for lifetime of Signleton class
private static PoolingHttpClientConnectionManager connManager = new PoolingHttpClientConnectionManager();
private static CloseableHttpClient client;
private static final IdleConnectionMonitorThread staleMonitor = new IdleConnectionMonitorThread(connManager);
private int warningLimit;
private int readTimeout;
private int connectionTimeout;
private int connectionFetchTimeout;
private int poolSize;
private int routeSize;
PropertyManager propertyManager = PropertyManagerFactory.getInstance().getPropertyManager("sebe.properties")
StreamManagerUtil() {
// Initialize all instance variables in singleton from properties file
readTimeout = 6
connectionTimeout = 6
connectionFetchTimeout =6
// Pooling
poolSize = 200
routeSize = 50
// Connection pool size and number of routes to cache
connManager.setMaxTotal(poolSize);
connManager.setDefaultMaxPerRoute(routeSize);
// ConnectTimeout : time to establish connection with GSA
// ConnectionRequestTimeout : time to get connection from pool
// SocketTimeout : waiting for packets form GSA
RequestConfig config = RequestConfig.custom()
.setConnectTimeout(connectionTimeout * 1000)
.setConnectionRequestTimeout(connectionFetchTimeout * 1000)
.setSocketTimeout(readTimeout * 1000).build();
// Keep alive for 5 seconds if server does not have keep alive header
ConnectionKeepAliveStrategy myStrategy = new ConnectionKeepAliveStrategy() {
#Override
public long getKeepAliveDuration(HttpResponse response, HttpContext context) {
HeaderElementIterator it = new BasicHeaderElementIterator
(response.headerIterator(HTTP.CONN_KEEP_ALIVE));
while (it.hasNext()) {
HeaderElement he = it.nextElement();
String param = he.getName();
String value = he.getValue();
if (value != null && param.equalsIgnoreCase
("timeout")) {
return Long.parseLong(value) * 1000;
}
}
return 5 * 1000;
}
};
// Close all connection older than 5 seconds. Run as separate thread.
staleMonitor.start();
staleMonitor.join(1000);
client = HttpClients.custom().setDefaultRequestConfig(config).setKeepAliveStrategy(myStrategy).setConnectionManager(connManager).build();
}
private retrieveInputFromURL (String categoryUrl, String xForwFor, boolean isXml) throws Exception {
URL url = new URL( categoryUrl );
GPathResult searchResponse = null
InputStream inputStream = null
HttpResponse response;
HttpGet get;
try {
long startTime = System.nanoTime();
get = new HttpGet(categoryUrl);
response = client.execute(get);
int resCode = response.getStatusLine().getStatusCode();
if (xForwFor != null) {
get.setHeader("X-Forwarded-For", xForwFor)
}
if (resCode == HttpStatus.SC_OK) {
if (isXml) {
extractXmlString(response)
} else {
StringBuffer buffer = buildStringFromResponse(response)
return buffer.toString();
}
}
}
catch (Exception e)
{
throw e;
}
finally {
// Release connection back to pool
if (response != null) {
EntityUtils.consume(response.getEntity());
}
}
}
private extractXmlString(HttpResponse response) {
InputStream inputStream = response.getEntity().getContent()
XmlSlurper slurper = new XmlSlurper()
slurper.setFeature("http://xml.org/sax/features/validation", false)
slurper.setFeature("http://apache.org/xml/features/disallow-doctype-decl", false)
slurper.setFeature("http://apache.org/xml/features/nonvalidating/load-dtd-grammar", false)
slurper.setFeature("http://apache.org/xml/features/nonvalidating/load-external-dtd", false)
return slurper.parse(inputStream)
}
private StringBuffer buildStringFromResponse(HttpResponse response) {
StringBuffer buffer= new StringBuffer();
BufferedReader rd = new BufferedReader(new InputStreamReader(response.getEntity().getContent()));
String line = "";
while ((line = rd.readLine()) != null) {
buffer.append(line);
System.out.println(line);
}
return buffer
}
public class IdleConnectionMonitorThread extends Thread {
private final HttpClientConnectionManager connMgr;
private volatile boolean shutdown;
public IdleConnectionMonitorThread
(PoolingHttpClientConnectionManager connMgr) {
super();
this.connMgr = connMgr;
}
#Override
public void run() {
try {
while (!shutdown) {
synchronized (this) {
wait(5000);
connMgr.closeExpiredConnections();
connMgr.closeIdleConnections(10, TimeUnit.SECONDS);
}
}
} catch (InterruptedException ex) {
// Ignore
}
}
public void shutdown() {
shutdown = true;
synchronized (this) {
notifyAll();
}
}
}
I also found found this in the log leading me to believe it happened on waiting for response data
java.net.SocketTimeoutException: Read timed out at java.net.SocketInputStream.socketRead0(Native Method) at java.net.SocketInputStream.read(SocketInputStream.java:150) at java.net.SocketInputStream.read(SocketInputStream.java:121) at sun.security.ssl.InputRecord.readFully(InputRecord.java:465)
Findings thus far:
We are using java 1.8u25. There is an open issue on a similar scenario
https://bugs.openjdk.java.net/browse/JDK-8075484
HttpClient had a similar report https://issues.apache.org/jira/browse/HTTPCLIENT-1589 but this was fixed in
the 4.3.6 version we are using
Questions
Can this be a synchronisation issue? From my understanding even though the singleton is accessed by multiple threads, the only shared data is the cached CloseableHttpClient
Is there anything else fundamentally wrong with this code,approach that may be causing this behaviour?
I do not see anything obviously wrong with your code. I would strongly recommend setting SO_TIMEOUT parameter on the connection manager, though, to make sure it applies to all new socket at the creation time, not at the time of request execution.
I would also help to know what exactly 'freezing' means. Are worker threads getting blocked waiting to acquire connections from the pool or waiting for response data?
Please also note that worker threads can appear 'frozen' if the server keeps on sending bits of chunk coded data. As usual a wire / context log of the client session would help a lot
http://hc.apache.org/httpcomponents-client-4.3.x/logging.html

spring ArrayIndexOutOfBoundsException with SimpleJdbcCall

This is the Oracle procedure I'm trying to call:
PROCEDURE GetCoreReportExtras
( pnAssignment IN NUMBER,
pnUserRole in NUMBER,
psAreaMenu in VARCHAR2,
pnAreaLevel in NUMBER,
curReportList OUT outcur,
psLDO in VARCHAR2 default 'none',
pnAcisNumber in NUMBER default 0);
Here's my Java code:
private class GetStandardReportExtrasSPV2{
int nAreaLevel;
int nAssignment;
int nUserRole;
int nAcisNum = 0;
String strAreaMenu;
String strLDO = null;
private SimpleJdbcTemplate simpleJdbcTemplate;
private SimpleJdbcCall procGetReportExtras;
public GetStandardReportExtrasSPV2(DataSource ds, int nUserRole, String strAreaMenu,
int nAssignment, int nAreaLevel, String strLDO, int nAcisNum) {
this.simpleJdbcTemplate = new SimpleJdbcTemplate(ds);
JdbcTemplate jdbcTemplate = new JdbcTemplate(ds);
jdbcTemplate.setResultsMapCaseInsensitive(true);
this.procGetReportExtras =
new SimpleJdbcCall(jdbcTemplate)
.withCatalogName("package")
.withProcedureName("proc")
.returningResultSet(REPORT_LIST,
ParameterizedBeanPropertyRowMapper.newInstance(Report.class));
}
public List<Report> getReportsList() {
Map<String, Object> params = new HashMap<String, Object>();
params.put(USER_ASSIGNMENT, nAssignment);
params.put(USER_ROLE, nUserRole);
params.put(AREA_MENU, strAreaMenu);
params.put(AREA_LEVEL, nAreaLevel);
params.put(SEGMENT, strLDO);
params.put(ACIS_NUMBER, nAcisNum);
SqlParameterSource in = new MapSqlParameterSource()
.addValues(params);
Map m = procGetReportExtras.execute(new HashMap<String, Object>(0), in);
return (List) m.get(REPORT_LIST);
}
}
When getReportsList() calls execute(), I get the following Exception:
java.lang.ArrayIndexOutOfBoundsException: 2
org.springframework.jdbc.core.metadata.CallMetaDataContext.matchInParameterValuesWithCallParameters(CallMetaDataContext.java:555)
org.springframework.jdbc.core.simple.AbstractJdbcCall.matchInParameterValuesWithCallParameters(AbstractJdbcCall.java:419)
org.springframework.jdbc.core.simple.AbstractJdbcCall.doExecute(AbstractJdbcCall.java:364)
org.springframework.jdbc.core.simple.SimpleJdbcCall.execute(SimpleJdbcCall.java:173)
Any hints as to what I'm doing wrong?
The issue is here
Map m = procGetReportExtras.execute(new HashMap<String, Object>(0), in);
You are calling the overloaded method that takes in a Object .... This method takes
optional array containing the in parameter values to be used in the
call. Parameter values must be provided in the same order as the
parameters are defined for the stored procedure.
In your call, those two parameters happen to be an empty HashMap and a SqlParameterSource.
The call fails in CallMetaDataContext#matchInParameterValuesWithCallParameters(). Source:
public Map<String, ?> matchInParameterValuesWithCallParameters(Object[] parameterValues) {
Map<String, Object> matchedParameters = new HashMap<String, Object>(parameterValues.length);
int i = 0;
for (SqlParameter parameter : this.callParameters) {
if (parameter.isInputValueProvided()) {
String parameterName = parameter.getName();
matchedParameters.put(parameterName, parameterValues[i++]); // fails here
}
}
return matchedParameters;
}
It's expecting that the array you pass has 7 elements (because that's what your stored procedure expects), but it only has 2, the HashMap and the SqlParameterSource. When it tries to access the 3rd one (index 2), it throws an ArrayIndexOutOfBoundsException.
You probably wanted to use the execute() method that accepts an SqlParameterSource
Map m = procGetReportExtras.execute(in);
You are calling the wrong SimpleJDBCCall.execute(Object... args); use SimpleJDBCCall.execute(SqlParameterSource)
procGetReportExtras.execute(in);

Resources