I use RestTemplate for GET Request with requestParam as :
final String Url = "http://localhost:8080/player/{nickname}";
final Map<String, String> nickname = new HashMap<String, String>();
nickname.put("nickname", score.getPlayer());
String Uri = "http://localhost:8081/game/{code}";
final Map<String, String> code = new HashMap<String, String>();
code.put("code", score.getGamecode());
Player player = restTemplate.getForObject(Url, Player.class, nickname);
Game game = restTemplate.getForObject(Uri, Game.class, code);
but I have an error
Connection refused
in the console
and error:
"I/O error on GET request for \"http://localhost:8080/player/sahar1\": Connection refused: connect; nested exception is java.net.ConnectException: Connection refused: connect"
in the Postman
could you help me how can I send GET request with requestParam?
the POST method:
#PostMapping
public ResponseEntity<?> createScore(#RequestBody #JsonView(Views.class) #Valid Score score) {
final String Url = "http://localhost:8080/player?nickname={nickname}";
final Map<String, String> nickname = new HashMap<String, String>();
nickname.put("nickname", score.getPlayer());
String Uri = "http://localhost:8081/game?code={code}";
final Map<String, String> code = new HashMap<String, String>();
code.put("code", score.getGamecode());
Player player = restTemplate.getForObject("http://localhost:8080/player/" + nickname, Player.class);
Game game = restTemplate.getForObject(Uri, Game.class, code);
if ((repo.findByScoreid(score.getScoreid())) == null) {
if((player!= null) && (game!=null)) {
HashMap<String, Object> map = new HashMap<String, Object>();
map.put("score", score.getScore());
map.put("date", score.getDate());
m = null;
m.add(map);
score.setHistory(m);
repo.save(score);
return ResponseEntity.status(201).body("Created!");
}
else return ResponseEntity.status(400).body("Bad Request!");
}
else
return ResponseEntity.status(409).body("Conflict!");
}
}
Related
I have SearchRequest object with all the Elasticsearch(ES) query data set. I cannot use RestHighLevel client for my usecase because it requires endpoint need to be passed at the time of instantiation. I gets ES endpoint dynamically based on some condition. One way is to always create new RestHighLevel client which will be inefficient approach. Other way is to create static CloseableHttpClient on service start and make HttpPost request with dynamic endpoint. I wanted to take later approach but don't know how to convert SearchRequest object into json query string.
Any code reference/snippet would be very helpful
private final CloseableHttpClient client;
public GenericElasticSearchResponse search(#Nonnull final SearchRequest searchRequest,
#Nonnull final RoutingConfig route) {
final URIBuilder builder = new URIBuilder()
.setScheme(route.getScheme())
.setHost(route.getESEndpoint())
.setPort(Optional.ofNullable(route.getPort())
.orElse(80))
.setPath("/sessions*/_search");
final URI uri = builder.build();
final ContentType contentType = ContentType.create("application/json", "UTF-8");
final HttpPost httpPost = new HttpPost(uri);
httpPost.setEntity(entity);
final CloseableHttpResponse response = client.execute(httpPost);
final String responseEntity;
try (final Reader reader = new InputStreamReader(response.getEntity().getContent(), Charsets.UTF_8)) {
responseEntity = CharStreams.toString(reader);
}
final SearchResponse searchResponse = objectMapper.readValue(responseEntity, SearchResponse.class);
return new ElasticSearchResponse(searchResponse);
}
I found searchRequest.source().toString() was actually returning json form of SearchRequest. Following is complete code snippet for hitting ES via Apache client
final EndpointConfig endpoint = route.getEndpoint();
final URIBuilder builder = new URIBuilder()
.setScheme(endpoint.getScheme())
.setHost(endpoint.getHost())
.setPort(Optional.ofNullable(endpoint.getPort())
.orElse(HTTPS_PORT))
.setPath(Optional.ofNullable(endpoint.getQueryPath())
.orElse(StringUtils.EMPTY));
final URI uri = builder.build();
final ContentType contentType = ContentType.create("application/json", "UTF-8");
final String queryString = searchRequest.source().toString();
final StringEntity entity = new StringEntity(queryString, contentType);
final HttpPost httpPost = new HttpPost(uri);
httpPost.setEntity(entity);
final CloseableHttpResponse response = sendRequest(httpPost);
final String responseEntity;
try (final Reader reader = new InputStreamReader(response.getEntity().getContent(), Charsets.UTF_8)) {
responseEntity = CharStreams.toString(reader);
}
log.info("ElasticSearchClient response: Code: {}, Entity {}", response.getCode(), responseEntity);
SearchResponse searchResponse = null;
if (Objects.nonNull(responseEntity)) {
searchResponse = parseResponse(responseEntity, searchRequest, response.getCode());
log.info("ElasticSearchClient searchResponse- {} ", searchResponse);
}
return new ElasticSearchResponse(searchResponse);
} catch (final URISyntaxException e) {
throw new IllegalStateException(
String.format("Invalid URI. host: %s", route.getEndpoint()), e);
} catch (final IOException e) {
throw new IllegalStateException("ElasticSearch Request failed.", e);
}
private SearchResponse parseResponse(#Nonnull final String responseEntity,
#Nonnull final SearchRequest searchRequest,
final int responseCode) {
if (responseCode >= 400 || responseCode < 200) {
log.info("ES error response - {} ", responseEntity);
final ESErrorResponse response = GSON.fromJson(responseEntity, ESErrorResponse.class);
throw new IllegalStateException();
}
SearchResponse searchResponse = null;
final NamedXContentRegistry registry = new NamedXContentRegistry(getDefaultNamedXContents());
final XContentParser parser;
try {
parser = JsonXContent.jsonXContent.createParser(registry,
DeprecationHandler.THROW_UNSUPPORTED_OPERATION, responseEntity);
searchResponse = SearchResponse.fromXContent(parser);
} catch (IOException e) {
throw new IllegalStateException("Error while parsing response ", e);
}
return searchResponse;
}
public static List<NamedXContentRegistry.Entry> getDefaultNamedXContents() {
final Map<String, ContextParser<Object, ? extends Aggregation>> map = new HashMap<>();
map.put(TopHitsAggregationBuilder.NAME, (p, c) -> ParsedTopHits.fromXContent(p, (String) c));
map.put(StringTerms.NAME, (p, c) -> ParsedStringTerms.fromXContent(p, (String) c));
return map.entrySet().stream()
.map(entry -> new NamedXContentRegistry.Entry(Aggregation.class, new ParseField(entry.getKey()), entry.getValue()))
.collect(Collectors.toList());
}
private CloseableHttpResponse sendRequest(final HttpPost httpPost) throws IOException {
return client.execute(httpPost);
}
There seem to be an issue when I use AggregatingReplyingKafkaTemplate with template.setReturnPartialOnTimeout(true) in that, it returns timeout exception even if partial results are available from consumers.
In example below, I have 3 consumers to reply to the request topic and i've set the reply timeout at 10 seconds. I've explicitly delayed the response of Consumer 3 to 11 seconds, however, I expect the response back from Consumer 1 and 2, so, I can return partial results. However, I am getting KafkaReplyTimeoutException. Appreciate your inputs. Thanks.
I follow the code based on the Unit Test below.
[ReplyingKafkaTemplateTests][1]
I've provided the actual code below:
#RestController
public class SumController {
#Value("${kafka.bootstrap-servers}")
private String bootstrapServers;
public static final String D_REPLY = "dReply";
public static final String D_REQUEST = "dRequest";
#ResponseBody
#PostMapping(value="/sum")
public String sum(#RequestParam("message") String message) throws InterruptedException, ExecutionException {
AggregatingReplyingKafkaTemplate<Integer, String, String> template = aggregatingTemplate(
new TopicPartitionOffset(D_REPLY, 0), 3, new AtomicInteger());
String resultValue ="";
String currentValue ="";
try {
template.setDefaultReplyTimeout(Duration.ofSeconds(10));
template.setReturnPartialOnTimeout(true);
ProducerRecord<Integer, String> record = new ProducerRecord<>(D_REQUEST, null, null, null, message);
RequestReplyFuture<Integer, String, Collection<ConsumerRecord<Integer, String>>> future =
template.sendAndReceive(record);
future.getSendFuture().get(5, TimeUnit.SECONDS); // send ok
System.out.println("Send Completed Successfully");
ConsumerRecord<Integer, Collection<ConsumerRecord<Integer, String>>> consumerRecord = future.get(10, TimeUnit.SECONDS);
System.out.println("Consumer record size "+consumerRecord.value().size());
Iterator<ConsumerRecord<Integer, String>> iterator = consumerRecord.value().iterator();
while (iterator.hasNext()) {
currentValue = iterator.next().value();
System.out.println("response " + currentValue);
System.out.println("Record header " + consumerRecord.headers().toString());
resultValue = resultValue + currentValue + "\r\n";
}
} catch (Exception e) {
System.out.println("Error Message is "+e.getMessage());
}
return resultValue;
}
public AggregatingReplyingKafkaTemplate<Integer, String, String> aggregatingTemplate(
TopicPartitionOffset topic, int releaseSize, AtomicInteger releaseCount) {
//Create Container Properties
ContainerProperties containerProperties = new ContainerProperties(topic);
containerProperties.setAckMode(ContainerProperties.AckMode.MANUAL_IMMEDIATE);
//Set the consumer Config
//Create Consumer Factory with Consumer Config
DefaultKafkaConsumerFactory<Integer, Collection<ConsumerRecord<Integer, String>>> cf =
new DefaultKafkaConsumerFactory<>(consumerConfigs());
//Create Listener Container with Consumer Factory and Container Property
KafkaMessageListenerContainer<Integer, Collection<ConsumerRecord<Integer, String>>> container =
new KafkaMessageListenerContainer<>(cf, containerProperties);
// container.setBeanName(this.testName);
AggregatingReplyingKafkaTemplate<Integer, String, String> template =
new AggregatingReplyingKafkaTemplate<>(new DefaultKafkaProducerFactory<>(producerConfigs()), container,
(list, timeout) -> {
releaseCount.incrementAndGet();
return list.size() == releaseSize;
});
template.setSharedReplyTopic(true);
template.start();
return template;
}
public Map<String, Object> consumerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG,bootstrapServers);
props.put(ConsumerConfig.GROUP_ID_CONFIG, "test_id");
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, org.apache.kafka.common.serialization.StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, org.apache.kafka.common.serialization.StringDeserializer.class);
return props;
}
public Map<String, Object> producerConfigs() {
Map<String, Object> props = new HashMap<>();
// list of host:port pairs used for establishing the initial connections to the Kakfa cluster
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,
bootstrapServers);
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,
org.apache.kafka.common.serialization.StringSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, org.apache.kafka.common.serialization.StringSerializer.class);
return props;
}
public ProducerFactory<Integer,String> producerFactory() {
return new DefaultKafkaProducerFactory<>(producerConfigs());
}
#KafkaListener(id = "def1", topics = { D_REQUEST}, groupId = "D_REQUEST1")
#SendTo // default REPLY_TOPIC header
public String dListener1(String in) throws InterruptedException {
return "First Consumer : "+ in.toUpperCase();
}
#KafkaListener(id = "def2", topics = { D_REQUEST}, groupId = "D_REQUEST2")
#SendTo // default REPLY_TOPIC header
public String dListener2(String in) throws InterruptedException {
return "Second Consumer : "+ in.toLowerCase();
}
#KafkaListener(id = "def3", topics = { D_REQUEST}, groupId = "D_REQUEST3")
#SendTo // default REPLY_TOPIC header
public String dListener3(String in) throws InterruptedException {
Thread.sleep(11000);
return "Third Consumer : "+ in;
}
}
'''
[1]: https://github.com/spring-projects/spring-kafka/blob/master/spring-kafka/src/test/java/org/springframework/kafka/requestreply/ReplyingKafkaTemplateTests.java
template.setReturnPartialOnTimeout(true) simply means the template will consult the release strategy on timeout (with the timeout argument = true, to tell the strategy it's a timeout rather than a delivery call).
It must return true to release the partial result.
This is to allow you to look at (and possibly modify) the list to decide whether you want to release or discard.
Your strategy ignores the timeout parameter:
(list, timeout) -> {
releaseCount.incrementAndGet();
return list.size() == releaseSize;
});
You need return timeout ? true : { ... }.
Hi I need to consume a REST operation which accepts a xml payload and a pdf file. Basically a JAXB object is converted to xml string and uploaded in a xml file. So in a multipart request, a xml file and pdf file are uploaded.
The REST operation server side code is as follows:
server side:
public class CompanyType extends MediaType {
public final static final XML_STRING = "applicaiton/company+xml";
}
#POST
#Path("/upload")
#Consumes("multipart/mixed")
#Produces(CompanyType.XML_STRING)
public UploadResponseObject upload(MultiPart multiPart){
UploadRequestObject req = multiPart.getBodyParts().get(0).getEntityAs(UploadRequestObject.class);
BodyPartEntity bpe = (BodyPartEntity) multiPart.getBodyParts().get(1).getEntity();
byte[] pdfBytes = IOUtils.toByteArray(bpe.getInputStream());
....
....
}
client side code to consume REST operation:
#Autowired
private RestTemplate rt;
public UploadResponseObject callMultipartUploadOperation(UploadRequestObject req, java.io.File target) throws Exception {
String url = "http://<host-name>:<port>/service-name/upload");
MultiValueMap<String, Object> mv = new LinkedMultiValueMap<String, Object>();
this.rt = new RestTemplate();
this.rt.setMessageConverters(getMessageConverter());
String id = <random number generated from 1 to 50000>;
// Add xml entity
org.springframework.http.HttpHeaders xmlFileHeaders = new org.springframework.http.HttpHeaders();
xmlFileHeaders.add(MeditType.CONTENT_TYPE, "applicaiton/company+xml");
HttpEntity<String> xmlFile = new HttpEntity<String>(createXMLString(req), xmlFileHeaders);
mv.add(id + ".xml", xmlFile);
// Add pdf file
org.springframework.http.HttpHeaders fileHeaders = new org.springframework.http.HttpHeaders();
fileHeaders.add(MediaType.CONTENT_TYPE, "application/pdf");
FileSystemResource fsr = new FileSystemResource(target);
HttpEntity<FileSystemResource> fileEntity = new HttpEntity<FileSystemResource>(
fsr, fileHeaders);
String filename = target.getName();
mv.add(filename, fileEntity);
HttpEntity<UploadRequestObject> ereq = new HttpEntity<UploadRequestObject>(req, getRequestHeaders());
ResponseEntity<UploadResponseObject> res= this.restTemplate.postForEntity(url, ereq, UploadResponseObject.class);
return res.getBody();
}
private List<HttpMessageConverter<?>> getMessageConverter() {
List<HttpMessageConverter<?>> messageConverters = new ArrayList<HttpMessageConverter<?>>();
Jaxb2Marshaller jaxb2Marshaller = new Jaxb2Marshaller();
jaxb2Marshaller.setClassesToBeBound(UploadResponseObject.class);
MarshallingHttpMessageConverter mhmc = new MarshallingHttpMessageConverter(jaxb2Marshaller);
List<org.springframework.http.MediaType> supportedMediaTypes = new ArrayList<org.springframework.http.MediaType>();
supportedMediaTypes.add(new org.springframework.http.MediaType("application", "company+xml"));
mhmc.setSupportedMediaTypes(supportedMediaTypes);
messageConverters.add(mhmc);
// Add Form and Part converters
FormHttpMessageConverter fmc = new FormHttpMessageConverter();
fmc.addPartConverter(new Jaxb2RootElementHttpMessageConverter());
messageConverters.add(fmc);
return messageConverters;
}
When the below line is executed from client code,
ResponseEntity<UploadResponseObject> res= this.rt.postForEntity(url, ereq, UploadResponseObject.class);
the following exception is thrown
org.springframework.web.client.RestClientException: Could not write request: no suitable HttpMessageConverter
found for request
type [org..types.UploadRequestObject]
and content type [application/company+xml]
Please advise the changes to make the client side code work.
After much trial and error, was able to find the solution for the same.
Client side code:
#Autowired
private RestTemplate rt;
public UploadResponseObject callMultipartUploadOperation(UploadRequestObject req, java.io.File target) throws Exception {
String url = "http://<host-name>:<port>/service-name/upload");
MultiValueMap<String, Object> mv = new LinkedMultiValueMap<String, Object>();
this.rt = new RestTemplate();
this.rt.setMessageConverters(getMessageConverter());
String id = <random number generated from 1 to 50000>;
// Add xml entity
org.springframework.http.HttpHeaders xmlFileHeaders = new org.springframework.http.HttpHeaders();
xmlFileHeaders.add(MeditType.CONTENT_TYPE, "applicaiton/company+xml");
HttpEntity<String> xmlFile = new HttpEntity<String>(createXMLString(req), xmlFileHeaders);
mv.add(id + ".xml", xmlFile);
// Add pdf file
org.springframework.http.HttpHeaders fileHeaders = new org.springframework.http.HttpHeaders();
fileHeaders.add(MediaType.CONTENT_TYPE, "application/pdf");
FileSystemResource fsr = new FileSystemResource(target);
HttpEntity<FileSystemResource> fileEntity = new HttpEntity<FileSystemResource>(
fsr, fileHeaders);
String filename = target.getName();
mv.add(filename, fileEntity);
HttpEntity<UploadRequestObject> ereq = new HttpEntity<UploadRequestObject>(req, getRequestHeaders());
ResponseEntity<UploadResponseObject> res= this.restTemplate.postForEntity(url, ereq, UploadResponseObject.class);
return res.getBody();
}
Message converters:
private List<HttpMessageConverter<?>> getMessageConverter() {
List<HttpMessageConverter<?>> messageConverters = new ArrayList<HttpMessageConverter<?>>();
Jaxb2Marshaller jaxb2Marshaller = new Jaxb2Marshaller();
jaxb2Marshaller.setClassesToBeBound(UploadResponseObject.class);
MarshallingHttpMessageConverter mhmc = new MarshallingHttpMessageConverter(jaxb2Marshaller);
List<org.springframework.http.MediaType> supportedMediaTypes = new ArrayList<org.springframework.http.MediaType>();
supportedMediaTypes.add(new org.springframework.http.MediaType("application", "company+xml"));
supportedMediaTypes.add(new org.springframework.http.MediaType("multipart", "form-data"));
mhmc.setSupportedMediaTypes(supportedMediaTypes);
messageConverters.add(mhmc);
// Add Form and Part converters
FormHttpMessageConverter fmc = new FormHttpMessageConverter();
fmc.addPartConverter(new Jaxb2RootElementHttpMessageConverter());
fmc.addPartConverter(new ResourceHttpMessageConverter());
messageConverters.add(fmc);
return messageConverters;
}
Request headers :
private org.springframework.http.HttpHeaders getRequestHeaders(String contentType) throws Exception {
....
.....
org.springframework.http.HttpHeaders httpHeaders = new org.springframework.http.HttpHeaders();
httpHeaders.set("Accept", "applicaiton/company+xml");
httpHeaders.set("Content-Type", "multipart/form-data");
String consumer = "<AppUserId>";
httpHeaders.set("consumer", consumer);
String tmStamp= getCurrentTimeStamp();
httpHeaders.set("timestamp", tmStamp);
...
...
return httpHeaders;
}
I am facing problems with JdbcPagingItemReader. I am stucked in endless loop. I read about that itemReader contract have to return null, but I don't manage to implement it correctly. Somebody can show me an example?
public List<TransactionDTO> getTransactions(Integer chunk, LocalDateTime startDate, LocalDateTime endDate)
throws Exception {
final TransactionMapper transactionMapper = new TransactionMapper();
final SqlPagingQueryProviderFactoryBean sqlPagingQueryProviderFactoryBean = new SqlPagingQueryProviderFactoryBean();
sqlPagingQueryProviderFactoryBean.setDataSource(dataSource);
sqlPagingQueryProviderFactoryBean.setSelectClause(env.getProperty("sql.fromdates.select"));
sqlPagingQueryProviderFactoryBean.setFromClause(env.getProperty("sql.fromdates.from"));
sqlPagingQueryProviderFactoryBean.setWhereClause(env.getProperty("sql.fromdates.where"));
sqlPagingQueryProviderFactoryBean.setSortKey(env.getProperty("sql.fromdates.sort"));
final Map<String, Object> parametros = new HashMap<>();
parametros.put("startDate", startDate);
parametros.put("endDate", endDate);
final JdbcPagingItemReader<TransactionDTO> itemReader = new JdbcPagingItemReader<>();
itemReader.setDataSource(dataSource);
itemReader.setQueryProvider(sqlPagingQueryProviderFactoryBean.getObject());
// TODO esto debe ser el chunk
itemReader.setPageSize(1);
itemReader.setFetchSize(1);
itemReader.setRowMapper(transactionMapper);
itemReader.afterPropertiesSet();
itemReader.setParameterValues(parametros);
ExecutionContext executionContext = new ExecutionContext();
itemReader.open(executionContext);
List<TransactionDTO> list = new ArrayList<>();
TransactionDTO primerDto = itemReader.read();
while (primerDto != null) {
list.add(itemReader.read());
}
itemReader.close();
return list;
}
I'd like to fetch sonar.timemachine.period1 via wsclient.
Seeing that it doesn't have one, I decided to bake one for myself
private Map<String, String> retrievePeriodProperties(final WsClient wsClient, int requestedPeriod) {
if (requestedPeriod > 0) {
final WsRequest propertiesWsRequestPeriod =
new GetRequest("api/properties/sonar.timemachine.period" + requestedPeriod);
final WsResponse propertiesWsResponsePeriod =
wsClient.wsConnector().call(propertiesWsRequestPeriod);
if (propertiesWsResponsePeriod.isSuccessful()) {
String resp = propertiesWsResponsePeriod.content();
Map<String, String> map = new HashMap<>();
map.put(Integer.toString(requestedPeriod), resp);
return map;
}
}
return new HashMap<>();
}
but it always return an empty Map<>
Any lead where I can go from this direction?
You can use org.sonar.api.config.Settings to fetch properties defined in SonarQube.