Hi in our application we are creating certain dynamic integrations flows that we remove and create on the flow .
Mostly things have worked great but we have observed the below error sometimes specially due to the integration flow trying to remove dependent beans . Can someone comment whether this is a bug or we are missing anything . Error Trace below
java.lang.NullPointerException: null
at org.springframework.beans.factory.support.DefaultListableBeanFactory.resetBeanDefinition(DefaultListableBeanFactory.java:912)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.removeBeanDefinition(DefaultListableBeanFactory.java:891)
at org.springframework.integration.dsl.context.IntegrationFlowContext.removeDependantBeans(IntegrationFlowContext.java:203)
at org.springframework.integration.dsl.context.IntegrationFlowContext.remove(IntegrationFlowContext.java:189) at
code that removes integration flow
flowContext.remove("flowId");
Update of the invoking code
if (discoveryService.isFLowPresent(flowId))
{
LOG.debug("Removing and creating flow [{}]", flowId);
discoveryService.removeIntegrationFlow(fc.getFeedId());
LOG.debug("Removing Old Job and create fresh one with new params [{}]", flowId);
try
{
discoveryService.createFlow(fc.getFeedId());
}
catch (ExecutionException e)
{
throw new IllegalStateException("Error while starting flow for Integration adapter [{}]" + fc.getFeedId(), e);
}
}
Related
I have a dataflow job which is written in apache beam with java. I am able run the dataflow job in GCP through this steps.
Created dataflow template from my code. Then uploading template in cloud storage.
Directly creating job from template option available in GCP->Dataflow->jobs
This flow is working fine.
I want to do same step through java app. means, I have one api when someone sends request to that api, I want to start this dataflow job through the template which I have already stored in storage.
I could see rest api is available to implement this approach. as below,
POST /v1b3/projects/project_id/locations/loc/templates:launch?gcsPath=template-location
But I didn't find any reference or samples for this. I tried the below approach
In my springboot project I added this dependency
<!-- https://mvnrepository.com/artifact/com.google.apis/google-api-services-dataflow -->
<dependency>
<groupId>com.google.apis</groupId>
<artifactId>google-api-services-dataflow</artifactId>
<version>v1b3-rev20210825-1.32.1</version>
</dependency>
and added below code in controller
public static void createJob() throws IOException {
GoogleCredential credential = GoogleCredential.fromStream(new FileInputStream("myCertKey.json")).createScoped(
java.util.Arrays.asList("https://www.googleapis.com/auth/cloud-platform"));
try{
Dataflow dataflow = new Dataflow.Builder(new LowLevelHttpRequest(), new JacksonFactory(),
credential).setApplicationName("my-job").build(); --- this gives error
//RuntimeEnvironment
RuntimeEnvironment env = new RuntimeEnvironment();
env.setBypassTempDirValidation(false);
//all my env configs added
//parameters
HashMap<String,String> params = new HashMap<>();
params.put("bigtableEmulatorPort", "-1");
params.put("gcsPath", "gs://bucket//my.json");
// all other params
LaunchTemplateParameters content = new LaunchTemplateParameters();
content.setJobName("Test-job");
content.setEnvironment(env);
content.setParameters(params);
dataflow.projects().locations().templates().launch("project-id", "location", content);
}catch (Exception e){
log.info("error occured", e);
}
}
This gives {"id":null,"message":"'boolean com.google.api.client.http.HttpTransport.isMtls()'"}
error in this line itself
Dataflow dataflow = new Dataflow.Builder(new LowLevelHttpRequest(), new JacksonFactory(),
credential).setApplicationName("my-job").build();
this is bcs, this dataflow builder expects HttpTransport as 1st argument but I passed LowLevelHttpRequest()
I am not sure is this the correct way to implement this. Can any one suggest any ideas on this? how to implement this? any examples or reference ?
Thanks a lot :)
I have a springboot API which is dealing with lot of processes in backend. I need to stream the status to the frontend. Since I am new to springboot can anyone help me how to achieve this scenario.
Note - Application is going to be containerized in future and I cannot use any cloud service for this.
As there is not much to go off of I will try my best:
If you are using Log4j2 you could simply usethe SocketAppender (external link)
If not:
I did something similar recently and you will need to somehow turn your logs into a stream. I'd advise using the information found here (non-external link)
OutputStream Stream;
#GetMapping("/stream-sse-mvc")
public SseEmitter streamSseMvc() {
SseEmitter emitter = new SseEmitter();
ExecutorService sseMvcExecutor = Executors.newSingleThreadExecutor();
sseMvcExecutor.execute(() -> {
try {
Stream.map(sequence -> SseEmitter.event()
.id(""))
.event("EVENT_TYPE")
.data("String.valueOf(sequence)
.build());
emitter.send(event);
Thread.sleep(1000); //This does not need to be here
} catch (Exception ex) {
emitter.completeWithError(ex);
}
});
return emitter;
}
There might be better ways to reach your endpoints but without knowing what Frameworks you are using this is hard to answer. Essentially what we are doing is capturing all log output to a stream which is then broadcasted by an SSE.
I've been looking on google for a bit by now and I can't find a solution to my problem.
The problem is the default behavior of RabbitTemplate's methods, namely convertSendAndReceive() and convertSendAndReceiveAsType().
When you invoke these methods and they are not processed and replied to (default direct reply-to with queue=amq.rabbitmq.reply-to) the RabbitTemplate simply returns null response instead of indicating that message was not replied to.
That is pretty important when you send almost empty body on queue and expect to receive like user's books or something similar, with null response you can't tell if user has no books or if message wasn't processed in time.
Example invocation
final List<String> messages = rabbitTemplate.convertSendAndReceiveAsType("getMessagesQueue", 0, new ParameterizedTypeReference<>() {});
I found a workaround for this - using AsyncRabbitTemplate as it's RabbitConverterFuture throws exception on method .get(timeout), but that's not my go-to. I don't want to have to use AsyncRabbitTemplate just to get notified on unprocessed message.
Example
final AsyncRabbitTemplate.RabbitConverterFuture<List<String>> messages = asyncRabbitTemplate.convertSendAndReceiveAsType("getMessagesQueue", 0, new ParameterizedTypeReference<>() {});
try {
messages.get(5000, TimeUnit.MILLISECONDS);
} catch (InterruptedException | ExecutionException | TimeoutException e) {
// message not processed
}
My problem is how to configure RabbitTemplate (configure template itself, not wrap template calls with aspects, decorator, proxy or similar) to actually throw some exception instead of returning null values.
There is currently no such feature; feel free to open a new feature request on GitHub. https://github.com/spring-projects/spring-amqp/issues
I have created a Lambda function which is triggered by a DynamoDB stream. I am trying to process Dynamodb events and put them into a Kinesis stream after some transformation. The Lambda has full access to both DynamoDB and Kinesis stream.
I am using Cloudwatch to check the logs and can see that the DynamoDb events are successfully processed. But when I try to create the Kinesis client (present in a different class), the code fails. I tried logging the error and even printing it but it did not help. Sometimes the logs end with this message
END RequestId: {some request id}
Other times, I get the following error
log4j:WARN No appenders could be found for logger (com.amazonaws.AmazonWebServiceClient).
The code fails at the time of creation of Kinesis client. I can see the log messages / print statements before the creation of Kinesis client. But right at that line code fails. I am not sure what the problem is. Can someone please help me out?
Here is the class in which the code fails
private AmazonKinesis kinesisClient;
private String streamName;
public TestKinesisPut(String streamName) {
this.streamName = streamName;
BasicAWSCredentials awsCreds = new BasicAWSCredentials("ACCESS_KEY", "SECRET_KEY");
System.out.println("aws creds are: " + awsCreds);
clientBuilder = AmazonKinesisClientBuilder.standard().withRegion(Regions.AP_SOUTH_1).
withCredentials(new AWSStaticCredentialsProvider(awsCreds));
System.out.println("Credentials are set: \n " + clientBuilder);
try {
System.out.println("This one is new \n About to build new kinesis client");
// the code fails after this line
kinesisClient = clientBuilder.build();
System.out.println("failed to build client");
}
catch(Exception e) {
System.out.println("failed to initialize producer: " + e.getMessage());
kinesisClient = null;
}
}
Thanks
After a few days of head scratching I decided to tinker with the configuration of my Lambda function. Looks like the problem was caused by OutOfMemoryError. I increased the memory of my Lambda function and it started working.
It seems that at the time of creation of the KinesisClient, the JVM was getting out of metaspace. I did some research and found this stackoverflow thread. Please refer the link to view a detailed discussion on a similar scenario.
My builds have been failing due to some of the integration tests I've been running. I'm stuck on why it won't work. Here is an example of the output:
I'm using Maven to first build, then it calls the JUnit tests. I'm seeing this 401 Unauthorized message in every single test, and I believe that's what is causing the builds to fail. In my mind, this means there are some permissions / authentication parameters that need to be set. Where would I go about doing this in JUnit?
Edit
#Test
public void testXmlHorsesNonRunners() throws Exception {
String servletUrl = SERVER + "sd/date/2013-01-13/horses/nonrunners";
Document results = issueRequest(servletUrl, APPLICATION_XML, false);
assertNotNull(results);
// debugDocument(results, "NonRunners");
String count = getXPathStringValue(
"string(count(hrdg:data/hrdg:meeting/hrdg:event/hrdg:nonrunner/hrdg:selection))",
results);
assertEquals("non runners", "45", count);
}
If you can, try to ignore the detail. Effectively, this is making a request. This is a sample of a test that uses the issueRequest method. This method is what makes HTTP requests. (This is a big method, which is why I didn't post it originally. I'll try to make it as readable as possible.
logger.info("Sending request: " + servletUrl);
HttpGet httpGet = null;
// InputStream is = null;
DefaultHttpClient httpclient = null;
try {
httpclient = new DefaultHttpClient();
doFormLogin(httpclient, servletUrl, acceptMime, isIrishUser);
httpGet = new HttpGet(servletUrl);
httpGet.addHeader("accept", acceptMime);
// but more importantly now add the user agent header
setUserAgent(httpGet, acceptMime);
logger.info("executing request" + httpGet.getRequestLine());
// Execute the request
HttpResponse response = httpclient.execute(httpGet);
// Examine the response status
StatusLine statusLine = response.getStatusLine();
logger.info(statusLine);
switch (statusLine.getStatusCode()) {
case 401:
throw new HttpResponseException(statusLine.getStatusCode(),
"Unauthorized");
case 403:
throw new HttpResponseException(statusLine.getStatusCode(),
"Forbidden");
case 404:
throw new HttpResponseException(statusLine.getStatusCode(),
"Not Found");
default:
if (300 < statusLine.getStatusCode()) {
throw new HttpResponseException(statusLine.getStatusCode(),
"Unexpected Error");
}
}
// Get hold of the response entity
HttpEntity entity = response.getEntity();
Document doc = null;
if (entity != null) {
InputStream instream = entity.getContent();
try {
// debugContent(instream);
doc = documentBuilder.parse(instream);
} catch (IOException ex) {
// In case of an IOException the connection will be released
// back to the connection manager automatically
throw ex;
} catch (RuntimeException ex) {
// In case of an unexpected exception you may want to abort
// the HTTP request in order to shut down the underlying
// connection and release it back to the connection manager.
httpGet.abort();
throw ex;
} finally {
// Closing the input stream will trigger connection release
instream.close();
}
}
return doc;
} finally {
// Release the connection.
closeConnection(httpclient);
}
I notice that your test output shows HTTP/1.1 500 Internal Server Error a couple of lines before the 401 error. I wonder if the root cause could be hiding in there. If I were you I'd try looking for more details about what error happened on the server at that point in the test, to see if it could be responsible for the authentication problem (maybe the failure is in a login controller of some sort, or is causing a session to be cancelled?)
Alternately: it looks like you're using the Apache HttpClient library to do the request, inside the issueRequest method. If you need to include authentication credentials in the request, that would be the code you'd need to change. Here's an example of doing HTTP Basic authentication in HttpClient, if that helps. (And more examples, if that one doesn't.)
(I'd second the observation that this problem probably isn't specific to JUnit. If you need to do more research, I'd suggest learning more about HttpClient, and about what this app expects the browser to send. One possibility: use something like Chrome Dev Tools to peek at your communications with the server when you do this manually, and see if there's anything important that the test isn't doing, or is doing differently.
Once you've figured out how to login, it might make sense to do it in a #Before method in your JUnit test.)
HTTP permission denied has nothing to do with JUnit. You probably need to set your credentials while making the request in the code itself. Show us some code.
Also, unit testing is not really meant to access the internet. Its purpose is for testing small, concise parts of your code which shouldn't rely on any external factors. Integration tests should cover that.
If you can, try to mock your network requests using EasyMock or PowerMock and make them return a resource you would load from your local resources folder (e.g. test/resources).