I am wondering about how can we testing automate functionality.
I am working on a Spring Boot micro-service where we use a GemFire cache. Right now I am testing it manually for below scenarios:
Is the data purged correctly after TTL is reached
Retrieving the data from cache if object exists
So, I know we can have a separate service which calls the GemFire and making sure that the object exists in cache (for step2). But not really sure how can we automate testing for step1.
And the whole point I am wondering is do we really need a new service completely to test this as a overhead? Are there any tools / better approach for testing the functionality?
Since you're using spring-boot and VMware GemFire together, I really hope you're taking advantage of the huge help and functionality spring-boot-data-gemfire provides out of the box. If you are, then you'd be delighted to know that there's yet another project, spring-test-data-geode, which can be used to write Unit and Integration Tests when building Spring Data for Apache Geode & VMware GemFire applications, you should really give it a try as it greatly helps in managing the scope and lifecycle of mock VMware GemFire/Apache Geode objects, along with cleaning all resources used by real objects used during Integration Tests.
As a side note, if you're using the Data Expiration Functionality shipped out of the box with VMware GemFire, I really don't see an actual need (other than the peace of mind that comes with I've tested everything I could) to include custom tests within your testing suite, you should only test what you own. The functionality itself is thoroughly tested already as part of the VMware GemFire / Apache Geode project itself, and you can see some (certainly not all) examples of such tests in the following links: ExpirationDUnitTest, RegionExpirationDistributedTest, ReplicateEntryIdleExpirationDistributedTest.
Cheers.
I have had some success using TestContainers here is the code used to create the container and
a sample test. It works by executing gfsh commands on the container but is slow.
Related
I have a mobile app where the back-end is currently running as a NodeJS Cloud Function, but I'm nowhere near as comfortable with NodeJS as I am with Java. So, I've re-written the API in Java - however, when it comes to deploying that as a Cloud Function or on Cloud Run the cold-start performance is obviously not very good. I'm seeing roundabouts 15 second cold-start time when I add in the dependencies that I need, which is not going to work. I do have a "warmup" endpoint that I call immediately when a user logs into the mobile app to kick off the initialization of the API back-end, which does help a little.
I've been playing around with GraalVM and generating a native image for a while as well, and while I can get your basic hello-world app and some slightly more elaborate examples working, my app has some dependencies like gRPC and Cloud Firestore, among others, and I have not been successful in generating a native image for that with Micronaut, Quarkus, or Spring Boot.
I considered running on a managed instance group with a minimum of 1 so there's always at least one instance up and running, ready to serve requests, but I would then need a Cloud Loadbalancer in front and I've read some horror stories where the Cloud Loadbalancer wound up costing folks a lot more than they had expected.
Is there a way to front a managed instance group using Cloud Endpoints? I see where you can do it with a single VM instance, but not across a group which leads me to believe that in that case I would need a Cloud Loadbalancer to do what I need?
Cost-effectiveness is important, because my app is super new and is not generating any revenue at all yet, and since it's just me funding it using personal money, my infrastructure budget is not super high :)
TL;DR/ Looking for tips on what the cheapest way would be to host a Java-based API app on a framework like Micronaut, Quarkus, or Spring Boot on GCP while maintaining good performance and elasticity.
Any insight would be greatly appreciated.
I wrote an article on Java framework cold start on Cloud Run (the results are outdated because after this article release and discussions with Googlers, the team has updated the Cloud Run platform and the way to manage Java containers. Now they start quickly now!)
Anyway, your question seems relevant at the beginning, but finally not really. I will explain why.
Firstly, the cold start is a temporary issue. Your first request is slow, and the dozens, hundreds after are very fast. Does it really a problem?
If so, the min instance feature (only available on CLoud Run for Anthos for now) is coming in the managed version. Like this you never really scale to 0, an instance is kept warm and start instantly (but, as counterpart, it won't be free).
Secondly, if you look for maintainability, I recommend you the framework that you know. You will be more efficient to improve your code, fix your issue and to save your time (and time is money) much more that infrastructure consideration!
All the Java framework are relatively close when optimized (Naive Spring Boot on Cloud Run start in 20s, in 2s after packaging optimizations!). Of course, native compilation (with GraalVM) is the fastest, but it's not really stable for now with several side effect (and I won't recommend it for production).
Personal opinion: I'm a big fan of Spring Boot and its ecosystem. But Micronaut and its AOT compilation, in addition of annotation compliant with Spring Boot idioms, is absolutely awesome. Quarkus is more recent, and I haven't real opinion on it (never used in production/real project)
I would say you need more Micronaut or Quarkus in combination with GraalVM if you target performance. Define your services to be run as
My experience is primary with Micornaut serverless application and it is manageable to have api service running as function/lambda with boot time of 100-500 ms. Cold starts are not a big issue anymore if you enable provisioning (feature is available since 12.2019 in AWS), you could skip the so called warming.
How to make your lambda faster ?
Keep your package size as small a possible (remove all big libraries where a fraction of it is used) - keep package size to max 20 MB. On every cold start this package is fetched and decompressed.
If you use a JVM technology for your services, try to migrate them to Graalvm where the boot-up overhead is reduced to minimum.
micronaut + graalvm
quarkus + graalvm
helidon + graalvm
Use cloud infrastructure configs to reduce the cold starts.
This is what AWS provides, not sure about GPC
https://aws.amazon.com/about-aws/whats-new/2019/12/aws-lambda-announces-provisioned-concurrency/
Note: IMHO AWS has a better setup for serverless application so far compared to GCP in terms of boot-up and cold starts.
We are developing test cases for a micro service using Spring Boot. One of the requirement is that for each Junit test case we need to:
start the project
test a unit case and
then stop the project .
I feel this is an anti pattern, but this is the requirement.
I looked around internet but couldn't find a solution for the same. I was able to start a web server but it provided no response and this might be because the project is not assigned to the server.
Does anyone have any idea on how this can be achieved?
PS: We don't want to use Mockito
Before hand i want to make clear that this a very bad practice and should be avoided. This approach does not implement unit tests concept correctly because you are testing an entire system up, so JUnit wouldn't be the correct tool.
I pocked around and i don't seem to find a Runner that may be able to do this (does not surprise me although), the most similar Runner may be SpringJUnit4ClassRunner which provides you a complete Spring context in your test space, but won't go live with the application.
An approach i'd suggest if you really want to go with this is to use tools like REST Assured to do End-to-End API layer tests against the live application, but this implies that you have to find another way to start the app, and then point the REST Assured tests to that started app. Maybe a shell script that starts the app and then starts the REST Assured tests suits, then when the suit ends put down the server.
I highly suggest you to chat with your product/management teams to avoid this kind of stuff since the tests will take FOREVER to run and you will be polluting your local or remote DBs if you are persisting data or other systems through REST or SOAP calls.
I am working on a sample application right now using Spring Boot, Spring Data JPA, and Spring Data Elasticsearch. I want to be able to run the unit tests as part of a pipeline build, but they require Elasticsearch to be running to work as the service makes calls to said ES server. SQL works fine because I am using an in-memory H2 instance.
I have implemented some code to attempt to launch ES as an "embedded" server. The embedded server works just fine, but it seems like, at least from what I can tell, it is started AFTER the context loads. Most importantly after the ElasticSearchConfiguration does it's thing.
I think I need refactor the code out of AbstractElasticsearchTest into a separate class that can run prior to ElasticSearchConfiguration generates the client/template, but I am not sure how to do it, nor how to Google said process.
Is there some mechanism in Spring Boot that could be used to start the embedded servers prior to running any of the configurations? Or is there some way I could enhance ElasticSearchConfiguration to do it prior to creating the client/template, but only when running the unit tests?
Edit:
So, just to be a little more specific...what I am looking for is a means/way to either run ES 5 in "embedded" mode OR how to mock up the Spring Data ES code enough so that it works for the CI server. The code linked above currently is mixing unit tests with integration tests, I know, as it's currently making calls to a physical ES server. That's what I am trying to correct: I should be able to stub/mock enough of the underlying Spring Data code to make the unit test think it's talking to the real-deal. I can then change the tests that determine if the documents made it to ES and test things like type-ahead searches to be integration tests instead so they do not run when CI or Sonar runs.
Ok, so for those that might come back here in the future, this commit shows the changes I made to get ES to run as "embedded".
The nuts-and-bolts of it was to start the node as "local" then physically return node.client(). Then in the Spring Bean method that gets the client, check if you have "embedded" turned on, if so start the node and return it's Client (the local one), if not just build the client just as normal.
Am a total beginner with spring framework and trying to know if it even fits my use case, before investing time learning it.
I'm responsible for a stand alone java project(used as a jar by a server) which basically serves requests from a server, and in turn makes service calls to various internal services.
This standalone java project, currently has all of its service calls hard coded. I want to use Spring to inject dependencies so I can make this stuff testable.
I have no idea how spring works. Does it even hold for standalone jars or is it only for 'running applications'?
If I make my standalone project 'spring enabled', when the server uses my jar, will it automagically work by creating beans or is there some requirement from the server side?
In short, yes, you can use Spring in a standalone jar-application ("console application", if you will), we do it all the time at work. You just need to create the ApplicationContext yourself when your application starts, see for example here: http://www.devdaily.com/blog/post/java/load-spring-application-context-file-java-swing-application
This is just one example I pulled straight out of Google, there are probably numerous others. Still, you really need to read at least the basics from the Spring documentation to get started, otherwise you'll probably hit a wall pretty soon.
Does anybody has an experience with Spring Integration project as embedded ESB?
I'm highly interesting in such use cases as:
Reading files from directory on schedule basis
Getting data from JDBC data source
Modularity and possibility to start/stop/redeploy module on the fly (e.g. one module can scan directory on schedule basis, another call query from jdbc data source etc.)
repeat/retry policy
UPDATE:
I found answers on all my questions except "Getting data from JDBC data source". Is it technically possible?
Remember, "ESB" is just a marketing term designed to sell more expensive software, it's not a magic bullet. You need to consider the specific jobs you need your software to do, and pick accordingly. If Spring Integration seems to fit the bill, I wouldn't be too concerned if it doesn't look much like an uber-expensive server installation.
The Spring Integration JDBC adapters are available in 2.0, and we just released GA last week. Here's the relevant section from the reference manual: http://static.springsource.org/spring-integration/docs/latest-ga/reference/htmlsingle/#jdbc
This link describes the FileSucker with Spring Integration. Read up on your Enterprise Integration patterns for more info I think.
I kinda think you need to do a bit more investigation your self, or do a couple of tries on some of your usecases. Then we can discuss whats good and bad
JDBC Adapters appear to be a work in progress.
Even if there is no specific adapter available, remember that Spring Integration is a thin wrapper around POJOs. You'll be able to access JDBC in any component e.g. your service activators.
See here for a solution based on a polling inbound channel adapter too.