We are in the process of building a new interoperability platform for our company, and we're looking into several data integration platforms as potential foundation candidates. One of these candidates is Spring Cloud Data Flow.
We would like to know if there's a long term support for this platform and/or a maintenance plan for the upcoming years.
We weren't able to find clear answers anywhere, and we would appreciate greatly if anyone here would provide us with some sort of answer ?
Any links are helpful. Thank you !
Related
I am a completely fresher for Spring Boot. I had learn to perform Basic CRUD operations using REST API. That basic knowledge is enough to working with Spring Boot Project. Can I able to work with that?
No one is technically perfect!
Every day we are exploring new things and implement new solutions as per the new business requirements. The developer should possess a good set of problem-solving skills. That’s because it’s common for developers to come across multiple programming problems while building just about any solution.
If your lead assigned a task to you explore quarkus and implement a simple CRUD operation using go language what you will do?
"I don't know golang", Is this your answer?
Qualities of a good junior software programmer
Learn new things daily which must be useful to the growth of the company, your team and you.
Problem Solving and Logical Thinking
Written and Verbal Communication
Teamwork
Interpersonal skills
Time management
How do I search for answers from StackOverflow like a PRO? This is a very important skill set, Really, I am not joking.
Health - Health(Physical/Mental) is an important asset, don't take official things personally. Manage stress etc...
Coming to technical side
It would be good if you know at least one programming language, in your case java is fine. But if you have the listed skill sets you can learn anything very easily.
Regarding Spring boot
Do you want to become an expert in the Spring framework? Work with one big project, whatever domain.
Refer - https://www.baeldung.com/
Once you become pro - Refer - https://docs.spring.io/spring-boot/docs/current/reference/htmlsingle/
As a starter this is fine, but if your application is going to face real customers/users and want to evolve your application over time, then you would need to consider concepts like below
Login/logout security with Spring Security (if stateless then JWT is a good choice)
Evolving code and database with versions of your software (can use Liquibase to evolve and maintain DB changes ).
Handling exceptions as Aspects from Spring.
Having coded business errors so your team can resolve them and classify them accordingly.
For example,
You have an IT estate where a mix of batch and real-time data sources exists from multiple systems, e.g. ERP, Project management, asset, website, monitoring etc.
The aim is to integrate the datasources into a cloud environment (agnostic).
There is a need for reporting and analytics on combinations of all data sources.
Inevitably, some source systems are not capable of streaming, hence batch loading is required.
Potential use-cases for performing functionality/changes/updates based on the ingested data.
Given a steer for creating a future-proofed platform, architecturally, how would you look to design it?
It's a very open-end question, but there are some good principles you can adopt to help direct you in the right direction:
Avoid point-to-point integration, and get everything going through a few common points - ideally one. Using an API Gateway can be a good place to start, the big players (Azure, AWS, GCP) all have their own options, plus there's lots of decent independent ones like Tyk or Kong.
Batches and event-streams are totally different, but even then you can still potentially route them all through the gateway so that you get the centralised observability (reporting, analytics, alerting, etc).
Use standards-based API specifications where possible. A good REST based API, based off a proper resource model is a non-trivial undertaking, not sure if it fits with what you are doing if you are dealing with lots of disparate legacy integration. If you are going to adopt REST, use OpenAPI to specify the API's. Using this standard not only makes it easier for consumers, but also helps you with better tooling as many design, build and test tools support OpenAPI. There's also AsyncAPI for event/async API's
Do some architecture. Moving sh*t to cloud doesn't remove the sh*t - it just moves it to the cloud. Don't recreate old problems in a new place.
Work out the logical components in your new solution: what does each of them do (what's it's reason to exist)? Don't forget ancillary components like API catalogues, etc.
Think about layering the integration (usually depending on how they will be consumed and what role they need to play, e.g. system interface, orchestration, experience APIs, etc).
Want to handle data in a consistent way regardless of source (your 'agnostic' comment)? You'll need to think through how data is ingested and processed. This might lead you into more data / ETL centric considerations rather than integration ones.
Co-design. Is the integration mainly data coming in or going out? Is the integration with 3rd parties or strictly internal?
If you are designing for external / 3rd party consumers then a co-design process is advised, since you're essentially designing the API for them.
If the API's are for internal use, consider designing them for external use so that when/if you decide to do that later it's not so hard.
Taker a step back:
Continually ask yourselves "what problem are we trying to solve?". Usually, a technology initiate is successful if there's a well understood reason for doing it, which has solid buy-in from the business (non-IT).
Who wants the reporting, and why - what problem are they trying to solve?
As you mentioned its an IT estate aka enterprise level solution mix of batch and real time so first you have to identify what is end goal of this migration. You can think of refactoring applications. If you are trying to make it event driven then assess the refactoring efforts and cost. Separation of responsibility is the key factor for refactoring and migration.
If you are thinking about future proofing your solution then consider Cloud for storing and processing your data. Not necessary it will be cheap but mix of Cloud and on-prem could be a way. There are services available by cloud providers to move your data in minimal cost. Cloud native solutions are there for performing analysis on your data. Database migration service in AWS or Azure can move data and then capture on-going changes. So you can keep using on-prem db & apps and perform analysis for reporting on cloud. It will ease out load on your transactional DB. Most data sync from on-prem to cloud is near real time.
Or I am really bad at searching or there is no detailed comparison between App Insights and ELK stack ?
All monitoring is going to be used for simple Web API, there going to be tons of end points but user traffic should not be too high.
So my question.. Is there any general points/differences when choosing between ELK and App Insights, personally never had a chance to set up any of those, but before setting up test environment would be nice to know in advance, what to expect/look for.
I'm from App Insights team. I think the link provided by #rickvdbosch in a comment gives quite good perspective. It is 1+ years old at this point, so, some items regarding App Insights evolved since then.
I think App Insights and ELK are quite different offerings. The former is managed offering (you can set it up within couple minutes), focused on very broad range of out-of-the-box experiences (collecting incoming/outgoing requests, exceptions, smart alerts, availability monitoring, analytics, live metrics, application map, end-to-end transactions across apps).
My understanding of ELK is that it has very powerful UI visualization and powerful dashboards (though there are adapters for Kibana to work with Azure Monitor). For scenarios where there is a need to store a lot of data (highly loaded apps with adaptive sampling still store limited amount of data) ELK solution might be cheaper to run.
Final decision was to use ELK as servers already have all the configuration, because other team uses it and mainly because logging will need a lot customization.
I work devops for a fairly large company that is in process of transitioning to microservices. This is a new area for most people involved and some of the governing requests seem like bad practice to me but I don't have the expertise to convince otherwise.
The request is to generate a report before deploying that would list any new api/events (Kafka is our messaging service) in a microservice.
The path that's being recommended is for devs to follow a style guide and then scrape the source code during CI/CD pipeline to generate a report that can be compared to previous reports and identify any new apis.
This seems backwards and unsustainable but I've been unable to find another solution that would satisfy their requests. I've recommended deploying to dev first, then using a tracing tool to identify any api changes, or event subscriptions, but they insist on having the report before deploying.
I'm hoping for any advice on best practice to accomplish this.
Tracing and detecting version changes is definitely over engineering. Whats simpler like #zenwraight has mentioned, is to version your APIs. While tracing through services to explore the different versions and schema could be a potential solution, it requires a lot more investment upfront and if thats not the bread and butter of the company, I would rather use a vendor product that might support something like this.
If discovery is a mechanism that is needed, I would recommend something that publishes internal API docs using a tool like Swagger so that you can search if there's an API you can consume.
And finally to support moving to different versions, I would recommend having an API onboarding process for the services so that teams can notify other teams that are using specific versions their services are coming to the end of their lifecycle and they will need to migrate to newer ones.
As per title. I don't know if this is the right place or way to ask this, admins feel free to edit/move/close the question if appropriate.
I'd like to get pointers to recent material clarifying the market trends, as well as real life examples. Even pseudo-pundit, Gartner-like stuff is OK. Thanks.
I am curious about the second part of the question. What is the basis of your statement that 'the ESB thing' appears to be fading? I don't believe it is.
The problem with ESBs however is that some vendors call their product an ESB, but it actually is much much more than that. In some companies this happened with their integration product just because Gartner or some other analysts company says that ESB is hot. Marketing strategy is changed: The product is called ESB and maybe somethings are added that are expected in an ESB.
Paul Fremantle of WSO2 wrote a very good article about what an ESB really is [1].
As for OSGi: The first company I saw using it in their middleware was WSO2. I have heard, that TIBCO, another middleware vendor, is also moving or has moved towards using it in their Active Matrix platform.
OSGi may help in various ways. The most important is that it decreases the effort of the installation of the platform. Install a minimum on each system used to deploy the application, and during deployment the components required to run the application will be added. You do not have to worry about having installed the right plug-ins, add-ons and what not. This is what both WSO2 and TIBCO are doing.
With some vendors, you see that you need to install an awful amount of software, of which you in the end may be using just a small part (e.g. IBM WebSphere). Because of this, you may have to use over-dimensioned systems, which adds extra costs.
OSGi may prevent this.
Have a look at the presentation of WSO2 about the WSO2 Carbon platform [2].
The statement at the end of the presentation says it all:
Adapt the middleware to your architecture, not the architecture to the middleware
So yes, I think OSGi has a future in enterprise apps.
[1] http://wso2.org/library/2913
[2] http://www.slideshare.net/wso2.org/the-carbon-story-presentation-855666
Disclaimer:
I am in no way affiliated with WSO2, TIBCO or IBM. I am a certified TIBCO BusinessWorks Developer and have been developing applications for the IBM WebSphere Process Server platform. Above all, I am a WSO2 Enthusiast.
I would say yes..WSO2 has proof for that..Check the following links
http://osgi.dzone.com/articles/carbon-osgi-and-soa
http://www.infoworld.com/d/developer-world/wso2-upgrades-osgi-middleware-695