vdm Odata version compatibility - s4sdk

A conceptual question for vdm usage. Assume my OData evolves in a S4 cloud system and I am consuming it in a microservice. Since vdm needs the edmx file to generate entitiy classes, assume my odata has a new field or has eliminated one field that I do not use. If I do not change my edmx and will not generate new classes, will it be still work my call? And second question is, if one of the fields I use change, and I need to ensure 0 downtime, how do I handle 2 versions of generated classes in the same time?

The generated OData VDM ultimately performs an OData call based on the fields that are used. So if you would not use fields that are removed, this should not be a problem. Note however, that such removals would have to be done in a new version of the SAP S/4HANA service.
Since breaking changes affect all consumers independent of whether the Java or JavaScript VDM of the SAP S/4HANA Cloud SDK is used, developers of services in SAP S/4HANA have to follow a certain API guideline that includes specific deprecation rules.
So, if a breaking change is really required, according to the S/4HANA API guideline, a new version of the service has to be published and this will be also available with a different URL. This then gives you the possibility to migrate from an old to a new version without interruptions.

Related

Different polling delay for different suppliers in Spring Cloud Stream Function

I'm trying to implement suppliers using Spring Cloud Function and Kafka. I need that one supplier should publish after every 10 secs and other should publish after every 30 secs. I could see from documentation, I can change delay using spring.cloud.stream.poller.fixed-delay property. Reference
But I need to set different delay for each topic. Is there any way to do it?
From the spring-cloud-function perspective there isn't any kind of polling as it is not the responsibility of the framework.
From the spring-cloud-stream perspective that uses spring-cloud-function indeed there is a mechanism that you have described. However, keep in mind that spring-cloud-stream is primarily designed to support concept of microservices (not your general messaging framework) and in microservices we embrace do one thing but do it well without affecting others approach. So having more then one supplier kind of goes against this model.
If you are building a general purpose messaging app, then i'd suggest to use Spring Integration framework which provides all the necessary hooks to accomplish what you need, but will require a bit more configuration details.

Versioning cross micro service dependencies

Architecture overview
Consider the following simplified microservice architecture:
Service A - Portal
Service B - API
Service A depends on Service B.
Problem statement
If I want to build a new backward-compatible feature in service A which requires changes to service B, then by definition of semantic versioning I'll have to create a new minor version for both services. However, service A requires the new minor version of service B to be deployed. How can I effectively manage this dependency? Do I need to create a new major version of service A to signal the changed dependency? I want to avoid that service A gets deployed while service B hasn't been deployed yet...
So basically; how should I version changes which are non-breaking (i.e. minor) in components itself, but will break the overall application if versions don't match?
You don't need to break the existing API contracts. Lets call your existing API (service B ) myapi/v1/products. You won't change anything in the existing API or this end point at all. You will create new version call it myapi/v2/products and deploy it. Its your choice where and how you host that end point. This end point has all the latest changes you want. Your portal is not affected yet.
Now you would deploy the portal and would use myapi/v1/ for features that require backward compatibility and myapi/v2/ for new features and this way you can manage API versioning without breaking features.
Hope that helps!

Data Migration using Spring

We are beginning the process of re-architecting the systems within our company.
One of the key components of the work is a new data model which better meets our requirements.
A major part of the initial phase of the work is to design and build a data migration tool.
This will take data from one or more existing systems and migrate it to the new model.
Some requirements:
Transformation of data to the new model
Enrichment of data, with default values or according to business rules
Integration with existing systems to pull data
Integration with Salesforce CRM which is being introduced into the company.
Logging and notification about failures
Within the Spring world, which is the best Spring project to use as the underlying framework for such a data migration tool?
My initial thoughts are to look at implementing the tool using Spring Integration.
This would:
Through the XML or DSL, allow for the high level data flow to be seen, understood, and edited (possibly using a visual tool such as a STS plugin). Being able to view the high level flow in such a way is a big advantage.
Connectors to work with different data sources.
Transformers components to be built to migrate data formats.
Routers to route the data in the new model to endpoints which connect with systems.
However, are there other Spring projects, such as Spring Data or Spring Batch, which are a better match for the requirements?
Very much appreciate feedback and ideas.
I would certainly start with spring-integration which exposes bare bones implementation for Enterprise Integration Patterns which are at the core of most/all of your requirements listed.
It is also an exceptionally great problem modelling tool which helps you better understand the problem and then envision its implementation in one cohesive integration flow
Later on, once you have a clear understanding of how things are working it would be extremely simple to take it to the next level by introducing the "other frameworks" you mentioned/tagged adding #spring-cloud-data-flow and #spring-cloud-stream.
Overall this question is rather broad, so consider following the above pointers and get started and raise more concrete questions.

protobuf version compatability

I am using a cloud service which is building on protobuf2.0. This cloud service cannot be changed.
Now we have client to connect to this cloud service, which is built on .netcore 2.0.
As I test, that .netcore only works with protobuf3.0 syntax.
And 3.0 syntax is little different with 2.0. If I deploy client with protobuf3.0 in C# .netcore 2.0, can it consume the service which is built on 2.0 protobuf?
The actual binary serialization format hasn't changed at all in this time, so there are no fundamental blockers.
The biggest feature difference between proto2 and proto3 is the treatment of default / optional values. Proto3 has no concept of "the default value is 4" (defaults are always zero/nil), and has no concept of explicitly specifying a value that happens to also be the default value (non-zero values are always sent, zeros are never sent). So if your proto2 schema makes use of non-zero defaults, it can be awkward to transition.
As I test, that .netcore only works with protobuf3.0 syntax.
That statement makes me think you're not using protobuf-net (tags), bit are in fact using Google's C# implementation - Jon's original port was proto2 only, and the version migrated to the Google codebase is proto3 only. However, protobuf-net (a separate implementation) has no such limitation, and supports both proto2 and proto3 on all platforms including .NET Core. It does have a different API, however. Protobuf-net can be found on nuget, with a .proto processing tool available here (it also provides access to all the "protoc" outputs if you want to compare to the Google version).

How to Implement Database Independence with Entity Framework

I have used the Entity Framework to start a fairly simple sample project. In the project, I have created a new Entity Data Model from a SQL Server 2000 database. I am able to query the data using LINQ to Entities and display values on the screen.
I have an Oracle database with an extremely similar schema (I am trying to be exact but I do not know all the details of Oracle). I would like my project to be able to run on both the SQL Server and Oracle data stores with minimal effort. I was hoping that I could simply change the configuration string of my Entity Data Model and the Entity Framework would take care of the rest. However, it appears that will not work at seamlessly as I thought.
Has anyone done what I am trying to do? Again, I am trying to write an application that can query (and update) data from a SQL Server or Oracle database with minimal effort using the Entity Framework. The secondary goal is to not have to re-compile the application when switching back and forth between data stores. If I have to "Update Model from Database" that might be ok because I wouldn't have to recompile, but I'd prefer not to have to go this route. Does anyone know of any steps that might be necessary?
What is generally understood under the term "Persistence Ignorance" is that your entity classes are not being flooded with framework dependencies (important for N-tier scenarios). This is not the case right now, as entity classes must implement certain EF interfaces ("IPOCO"), as opposed to plain old CLR objects. As another poster has mentioned, there is a solution called Persistence Ignorance (POCO) Adapter for Entity Framework V1 for that, and EF V2 will support POCO out of the box.
But I think what you really had in mind was database independence. With one big configuration XML that includes storage model, conceptual model and the mapping between those two from which a typed ObjectContext will be generated at designtime, I also find it hard to image how to transparently support two databases.
What probably looks more promising is applying a database-independent ADO.NET provider like the one from DataDirect. DataDirect has also announced EF support for Q3/2008.
http://blogs.msdn.com/jkowalski/archive/2008/09/09/persistence-ignorance-poco-adapter-for-entity-framework-v1.aspx
The main problem is that the entity framework was not designed with persistence ignorance in mind. I would honestly look at using something other than entity framework.

Resources