How do I access OData service using the SAP S/4HANA Cloud SDK in existing application? - s4sdk

I have a Java application on SAP Cloud Platform Cloud Foundry that integrates with SAP S/4HANA Cloud (my company's ERP system) by calling APIs (OData services) in that system. I heard about the SAP S/4HANA Cloud SDK and that it makes such scenarios much easier.
How can I leverage the SAP S/4HANA Cloud SDK? Currently, my code to call SAP S/4HANA looks like this (simplified and joined together) for the scenario of retrieving product master data. I have created the S4Product class myself as representation of the response. The baseUrland authHeader are determined before by talking to the destination service on SAP Cloud Platform.
StringBuilder url = new StringBuilder(baseUrl);
url.append("/sap/opu/odata/sap/API_PRODUCT_SRV/A_Product");
url.append("&$select=Product,CreationDate");
url.append("&$filter=ProductType eq '1'");
url.append("&$top=10");
URL urlObj = new URL(url.toString());
HttpURLConnection connection = (HttpURLConnection) urlObj.openConnection();
connection.setRequestMethod("GET");
connection.setRequestProperty("Content-Type", "application/json");
connection.setRequestProperty("Accept", "application/json");
connection.setRequestProperty("Authorization",authHeader);
connection.setDoInput(true);
final InputStreamReader in = new InputStreamReader(connection.getInputStream());
String response = CharStreams.toString(in);
List<S4Product> result = Arrays.asList(new Gson().fromJson(response, S4Product[].class));
Now I'm asked to do something similar with business partners. How do I do this for the business partner OData service, using the SDK? Do I have to create a new application if I want to use the SDK?

With the Java virtual data model of the SAP S/4HANA Cloud SDK, your code would be replaced by something like the following.
final List<Product> products = new DefaultProductMasterService()
.getAllProduct()
.select(Product.PRODUCT, Product.CREATION_DATE)
.filter(Product.PRODUCT_TYPE.eq("1"))
.top(10)
.execute();
This handles everything you have done before manually, in a fluent and type-safe API. In this case, the class Product is provided by the SAP S/4HANA Cloud SDK, no need to create that yourself. It offers a Java representation of the entity type, with all fields, which we are using to define the select and filter query options.
And for your question about business partners, it would look quite similar to this.
final List<BusinessPartner> businessPartners = new DefaultBusinessPartnerService()
.getAllBusinessPartner()
.select(BusinessPartner.BUSINESS_PARTNER /* more fields ... */)
// example filter
.filter(BusinessPartner.BUSINESS_PARTNER_CATEGORY.eq("1"))
.execute();
BTW, this also covers talking to the destination service and applying authentication headers - you no longer need to do this manually.
You can use the SAP S/4HANA Cloud SDK in any Java project. Just include the dependencies com.sap.cloud.s4hana.cloudplatform:scp-cf (for Cloud Foundry) and com.sap.cloud.s4hana:s4hana-all.

Related

How to leverage Armeria's fantastic JSON to GRPC transcoding functions to springboot project

We have a existing springboot project which has terrible API management system. So we wanna do something like grpc-gateway related work. But we don't want to add sidecar to our existing service. We found that Armeria has a wonderful json grpc transcoding function. How do we leverage this thing to our existing spring boot project.
We found that Armeria has a wonderful json grpc transcoding function.
I guess a minimal example may look like the following:
final GrpcService grpcService = GrpcService.builder()
.addService(new MyGrpcService())
.enableHttpJsonTranscoding(true) // enable http json transcoding
.build();
final ServerBuilder sb = Server.builder();
sb.service(grpcService).serviceUnder("/foo", grpcService); // add the grpc service to the server
final Server server = sb.build();
Runtime.getRuntime().addShutdownHook(new Thread(() -> {
server.stop().join();
}));
server.start().join(); // start the server
How do we leverage this thing to our existing spring boot project.
Armeria also offers spring-boot integration. An example can be found in the following repository.
You can also ask at slack or github issues if you have any additional/follow up questions.

How can ff4j client consume the rest api exposed by ff4j server(spring mvc)?

I have used ff4j-sample-springboot2x-jdbc-postgres and setup my ff4j server along with DB part and exposed the api - accessible at "/api/ff4j". This is my admin component.
Now I want a client microservice (also a springboot app) to consume this REST api and use ff4j.check() and other methods. Is there a quick way (less code - I need to do this for many apps) to consume the api on client-side?
P.S. - For the server I have simply used a DB and ff4j-spring-starter(just like in the sample) - I have not used and jersey/jetty dependencies
The client accesses the FF4j service using its own FF4j object.
When you create a FF4j object, you define the FeatureStore and PropertyStore that it uses, which determines how the store is accessed. In your case, you want to use a FeatureStoreHttp and PropertyStoreHttp.
String ff4jApiEndPoint = "https://my-ff4j-service.org/api/ff4j";
FeatureStoreHttp fStore = new FeatureStoreHttp(ff4jApiEndPoint);
PropertyStoreHttp pStore = new PropertyStoreHttp(ff4jApiEndPoint);
FF4j client_ff4j = new FF4j(fStore, pStore);

Create kafka connector during startup without REST API

i have to leverage kafka connector as source but creating connector by REST api in production is something I want to avoid, is it possible to create connector during startup without using REST api?
I understand REST api provides the flexibility to dynamically create/configure connectors and size the tasks, but really want to if the same can be done during startup either by providing any configuration problem.
Currently I start connector in distributed mode by supplying properties file and I want to mention the database, filters, transformers and other details there itself.
Let me know if there is a way to achieve.

Spring Cloud Connector Plan Information

I am using Spring Cloud Connector to bind to databases. Is there any way to get the plan of the bound service? When I extend an AbstractCloudConfig and do
cloud().getSingletonServiceInfosByType(PostgresqlServiceInfo.class)...
I will have information on the url and how to connect to the postgres. PostgresqlServiceInfo and others do not carry along the plan data. How can I extend the service info, in order to read this information form VCAP_SERVICES?
Thanks
By design, the ServiceInfo classes in Spring Cloud Connectors carry just enough information to create the connection beans necessary for an app to consume the service resources. Connectors was designed to be platform-neutral, and fields like plan, label, and tags that are available on Cloud Foundry are not captured because they might not be available on other platforms (e.g. Heroku).
To add the plan information to a ServiceInfo, you'd need to write your own ServiceInfo class that includes a field for the value, then write a CloudFoundryServiceInfoCreator to populate the value from the VCAP_SERVICES data that the framework provides as a Map. See the project documentation for more information on creating such an extension.
Another (likely easier) option is to use the newer java-cfenv project instead of Spring Cloud Connectors. java-cfenv supports Cloud Foundry only, and gives access to the full set of information in VCAP_SERVICES. See the project documentation for an example of how you can use this library.

Azure alternative to spring cloud dataflow process

I'm looking for the azure alternative for the Data flow model of Data Source-processor-sink.
I want the three entities to be separate microservices. I want to use messaging as a link between these three.
Basically, Source app takes the data from another service and sends it to processor while processor app acts on it and sends relevant notification/alert to sink.
I'm aware I can use rabbitmq for the messaging but I need to know which one will be better in azure - service bus topics or eventhub? and how can I use them?
At the moment, there isn't a Spring Cloud Stream binder implementation for Azure Event Hubs.
Unless we have this, the out-of-the-box or the custom apps cannot be built as a messaging-microservice app, where Spring Cloud Stream provides the programming model and Spring Cloud Data Flow lets you orchestrate the individual microserivces in to a data pipeline (i.e., source-processor-sink) via the DSL/Drag-and-Drop GUI.
Microsoft was exploring the binder implementation in the past; possibly it would end up in Azure Spring Boot project. Feel free to drop an issue on their backlog.

Resources