Is there a way to update connector configuration using MSK-connect API? - apache-kafka-connect

I have an s3 connector deployed on MSK Connect, and a repository on github with the json connector configuration file. I'd like to update the connectors configuration on demand via MSK's REST API. I've checked the API documentation, but it seems like the UpdateConnector API only allows to modify the capacity configuration. The CreateConnector API does allow to provide connector configuration, but it returns an error if the connector already exists.
I could delete and then recreate the connector, but this doesn't seem like a good approach.
Is there another way to update a running connector configuration?

If the Connect REST API is not directly accessible in other ways, then it seems that delete/recreate is the only option.
For sink connectors, that's a relatively safe option because consumer offsets are tracked by the connector name itself and there's no state stored outside of the internal Connect topics

Based on an answer I got from AWS, it's indeed not supported at the moment. It's in their roadmap, but they don't have an ETA at the moment.
They suggest to follow the information here, for upcoming improvements.

Related

How to read a file on Windows shared folder from Springboot?

I am confused, and googled everything but there's no answer:
I got a excel file stored in somewhere like this on windows, it's a shared file under 'Network':
\\[serverName]\[folderName]\[folderName]\[folderName]\[folderName]\ZNAC.XLSX
It's compulsory that I can only read/download the file here.
Everything works fine when I am reading it from local, it both works fine by using SMB or declare the file path directly as an inputstream.
But when I deploy to SAP cloud foundry, it always ends up with FileNotFoundException, and I tried a lot of ways and no change.
I am wondering if the cloud instance is finding the file from internally not externally.
But I tried SMB as well, it's not working.
I found there is something called 'Volume service' on cloud foundry, but it's not usable in SAP Cloud Foundry.
Any help to make my application able to read an external file from SAP Cloud Foundry?
To read a file from an external share you must first create volume service for the corresponding share (NFS or SMB) and start it.
Then you must bind service instance to the CF app like this:
cf bind-service YOUR-APP SERVICE-NAME -c '{"uid":"UID","gid":"GID","mount":"OPTIONAL-MOUNT-PATH","readonly":true}'
The detailed guide is here
https://docs.cloudfoundry.org/devguide/services/using-vol-services.html#smb
SAP Cloud Platform / SAP BTP does not have a service that allows you to access SMB drives. One possibility would be to use a SMB/SAMBA Java client library, configure the Firewall / SAP Cloud Connector accordingly. We once implemented something like that, but there are some challanges on the way.
An other, easier, possibility would be to create an on-premise service (e.g. REST) that allows you to access the files. This service needs to be made available to SCP as well, for example, through SAP API Management.

Personnalized Microsoft Teams connector config page

Question
How can I make my Microsoft Teams connector config page look like the ones bellow?
Details
I'm creating a teams app with a connector. The documentation explain how to make our own config page. (for example here and here). It works for me 👍.
But in the app directory, I saw multiple existing connectors that share the same config page structure. (name of connector, then webhook, then instructions to install)
This structure would be good for me but I did not found how to configure my connector to use that template.
Thanks for help 😀.

Please migrate off JSON-RPC and Global HTTP Batch Endpoints - Dataflow Template

I received an email with the Title ^ as the subject. Says it all. I'm not directly using the specified endpoint (storage#v1). The project in question is a postback catcher that funnels data into BigQuery
App Engine > Pub Sub > Dataflow > Cloud Storage > BigQuery
Related question here indicates Dataflow might be indirectly using it. I'm only using the Cloud PubSub to GCS Text template.
What is the recommended course of action if I'm relying on a template?
I think the warning may come from a dataflow job which uses the old version of storage API. Please upgrade Dataflow/Beam SDK version beyond 2.5.
Since you're using our PubsubToText template. The easiest way to to it would be:
Stop your pipeline. Be sure to select "Drain" when asked.
Relaunch the pipeline using the newest version (which is automatically done if you're using UI), from the same subscription.
Check the SDK version. It should be at least 2.7.
After that you should not see any more warnings.

How to create .audit file to upload a custom amazon AWS audit for Audit Cloud Infrastructure

I am new to nessus Audit Cloud Infrastructure. I have an infrastructure over AWS cloud with Unix based machine.
Audit Cloud Infrastructure requires an audit file without it an error is coming as below :-
Error (400): At least one audit must be added to this policy in the
'Compliance' section.
I have tried to search a sample .audit file which I can import for my policy but I haven't get anything.
As per the slide below, there is tool as i2a but I haven't able to get that and neither I am able to contact with support portal(may be because I am using the trial version for now) :-
https://www.slideshare.net/jderienzo/create-a-nessus-audit-file-30230893
If anyone have any source of .audit file please share it. specially for unix one. Or please atleast provide some link/video etc so I can create my own .audit file.
I have tried below link but it didn't work for me:-
https://www.tenable.com/blog/version-2-of-windows-compliance-checks-available-for-testing
Any help is appreciated in advance!!

Multitenency in Apache Nifi

I am working on a cloud based application using Apache Nifi, for this we required to support Multitenency. But the current Nifi implementation only supports role based access for users, for a single flow.
I could understand that the flow state is saved as a single compressed XML file for a Nifi instance. So that who ever logins into that instance can view the same flow. Our requirement is to create unique flows for each user login. I tried to replicate state saving gz XML file for each users, but couldn't succeeded as the FlowService/FlowController which loads the XML file, is instantiated at the application startup and they are singleton. Please correct me, if Iam wrong with this approach. Or is there any other solution for adding Multitenant support with Nifi. I also wonder the reason behind the Nifi as a single user application.
Multi-tenant support will be introduced in Apache NiFi 1.0.0. There is a BETA release available [1]. This will support assigning permissions on a per component basis. However, the different tenants still share a canvas. There has been discussions of introducing a workspace concept that could provide visually separate dataflows.
[1] https://nifi.apache.org/download.html

Resources