I'm using Google cloud composer which essentially gives you an Apache Airflow environment. Since my application is written in Golang, I was looking for a golang client library to create and execute DAGs in the Cloud Composer Airflow environment. Please let me know if there's any.
Thanks to your clarification, you can't! Indeed, Composer is a managed version of Apache Airflow, where dags are described in Python and in Python only.
You can reach the Composer/Airflow API with Go, you can generate Python code with Go and Go template. You can also define a custom Operator in Airflow which run a Go binary with the right parameters. However, the custom operator itself must be wrote in Python.
Related
For our Airflow Projects (running Airflow 2.0.1) I have implemented some general test that verify the DAG validity, check if each DAG has an owner/email, check for parsing time and so on ..
Now I am trying to set up some CI/CD pipeline that runs these tests and then pushes the DAGs to the Cloud Composer bucket. This test however will obviously fail if I use any Airflow Connection or Variable as these have not been created yet on the runners.
What I do not want to do is using mocking, as I have to specify each connection/variable which is a bit too much work for general tests. How do you deal with connections/variables for testing on different environments (development/testing/production)
Since you're using Airflow 2 you can use the stable API to create or update variables in the desired environment. You just need to call that API from the CI/CD pipeline.
Check the official documentation for create a variable and update a variable.
Recently research different build tools , GCP CloudBuild is one of the selection
https://github.com/GoogleCloudPlatform/cloud-builders
one of require is work loop an array list and write the function only once and run in parallel
however i did not find Cloudbuild mention any about matrix build
Function which provided by Jenkins Plugin https://www.jenkins.io/blog/2019/11/22/welcome-to-the-matrix/
or Github Action https://docs.github.com/en/free-pro-team#latest/actions/reference/workflow-syntax-for-github-actions#jobsjob_idstrategymatrix
Cloud Run is not designed to work with Jenkins out of the box and the links you included do not mention how to do this.
As indicated [1] the best product to integrate Jenkins inside Google Cloud is to use Google Kubernetes Engine (GKE).
[1] https://cloud.google.com/jenkins
I received an email with the Title ^ as the subject. Says it all. I'm not directly using the specified endpoint (storage#v1). The project in question is a postback catcher that funnels data into BigQuery
App Engine > Pub Sub > Dataflow > Cloud Storage > BigQuery
Related question here indicates Dataflow might be indirectly using it. I'm only using the Cloud PubSub to GCS Text template.
What is the recommended course of action if I'm relying on a template?
I think the warning may come from a dataflow job which uses the old version of storage API. Please upgrade Dataflow/Beam SDK version beyond 2.5.
Since you're using our PubsubToText template. The easiest way to to it would be:
Stop your pipeline. Be sure to select "Drain" when asked.
Relaunch the pipeline using the newest version (which is automatically done if you're using UI), from the same subscription.
Check the SDK version. It should be at least 2.7.
After that you should not see any more warnings.
I've used Nutch and Elasticsearch many times before, however, I believe I was using the default setup to where Nutch used the binary transport method for communicating with Elasticsearch. It was simple and worked out of the box so I've used it alot.
I've been in the process of updating crawl system and it seems now the better option is to use the Jest REST api library.
However, I'm a bit confused about it...
First how do I install the Jest library to be used with Nutch and Elasticsearch. I know I can download or clone via Github but.. how is it connected?
Do I literally just update the dependencies in the /indexer-elastic-rest *.xml files for Nutch and then just build again with ant?
My first install of Nutch was using the binary zip. I just recently started using the src package so ant/maven is somewhat new to me - which is why this all a bit confusing. All the blogs and articles say to "and then rebuild with ant"...
Second - does the Jest library take care of all Java REST api code or do I have to write Java code now?
This might seem like a weird question . But i want to know if it is possible to access a GO language chaincode installed in the peers from Composer script file .
That is in script.js file of the BNA file is it possible to access the GO Lang chaincode installed in the peers from Hyperledger Fabric side .
I want to invoke a chaincode 'mycc' from the script.js file and try to call 'mycc' functions from Composer
Any suggestions?
Also an example would be good . Thanks !
It is possible to invoke another chaincode from a Composer transaction script. You would need to use the getNativeAPI() function. It is introduced in the Composer documentation halfway down in this document where I think the example uses the native api to access the 'current' network (chaincode).
There is an additional example in this tutorial where the getNativeAPI().invokeChaincode method is used to connect to a different Business Network.
I have not seen other examples, but this should work for connecting to other chaincodes.