How to setup a upstream and downstream trigger in google cloud build? - google-cloud-build

I have to run cloud build trigger one after another and only if first trigger pass it has to move to next trigger.
All the triggers are for different repos.
How can i achieve it?
Thanks in advance!

Cloud Build publishes messages on a Google Pub/Sub topic when your build's state changes, such as when your build is created, when your build transitions to a working state, and when your build completes. You can have a look at the doc here for more info.
You could set up a PubSub-triggered Cloud Function to process these events and programmatically launch the next Cloud Build via the API (see the API tab here). This is a but cumbersome since you have to define your builds in the API call's body but as of now there's no chaining capability in Cloud Build.

Related

Bot Framework Orchestrator create snapshots at runtime / Orchestrate between multiple KBs

I am creating a ChatBot using the Bot Framework SDK v4 with Orchestrator.
As per the docs, Orchestrator works by using a pretrained model to create a snapshot of .lu & .qna files, and then dispatches to the correct LUIS or QnAMaker service depending on user input by referring to the Orchestrator snapshot.
However, for my bot, which has several knowledge bases with several dozens of contributors this is a very tedious task. You have to export each knowledge base using the bf CLI via
bf qnamaker:kb:export -o filename.qna --kbId="mykbid" --qnaFormat
and then move the resulting .qna file into the CognitiveModels folder, then create the Orchestrator snapshot manually. Then I also have to redeploy the bot to Azure just to update the Orchestrator snapshot.
This means every time any knowledge base is updated I have to do this manually periodically in order to update the bot so that Orchestrator properly dispatches to the correct KB. I feel like this should be something that's automated during runtime. I don't see anywhere in the Docs a way to do this.
Since LUIS models aren't updated as regularly I thought that I could set the default case as the QnaMaker, so any unknown intents get mapped to QnAMaker by default and therefore the snapshot doesn't have to be updated as often. But the problem with that is if you have multiple QnAMaker KBs, there's no way to federate the KBs into a single QnAMaker endpoint (AFAIK; if it is possible this would solve my issue) so you have to rebuild Orchestrator snapshot to dispatch between different KB services as well.
Are there any suggestions on how to automate creating Orchestrator snapshots?
As far as I am aware, there is no automated method of accomplishing this. If everything you need to do is in the CLI, you could write a script to automate the task for you, and either run it as needed or use something like cron to run it automatically (assuming the machine running the script is on at the specified time).

Consume Hyperledger composer event in Angular Based Application which uses REST API

I am little new to UI freamework so please help me understand is there a way to consume an event if I have build the plain angular based app, which uses the composer rest api for UI (note: not a Node.js application)
Because as per documentation it says:
Node.js applications can subscribe to events from a business network by using the composer-client.BusinessNetworkConnection.on API call. Events are defined in the business network model file and are emitted by specified transactions in the transaction processor function file.
Redirection to blog or documentation link would be helpful.
We plan to expose events through the composer-rest-server however that work is not yet complete. See:
https://github.com/hyperledger/composer/issues/1177
Until that work is finished you will need to subscribe to events from within your Node.js express application and then publish them (via web sockets?) to your frontend.

Can we trigger concourse job with HTTP request?

I would like to configure my slack outgoing webhook such that it can trigger concourse job over http. Is there a way that we can trigger concourse jobs via http requests instead of manually clicking on job from web UI?
concourse has a feature where you can trigger pipelines via HTTP if you use a webhook token
https://concourse-ci.org/resources.html#resource-webhook-token
Yes, you can apparently trigger a concourse job via HTTP request as pointed out in this answer on Stackoverflow.
However you would need to implement an intermediary script (or Slack app) that receives the request from Slack and then triggers the concourse job accordingly. That would not work with a simple outgoing webhooks.
Also please note that Outgoing Webhooks are now outdated. Better use either a custom bot (based on Events API or RTM API) or a slash command to trigger them. The former is more flexible and powerful, the latter more easy to implement. I would advise to look into all mentioned concepts to see which bets fits your requirements.

Schedule CronJobs with the PHP Buildpack

For my PHP Web App I am using the PHP Buildpack. Now I would like to schedule a Tasks that should be triggered every month. Normally I would use CronJobs for that.
How can I achieve that within the Swisscom Application Cloud?
Swisscom App Cloud is based on Open Source Cloud Foundry
Upstream Cloud Foundry doesn’t have a feature equivalent to cron jobs (task scheduler). Stay tuned, I guess this feature will be soon implemented, because lots of people migrating from Heroku to CF. Heroku offers a cron job feature. Subscribe to Swisscom App Cloud Newsletter to read announcements.
There are workarounds for scheduling tasks, see Scheduling tasks on Cloud Foundry on blog.pivotal.io for a Ruby/Rake based example. Sorry for PHP I didn't found example code. There is no elegant solution! You need to implement yourself some kind of workaround. Would be great if you publish your code to GitHub.
If you need cron jobs only in data store, for example MariaDB offers Events.
Events are named database objects containing SQL statements that are
to be executed at a later stage, either once off, or at regular
intervals.
They function very similarly to the Windows Task Scheduler or Unix
cron jobs.
We had a simular issue. As written by #Fyodor, there is no native solution in Cloud Foundry. We did some research and found vendors like https://www.iron.io/.
Finally, we ended up with a very simple solution.
We expose all our background jobs via an https interface.
As we anyhow use Jenkins for CI/CD and it has lots of scheduling capabilities, we use our existing Jenkins to trigger these jobs via a simple cURL call to the HTTP endpoints.

How to detect from cloud code if any changes have been made to my class in Parse cloud

I want to deploy a cloud service which tracks whether any changes have been made to any of my classes in my app in Parse cloud. How should I go about doing this?
I know that I probably have to use a background job on Parse, but how exactly do I go about doing this?

Resources