Use Laravel Queue with Google Tasks on App Engine Standard - laravel

I'm running Laravel 6 on Google App Engine Standard and trying to make Laravel Queues work with Google Tasks natively.
Currently I'm creating (dispatching) and handling tasks with custom classes, but I would like to use Google task as native Laravel Queues, so I can call Job::dispatch() dispatch(new Job) and use Mail, Notification with Queueable. But unfortunately it is beyond my abilities and I can’t figure out how everything is interconnected in Laravel.

Using Google Cloud API, you can create multiple queues for different target applications, deployed on Google App Engine in either Standard or Flexible environment.(Check here)
Here you can find detailed instructions on how to associate your Laravel project with Google Cloud Tasks processing asynchronous Jobs.
-Basically, you will create a task queue with a “queue.yaml” file, to handle the Cloud Tasks.
-Before Creating the Task:
---Pass the route of the API and the payload object for the Task.
---Authenticate userID in payload.
-Build/Create the Task. Using the method demonstrated on the example, it will use the Google API to build a Cloud Task and pass it to the task queue.
-Create the API routes in api.php.
-Create a TaskController which will route the different API to the specific (associateApp()) function.

Related

Is it possible to have a multi-endpoint REST API on Google Cloud Functions? (Aws Lambda migration to GCF)

My company has been using AWS Lambda for many years to run our Spring Boot REST API. We are migrating to GCP and they want me to deploy our code to GCF the same way we were with AWS Lambda, but I am not sure that GCF works that way.
According to Google Cloud Functions are only good for Single Endpoints and can only work as a web server using the functions framework.
Spring has a document that uses the GcfJarLauncher, but that is still in alpha and I can only get it to work for a single endpoint. Any additional functions I put into the code are ignored and every endpoint triggers the same function.
There were some posts here on SO that talked about using Functional Beans to map to multiple functions, but I couldn't fully get it working and my boss isn't interested in that.
I've also read of people putting the endpoint in the request payload and then mapping to the proper function, but we are not interested in doing that either.
TLDR/Conclusion:
Is it even possible to deploy our app to GCF or do we need to use Cloud Run (as Google suggests in my first link)?

Create GCS V4 signed url via google cloud workflows

Before I conclude that I can't do this with google cloud workflows alone, I just wanted to check with the community that I'm not missing anything...
I have a google cloud workflows program which exports data from BigQuery to GCS and then sends an email to a user with a URL in the body of the email. I want this URL to be signed.
The gcloud CLI and language-specific libraries all come with nice helpers to do this but I can't access any of this direct from google cloud workflows. I considered implementing my own sub-workflow which would perform the logic described in the signing URLS manually documentation but I don't think I can do this from Workflows alone (I could easily create some cloud func which I call [and in that case, I could just use the helper from the python SDK for example] but I'm trying to avoid that). The following functionality from the python example constitute blockers; logic that I believe I can't do from google cloud workflows alone - unless anyone knows of public web services that I can call to get around this?
canonical_request_hash = hashlib.sha256(canonical_request.encode()).hexdigest()
signature = binascii.hexlify(google_credentials.signer.sign(string_to_sign)).decode()
Everything else I could just about do in a fairly long and drawn out sub-workflow... but it would be possible.
Cloud Workflows do not natively support hashing & RSA signing libraries within its Standard library which is a core requirement of GCS URL signing algorithm.
As also advised in public docs, Cloud workflows / sub-workflows should be primarily used as an orchestration flow to invoke services, parse responses, and construct inputs for other connected services. Services (like Cloud Function / Run etc.) should be created to perform any work that is too complex for Workflows or for operations that are not natively supported by Workflows expressions and its standard library.
Solution for above use case is to either:
a) Create a service (~ triggered from Cloud Workflow) like Cloud Function to generate signed GCS URLs.
OR b) Generate the GCS Signed URL as an independent task outside & after execution of the core workflow operation as shown in this sample.

Laravel best practices listening to JSON data from external API

I have this use case of building an automation tool, that needs to get data from an external API and notify a specific user.
I'm thinking about using a cron job to fetch the API every 2 minutes and triggering an event to notify the user.
Is there any alternative approach, something to listen to an API?

BotFramework v4 Running Multiple Bots

I need to build a single Chatbot instance using BotFramework v4 that can handle multiple endpoints, and thus multiple AppID/Secret. I have seen notes online and in BotFramework samples that it is possible to do but I cannot find any specific examples for BotFramework v4.
Can anyone provide a sample on who to handle such scenario. For example, I would need to have endpoints /messages/hr and /messages/payroll, and depending on which endpoint is used the right AppID/Secret is used and specific MainMenuHrDialog or MainMenuPayrollDialog is launched.
In general, is it recommended to handle bots for different domains in the same bot project, or is it better to have separate projects for different domains with a NuGet package shared for common tasks.
So if I understand correctly, your desire to use different appIDs and secrets is gonna require multiple web app instances of similar botframework template code which executes different functions using an extension to their already existing api/messages endpoint(the default chatbot messaging endpoint). I'd recommend setting up a couple of azure web app instances along with a couple of bot channels registrations for connecting channels to your bot logic. Here's a decent resource for that: https://learn.microsoft.com/en-us/azure/bot-service/bot-builder-tutorial-deploy-basic-bot?view=azure-bot-service-4.0&tabs=csharp
Though this doesn't apply directly to your scenario, you might want to check out this sample repo here: https://github.com/microsoft/BotBuilder-Samples/tree/main/samples/typescript_nodejs/16.proactive-messages. It shows you how you could open up those extra endpoints of /api/messages/hr or /api/messages/payroll. Additionally, I'm not sure how necessary the extra appIDs and secrets are for you but if your requirement is to ensure authorization when accessing these endpoints, I'd recommend looking into this prebuilt sample as well: https://github.com/microsoft/BotBuilder-Samples/tree/main/samples/javascript_nodejs/18.bot-authentication. It possesses some info about authentication and how you might differentiate between users using a combination of conversation.activity.id and tokenResponses from AzureAD.

Consume Hyperledger composer event in Angular Based Application which uses REST API

I am little new to UI freamework so please help me understand is there a way to consume an event if I have build the plain angular based app, which uses the composer rest api for UI (note: not a Node.js application)
Because as per documentation it says:
Node.js applications can subscribe to events from a business network by using the composer-client.BusinessNetworkConnection.on API call. Events are defined in the business network model file and are emitted by specified transactions in the transaction processor function file.
Redirection to blog or documentation link would be helpful.
We plan to expose events through the composer-rest-server however that work is not yet complete. See:
https://github.com/hyperledger/composer/issues/1177
Until that work is finished you will need to subscribe to events from within your Node.js express application and then publish them (via web sockets?) to your frontend.

Resources