Create GCS V4 signed url via google cloud workflows - google-workflows

Before I conclude that I can't do this with google cloud workflows alone, I just wanted to check with the community that I'm not missing anything...
I have a google cloud workflows program which exports data from BigQuery to GCS and then sends an email to a user with a URL in the body of the email. I want this URL to be signed.
The gcloud CLI and language-specific libraries all come with nice helpers to do this but I can't access any of this direct from google cloud workflows. I considered implementing my own sub-workflow which would perform the logic described in the signing URLS manually documentation but I don't think I can do this from Workflows alone (I could easily create some cloud func which I call [and in that case, I could just use the helper from the python SDK for example] but I'm trying to avoid that). The following functionality from the python example constitute blockers; logic that I believe I can't do from google cloud workflows alone - unless anyone knows of public web services that I can call to get around this?
canonical_request_hash = hashlib.sha256(canonical_request.encode()).hexdigest()
signature = binascii.hexlify(google_credentials.signer.sign(string_to_sign)).decode()
Everything else I could just about do in a fairly long and drawn out sub-workflow... but it would be possible.

Cloud Workflows do not natively support hashing & RSA signing libraries within its Standard library which is a core requirement of GCS URL signing algorithm.
As also advised in public docs, Cloud workflows / sub-workflows should be primarily used as an orchestration flow to invoke services, parse responses, and construct inputs for other connected services. Services (like Cloud Function / Run etc.) should be created to perform any work that is too complex for Workflows or for operations that are not natively supported by Workflows expressions and its standard library.
Solution for above use case is to either:
a) Create a service (~ triggered from Cloud Workflow) like Cloud Function to generate signed GCS URLs.
OR b) Generate the GCS Signed URL as an independent task outside & after execution of the core workflow operation as shown in this sample.

Related

Is it possible to have a multi-endpoint REST API on Google Cloud Functions? (Aws Lambda migration to GCF)

My company has been using AWS Lambda for many years to run our Spring Boot REST API. We are migrating to GCP and they want me to deploy our code to GCF the same way we were with AWS Lambda, but I am not sure that GCF works that way.
According to Google Cloud Functions are only good for Single Endpoints and can only work as a web server using the functions framework.
Spring has a document that uses the GcfJarLauncher, but that is still in alpha and I can only get it to work for a single endpoint. Any additional functions I put into the code are ignored and every endpoint triggers the same function.
There were some posts here on SO that talked about using Functional Beans to map to multiple functions, but I couldn't fully get it working and my boss isn't interested in that.
I've also read of people putting the endpoint in the request payload and then mapping to the proper function, but we are not interested in doing that either.
TLDR/Conclusion:
Is it even possible to deploy our app to GCF or do we need to use Cloud Run (as Google suggests in my first link)?

List all the scheduled snapshots in a given project and region programmatically (golang)

I am trying to use a golang client to programmatically list all the scheduled snapshot policies in a given project and region and describe them.
I am able to fetch them using gcloud commands, but wondering how I can do the same programmatically (preferably compute golang client)?
gcloud compute resource-policies list --project myproject
gcloud compute resource-policies describe my-snapshot-policy --project myproject --region myregion
thanks in advance.
Per #john-hanley, you are encouraged to demonstrate your own attempt to solve the problem in your question.
Google provides SDKs for all of its services. There are 2 flavors and this can be confusing. The original style which you can find for any Google service are called API Client Libraries. For Google Cloud Platform many (!) of the services also (!) have Cloud Client Libraries. See Google Client Libraries Explained.
For Compute for Golang, there's a new Cloud Client Library.
You can see examples of its use here. I encourage you to follow Google's style including by using Application Default Credentials.
You will want to use a ResourcePoliciesClient and the client's Get and List methods.

Does Google Container Engine SDK/API exist?

I am planning to launch container cluster from an SDK/API. Presently, I am fine with any language, but I prefer NodeJS SDK. As far as I have seen, I could not find any Container engine SDK. Here is the NodeJS SDK for GCP which does not contain container engine. In fact it contains SDK only for very few GCP services.
I came across OAuth API for container engine but it involves human intervention to launch it. I am looking for service account based authentication for the SDK.
Are there container engine SDKs available ?
Update after discussion with Robert Lacok:
This is the code I tried to use for container APIs with API-key, it does not work. It expects Oauth 2 token, or some other credentials other than Service account. I tried API-key it didnt work. I dont know how to use Service account authentication with the API.
Here is my source code:
Here is the error:
I see a method for Application Default credentials. But I dont think so it will be useful for my use-case. I am trying to create container cluster from AWS Lambda. So, I cant use application default credentials. Is there any other options ?
The API for Google Container Engine is very limited at the moment as all the features are in Alpha status and because they can change not many people are incorporating them into the SDKs they are developing.
These are the current available APIs: https://cloud.google.com/sdk/gcloud/reference/container/
And here is the Alpha APIs: https://cloud.google.com/sdk/gcloud/reference/alpha/container/
What you probably want to do is making calls to the REST API and using the client library for OAuth2 authentication.
You can browse the API documentation and see that every method has a short how-to for a number of languages, NODE.JS being one of them. Have a look here for an example on how to create a container cluster.
You also mentioned service account authentication. The preferred way to do this is to use the application default credentials, you can have a little read about them here.
In short, you want to set an environment variable GOOGLE_APPLICATION_CREDENTIALS=/path/to/key.json which is a key to service account you generated in console.
Then the client library will take care of the rest (getting the OAuth tokens and what not).

How do I create a StorageController in Node js for Azure File Sync?

If i wish to create a StorageController to generate the SAS tokens inorder to make the Azure File Syncing work with Xamarin forms, is that possible ? All the documentation that I have seen only mention a way to do it in ASP.NET.
Reference :
https://azure.microsoft.com/en-us/blog/file-management-with-azure-mobile-apps/
It seems I can create a SAS token in Node js, but how do I tie it up the /table endpoint ?
http://azure.github.io/azure-storage-node/
There is significant work involved in creating the equivalent of a StorageController for the Node.js backend. You could create a simple custom API endpoint that generates SAS tokens and consume those from your client application, i.e. interacting directly with Azure Storage rather than using the file sync plugin for Azure Mobile Apps.

Is Parse an adequate solution here?

I'm contemplating to use Parse as a platform for my app, as I'm trying to avoid creating and managing the cloud infrastructure myself.
For the sake of simplicity let's say that my app will hook into an Exchange Server and will need to leverage some hosted Machine Learning service to categorize my e-mail and report on insights found.
I'm assuming that Parse would store my core data, while the hosted ML will store the "Big Data" associated with processing for insights.
I'm also expecting my app to receive push notifications generated by the hosted ML service.
Does this sound like a plausible way to go about it and leverage Parse, or am I better off developing the backend myself?
I think parse.com is the right place for you requirements, because they have everything you need like storage of core data, push notifications, cloud module which can be integrated with heroku, social integration, user management functionalities.
They also have large set of client libraries for desktop and mobile apps (node,java,.net etc...) also they have libraries of embedded devices.
The biggest advantage is that everything is setup, and you are focused on software development not on infrastructure things. This is my opinion.
I've been experimenting with the above stack and so far was really impressed. Seems like a viable path forward. The Cloud Code capability of Parse is very solid, and easy to work with. If you want to run services outside of Parse code this us also possible : just issue REST calls.

Resources