Setup Terraform stackdriver alerts based on GCP bucket - google-cloud-stackdriver

I am trying to setup stack driver alerting policies thru terraform, and based on cloud storage bucket conditions.
So whenever there is a file in the GCP bucket, It should trigger a mail notification to our mails (Not using SendGrind).
For now, I got this mail notification working thru GCP console via stack-driver. But I am trying to incorporate it using terraform.
Any guidance is really appreciated. Thank you

Figured out thru terraform import google monitoring policies. Incorporated all thru terraform and managed to hook up notifications sent thru stackdriver over bucket changes as well.

Related

How to send Lambda logs to StackDriver instead of CloudWatch?

I am considering sending my logs into StackDriver instead of CloudWatch. But from the docs, it seem to only describe how to do it with EC2. What about lambda? I prefer to send logs directly to StackDriver instead of StackDriver reading from CloudWatch to remove the CloudWatch costs entirely.
Stackdriver supports the metric types from Amazon Lambda listed in this article
To use these metrics in charting or alerting, your Google Cloud Platform project or AWS account must be associated with a Workspace.
After you have a Workspace, you can add more GCP projects and AWS accounts to it using the Adding monitored projects instructions.
If you plan to monitor more than just your host project, then the best practice is to use a new, empty GCP project to host the Workspace and then to add the projects and AWS accounts you want to monitor to your Workspace. This lets you choose a useful name for your host project and Workspace, and gives you a little more flexibility in moving monitored projects between Workspaces. The following diagram shows Workspace W monitoring GCP projects A and B and AWS account D:
Monitoring creates this AWS connector project when you add an AWS account to a Workspace. The connector project has a name beginning with AWS Link, and it has the same parent organization as the Workspace. To get the name and details about your AWS connector projects, go to the Inspecting Workspace section.
In the GCP Console, AWS connector projects appear as regular GCP projects. Don't use connector projects for any other purpose, and don't delete them while your Workspace is still connected to your AWS account.

OpenStack VM creating Using Alerts from Splunk

As per my understanding, in AWS, we can combine AWS CloudWatch and AWS Elastic Beanstalk for the automation of VM creation. For example, We can configure CloudWatch to trigger an alert for certain condition and depending on that we can create/alter a VM. Is there a way to do the same with OpenStack using Terraform scripts?
Currently, we are creating and managing OpenStack VM's using terraform and ansible scripts. We have Splunk for dashboard and alerts. Is there a way to execute terraform scripts for VM's as we get an alert from Splunk? Please correct me if my understanding is wrong.
Is there a way to execute terraform scripts for VM's as we get an alert from Splunk?
AWX (or its Tower friend) will trivially(?) do that, via /api/v2/job_templates/{id}/launch/, or if there needs to be some API massaging (either to keep the credentials out of Splunk or to reshape the webhook payload) then I would guess a lambda function could do that
I would guess that if you are using terraform to drive ansible (instead of the other way around) then you could use Atlantis or TerraHub in roughly the same manner

How to set elastic alert on amazon elasticsearch

I've been looking for a tutorial that I can get alerts from amazon elasticsearch.
I'm using metricbeat in my server instance to collect logs everything is fine but now I have to find a way to send alert for my memory and cpu, I read something about elastic alert to send alert to e-mail or slack but I don't know how to use it on amazon elasticsearch.
If anybody has a tutorial that help me.
Thanks in advance.
You need x-pack to be able to configure watchers to send email or Slack alerts. But, AWS Elasticsearch does not offer x-pack features. For this exact reason we moved away from AWS Elasticsearch to Elastic Cloud and we couldn’t be happier.

Terraform: cloudwatch logs to elasticsearch

I am trying to push the cloudwatch logs to elastic search either using a Lambda function or Amazon Kinesis. I have the log groups setup and the elastic search domain running using terraform. Please suggest on how can I push the logs from the log group to elastic search. Please share if you have the terraform codes for the same.
This answer documents some example Terraform code for creating a lambda and Cloudwatch subscription that ships logs from a Cloudwatch log group to a Sumologic HTTP collector (just a basic HTTP POST endpoint). The Cloudwatch subscription invokes the Lambda every time a new batch of log entries is posted to the log group.
The cloudwatch-sumologic-lambda referred to in that Terraform code was patterned off of the Sumologic Lambda example.
I'd imagine you would to do something similar, but re-writing the Lambda to format the HTTP however ElasticSearch requires. I'd bet some quick googling on your part will turn up plenty of examples.
Alternatively to all this Terraform config though, you can just go to your Cloudwatch console, select the log group you're interested in and select "Stream to Amazon ElasticSearch".
Though I think that will only work if you're using the AWS "ElasticSearch service" offering - meaning if you installed/configured ElasticSearch on some EC2 instances yourself it probably won't work.

Connect hadoop cluster to mutiple Google Cloud Storage backets in multiple Google Projects

It is possible, to connect my Hadoop cluster to multiple Google Cloud Projects at once ?
I can easly use any Google Storage bucket in single Google Project via Google Cloud Storage Connector as explained in this thread Migrating 50TB data from local Hadoop cluster to Google Cloud Storage. But i can't find any documentation or example how to connect to two or more Google Cloud Project from single map-reduce job. Do You have any suggestion/trick ?
Thanks a lot.
Indeed, it is possible to connect your cluster to buckets from multiple different projects at once. Ultimately, if you're using the instructions for using a service-account keyfile, the GCS requests are performed on behalf of that service-account, which can be treated more-or-less like any other user. You can either add the service account email your-service-account-email#developer.gserviceaccount.com to all the different cloud projects owning buckets you want to process, using the permissions section of cloud.google.com/console and simply adding that email address like any other member, or you can set GCS-level access to add that service-account like any other user.

Resources