Whenever tableau sends a data driven alert, I want to trigger an external API call - aws-lambda

I am using Tableau Servers data driven alerts. Whenever our alerts threshold is passed, we send an email to users. However, I also want to call an external API from my application (not in tableau).
I have considered trying to use AWS SNS to trigger a lambda but was hoping to see if anyone else has faced a similar use case? It does not seem like Tableaus Rest APIs provide enough metadata to handle this use case for data driven alerts

Related

How to Transfer Data Between Multiple Microservices?

As part of my project, I'd like to use microservices. The application is a store website where the admin can add products and the user can order and buy them.
I envision implementing four services: admin service, user service, product service, and order service.
I had trouble with handling data between multi services but it's solved by duplicating some necessary data using message brokers.
I can do this solution between product and user and order service because I need some of the data not all of them
Now, my question is about handling admin service because in this service I need to access all of the data, for example, the admin should have a list of users and the ability to add new products or update them.
how can I handle data between these services and the admin service?
should I duplicate all data inside the admin service?
should I use Rest API?
no thats wrong. it seems you want run away from the fact. in general duplication is an anti-pattern mostly in case you describe.
the way you thinking about admin-service is wrong.
because in this service I need to access all of the data
i dont think you need to have such a service. accessing the data based on users must be handled by Identity server(oidc Oauth) which is the separated service and handle the accessing end points .
for example the product-service provides 1-return product list 2-return individual product data 3-create data. the first two can access by both user and admin but the 3rd must be accessed by admin. one of identity server duty is to identify user in case of user interaction(login) with services.
ADMIN Scenario
user-client request create product endpoint(services eg:product.service).
client-app(front end app) is configed with identity server and realize there is no require identity tokens and redirect to identity server login.
NOTE: there is also identifying the client-app itself i skipped.
user-client login and get require token that based on his claims and roles and etc.
user-client request create product endpoint with tokens included in request header
endpoint (product service) receives the request and check the header (the services also configured base on identity server and user claims)
get the user claims info.
the create-product requires admin role if its there then there we go otherwise no access.
the image uses identity server 4 . there are also several kinds and also you can implement by your self using 0AUTH and oidc protocol libraries.
so the admin just request to the certain service not getting data through the separate service for this goal.
Communication between Service:
the most struggling part of microservices is the wiring it up. the wiring is directly the consequence of your design.(recommand deep study on Domain Driven Design).
asynchronous communication :
to avoid coupling between services mostly use asynchronous communication which you pass event eg:brokers like rabbitmq and kafka..etc , redis etc. in this communication the source service who send event does not care about response and not wait for it.just it always ready to listen for any result event. for example
the inventory service creates item
123|shoe-x22|22units
and this service fire event with data 123|shoe-x22(duplicate maybe or maybe not just id) to product service to create but it does not wait for response from product service that is it created successfully or not.
as you see this scenario is unreliable in case of fault and you need handle that so in this case you have to study CAP theory,SAGA,Circuit-breaker.
synchronous communication :
in this case the service insist to have response back immediately. this push service to become more coupling. if you need performance then you can use gRPC communication other wise simple api call to the certain service. in case of gRPC i recommand using libraries like MassTransit
which also can be used for implementingf gRPC with minimum coupling.
Some of Requests need data from multiple services
if you are in such situation you have two options.
mostly microservices architecture using APIGATE WAY (EG:nginx,OCELOT,etc)
which provide reverse-proxy,load balancing,ssl terminations etc. one of its ability is to merge the multiple responses from a request.but it just merge them not changing the data structure of response.
in case of returns desire response data structure you may create an Aggregator service which itself calls other two, gathers data and wrap it in desire format and return it.
so in the end still the Domain Driven Design is the key and i think i talked tooooo much. hope help you out there.

Getting emails via LocalStack / AWS SES

I am trying to use LocalStack for my end to end tests but I cannot read the emails sent via LocalStack/SES. Is there a way to do that?
I want my Cypress e2e tests to invoke my backend services, the backend services compose an email containing a link and send the email via LocalStack/SES. I then want my e2e tests to wait for that email, read the link sent in it, and proceed.
I managed to invoke LocalStack's SES to send the email, and I am aware that the moto library backing LocalStack stores the sent messages in memory. Is there a way to read those messages?
The sent messages can be retrieved via a service API endpoint (GET
/_localstack/ses) or from the filesystem.
Messages are also saved to the data directory (DATA_DIR, see
Configuration). If data directory is not available, the temporary
directory (TMPDIR) is used. The files are saved as JSON in the ses/
subdirectory and organised by message ID.
Reference:
https://docs.localstack.cloud/aws/ses/
Localstack uses Moto and Moto does expose the ability to check the sent emails. It is discussed here
https://github.com/spulec/moto/issues/1392
Taking a look at the code for localstack doesn’t look like they expose a function to access this information.
https://github.com/localstack/localstack/blob/master/localstack/services/ses/ses_starter.py
You will need to make a pull request to Localstack and add a function which exposes the ses_backend or specifically the sent_messages array.
from moto.ses import ses_backend
ses_backend.sent_messages
I am not using localstack for SES I am running the E2E tests against our real Quality Assurance Testing environment (real SES). In that case you can use one of the following.
What you need is a programmatic way of reading the inbox and checking the emails by title and maybe the body too.
Mailosaur https://mailosaur.com/
The API is simple to use with Java which fit our use case. In addition to that the response from their sales /support. They also have SMS services too which we are not using.
MailSlurp https://www.mailslurp.com/
This was our first choice just because it’s the first one we found and it looked pretty good. But we tried to contact them and never go a reply. They still haven’t replied and we sent half a dozen emails.
MailTrap https://mailtrap.io/
There was a third service which is suitable if your working with JavaScript (please use Typescript) as it’s REST based. But for Java REST APIs end up being quite verbose with the code which I am not a fan of. But if you are in JavaScript this option is suitable.
The prices of each are all comparable MailTrap has a free option too.

How can I have WebSockets connect to a Google Cloud Function?

I have a WebSocket that I'm subscribing to and when an event comes in, I want to trigger a Google Cloud Function. Is this possible?
For example, I'm listening to https://alpaca.markets/docs/api-documentation/api-v2/streaming/
And whenever I get trade_updates, I want to run a function on Google cloud Functions
All trigger types for Cloud Functions are listed here: https://cloud.google.com/functions/docs/calling
Web Sockets are not directly supported as a trigger type.
The closest I can think of is setting up an architecture like this:
So here you have some code listening to the web socket and then triggering Cloud Functions through Pub/Sub or one of the other supported trigger types.

Laravel best practices listening to JSON data from external API

I have this use case of building an automation tool, that needs to get data from an external API and notify a specific user.
I'm thinking about using a cron job to fetch the API every 2 minutes and triggering an event to notify the user.
Is there any alternative approach, something to listen to an API?

Intercepting PouchDB communications with CouchDB backend

I am considering PouchDB & CouchDb as an alternative to Amazon Cognito Sync for a hybrid mobile app that will need data synced between devices and users. I have pouchdb working in a small sample app that syncs with a local couchdb.
I need to be able to intercept the communications back and forth between the pouchdb and couchdb in Java in order to do things in response to these sync events. Sort of like Amazon Cognito Sync's sync triggers. Also, I keep thinking much like Spring's AOP around.
Since the couchdb has a rest interface, I thought I could point the pouchdb to my application server which has a controller listening for any request with the db name as the base. When a request, from pouchdb comes in the Java Rest Controller can optionally do something, then forward the request to the real rest endpoint of the couchdb and get a response, then optionally do something again, then return the response to the pouchdb.
Does this seem like a feasible solution? I am currently working on trying to get this concept working. Has anyone else done anything like this? Any major pitfalls to this approach? Currently, I'm using Java 8 with Spring Boot & Jersey.
I think the architecture goes like this:
Data is empty everywhere.
Data changes, the device where the data changed pushes via a REST APIs.
Your server "master", send notification GCM or APN to devices.
In your notification listener, you check the type of notification and you sync the data.
If a new device connects to your "list of devices to sync" you send a push notification to sync the data.
Keep a list of connected devices.
The same ideas goes for every device/web browser. You have a local cache that you push to the "master" if it changes locally.
You will have many cases to deal with, and I don't think there are open source projects that offers the same patter as Cognito Sync.
Also think about scalability, devices don't have to pull your "master", the master sends notification to trigger devices to download the data.
You have to deal with diffs, regular checks, and so on ...
Good luck

Resources