How to send event into Azure EventHub in locust task? - python-asyncio

I want to send message into eventhub using locust task, but looks like it doesn't support async tasks, like Asyncio support, how could i do it then?
it would stucked when process locust -f
and then stop with send event timeout.

Related

is there a gRPC "completion queue" for synchronous server-side streaming?

I am working on implementing a gRPC service and I would like to use the server streaming functionnality. But since i am working with a Ruby client, from what I understand, the the request must be synchronious. Is there something to know when working with synchronious streams? what happens if the client is slower to process the streams than the server, are each messages sent in the stream queued somewhere so the client can process them? If yes, is there a size limit to that queue?
It seams I am able to process all the streams I receive, but what does that look like in a production environment? There is little information on what happens to a message when it is received by the client but before it is processed.

Heroku Scheduler for Discord bot to send daily message

I currently have a discord.js bot hosted on Heroku. I want to schedule a daily message, but Heroku Scheduler takes a single .js file to run a scheduled task. If I were to schedule it, I would have to copy the basic code from my index.js and create another instance of the bot to send the scheduled message. Is there a better way to schedule a Discord message on Heroku?
If you really want to keep it simple, you can use the Discord API directly, without discord.js. Take a look at the documentation for the Create Message endpoint. You can use a library like axios to make the request in your single scheduled .js file.
You can also use the #discordjs/rest package, which allows you to more easily make requests to the Discord API without having to spin up an entire gateway-connected bot.

How can I trigger spring batch execution from vuejs

I am trying to trigger a spring batch execution from endpoint. I have implemented a service at the backend. So from Vue i am trying make a call to that endpoint.
async trigger(data) {
let response = await Axios.post('')
console.log(response.data.message)
}
My service at the backend returns a response " Batch started" and does execution in the background since it is async but does not return back once job is executed(i see the status only in console). In such scenario how can i await the call from vue for the service execution to complete. I understand that service send no response once execution is complete/failed. Any changes i need to make either at the backend or frontend to support this. Please let me know your thoughts.
It's like you said, the backend service is asynchronous, which means that once the code has been executed, it moves to the next line. If no next line exits, the function exists, the script closes, and the server sends an empty response back to the frontend.
Your options are:
Implement a websocket that broadcasts back when the service has been completed, and use that instead.
Use a timeout function to watch for a flag change within the service that indicates that the service has finished its duties, or
Don't use an asynchronus service
how can i await the call from vue for the service execution to complete
I would not recommend that since the job may take too long to complete and you don't want your web client to wait that long to get a reply. When configured with an asynchronous task executor, the job launcher returns immediately a job execution with an Id which you can inspect later on.
Please check the Running Jobs from within a Web Container for more details and code examples.
My suggestion is that you should make the front-end query for the job status instead of waiting for the job to complete and respond because the job may take very long to complete.
Your API to trigger the job start should return the job ID, you can get the job ID in the JobExecution object. This object is returned when you call JobLauncher.run.
You then implement a Query API in your backend to get the status of the job by job ID. You can implement this using the Spring JobExplorer.
Your front-end can then call this Query API to get the job status. You should do this in an interval (E.g. 30 secs, 5 mins, .etc depending on your job). This will prevent your app from stuck in waiting for the job and time-out errors.

Best way to schedule one-time events in serverless environments

Example use case
Send the user a notification 2 hours after signup.
Options considered
setTimeout(() => { /* send notification */ }, 2*60*60*1000); is not an option in serverless environments since the function terminates after execution (so it has to be stateless).
CloudWatch events can schedule lambda invocations using cron expressions - but this was designed for repetitive invocations (there's a limit of 100 rules/region).
I have not seen scheduling options in AWS SNS/SQS or GCP Pub/Sub. Are there alternatives with scheduling?
I want to avoid (if possible) setting up a dedicated message broker (overkill) or stateful/non-serverless instance - is there a serverless way to do this?
I can queue the events in a database and invoke a lambda function every minute to poll the database for events to execute in that minute... is there a more elegant solution?
Use AWS Step functions, they are like serverless functions that don't have the 15 minute limit like AWS Lambda does. You can design a workflow in AWS step that integrates with API Gateway, Lambda and SNS to send email and text notifications as follows:
Create a REST API via API gateway that will invoke a Lambda function passing in for example, the destination address (email, phone #) of the SNS notification, when it should be sent, notification method (e.g. email, text, etc.).
The Lambda function on invocation will invoke the Step function passing in the data (Lambda is needed because API Gateway currently can't invoke Step functions directly).
The Step function is basically a workflow, you can define states for waiting (like waiting for the specified time to send the notification e.g. 30 seconds), and states for invoking other Lambda functions that can use SNS to send out an email and/or text notifications.
A rudimentary example is provided by AWS w/ their Task Timer example.
Things are coming on GCP for doing this, but not very soon. Thereby, today, the solution is to poll a database.
You can to that with Datastore/firestore with the execution datetime indexed (to prevent to read all the documents each minute). But be careful of traffic spike, you could create hotspot.
You can use Cloud Scheduler on Google Cloud Platform. As is is stated in the official documentation :
Cloud Scheduler is a fully managed enterprise-grade cron job scheduler. It allows you to schedule virtually any job, including batch, big data jobs, cloud infrastructure operations, and more. You can automate everything, including retries in case of failure to reduce manual toil and intervention. Cloud Scheduler even acts as a single pane of glass, allowing you to manage all your automation tasks from one place.
Here you can check a quickstart for using it with Pub/Sub and Cloud Functions.

Microservice and RabbitMQ

I am new to Microservices and have a question with RabbitMQ / EasyNetQ.
I am sending messages from one microservice to another microservice.
Each Microservice are Web API's. I am using CQRS where my Command Handler would consume message off the Queue and do some business logic. In order to call the handler, it will need to make a request to the API method.
I would like to know without having to explicit call the API endpoint to hit the code for consuming messages. Is there an automated way of doing it without having to call the API endpoint ?
Suggestion could be creating a separate solution which would be a Console App that will execute the RabbitMQ in order to start listening. Create a while loop to read messages, then call the web api endpoint to handle business logic every time a new message is sent to the queue.
My aim is to create a listener or a startup task where once messages are in the queue it will automatically pick it up from the Queue and continue with command handler but not sure how to do the "Automatic" way as i describe it. I was thinking to utilise Azure Webjob that will continuously be running and it will act as the Consumer.
Looking for a good architectural way of doing it.
Programming language being used is C#
Much Appreciated
The recommended way of hosting RabbitMQ subscriber is by writing a windows service using something like topshelf library and subscribe to bus events inside that service on its start. We did that in multiple projects with no issues.
If you are using Azure, the best place to host RabbitMQ subscriber is in a "Worker Role".
I am using CQRS where my Command Handler would consume message off
the Queue and do some business logic. In order to call the handler, it
will need to make a request to the API method.
Are you sure this is real CQRS? CQRS occures when you handle queries and commands differently in your domain logic. Receiving a message via a calss, that's called CommandHandler and just reacting to it is not yet CQRS.
My aim is to create a listener or a startup task where once messages
are in the queue it will automatically pick it up from the Queue and
continue with command handler but not sure how to do the "Automatic"
way as i describe it. I was thinking to utilise Azure Webjob that will
continuously be running and it will act as the Consumer. Looking for
a good architectural way of doing it.
The easier you do that, the better. Don't go searching for complex solutions until you tried out all the simple ones. When I was implementing something similar, I was just running a pool of message handler scripts using Linux cron. A handler poped a message off the queue, processed it and terminated. Simple.
I think using the CQRS pattern, you will have events as well and corresponding event handlers. As you are using RabbitMQ for asynchronous communication between command and query then any message put on specific channel on RabbitMQ, can be listened by a callback method
Receiving messages from the queue is more complex. It works by subscribing a callback function to a queue. Whenever we receive a message, this callback function is called by the Pika library.

Resources