How can I manage background tasks in FastAPI? - multiprocessing

I am creating a FastAPI application that is collecting user data in the background 24/7 and saving it to the DB. The API communicates with the data collector and I want it to be able to use multiple processes.
I was wondering:
How would I go about implementing something like this in a multi-worker environemnt? Because I don't want the data collection task and discord bot to run in every process, but I also need to access the data throughout the API.
I learned that FastAPI has a BackgroundTask functionality, but that doesn't suit my need either.
Do I have to use some inter-proccess-communication? If so, what do you recommend?

Related

where are the chat messages stored in socket.io

Hi,
I was in the middle of developing a PHP chat that stores the messages and user sessions in text files when I learned about socket.io and I would like to switch to this method instead because seems to save CPU usage on my server (currently I have to do GET requests of the php files to load messages and users every 2 seconds).
My question is, are the messages stored somewhere with socket.io? How about the user sessions with their data such as username and profiles? I get all that from a database via PHP as of now and then store it a temporary text session file.
Thanks.
are the messages stored somewhere with socket.io?
No: Socket.io is just a WebSocket library. It's your responsibility as an application developer to use socket.io to receive incoming messages which runs your own code that loads or generates responses and vice-versa. So asking that question is like asking "are the messages stored somewhere with PHP?".
How about the user sessions with their data such as username and profiles?
Again, socket.io is just a library for using WebSockets. It does not have any user or profile management features built-in. It's your responsibility to build or integrate that into your NodeJS code which then uses socket.io
I get all that from a database via PHP as of now and then store it a temporary text session file.
You'll need to keep on doing just that (though you should also store the chat messages in your database too, as using text files won't scale to more than a handful of concurrent users).
Note that unlike PHP websites, NodeJS applications do have semi-persistent, in-memory state (so you can load data from your database and use it in-between requests instead of reloading everything for each new page request or websocket message) but remember that that state will be lost when the NodeJS server shuts-down, crashes, or is killed by your web-host's management infrastructure. Also when you do eventually scale to multiple server instances each one is isolated from each other, so their in-memory state won't be shared. If you absolutely need to share some kind of low-latency (but not in-memory state) then consider using something like memcached or Redis).

Worker Service in a Microservice Architecture

Soon I'll start a project based on a Microservice Architecture and ones of the components I need to develop is a Worker Service (or Daemon).
I have some conceptual questions about this.
I need to create a worker service that send emails and sms. This worker service need the data to send this emails. Also, I need to create a micro service that allow users to create a list of emails that need to be sanded by this Worker service. But both of then need to consume data from the same database.
In my worker service I should consume a micro service resource to get the data or it's ok that this worker service have a connection to the same database that my micro service?
Or is best that my worker service also has the api endpoints to let the users create new lists of emails, add or modify configuration and all the other functionalities i need to implement? This sound like a good idea, but I'll get a component with two responsibilities, so I have some doubts about that.
Thanks in advance.
Two microservices sharing the connection to the same database is usually a bad idea. Because each service should be the owner of its own data model and no one else should access it directly. If a service needs data of the domain of another service it should get it calling the owner via API or replicating the model in a read-only way in its own dabase and update it using events for example.
However, I think that for your current use case the best option is to provide the worker with all the information that it needs to send an email, (address, subject, body, attached files...) so the only responsibility of the worker will be to send emails and not to fetch the information.
It could provide also the functionality to send emails in batches. In the end, the responsibility of the service will be only one "To send emails" but it can provide different ways to do it (single emails, batches, with attached files... etc)

Laravel best practices listening to JSON data from external API

I have this use case of building an automation tool, that needs to get data from an external API and notify a specific user.
I'm thinking about using a cron job to fetch the API every 2 minutes and triggering an event to notify the user.
Is there any alternative approach, something to listen to an API?

Store serviceAccountKey.json file in third party server

I have an android app which gets its data from firebase realtime database. For updating the realtime database automatically, I've written a python script which crawls data from a website and processes it. Then it sends the data to my firebase realtime database using the admin sdk. I am willing to store and execute the script on my server, so that it is executed automatically twice a day. Is it safe to upload my serviceAccountKey.json along with it? If it is not, then how can I achieve my desired functionality?
Yes, it is fine to store the service account JSON file in your own server. That's the intended use case. Just make sure it's not exposed to users in anyway.

How does AJAX requests work exactly?

I had in mind to make a website that pulls data from steam statistics to show how many players are currently playing a specific game. Is this possible or does it have to be an actual api?
If you're planning on this website only ever being a platform that displays the information you're pulling from the Steam API when users access your site, then no, you don't necessarily have to create a public-facing API.
However, if you're creating this application with the idea that other applications are going to programmatically retrieve the data you're displaying on the site, then yes, you should create an API that allows other applications to consume your data.
The best way to do the latter would be to create an independent service that consumes data from the Steam API and transforms it, then makes it available publicly (with authentication if deemed necessary). This API would then be consumed by a client that you create (your website that displays the data) as well as any other application that could make use of the data.

Resources