RTK-query fetch data every X minutes and add it to stored data - react-redux

I have a "live graph" with many data points.
I want to use rtk-query to fetch the data from time A to now and store it.
Then, every X minutes, I want to call the API again in order to fetch the data from the last time point to now.
Two questions:
How do I ignite the call every X minutes?
How do I add to the store and not totally invalidate it?

If you want to call API every X minute, you can implement it with pooling interval.
Also, the same could be implemented through socket.io pooling or with plain websocket connection. If you do not want to cache the data, you can simply create redux middleware for socket.io - check my code. Or if it's necessary to cache websocket data, you can use RTK Query onCacheEntryAdded function -
here is a link.

Related

how to stop ABP from querying EMPTY AbpUserOrganizationUnits TABLE

We use ABP with a MSSQL database hosted on Azure.
in scope of Cost optimization we need to make as less requests to DB as possible.
during investigation i found out that ABP does around 50 millions of requests to table AbpUserOrganizationUnits per month and we don't this table at all
I would like to disable all calls to this table.
making 50 millions requests to database to receive an empty set is not what a normal product should do.
i woudl like to know if there is a way to stop this requests or if redirect them to a redis Cache or even stop them inside the API

GraphQL How to read values with dealy using UseSubscription hook

I have created a graphQL Realtime server and client reading values using useSubscription hook. Can anyone tell me please how to create a delay of few seconds to read the values.
Right now on the server side, I have created a change stream on a mongoDB collection and whenever an object is inserted into that collection it pushes the data to graphql subscription and the client receives it, The problem is the process which enters the data is inserting too many values in one second and i need just the most latest one and probably most latest with 5 seconds. So I need to create some sort of delay either on server side or client side.

Apollo Client v3 Delete cache entries after given time period

I am wondering if there is a way to expire cached items after a certain time period, e.g., 24 hours.
I know that Apollo Client v3 provides methods such as cache.evict and cache.gc which are a good start and I am already using; however, I want a way to delete cache items after a given time period.
What I am doing at the minute is adding a TimeToLive field to every object in my Apollo schema, and when the backend returns an object, the field is populated with the current time + 24 hours (i.e. the time in 24 hours time). Then when I query the data in the front end, I check the to see if the TimeToLive field of the returned data is in the future (if not that means the data was definitely retrieved from the cache and in which case I call the refetch function, which forces the query to fetch the data from the server. However, this doesn't seem like the best way to do things, mainly because I have to iterate over every result in the returned data anch check if any of the returned objects are expired; and if so, everything is refetched.
Another solution I thought of was to use something like React Native Queue and have a background task that periodically checks the cache and deleted items that have expired. But again, I am not totally sold on this solution.
For a little bit of context here: I am building a cooking / recipes app - and recipes / posts are cached on the device; however, my concern is that a user could delete a post, but everyone else who has that post cached would still be able to see it, and hence by expiring the cached item at least they would only be able to see for a number of hours before it is removed. However they might be a better way to do this all together, i.e. have the sever contact clients with the cached item (though I couldn't think of any low lift solutions at the time of writing this)
apollo-invalidation-policies replaces the Apollo-client InMemoryCache with InvalidationPolicyCache and within the typePolicies you can specify a timeToLive field. If an object is accessed beyond their TTL, they are evicted and no data is returned.

faster large data exports in laravel to avoid timeouts

I am to generate a report from the database with 1000's of records. This report is to be generated on monthly basis and at times the user might want to get a report spanning like 3 months. Already as per the current records, a month's data set can reach to like 5000.
I am currently using vue-excel to which makes an api call to laravel api and there api returns the resource which is now exported by vue-excel. The resource does not only return the model data but there are related data sets I also need to fetch.
This for smaller data sets works fine that is when I am fetching like 3000 records but for anything larger than this, the server times out.
I have also tried to use laravel excel with the query concern actually timed them and both take same amount of time because laravel excel was also mapping to get me the relations.
So basically, my question is: is there some better way to do this so as get this data faster and avoid the timeouts
just put this on start of the function
ini_set(max_execution_time, 84000); //84000 is in seconds
this will override the laravel inbuild script runtime max value.

what is the best strategy to sync data between DB and redis cache

We are using Oracle db, we would like to use Redis Cache mechanism, We add some subset of DB data to cache, does it sync with DB automatically when there is a change in the data in DB or we will have to implement the sync strategy, if yes, what is the best way to do it.
does it sync with DB automatically when there is a change in the data in DB
No, it doesn't.
we will have to implement the sync strategy, if yes, what is the best way to do it.
This will depend on your particular case. Usually caches are sync'd in two common ways:
Data cached with expiration. Once cached data has expired, a background process adds fresh data to cache, and so on. Usually there's data that will be refreshed in different intervals: 10 minutes, 1 hour, every day...
Data cached on demand. When an user requests some data, that request goes through the non-cached road, and that request stores the result in cache, and a limited number of subsequent requests will read cached data directly if cache is available. This approach can fall into #1 one too in terms of cache invalidation interval.
Now I believe that you've enough details to think about what could be your best strategy in your particular case!
Additionally to what mathias wrote, you can look ath the problem from dynamic/static perspective:
Real/Time approach: each time a process changes the DB data, you dispatch an event or a message to a queue where a worker handles corresponding indexing of the cache. Some might event implement it as a DB Trigger (I don't like)
Static/delayed approach: Once a day/hour/minute.. depending on your needs there is a process that does a batch/whole indexing of the DB data to the cache.

Resources