DynamoDB delete trigger OldImage - aws-lambda

I would like to do some cleanup after a record has been deleted in my DynamoDB table. It would be pretty great if I could use triggers to do this. Unfortunately it seems that OldImage is not passed into "REMOVE" events. The problem is that I need some record attributes other than the keys in order to perform my cleanup and I can't actually read the record anymore to get these attributes once the event has triggered. Is there any other way I can still read attributes of a record that has been deleted in a trigger?

Change the DynamoDB stream to include new and old images.

Related

Create entry in related table when inserting an item into Hasura?

I have two tables, one has a foreign key to the other. For ease of use let's say I have a table named 'Boy' with a foreign key for iceCreamId to a table called 'IceCream'.
In hasura the only way I can see how to create an entry in IceCream when I insert into boy is through the query.
Is there a way I can trigger a default IceCream insert when a 'Boy' is inserted through the backend? Don't like the frontend being relied on to do this.
Hasura builds on top of the functionality of your database and works best when you embrace all the functionality that the underlying DB has to offer.
This can easily be accomplished using a Database Trigger (I am assuming you're using Postgres here)
Triggers allow you to run additional logic in the backend either for validation or to do create, update, delete etc when records are updated, deleted or created
You can simply use Hasura event triggers
Hasura can be used to create event triggers on tables in the Postgres
database. Event triggers reliably capture events on specified tables
and invoke webhooks to carry out any custom logic.
so, in your case, you can set up a simple event trigger on the create/update of the Boy table, you can then create the IceCream row in your webhook.
The one thing cooler about event triggers is that you can manually invoke/retry the same operation in case of failure!
for more info: Hasura Event triggers

How to prevent overwrites/duplicates in dynamoDB from triggering lambda

I have a DynamoDB that is getting data written to it from a java app
mapper.save(row, DynamoDBMapperConfig.builder().withSaveBehavior(SaveBehavior.CLOBBER).build());
I want to have a lambda trigger off of new items in the DDB so that their keys can be put onto SNS. However, if an item in DynamoDB is being overwritten (IE we received duplicate data) we do NOT want to do anything with it.
How to handle that? I control both the lambda and the code writing to DDB, but not the source of the data.
I don't think we can prevent the Lambda from being triggered when an item is overwritten in DynamoDB. But once the Lambda is triggered, we can identify if it's a new record or an existing record.
The input to the Lambda function will be a DynamoDBStreamEvent which contains an OldImage attribute. If that is present, it indicates that it's an existing record that got modified. In this case, we can just return from the Lambda without doing any processing.
Also, the event contains the entire snapshot before and after the insert in the OldImage and NewImage attributes. So we can also check whether some other attribute value has changed or not to decide whether it's an overwrite.
You need to have an IF, CASE, or something that looks at the stream record's eventName and if it is INSERT, which means new if I recall correctly, then it will run your code. If it is something like MODIFY, then it will not. There is an example in the DynamoDB documentation.

Algolia Update Index On Relational Database change with Laravel Scout

I have implemented Aloglia for my Movies table with actors as relational table and it works fine.
Problem:
When I update any movie its also updating algolia index (its good). But how can I update index if I made any change in relational table (for example update an actor of movie).
How to push a specific record manually with laravel scout.
Thanks
The issue itself lies in laravel's events. Whats happening is scout is listening for an 'updated' event which only occurs in laravel when the model object is saved and is dirty (aka value differ from that in the db).
There are two ways you can do this.
The bad lazy way would simply be to add ->touch() to the model prior to save - this will force the updated_at field to update and ultimately trigger the updated event. This is bad as you're wasting a DB query.
The second and preferable way is to register another observer on 'saved' which triggers regardless of whether or not the object is dirty. Likely you either want to check if the model is dirty and only index when its not (to prevent double indexing from the updated event) or just de-register the 'updated' listener that comes in Scout.

How to perform recaching using memcached?

I am using memcached to cache a table records. My code is as follows
Rails.cache.fetch('custom_profiles') do
#custom_profiles=CustomProfile.where(:status=>"Active")
end
But whenever a new record has been added , its not getting updated.I want to update this variable whenever the table got edited. If you have any idea about this please share with me.
You can delete an item from cache like this
Rails.cache.delete('custom_profiles')
You might want to do this in some callbacks or Observers

LINQ context SubmitChanges

Regarding the SubmitChanges order (Insert, Update, Delete), is there a way to change that order? I need to have the Deletes executed first, any updates, then any new inserts. I have a datagrid where the user can do all add, changes and updates and submit. Since each grid row must have a unique item chosen in it (via a dropdown), it's possible for the the user to delete the row, then try to utilize the deleted dropdown item in a new row, then try to update all changes and have the update fail (since the item the user wants to delete actually still exists in the database because the submit is doing the inserts first). Is there a setting where I can control the automatic update order or do I have to manually do the updates myself?
I have not tried this, but you could consider the following. First, get the ChangeSet using DataContext.ChangeSet. Then, run through the ChangeSet.Deletes calling Table<T>.DeleteOnSubmit on a new instance of your DataContext. Rinse repeat for the ChangeSet.Updates and the ChangeSet.Inserts.
Good luck.
I don't believe it's possible to do this. You would have to process the changes in the order you want, and call SubmitChanges() after each insert, update or delete. If you want the whole thing within the scope of a transaction, use the TransactionScope object.

Resources