I have a class (as a protobuf) OrderChange, that represents when an order (imagine Amazon.com) changes:
message OrderChange {
Order old_order = 1;
Order new_order = 2;
}
message Order {
OrderType order_type = 1;
OrderCategory order_category = 2;
OrderStatus order_status = 3;
// many more fields
}
enum OrderType {
ORDER_TYPE_RETAIL = 0;
ORDER_TYPE_BUSINESS = 1;
}
enum OrderCategory {
ORDER_CATEGORY_ELECTRONICS = 0;
ORDER_CATEGORY_FOOD = 1;
ORDER_CATEGORY_FURNITURE = 2;
ORDER_CATEGORY_FITNESS = 3;
ORDER_CATEGORY_HOUSEHOLD = 4;
}
enum OrderStatus {
ORDER_STATUS_PAID = 0;
ORDER_STATUS_SHIPPED = 1;
ORDER_STATUS_DELIVERED = 2;
}
For each OrderChange object, I want to trigger some code for each conditional. For example if Order is RETAIL, FURNITURE, and PAID, I want to send a specific email.
OrderType and OrderCategory probably won't change between old_order and new_order, so my code can look at only new_order for these fields. Other fields such as OrderType will change, and my code can compare old_order and new_order to know what changed.
My problem is that Order has many fields, each with many values, so the number of conditional combinations is huge. Using only if/else or switch/case would be unmaintainable code.
So my question is, what pattern can I use to make these conditionals more maintainable?
Maybe I can break the fields into handlers - a handler for each OrderType, then each of these handlers contain a list of handlers for each OrderCategory, and continuing to nest a single field for each level until there are no more fields. My issue would be that beyond maybe the OrderType being the highest level field, there is no clear hierarchical relationship for the other fields. As in, it's not clear that OrderCategory handlers should contain OrderType handlers.
How should I design this?
Is the intent that there is only one action that should be triggered? Say you have actions that should run on RETAIL + PAID and RETAIL + FURNITURE + PAID, these clearly can compete with each other and there would need to be a priority type system to deal with ensuring only one handled it. Something like chain of responsibility could be used, where you register the handlers in priority order. If on the other hand, you want all eligible handlers to process it, then you can simply iterate all registered change handlers and pass each the change notification and allow each to process it, which is effectively just a simple observer.
Related
I'm pulling data from a third party api. The api runs multiple times in a day. So, if the same data is present in the table it should ignore that record, else if there are any changes it should update that record or insert a new record if anything new shows up in the json received.
I'm using the below code for inserting any new data.
var input = JsonConvert.DeserializeObject<List<DeserializeLookup>>(resultJson).ToList();
var entryset = input.Select(y => new Lookup
{
lookupType = "JOBCODE",
code = y.Code,
description = y.Description,
isNew = true,
lastUpdatedDate = DateTime.UtcNow
}).ToList();
await _context.Lookup.AddRangeAsync(entryset);
await _context.SaveChangesAsync();
But, after the first run, when the api runs again it's again inserting the same data in the table. As a result, duplicate entries are getting into table. To handle the same, I used a foreach loop as below before inserting data to the table.
foreach (var item in input)
{
if (!_context.Lookup.Any(r =>
r.code== item.Code))
{
//above insert code
}
}
But, the same doesn't work as expected. Also, the api takes a lot of time to run when I put a foreach loop. Is there a solution to this in .net core 3.1
List<DeserializeLookup> newList=new();
foreach (var item in input)
{
if (!_context.Lookup.Any(r =>
r.code== item.Code))
{
newList.add(item);
//above insert code
}
}
await _context.Lookup.AddRangeAsync(newList);
await _context.SaveChangesAsync();
It will be better if you try this way
I’m on my phone so forgive me for not being able to format the code in my response. The solution to your problem is something I actually just encountered myself while syncing data from an azure function and third party app and into a sql database.
Depending on your table schema, you would need one column with a unique identifier. Make this column a primary key (first step to preventing duplicates). Here’s a resource for that: https://www.w3schools.com/sql/sql_primarykey.ASP
The next step you want to take care of is your stored procedure. You’ll need to perform what’s commonly referred to as an UPSERT. To do this you’ll need to merge a table with the incoming data...on a specified column (whichever is your primary key).
That would look something like this:
MERGE
Table_1 AS T1
USING
Incoming_Data AS source
ON
T1.column1 = source.column1
/// you can use an AND / OR operator in here for matching on additional values or combinations
WHEN MATCHED THEN
UPDATE SET T1.column2= source.column2
//// etc for more columns
WHEN NOT MATCHED THEN
INSERT (column1, column2, column3) VALUES (source.column1, source.column2, source.column3);
First of all, you should decouple the format in which you get your data from your actual data handling. In your case: get rid of the JSon before you actually interpret the data.
Alas, I haven't got a clue what your data represents, so Let's assume your data is a sequence of Customer Orders. When you get new data, you want to Add all new orders, and you want to update changed orders.
So somewhere you have a method with input your json data, and as output a sequence of Orders:
IEnumerable<Order> InterpretJsonData(string jsonData)
{
...
}
You know Json better than I do, besides this conversion is a bit beside your question.
You wrote:
So, if the same data is present in the table it should ignore that record, else if there are any changes it should update that record or insert a new record
You need an Equality Comparer
To detect whether there are Added or Changed Customer Orders, you need something to detect whether Order A equals Order B. There must be at least one unique field by which you can identify an Order, even if all other values are of the Order are changed.
This unique value is usually called the primary key, or the Id. I assume your Orders have an Id.
So if your new Order data contains an Id that was not available before, then you are certain that the Order was Added.
If your new Order data has an Id that was already in previously processed Orders, then you have to check the other values to detect whether it was changed.
For this you need Equality comparers: one that says that two Orders are equal if they have the same Id, and one that says checks all values for equality.
A standard pattern is to derive your comparer from class EqualityComparer<Order>
class OrderComparer : EqualityComparer<Order>
{
public static IEqualityComparer<Order> ByValue = new OrderComparer();
... // TODO implement
}
Fist I'll show you how to use this to detect additions and changes, then I'll show you how to implement it.
Somewhere you have access to the already processed Orders:
IEnumerable<Order> GetProcessedOrders() {...}
var jsondata = FetchNewJsonOrderData();
// convert the jsonData into a sequence of Orders
IEnumerable<Order> orders = this.InterpretJsonData(jsondata);
To detect which Orders are added or changed, you could make a Dictonary of the already Processed orders and check the orders one-by-one if they are changed:
IEqualityComparer<Order> comparer = OrderComparer.ByValue;
Dictionary<int, Order> processedOrders = this.GetProcessedOrders()
.ToDictionary(order => order.Id);
foreach (Order order in Orders)
{
if(processedOrders.TryGetValue(order.Id, out Order originalOrder)
{
// order already existed. Is it changed?
if(!comparer.Equals(order, originalOrder))
{
// unequal!
this.ProcessChangedOrder(order);
// remember the changed values of this Order
processedOrder[order.Id] = Order;
}
// else: no changes, nothing to do
}
else
{
// Added!
this.ProcessAddedOrder(order);
processedOrder.Add(order.Id, order);
}
}
Immediately after Processing the changed / added order, I remember the new value, because the same Order might be changed again.
If you want this in a LINQ fashion, you have to GroupJoin the Orders with the ProcessedOrders, to get "Orders with their zero or more Previously processed Orders" (there will probably be zero or one Previously processed order).
var ordersWithTPreviouslyProcessedOrder = orders.GroupJoin(this.GetProcessedOrders(),
order => order.Id, // from every Order take the Id
processedOrder => processedOrder.Id, // from every previously processed Order take the Id
// parameter resultSelector: from every Order, with its zero or more previously
// processed Orders make one new:
(order, previouslyProcessedOrders) => new
{
Order = order,
ProcessedOrder = previouslyProcessedOrders.FirstOrDefault(),
})
.ToList();
I use GroupJoin instead of Join, because this way I also get the "Orders that have no previously processed orders" (= new orders). If you would use a simple Join, you would not get them.
I do a ToList, so that in the next statements the group join is not done twice:
var addedOrders = ordersWithTPreviouslyProcessedOrder
.Where(orderCombi => orderCombi.ProcessedOrder == null);
var changedOrders = ordersWithTPreviouslyProcessedOrder
.Where(orderCombi => !comparer.Equals(orderCombi.Order, orderCombi.PreviousOrder);
Implementation of "Compare by Value"
// equal if all values equal
protected override bool Equals(bool x, bool y)
{
if (x == null) return y == null; // true if both null, false if x null but y not null
if (y == null) return false; // because x not null
if (Object.ReferenceEquals(x, y) return true;
if (x.GetType() != y.GetType()) return false;
// compare all properties one by one:
return x.Id == y.Id
&& x.Date == y.Date
&& ...
}
For GetHashCode is one rule: if X equals Y then they must have the same hash code. If not equal, then there is no rule, but it is more efficient for lookups if they have different hash codes. Make a tradeoff between calculation speed and hash code uniqueness.
In this case: If two Orders are equal, then I am certain that they have the same Id. For speed I don't check the other properties.
protected override int GetHashCode(Order x)
{
if (x == null)
return 34339d98; // just a hash code for all null Orders
else
return x.Id.GetHashCode();
}
I have a RepeatedPtrField<M::Table> and a protobuf message M as:
message M {
message Table {
optional string guid = 1;
optional int64 schema_version = 2;
optional int64 data_version = 3;
repeated Column column = 4;
}
repeated Table table = 1;
}
How to I create a instance of M having the contents of RepeatedPtrField. I can write a for loop to copy data explicitly, but I am currently looking for something more concise, preferably using std::move() like optimization.
If you're using a new version of Protobuf, like Protobuf 3.6.0, RepeatedPtrField defines move constructor, and you can call std::move to achieve your goal.
If you're using an old version, you have to call Swap to do the work, as you mentioned in the comment.
This code will auto generate a new part number. This is a Post-processing BPM for BO GetNewPart
int iPartnum = 0;
string cPartid = string.Empty;
Erp.Tables.Company Company;
foreach (var ttpart_xRow in ttPart)
{
var ttpartRow = ttpart_xRow;
Company = (from Company_Row in Db.Company
where Company_Row.Company == Session.CompanyID
select Company_Row).FirstOrDefault();
iPartnum = (decimal)Company["AutoGenerate_c"] + 1;
cPartid = System.Convert.ToString(iPartnum);
ttpartRow.PartNum = cPartid;
Services.Lib.UpdateTableBuffer._UpdateTableBuffer(Company,"AutoGenerate_c", iPartnum);
}
Is it just not working or is there an error message?
Services.Lib.UpdateTableBuffer._UpdateTableBuffer(Company,"AutoGenerate_c", iPartnum);
I have personally never used or even seen this Lib item so I can't vouch for it. I would update the object manually inside of a transaction scope because I doubt GetNewPart ever touches that database and therefore probably doesn't create a transaction.
using (System.Transactions.TransactionScope txScope = IceDataContext.CreateDefaultTransactionScope())//start the transaction
{
//Your Logics go here
Db.Validate();
txScope.Complete();//commit the transaction
}
As a side note, I try to keep these sorts of things off of the company record because nearly every process in the system touches it and I don't want a process to lock it up or cause weird race conditions. I generally like to reserve a record that will only get touched for this specific purpose so I have a UDCodeType/UDCode for this sort of thing.
In my web app i'm reciving data every 3-4 seconds from an AJAX call to API like this:
$http.get('api/invoice/collecting').success(function(data) {
$scope.invoices = data
}
Then displaying the data, like this: http://jsfiddle.net/geUe2/1/
The problem is that every time i do $scope.invoices = data ng-repeat rebuilds the DOM area which is presented in the jsfiddle, and i lose all <input> values.
I've tried to do:
angular.extend()
deep version of jQuery.extend
some other merging\extending\deep copying functions
but they can't handle the situation like this:
On my client a have [invoice1, invoice2, invoice3] and server sends me [invoice1, invoice3]. So i need invoice2 to be deleted from the view.
What are the ways to solve this problem?
Check the ng-repeat docs Angular.js - Data from AJAX request as a ng-repeat collection
You could use track by option:
variable in expression track by tracking_expression – You can also provide an optional tracking function which can be used to associate the objects in the collection with the DOM elements. If no tracking function is specified the ng-repeat associates elements by identity in the collection. It is an error to have more than one tracking function to resolve to the same key. (This would mean that two distinct objects are mapped to the same DOM element, which is not possible.) Filters should be applied to the expression, before specifying a tracking expression.
For example: item in items track by item.id is a typical pattern when the items come from the database. In this case the object identity does not matter. Two objects are considered equivalent as long as their id property is same.
You need to collect data from DOM when an update from the server arrives. Save whatever data is relevant (it could be only the input values) and don't forget to include the identifier for the data object, such as data._id. All of this should be saved in a temporary object such as $scope.oldInvoices.
Then after collecting it from DOM, re-update the DOM with the new data (the way you are doing right now) $scope.invoices = data.
Now, use underscore.js _.findWhere to locate if your data._id is present in the new data update, and if so - re-assign (you can use Angular.extend here) the data-value that you saved to the relevant invoice.
Came out, that #luacassus 's answer about track by option of ng-repeat directive was very helpful but didn't solve my problem. track by function was adding new invoices coming from server, but some problem with clearing inactive invoices occured.
So, this my solution of the problem:
function change(scope, newData) {
if (!scope.invoices) {
scope.invoices = [];
jQuery.extend(true, scope.invoices, newData)
}
// Search and update from server invoices that are presented in scope.invoices
for( var i = 0; i < scope.invoices.length; i++){
var isInvoiceFound = false;
for( var j = 0; j < newData.length; j++) {
if( scope.invoices[i] && scope.invoices[i].id && scope.invoices[i].id == newData[j].id ) {
isInvoiceFound = true;
jQuery.extend(true, scope.invoices[i], newData[j])
}
}
if( !isInvoiceFound ) scope.invoices.splice(i, 1);
}
// Search and add invoices that came form server, but are nor presented in scope.invoices
for( var j = 0; j < newData.length; j++){
var isInvoiceFound = false;
for( var i = 0; i < scope.invoices.length; i++) {
if( scope.invoices[i] && scope.invoices[i].id && scope.invoices[i].id == newData[j].id ) {
isInvoiceFound = true;
}
}
if( !isInvoiceFound ) scope.invoices.push(newData[j]);
}
}
In my web app i'm using jQuery's .extend() . There's some good alternative in lo-dash library.
I'm trying to work with a LINQ result set of 4 tables retrieved with html agility pack. I'd like to process each one slightly differently by setting a variable for each (switch statement below), and then processing the rows within the table. The variable would ideally be the index for each of the tables in the set, 0 to 3, and would be used in the switch statement and to select the rows. I haven't been able to locate the index property, but I see it used in situations such as SelectChildNode.
My question is can I refer to items within a LINQ result set by index? My "ideal scenario" is the last commented out line. Thanks in advance.
var ratingsChgs = from table in htmlDoc.DocumentNode
.SelectNodes("//table[#class='calendar-table']")
.Cast<HtmlNode>()
select table;
String rtgChgType;
for (int ratingsChgTbl = 0; ratingsChgTbl < 4; ratingsChgTbl++)
{
switch (ratingsChgTbl)
{
case 0:
rtgChgType = "Upgrades";
break;
case 1:
rtgChgType = "Downgrades";
break;
case 2:
rtgChgType = "Coverage Initiated";
break;
case 3:
rtgChgType = "Coverage Reit/ Price Tgt Changed";
break;
//This is what I'd like to do.
var tblRowsByChgType = from row in ratingsChgs[ratingsChgTbl]
.SelectNodes("tr")
select row;
//Processing of returned rows.
}
}
ElementAt does what you're asking for. I don't recommend using it in your example, though, because each time you call it, your initial LINQ query will be executed. The easy fix is to have ratingsChgs be a List or Array.
You can also refactor out the switch statement. It is overkill when you only need to iterate through a list of items. Here is a possible solution:
var ratingsChgs = from table in htmlDoc.DocumentNode
.SelectNodes("//table[#class='calendar-table']")
.Cast<HtmlNode>()
select table;
var rtgChgTypeNames = new List
{
"Upgrades",
"Downgrades",
"Coverage Initiated",
"Coverage Reit/ Price Tgt Changed"
};
var changeTypes = ratingsChgs.Zip(rtgChgTypeNames, (changeType, name) => new
{
Name = name,
Rows = changeType.SelectNodes("tr")
});
foreach( var changeType in changeTypes)
{
var name = changeType.Name;
var rows = changeType.Rows;
//Processing of returned rows.
}
Also, why not store your rating change types in the HTML doc? It seems odd to have table information defined in the business logic.