I have two sources A source B
Source A passes through a drug lookup and a router to see if the NDC is present and if not goes to either a present table or a nonpresent table.
Source B does the same but only looks for GCN number if it is present then it goes to a GCN present table or a gcn not present table.
I am currently using in Group filters
ISNULL(NDC_DRUG_CODE_LOOKUP)
NOT ISNULL(NDC_DRUG_CODE_LOOKUP)
ISNULL (GCN_CODE_out_LKP)
NOT ISNULL(GCN_CODE_out_LKP)
the problem is that when the lookup and GCN or NDC code match it's not routing properly
So my question is should I use two different sorters or is there a better way to code this.
Using multiple sorters is not the right option because it reduces the performance. Not sure exactly about your requirement, but I hope below is what you are expecting. Use the condition in router in such a way,
ISNULL(NDC_DRUG_CODE_LOOKUP) AND
NOT ISNULL(NDC_DRUG_CODE_LOOKUP) AND
ISNULL (GCN_CODE_out_LKP) AND
NOT ISNULL(GCN_CODE_out_LKP)
If the condition is not working the issue would be with your lookup. Try creating an output target for each look up and test the scenario.
Try Using Unconnected lookup , Call twice with GCN and NDC , And create four flags in single router and route them as per your requirement ...And one more suggestion if you are using ISNULL in router or any transformation , try to default to some like nvl in oracle, the reason is sometime it takes null both side it does not match..
Hope this helps...
Related
I want to implement this logic other than aggregator stage, basically through transformer stage to merge these records based on the ID column, and there is no possibility to get multiple values for same field in my case for same ID column.
I have this input data,
ID|VAL1|VAL2|VAL3|BAL1|BAL2|BAL3
10001|5|0|0|1000|0|0
10001|0|10|0|0|1200|0
10001|0|0|11|11|0|10500
and i want my output to be like:
ID|VAL1|VAL2|VAL3|BAL1|BAL2|BAL3
10001|5|10|11|1000|1200|10500
Is it possible to implement it and if, then thanks in advance!!!!
There are at least two options to do that:
Using the loop within the transformer
Storing the data of the previous row (with the help of stage variables) until LastRowInGroup
Some common things are
get the data sorted upfront the transformer
Use LastRowInGroup to use it as output constraint
remember that the stage & loop variables are processed top down so the sequence matters and enables one to point to an old (previous) content when referring to a variable further down from above
Be aware that this a little advanced - the aggregator would be probably the easier solution.
I have this situation. Starting from a table, I have to check all the records that match a key. If records are found, I have to check another table using a key from the first table and so on, more on less on five levels. There is a way to do this in a recursive way, or I have to write all the code "by hand"? The language I am using is Visual Fox Pro. If this is is not possible, is it al least possible to use recursion to popolate a treeview?
You can set a relation between tables. For example:
USE table_1.dbf IN 0 SHARED
USE table_2.dbf IN 0 SHARED
SET ORDER TO TAG key_field OF table_2.cdx IN table_2
SET RELATION TO key_field INTO table_2 ADDITIVE IN table_1
First two commands open table_1 and table_2. Then you have to set the order/index of table_2. If you don't have an index for the key field then this will not work. The final command sets the relation between the two tables on the key field.
From here you can browse both tables and table_2's records will be filtered based on table_1's key field. Hope this helps.
If the tables have similar structure or you only need to look at a few fields, you could write a recursive routine that receives the name of the table, the key to check, and perhaps the fields you need to check as parameters. The tricky part, I guess, is knowing what to pass down to the next call.
I don't think I can offer any more advice without at least seeing some table structures.
Sorry for answering so late, but the problem was of course that the recursion wasn't a viable solution since I had to search inside multiple tables. So I resolved by doing a simple 2-Level search in the tables that I needed.
Thank you very much for the help, and sorry again for answering so late.
I have a use case where i am mapping two tables to the same object.
In this object i have a string called source and I want to be able to set the table name or the database name to this variable.
Any ideas on how to achieve this?
I have thought about iterating over my list and manually setting it but this has the potential to waste a fair chunk of time.
I appreciate this is somewhat of an odd request so this may be the only way but am hoping for a solution that maps the source variable when hibernate is mapping everything else.
if i had understood correctly your issue , then your solution might be the MappedSuperClass , in which you must have an abstract class , which will have the common fields of the two tables and then you will extend that to the two entities you want , which will point to two different tables.
Check this link
You could try to achieve this with Load listener or Interceptors. In the listener/interceptor you can check what the data source is and populate the source field accordingly.
In the end i ended up using a formula to map my variable to a select statement which was sufficient for what i needed.
I'm trying to build a report in AX 2009 (SP1, currently rollup 6) with a primary data source of the SalesQuotationLine table. Due to how our inventory is structured, I need to apply a filter that shows only certain categories of items (in this case, non-service items as defined in the InventTable). However, it seems that there is a problem in the link between the SalesQuotationLine and InventTable such that only two specific items will ever display.
We have tested this against the Sales Quotation Details screen as well, with the same results. Executing a query such as this:
...will only show quotes that have one of the specific items mentioned earlier. If we change the Item Type to something else (for example to Item), the result is an empty set. We are also getting this issue on one of our secondary test servers, which for all intents is a fresh install.
There doesn't seem to be any issues with the data mapping from the one table to the other, and we are not experiencing this issue with any other table set. Is this a real issue, or am I just missing something?
After analyzing the results from a SQL Profile run during the execution of the query, it seems the issue was a system bug. When selecting a table to join to the SalesQuotationLines, you have two options: 'Items' and 'Items (Item Number)'. Regardless of which table you select the query executes with, it joins the InventTable with the relation "SalesQuotationLines.ProjTransCode = InventTable.ItemId".
After comparing the table to other layers in the system, I found the following block of code removed from the createLine method (in the SYP layer):
if (this.ProjTransType == QuotationProjTransType::Item)
{
this.ProjTransCode = this.ItemId;
}
Since the ProjTransCode is no longer being populated, the join does not work except on certain quote lines that do have the ProjTransCode populated.
In addition, there is no directly defined relation to the InventTable - the link is only maintained via an Extended Data Type that is used on the SalesQuotationLine.ItemId field. Adding this relation in manually solved the problem.
I have a simple query based on tables from two different linked servers. I need both servers to be changeable since we're moving from DEV to UAT to Production. I'm using an expression to set the Connection String and Password for server A. So, using that as a base I set a Data Flow Task and an 'OLE DB Source' to extract the data I need. Ultimately I'd like my query to look like this:
Select * from A.Payments p1
Full Outer Join ?.Payments p2 on p1.Id = p2.Id
where p1.OrderDesc is null or p2.OrderDesc is null
Is there a way around it? Can I use a variable or some kind of dynamic query? I haven't managed to parse a project parameter and run one. Thank you very much for your help.
This is done by making the Data Source SQL an expression.
Right click the Data Flow and then click the ellipsis [...] beside "Expressions". In there you will find one of the available properties you can set is the SQLCommand for your Data Flow Source.
It's not the most intuitive thing to be fair.