How to remove unwanted nested columns? - azure-blob-storage

I've been tasked to alter the company's Event Export from our PlayFab Environment into Azure. Initially, we've set it up to Export all events but after looking at the data we do have some data exported that we don't want for legal reasons. I was exploring the Use custom query method and was trying to build the query to get all data except the columns I want to exclude. The problem is that these columns are nested. I've tried using the project-away query to exclude one column for now but when I run the below query
['events.all']
| project-away EventData.ColumnToExclude
| limit 100
I get this error
I'm assuming it's because it is not supporting nested columns. Is there an easy way to exclude the column without having to flatten the data or list all my columns (our developers might create new events without notice so that won't work)?
UPDATE 1:
I've found that project-away is the syntax to remove a column from a table but what I needed is a way to remove a key from a json/dynamic object so found that using bag_remove_keys() is the correct approach
['events.all']
| project EventData=bag_remove_keys(EventData, dynamic(['Key1', 'Key2', '$.Key3.SubKey1'])
But now I am facing another issue. When I use the '$.' notation for subkeys I get the below error
Query execution has resulted in error (0x8000FFFF): Partial query failure: Catastrophic failure (message: 'PropertyBagEncoder::Add: non-contiguous entries: ', details: '').
[0]Kusto.Data.Exceptions.KustoDataStreamException: Query execution has resulted in error (0x8000FFFF): Partial query failure: Catastrophic failure (message: 'PropertyBagEncoder::Add: non-contiguous entries: ', details: '').
Timestamp=2022-01-31T13:54:56.5237026Z
If I don't list any subkeys I don't get this issue and I can't understand why
UPDATE 2:
I found that bag_remove_keys has a bug. On the below query I get the described error in UPDATE 1
datatable(d:dynamic)
[
dynamic(
{
"test1": "val",
"test2": {},
"test3": "val"
}
)
]
| extend d1=bag_remove_keys(d, dynamic(['$.SomeKey.Sub1', '$.SomeKey.Sub2']))
However, if I move the "test2" key at the end I don't get an error but d1 will not show the "test2" key in the output.
Also, if I have a key in bag_remove_keys() that matches a key from the input like | extend d1=bag_remove_keys(d, dynamic(['$.SomeKey.Sub1', '$.SomeKey.Sub2', 'test1'])) then, again it will not error but will remove "test2" from the output

Thanks for reporting it Andrei, it is a bug and we are working on a fix.
Update - fix had been checked in and will be deployed within two weeks, please open a support ticket if you need it earlier.

Related

Informatica cloud: use field in pre/post sql commands

I am trying to delete a set of data in the target table based on a column (year) from the lookup in IICS (Informatica Cloud).
I want to solve this problem using pre/post sql commands but the constraint is I can't pass year column to my query.
I tried this:
delete from sample_db.tbl_emp where emp_year = {year}
I want to delete all the employees in a specific year i get from lookup return
For Ex:
I got year as '2019', all the records in table sample_db.tbl_emp containing emp_year=2019 must be deleted.
I am not sure how this works in informatica cloud.
Any leads would be helpful.
How are you getting the year value? A pre/post SQL may not be the way to go unless you need to do this as part of another transformation, i.e., before or after the transformation runs. Also, does your org only have ICDI, or also ICAI? ICAI may be a better solution depending on the value is being provided.
The following steps would help you achieve this.
Create an input-output parameter in your mapping.
Assign the result of your lookup in an expression transformation to the parameter using SetMaxVariable
Use the parameter in your target pre SQL as
delete from sample_db.tbl_emp where emp_year = $$parameter
Let me know if you have any further questions

ActiveRecord#create does not allow to set the id

I need to batch insert > 100.000 records.
The id will not be created by the DB and I have to use a a given UUID:
Doing this in a loop using mymodel.new assigning the ID, then save the record will work but is way too slow (appr. 20min)
When I create an array 'records' and use mymodel.create(records) I run into the 'cannot mass assign id' problem.
I've tried all solutions I could find:
'attr_acccessible :id, ...' for the model. works for all but id.
(re)define 'def self.attributes_protected_by_default [] end' - no effect
one advice was to use 'create' with ':without_protection => true', but create does not take more than one argument.
.So neither of these solutions helped.
What else can I do?
Finally, I found a solution which might not be elegant in a Rails way but it solves my performance problem:
At first I tried what #Albin suggested only to find that create(records) does not work much faster (still > 15min).
My solution now is:
Create a temporary CSV file
db_tmp = File.open("tmp_file", "w")
records = ""
#data_records.each do |row|
records << "#{row['id']},#{row['id']},#{field_1},#{row['field_2']}, ... \n"
end
db_tmp.write(records)
db_tmp.close
Execute sql with a load data command
sql = "load data infile 'tmp_file' into table my_table
fields optionally enclosed by '\"' terminated by ','
(id,field_1,field_2, ... )"
ActiveRecord::Base.connection.execute(sql)
The whole process now lasts less than 1 (!) minute, including getting the data over the network and parsing the original json message into a hash.
I'm aware that this does not clarify how create could be tricked into allowing ID assignment but the performance problem is solved.
Another point is that my solution bypasses any validation defined for the model. This is not a problem because in this case I know I can rely on integrity of the data I'm receiving - and if there's a problem load would fail and execute would raise an exception.

Microsoft Access can't append all the records in the append query

I found numerous threads online about my problem but the I'm very new to access and therefore it's hard to understand.
I'm having the following problem with my Access Database.
Microsoft Access set 0 fields to Null due to a type of conversion failure, and it didn't add 0 records to the table due to key violations, 0 records due lock violations, and 0 records due to validation rule violations. Do you want to run query anyway?
I'm using Acces 2013.And I have imported SharePoint List Data in Access 2013.
Here Following is My Query
INSERT INTO [Content Metadata Master] ([Content Name], [Metadata Name], [Value])
SELECT Content.Name as [Content Name], 'Author 1' as [Metadata Name] , [Gold Metadata].[Author 1] as Value1
FROM Content, [Gold Metadata]
WHERE ((([Gold Metadata].[Case Number])='OM-0057' And ([Gold Metadata].[Case Number])=[Content].[Name]));
Can anyone please help me explain in baby language how this problem occurs and how to resolve it without losing data.
Many Thanks,
Samadhan
Without knowing what the exact structure is of the data coming in, and the table accepting it, it's impossible to give a complete answer. But your problem is, "due to a type of conversion failure". This means that one or more of your values are not of the same type.
This means, you are trying to insert a string into a number field, or a date into a boolean, etc, etc...
Something is inconsistent between what you are pushing in, and what is trying to accept it.
Check the data types.

Qlikview: Matching columns of two indirectly link tables does not work

Following is the data model of the dashboard I am facing problem in:
http://blob:http://stackoverflow.com/f3e40cfe-e009-4d03-bcf5-b7b4305c18c4
Now, what i want to achieve is that in Case there is a filed named Manufacturing_Date. And in MWODefetcs there is a field named Defect_Date. What i want is that when ever a record is selected from a table containing cases from Case corresponding records are shown in another table based on the exact match of Manufacturing_Date=Defect_Date.
As simple as it sounds, i can not seem to accomplish it. I have tried the following expressions to no avail:
Count({<[Defect_Date_text]=p([Manu_text]),FaultID=,DEFECT_CODE=>}MFG_BARCODE_NUM)
sum({$<Defect_Date ={"=$(Manufacturing_Date__c)"}>}Defect_Date)
Do the 2 tables need to be directly linked. Is it the intermediary iFaults table that is preventing me to accomplish it?
Please help.
you should use the P() set expression like this:
sum({$<Defect_Date =P(Manufacturing_Date__c) >}Defect_Date)

Filtering Quotes by InventTable

I'm trying to build a report in AX 2009 (SP1, currently rollup 6) with a primary data source of the SalesQuotationLine table. Due to how our inventory is structured, I need to apply a filter that shows only certain categories of items (in this case, non-service items as defined in the InventTable). However, it seems that there is a problem in the link between the SalesQuotationLine and InventTable such that only two specific items will ever display.
We have tested this against the Sales Quotation Details screen as well, with the same results. Executing a query such as this:
...will only show quotes that have one of the specific items mentioned earlier. If we change the Item Type to something else (for example to Item), the result is an empty set. We are also getting this issue on one of our secondary test servers, which for all intents is a fresh install.
There doesn't seem to be any issues with the data mapping from the one table to the other, and we are not experiencing this issue with any other table set. Is this a real issue, or am I just missing something?
After analyzing the results from a SQL Profile run during the execution of the query, it seems the issue was a system bug. When selecting a table to join to the SalesQuotationLines, you have two options: 'Items' and 'Items (Item Number)'. Regardless of which table you select the query executes with, it joins the InventTable with the relation "SalesQuotationLines.ProjTransCode = InventTable.ItemId".
After comparing the table to other layers in the system, I found the following block of code removed from the createLine method (in the SYP layer):
if (this.ProjTransType == QuotationProjTransType::Item)
{
this.ProjTransCode = this.ItemId;
}
Since the ProjTransCode is no longer being populated, the join does not work except on certain quote lines that do have the ProjTransCode populated.
In addition, there is no directly defined relation to the InventTable - the link is only maintained via an Extended Data Type that is used on the SalesQuotationLine.ItemId field. Adding this relation in manually solved the problem.

Resources