i'm using Informatica BDM tool (10.2.2) for data quality purpose. In process of creating a profile on 1 column, I'm facing an issue.
issue:
column having 100 records of which 50 are valid records,25 are invalid and 25 are null records. when I create a rule on the column and add that rule in the profile, i'm getting 50 as valid and 50 as invalid.
where are nulls are also showing as invalid.
Code: IIF(INSTR(col,'',1,1)||INSTR(col,'',1,2)='T'||'R', 'Valid', 'Invalid')
Note: I want only 2 results to be seen in the profile (Valid/Invalid).
Please help me to resolve the issue.
Thanks.
Related
I've been tasked to alter the company's Event Export from our PlayFab Environment into Azure. Initially, we've set it up to Export all events but after looking at the data we do have some data exported that we don't want for legal reasons. I was exploring the Use custom query method and was trying to build the query to get all data except the columns I want to exclude. The problem is that these columns are nested. I've tried using the project-away query to exclude one column for now but when I run the below query
['events.all']
| project-away EventData.ColumnToExclude
| limit 100
I get this error
I'm assuming it's because it is not supporting nested columns. Is there an easy way to exclude the column without having to flatten the data or list all my columns (our developers might create new events without notice so that won't work)?
UPDATE 1:
I've found that project-away is the syntax to remove a column from a table but what I needed is a way to remove a key from a json/dynamic object so found that using bag_remove_keys() is the correct approach
['events.all']
| project EventData=bag_remove_keys(EventData, dynamic(['Key1', 'Key2', '$.Key3.SubKey1'])
But now I am facing another issue. When I use the '$.' notation for subkeys I get the below error
Query execution has resulted in error (0x8000FFFF): Partial query failure: Catastrophic failure (message: 'PropertyBagEncoder::Add: non-contiguous entries: ', details: '').
[0]Kusto.Data.Exceptions.KustoDataStreamException: Query execution has resulted in error (0x8000FFFF): Partial query failure: Catastrophic failure (message: 'PropertyBagEncoder::Add: non-contiguous entries: ', details: '').
Timestamp=2022-01-31T13:54:56.5237026Z
If I don't list any subkeys I don't get this issue and I can't understand why
UPDATE 2:
I found that bag_remove_keys has a bug. On the below query I get the described error in UPDATE 1
datatable(d:dynamic)
[
dynamic(
{
"test1": "val",
"test2": {},
"test3": "val"
}
)
]
| extend d1=bag_remove_keys(d, dynamic(['$.SomeKey.Sub1', '$.SomeKey.Sub2']))
However, if I move the "test2" key at the end I don't get an error but d1 will not show the "test2" key in the output.
Also, if I have a key in bag_remove_keys() that matches a key from the input like | extend d1=bag_remove_keys(d, dynamic(['$.SomeKey.Sub1', '$.SomeKey.Sub2', 'test1'])) then, again it will not error but will remove "test2" from the output
Thanks for reporting it Andrei, it is a bug and we are working on a fix.
Update - fix had been checked in and will be deployed within two weeks, please open a support ticket if you need it earlier.
I am using Copy command to load data from CSV file to a table using internal Stage.
After loading data I am using below code to get number of rows loaded and failed.
Select * from table(Result_Scan('Copy_Query_ID'))
I am also using below query to get actual failed records:
select * from table(validate("Table_Name",job_id=>'Copy_Query_ID'))
it worked fine few times. But I noticed today that first query shows as below:
Rows_Parsed Rows_Loaded Error_Seen
10000 9600 400
So I was expecting 400 rows in second query result but instead I see 10400 records:
All rows once and additional 400 records for some other errors. If all rows are error rows then why are they loaded? Can i not use this queries for this purpose?
Note- In my file I have 6 fields but I am using only 4 of them in Copy and rest two fields I am getting using SYSdate(), may be this is the reason for mismatch?
Copy into table(col1,col2,col3,col4,col5,col6) from ( select $1,$2,$3,$4, sysdate(),'10/20/2020' from %#table)
so I am guessing validate is not looking at my new values for field 5,6 instead it is taking these values from file?
I found numerous threads online about my problem but the I'm very new to access and therefore it's hard to understand.
I'm having the following problem with my Access Database.
Microsoft Access set 0 fields to Null due to a type of conversion failure, and it didn't add 0 records to the table due to key violations, 0 records due lock violations, and 0 records due to validation rule violations. Do you want to run query anyway?
I'm using Acces 2013.And I have imported SharePoint List Data in Access 2013.
Here Following is My Query
INSERT INTO [Content Metadata Master] ([Content Name], [Metadata Name], [Value])
SELECT Content.Name as [Content Name], 'Author 1' as [Metadata Name] , [Gold Metadata].[Author 1] as Value1
FROM Content, [Gold Metadata]
WHERE ((([Gold Metadata].[Case Number])='OM-0057' And ([Gold Metadata].[Case Number])=[Content].[Name]));
Can anyone please help me explain in baby language how this problem occurs and how to resolve it without losing data.
Many Thanks,
Samadhan
Without knowing what the exact structure is of the data coming in, and the table accepting it, it's impossible to give a complete answer. But your problem is, "due to a type of conversion failure". This means that one or more of your values are not of the same type.
This means, you are trying to insert a string into a number field, or a date into a boolean, etc, etc...
Something is inconsistent between what you are pushing in, and what is trying to accept it.
Check the data types.
During working I faced a very weird thing in phpfox script
I put in the table user a new field .. and this field is tinyint with default value 0 and started to work on giving the user the ability to insert the value through links and finally it's succeeded but when I tried to get this value by getUserBy('name_of_the_field') it gave me a null value although I checked it in the database table and found that field has a value ... so could you help me please ?!
The getUserBy() does not get every field in the user table, there is a predefined list of columns that it will fetch.
You will need to get this field in a different way or write a plug-in to the hook "user.service_auth___construct_query" so it loads your new field, I have not tried this but I believe it should work as a plug-in to that hook:
$this->database()->select('u.my_new_field,');
Please help, I have been trying to fix this error for the better part of 8 hours so far. I have a report in Crystal Reports that just started throwing this error. I changed a field in the View that is attached to the report, so I opened up my XSD file in VS2010 and renamed the current DT to ViewTracker0 and then pulled in the ViewTracker view. I added my queries from the old DT, ensured that there is no primary key, double checked that each length of the fields were the same as the db, checked to make sure that each column name is named to match the DB. I can preview my data fine in the XSD, as well as in SQL I can run my queries and everything is returned correctly. When I run my report, it dies everytime with this error.
Failed to enable constraints. One or more rows contain values violating non-null, unique, or foreign-key constraints.
What do I need to check next.
Have you tried verifying the database within the Crystal Reports designer, then running the report?
Try traversing GetErrors(), described here:
http://www.fransson.net/blog/failed-to-enable-constraints-one-or-more-rows-contain-values-violating-non-null-unique-or-foreign-key-constraints/