Powercenter - SetVariable always is zero - etl

I have a problem when assigning a variable mapping.
It is assigned an expression and then to review it in the Workflow this are in Zero

I found the problem , it was a value field taking a flat file exceeded the decimal type ( 11).
Thank you

Related

DECIMAL value out of range

I am trying to publish data from our SAS environment into a remote Hadoop/Hive database (as sequence files). I'm performing basic tests by taking some source data from our business users and using a data step to write out to the Hadoop library.
I'm getting errors indicating that a value at row X is out of range.
For example:
ERROR: Value out of range for column BUY_RT1, type DECIMAL(5, 5). Disallowed value is: 0.
The source data has a numeric format of 6.5, and the actual value is .00000.
Why is .00000 out of range? Would the format for Hadoop need to be DECIMAL(6, 5)?
I get the same error when the value is 0.09:
ERROR: Value out of range for column INT_RT, type DECIMAL(5, 5). Disallowed value is: 0.09
You may need to check the actual values in SAS. If a numeric value in SAS has a format applied, you will see the formatted (possibly rounded) version of the numeric value wherever you output the value, but the underlying numeric may still have more significant digits that you're not seeing, due to the format.
For example, you say your source data has a format of 6.5 and the 'actual value' is 0.00000; are you sure that's the actual value? To check, you could try comparing the value to a literal 0, or putting the value to the SAS log with a different format like BEST32. (eg put BUY_RT1 best32.;).
If this is the problem, the solution is to properly round the source numeric values, rather than just applying a format.

Get warnings while exporting data to file in ibm datastage

As the title. When I export data to a sequential file in ibm datastage, I get the warnings:
When validating export schema: At field "ALLOCATERATE": "null_field" length (4) must match field's fixed width (7)
where I set "null_field_value" as "null" in the "Format" tab
where "ALLOCATERATE" is a decimal field (there are other fields of data types such as date\time\timestamp that got this warning.
Although I set the "pad char" option in "type defaults" as help document says, but still get the warnings.
It seems that the job treat the "decimal\timestamp" fields as fixed length char fields.
Could there be a way to eliminate those warnings? Thanks for any help.
Jason,
The null field value should be "" and not "null" as it will output the word "null" (which is 4 characters long, and where you're seeing your error), or even better, just don't set it.
Null field value. Specifies the value written to null field if the source is set to null
Which in my understanding will overwrite the null with the contents of the specified value.
As for the padding, from the same link above
Record length. Select Fixed where fixed length fields are being written.
This is an additional property that you need to configure (it's in your screenshot)
I hope this points you in the right direction.
Dan

Is it possible to multiply by -1 in a CRM Dynamics 2016 Workflow?

I'm trying to create a workflow that would make the target field value a negative number. I want to create related records as credits and debits and then be able to sum them up to get a net value.
I've tried to update the field to -1 and then multiply it accordingly, but I get an error stating that the value needs to be between 0 and 1,000,000,000,000. I've also just tried to multiply the value by a -1, but that doesn't work either. It just runs the workflow, but doesn't change the value.
Building on the comment from #MarioZG it looks like your CRM field doesn't allow negative numbers.
When you setup a number field (Decimal; Currency; Floating or Whole Number) you can specify the range of acceptable values. Here's a quick screenshot of the Whole Number's properties:
I actually figured this one out. I created two different workflows to create records in one entity with different "Types". Then, on the account I had a rollup field to sum one type and then a rollup type to sum the other type. Then I used a calculated field to subtract one from the other.
Use data type of the field as decimal number(looking at your use), in this case you will have flexibility of storing all sort of numbers.

Cannot identify the origin of an error - hypothesis : decimal format conflicts

While executing a DB2 (V8) Stored Procedure, I get the following error :
SQL0304N A value cannot be assigned to a host variable because the value is
not within the range of the host variable's data type. SQLSTATE=22003
I did not set any kind of tracing or specific error handling and as the error only occurs in our client's validation environment that I'm not allowed to play with, I do not have many options but analyze my code again.
Here is the result of my current analysis. Google is not much of a help...
My "10 pages" procedure creates a CURSOR over a set of data, goes though it and computes values for each element to be inserted it in a table.
I have checked (hopefully) all my variables types versus data types used to fill them and versus the data types of the target table and I do not see any conflict there.
Since there are a lot of decimal numbers, multiplications and additions, my only hypothesis is that a computed value becomes too large for a defined variable. Could anyone confirm that would be the "correct error" ? And would it also apply if the number of digits after the decimal point generated by computing is greater than allowed by the targeted variable type (eg. 100000.123 in decimal(6,2)) ?
I also tried to find a way to debug db2 pl sql through a client but I did not find any solution. If you have any suggestion...
Many thanks in advance for any clue :)
I answer myself...
First, my last question => I did not find any way to debug db2 pl sql through a client (with DB2 V8 at least).
After I was authorized to work on our integration client's environment, I could confirm my hypothesis was right. The variable format receiving the multiplication was sometimes too small (decimal(10,2)) for the computed result.
The solution adopted was to change the variable format to decimal(15,2) and since the final value to insert still had to be decimal(10,2) upon client's requirements, we validated the following with our client :
1-Check the variable value :
if (myval > 9999999,99)
then
set myval = 9999999,99;
end if;
=> "back to decimal(10,2) requirement"
2-Get back to decimal(10,2) at insert :
This last bit of code also solves the issue when there are too many digits after the decimal point. That was causing an error as well at insert time
insert into mytable values (
... ,
CAST(myval AS DECIMAL( 12 , 2 )),
...
)

Oracle: error while trying to use formulas

I created an element with an input value of type "Day" , when i write a formula i get this error.
Any idea what's wrong?
APP-FF-33232:
EATC_EXTRA_DAYS_ENTRY_EFFECTIVE_DATE_ENTRY_VALUE has null or not found allowed, but no
default set specified.
Cause: If a Database Item has
null allowed, or not found allowed,
then the item must also specify a
default set to be used to provide
default values in the event of these
occurring. The item named has one of
these conditions allowed, but the
default set column in the
FF_DATABASE_ITEMS table is null.
Action: Please refer to your
local support representative.
-
I'm not an expert in Oracle Apps (to say the least) but the error message is fairly clear. You - or someone - have written a Fast Formula which references a database column EATC_EXTRA_DAYS_ENTRY_EFFECTIVE_DATE_ENTRY_VALUE. Apparently this column can be nullable, in which case your Formula needs to provide a default value. Something like:
default for EATC_EXTRA_DAYS_ENTRY_EFFECTIVE_DATE_ENTRY_VALUE is 01-JAN-2010
Or perhaps you can use SYSDATE or CURRENT_DATE rather than a fixed value.
Solution to error: You called database item in Fast formula,
you need to initialize the date to specific date
alias EATC_EXTRA_DAYS_ENTRY_EFFECTIVE_DATE_ENTRY_VALUE as day
default for day is 01-jan-2010

Resources