show row values horizontally in BIRT report - birt

My scenario is that
default table shown in BIRT is like that
col1 | col2
val1 | val1
val2 | val2
val3 | val3
but I want show table like as below
column 1 | val1 val2 val3
column 2 | val1 val2 val3 in table
Please give me a suggestions how can I do it,it really helpful for me
thanks in Advance.
image for reference

You can create 2 headers and map the values in the column as val1 val2 val3. For example, see the image below:

Related

How to replace all values of column with null values in nifi

I have a CSV file where:
column1 has names
column2 is age
For example:
column1, column2
Maria., 24
Sunio., 65
Morris., 45
There 100 fields.
I want to replace column1 values with NULL:
Expected Output:
column1, column2
NULL, 24
NULL, 65
NULL, 45
How can I achieve this?
I have tried with update record....but not successful
With update record processor you are able to update any column to any value you want.
Here is an example:
[
You should consider there is no such thing as null value in csv, if you want to treat some value as null you should specify that on CSVReader or CSVRecordSetWriter as follow:

How to call Oracle stored procedure using muliple param values from table

I have requirement to call stored procedure for which input parameters are multiples values are coming from table.
For example:
sp(p1,p2,p2,...);
Table looks like this:
Col1 | Col2
-----+-------
p1 | val1
p2 | Val2
p3 | val3
How we can achieve this in PL/SQL block?

JPA spring boot insert into child table based on condition

I have one to many relationships in parent and child but I want to insert in the child based on condition only but parent table always.
I think it would be helpful if you could provide us the data structure with example records like:
table 1
column1 | column 2 | column 3
value | value | value
table 2
column1 | column 2 | column 3
value | value | value

How to create KSQL table from a topic with composite key?

I have some topic data with the fields stringA stringB and I am just trying to use that as the key when creating a KSQL table from a topic.
Here's an example. First, I'll create and populate a test stream
ksql> CREATE STREAM TEST (STRINGA VARCHAR,
STRINGB VARCHAR,
COL3 INT)
WITH (KAFKA_TOPIC='TEST',
PARTITIONS=1,
VALUE_FORMAT='JSON');
Message
----------------
Stream created
----------------
ksql> INSERT INTO TEST (STRINGA, STRINGB, COL3) VALUES ('A','B',1);
ksql> INSERT INTO TEST (STRINGA, STRINGB, COL3) VALUES ('A','B',2);
ksql> INSERT INTO TEST (STRINGA, STRINGB, COL3) VALUES ('C','D',3);
ksql>
ksql> SET 'auto.offset.reset' = 'earliest';
Successfully changed local property 'auto.offset.reset' to 'earliest'. Use the UNSET command to revert your change.
ksql> SELECT * FROM TEST EMIT CHANGES LIMIT 3;
+--------------+--------+---------+----------+------+
|ROWTIME |ROWKEY |STRINGA |STRINGB |COL3 |
+--------------+--------+---------+----------+------+
|1578569329184 |null |A |B |1 |
|1578569331653 |null |A |B |2 |
|1578569339177 |null |C |D |3 |
Note that ROWKEY is null.
Now I'll create a new stream, populated from the first, and create the composite column set it as the key. I'm also including the original fields themselves, but this is optional if you don't need them:
ksql> CREATE STREAM TEST_REKEY AS
SELECT STRINGA + STRINGB AS MY_COMPOSITE_KEY,
STRINGA,
STRINGB,
COL3
FROM TEST
PARTITION BY MY_COMPOSITE_KEY ;
Message
------------------------------------------------------------------------------------------
Stream TEST_REKEY created and running. Created by query with query ID: CSAS_TEST_REKEY_9
------------------------------------------------------------------------------------------
Now you have a stream of data with the key set to your composite key:
ksql> SELECT ROWKEY , COL3 FROM TEST_REKEY EMIT CHANGES LIMIT 3;
+---------+-------+
|ROWKEY |COL3 |
+---------+-------+
|AB |1 |
|AB |2 |
|CD |3 |
Limit Reached
Query terminated
You can also inspect the underlying Kafka topic to verify the key:
ksql> PRINT TEST_REKEY LIMIT 3;
Format:JSON
{"ROWTIME":1578569329184,"ROWKEY":"AB","MY_COMPOSITE_KEY":"AB","STRINGA":"A","STRINGB":"B","COL3":1}
{"ROWTIME":1578569331653,"ROWKEY":"AB","MY_COMPOSITE_KEY":"AB","STRINGA":"A","STRINGB":"B","COL3":2}
{"ROWTIME":1578569339177,"ROWKEY":"CD","MY_COMPOSITE_KEY":"CD","STRINGA":"C","STRINGB":"D","COL3":3}
ksql>
With this done, we can now declare a table on top of the re-keyed topic:
CREATE TABLE TEST_TABLE (ROWKEY VARCHAR KEY,
COL3 INT)
WITH (KAFKA_TOPIC='TEST_REKEY', VALUE_FORMAT='JSON');
From this table we can query the state. Note that the composite key AB only shows the latest value, which is part of the semantics of a table (compare to the stream above, in which you see both values - both stream and table are the same Kafka topic):
ksql> SELECT * FROM TEST_TABLE EMIT CHANGES;
+----------------+---------+------+
|ROWTIME |ROWKEY |COL3 |
+----------------+---------+------+
|1578569331653 |AB |2 |
|1578569339177 |CD |3 |
Just an update to #Robin Moffat..
Use the below
CREATE STREAM TEST_REKEY AS
SELECT STRINGA + STRINGB AS MY_COMPOSITE_KEY,
STRINGA,
STRINGB,
COL3
FROM TEST
PARTITION BY STRINGA + STRINGB ;
instead of
CREATE STREAM TEST_REKEY AS
SELECT STRINGA + STRINGB AS MY_COMPOSITE_KEY,
STRINGA,
STRINGB,
COL3
FROM TEST
PARTITION BY MY_COMPOSITE_KEY ;
NOTE: column ordering matters
Worked for me! (CLI v0.10.1, Server v0.10.1)

Explode Hive Map data object into long format

I have a map data type in a table with fairly large number of key, values (10-30). When I explode the key, values, I get below:
SELECT id, key,value
FROM tbl1
lateral view explode(map_field) feature_cols
Results:
id, key1, value1
id, key2, value2
id, key3, value3
However, I would like to see:
id, key1, key2, key3
1, value1, valu2, value3
Is there any command that either produces my desired format, or is there any command to convert exploded output to the long format I desire?
We need to transpose Columns into Rows after lateral view explode. You can write query like as stated below.
Select
id,
Case when key=key1 then value1 as key1,
Case when key=key2 then value2 as key2,
Case when key=key3 then value3 as key3
From
(SELECT id, key,value FROM tbl1 lateral view explode(map_field) feature_cols) temp

Resources