drop down list formula in Quickbase - quickbase

I have a simple question : Is it possible to fill a drop down list with two value from a table?
I have a table with field X and field Y and I want my drop down in my form to show :
Value1 YField - Value1 XField
Value2 YField - Value2 XField
Value3 YField - Value3 XField
...
Or I have not choice but to add another drop down to select my value from and put a Text (formula) field under it using its value to build what I want?
I would like to avoid overloading the form if possible.
Thank you!

So I have solve my problem :
What I did is add a field in my table with the concatenate value
YField - XField
So in my drop down list I link a reference to that field.

Related

POWERQUERY : Deal with a column containing Text Values and Table

I am dealing with a column which is having Text Value mixed with Table data.
Table value
I would like to operate on this column by
if value = Table : aggregate the Table by combining the value with a comma separator
if value = : keep the original value
The result would be
I use this function to aggregate the value for Table, but I got an error when the value is a text
= Table.AggregateTableColumn(#"Lignes filtrées1", "value.1", {{"Element:Text", each Text.Combine(List.Transform(_, (x) => Text.From(x)), ", "), "Desired Result"}})
Would you have some tips to help me with this problem ?
Thanks in advance
It is hard to tell what you have in your table
As an example if the embedded table looked like this, with a single column
then you could add a column like
#"Added Custom" = Table.AddColumn(#"PriorStepName", "Custom", each try Text.Combine([Column1][TableColumn1],",") otherwise [Column1])
and get
If your table is more complicated you'd probably have to unpivot it first or otherwise give us a hint how to transform it into a single cell

Cannot update a row with a bind variable in UPDATE statement

I am using Oracle SQL Developer 4.0.0.13.
Query :
UPDATE employes
SET emptime = systimestamp
WHERE emp_id = 123
AND emp_device = :abc;
Field Definition : emp_device char(20 byte)
Value is : 99998000000008880999 (This value is present in the table)
When I run the above query in SQL developer it asks me to give the value for the bind variable, which I paste in the text box and it returns 0 rows updated.
But when I remove the bind variable in the update query and specify the actual value, it updates the column value. Below is the query.
Query:
UPDATE employes
SET emptime = systimestamp
WHERE emp_id = 123
AND emp_device = 99998000000008880999 ;
---(works)
Also, when I add some trailing spaces in the bind variable text box and trim the emp_device column, it updates the column. Below is the query.
Query :
UPDATE employes
SET emptime = systimestamp
WHERE emp_id = 123
AND emp_device = trim(:abc);
-- (works --- :abc value is '99998000000008880999 ')
I do not know what is wrong with it. Can someone please take a look and suggest a solution.
You are using CHAR type for your emp_device datatype. Note that CHAR type always blank pads the resulting string out to a fixed width.read this.
You should use VARCHAR2 as datatype if you are expecting a string or just NUMBER as your example consists purely of numeric values.
in dialog box enter your parameter as '99998000000008880999' use apostrophe chars.

Decode in insert statement

I have many columns in table of which 2 columns are
A : number
B : varchar
I am trying to insert value in B based on value of A from java.
insert into table(A,B) values (? , decode('A',110,'ABC',NA));
this gives me an error : illegal number
So i tried the below :
insert into table(A,B) values (? , decode('A','110','ABC','NA'));
This always inserts NA in the column.
Can someone please help me with this ?
Instead of using 'A' you should provide the content of variable A

String to Map Conversion Hive

I have a table having four columns.
C1 C2 C3 C4
--------------------
x1 y1 z1 d1
x2 y2 z2 d2
Now I want convert it into map data type having key and value pairs and load into separate table.
create table test
(
level map<string,string>
)
row format delimited
COLLECTION ITEMS TERMINATED BY '&'
map keys terminated by '=';
Now I am using below sql to load data.
insert overwrite table test
select str_to_map(concat('level1=',c1,'&','level2=',c2,'&','level3=',c3,'&','level4=',c4) from input;
Select query on the table.
select * from test;
{"level1":"x1","level2":"y1","level3":"z1","level4":"d1=\\"}
{"level1":"x2","level2":"y2","level3":"z2","level4":"d2=\\"}
I didn't get why I am getting extra "=\ \" in last value.
I double check data but the issue persist.
Can you please help?
str_to_map(text, delimiter1, delimiter2) - Creates a map by parsing text
Split text into key-value pairs using two delimiters. The first delimiter seperates pairs, and the second delimiter sperates key and value. If only one parameter is given, default delimiters are used: ',' as delimiter1 and '=' as delimiter2.
You can get this info by running this command:
describe function extended str_to_map
In your syntax there are two errors:
insert overwrite table test
select str_to_map(concat('level1=',c1,'&','level2=',c2,'&','level3=',c3,'&','level4=',c4) from input;
First is, one bracket ) is missing.
Second is, its not an error basically, you have not given the delimiters so the function is taking default values for delimiters, That's why your are getting ',' in your result.
To get the output in current format you should try this query:
insert overwrite table test
select str_to_map(concat('level1=',c1,'&','level2=',c2,'&','level3=',c3,'&','level4=',c4),'&','=') from input;

Reading CSV with Column header and loading it in hive tables

I have csv file with column header inside the file.
e.g.
Column1 Column2 Column3
value1 value2 value 3
value1 value2 value 3
value1 value2 value 3
value1 value2 value 3
Now i want to create hive table using this header inside and then load the entire table without the header line into the table.
Can anyone please suggest what approach should be followed in this case.
You can specify
tblproperties ("skip.header.line.count"="1");
see this SO question (Hive External table-CSV File- Header row)
You should remove the header line before loading data into HDFS, no other options here.

Resources