My table consists of a field that is binary_double. However I want to convert it to varchar.
The column currently has sample values stored as binary_double. It looks like this:
69623829
I want the result to be returned in the same format when I convert it to varchar. So expected output is like this:
69623829
I have tried this:
select
convert(varchar(20),r.col_14,1)
from sample_table r
The error message is:
ORA-00936: missing expression
p.s : I am just starting off in PL/SQL
I suggest using TO_CHAR function. See TO_CHAR.
As in
SELECT TO_CHAR(col14,'99999999') FROM sample_table
Please see linked documentation for the desired format according to your requirements.
Use cast function:
CAST ( { expr | ( subquery ) | MULTISET ( subquery ) } AS type_name )
so:
select CAST(col_14 as varchar2(20)) from sample_tabe
Oracle live compiler - CAST as varchar2
For more, check this link:
CAST function
Related
I am looking for a Built-in UDF or any other method to convert values of a string column to integer in my phoenix table for sorting using SELECT and ORDER BY. I searched in the apache language Manual, but no use. Any other suggestions also welcome.
Actual Query
select "values" from "test_table"
I tried below approach but did not work
select TO_NUMBER("values", '\u00A4') from "test_table"
TO_NUMBER returns decimal but you can cast the result to INTEGER
SELECT CAST(TO_NUMBER(MY_COLUMN) AS INTEGER) FROM MY_DB
select TO_NUMBER(values) from test_table;
see https://phoenix.apache.org/language/functions.html#to_number
I have function which returns column names and i am trying to use the column name as part of my select statement, but my results are coming as column name instead of values
FUNCTION returning column name:
get_col_name(input1, input2)
Can И use this query to the results of the column from table -
SELECT GET_COL_NAME(input1,input2) FROM TABLE;
There are a few ways to run dynamic SQL directly inside a SQL statement. These techniques should be avoided since they are usually complicated, slow, and buggy. Before you do this try to find another way to solve the problem.
The below solution uses DBMS_XMLGEN.GETXML to produce XML from a dynamically created SQL statement, and then uses XML table processing to extract the value.
This is the simplest way to run dynamic SQL in SQL, and it only requires built-in packages. The main limitation is that the number and type of columns is still fixed. If you need a function that returns an unknown number of columns you'll need something more powerful, like the open source program Method4. But that level of dynamic code gets even more difficult and should only be used after careful consideration.
Sample schema
--drop table table1;
create table table1(a number, b number);
insert into table1 values(1, 2);
commit;
Function that returns column name
create or replace function get_col_name(input1 number, input2 number) return varchar2 is
begin
if input1 = 0 then
return 'a';
else
return 'b';
end if;
end;
/
Sample query and result
select dynamic_column
from
(
select xmltype(dbms_xmlgen.getxml('
select '||get_col_name(0,0)||' dynamic_column from table1'
)) xml_results
from dual
)
cross join
xmltable
(
'/ROWSET/ROW'
passing xml_results
columns dynamic_column varchar2(4000) path 'DYNAMIC_COLUMN'
);
DYNAMIC_COLUMN
--------------
1
If you change the inputs to the function the new value is 2 from column B. Use this SQL Fiddle to test the code.
I have data in the date column as below.
reportDate
21-Jan-17
02-FEB-17
I want to write a query to fetch data for 01/21/2017?
Below query not working in Oracle.
SELECT * FROM tablename where reportDate=to_date('01/21/2017','mm/dd/yyyy')
What is the data type of reportDate? It may be DATE or VARCHAR2 and there is no way to know by just looking at it.
Run describe table_name (where table_name is the name of the table that contains this column) and see what it says.
If it's a VARCHAR2 then you need to convert it to a date as well. Use the proper format model: 'dd-Mon-rr'.
If it's DATE, it is possible it has time-of-day component; you could apply trunc() to it, but it is better to avoid calling functions on your columns if you can avoid it, for speed. In this case (if it's really DATE data type) the where condition should be
where report_date >= to_date('01/21/2017','mm/dd/yyyy')
and report_date < to_date('01/21/2017','mm/dd/yyyy') + 1
Note that the date on the right-hand side can also be written, better, as
date '2017-01-21'
(this is the ANSI standard date literal, which requires the key word date and exactly the format shown, since it doesn't use a format model; use - as separator and the format yyyy-mm-dd.)
The query should be something like this
SELECT *
FROM table_name
WHERE TRUNC(column_name) = TO_DATE('21-JAN-17', 'DD-MON-RR');
The TRUNC function returns a date value specific to that column.
The o/p which I got when I executed in sqldeveloper
https://i.stack.imgur.com/blDCw.png
I have 2 Oracle 11g databases with a table containing a XMLType column and some test data differing only in the separator (.,) for the milliseconds of the timestamp:
create table TEST_TIMESTAMP (
ID number(19,0) constraint "NN_TEST_TIMESTAMP_ID" not null,
DOC xmltype constraint "NN_TEST_TIMESTAMP_DOC" not null
);
insert into TEST_TIMESTAMP values ( 1, xmltype('<?xml version="1.0" encoding="utf-8"?><test><ts>2015-04-08T04:55:33.11</ts></test>'));
insert into TEST_TIMESTAMP values ( 2, xmltype('<?xml version="1.0" encoding="utf-8"?><test><ts>2015-04-08T04:55:33,11</ts></test>'));
When I try to extract the timestamp with the following statements, it fails either with the first document on one database or with the second document on the other database.
select x.*
from TEST_TIMESTAMP t,
xmltable(
'/test'
passing t.DOC
columns
ORIGINAL varchar2(50) path 'ts',
RESULT timestamp with time zone path 'ts'
) x
where t.ID = 1;
select x.*
from TEST_TIMESTAMP t,
xmltable(
'/test'
passing t.DOC
columns
ORIGINAL varchar2(50) path 'ts',
RESULT timestamp with time zone path 'ts'
) x
where t.ID = 2;
The error I get:
ORA-01858: a non-numeric character was found where a numeric was expected
01858. 00000 - "a non-numeric character was found where a numeric was expected"
*Cause: The input data to be converted using a date format model was
incorrect. The input data did not contain a number where a number was
required by the format model.
*Action: Fix the input data or the date format model to make sure the
elements match in number and type. Then retry the operation.
The only differences between those databases I've found are:
DB1: version=11.2.0.1.0, NLS_CHARACTERSET=AL32UTF8 -> fails on document 2
DB2: version=11.2.0.2.0, NLS_CHARACTERSET=WE8MSWIN1252 -> fails on document 1
DB1 has the behaviour that I would expect. Does anybody know why those databases behave differently and how to fix the issue in DB2?
Thanks in advance,
Oliver
My guess is that the nls_timestamp_format is different between the two databases.
However, rather than forcing the implicit conversion down at the XMLTABLE level, I would do an explicit conversion in the select list:
with test_timestamp as (select 1 id, xmltype('<?xml version="1.0" encoding="utf-8"?><test><ts>2015-04-08T04:55:33.11</ts></test>') doc from dual union all
select 2 id, xmltype('<?xml version="1.0" encoding="utf-8"?><test><ts>2015-04-08T04:55:33,11</ts></test>') doc from dual)
select x.original,
to_timestamp(x.original, 'yyyy-mm-dd"T"hh24:mi:ss,ff2') result
from test_timestamp t,
xmltable('/test' passing t.doc
columns original varchar2(50) path 'ts') x;
ORIGINAL RESULT
-------------------------------------------------- --------------------------------------------------
2015-04-08T04:55:33.11 08/04/2015 04:55:33.110000000
2015-04-08T04:55:33,11 08/04/2015 04:55:33.110000000
N.B. I found that using "ss.ff2" errored, but "ss,ff2" handled both cases just fine. I'm not sure if that's reliant on some other nls setting or not, though.
I need to be able to reconstruct a table column by using the column data in DBA_TAB_COLUMNS, and so to develop this I need to understand what each column refers to. I'm looking to understand what DATA_TYPE_MOD is -- the documentation (http://docs.oracle.com/cd/B19306_01/server.102/b14237/statviews_2094.htm#I1020277) says it is a data type modifier, but I can't seem to find any columns with this field populated or any way to populate this field with a dummy column. Anyone familiar with this field?
Data_type_mod column of the [all][dba][user]_tab_columns data dictionary view gets populated when a column of a table is declared as a reference to an object type using REF datatype(contains object identifier(OID) of an object it points to).
create type obj as object(
item number
) ;
create table tb_1(
col ref obj
)
select t.table_name
, t.column_name
, t.data_type_mod
from user_tab_columns t
where t.table_name = 'TB_1'
Result:
table_name column_name data_type_mod
-----------------------------------------
TB_1 COL REF
Oracle has a PL/SQL package that can be used to generate the DDL for creating a table. You would probably be better off using this.
See GET_DDL on http://docs.oracle.com/cd/B19306_01/appdev.102/b14258/d_metada.htm#i1019414
And see also:
How to get Oracle create table statement in SQL*Plus