Convert Milliseconds to Hours, minutes and seconds - time

Yes I know there are others who have asked the same question but their solution won't work in this case.
Here is my problem.
I'm summing a verly large number of integers. In fact so many that the SUM function won't work.
So I do this:
Sum(cast(LotsofIntegers as decimal)) which gives me 3472201304
I want to view this in hh:mm:ss. The problem is that the Dateadd function won't accept such a large number. otherwise I could just do this
CONVERT(VARCHAR,DATEADD(ms,Sum(cast(LotsofIntegers as
decimal)),0),114)
which is the common solution.
I'd rather not have to do this the very hard way with a whole bunch of divisions.
Can anyone assist?

Try this (MySQL syntax; convert to your RDBMS of choice):
SELECT CONCAT(
CAST((#hours := FLOOR(SUM(msec)/3600000)) AS CHAR),
":",
CAST((#minutes := FLOOR((SUM(msec) - #hours * 3600000) / 60000)) AS CHAR),
":",
CAST((#seconds := FLOOR((SUM(msec) - #hours * 3600000 - #minutes * 60000) / 1000)) AS CHAR)
) FROM my_table WHERE 1;
On a table of 5,000,001 rows with a sum of 25,000,706,152 (under 64-bit arithmetic), I get the correct answer of 6944:38:26.
The problem appears to be that most common RDBMS' date difference classes (including MSSQL, which you appear to be using) can only support on the order of 2^32 milliseconds' difference because of 4-byte internal representation. This is a limitation of the RDBMS in question; apparently working with more than 4 billion msec at a time was outside the envisioned use cases. So unfortunately, unless patches are released to extend the built-in functionality (or there exists an upgraded function I haven't heard of!), the long way is the way to do it.

Related

Oracle DECODE statement failing on RHEL but works on HPUX

I've isolated a very specific code piece that works on our HP-UX Oracle 11.2 environment, but fails on our RHEL 7.1 Oracle 11.2 environment. Any pointers as to why this would happen?
Everything else (except PSU level) is generally the same.
TIA
AND v_effective_date
BETWEEN DECODE
(pet.attribute1,'OVERTIMEVACCOMP',
fnd_date.canonical_to_date(prv2.result_value),
TO_DATE ('01/01/0001', 'DD/MM/YYYY'))
AND DECODE
(pet.attribute1,'OVERTIMEVACCOMP',
fnd_date.canonical_to_date(prv3.result_value),
TO_DATE ('01/01/0001', 'DD/MM/YYYY'))
Most Entity-Attribute-Value models have a fatal flaw: stringly-typed data.
If all values are stored as strings it's critical that attribute filtering occurs before those values are converted into a type. But Oracle's query optimizations make it almost impossible to enforce a specific order of operations in SQL.
This question has a simple example of how bizarre this out-of-order execution can get. It's a bit extreme, but will hopefully help you prove how unpredictable order of operations can be. You wouldn't think this query could fail, but it does:
WITH data AS (SELECT 1 AS cond, 10 AS num, 0 AS div FROM DUAL)
SELECT
CASE WHEN cond = 2 THEN (CASE WHEN MAX(div) = 0 THEN 0 ELSE SUM(num / div) END)
ELSE -1
END AS result
FROM data
GROUP BY cond;
ORA-01476: divisor is equal to zero
We don't know exactly how Oracle implements the order. Maybe it's different between RHEL and HPUX, maybe it's different on Thursdays. Unfortunately, even using a LEAST may not be bullet-proof. That function may logically operate in order, and it may normally use short-circuit evaluation, but it's not guaranteed to always run in that order. You may have just switched between one 99.9% solution to another 99.9% solution.
There are only two fool-proof solutions to this, discussed in more detail in my answer here. Either change the table to use a different column for different types or add an inline view with a ROWNUM to every query. Neither of which is pleasant.

What datatype to use for daytime (hour, minute) in Oracle? [duplicate]

In one field I need to store not a datetime pair, i.e. a standard Oracle date.
01/10/2009 22:10:39
But time only
22:10:39
I think that save disk space (I have 2 million rows) or provide faster processing.
You could try the INTERVAL DAY TO SECOND data type but it won't save you any disk space ... it is very suitable for this purpose though.
create table t1 (time_of_day interval day (0) to second(0));
insert into t1 values (TO_DSINTERVAL('0 23:59:59'));
select date '2009-05-13'+time_of_day
from t1;
11 bytes though.
Your best bet would probably be storing "seconds since midnight" as a number field.
SELECT to_char( SYSDATE, 'SSSSS' ) FROM dual;
You can extract the time from a date as a string like this:
to_char(sysdate,'HH.MI.SS')
but there is no time-only data type that will help you save space.
you can use:
TO_CHAR(<DATE_COLUMN>, '<TIME_FORMAT>');
example
TO_CHAR(SYSDATE, 'HH24:MI:SS');
for time format you can check in here
You would save a few Mb of disk space(which is nothing nowadays) and you would gain next to nothing in performance.
You could use a column of NUMBER type for storing the number of seconds since midnight as suggested, just don't forget about the constraints.
(You'd probably use NUMBER(5, 0) which uses 1-3 bytes depending on the stored value, instead of a constant 7 bytes used by a DATE column)

Real Time issues: Oracle Performance tuning (types / indexes / plsql / queries)

I am looking for a real time solution...
Below are my DB columns. I am using Oracle10g. Please help me in defining table types / indexes and tuned PLSQL / query (both) for the updates and insertion
Insert and Update queries are simple but here we need to take care of the performance because my system will execute such 200 times per second.
Let me know... should I use procedures or simple queries? It is requested to write tuned plsql and query with proper DB table types / indexes.
I would really like to see the performance of my system after continuous 200 updates per second
DB table (columns) (I can change the structure if required so please let me know...)
Play ID - ID
Type - Song or Message
Count - Summation of total play
Retries - Summation of total play, if failed.
Duration - Total Duration
Last Updated - Late Updated Date Time
Thanks in advance ... let me know in case of any confusion...
You've not really given a lot of detail about WHAT you are updating etc.
As a basis for you to write your update statements, don't use PL/SQL unless you cannot achieve what you want to do in SQL as the context switching alone will hurt your performance before you even get round to processing any records.
If you are able to create indexes specifically for the update then index the columns that will appear in your update statement's WHERE clause so the records can be found quickly before being updated.
As for inserting, look up the benefits of the /*+ append */ hint for inserting records to see if it will benefit your particular case.
Finally, the table structure you will use will depend on may factors that you haven't even begun to touch on with the details you've supplied, I suggest you either do some research on DB structure or ask your DBA's for a 101 class in it.
Best of luck...
EDIT:
In response to:
Play ID - ID ( here id would be song name like abc.wav something..so may be VARCHAR2, yet not decided..whats your openion...is that fine if primary key is of type VARCHAR2....any suggesstions are most welcome...... ) Type - Song or Message ( varchar2) Count - Summation of total play ( Integer) Retries - Summation of total play, if failed. ( Integer) Duration - Total Duration ( Integer) Last Updated - Late Updated Date Time ( DateTime )
There is nothing wrong with having a PRIMARY KEY as a VARCHAR2 data type (though there is often debate about the value of having a non-specific PK, i.e. a sequence). You must, however, ensure your PK is unique, if you can't guarentee this then it would be worth having a sequence as your PK over having to introduce another columnn to maintain uniqueness.
As for declaring your table columns as INTEGER, they eventually will be resolved to NUMBER anyway so I'd just create the table column as a number (unless you have a very specific reason for creating them as INTEGER).
Finally, the DATETIME column, you only need decare it as a DATE datatype unless you need real precision in your time portion, in which case declare it as a TIMESTAMP datatype.
As for helping you with the structure of the table itself (i.e. which columns you want etc.) then that is not something I can help you with as I know nothing of your reporting requirements, application requirements or audit requirements, company best practice, naming conventions etc. I'm afraid that is something for you to decide for yourself.
For performance though, keep indexes to a minumum (i.e. only index columns that will aid your UPDATE WHERE clause search), only update the minimum data possible and, as suggested before, research the APPEND hint for inserts it may help in your case but you will have to test it for yourself.

Oracle - selected records between two dates (inclusive) when converting from date string

I have the following Oracle query
SELECT *
FROM table
WHERE date_opened
BETWEEN ((TO_DATE('2011-08-01', 'yyyy-mm-dd') - to_date('01-JAN-1970','DD-MON-YYYY')) * (86400))
AND ((TO_DATE('2011-08-31', 'yyyy-mm-dd') - to_date('01-JAN-1970','DD-MON-YYYY')) * (86400))
that nearly works but it doesn't include the dates records that are dated 2011-08-31. Any ideas? It has probably got something to do with how I am converting my date strings...
UPDATE: I really should have said that the date is actually a UNIX timestamp. That is why I am using the 86400 and 01-JAN-1970
Thank you :)
If the upper bound of an interval is not included in your results, then it's likely that you're building an "exclusive" filter with respect to the upper bound. So just add one day to the upper bound. I.e.
AND ((TO_DATE(...) - to_date(...) + 1) * (86400)) - 1
In Oracle, +1 will add one day when used in date time arithmetic.
Note: BETWEEN .. AND creates an inclusive filter, as Ollie stated, but your arithmetic may change that behaviour by transforming things to seconds
You don't include anything that happened after midnight the last day.
try:
AND ((TO_DATE('2011-09-01', 'yyyy-mm-dd') - to_date('01-JAN-1970','DD-MON-YYYY')) * (86400) - 1)
You only want to compare the date without the timestamp implicit in Oracle dates.
WHERE trunc(date_opened) BETWEEN . . .

How to efficiently convert text to number in Oracle PL/SQL with non-default NLS_NUMERIC_CHARACTERS?

I'm trying to find an efficient, generic way to convert from string to a number in PL/SQL, where the local setting for NLS_NUMERIC_CHARACTERS settings is inpredictable -- and preferable I won't touch it. The input format is the programming standard "123.456789", but with an unknown number of digits on each side of the decimal point.
select to_number('123.456789') from dual;
-- only works if nls_numeric_characters is '.,'
select to_number('123.456789', '99999.9999999999') from dual;
-- only works if the number of digits in the format is large enough
-- but I don't want to guess...
to_number accepts a 3rd parameter but in that case you to specify a second parameter too, and there is no format spec for "default"...
select to_number('123.456789', null, 'nls_numeric_characters=''.,''') from dual;
-- returns null
select to_number('123.456789', '99999D9999999999', 'nls_numeric_characters=''.,''') from dual;
-- "works" with the same caveat as (2), so it's rather pointless...
There is another way using PL/SQL:
CREATE OR REPLACE
FUNCTION STRING2NUMBER (p_string varchar2) RETURN NUMBER
IS
v_decimal char;
BEGIN
SELECT substr(VALUE, 1, 1)
INTO v_decimal
FROM NLS_SESSION_PARAMETERS
WHERE PARAMETER = 'NLS_NUMERIC_CHARACTERS';
return to_number(replace(p_string, '.', v_decimal));
END;
/
select string2number('123.456789') from dual;
which does exactly what I want, but it doesn't seem efficient if you do it many, many times in a query. You cannot cache the value of v_decimal (fetch once and store in a package variable) because it doesn't know if you change your session value for NLS_NUMERIC_CHARACTERS, and then it would break, again.
Am I overlooking something? Or am I worrying too much, and Oracle does this a lot more efficient then I'd give it credit for?
The following should work:
SELECT to_number(:x,
translate(:x, '012345678-+', '999999999SS'),
'nls_numeric_characters=''.,''')
FROM dual;
It will build the correct second argument 999.999999 with the efficient translate so you don't have to know how many digits there are beforehand. It will work with all supported Oracle number format (up to 62 significant digits apparently in 10.2.0.3).
Interestingly, if you have a really big string the simple to_number(:x) will work whereas this method will fail.
Edit: support for negative numbers thanks to sOliver.
If you are doing a lot of work per session, an option may be to use
ALTER SESSION SET NLS_NUMERIC_CHARACTERS = '.,'
at the beginning of your task.
Of course, if lots of other code is executed in the same session, you may get funky results :-)
However we are able to use this method in our data load procedures, since we have dedicated programs with their own connection pools for loading the data.
Sorry, I noticed later that your question was for the other way round. Nevertheless it's noteworthy that for the opposite direction there is an easy solution:
A bit late, but today I noticed the special format masks 'TM9' and 'TME' which are described as "the text minimum number format model returns (in decimal output) the smallest number of characters possible." on https://docs.oracle.com/cloud/latest/db112/SQLRF/sql_elements004.htm#SQLRF00210.
It seems as if TM9 was invented just to solve this particular problem:
select to_char(1234.5678, 'TM9', 'NLS_NUMERIC_CHARACTERS=''.,''') from dual;
The result is '1234.5678' with no leading or trailing blanks, and a decimal POINT despite my environ containing NLS_LANG=GERMAN_GERMANY.WE8MSWIN1252, which would normally cause a decimal COMMA.
select to_number(replace(:X,'.',to_char(0,'fmd'))) from dual;
btw
select to_number(replace('1.2345e-6','.',to_char(0,'fmd'))) from dual;
and if you want more strict
select to_number(translate(:X,to_char(0,'fmd')||'.','.'||to_char(0,'fmd'))) from dual;
Is it realistic that the number of digits is unlimited?
If we assume it is then isn't it a good reason to look into the requirements more carefully?
If we have that fantastic situation when the initial string is super long, then the following does the trick:
select
to_number(
'11111111.2222'
, 'FM' || lpad('9', 32, '9') || 'D' || lpad('9', 30, '9')
, 'NLS_NUMERIC_CHARACTERS=''.,'''
)
from
dual

Resources