How calculate working time in Oracle PLSQL - oracle

How calculate working time eg.
7.5 = 7h and 30 min (working hours)
0.75 = 45 min (pause)
8 = 8h (Planing hours)
How get result eg. (-15 min) below query return 00:15 is it possible get in minus or use have better example?
Select
to_char(time'0:0:0'+numtodsinterval((7.5 + 0.75 - 8 ),'hour'),'hh24:mi')
from dual

You have the arithmetic backwards and to get a negative number you want 8 - (7.5 + 0.75).
Don't use a time and just use the interval (and extract the sign, hour and minute components using string functions if you want a different format):
SELECT numtodsinterval(8 - (7.5 + 0.75),'hour') AS interval,
REGEXP_REPLACE(
numtodsinterval(8 - (7.5 + 0.75),'hour'),
'([+-]?)(\d+) (\d+):(\d+):(\d+\.?\d*)',
'\1\3:\4'
) AS hhmm
FROM DUAL;
Outputs:
INTERVAL
HHMM
-000000000 00:15:00.000000000
-00:15
fiddle

Related

What is the syntax in Oracle to round any number to the greatest/highest place value of that number?

I have a wide variety of numbers
In the ten thousands, thousands, hundreds, etc
I would like to compute the rounding to the highest place value ex:
Starting #: 2555.5
Correctly Rounded : 3000
——
More examples ( in the same report )
Given: 255
Rounded: 300
Given: 25555
Rounded: 30000
Given: 2444
Rounded: 2000
But with the Round() or Ceil() functions I get the following
Given: 2555.5
Did not want : 2556
Any ideas ??? Thank you in advance
You can combine numeric functions like this
SELECT
col,
ROUND(col / POWER(10,TRUNC(LOG(10, col)))) * POWER(10,TRUNC(LOG(10,col)))
FROM Data
See fiddle
Explanation:
LOG(10, number) gets the power you need to raise 10 to in order get the number. E.g., LOG(10, 255) = 2.40654 and 10^2.40654 = 255
TRUNC(LOG(10, col)) the number of digit without the leading digit (2).
POWER(10,TRUNC(LOG(10, col))) converts, e.g., 255 to 100.
Then we divide the number by this rounded number. E.g. for 255 we get 255 / 100 = 2.55.
Then we round. ROUND(2.55) = 3
Finally we multiply this rounded result again by the previous divisor: 3 * 100 = 300.
By using the Oracle ROUND function with a second parameter specifying the number of digits with a negative number of digits, we can simplify the select command (see fiddle)
SELECT
col,
ROUND(col, -TRUNC(LOG(10, col))) AS rounded
FROM Data
You can also use this to round by other fractions like quarters of the main number:
ROUND(4 * col, -TRUNC(LOG(10, col))) / 4 AS quarters
see fiddle
Similar to what Olivier had built, you can use a combination of functions to round the numbers as you need. I had built a similar method except instead of using LOG, I used LENGTH to get the number of non-decimal digits.
WITH
nums (num)
AS
(SELECT 2555.5 FROM DUAL
UNION ALL
SELECT 255 FROM DUAL
UNION ALL
SELECT 25555 FROM DUAL
UNION ALL
SELECT 2444 FROM DUAL)
SELECT num,
ROUND (num, (LENGTH (TRUNC (num)) - 1) * -1) as rounded
FROM nums;
NUM ROUNDED
_________ __________
2555.5 3000
255 300
25555 30000
2444 2000

Avg of avgs in variable time windows

Context:
Activity has a grade
Activities belong to a subject, and the subject_avg is simply the average of its activities grades in a determined time range
The global_avg is the avg of many subject_avg (i.e, not to be confused with the average of all activity grades)
Problem:
"Efficiently" calculate global_avg in variable time windows
"Efficiently" calculating subject_avg for a single subject, by accumulating the amount and grade of its activities:
date
grade
act1
day 1
0.5
act2
day 3
1
act3
day 3
0.8
act4
day 6
0.6
act5
day 6
0
avg_sum
activity_count
day 1
0.5
1
day 3
2.3
3
day 6
2.6
5
I called it "efficiently" because if I need subject_avg between any 2 dates, I can obtain it with simple arithmetic over the second table:
subject_avg (day 2 to 5) = (2.3 - 0.5) / (3 - 1) = 0.6
Calculating global_avg:
subjectA
avg_sum
activity_count
day 1
0.5
1
day 3
2.3
3
day 6
2.6
5
subjectB
avg_sum
activity_count
day 4
0.8
1
day 6
1.8
2
global_avg (day 2 to 5) = (subjectA_avg + subjectB_avg)/2 = (0.6 + 0.8) / 2 = 0.7
I have hundred of subjects, so I need to now: Is there any way I could pre-process the subject_avgs so that I don't need to individually calculate its averages in the given time window before calculating global_avg?

How do I get the counter clockwise value using the modulo operator?

Let's say we have a 24 hour clock where all time is represented in minutes. That gives us 24 * 60 possible time points from 0 hours to 24 hours. The clockwise distance between two time points T1, T2 is simply |T1 - T2| since the time is represented in minutes.
Now, how do I obtain the counter clockwise distance between T1 and T2 ? Would I do something like
(-|T1 - T2|) % 1440?
have you considered:
24 * 60 - |T1 - T2|

Calculate days between two dates [always keeping months of max 30 days]

Assuming each month always has 30 days, I'd like to calculate the days between two given dates.
FROM 05/04/2020
TO 20/12/2020
result: 256 days (NOT 259 days if we considered months with 31 days)
With the simple mathematical subtraction between dates I get the wrong risult:
(Date.new(2019,12,20) - Date.new(2019,4,5)).floor
=> 259
To overcome this I had to create a pretty complex alghoritm:
days += inclusive_days_in_range(
position_data[:workFrom],
position_data[:workFrom].at_end_of_month
)
months = inclusive_months_in_range(
position_data[:workFrom].at_beginning_of_month.next_month,
position_data[:workTo].at_end_of_month.prev_month
)
days += months * MAX_DAYS_IN_MONTHS
days += inclusive_days_in_range(
position_data[:workTo].at_beginning_of_month,
position_data[:workTo]
)
Is there a simple way?
Similar to #CarySwoveland's answer but uses dot product:
require 'matrix'
def ndays str
Vector[*str.split('/').map(&:to_i)].dot [1,30,360]
end
> ndays('20/12/2020') - ndays('05/04/2020') + 1
=> 256
Add +1 since it seems like you want the number of days, inclusive.
Another approach would be to count the number of months, multiply by 30, then subtract the days into the month of the FROM date, and add in the days of the TO date.
Counting months has already been answered on stack overflow here: Find number of months between two Dates in Ruby on Rails
so I'll use that as a reference to get the months. Then it's just a matter of addition and subtraction
from_date = Date.new(2019,4,5)
to_date = Date.new(2019,12,20)
num_months = (12*(to_date.year-from_date.year))+(to_date.month-from_date.month)
# We add 1 to make it inclusive, otherwise you get 255
num_days = (num_months*30) + to_date.day - from_date.day + 1
def days_from_zero(date_str)
d, m, y = date_str.split('/').map(&:to_i)
d + 30*(m + 12*y)
end
days_from_zero("05/04/2020") - days_from_zero("4/04/2020") #=> 1
days_from_zero("20/12/2020") - days_from_zero("05/04/2020") #=> 255
days_from_zero("05/04/2020") - days_from_zero("20/12/2020") #=> -255
days_from_zero("05/04/2020") - days_from_zero("3/6/20") #=> 719942

Basic Velocity Algorithm?

Given the following dataset for a single article on my site:
Article 1
2/1/2010 100
2/2/2010 80
2/3/2010 60
Article 2
2/1/2010 20000
2/2/2010 25000
2/3/2010 23000
where column 1 is the date and column 2 is the number of pageviews for an article. What is a basic velocity calculation that can be done to determine if this article is trending upwards or downwards for the most recent 3 days?
Caveats, the articles will not know the total number of pageviews only their own totals. Ideally with a number between 0 and 1. Any pointers to what this class of algorithms is called?
thanks!
update: Your data actually already is a list of velocities (pageviews/day). The following answer simply shows how to find the average velocity over the past three days. See my other answer for how to calculate pageview acceleration, which is the real statistic you are probably looking for.
Velocity is simply the change in a value (delta pageviews) over time:
For article 1 on 2/3/2010:
delta pageviews = 100 + 80 + 60
= 240 pageviews
delta time = 3 days
pageview velocity (over last three days) = [delta pageviews] / [delta time]
= 240 / 3
= 80 pageviews/day
For article 2 on 2/3/2010:
delta pageviews = 20000 + 25000 + 23000
= 68000 pageviews
delta time = 3 days
pageview velocity (over last three days) = [delta pageviews] / [delta time]
= 68,000 / 3
= 22,666 + 2/3 pageviews/day
Now that we know the maximum velocity, we can scale all the velocities to get relative velocities between 0 and 1 (or between 0% and 100%):
relative pageview velocity of article 1 = velocity / MAX_VELOCITY
= 240 / (22,666 + 2/3)
~ 0.0105882353
~ 1.05882353%
relative pageview velocity of article 2 = velocity / MAX_VELOCITY
= (22,666 + 2/3)/(22,666 + 2/3)
= 1
= 100%
"Pageview trend" likely refers to pageview acceleration, not velocity. Your dataset actually already is a list of velocities (pageviews/day). Pageviews are non-decreasing values, so pageview velocity can never be negative. The following describes how to calculate pageview acceleration, which may be negative.
PV_acceleration(t1,t2) = (PV_velocity{t2} - PV_velocity{t1}) / (t2 - t1)
("PV" == "Pageview")
Explanation:
Acceleration is simply change in velocity divided by change in time. Since your dataset is a list of page view velocities, you can plug them directly into the formula:
PV_acceleration("2/1/2010", "2/3/2010") = (60 - 100) / ("2/3/2010" - "2/1/2010")
= -40 / 2
= -20 pageviews per day per day
Note the data for "2/2/2010" was not used. An alternate method is to calculate three PV_accelerations (using a date range that goes back only a single day) and averaging them. There is not enough data in your example to do this for three days, but here is how to do it for the last two days:
PV_acceleration("2/3/2010", "2/2/2010") = (60 - 80) / ("2/3/2010" - "2/2/2010")
= -20 / 1
= -20 pageviews per day per day
PV_acceleration("2/2/2010", "2/1/2010") = (80 - 100) / ("2/2/2010" - "2/1/2010")
= -20 / 1
= -20 pageviews per day per day
PV_acceleration_average("2/3/2010", "2/2/2010") = -20 + -20 / 2
= -20 pageviews per day per day
This alternate method did not make a difference for the article 1 data because the page view acceleration did not change between the two days, but it will make a difference for article 2.
Just a link to an article about the 'trending' algorithm reddit, SUs and HN use among others.
http://www.seomoz.org/blog/reddit-stumbleupon-delicious-and-hacker-news-algorithms-exposed

Resources