How do I do time/hour arithmetic in a Google spreadsheet?
I have a value that is time (e.g., 36:00:00) and I want to divide it by another time (e.g., 3:00:00) and get 12. If I divide just one by the other, I get 288:00:00 when what I want is 12 (or 12:00:00).
Note that using the hours() function doesn't work, because 36:00:00 becomes 12.
When the number being returned by your formula is being formatted as a time, and you want it formatted as a plain number, change the format of the cell to a plain number format: click the cell and then click Format, Number, Normal.
Time values in Google spreadsheet are represented as days and parts of days. For example, 36:00:00 is the formatted representation of the number 1.5 (a day and a half).
Suppose you divide 36:00:00 by 3:00:00, as in your example. Google Spreadsheet performs the calculation 1.5 divided by 0.125, which is 12. The result tells you that you have 12 3-hour intervals in a 36-hour time period. 12, of course, is not a time interval. It is a unitless quantity.
Going the other way, it is possible to format any number as a time. If you format 12 as a time, it's reasonable to expect that you will get 288:00:00. 12 days contain 288 hours.
Google Sheets now have a duration formatting option. Select: Format -> Number -> Duration.
Example of calculating time:
work-start work-stop lunchbreak effective time
07:30:00 17:00:00 1.5 8 [=((A2-A1)*24)-A3]
If you subtract one time value from another the result you get will represent the fraction of 24 hours, so if you multiply the result with 24 you get the value represented in hours.
In other words: the operation is mutiply, but the meaning is to change the format of the number (from days to hours).
You can use the function TIME(h,m,s) of google spreadsheet. If you want to add times to each other (or other arithmetic operations), you can specify either a cell, or a call to TIME, for each input of the formula.
For example:
B3 = 10:45
C3 = 20 (minutes)
D3 = 15 (minutes)
E3 = 8 (hours)
F3 = B3+time(E3,C3+D3,0) equals 19:20
I had a similar issue and i just fixed it for now
format each of the cell to time
format the total cell (sum of all the time) to Duration
I used the TO_PURE_NUMBER() function and it worked.
So much simpler: look at this
B2: 23:00
C2: 1:37
D2: = C2 - B2 + ( B2 > C2 )
Why it works, time is a fraction of a day, the comparison B2>C2
returns True (1) or False (0), if true 1 day (24 hours) is added.
http://www.excelforum.com/excel-general/471757-calculating-time-difference-over-midnight.html
if you have duration in h:mm, the actual value stored in that cell is the time converted to a real number, divided by 24 hours per day.
ex: 6:45 or 6 hours 45 minutes is 6.75 hours 6.75 hours / 24 = 0.28125 (in other words 6hrs45minutes is 28.125% of a day). If you use a column to convert your durations into actual numbers (in example, converting 6:45 into 0.28125) then you can do you multiplication or division and get the correct answer.
In the case you want to format it within a formula (for example, if you are concatenating strings and values), the aforementioned format option of Google is not available, but you can use the TEXT formula:
=TEXT(B1-C1,"HH:MM:SS")
Therefore, for the questioned example, with concatenation:
="The number of " & TEXT(B1,"HH") & " hour slots in " & TEXT(C1,"HH") _
& " is " & TEXT(C1/B1,"HH")
Cheers
In an fresh spreadsheet with 36:00:00 entered in A1 and 3:00:00 entered in B1 then:
=A1/B1
say in C1 returns 12.
Type the values in single cells, because google spreadsheet cant handle duration formats at all, in any way shape or form. Or you have to learn to make scripts and graduate as a chopper pilot. that is also a option.
Related
I have data CNV file, where the first column is "minutes from start."
Minutes;Temperature
0;15.5
60;15.8
120;15.6
180;16.1
....
I would like to plot this data with x-axis as time (DAYS), so that every 1440 minutes is 1 day, then comes day 2... etc. What is the best way to do this?
Simply divide the minutes by 60 and by 24 (or 1440). Then you will have days.
Note, column values, e.g. $1, are always taken as float, so you don't have to worry about gnuplot's integer division (which can lead to unexpected results if you don't know about it).
plot "YourFile.dat" u ($1/1440):2 with lines
I have a sheet where I record my working hours (this is more for me to remind me to stop working than anything else). For every day, I have three possible shifts - early, normal & late, and I have a formula which will sum up any times put into these columns and give me the daily total hours.
To summarise the duration of time spent working in a day, I use the following formula: =(C41-B41)+(E41-D41)+12+(G41-F41) which is:
early end time minus early start time
normal end time minus normal start time PLUS 12 hours
late end time minus late start time
Which gives me output like this:
What I cannot seem to achieve is, the ability to sum the daily totals into something which shows me the total hours worked over 1-week. If I attempt to sum the daily totals together for the example image shown, I get some wild figure such as 1487:25:00 when formatting as 'Duration' or 23:25:00 when formatted as 'Time'!
All my cells where I record the hours worked are formatted as 'Time'
When using arithmetic operations on date values in Google Sheets, it's important to remember that the internal representation of a date is numeric, and understood as the number of days since January 1, 1970.
What follows from that, is that if you want to add 12 hours to a time duration, you should not write "+12" because that will in fact add 12 days. Instead add "+12/24". In other words, try the following formula instead of the one you are using now:
=(C41-B41)+(E41-D41)+(12/24+G41-F41)
I'm creating an app to monitor water quality. The temperature data is updated every 2 min to firebase real-time database. App has two requirements
1) It should alert the user when temperature exceed 33 degree or drop below 23 degree - This part is done
2) It should alert user when it has big temperature fluctuation after analysing data every 30min - This part i'm confused.
I don't know what algorithm to use to detect big temperature fluctuation over a period of time and alert the user. Can someone help me on this?
For a period of 30 minutes, your app would give you 15 values.
If you want to figure out a big change in this data, then there is one way to do so.
You can use implement the following method:
Calculate the mean and the standard deviation of the values.
Subtract the data you have from the mean and then take the absolute value of the result.
Compare if the absolute value is greater than one standard deviation, if it is greater then you have a big data.
See this example for better understanding:
Lets suppose you have these values for 10 minutes:
25,27,24,35,28
First Step:
Mean = 27 (apprx)
One standard deviation = 3.8
Second Step: Absolute(Data - Mean)
abs(25-27) = 2
abs(27-27) = 0
abs(24-27) = 3
abs(35-27) = 8
abs(28-27) = 1
Third Step
Check if any of the subtraction is greater than standard deviation
abs(35-27) gives 8 which is greater than 3.8
So, there is a big fluctuation. If all the subtracted results are less than standard deviation, then there is no fluctuation.
You can still improvise the result by selecting two or three standard deviation instead of one standard deviation.
Start by defining what you mean by fluctuation.
You don't say what temperature scale you're using. Fahrenheit, Celsius, Rankine, or Kelvin?
Your sampling rate is a new data value every two minutes. Do you define fluctuation as the absolute value of the difference between the last point and current value? That's defensible.
If the max allowable absolute value is some multiple of your 33-23 = 10 degrees you're in business.
I'm currently dealing with a system which uses an unknown timestamp mechanism.
The system is running on a Windows machine, so my first thought was that it uses some kind of Windows epoch for its timestamps, but it appears it does not.
My goal is to convert these timestamps to Unix timestamps.
A few examples:
The following timestamp: 2111441659 converts to: 2013-10-01 11:59
2111441998 to 2013-10-01 17:14
2111443876 to 2013-10-02 14:36
2111444089 to 2013-10-02 17:57
(All dates are GMT+2)
I've tried to calculate the reference date using the data above, but somehow I get a different result with every single timestamp.
Could anybody shed some light on this rather odd problem?
Thanks in advance!
To me the number seems to small to be milliseconds. My first guess was then seconds but looking at the speed this number varies with i think minutes is a better guess. Doing some math on it 2111441659/60/24/365 = 4017.20254756 which suggests the epoch might be sometime in the year -2000?
Here is a list of common epochs in computing but the year -2000 is not really there :) How are you obtaining this timestamp?
P.S. are you sure the year is set to 2013 on this machine and not to 4013? :) This would then fit with the .NET epoch of January 1, Year 1
In order to distinguish your timestamp from Unix timestamp, let's call yours The Counter.
So we have four counter values with their corresponding DateTime value. The first thing to do is calculate the counter's unit correspondence to a real time unit, let's say a second.
In order to do that, we need (1) the difference d between two counter values and (2) the difference s between their corresponding DateTimes, in seconds.
Considering the first two values we have d1=2111441998-2111441659=339. The difference between 2013-10-01 11:59 and 2013-10-01 17:14 (in seconds) is s1=18900. Consequently, the counter's unit corresponds to u1=s1/d1=55.7522123894 seconds.
But if we do the same with pairs #2 and #3, we will find that u2=40.9584664536 seconds.
Similarily, pairs #3 and #4 give us u3=56.6197183114 seconds.
My conclusion therefore, is that there's no alignment between the counter values and the corresponding DateTimes provided. That's the reason why you get a different result with each sample.
Finally, after many hours of comparing the timestamps with the datetimes, trying to discover the logic between them, I've found the answer by reverse engineering the software which generates the timestamps.
It turns out that the integer timestamps are actually bitwise representations* of the datetimes.
In pseudocode:
year = TimeStamp >> 20;
month = (TimeStamp >> 16) & 15;
day = (TimeStamp >> 11) & 31;
hour = (TimeStamp >> 6) & 31;
minute = TimeStamp & 63;
*I'm not sure if this is the correct term for it, if not, please correct me.
I want to count how many times in a specific month(for example:January) the red cell with the rule: equal or more than 15 days -> red cell , appears. How can I do that?
My table looks like this
A B
16.02.2013 15
17.01.2012 20
01.02.2013 4
26.04.2012 10
01.01.2012 21
20.04.2012 7
The answer for January is 2
How can I do if I want to make the count by month and year?
Thank you in advance!
Lygia
You can use the countifs function for this. (Link)
Assuming C1 holds start date you want to count from, and C2 holds end date you want to count too.
=COUNTIFS(A:A;">="&C1;A:A;"<="&C2;B:B;">15")
Not sure if you can tie the condition directly to the formating rule.
I'm guessing that entries in ColumnB greater than or equal to 15 have been formatted red. (Which could imply a flaw in #Taemyr's answer, where >15 should perhaps be >=15.) Also, that the likes of "(for example:January)" implies that a solution that works conveniently for many months would be appreciated, so suggest a PivotTable.
Due to PT constraints, identification of values greater than a cutoff is easier in the source data. In this case should only require filtering to select values that have already been formatted red, and adding a flag, say x, to these.
Then the PT could be filtered for x only and the ColumnA values (ROWS) Grouped by Years and Months. The sum of VALUES field being Count of B.