Facing the time bomb in computers? [closed] - time

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 7 years ago.
Improve this question
It is about the representation of time by using an int to store the number of seconds since January 1, 1970.
When will programs that use this representation face a time bomb? How should you proceed when that happens?

We are 2015 today. The number of seconds approximately since 1/1/1970 is
(2015 - 1970) * 365,25 * 24 * 60 * 60 = 1.420.092.000
That is the number of seconds in 45 years.
An unsigned int (32 bit) can store the value
4.294.967.295
which leaves us with
2.874.875.295 seconds ~ 90 years to go from now on
We still got some time to go.
In case a signed int is used, refer to this link (Thank you PM for the comment).
The signed int can store
2.147.483.647
which leaves us with
727.391.647 ~ 23 years to go from now on, i.e. 2038.
And thus the name of this problem: the Year 2038 problem
That is it can arise before our retirement.
For whatever you are concerned about, please refer to this link on SO.

Related

Is there a command to batch change time zones of files? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 3 years ago.
Improve this question
I recently went to Japan (UTC+9) from the UK, however, when I returned I forgot to change the time zone on my DLSR camera back to GMT. Since the files are now on my Linux machine, is there a way to batch change the time stamps for the pictures I have taken since I came home, so that they're on GMT rather than 9 hours ahead?
I found this, using exiftool:
exiftool "-AllDates+=1:12:28 14:54:32" -verbose *.jpg
to adjust all JPG image dates by adding 1 year, 12 month, 28 days, 14 hours, 54 minutes, 32 seconds
At photo.stackexchange: How to shift EXIF date/time created by time in days, hours, minutes?
So for shifting down all your photos by 9 hours, you could:
exiftool "-AllDates-=09:00:00" /path/to/IMG*.JPG
Then
exiftool '-FileModifyDate<DateTimeOriginal' /path/to/IMG*.JPG
For setting file system date/time from exif infos.

How to calculate a 4-week period of the billing year [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 3 years ago.
Improve this question
I'm refactoring a method that should return 4-week periods. A period is calculated as the number of 4-weekly blocks in the current billing year. There are thirteen periods per billing year. First billing year began on 24/08/2008.
I have created a formula:
def period
period = (((self.utc.to_time - Time.new(2008,8,23,0,0,0,0)) / 60 / 60 / 24 / 7 / 52).modulo(1) * 13).ceil
period == 0 ? 13 : period
end
Over the years, it has become inaccurate, and we're now seeing a shift of a whole period: bookings in Period 1 are showing as Period 2. I tried re-calculating the formula with no success. I feel the formula is too complex anyway, and can probably be achieved more simply.
I would just use Date instead of Time and the calculation would be much easier.
DAYS_PER_PERIOD = 28
def period
days = Date.today - Date.new(2008, 8, 23)
periods = days / DAYS_PER_PERIOD
periods.to_i # would return a `Rational` instead of an `Integer` otherwise
end

Maintain realtime counter [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
The following is an interview question which shows up here and here:
Given a timer time() with nanosecond accuracy and given the interface
interface RealTimeCounter:
void increment()
int getCountInLastSecond()
int getCountInLastMinute()
int getCountInLastHour()
int getCountInLastDay()
The getCountInLastX functions should return the number of times increment was called in the last X
Here's one suggested solution (paraphrased from the above blog entry):
Maintain an ArrayList of timestamps. When asked for a given counter, let's say, the count for the last second, perform a binary search for (timer.time() - ONE_SECOND_IN_NANOSECONDS), and return the list.length() - list_index. Then, as a background process at regular intervals we trim our data. Since we only need to maintain data for the last day, we can delete all entries prior to the last day.
Please critique this solution or offer a better performing one.

same algorithm execution time on different computers? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
if an algorithm with O(nlogn) time complexity is executed in two seconds on a computer ,how long does it take a 100 times faster computer to execute the same algorithm? is it 2/100 seconds? as far as i know Big o notation is a function of the input size and has nothing to do with execution time of the same algorithm on different computers, am i right?
The complexity does not say anything about the running time.
Example: I'm working on a very complex algorithm in a small team and the exact same algorithm runs in 30 minutes on my computer, in 60 minutes on a computer (similar performance as my comuter) of a team member, in 15 minutes on a netbook with a very slow processor but a different OS and a special extension. The runningtime also differs by different compilers.
The complexity gives you only a hint, how much more time your algorithm needs, if the input grows.

Acceptable Ratio for Dev hours vs. Debugging hours? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
On a recent project, with roughly 6,000 hours of development, a little over 1,000 hours has gone towards "debugging"/"fixes"... does this sound to be acceptable, high or low??
I also understand that this is a rather dynamic question, while also requesting a rather simply answer, however, I'm just looking for a rough estimate/average based on past project experiences : )
Grateful for any and all input~!!
Pressman (2000) gives 30-40% as the total amount of project time for integration, testing an debugging, so your figures look a little low - but it depends on how you calculate it!

Resources