Purpose of the nanoseconds in database programming [closed] - time

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
Just saw this question: Nanoseconds lost coming from MongoDB ISODate Object
Asking because, 3GHz processor can do max 9 instructions per nanosecond (sure not enough to store anything into mongodb) and memory speed is ten times slower anyway. I'm not sure how precise is the network time sync.
Wondering for what purpose is useful and usually used the nanosecond value in any (standard) programming language/system.

info ls mentions a --time-style=full-iso option, with this description:
List timestamps in full using ISO 8601 date, time, and time
zone format with nanosecond precision, e.g., `2002-03-30
23:45:56.477817180 -0700'. This style is equivalent to
`+%Y-%m-%d %H:%M:%S.%N %z'.
This is useful because the time output includes all the
information that is available from the operating system. For
example, this can help explain `make''s behavior, since GNU
`make' uses the full timestamp to determine whether a file is
out of date.
(Your mileage may vary.)
Edited to add: Also, RFC 4122, which defines UUIDs, gives several algorithms for generating them, including a few "time-based" algorithms. These all require 100ns resolution. Since it would not make sense to offer a "tenths-of-microseconds" field, this requires a "nanoseconds" field, even if meaningful 1ns resolution is not offered.
(And I feel compelled to point out that, in the question you link to, the asker is actually only storing up to milliseconds in MongoDB. The question asks about "nanoseconds" only because that is the field that stores sub-second resolution.)

Related

What Is "Time" When Talking About bigO Notation [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I am getting ahead on next semester classes and just had a question about bigO notation. What is the time factor measured in? Is it a measure of milliseconds, nanoseconds or just an arbitrary measure based upon the amount of inputs, n, used to compare different versions of algorithims?
It kinda sorta depends on how exactly you define the notation (there are a many different definitions that ultimately describe the same thing). We defined it on turing machines, there time would be defined as the number of computation steps performed. On real machines, it'd be similar - for instance, the number of atomic instructions performed. As some of the comments have pointed out, the unit of time doesn't really matter anyway because what's measured is the asymptotic performance, that is, how the performance changes with increasing input sizes.
Note that this isn't really a programming question and probably not a good fit for the site. More of a CompSci thing, but i think the compsci stackexchange site is meant for post graduates.

Look for measure of time estimation in issue tracking software [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
We have standard issue tracking system (home grown for internal use only) and plan to add planing capabilities as all task have all necessary data to make estimation.
So each task have:
more or less accurate estimated time
accurate spent time
more or less accurate percentage completeness
accurate beginning/schedule date
task owner
Also we have scheduled version which is a group of tasks.
We don't know how to ask to question:
how many hours we must spend to release end
according to tasks time data and version schedule date?
or:
do we finish version to specified schedule?
PS Seems that percentage completeness is less accurate and we decide to drop it...
Estimation is trickier than it looks. For example, when people are asked to give time estimates, they generally systematically underestimate (it's called "optimism bias").
My best suggestion is that you should get a book on the topic and read it. McConnell's Software Estimation: Demystifying the Black Art is a good place to start.

Google-like Calculator Program for Windows [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Closed 7 years ago.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Improve this question
I love using Google for quick back-of-the-envelope calculations. For instance, if I want to know the approximate weight of a carbon-12 ion with charge state 4, I search for
12 u -4*electron mass in u
and get the answer 11.9978057 atomic mass units. More complex things, such as the cyclotron frequency of this ion in some magnetic field, are just as easy:
1/(2*pi)*4* (elementary charge)/(12 u - 4*(electron mass)) * 5.1125 Tesla
This returns the correct answer 26.174171 MHz. The fact that I can enter 12u - 4*(electron mass) and Google converts the units on the fly, is really helpful to me. WolframAlpha can do even more, but Google is a lot quicker and does not ask for a subscription after my nth query.
As an offline solution, I used a Matlab script in which I had most constants defined, but Matlab takes 30 sec to 1 min to start up, which is frustrating. Mathematica is not much faster to start up, either. Also, for technical reasons I have to use network licenses, so these programs are not offline solutions anymore. I switched to Excel (which loads quite fast), where I have a sheet that used named ranges. This is semi-convenient, but it just feels wrong.
Is there any lightweight Windows program that provides this functionality offline?
You can use the Units program that was originally developed for UNIX. There is a native Windows port that is based on version 1.87 (2008). The current version of the UNIX tool is 2.01 (2012).
Units was originally designed to do simple unit conversion, but it also supports evaluating mathematical expressions. It requires you to specify the unit of the output and gives you two lines as a result: The result that you want is the first line, the second line is the inverse of the result.
This program has three major shortcomings when compared to the Google math expression evaluation:
You have to know the unit that you want to get in advance. (I don't always know it, and sometimes I just don't care. Often this unit is "1", as for the result of the calculation sin(pi).)
It does not tell you how it interpreted the units that you entered. Google always returns a parsed version of the input string, so that you can see where Google misunderstood you.
It is quite strict when it comes to variable names. Multi-word names are not permitted, so electron mass is called electronmass (m_e also works).
The installer.exe is easy enough to use, but on my Windows XP machine it did not set the path variables of the command line correctly. I set up a simple shortcut on my Desktop that points to: C:\Programs\GnuWin32\bin\units.exe.
Overall, Units is a nice and quick calculator that starts up a few thousand times faster than Matlab or Mathematica - but the user interface has some shortcomings.

Max line count of one file? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
How many max number of lines of code one .cs file should hold. Are there any industry standards/best practices?
I don't know how much, but there is a number which is considered a best practice.
It's all about maintainability, and there's much more to it than the number of lines in a source file. What about number of methods in a class? Cyclomatic complexity of a method? Number of classes in a project?
It's best to run a code metrics tool on your source code to find out more, like NDepend.
As few as possible to do the job required whilst remaining readable.
If you're spilling over into the thousands upon thousands of line you might want to ask yourself what exactly this file is doing and how can you split it up to express the activity better.
Day to day, if I find a class which is more than 1000 lines long I am always suspicious that either the class is responsible for too much, or the responsibility is expressed badly.
However as with every rule of thumb, each case should be assessed on it's own merits.

What applications are there that I can pass data as it's generated and have it analyze some statistics for? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
The basic requirement is pass to some command type and execution time (possibly other data as well, but that's the basic data we're concerned with at the moment) from C# code (either managed code or something that can take data periodically from the command line. and perform some statistical analysis on it: avg time for each command type, standard deviation, some charts would be nice, etc.
Something that can do this in real time might be preferable, but I guess it's also acceptable to save the data ourselves and just pass it in to be analyzed.
We could write up something for this, but it seems like there should probably be something out there for this.
Edit: Basically looking for low learning curve and able to do what's mentioned above. Basically something that would be faster to learn and use than coding it manually.
I may be off base here, but would custom Windows perfmon objects and counters do this? You could create an object for each command type, with a counter for execution time, then use Perfmon's logging, charting and reporting facilities. Or export the Perfmon data to Excel/Access for fancier stuff.

Resources