What Is "Time" When Talking About bigO Notation [closed] - algorithm

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I am getting ahead on next semester classes and just had a question about bigO notation. What is the time factor measured in? Is it a measure of milliseconds, nanoseconds or just an arbitrary measure based upon the amount of inputs, n, used to compare different versions of algorithims?

It kinda sorta depends on how exactly you define the notation (there are a many different definitions that ultimately describe the same thing). We defined it on turing machines, there time would be defined as the number of computation steps performed. On real machines, it'd be similar - for instance, the number of atomic instructions performed. As some of the comments have pointed out, the unit of time doesn't really matter anyway because what's measured is the asymptotic performance, that is, how the performance changes with increasing input sizes.
Note that this isn't really a programming question and probably not a good fit for the site. More of a CompSci thing, but i think the compsci stackexchange site is meant for post graduates.

Related

Performance analysis of Sorting Algorithms [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I am trying to compare the performance of couple of sorting algorithms. Are there any existing benchmarks that can make my task easy? If not I want create my own benchmark. How would I achieve this?
Couple of things I need to consider:
Test on different possible input permutation
Test on different scale of input size
Keep hardware configuration consistent across all the algorithms
Major challenge is in implementing sorting algorithm. Because if I implement one and if that happens to be the non-efficient way of implementation it will generate inaccurate result. How would I tackle this?
Tomorrow if someone comes up with his/her own sorting algorithms how would he/she compare with other sorting algorithm?
Though I am flexible with any programming language but would really appreciate if someone can suggest me some functions available in python.
Well, i think you are having trouble what a doubling ratio test is. I know only basics of python so i got this code from here
#!/usr/bin/python
import time
# measure wall time
t0 = time.time()
procedure() // call from here the main function of your sorting class and as
the (sorting)process ends then it will automatically print
the time taken by a sorting algorithm
print time.time() - t0, "seconds wall time"

What is the difference between the design of algorithms and the analysis of algorithms? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I am new to algorithms. What is the difference between the design of algorithms and the analysis of algorithms?
The design of an algorithm is the process of inventing the algorithm. You work out what steps to take, the order in which to take them, etc. (Think of it like writing the code for the algorithm). The analysis of an algorithm is where you work out mathematically how efficient it is, prove that it's correct in all cases, etc.
Think of the design as writing the code and the analysis as justifying why that code works and why it's efficient.
Algorithm Design is a specific instructions for completing a task.
They've also been called "recipes".Perhaps a more accurate description would be that algorithms design are patterns for completing a task in an efficient way.
Analysis of Algorithms is the determination of the amount of resources (such as time and storage) necessary to execute them.usually described as (time complexity) and storage locations (space complexity) of an algorithm and stated as a function relating the input length to the number of steps.

Examples of the integral that can't be done correctly by Wolfram Alpha [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
Does anybody know the examples of indefinite or definite integral that can be done in the terms of elementary functions manually by a good first-year or second-year student, but which Wolfram Alpha (or Mathematica) evaluate not correctly?
In other words, I want to find some tasks for mathematical test, where students cannot easily find the answer using wolfram and just rewrite it in their papers.
Thanks in advance.
It is probably impossible. Set of functions known by 1-2 years students is constrained. Mathematica uses symbolic algebra system to transform integrals, and big repository with properties of functions.
http://functions.wolfram.com/
For example for Hypergeometric Functions you have (218,254 formulas)!
Methods of calculations of integrals are explained on wolframalpha.com as step by step solutions for pro users. ($4.75/mo billed annually or $6 billed monthly)
Calculating integrals by computers is nowadays on level comparable to chess games. You have to talk with student individually.

How would be an algorithm to simulate human interaction? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Let's suppose that androids which are physically alike humans are a reality.
What would be an algorithm to make it interact with human beings if we want it to:
1) be indistinguishable from regular people in behavior
2) be as equally friendly to everyone as possible?
I understand that it is very hard to write an algorithm like that. I can, however, imagine an android simulating human behavior fairly well with some sort of machine learning technique.
But how would we train it? The act of collecting data would also be a big big problem.
Which machine learning technique would be ideal?
If you consider requirement 1 to be a hard requirement, such an algorithm would beat the Turing Test at least to some extent, so it would be a pretty advanced (world-class) algorithm.
Your problem basically equates to beating the Turing Test, so check the linked article to see the scientific literature produced by people working on this problem.
Assuming massive data availability and processing power are basically unbounded, I believe an Artificial Neural Network would be the best runner-up to base such an algorithm on.

Cyclomatic Number [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
I am trying to understand McCabe's Cyclomatic number and I learned what it is actually for i.e. it is used to indicate the complexity of a program. It directly measures the number of linearly independent paths through a program's source code. (Read from wikipedia)
But I want to know which software entity and attribute it really measures.
Cyclomatic Complexity, CC, is measured at the granularity of the function or method. Sometimes it is summed up for a class and called the Weighted Method Count, WMC, i.e. the sum of the CC's for every method in the class.
Cyclomatic complexity analyses the code. looks for Loops and branches you have in code and assumes that greater the loops and branches, complex the code.
Complexity is then linked to maintainability. its assumed that higher the complexity difficult it gets to maintain.
This is used for methods and classes to measure the complexity .
Complexity number 3 is not bad for a method,If it is greater than 3,then it is eligible for refactoring.
It encourages to write small methods so that there is high posibility for code reuse.

Resources