Pseudo random number - random

I need to find a function f(t) which generates pseudo random numbers in range [0;1) with uniform distribution. Results for same t must be equals.

It seems that a linear congruential generator might be the way to go.
https://rosettacode.org/wiki/Linear_congruential_generator

Related

Generating g(x)=2e^(-2x) in Matlab

I want to generate 10000 random variables with the distribution function g(x)=2e^(-2x). I'm thinking to use random but struggle to understand how to get (-2x) as part of the density function. Is random the way to go or is the another way?
I presume you mean g(x) is the density function, not the distribution function. What you've described is the density of an exponential random variable with rate λ=2. You can generate these in Matlab with the exprnd() function. Note that Matlab specifies things in terms of the mean mu, which is the inverse of the rate. To get variates with rate 2, you specify a mean of 1/2.

Generating Gaussian Random Numbers without a Uniform Random Number Generator

I know many uniform random number generators(RNGs) based on some algorithms, physical systems and so on. Eventually, all these lead to uniformly distributed random numbers. It's interesting and important to know whether there is Gaussian RNGs, i.e. the algorithm or something else creates Gaussian random numbers. Much precisely I want to say that I don't want to use transformations such as Box–Muller or Marsaglia polar method to get Gaussian from Uniform RNGs. I am interested if there is some paper, algorithm or even idea to create Gaussian random numbers without any of use Uniform RNGs. It's just to say we pretend that we don't know there exist Uniform random number generators.
As already noted in answers/comments, by virtue of CLT some sum of any iid random number could be made into some reasonable looking gaussian. If incoming stream is uniform, this is basically Bates distribution. Ami Tavory answer is pretty much amounts to using Bates in disguise. You could look at closely related Irwin-Hall distribution, and at n=12 or higher they look a lot like gaussian.
There is one method which is used in practice and does not rely on transformation of the U(0,1) - Wallace method (Wallace, C. S. 1996. "Fast Pseudorandom Generators for Normal and Exponential Variates." ACM Transactions on Mathematical Software.), or gaussian pool method. I would advice to read description here and see if it fits your purpose
As others have noted, it's a bit unclear what is your motivation for this, and therefore I'm not sure if the following answers your question.
Nevertheless, it is possible to generate (an approximation of) this without the specific formulas transforming uniform RNGs that you mention.
As with any RNG, we have to have some source of randomness (or pseudo-randomness). I'm assuming, therefore, that there is some limitless sequence of binary bits which are independently equally likely to be 0 or 1 (note that it's possible to counter that this is a uniform discrete binary RNG, so I'm unsure if this answers your question).
Choose some large fixed n. For each invocation of the RNG, generate n such bits, sum them as x, and return
(2 x - 1) / √n
By the de Moivre–Laplace theorem this is normal with mean 0 and variance 1.

How to generate random numbers when given the value of a probability density function?

When given a set of values deriving from a probability density function f, like this
{f(X1),f(X2)... f(Xn)}
But we don't know the exactly form of f,only we know is that the probability density function is a generalized Gaussian distribution.
Is it possible to generate the random numbers Xi if Xi belongs to a range [-3,3]?
The most straightforward way that I can see is this. Assuming that you have have large number of points {f(X1),--,f(Xn)}, plot them as distribution and fit a generalized Gaussian distribution curve through them. After this, you can use rejection sampling to generate further numbers from the same distribution.

Using VB 6.0 to generate pseudorandom numbers with a Gaussian distribution

I would like to generate some pseudorandom numbers on (-infinity, infinity) with a Gaussian distribution of standard deviation s and mean m. Any suggestions about how to do this? I'd appreciate any help in the right direction, as there seems to be a huge literature out there as how best to generate pseudorandom numbers.
You can generate a Gaussian distribution (also known as a normal distribution) buy using a uniform random number generator and an appropriate algorithm. Check out [stackoverflow link to Gaussian algorithms][1]
Do you really want to go from +/- infinity? Does that make sense?
A simple algorithm to use is the Box-Muller method.
Normal Dist. Random # = SQRT(-2*LN(RAND()))*SIN(2*PI()*RAND())
The Box-Muller method is mathematically exact if implemented with a perfect uniform random number generator and infinite precision. (oops.. in that formula, mu/mean =0 and sigma = 1 and random #'s are between 0 and 1) see http://mathworld.wolfram.com/Box-MullerTransformation.html

Pseudorandom Number Generation with Specific Non-Uniform Distributions

I'm writing a program that simulates various random walks (with differing distributions). At each timestep, I need randomly generated, two dimensional step distances and angles from the distribution of the random walk. I'm hoping someone can check my understanding of how to generate these random numbers.
As I understand it I can use Inverse Transform Sampling as follows:
If f(x) is the pdf of our random walk that has a non-uniform distribution, and y is a random number from a uniform distribution.
Then if we let f(x) = y and solve to find x then we have a random number from the non-uniform distribution.
Is this a feasible solution?
Not quite. The function that needs to be inverted is not f(x), the pdf, but F(x)=P(X<=x)=int_{-inf}^{x}f(t)dt, the cdf. The good thing is that F is monotone, so actually has a unique inverse (unlike f).
There are multiple other ways of generating random numbers according to a given distribution. For example, if the cdf F is difficult to compute or to invert, rejection sampling can be a good option if f is easy to compute.
You are close, but not quite. Every probability density function (pdf) has a corresponding cumulative density function (cdf). An important property about CDF(x) is that they are always between 0 and 1. Because it is relatively easy to draw a random number between 0 and 1, we can use that to work our way backwards to the distribution. So changing the word pdf to CDF in your question makes the statement correct.
As an aside for this to make sense computationally you need to find an easy to calculate inverse of the CDF. One way to do this is to fit a polynomial approximation to the CDF and find the inverse of that function. There are more advanced techniques for simulating probability distributions with messy distributions. See this book chapter for the details.

Resources