Are there any efficient random BYTES generator (without using divisions)? - random

I'm trying to generate some 8-bit random numbers with C++ and don't want to use divisions (like rand()%8 or any scale methods).
One algorithm I found online is Park-Miller-Carta Pseudo-Random Number Generator
It is a 32-bit random number generator with no divisions. With these random numbers, I'm trying to extract the lower or higher 8 bits of them so that I can get some random bytes, but this does not seem to work because these bits are not so random.
Are there any tricks to fix this or are there any other algorithms that can do the trick?

How about XORing four bytes of 32bit random integer?

Related

floating point random number generation in verilog.

Is there a way to generate random floating point numbers in either verilog or system verilog? More specifically through some hardware implementation?
A floating point number is just a number of bits. As such generating a random floating point number can be done by generating random bits and then interpreting them as a float (Or real as they are called in both VHDL and Verilog).
A standard way of generating a series of random bits in hardware is using a PRBS generator (Pseudo Random Bit Sequence generator): A linear feedback shift register with special feedback to get the maximum sequence. There are various polynomial depending on how long a PRBS you want to have.
For exact implementation I suggest you search for PRBS.

Uniform pseudo-random number generator between two numbers on GPU

Just need a fast way to generate pseudo-random numbers between -1 and 1 on the GPU.
I've been looking at the xorshift random number generators, but I can't figure out how to constrain them between two numbers.

Computing entropy/disorder

Given an ordered sequence of around a few thousand 32 bit integers, I would like to know how measures of their disorder or entropy are calculated.
What I would like is to be able to calculate a single value of the entropy for each of two such sequences and be able to compare their entropy values to determine which is more (dis)ordered.
I am asking here, as I think I may not be the first with this problem and would like to know of prior work.
Thanks in advance.
UPDATE #1
I have just found this answer that looks great, but would give the same entropy if the integers were sorted. It only gives a measure of the entropy of the individual ints in the list and disregards their (dis)order.
Entropy calculation generally:
http://en.wikipedia.org/wiki/Entropy_%28information_theory%29
Furthermore, you have to sort your integers, then iterate over the sorted integer list to find out the frequency of your integers. Afterwards, you can use the formula.
I think I'll have to code a shannon entropy in 2D. Arrange the list of 32 bit ints as a series of 8 bit bytes and do a Shannons on that, then to cover how ordered they may be, take the bytes eight at a time and form a new list of bytes composed of bits 0 of the eight followed by bits 1 of the eight ... bits 7 of the 8; then the next 8 original bytes ..., ...
I'll see how it goes/codes...
Entropy is a function on probabilities, not data (arrays of ints, or files). Entropy is a measure of disorder, but when the function is modified to take data as input it loses this meaning.
The only true way one can generate a measure of disorder for data is to use Kolmogorov Complexity. Though this has problems too, in particular it's uncomputable and is not yet strictly well defined as one must arbitrarily pick a base language. This well-definedness can be solved if the disorder one is measuring is relative to something that is going to process the data. So when considering compression on a particular computer, the base language would be Assembly for that computer.
So you could define the disorder of an array of integers as follows:
The length of the shortest program written in Assembly that outputs the array.

More random numbers

So I get that all the built in function only return pseudo random numbers as they use the clock speed or some other hardware to get the number.
So here my idea, if I take two pseudo random numbers and bitwise them together would the result still be pseudo random or would it be closer to truly random.
I figured that if I fiddled about with the number a bit it would be less replicable, or am I getting this wrong.
On a side note why is pseudo random a problem?
It will not be more random, but there is a big risk that the number will be less random (less uniformly ditributed). What bitwise operator were you thinking about?
Lets assume 4-bit random numbers 0101, 1000. When OR:ed together you would get 1101. With OR there would be a clear bias towards 1111, with AND towards 0000. (75 % of getting a 1 or 0 respectively in each position)
I don't think XOR and XNOR would be biased. But also you wouldn't get any more randomness out of it (see Pavium's answer).
Algorithms executed by computers are deterministic.
You can only generate truly random numbers if there's a non-deterministic input.
Pseudo-random numbers follow a repeating sequence. Maybe a long sequence but the repetition makes them predictable and therefore not truly random.
You can't generate truly random numbers from two pseudo-random numbers.
EDITED: to put the sentences in a more logical order.

Best method of generating a number with 256 random bits?

What is the best method of generating a number with 256 random bits?
Does concatenating random bytes work?
byte[] data = new byte[32];
RNGCryptoServiceProvider rng = new RNGCryptoServiceProvider();
rng.GetNonZeroBytes(data); // should include zero bytes?
string number = BitConverter.ToString(data, 0).Replace("-", "");
Further more, would it be appropriate to sort a deck of cards using non-duplicates of these numbers?
Whether or not you can concatenate random bytes depends on the random number generator that you are using. Some random number generators exhibit serial correlation. For these random number generators, concatenating would be bad.
If you are using these random numbers for crypotgraphic purposes, you should look at Blum Blum Shub. Otherwise, look at the Mersenne Twister.
For shuffling a finite set, look at the Fisher-Yates shuffle.
The correct way to shuffle a deck of cards is with a Knuth Shuffle. It's simple and perfect. Perfect meaning that all possible card orderings are equally likely, assuming use of a good RNG.
yes, concatenating random bytes would work.
EDIT: Not sure why you would need 256 bits to shuffle a deck of cards, can you expand on that part further?
If the random byte generator is good, any method works equally well, and also your card shuffling approach is appropriate.

Resources