What is "gate count" in synthesis result and how to calculate - vhdl

I'm synthesizing my design with design compiler and have some comparison with another design (as a evaluation in my report). The Synopsys's tool can easily report the area with command but in all paper I've read care about gate count.
My quiz is what is gate count and how to calculate it?
I googled and heard about gate count is calculated as total_area/NAND2_area. So, is it true?
Thank for your reading and please don't blame me about stupid question :(.

Synthesised area is often quoted as Gate count in NAND2 equivalents. You are correct with:
(total area)/(NAND2 area).
Older tools and libraries use to report this number, a few years a go I noticed a shift for tools to just provide areas in Square Microns. However the gate count is a nicer number to get your head around, and the number is portable between different size geometries.
40K for implementation A is smaller than 50K for implementation B. Much harder to compare 100000 um^2 for implementation A process X vs 65000 um^2 for implementation B on process y.

Related

What do the digits in WELL generators represent?

In psuedo-random number generators like WELL512a, WELL1024, and WELL44497b, I understand what WELL (well equidistributed long-period linear) stands for, but I can't find any information on the suffix.
I'm writing a paper over rng's and I'm not sure if this is relevant
This is, I believe, log2(RNG period). Thus, WELL512a will have period of 2512, WELL1024 will have period 21024 etc
Reference: http://www.iro.umontreal.ca/~lecuyer/myftp/papers/wsc05rng.pdf, Table 1
This is an old question, and I'm sure that OP has moved on, but others may be interested in the answer. #SeverinPappadeux's answer is pretty much correct. The number n in the suffix is the roughly number of bits in the internal state. The period is 2n - 1. The letters after the numbers indicate different variants of the PRNG with the corresponding period. The different letters don't have any meaning other than indicating different versions.
The Wikipedia page is very brief:
https://en.wikipedia.org/wiki/Well_equidistributed_long-period_linear
This is the official paper on the WELL generators:
http://www.iro.umontreal.ca/~lecuyer/myftp/papers/wellrng.pdf
The table on page 9 lists parameters for the various WELL generators. You have to study the paper to understand the parameters, but the upper Δ1 in the right-hand column is worth noticing. Zero is the best value for Δ1--it's the number of dimensions in which the random numbers are not equidistributed. So it's worth noticing, for example, that Δ1 is not zero for WELL19937a or WELL19937b, but it is zero for WELL19937c. Thus if you want a WELL generator and like the idea of a generator with period 219937 - 1, and you don't mind 624 words of state (624 * 32 = 19968), it's probably slightly better to use WELL19937c rather than the other two. (This is probably one reason why WELL19937c is currently the default generator for Apache Commons Math lib, release 3.6.1, btw.)

Generate Random Numbers non-algorithmically

I am looking for a satisfying solution of how to generate a random number.
I looked at this, this, this and this.
But am looking for something else.
Most of the posts mention using, R[n+1] = (a *R[n-1 + b) %n, this pseudo-random function, or some other mathematical functions.
But weirdly I am not looking for these; I want some non-algorithmic answer. Precisely, an "Interview" answer. Something easy to understand, not to make the interviewer feel that I mugged up a method :) .
For an interview question, a common answer might be to look at the intervals between keystrokes (ask the user to type something), disc seek times or input from a disconnected source -- that will give you thermal electrons from inside your MIC socket or whatever.
LavaRnd uses a digital camera with the lens cap on, which is a version of the last.
Some operating systems allows indirect access to some of this random input, usually through a secure random function; slower but more secure than the usual RNG.
Depending on what job the interview is for, you can talk about testing the raw data to check for entropy, and concentrating the entropy by using a cryptographic hash function like SHA-256.
There are also specialised, and expensive, hardware cards which use various quantum effects to generate true random numbers.
Take the system time, add a seed, modulo the upper limit. if upper limit is less than 0 than multiply it by -1 and then later the result subtract the max... not very strong but meets your requirement?
If you have a UI and only need a couple of randoms can ask the user to move mouse around, enter a few seeds, enter a few words and use them as seeds

relationship between flopping and meta-stability

While doing clock domain conversions (rate matched) we usually double flop the data to avoid meta-stable states. Double flopping just reduces the probability of meta-stability. Triple flopping will reduce it further.
How to calculate the probability/relationship between meta-stability and number of clock domain flops used?
The canonical answer to metastability queries always involves referring to articles written by the late, great, Peter Alfke. In particular, the XAPP094 appnote - don't worry about the age of it, the theory is still the same.
There are also numbers for some more recent families available - although I can't see anything for the 6 and 7 series as yet.

When to stop the looping in random number generators?

I'm not sure StackOverflow is the right place to ask this question, because this question is half-programming and half-mathematics. And also really sorry if my question is stupid ^_^
I'm studying about Monte Carlo simulations via the "Monte Carlo Methods" book. One of the first thing I must learn is about Random Number Generator. The basic algorithm of RNG is:
1. Initialize: Draw the seed S0 from the distribution µ on S. Set t = 1.
2. Transition: Set St = f(St−1).
3. Output: Set Ut = g(St).
4. Repeat: Set t = t+ 1 and return to Step 2.
(µ is a probability distribution on the finite set of states S, the input is S0 and the random number we desire it the output Ut)
It is not hard to understand, but the problem here is I don't see the random factor which lie in the number of repeat. How can we decide when to stop the loop of the RNG? All examples I read which implement a RNG are loop for 100 times, and they returns the same value for a specific seed. It is not random at all >_<
Can someone explain what I'm missing here? Any help will be appreciated. Thanks everyone
You can't get a true sequence of random numbers on a computer, without specialized hardware. (Such specialized hardware performs the equivalent of an initial roll of the dice using physics to provide the randomness. Electronic ones often use the electronic noise of specialized diodes at constant temperatures; others use radioactive decay events.)
Without that specialized hardware, what you can generate are pseudorandom numbers which, as you've observed, always generate the same sequence of numbers for the same initial seed. For simple applications, you can often get away with generating an initial seed from the time of invocation, which is effectively random.
And when I say "simple applications," I am excluding cryptography. (Not just that, but especially that.)
Sometimes when you are trying to debug a simulation, you actually want to have a reproducible stream of "random" numbers so you might specifically sent a stream to start with a specific seed.
For instance in the answer Creating a facet_wrap plot with ggplot2 with different annotations in each plot rcs starts the answer by creating a reproducible set of data using the R code
set.seed(1)
df <- data.frame(x=rnorm(300), y=rnorm(300), cl=gl(3,100)) # create test data
before going on to demonstrate how to answer the actual question.

How exactly does PC/Mac generates random numbers for either 0 or 1?

This question is NOT about how to use any language to generate a random number between any interval. It is about generating either 0 or 1.
I understand that many random generator algorithm manipulate the very basic random(0 or 1) function and take seed from users and use an algorithm to generate various random numbers as needed.
The question is that how the CPU generate either 0 or 1? If I throw a coin, I can generate head or tailer. That's because I physically throw a coin and let the nature decide. But how does CPU do it? There must be an action that the CPU does (like throwing a coin) to get either 0 or 1 randomly, right?
Could anyone tell me about it?
Thanks
(This has several facets and thus several algorithms. Keep in mind that there are many different forms of randomness used for different purposes, but I understand your question in the way that you are interested in actual randomness used for cryptography.)
The fundamental problem here is that computers are (mostly) deterministic machines. Given the same input in the same state they always yield the same result. However, there are a few ways of actually gathering entropy:
User input. Since users bring outside input into the system you can take that to derive some bits from that. Similar to how you could use radioactive decay or line noise.
Network activity. Again, an outside source of stuff.
Generally interrupts (which kinda include the first two).
As alluded to in the first item, noise from peripherals, such as audio input or a webcam can be used.
There is dedicated hardware that can generate a few hundred MiB of randomness per second. Usually they give you random numbers directly instead of their internal entropy, though.
How exactly you derive bits from that is up to you but you could use time between events, or actual content from the events, etc. – generally eliminating bias from entropy sources isn't easy or trivial and a lot of thought and algorithmic work goes into that (in the case of the aforementioned special hardware this is all done in hardware and the code using it doesn't need to care about it).
Once you have a pool of actually random bits you can just use them as random numbers (/dev/random on Linux does that). But this has downsides, since there is usually little actual entropy and possibly a higher demand for random numbers. So you can invent algorithms to “stretch” that initial randomness in a manner that makes it still impossible or at least very difficult to predict anything about following numbers (/dev/urandom on Linux or both /dev/random and /dev/urandom on FreeBSD do that). Fortuna and Yarrow are so-called cryptographically secure pseudo-random number generators and designed with that in mind. You still have a very good guarantee about the quality of random numbers you generate, but have many more before your entropy pool runs out.
In any case, the CPU itself cannot give you a random 0 or 1. There's a lot more involved and this usually includes the complete computer system or special hardware built for that purpose.
There is also a second class of computational randomness: Plain vanilla pseudo-random number generators (PRNGs). What I said earlier about determinism – this is the embodiment of it. Given the same so-called seed a PRNG will yield the exact same sequence of numbers every time¹. While this sounds idiotic it has practical benefits.
Suppose you run a simulation involving lots of random numbers, maybe to simulate interaction between molecules or atoms that involve certain probabilities and unpredictable behaviour. In science you want results anyone can independently verify, given the same setup and procedure (or, with computing, the same algorithms). If you used actual randomness the only option you have would be to save every single random number used to make sure others can replicate the results independently.
But with a PRNG all you need to save is the seed and remember what algorithm you used. Others can then get the exact same sequence of pseudo-random numbers independently. Very nice property to have :-)
Footnotes
¹ This even includes the CSPRNGs mentioned above, but they are designed to be used in a special way that includes regular re-seeding with entropy to overcome that problem.
A CPU can only generate a uniform random number, U(0,1), which happens to range from 0 to 1. So mathematically, it would be defined as a random variable U in the range [0,1]. Examples of random draws of a U(0,1) random number in the range 0 to 1 would be 0.28100002, 0.34522, 0.7921, etc. The probability of any value between 0 and 1 is equal, i.e., they are equiprobable.
You can generate binary random variates that are either 0 or 1 by setting a random draw of U(0,1) to a 0 if U(0,1)<=0.5 and 1 if U(0,1)>0.5, since in theory there will be an equal number of random draws of U(0,1) below 0.5 and above 0.5.

Resources