I want to generate a random number between 1,000 and 9,999 and convert the return value to string. Thus I have:
(Math.floor(Math.random() * 9999) + 1000).toString()
However, this always returns me a number that starts with one. What am I doing wrong? Also, if I want to return a number where the digits are not repeated and the range is from 1 to 9999, how do I modify the equation?
A random number that didn't begin with a 1.
Maxima Cas random function takes as an input floating point number and gives floating point number as an output.
I need floating point number with more digits, so I use bfloat with increased precision.
I have tried :
random(1.0b0)
bfloat(random(1.0));
The best result was :
bfloat(%pi)/6.000000000000000000000000000000000000000000b0
5.235987755982988730771072305465838140328615665625176368291574320513027343810348331046724708903528447b-1
but it is not random.
One way to generate a random bigfloat is to generate an integer with the appropriate number of bits and then rescale it to get a number in the range 0 to 1.
Note that random(n) returns an integer in the range 0 to n - 1 when n is an integer, therefore: bfloat(random(10^fpprec) / 10^fpprec).
I think I've hit the upper limit level for random generating numbers.
Is there any workaround for this?
I need to generate ten unique, 19-digit, random numbers in one formula/string.
I have this: =TRUNC (RAND() * (9999999999999999999 - 1) + 1) but spreadsheet rewrites it to:
=TRUNC (RAND() * (9999999999999990000 - 1) + 1)
so I guess the limit is 9999999999999990000.
Desired output format in A1:
2459759093970314589,6393667943286134368,4897561254458152397, etc.
Numeric precision in Excel is ~15 significant digits. An alternative would be to generate the random numbers as strings (the fist character needs to be in the range of 1 to 9, the others from the range of 0-9). Something like this:
=CONCAT(
TRUNC(RAND()*9+1);
TRUNC(RAND()*10);
TRUNC(RAND()*10);
TRUNC(RAND()*10);
...
)
google-spreadsheets
=TRANSPOSE(regexextract(JOIN("",ArrayFormula(RANDBETWEEN(row(INDIRECT("a1:a"&A1*A2))^0-1,9))),rept("(\d{"&A1&"})",A2)))
I have 6 variables 0 ≤ n₁,...,n₆ ≤ 12 and I'd like to build a hash function to do the direct mapping D(n₁,n₂,n₃,n₄,n₅,n₆) = S and another function to do the inverse mapping I(S) = (n₁,n₂,n₃,n₄,n₅,n₆), where S is a string (a-z, A-Z, 0-9).
My goal is to minimize the length of S for 3 or less.
I thought as the variables have 13 possible values, a single letter (a-z) should be able to represent 2 of them, but I realized that 1 + 12 = m and 2 + 11 = m, so I still don't know how to write a function.
Is there any approach to build a function that does this mapping and returns a small string?
Using the whole ASCII to represent S is an option if it's necessary.
You can convert a set of numbers in any given range to numbers in any other range using base conversion.
Binary is base 2 (0-1), decimal is base 10 (0-9). Your 6 numbers are base 13 (0-12).
Checking whether a conversion would be possible involves counting the number of possible combinations of values for each set. With each number in the range [0,n] (thus base n+1), we can go from all 0's to all n's, thus each number can take on n+1 values and the total number of possibilities is (n+1)numberCount. For 6 decimal digits, for example, it would be 106 = 1000000, which checks out, since there are 1000000 possible numbers with (at most) 6 digits, i.e. numbers < 1000000.
Lower- and uppercase letters and numbers (26+26+10) would be base 62 (0-61), but, following from the above, 3 such values would be insufficient to represent your 6 numbers (136 > 623). To do conversion from/to these, you can do the conversion to a set of base 62 numbers, then have appropriate if-statements to convert 0-9 <=> 0-9, a-z <=> 10-35, A-Z <=> 36-61.
You can represent your data in 3 bytes (since 2563 >= 136), although this wouldn't necessary be printable characters - 32-126 is considered the standard printable range (which is still too small of a range), 128-255 is the extended range and may not be displayed properly in any given environment (to give the best chance of properly displaying it, you should at least avoid 0-31 and 127, which are control characters - you can convert 0-... to the above ranges by adding 32 and then adding another 1 if the value is >= 127).
Many / most languages should allow you to give a numeric value to represent a character, so it should be fairly simple to output it once you do the base conversion. Although some may use Unicode to represent characters, which could make it a bit less trivial to work with ASCII.
If the numbers had specific constraints, that would reduce the number of possible combinations, thus possibly making it fit into a smaller set or range of numbers.
To do the actual base conversion:
It might be simplest to first convert it to a regular integral type (typically binary or decimal), where we don't have to worry about the base, and then convert it to the target base (although first make sure your value will fit in whichever data type you're using).
Consider how binary works:
1101 is 13 = 23 + 22 + 20
13 % 2 = 1 13 / 2 = 6
6 % 2 = 0 6 / 2 = 3
3 % 2 = 1 3 / 2 = 1
1 % 2 = 1
The above, from top to bottom: 1101 = our number
Using the same idea, we can convert to/from any base as follows: (pseudo-code)
int convertFromBase(array, base):
output = 0
for each i in array
output = base*output + i
return output
int[] convertToBase(num, base):
output = []
while num > 0
output.append(num % base)
num /= base
output.reverse()
return output
You can also extend this logic to situations where each number is in a different range by changing what you divide or multiple by at each step (a detailed explanation of that is perhaps a bit beyond the scope of the question).
I thought as the variables have 13 possible values, a single letter
(a-z) should be able to represent 2 of them
This reasoning is wrong. In fact to represent two variables (=any combination these variables might take) you will need 13x13 = 169 symbols.
For your example the 6 variables can take 13^6 (=4826809) different combinations. In order to represent all possible combinations you will need 5 letters (a-z) since 26^5 (=11881376) is the least amount that is will yield more than 13^6 combinations.
For ASCII characters 3 symbols should suffice since 256^3 > 13^6.
If you are still interested in code that does the conversion, I will be happy to help.
I am quite new to KDB+ and have a question about generating random numbers.
Lets say I want to create num random unique numbers.
When i use this
q)10?10
q)-10?10
I get 10 random numbers in line 1 and 10 unique random numbers in line 2 (range from 0 to 9)
When I want to introduce a variable like this
q)num:10
q)num?10 / works
q)-num?10 / dont work
The generation of unique randoms does not work.
Whats the correct syntax for this?
Thanks in advance
This will give you num unique numbers between 0 and 9.
q)(neg num)?10