Time.time in unity - time

i see a video of how to move a cube like snake game move
HI
in this video ( https://www.youtube.com/watch?v=aT2zNLSFQEk&list=PLLH3mUGkfFCVNs51eK8ftCAlI3hZQ95tC&index=11 ) he declare float name **lastMove **with no value (zero by default) and use it in condition and **minus **it with Time.time then assign it to **lastMove **.
my question is what is the effect of lastMove in condition when it has no value?
if i clear it from "if statement" the game run fast but if remain in "if statement" time passed very slower

What he does is check continuously if time - lastMove is bigger than a given predefined interval (timeInBetweenMoves). Time keeps increasing each frame while lastMove is fixed. So at some point this condition will be true. When it is, he updates lastMove with the value of time to "reset the loop" = to make the minus difference lower than the interval again.The point of doing this is to move only at a fixed interval (0.25 secs) instead of every frame. Like this:
interval = 0.25 (timeBetweenMoves)
time (secs) | lastMove | time - lastMove
-----------------------------------------
0.00 | 0 | 0
0.05 | 0 | 0.05
0.10 | 0 | 0.10
0.15 | 0 | 0.15
0.20 | 0 | 0.20
0.25 | 0 | 0.25
0.30 | 0 | 0.30 ---> bigger than interval: MOVE and set lastMove to this (0.30)
0.35 | 0.30 | 0.5
0.40 | 0.30 | 0.10
0.45 | 0.30 | 0.15
0.50 | 0.30 | 0.20
0.55 | 0.30 | 0.25
0.60 | 0.30 | 0.30 ---> bigger than interval: MOVE and set lastMove to time (0.60)
0.65 | 0.60 | 0.5
0.70 | 0.60 | 0.10
...
This is kind of a throttling.

Related

How to do proper performance testing and analysis of alacritty and zsh

I've been working with my setup lately and have been trying to determine where my 2.3s terminal load times have been coming from. I'm fairly new to linux performance testing in general but I have determined a few things.
The first thing I should mention is that terminal is a shell program with the following:
#!/bin/sh
WINIT_X11_SCALE_FACTOR=1.5 alacritty "$#"
The stats on launching the terminal program (alacritty) and its shell (zsh -l):
> perf stat -r 10 -d terminal -e $SHELL -slc exit
Performance counter stats for 'terminal -e /usr/bin/zsh -slc exit' (10 runs):
602.55 msec task-clock # 0.261 CPUs utilized ( +- 1.33% )
957 context-switches # 1.532 K/sec ( +- 0.42% )
92 cpu-migrations # 147.298 /sec ( +- 1.89% )
68,150 page-faults # 109.113 K/sec ( +- 0.13% )
2,188,445,151 cycles # 3.504 GHz ( +- 0.17% )
3,695,337,515 instructions # 1.70 insn per cycle ( +- 0.08% )
791,333,786 branches # 1.267 G/sec ( +- 0.06% )
14,007,258 branch-misses # 1.78% of all branches ( +- 0.09% )
10,893,173,535 slots # 17.441 G/sec ( +- 0.13% )
3,574,546,556 topdown-retiring # 30.5% Retiring ( +- 0.11% )
2,888,937,632 topdown-bad-spec # 24.0% Bad Speculation ( +- 0.41% )
3,125,577,758 topdown-fe-bound # 27.1% Frontend Bound ( +- 0.16% )
2,189,183,796 topdown-be-bound # 18.4% Backend Bound ( +- 0.47% )
924,852,782 L1-dcache-loads # 1.481 G/sec ( +- 0.07% )
38,308,478 L1-dcache-load-misses # 4.16% of all L1-dcache accesses ( +- 0.09% )
3,445,566 LLC-loads # 5.517 M/sec ( +- 0.20% )
725,990 LLC-load-misses # 20.97% of all LL-cache accesses ( +- 0.36% )
2.30683 +- 0.00331 seconds time elapsed ( +- 0.14% )
The stats on launching just the shell (zsh):
Performance counter stats for '/usr/bin/zsh -i -c exit' (10 runs):
1,548.56 msec task-clock # 0.987 CPUs utilized ( +- 3.28% )
525 context-switches # 323.233 /sec ( +- 21.17% )
16 cpu-migrations # 9.851 /sec ( +- 11.33% )
90,616 page-faults # 55.791 K/sec ( +- 2.63% )
6,559,830,564 cycles # 4.039 GHz ( +- 3.18% )
11,317,955,247 instructions # 1.68 insn per cycle ( +- 3.69% )
2,351,473,571 branches # 1.448 G/sec ( +- 3.46% )
46,539,165 branch-misses # 1.91% of all branches ( +- 1.31% )
32,783,001,655 slots # 20.184 G/sec ( +- 3.18% )
10,776,867,769 topdown-retiring # 32.5% Retiring ( +- 3.28% )
5,729,353,491 topdown-bad-spec # 18.2% Bad Speculation ( +- 6.90% )
11,083,567,578 topdown-fe-bound # 33.3% Frontend Bound ( +- 2.34% )
5,458,201,823 topdown-be-bound # 15.9% Backend Bound ( +- 4.51% )
3,180,211,376 L1-dcache-loads # 1.958 G/sec ( +- 3.10% )
126,282,947 L1-dcache-load-misses # 3.85% of all L1-dcache accesses ( +- 2.37% )
14,347,257 LLC-loads # 8.833 M/sec ( +- 1.48% )
2,386,047 LLC-load-misses # 16.33% of all LL-cache accesses ( +- 0.77% )
1.5682 +- 0.0550 seconds time elapsed ( +- 3.51% )
The stats on launching the shell (zsh) with zmodload zsh/zprof:
num calls time self name
-----------------------------------------------------------------------------------
1) 31 78.54 2.53 77.09% 50.07 1.62 49.14% antigen
2) 2 23.24 11.62 22.81% 15.93 7.96 15.63% compinit
3) 2 7.31 3.66 7.18% 7.31 3.66 7.18% compaudit
4) 1 8.27 8.27 8.12% 7.29 7.29 7.16% _autoenv_source
5) 1 6.93 6.93 6.80% 6.93 6.93 6.80% detect-clipboard
6) 1 5.18 5.18 5.08% 5.18 5.18 5.08% _autoenv_hash_pair
7) 1 2.49 2.49 2.45% 2.45 2.45 2.41% _zsh_highlight_load_highlighters
8) 2 1.01 0.51 0.99% 1.01 0.51 0.99% _autoenv_stack_entered_contains
9) 10 0.91 0.09 0.89% 0.91 0.09 0.89% add-zsh-hook
10) 1 0.94 0.94 0.92% 0.87 0.87 0.85% _autoenv_stack_entered_add
11) 1 0.85 0.85 0.84% 0.85 0.85 0.84% async_init
12) 1 0.49 0.49 0.49% 0.49 0.49 0.48% _zsh_highlight__function_callable_p
13) 1 0.45 0.45 0.44% 0.45 0.45 0.44% colors
14) 3 0.38 0.13 0.37% 0.35 0.12 0.35% add-zle-hook-widget
15) 6 0.34 0.06 0.34% 0.34 0.06 0.34% is-at-least
16) 2 15.14 7.57 14.86% 0.27 0.13 0.26% _autoenv_chpwd_handler
17) 1 5.46 5.46 5.36% 0.26 0.26 0.26% _autoenv_authorized_env_file
18) 1 0.23 0.23 0.22% 0.23 0.23 0.22% regexp-replace
19) 11 0.19 0.02 0.19% 0.19 0.02 0.19% _autoenv_debug
20) 2 0.10 0.05 0.10% 0.10 0.05 0.10% wrap_clipboard_widgets
21) 16 0.09 0.01 0.09% 0.09 0.01 0.09% compdef
22) 1 0.08 0.08 0.08% 0.08 0.08 0.08% (anon) [/home/nate-wilkins/.antigen/bundles/zsh-users/zsh-autosuggestions/zsh-autosuggestions.zsh:458]
23) 2 0.05 0.02 0.05% 0.05 0.02 0.05% bashcompinit
24) 1 0.06 0.06 0.06% 0.04 0.04 0.04% _autoenv_stack_entered_remove
25) 1 5.50 5.50 5.40% 0.03 0.03 0.03% _autoenv_check_authorized_env_file
26) 1 0.04 0.04 0.04% 0.03 0.03 0.03% complete
27) 1 0.88 0.88 0.87% 0.03 0.03 0.03% async
28) 1 0.03 0.03 0.02% 0.03 0.03 0.02% (anon) [/usr/share/zsh/functions/Misc/add-zle-hook-widget:28]
29) 2 0.01 0.00 0.01% 0.01 0.00 0.01% env_default
30) 1 0.01 0.01 0.01% 0.01 0.01 0.01% _zsh_highlight__is_function_p
31) 1 0.00 0.00 0.00% 0.00 0.00 0.00% _zsh_highlight_bind_widgets
Lastly I have a perf run with a corresponding flamegraph:
perf-run --out alacritty --command "terminal -e $SHELL -slc exit"
But I'm not sure how to interpret the flamegraph since it seems to have everything in it and not just the command that was run.
So my question is:
What is taking up the most time in my terminal setup and is there another approach I could use to better determine where the problem is coming from?

Cumulative Return For the following data

Actually I was reading a blog on how to calculate the cumulating return of a stock for each day.
The formula that was described in the blog to calculate cumulating return was
(1 + TodayReturn) * (1 + Cumulative_Return_Of_Previous_Day) - 1 , but still I am not able to calculate the cumulative return that was provided on it.
Can someone please make this clear that how is cumulative return has been calculated in the table given below. This would be a lot of help.
Thanks in advance.
| Days | Stock Price | Return | Cumulative Return|
---------------------------------------------------
| Day 1 | 150 | | |
| Day 2 | 153 | 2.00 % | 2.00 % |
| Day 3 | 160 | 4.58 % | 6.67 % |
| Day 4 | 163 | 1.88 % | 8.67 % |
| Day 5 | 165 | 1.23 % | 10.00 % |
---------------------------------------------------

Using bash to extract numbers and convert to CSV file

I am quite new in using bash to extract but I am not what search terms to look for my problem. I like to extract data for some variables from a very large log file.
Sample of logfile
temp[min,max]=[ 24.0000000000000 .. 834.230000000000 ]
CHANGE working on TEMS
RMS(TEMS)= 6.425061887244621E-002 DIFMAX: 0.896672707535103
765 1 171
CHANGE working on PHI
RMS(PHI )= 1.92403467949391 DIFMAX: 62.3113693145351
765 1 170
CHANGE working on TEMP
RMS(TEMP)= 6.425061887244621E-002 DIFMAX: 0.896672707535103
765 1 171
PMONI working
TIMSTP working
COPEQE working : INFO
DELT = 630720000.000000 sec
Courant-Number in x,y,z:
Max. : 5.05 , 0.00 , 6.93
Min. : 0.00 , 0.00 , 0.00
Avg. : 0.568E-02, 0.00 , 0.383
PROBLEM: Courant-Number(s) greater than 1 : 11.9802093558263
max. TEMP-Peclet in X: 653 1
170
max. TEMP-Peclet in Y: 653 1
170
Temperature-Peclet-Number in x,y,z:
Max. : 0.357 , 0.00 , 0.313E-01
Min. : 0.00 , 0.00 , 0.00
Avg. : 0.307E-03, 0.00 , 0.435E-03
Temperature-Neumann-Number in x,y,z:
Max.: 64.9 , 64.9 , 64.9
Min.: 0.619E-02, 0.619E-02, 0.619E-02
Avg.: 35.5 , 35.5 , 35.5
PROBLEM: Temp-Neumann-Number greater than 0.5 : 194.710793368065
(Dominating: Courant-Number)
DRUCK working
KOPPX working
#########################################################################
STRESS PERIOD: 1 1
1 of 100 <<<<<
Time Step: 50 ( 1.0% of 0.315E+13 sec )(0.631E+09 sec )
#########################################################################
### Continues on ###
I managed to extract the lines relating to the variables I am looking for using bash.
grep -A 3 'Courant-Number in x,y,z' logfile.log > courant.txt
grep -A 2 'Max.' courant.txt > courant.txt
to get this...
Max. : 0.146E+04, 0.00 , 0.169E+04
Min. : 0.00 , 0.00 , 0.00
Avg. : 1.15 , 0.00 , 0.986
--
Max. : 0.184E+04, 0.00 , 0.175E+04
Min. : 0.00 , 0.00 , 0.00
Avg. : 1.13 , 0.00 , 1.05
--
Max. : 0.163E+04, 0.00 , 0.172E+04
Min. : 0.00 , 0.00 , 0.00
Avg. : 1.13 , 0.00 , 1.17
I would like to convert this data to a CSV file with the following columns, thus making a total of 9 columns.
Max_x | Max_y | Max_z | Min_x | Min_y | Min_z | Avg_x | Avg_y | Avg_z
I would like to continue to use bash to get this data. Any inputs will be most appreciated.
Thanks!
You've got a good start. I had a much worse solution a bit earlier, but then I learned about paste -d.
grep -A 3 'Courant-Number in x,y,z' logfile.log |
grep -A 2 'Max.' |
grep -v -- '--' |
sed 's/^.*://' |
paste -d "," - - - |
sed 's/ *//g'
find courant number + 3 lines
find max + 2 following lines
get rid of lines that have '--'
get rid of min: max: avg:
join every three lines with commas
get rid of whitespace

OR-relation in Bayesian Networks

How do you represent an OR-relation in a Bayesian Network? For example, P(A | B OR C).
I also wonder how you can calculate the probability for such an expression?
Thank you in advance!
This is not particularly well-posed, because one cannot sum over the conditioned variables in a conditional distribution. However, an example may help. If we assume that B and C are binary variables and introduce a variable Z = A or B. Let's define the following joint distribution on P(A,B,C)
A B C | Z | P(A,B,C)
------+---+----------
0 0 0 | 0 | 0.02
0 0 1 | 1 | 0.22
0 1 0 | 1 | 0.06
0 1 1 | 1 | 0.08
1 0 0 | 0 | 0.18
1 0 1 | 1 | 0.24
1 1 0 | 1 | 0.17
1 1 1 | 1 | 0.03
Now, by the definition of a conditional distribution, P(A|Z) = P(A,Z)/P(Z). So, summing up terms
P(Z = 0) = 0.02 + 0.18 = 0.20
P(Z = 1) = 0.22 + 0.06 + 0.08 + 0.24 + 0.17 + 0.03 = 0.80
and P(A,Z)
A | Z | P(A, Z) | P(A | Z)
--+---+---------+---------
0 | 0 | 0.02 | 0.10
1 | 0 | 0.18 | 0.90
0 | 1 | 0.36 | 0.45
1 | 1 | 0.44 | 0.55
Notice that once we condition on Z that the two sets of terms with Z held constant both sum to 1.0.
So, in short, there isn't a generic way of calculating P(A|B or C), you need to look at the joint distribution in order to calculate the appropriate probabilities.

How to get a substring in awk

This is one line of the input file:
FOO BAR 0.40 0.20 0.40 0.50 0.60 0.80 0.50 0.50 0.50 -43.00 100010101101110101000111010
And an awk command that checks a certain position if it's a "1" or "0" at column 13
Something like:
awk -v values="${values}" '{if (substr($13,1,1)==1) printf values,$1,$3,$4,$5,$6,$7,$8,$9,$10,$11,$12,$13}' foo.txt > bar.txt
The values variable works, but i just want in the above example to check if the first bit if it is equal to "1".
EDIT
Ok, so I guess I wasn't very clear in my question. The "$13" in the substr method is in fact the bitstring. So this awk wants to pass all the lines in foo.txt that have a "1" at position "1" of the bitstring at column "$13". Hope this clarifies things.
EDIT 2
Ok, let me break it down real easy. The code above are examples, so the input line is one of MANY lines. So not all lines have a 1 at position 8. I've double checked to see if a certain position has both occurences, so that in any case I should get some output. Thing is that in all lines it doesn't find any "1"'s on the posistions that I choose, but when I say that it has to find a "0" then it returns me all lines.
$ cat file
FOO BAR 0.40 0.20 0.40 0.50 0.60 0.80 0.50 0.50 0.50 -43.00 100010101101110101000111010
FOO BAR 0.40 0.20 0.40 0.50 0.60 0.80 1.50 1.50 1.50 -42.00 100010111101110101000111010
$ awk 'substr($13,8,1)==1{ print "1->"$0 } substr($13,8,1)==0{ print "0->"$0 }' file
0->FOO BAR 0.40 0.20 0.40 0.50 0.60 0.80 0.50 0.50 0.50 -43.00 100010101101110101000111010
1->FOO BAR 0.40 0.20 0.40 0.50 0.60 0.80 1.50 1.50 1.50 -42.00 100010111101110101000111010

Resources