Low fps oculus quest game - performance

i have a problem with a VR game i am developing on unity3d,when i build the project and play with it on my oculus quest 2 device it does no more than 40/45 fps and 20ms of latency.
I followed many youtube tutorials,i disabled all the graphic settings like shadows,lights,post processing ecc...
I think that the problem is in on the code but i am not sure,i am trying to analize the performance on the settings and the problem i think could be in the GC Alloc.
I have 2.2kb of GC allocate.
I would have 72 fps and 8/10 seconds of latency.

Related

Metal api - GPU FPS is 0

I had play around with Signed Distance Functions and the frame rate is falling to 30 fps. So I just take a look at the bebugger in Xcode:
and just realize that all the processing is done in cpu and seems like gpu is not running at all.
Almost all of my code is inside metal file with compute shader. cpu is just to compile and launch the app.
What could possibly be happening here? Anyway for me to test and inspect this issue?
I am using macOS 10.12.2 and Xcode 8.3.2.
You shouldn't pay too much attention to those gauges; they're both lying to you. The GPU meter always reports 0 utilization on some AMD GPUs, despite the fact that your SDF raymarcher is probably quite taxing on the GPU. The apparently high CPU utilization is actually caused by the fact that the frame time is calculated from the beginning of the frame to the end, rather than the amount of time the CPU is actually busy (e.g., if the GPU takes 30 ms to complete the frame, that will show up as ~30ms on the CPU, even though it was mostly idle during that time). Notice that the CPU utilization is actually only ~3% on the left; this is a more accurate reflection of how little work you're doing to encode the frame.
In short: the gauges are unreliable. Your shader is expensive, and that's why your frame rate is suffering.

Very low fps, except when running something else on background

So, i give development support to a private server for a game called metin2.
The game have low requirements so it runs pretty smoothly, but there is a certain zone in the game that is going well and like randomly and instantly, the fps drops from 40 to 0,1, and it looks like a powerpoint presentation.
The solution the community has come up with (it was pure luck and coincidence) is running "counter strike 1.6" on the background (probably another game should also work) and the game runs smoothly.
So basically, my question is: How does consuming more CPU and RAM actually improves the fps performance in that zone of the game? The game processes are independent.
Is the zone a high-load area and is this occuring for all users ?
Just because it reminds me of some solutions to frame spikes issues I've had before, which were related to some intel speedstep/C1M/turbo/something something BIOS CPU options.
The short of it is that in some 3D applications at certain times when load was low, it'd be too low and whichever feature would determine that less performance was needed and underclock. This was some years ago now but maybe a worthwhile train of thought.

What is the android standard or benchmark for battery drain by an application?

I was working on my phone-book application and recorded battery drain for 1 hour.
It drained to 5 % for 1 hour. I wanted to know what is the android standard or what is the android benchmark for battery drain for specific time of an application.
There really is no answer for this, because your app had barely any effect at all on the battery during that time.
The device is on and awake, powering the processor at speed, generating, and displaying graphics. You app is doing very few calculations compared to what is going on behind the scenes.
Battery life also varies by device, battery health, backlight level, wifi, bluetooth, nfc, and other factors, one of which is your app, very low on the list of power consumption.
Once you start calculating Pi, or doing other intense calculations, you will not see a significant power consumption due to your app alone.

How to reduce XNA game CPU usage while nothing worth computing is happening?

A fresh XNA game project application consumes quite some CPU percentage while its window is active. On my desktop PC it's about 30% of one core of a 2-core processor. When the window loses focus, the game goes into idle mode and consumes about 1% of CPU.
In an image viewer application I recently made using XNA, redrawing frames when no image manipulation is going on doesn't make much sense, so I'm using the SuppressDraw() method which, as the name suggests, suppresses spending resources for drawing the next frame, showing the last drawn frame instead. But still, there's a problem where the application keeps wasting CPU for a very simple input update.
How do I reduce the CPU usage for an XNA application when it doesn't require much of it?
quote from this question
According to this discussion on XBox Live Indie Games forum , apparently on some processors (and OS-s) XNA takes up 100% CPU time on one core when the default value of Game.IsFixedTimeStep is used.
A common solution (one that worked for me as well) is to put the following in your Game constructor:
IsFixedTimeStep = false;
more details here

Windows Phone 7 Frame Rate Performance

Reading Jeff Willcox on frame rate counters, I realized my application rarely hit the 60 fps. I'm not satisfied with the global performance of my app (compared to its iPhone counterpart), but the numbers seems weird to me.
When the app is doing nothing, even just after launch, it's even sometimes at 0 fps. And the higher I hit is 50 fps.
Overall, my application is not blazing fast, but not really slow. So how can I interpret the numbers ? How can I spot the issue that makes my app have a bad fps ?
A low frame rate doesn't necessarily indicate poor performance.
If you're testing on an actual device and you see poor performance when then an investigation may indicate a problem that may be related to an issue which also impacts frame rate.
Hmmm. That sentence may not be clear.
Don't worry too much about getting a high frame rate all the time. Focus on actual performance experienced by the user.
If the actual performance is poor and the frame rate is low, that's when you should worry about the frame rate.
What's important is testing on an actual device and what performance is like there.
Jeff Wilcox notes in his post that:
Frame rate counters may be 0 when there is no animation being updated on the thread at any particular moment. You can add a very simple, continually animating and repeating, animation to your application during development & testing if you want to ensure that there is always some frame rate value available.
So the 0fps reading seems not be an issue since no screen updates need to be rendered.

Resources