I inherited a project in Unity which has a lag problem when changing scenes. On the Unity editor it's about 500 milliseconds but on a device it can take up to 2-3 seconds to change scene.
I suspect that one of the problem might be the audio files which are played every time a button is pressed. I tried disabling some components but I never worked with audio files on Unity and so I don't really know what I am doing.
This is the profiler screen (sorry for the green tint):
How can I find if the problem area actually the audio files?
Can you spot something from my profiler image?
I read that audio files should stay in the Streaming Assets folder but on this project they are stored on a general "Audio" folder. Is this correct?
Related
All my animations were working fine, I was adding a boss fight in level 12
Got it working but the player ship that I wasn’t working on was only showing the back half, as if it had been cropped! Yet still animated as normal.
When I tried a clean build I tested level 1 and the player ship is still cropped, along with some other animated nodes, like the weapon power up symbols.
All the other animations are working ok.
Any ideas what could cause this?
I finally worked out what was wrong and how to fix it. I had made lots of atlas files but a Xcode update didn’t like the way I had made them or the location I put them. I deleted all the atlas files and made them inside the Assets.xcassets file, using the menu and selecting create atlas sprite file.
Now all my animations work as they should
I want to record the video feed captured from Vuforia then play the scene back, allowing for the tracked image marker to be enabled or disabled upon playback. I know Vuforia allows me to access camera properties with Vuforia.CameraDevice.Instance but there doesn't seem to be a way to override the incoming image with a prerecorded one.
I know I could record the state (position and rotation) of the objects during the recording but it seems more elegant for them to be tracking in realtime based off a prerecorded video feed. Thanks.
I attempted this as well, to no avail.
From: Is is possible to use Vuforia without a camera?
...but the Vuforia SDK prevents the use of any other source than the camera.
I guess the main reason for this is that the camera management is fully handled internally by the Vuforia SDK, probably in order to make it easier to use as managing the camera by ourselves is at best a boring task (lines and lines of code to repeat in each project...), at worst a huge pain in the ass (especially on Android where there are sometimes devices than don't behave as expected).
I'm making a game with swift and I went into the project editor to rename the project. Now when I try to run it it just gives me a gray screen. My app is called Scene Transition, and under the projects folder the Scene Transition.app and Scene Transition Tests.xctest are both red. Not sure if this has something to do with it.
I would suggest following the instructions in this youtube video. In general, renaming things in Xcode without considering the configuration changes is a bad idea (this actually holds for just about every programming language and environment I can think of).
I am using a custom item renderer with an image and a line of text. I have the image set to blendMode="luminosity" and when you run it on the desktop air to debug and test it it appears to run perfectly fine without any lag. However if you test it on a mobile device it causes the list to lag a ridiculous amount to the point where it is unusable. If I do this on just one item of the list it will cause the lag to appear. It gets worse the more images there are that use it but even with just one it is clearly evident and practically unusable.
Does anyone know if there is a way to improve the performance of this or is there no work around?
Thanks
I want to have two aplications simultaneously run: one that analyzes image from webcam written using OpenCV (the image is acquired through callback function) and an application that goes into fullscreen mode (let's say a 3D game). The problem is that while the fullscreen mode is launched the webcam image stream is stopping - the frames simply don't turn up, the callback function isn't called. This seems to be an issue with OpenCV - to test that a simple application displaying the image form camera has been prepared.
Why the image stream could be blocked by the fullscreen mode? How to bypass this?
Thanks for any hints.
Your question does not tell if you have tried to search for the problem in the OpenCV community first, so I post this as a hint in case: http://tech.groups.yahoo.com/group/OpenCV/
Also check out the list of issues, maybe its a known bug: https://code.ros.org/trac/opencv/report/1
I'm not an OpenCV expert so this is closer to a suggestion than an answer - but I've experienced similar on my multi-monitor setup using a number of media players on the second monitor and some fullscreen apps ont he first.
In my limited testing, it comes down to what method is used to render the 3d app - DirectX seems to stop media players, OpenGL doesn't.
So it might not be OpenCV which has a problem - it may be what DirectX does to the hardware during a full-screen game.
Actually the behaviour of the OpenCV camera stream is strange. It seems to depend on the native OpenCV window (cvNamedWindow()) that shows the output image form webcam. If the window is on the same screen that went fullscreen the streaming will continue. If the camera window would be placed on another screen, the stream would stop.
Another curious thing is with the screen resolution change. If you change the resolution of the screen and the camera window is not visible (is closed or even minimalized) the image stream would get blocked.
These are just my observations on the topic, maybe it'll be helpful for someone.