ARCore unity debugging - debugging

For example I try to print out to the console and it doesn't, my script is attached to my main camera so we can rule that out and yes the script is active also, any help will be appreciated, right now what I am working on is to click on a certain part of my GameObject in AR and while my phone is connected to the computer I want to see the name of the certain part I clicked on in the console.

When you run the app on and Android device the log does not go to the Unity editor console. In order to see it you need to open Android Studio and then use the Logcat tab to see the device log.
Alternatively, you can run logcat from the command line
adb logcat.
More information: https://developer.android.com/studio/command-line/logcat.html

I have worked with ARCore and had the same frustating experience when it comes to debugging.
If you just want console prints, then use Log viewer which can catch and show them on Android too. (https://assetstore.unity.com/packages/tools/log-viewer-12047)
However, because I wanted more control and the ability to test and debug my game logic right inside the editor (without deploying to the phone all the time), I wrote a little plugin that allows me to do just that. This plugin simulates the operation of ARCore inside Unity editor. You can just hit play and ARCore will be simulated for you, so you can freely develop and debug your game logic. Moreover, you can then just build and deploy the project without changing anything, and ARCore will work like normal on your phone.
Using it is very similar to native ARCore, so you will not have much difficulty getting into it. It does not cover ALL features of ARCore yet, but it covers the basics. You can still use native ARCore for the rest.
You can find it here: https://github.com/VR-House/Eazy-ARCore-Interface

In current versions of Unity you can output logs from connected devices to Unity Editor. To do this, build your project in Development Mode and connect the Console to your device.
But I would recommend a more advanced way of testing AR in Editor with a plugin I wrote. I wrote it for my project and decided to make it into a plugin so everyone else can benefit from it.
AR Foundation Editor Remote plugin:
https://forum.unity.com/threads/ar-foundation-editor-remote-test-and-debug-your-ar-project-in-the-editor.898433/

I use debugging like this in C# script:
Debug.Log("Debug message and image name "+Image.Name);
To see this real time, I use Android Device Monitor (it's in Android's sdk folder, usually \Users\AppData\Local\Android\sdk\tools\monitor.bat. If device is connected, it appears on the devices list and it can be selected.
Or if I want to see the debug log in device, I put these lines in script:
private void OnGUI()
{
GUIStyle style = new GUIStyle();
style.fontSize = 50;
GUI.Box(new Rect(350, 0, 500, 500), Image.Name, style);
}
The Rect(350,0 is the position, 500,500 is the width and height of rectangle.

Related

Has any one got camera errors in a widows machine, after using camera to opencv projeccts?

I was working on an OpenCV project. There I used OpenCV to get video captures and then used some algorithms to face detection and face-swapping. After working on this project for about two weeks, I got errors in my camera (windows camera application). attached one error but sometimes I got different errors as well. I want to know, Did I get those errors due to this OpenCV project?.
If your application still runs, you cannot achieve from windows application at the same time. If not, basic reset always saves lives!
Click Windows key + I to open Settings app.
Select System>Apps & features and find Camera on the list.
Click Advanced options link.
Then press Reset button and wait for the process to be executed.
Secondly, check if the camera works properly from device manager.
Lastly, try to open from another app(like Skype, discord). Otherwise, update or roll-back driver of the camera.

How to debug when using Google VR (Daydream) and Unity?

I'm using the latest Google VR SDK and Unity 5.6. I've got a Daydream headset/controller and I'm trying to develop a game. I've been stuck on a problem for a while now, which in a typical Unity environment I should be able to solve very quickly, but because I'm forced to build/run the code to the device each time I want to test I'm unable to view the console nor see any error/warnings which are thrown.
Any idea on how I can debug using Unity, or even emulate the Daydream controller/headset within Unity? I've seen that a controller emulator exists, but it appears you still have to run on the device, only in that scenario you have to have two phones; one acting as the controller and another as the 'screen'.
Thanks
Check out this asset Log Viewer.
Using this asset you can see the same log console in Unity Editor in runtime. I don't know about DayDream but it worked for me on Android, Oculus, and GearVR.

Display app icon on top of all window in android wear?

I am developing an wear app, i want the app icon to be display on top of window
above the watch face. I have tried using WindowManager but failed to get the
result.
WindowManager.LayoutParams params=new
WindowManager.LayoutParams(ViewGroup.LayoutParams.WRAP_CONTENT,
WindowManager.LayoutParams.WRAP_CONTENT,
WindowManager.LayoutParams.TYPE_PHONE,
WindowManager.LayoutParams.FLAG_NOT_FOCUSABLE,
PixelFormat.TRANSLUCENT);
params.gravity= Gravity.CENTER|Gravity.CENTER;
params.x=160;
params.y=160;
windowManager=(WindowManager)getSystemService(WINDOW_SERVICE);
windowManager.addView(mfloatview,params);
my question is it possible to add an floating app icon in android wear?
I'm not sure if this feature (Display app icon on top of all window) is available. But you can try checking the documentation - Watch Face Complications.
A complication is any feature in a watch face that displays more than hours and minutes. For example, a battery indicator is a complication. The Complications API is for both watch faces and data provider apps.
Just like this:
Since you are using Android Wear 2.0, Complication API will be available for use.
Hope this helps.
I tried this one and works for android wear also...
apply same concept to wear, it works
http://www.androidhive.info/2016/11/android-floating-widget-like-facebook-chat-head/

Programming selectable Pebble Watchfaces (Time)

I have started with Pebble watchfaces programming on my Pebble Time, and although the watchface displays fine, it is loaded as an application, not a watchface, in that it is available in the list of applications, not the watchfaces sub-menu. I found that there are watchface online development sites that will produce actual watchfaces that get installed as such.
What is the difference ? What would I have to add to the code to make it a watchface ? Or manifest file ? The samples with the SDK were all for applications, not watchfaces.
Thanks :-)
Watch apps can do more things than watch faces can. For example, watch faces can't respond to button clicks. The SDK (CloudPebble or the C SDK) generates different code for the two types of programs. So you have to tell it what you are trying to build.
In the appinfo.json file there is a section like
"watchapp": {
"watchface": false
},
Change false to true and it should start showing up as a watch face

Visual recognition mouse-click automation utility?

Is anybody aware of any kind of a simple MacOSX utility that can take a given image and find whether the image is on the screen (perhaps with a certain variance threshold) and then position the mouse and/or click on the area which matches the image? Please don't respond about how this is a terrible idea and shouldn't be done. This is an important task for testing and cannot be easily accomplished by triggering events or the like.
Project SIKULI:
"Sikuli is a visual technology to automate and test graphical user interfaces (GUI) using images (screenshots). Sikuli includes Sikuli Script, a visual scripting API for Jython, and Sikuli IDE, an integrated development environment for writing visual scripts with screenshots easily. Sikuli Script automates anything you see on the screen without internal API's support. You can programmatically control a web page, a Windows/Linux/Mac OS X desktop application, or even an iphone or android application running in a simulator or via VNC."
www.sikuli.org
Free for Mac OS X

Resources