IB: Use Auto Layout? - xcode

I am slowly becoming more comfortable with programming in a Mac environment after moving across from Windows where I was using C#.
The transition has been smooth so fare but with a very steep learning curve getting used to Objective-C. I am finding Objective-C an amazing spin on a C stye language and so happy I made the move.
I am using XCode 4.6 on OSX 10.8.2 and iOS 6.1.2
The application I’m working on is an OSX Checklist that will need to communicate with a iOS equivalent.
The OSX side of it comes first and it includes a NSOutlineView to use like a SideBar.
I have been having a little trouble positioning controls and having them size to what I want them to do.
I have just found in XCode’s IB a switch that may solve the issue.
In the File Inspector there is a switch for: Use Auto Layout, which is currently ticked. When I Build the project I have a application that runs.
As soon as I remove the tick and re-Build the application it crashes with the following errors, plus more:
2013-02-24 17:00:17.988 ServiceCheck[1633:303] *** Assertion failure in -[NSTableRowData insertRowsAtIndexes:withRowAnimation:], /SourceCache/AppKit/AppKit-1187.34/TableView.subproj/NSTableRowData.m:5408
2013-02-24 17:00:17.989 ServiceCheck[1633:303] An uncaught exception was raised
2013-02-24 17:00:17.989 ServiceCheck[1633:303] insertRowsAtIndexes:withRowAnimation: can not happen while updating visible rows!
I am having trouble on what’s needed to resolve the error and allow me to manually size the controls to my likings.
To help, I can attached my code (once I work out how) for you to better understand what I am seeing.

Related

Weird Android Emulator and Mac tap-to-click sensitivity issue

I'm experiencing a really weird and frustrating issue with the Android Emulator on macOS Monterey.
I have "tap to click" enabled on my Macbook Pro (Mid 2015 15"), and it works fine in all other apps. But somehow, when the emulator window is active it seems to miss almost every other tap. If I click hard instead of tapping, it catches every click. The tap sensitivity in the Trackpad settings is set to "light".
So, it seems that the emulator window is somehow less sensitive to tapping than all other apps. I don't even know how this is possible, is there even such a thing as app-specific tap-sensitivity??
What's more, it's not only the emulator window itself that has this issue, but the emulator settings window as well. If I tap the "Enable clipboard sharing" toggle, it misses about 50% of the taps. If I click hard, it catches them 100%. If I try the same in some other app (tested with the "System Preferences" window), it catches 100% of the taps.
I have tested and tested this again to make sure I'm not biasing the results, but there really is a difference, and it's driving me nuts. I think it appeared after updating to Monterey, but not 100% sure of the exact timing correlation.
Any ideas??
My problem was really similar, I am using MAC with apple mouse, so I could fix it by disabling the mouse wheel on Android Emulator Extended Controls.
Hope that help
I've noticed the same issue some time ago. Unfortunately, I didn't find any solutions.
However, there are a couple of good enough workarounds:
Launch the emulator in a tool window. Anyway, this is a default approach for modern versions of Android Studio. To enable/disable it check Preferences -> Tools -> Emulator -> Launch in a tool window.
Use alt emulators. For instance, Genymotion doesn't have such an issue.
I ran into a simmiliar issue for me and the solution that i found was enabling "tap on click" to in the "system preferences" -> "trackpad".
I am new to the Android Emulator, but am experiencing the same issue in Ubuntu, even though I have tap-to-click disabled in the OS. I hate tap-to-click, so having an ultra-sensitive-to-touch Android screen emulated on my laptop is beyond frustrating.
Looking at the documentation, I came across the SOURCE_CLASS_POINTER method, which states:
The input source is a pointing device associated with a display. Examples: SOURCE_TOUCHSCREEN, SOURCE_MOUSE. A MotionEvent should be interpreted as absolute coordinates in display units according to the View hierarchy. Pointer down/up indicated when the finger touches the display or when the selection button is pressed/released. Use getMotionRange(int) to query the range of the pointing device. Some devices permit touches outside the display area so the effective range may be somewhat smaller or larger than the actual display size.
In reading that, I've come to believe this may actually be the default behavior due to touchpad events being interpreted through the SOURCE_TOUCHSCREEN method, rather than SOURCE_TOUCHPAD or SOURCE_MOUSE.
Unfortunately, I don't have a solution as much as a workaround:
I plugged in a mouse and tested the pointer up/down movements over the screen, which this part of the document suggests should register as a press. However, with the mouse it only responds to clicks. So it suggests to me that it is indeed properly interpreted as a SOURCE_MOUSE controlled pointer and not a SOURCE_TOUCHSCREEN controlled pointer.
So unless we can find out how to make the AVD properly interpret a touchpad as a touchpad, and not a touchscreen, using a mouse seems like the best solution.
For reference, I'm including this link to the AVD manual: https://developer.android.com/studio/run/emulator
UPDATE: Somehow over the period of about 18 hours and several restarts, my AVD no longer does tap-to-click on its virtual screen. It would be very hard to pinpoint exactly what changed because I've been updating packages frequently since I'm running a pre-alpha release of Ubuntu, but I think it's from using X11 instead of Wayland.
Which got me thinking, you could try changing your display server from Cocoa to X11. Thankfully, MacPorts, the MacOS version of the FreeBSD Ports Tree, makes it fairly easy to cross-compile software. It contains build recipes for multi-platform unix-like software, much like HomeBrew but often allowing for more customization.
That tap issue was annoying enough it's probably worth giving a shot.
(from macports website) The X11 windowing environment, for ports that depend on the functionality it provides to run. You have multiple choices for an X11 server: https://www.macports.org/install.php
I would build them in this order:
MacPorts: X11 - If you build it, you'll have a bunch of libraries already
MacPorts: QEMU - use make configure menu to select GTK3+, if there's no option for X11, try this build flag with make after you install X11 (pointing it at your X11's lib dir):
make -L/opt/X11/lib -lX11
Lastly, MacPorts: Android Platform tools
Related StackOverflow Q/As:
Compiling a C program that uses OpenGl in Mac OS X
Running x11 on Mac OS

Qt + VTK fails to run on Surface Pro X

I have an application I have been working on for the past year. It builds and runs normally in various machines (including Mac OS). I just got a Surface Pro X and was surprised I was having issues when rendering in a QVTKOpenGLWidget. The application runs, but when I open a window with a QVTKOpenGLWidget, it crashes and gives access violation error in Qt5Gui.dll.
I changed from QVTKOpenGLWidget to QVTKOpenGLNativeWidget which seemed to have improved somewhat the situation, but I can't render anything. The window opens and stays open so long as I don't add any actor or call to render.
From looking at the logs, Qt seems to have issues with creating a context. The VTK logs keep popping up the following error:
vtkGenericOpenGLRenderWindow (0CAD3408): GLEW could not be initialized: Missing GL version
I have tried the following with no positive results:
// First attempt
auto format = QVTKOpenGLWidget::defaultFormat();
format.setProfile(QSurfaceFormat::CompatibilityProfile);
QSurfaceFormat::setDefaultFormat(format);
// Second attempt
QSurfaceFormat::setDefaultFormat(QVTKOpenGLNativeWidget::defaultFormat());
Has anyone figured out how to get this working? I don't think the problem is specific to the Surface Pro X, but to any Windows GPU driver.

Minmizing OpenGL app while preserving EGL Context results in HUGE PROBLEM

I'm using opengl es 3.0 API with the android studio ndk to create apps.
But I've encountered a very huge problem. I've created a demo app, all it does it change the background color of the screen from white to black and vice versa, every frame. And so when I go to minimize this app, I still see it rendering the background, mostly at the edges of the screen, and not in full color but still very strongly apparent. And it doesn't go away when I close the app, when I restart the device, or when I run "kill apps" on it. Only a factory data reset fixes the issue, so it's not easy for me to debug this.
This is the relevant code that I'm using for when the app is minimized and receives the APP_CMD_TERMINATE event:
eglMakeCurrent(engine->display,EGL_NO_SURFACE,EGL_NO_SURFACE,EGL_NO_CONTEXT);
eglDestroySurface(engine->display,engine->surface);
engine->display = EGL_NO_DISPLAY;
engine->surface = EGL_NO_SURFACE;
I've error checked that eglDestroySurface() is successful.
And I've put debugging messages in to make sure that the main draw loop is NOT executing when the app is minimized. But the problem persists and I don't know what to do about it. Thanks for any help.
UPDATE: well, no one has responded, and I still don't know what to do. Could it be related to threads?
UPDATE: Still can't determine what it is, but for some reason it's messing with the System UI. Willing to upload my entire source code somewhere if someone would be willing to go through this with me, as I'd really like to be able to continue working on my game engine.
Is it the "Strict Mode" developer option on the device settings, perhaps?
That one flashes the screen if an app is blocking.
It would explain why a factory reset changes behaviour.
The answer is not a solution here. The above comment by the user columbo was correct.
I've demoed switching from black to white at high framerates on 3 different android devices, and also my Linux Desktop, all via the openGL api, and it has exhibited this issue on all the devices. So what he said must be correct: this is a problem with LCD monitor technology itself. Interestingly, doing completely random colors does not cause this problem.

Configure platform page with platform specific options

We've begun evaluating Xamarin for a up and coming project involving both iOS and Android, with the overriding intention to produce a single UI layer (and some share code, obviously) (I'm also new to C#)
TL;DR
I've begun exploring Xamarin on iOS. I started with the Phoneword example and it worked well enough.
The first issue I found was running the code on the iPhone X, which I was able to solve by using MainPage.On<Xamarin.Forms.PlatformConfiguration.iOS>().SetUseSafeArea(true); in the platform App class
While testing this, I noticed some issues with the ListView not scrolling properly (the core issue was actually with the platform padding).
I then used (MainPage as Xamarin.Forms.NavigationPage).On<iOS>().EnableTranslucentNavigationBar(); to enable translucent navigation bars (as we're targeting iOS 11+) and now everything appears under the navigation bar.
This is easily fixed in xCode and after some research I've found that I need to be using UIKit.UIViewController.EdgesForExtendedLayout Property, the immediate problem I'm facing is, the only "snippets" of code I can find are from the View is displayed under status bar in iOS 7 and EdgesForExtendedLayout doesn't help forum post.
Issue at hand...
The example solution snippets posted seem to be making use of a platform (iOS) specific solution. The problem is, I want to keep using the "cross-platform" code in the "platform" project and simply provide some custom configuration for the iOS platform which can apply these states.
I understand it could be possible to use a renderer, but this seems to counter-interactive, as I'd need one for both iOS and Android, where the platform page is doing just fine as it.
I understand that I could setup a DependencyService, but this seems annoying to have to include a specific "configuration" service just to solve this issue for iOS
I was hoping it might be possible to setup a iOS Page which would "override" some of the functionality of the platform page and would allow me to make use of things like viewDidLoad so I can apply the iOS specific configurations on a page by pages bases, so we could keep the platform page as it, but when running under iOS, it would provide me access to iOS life cycle of the actual view...
I've been trying to search the documentation and tutorials and haven't yet come across anything which would seem to do this or something similar (not to say there isn't one, but I'm just not finding it).

Can I turn off saving in a Document Based app? (Swift for OSX)

I'm trying to make an extremely simple note-taking app for OSX: one that can have multiple windows open and where I can quickly write down something. I don't want to store anything anywhere.
Most importantly: it should not nag about saving on quitting the app.
I'm nearly there, but I am stuck at turning off saving.
Any ideas if this is possible for a Document Based swift-app?
(using Swift and Xcode, complete NOOB at this)
There are different types of applications. You can specify this when you start a new project.
Disable "create document-based application" when you start a new project and it won't nag you about saving anything.

Resources