Unity 5.2 Animator window / tab - animation

When I double-click on an animator controller to launch it, the animator tab appears, but when I run the editor, I don't get the usual flow, operations, etc... I only get a static view of the states and transition arrows between them. My parameters do not show the changes they go through either.
I have multiple animations and can switch between them when certain game conditions occur, but nothing really shows when I do so, to see the flow of control, what happens, what goes wrong, the switching, the progress bar, etc...
I have the latest Unity 5.2.0f3 so I wondered if it is just me or others are having a similar problem...

What we need to do is this: Once we hit the play in the editor mode (and have the animator window docked on one side, of course) we just go and click the object in the hierarchy for which we want to analyse the animation flow. And the animator window will start showing the states and the progress bar.
Also, after upgrading to Unity 5.2, it is worth checking the values that were previously set for transition states, for example if vSpeed is greater than 0.1 then start walking. All my set values were messed up; i.e. changed.

Related

Whenever I move the camera on the left side of the screen, the cursor appears in-game. Unreal engine 5

The issue.
The mouse keeps escaping the full-screen whenever I either jump, move, rotate the camera or keep spinning to the left. But only on the left side of the screen, not the right, not the middle, not beneath, only on the left side.
This has happened in other unreal engine 5 games as well, but not in every single one of them.
What I've tried so far is:
Disabled Set bSHowMouseCursor nodes. [No effect.]
Set an Event BeginPlay > Set Input Mode Game only > Get Player Controller > Set SHow Mouse Cursor (Unchecked) [No effect.]
Created a new template (A blank one), then packaged it for testing. [Mixed effects. It slightly reduced the rate the mouse showed on the left side of the screen.]
Set an Event BeginPlay > Set Input Mode Game and UI > Get Player Controller > Set SHow Mouse Cursor (Unchecked) [Mixed results - I had to hold right click in order to move the camera, in which case, the mouse stopped showing, but as soon as I released it, the left mouse not only did appear as well, but the camera could not be moved and the cursor appeared permanently on the left side of the screen.]
Changed both the software and hardware cursors. But the cursor is unchanged. So it's not the game's cursor that appears in-game. It's the actual windows cursor.
I tested every option in the Default Viewport Mouse Lock Mode.
Deleted both Saved and the Intermediate folders.
Someone told me it has something to do with how the mouse is locked (or not) on viewport. And that, that might be the reason behind the cursor leaving escaping full-screen.
I don't have two monitors. The game is in full-screen. The issue persists in windowed mode as well. No widgets.
Tested it in packaging and in a Standalone game. (Issue persisted.)
The issue happens only if I click the left mouse button. If I play the game without clicking it, the cursor does not appear in-game.
Update:
It appears like going to the game's exe's compatibility and changing the high DPI settings' scaling behavior to application fixes the problem but reduces the fps. This is a temporary fix but I need a permanent one.
This is a weird issue.
What is on your left side desktop?
Do you use any apps that have an overlay?
If you have any apps try to disable them temporarily at startup from task manager, restart and check again.
In UE5, have you tried setting the mousecapture to true, especially in fullscreen.
Sometimes the mouse can escape...
Are you using a standard resolution/aspect ratio?
Some non standard resolutions and aspect ratios can change the way the base OS responds to captured/non captured input.

wxWidgets pprogrammaticaly move to next input control

I originally had code that set the focus to the first widget in a dialog, in the onInit method. But there were problems with it: if I pressed TAB, indeed focus moved to next control (wxTextCtrl), which got the blue 'focus' color, but the 'focus' color/highlight was not removed from previous focus widget. So now it looked like both first and second control had focus at the same time...
When cycling manually (by pressing TAB) full circle (till last control and then wrap around to the first), suddenly all worked well. That is, when moving focus from first control to next one, the first visually lost focus (blue color was removed) as it should. From now on, only one item had the focus color/highlight.
So instead of setting focus on the first control, I tried a different approach: I set the focus to the last control in the dialog, which is always the OK button. Next, I want to emulate programmatically that a TAB is pressed and received by the dialog. So I wrote this (inside Dialog::onInit):
m_buttonOK->SetFocus();
wxKeyEvent key;
key.SetEventObject(this);
key.SetEventType(wxEVT_CHAR);
key.m_keyCode=WXK_TAB;
ProcessWindowEvent(key);
Now the focus indeed moves away from the OK button, but it does not wrap around to the first control.
Only when I manually press TAB after the dialog opened, the first item gets focus.
Question: why does this wrapping around to set focus on first widget not work with the code shown above?
First of all, your initial problem is almost certainly related to not calling event.Skip() in one of your event handlers, see the note in wxFocusEvent documentation.
Second, you can't send wx events to the native windows, they don't know anything about it. In this particular case you can use wxWindow::Navigate() to do what you want, but generally speaking what you're doing simple can't, and won't, work reliably.

Pausing an animation whilst Unity UI is active

I have a two modals that I'm designing to go over my project in Unity, one for a quick tutorial and one for more information about the project, I also have an animation playing in the background.
I'd like the animation to pause whenever the modals are active and over the top of the animation, if you can't see the animation, why have it playing? Exactly.
My code so far doesn't work and I'm unsure why.
using UnityEngine;
using System.Collections;
public class ModalPauseController : MonoBehaviour
{
public GameObject tutorialModal;
public ScrubberAnimationController nac;
public void TutorialModalIsOpen()
{
if (tutorialModal.activeInHierarchy == true)
{
Debug.Log("The tutorial panel is active!!");
nac.UserClickedPauseButton();
}
}
}
nac.UserClickedPauseButton refers to another script where it pauses the animation.
So far whenever I toggle the active state of the modals, whether it be via the inspector, or in game, I no feedback in my debug.
It's quite tricky writing code like that, because if you think about it
the code itself will be not active !!!
As a rule say you have a panel "X" which is a popup or something like that - something which comes and goes.
The rule is you must have the "controller" for that thing
on some other item, not on itself!
Here's an example:
The "megabomb" is a popup sign which goes on and off. There it is on
and there it is off. There's a script which controls the megabomb on-offness.
Notice there is a "wrapper" for the megabomb item. In UI the "wrapper" is very simply an empty panel. It is invisible. Note that the wrapper always stays on.
The script goes on the wrapper - that's it.
Note that very commonly the "megabomb" item itself has a "wrapper" - a gray transparent background. So, when the popup appears, the game behind goes all gray and you have the popup over that. Don't forget in that case the gray background would be labelled "megabomb" in this example and the actual popup green (or whatever) panel would be below that. (Indeed the popup panel itself will very likely have many parts -- images, buttons, borders, etc etc.) Do not try to use the "gray background" as the "wrapper which holds the script. You need a completely separate wrapper "outside of everything" which does nothing other than hold the script.
{BTW this is yet another thing which screws up experienced-developer-who-are-new-to-Unity. Normally you'd have "some other class" run the popup, but there is "no other class" in Unity - everything is a singleton if you will, since it is an ECS system, not normal OO. You have to literally have another game object (the one labelled 'wrapper' in the example) which controls the game object in question (the one labelled 'megabombs' in the example). AND it has to be always active.}
Note - if you prefer the "controller script" can actually be anywhere else (it does not have to be on an game object "holding" the item in question). However certainly at first I encourage you to use the system of putting it as a "wrapper" as shown, it has many advantages. (Not to mention ultimately it might be a Prefab, etc etc.) Definitely do that when starting out.
It's just "one of those things they don't mention" about Unity: very often many many things in your scene will need a "wrapper" object. (The same is true for other reasons such as positioning, relative rotations, etc, too.)
BTW as I mention in a comment. If you are making a popup, it must be done with Unity.UI (ie, click "add canvas" - always remember to set to 'scale with screen size' , then click "add panel", then click "add text").

mouse moved events are not detected by NSView

I am trying to make a simple application in which there is a empty red rectangle and whenever the mouse is moved over the upper half border of the rectangle the cursor will become closed hand.
I started with selecting the foundation command line project.Made a transparent NSWindow and embedded a NSView in it with the rectangle, made window to accept mouse moved events(by method: -setAcceptsMouseMovedEvents). I have overridden -canBecomeKeyWindow and -canBecomeMainWindow window to return YES. But somehow none of the -mouseMoved events are being received by NSView.
When I put the same code by making a cocoa application project and creating my window in -applicationDidFinishLaunching method , my view was able to receive -mouseMoved events.
why is it not receiving mouse moved events when I use foundation command line utility project ?
I have also observed that whenever I make a window(carbon or cocoa) through foundation cmd line utility project , the window doesn't become key even on clicking the title bar.On clicking the title bar color remains light grey instead of becoming dark grey. Why is this happening?
I have overridden -canBecomeKeyWindow and -canBecomeMainWindow of NSwindow to return YES.
I would agree with what Joshua has already said. Any application that is going to show a user interface, be it a faceless background process or one which shows up in the Dock, should be in the form of an application bundle, not a plain old Mach-O executable like the Foundation tool template will create.
Also, there are reasons why views do not respond to mouseMoved: events by default:
Mouse moved events can quickly flood the event queue
There is generally little reason to use mouseMoved:, as tracking areas are
far more effective and efficient.
A while back, I wrote a little test app that demonstrates the differences between these 2 approaches:
Moving your mouse around the upper view for roughly 20 seconds results in 1000 events, while in the lower view, which uses tracking areas, less than 50.
Sample GitHub project: https://github.com/NSGod/MouseMoved-vs-TrackingAreas
Again, as Joshua mentioned, it would be helpful if you could describe what you're trying to accomplish. If your app needs to be a background app (LSUIElement == 1), and present an interface without appearing in the Dock, then there are ways to do that (as Josh mentioned, a command-line, non-bundled app is not the way).
You have no event loop to detect events and pass them to your window because your program does not start an NSApplication. See the main.m file of a typical Cocoa application.
It might be helpful to describe what you're trying to accomplish by taking this approach. My guess is you're building a daemon but want a GUI interface to manage the otherwise "headless" daemon. That or you're building a new login management system. In either case, there are specific ways to do both and this isn't it. :-)

How does one use onmousedown/onmouseup correctly?

Whenever I write mouse handling code, the onmousedown/onmouseup/onmousemove model always seemed to force me to produce unnecessarily complex code that would still end up causing all sorts of UI bugs.
The main problem which I see even in major pieces of software these days is the "ghost mouse" event where you drag to outside the window and then let go. Once you return back into the window, the application still thinks you have the mouse down even though the button is up. This is especially annoying when you're trying to highlight something that goes to the border of the screen.
Is there a RIGHT way to write mouse code or is the entire model just flawed?
Ordinarily one captures the mouse events on mouse down so the mouse move and mouse up go through your code regardless of the caret moving out of you application window.
More recently this is a problem when running a VM or remote session, its difficult for apps in these to track the mouse outside of the machine screen area represented by a window on a host.
I'm not sure what environment you're attempting to track mouse buttons in, but the best way to handle this is to have a mouse listener that tracks onmouseup 100% of the time after you've detected onmousedown.
That way, it doesn't matter what screen region the user releases the mouse button in. It will reset no matter where it happens.

Resources