Unreal Engine:How to solve the problem of using BroadCast() of Delegate in C++/WinRT Async function? [closed] - delegates

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 27 days ago.
The community reviewed whether to reopen this question 26 days ago and left it closed:
Original close reason(s) were not resolved
Improve this question
I am currently creating Voice Recognition by using C++/WinRT and binding it into Unreal Engine. Because I wanna let my blueprint widget to show the words when it got recognized, I created a DECLARE_MULTICAST_DELEGATE in my C++/WinRT file and use a function to BroadCast() it to notify widget when words are recognized. Here is the code
VoiceRecog.h
VoiceRecog.cpp-1
VoiceRecog.cpp-2
BP_WinRTVoiceRec(Create from VoiceReocg.cpp)
The problem is when it got recognized it crashed, and I checked the log it said
It seems like it did enter the function and broadcast, but stuck at the bind event…
I’ve testted this logic by trigger the ActivateDIspatcher() function with normal blueprint Actor, it works fine…so I believe it definitely should be the problem of this Async function.
Does anyone know how to solve this?
Thank you!

The problem is that you are firing the event from a thread which is not the game thread, and then trying to play audio from that thread. The audio system does not allow you to play sound from any other thread except the game thread.
It is generally good practice to make sure blueprint events are fired from the game thread only because you don't know what a consumer might try to do in that event, and many game framework elements can only be accessed form the game thread.
You can instead use the AsyncTask construct to fire the event on the game thread.
#include "Async/Async.h"
AsyncTask(ENamedThreads::GameThread, [=]() {
OnRecognized.Broadcast();
});

Related

What is the difference between react-navigation's StackNavigator and SwitchNavigator? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
As of this writing, the Docs don't provide a description of SwitchNavigator's purpose.
I've tried using both StackNavigator and SwitchNavigator interchangeably and I personally cannot spot a difference. Although I'm certain there is.
Can anyone explain what the added benefit of SwitchNavigator is over StackNavigator ? Or a scenario where one might use it over the other?
Here is a description from React Navigation:
The purpose of SwitchNavigator is to only ever show one screen at a
time. By default, it does not handle back actions and it resets routes
to their default state when you switch away. This is the exact
behavior that we want from the authentication flow: when users sign
in, we want to throw away the state of the authentication flow and
unmount all of the screens, and when we press the hardware back button
we expect to not be able to go back to the authentication flow. We
switch between routes in the SwitchNavigator by using the navigate
action.
Source: https://reactnavigation.org/docs/auth-flow.html

How can I detect touch input events using hooks, in Windows 10? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I am unable to figure out how I can detect touch events, that are made anywhere on a touch screen monitor, using hooks. Is that even possible?
This may not be exactly what you're looking for, but there are a few ways to set hooks:
SetWinEventHook (active accessibility hook). Pros: is a supported high-level way to get events from Windows. Cons: can slow down applications, especially if you are running an "out of context" hook.
SetWindowsHookEx. Pros: very low-level hooking into applications. Cons: doesn't support out of context hooks so you need to write your own IPC and also sometimes unreliable (e.g. sometimes you miss events in Command Prompt).
Looking through the first API I don't see anything specific to touch (although I would encourage you to grab the most recent Windows SDK and look at the different events). You could, however, simply look for cursor position changes to know where the user most recently touched.
The second API may give you the kind of control you want because you can use a WH_CALLWNDPROC hook to trap touch events. But then again, a window only receives touch-related messages if they mark themselves as touch-aware. So even this may not do what you want.

Xamarin forms bluetooth state change event [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 7 years ago.
Improve this question
In my Xamarin forms application I want to handle some functionality while bluetooth state change state. Is there any event which will fire when bluetooth is on/off. Please help me.
Xamarin Forms doesn't have the capacity built-in so you will need to use a 3rd party package or create your own native handlers for each platform and use dependency injection to access it from forms. Some good links to get you started are
https://forums.xamarin.com/discussion/15794/ble-bluetooth-low-energy-cross-platform-support
And this is an open source project with example code on interacting with Bluetooh on 2 platforms. Bluetooth and BLE are different with WP8 not supporting BLE but I believe UWP does, though thats only in preview.
https://github.com/xamarin/Monkey.Robotics/tree/master/Source/Platform%20Stacks

Whats the best approach for handling a template picker in Cocoa [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
I have been dealing with this situation for a while now. I have been programming in objective C for about four years now but only for the iphone platform. I am currently working on my first mac osx application and would like some help on the best practices for handling a template picker when the app launches. To be more specific the transition between the template picker view and the window. For example in the application pages, the application allows you to select a template in the first view that is loaded and then after you have chosen your template that view disappears completely and you then receive a new view that is specifically for editing purposes. I currently have an application that loads the first view fine allows for selection in an IKImagebrowser and loads a separate view although I don't know if this is the best approach for doing this and also I can't seem to get the view with the Imagbrowser in it to disappear after I load the second view. If someone could shed some light on this situation for me it would be most appreciated.
there is not much wrong with what your doing... although BASIC difference between iphone programming and MAC programming is that in iphone programming we use the IK kit and in MAC prograamming we dont..all the classes used in MAC programming (not necessarily) but most use NS.... KEEP in mind that in MAC programming , a single button can launch a SINGLE action or call a single method unlike in iphone SDK where u can connect a button with multiple method calls.
my suggestion to u is put a working or busy sign and set its alpha value to zero, wheen first app is closed, dont close it there...instead change gamma value and load the next view...When that is loaded(KEEP THEIR POSITON(x,y) exactly same , hide the previous view app or first app...that way the transition would seem like its a single app....
hope this helps

Mac: force another application to use a specific audio device [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
As the question already suggests,
I'm trying to figure out, if it's possible to force another Application/process to use a specific output device.
Not every Application has a dropdown menu to select which audio device should be used and therefore always uses the standard device.
But when running multiple applications this isn't always the desired solution.
Plus, even if the application has a device-selection, it's almost always buried deep within the menus.
I was thinking of something like faking the change of the standard-device (but only to one application)
Looking forward for you answers :)
Greetings
Audio Hijack Pro already does this and it works great. Rogue Amoeba are the experts at Mac OS X audio.

Resources