In the last project I use BLE plugin.
adapter.DeviceDiscovered += (s, a) =>
{
myDeviceList.Add(a.Device);
}
await adapter.StartScanningForDevicesAsync();
But right now I'm just looking for devices and adding what you find directly to this list.
I want this scan to work continuously and if any device gets lost, it can automatically delete it here.
BLE has StartScanningForDevicesAsync class but I don't know if this is useful for me.
event EventHandler<DeviceErrorEventArgs> DeviceConnectionLost;
//
// Summary:
// Occurs when a device has been disconnected. This occurs on intendet disconnects
// after Plugin.BLE.Abstractions.Contracts.IAdapter.DisconnectDeviceAsync(Plugin.BLE.Abstractions.Contracts.IDevice).
Is this possible?
I think, you can use something like this (pseudocode + C#):
StartTimer(TimeSpan.FromSeconds(30),
() =>
{
if (!isScanning)
{
new Handler().PostDelayed(StopScan, 10000); // stop
scanning after 10 sec
isScanning = true;
StartScan();
}
return true; // this result will tell to fire onTimer event again
});
Here StartScan and StopScan - functions you use for communicating with BLE
Related
This question is about running a non-blocking, high-performance activity in nativescript that is needed for the simple task of reading and saving raw audio from the microphone by directly accessing the hardware through the native Android API. I believe I have brought the nativescript framework to the edge of its capabilities, and I need experts' help.
I'm building a WAV audio recorder in Nativescript Android. Native implementation is described here (relevant code below).
In short, this can be done by reading audio steam from an android.media.AudioRecord buffer, and then writing the buffer to a file in a separate thread, as described:
Native Android implementation
startRecording() is triggered by a button press, and starts a new Thread that runs writeAudioDataToFile():
private void startRecording() {
// ... init Recorder
recorder.startRecording();
isRecording = true;
recordingThread = new Thread(new Runnable() {
#Override
public void run() {
writeAudioDataToFile();
}
}, "AudioRecorder Thread");
recordingThread.start();
}
Recording is stopped by setting isRecording to false (stopRecording() is triggered by a button press):
private void stopRecording() {
isRecording = false;
recorder.stop();
recorder.release();
recordingThread = null;
}
Reading and saving buffer is stopped if isRecording = false:
private void writeAudioDataToFile() {
// ... init file and buffer
ByteArrayOutputStream recData = new ByteArrayOutputStream();
DataOutputStream dos = new DataOutputStream(recData);
int read = 0;
while(isRecording) {
read = recorder.read(data, 0, bufferSize);
for(int i = 0; i < bufferReadResult; i++) {
dos.writeShort(buffer[i]);
}
}
}
My Nativescript javascript implementation:
I wrote a nativescript typescript code that does the same as the native Android code above. The problem #1 I faced was that I can't run while(isRecording) because the javascript thread would be busy running inside the while loop, and would never be able to catch button clicks to run stopRecording().
I tried to solve problem #1 by using setInterval for asynchronous execution, like this:
startRecording() is triggered by a button press, and sets a time interval of 10ms that executes writeAudioDataToFile():
startRecording() {
this.audioRecord.startRecording();
this.audioBufferSavingTimer = setInterval(() => this.writeAudioDataToFile(), 10);
}
writeAudioDataToFile() callbacks are queued up every 10ms:
writeAudioDataToFile() {
let bufferReadResult = this.audioRecord.read(
this.buffer,
0,
this.minBufferSize / 4
);
for (let i = 0; i < bufferReadResult; i++) {
dos.writeShort(buffer[i]);
}
}
Recording is stopped by clearing the time interval (stopRecording() is triggered by button press):
stopRecording() {
clearInterval(this.audioBufferSavingTimer);
this.audioRecord.stop();
this.audioRecord.release();
}
Problem #2: While this works well, in many cases it makes the UI freeze for 1-10 seconds (for example after clicking a button to stop recording).
I tried to change the time interval that executes writeAudioDataToFile() from 10ms to 0ms and up to 1000ms (while having a very big buffer), but then the UI freezes were longer and, and I experienced loss in the saved data (buffered data that was not saved to the file).
I tried to offload this operation to a separate Thread by using a nativescript worker thread as described here, where startRecording() and stopRecording() are called by messages sent to the thread like this:
global.onmessage = function(msg) {
if (msg.data === 'startRecording') {
startRecording();
} else if (msg.data === 'stopRecording') {
stopRecording();
}
}
This solved the UI problem, but created problem #3: The recorder stop was not executed on time (i.e. recording stops 10 to 50 seconds after the 'stopRecording' msg.data is received by the worker thread). I tried to use different time intervals in the setInterval inside the worker thread (0ms to 1000ms) but that didn't solve the problem and even made stopRecording() be executed with greater delays.
Does anyone have an idea of how to perform such a non-blocking high-performance recording activity in nativescript/javascript?
Is there a better approach to solve problem #1 (javascript asynchronous execution) that I described above?
Thanks
I would keep the complete Java implementation in actual Java, you can do this by creating a java file in your plugin folder:
platforms/android/java, so maybe something like:
platforms/android/java/org/nativescript/AudioRecord.java
In there you can do everything threaded, so you won't be troubled by the UI being blocked. You can call the Java methods directly from NativeScript for starting and stopping the recording. When you build your project, the Java file will automatically be compiled and included.
You can generate typings from your Java class by grabbing classes.jar from the generated .aar file of your plugin ({plugin_name}.aar) and generate type declarations for it: https://docs.nativescript.org/core-concepts/android-runtime/metadata/generating-typescript-declarations
This way you have all the method/class/type information available in your editor.
In OS X my gamepads are recognised correctly in SDL_PollEvent() at application startup. However, when I try hot plugging new gamepads or removing old gamepads, the SDL_PollEvent() does not trigger either SDL_CONTROLLERDEVICEADDED or SDL_CONTROLLERDEVICEREMOVED. The same code works correctly in Windows when I hot plug game controllers.
A more interesting note is that if I resize the window of my application, the hot plugging works. After resize event all the hot plugging events are triggered. It almost seems that the gamepad events are put in a some kind of waiting queue which is purged when the resize event happens. My SDL_PollEvent() code is quite standard as seen below.
case SDL_CONTROLLERDEVICEADDED:
if (SDL_IsGameController(e.cdevice.which))
{
SDL_GameController *pad = SDL_GameControllerOpen(e.cdevice.which);
if (pad)
{
SDL_Joystick *joy = SDL_GameControllerGetJoystick(pad);
int instanceID = SDL_JoystickInstanceID(joy);
if(m_gameControllers.count(instanceID) == 0)
{
m_gameControllers.insert(std::make_pair(instanceID, pad));
}
}
}
break;
case SDL_CONTROLLERDEVICEREMOVED:
{
auto it = m_gameControllers.find(e.cdevice.which);
if (it != m_gameControllers.end())
{
SDL_GameController* pad = m_gameControllers[e.cdevice.which];
SDL_GameControllerClose(pad);
m_gameControllers.erase(it);
}
}
break;
Has anyone else experienced this?
After some struggling I found the solution: Call SDL_PollEvent() from the main thread. Initially I called the gamepad handling method from the CVDisplayLink thread which resulted the described behaviour.
In my case the solution was simply to add dispatch_async call to my gamepad handling function.
dispatch_async(dispatch_get_main_queue(),^ { handleGamePad();});
I have a problem where I'm more-or less using the jsPlumb flow-chart demo example but where there is only ever one drop target per window and there may be one or many drag targets. However I want to forbid self-connections so that a connection can be dragged from any window to any other window EXCEPT itself.
I was thinking that maybe you would use scopes but this would mean a different scope for each window which seems over the top. Does anyone have a tidy solution?
Thanks for the answers they pointed me in the right direction. In the end used "beforeDrop"
when binding the "connection" it was detaching the source endpoint of the window as well as the connection.
The final solution was:
instance.bind("beforeDrop", function (info) {
// console.log("before drop: " + info.sourceId + ", " + info.targetId);
if (info.sourceId === info.targetId) { //source and target ID's are same
console.log("source and target ID's are the same - self connections not allowed.")
return false;
} else {
return true;
}
});
from http://www.jsplumb.org/doc/connections.html#draganddrop
Preventing Loopback Connections
In vanilla jsPlumb only, you can instruct jsPlumb to prevent loopback connections without having to resort to a beforeDrop interceptor. You do this by setting allowLoopback:false on the parameters passed to the makeTarget method:
jsPlumb.makeTarget("foo", {
allowLoopback:false
});
Bind an event to get notified whenever new connection is created. After connection creation check whether source and target of connection are same, if so detach that connection to avoid self loop. Code:
jsPlumb.bind("jsPlumbConnection", function(ci) {
var s=ci.sourceId,c=ci.targetId;
if( s===c ){ //source and target ID's are same
jsPlumb.detach(ci.connection);
}
else{ // Keep connection if ID's are different (Do nothing)
// console.log(s+"->"+c);
}
});
I am using Titanium sdk's openCamera function to capture an image and storing it to sdcard.
function captureImage() {
var capturedImg;
Titanium.Media.showCamera({
success : function(event) {
/* Holds the captured image */
capturedImg = event.media;
/* Condition to check the selected media */
if (event.mediaType == Ti.Media.MEDIA_TYPE_PHOTO) {
var window1 = Project.AddDocumentSaveView.init(capturedImg, docImgModel);
window1.oldWindow = win;
Project.UI.Common.CommonViews.addWindowToTabGroup(window1);
activityInd.hide();
}
},
cancel : function() {
},
error : function(error) {
/* called when there's an error */
var a = Titanium.UI.createAlertDialog({
titleid : Project.StringConstant.IMP_DOCS_CAMERA
});
if (error.code == Titanium.Media.NO_CAMERA) {
a.setMessage(Project.StringConstant.IMP_DOCS_ERROR_WITH_CAMERA);
} else {
a.setMessage(Project.StringConstant.UNEXPECTED_ERROR + error.message);
}
a.show();
}
});
}
It works fine in iphone and even samsung galaxy s2. But on one device, Motorola Milestone device, the application crashes when the picture is accepted after capturing.
Here is the log while the device was attached : Log for camera crash
I tried so many times but couldnt find the issue .I think its some memory issue but i am not sure.
Could someone look into it and help me find what the issue is.
Any help/suggestions will be appreciated.
Thanks
everything in this block should be done after the camera is close
if (event.mediaType == Ti.Media.MEDIA_TYPE_PHOTO) {
}
the camera is memory intensive and you are opening new windows and doing a bunch of other stuff... not good.
This is an aging issue on Titanium (TIMOB-12848
On some devices the native camera app (Titanium calls it using an Intent) cause Android to destroy our app.
When it tries to restart it, there's no recovering of the previous state so the intent callback is not called.
I've found a simple workaround to minimize this issue but doesn't resolve it. It just "mask" it somehow.
The workaround is discussed in the previous link and the full example code is here
I'm currently using the GeoCordinateWatcher which abstracts away the information used to retrieve the position and speed it provides a status (disabled/ready/nodata/initializing) but that's all.
I've seen a few apps such as RunKeeper that have a GPS signal strength indicator but I wasn't sure whether that was accurate or whether it was calculated based on the HorizontalAccuracy property of the GeoCordinate
NOTE: I have read this link:
How to read GPS signal strength in Windows Mobile?
But this is dealing with WP6.5 and I don't think helps on WP7.
Speaking from experience (as the developer of the RunKeeper Windows Phone app), you can't access the GPS signal strength directly, but you can use the HorizontalAccuracy to display a relative strength indicator.
I use Rx Extensions to provide an observable position stream on the GeoCoordinateWatcher, then create an observable accuracy stream on top of that, so that I can subscribe to accuracy changes separate from position changes (rather than having to check and update on every position).
// Extension method.
public static IObservable<GeoPositionChangedEventArgs<:GeoCoordinate>> GetPositionChangedEventStream(this GeoCoordinateWatcher watcher)
{
return Observable.Create<GeoPositionChangedEventArgs<GeoCoordinate>>(observable =>
{
EventHandler<GeoPositionChangedEventArgs<GeoCoordinate>> handler = (s, e) =>
{
observable.OnNext(e);
};
watcher.PositionChanged += handler;
return () => { watcher.PositionChanged -= handler; };
});
}
// Usage:
var positionStream = this._watcher.GetPositionChangedEventStream();
var accuracyStream = positionStream.Select(p => p.Position.Location.HorizontalAccuracy);
...
accuracyStream.Subscribe((accuracy) =>
{
// Do something with the accuracy.
});