I'm working on a mobile app and my intent is to make the Arduino communicate with the smartphone. so far I can only read the first message sent by the arduino, when the application is not active.
I'm using this function of react-native-nfc-manager library:
getLaunchTagEvent ()
After this event I can no longer read other NDEF messages. how can i solve?
The code is as follows:
componentDidMount(){
NfcManager.isSupported()
.then(supported => {
this.setState({ supported });
if (supported) {
this._startNfc();
}
})
}
_startNfc() {
if (Platform.OS === 'android') {
NfcManager.getLaunchTagEvent()
.then(tag => {
console.log('launch tag', tag);
if (tag) {
this.setState({ tag });
}
})
.catch(err => {
console.log(err);
})
}
}
Also i am trying to read the tag with the application open, but the action fails on the arduino. solutions?
The code is as follows:
readData = async () => {
NfcManager.registerTagEvent(
tag => {
console.log('Tag Discovered', tag);
},
'Hold your device over the tag',
{
readerModeFlags:
NfcAdapter.FLAG_READER_NFC_A | NfcAdapter.FLAG_READER_SKIP_NDEF_CHECK,
readerModeDelay: 2,
},
);
}
The Arduino code is as follows:
#include "SPI.h"
#include "PN532_SPI.h"
#include "snep.h"
#include "NdefMessage.h"
PN532_SPI pn532spi(SPI, 10);
SNEP nfc(pn532spi);
uint8_t ndefBuf[128];
void setup() {
Serial.begin(9600);
Serial.println("NFC Peer to Peer-Send Message");
}
void loop() {
Serial.println("Send a message to Peer");
NdefMessage message = NdefMessage();
message.addTextRecord("Hello");
int messageSize = message.getEncodedSize();
if (messageSize > sizeof(ndefBuf)) {
Serial.println("ndefBuf is too small");
while (1) {
}
}
message.encode(ndefBuf);
if (0 >= nfc.write(ndefBuf, messageSize)) {
Serial.println("Failed");
} else {
Serial.println("Success");
}
delay(3000);
}
The uses of SNEP (and LLCP) complicates things as this is a peer to peer protocol and peer to peer has been deprecated in Android 10 and not supported in iOS and I'm not so familiar with it.
I'm not sure it is possible read SNEP messages using enableReaderMode (this is what you have asked react-native-nfc-manager library to use).
This is because SNEP and (LLCP) is not a TYPE A technology type
If you look at the NFC standards diagram at https://pdfslide.net/documents/divnfc0804-250-nfc-standards-v18.html
It might be a TYPE F technology type so I would try instead of NfcAdapter.FLAG_READER_NFC_A I would use NfcAdapter.FLAG_READER_NFC_F or enable all of the technologies to be on the safe side (though I think this might not work as well)
But if this does not work, normally with Android Peer to Peer it expects only to be sent NDEF messages and you have disabled the System NFC App from processing NDEF messages with NfcAdapter.FLAG_READER_SKIP_NDEF_CHECK so I would try removing that and work with the Ndef tag technology type.
But I don't think any of that will help, the next thing I would try is to not use enableReaderMode with react-native-nfc-manager but use the underlying enableForgroundDispatch methods by just by specifying NfcManager.registerTagEvent();.
As this interacts with the Android System NFC App at a later point in the chain of events where the Android System NFC App is creating Intents to share with other Apps either to Launch an App to handle the Intent or pass it to a running App that has asked to be sent NFC Intents.
As this looks to be a common point between how the Android System NFC App handles real NFC Tags and Peer to Peer SNEP messages as a SNEP message can launch your App.
But going forward I would not use SNEP (peer to peer) as this is deprecated but get the Arduino to do Host Card Emulation to send the data (Then you could use Reader Mode)
Related
Using Xamarin, I'd like to use an AVAudioSinkNode to store and eventually transfer incoming audio data from a mic at the lowest latency possible (without going straight into AudioUnits and the deprecated AUGraphs). See my commented code below where the SinkNode is connected to the default InputNode. It's giving me grief. I'm using Xamarin.Forms with a simple iOS dependency class. I can successfully hook up an InputNode through an fx node (Reverb for example) and on out to the OutputNode. In this case, I've minimized my code down to focus on the problem at hand:
public unsafe class AudioEngine : IAudioEngine
{
AVAudioEngine engine;
AVAudioInputNode input;
AVAudioSinkNode sink;
public AudioEngine()
{
ActivateAudioSession();
}
protected void ActivateAudioSession()
{
var session = AVAudioSession.SharedInstance();
session.SetCategory(AVAudioSessionCategory.Playback, AVAudioSessionCategoryOptions.DuckOthers);
session.SetActive(true);
session.SetPreferredIOBufferDuration(0.0007, out error); // 32 byte buffer, if possible!
engine = new AVAudioEngine();
input = engine.InputNode; // to save on typing
input.Volume = 0.5f;
var format = input.GetBusInputFormat(0); // used for fx connections, but not used in this snippet. If I use this in the Input -> Sink connection, it crashes.
sink = new AVAudioSinkNode(sinkReceiverHandler);
engine.AttachNode(sink);
try
{
//-----------------------------------------------------
// Param #3 (format) is nil in all the Apple Documentation and multiple examples
// In place of nil, **NSNull.Null** isn't accepted.
// In place of nil, **null** throws a System.NullReferenceException. (see stack dump)
// In place of nil, using the **InputNode's format** crashes with
// something about missing the Trampolines.g.cs file... no clue...
engine.Connect(input, sink, **null**); // null doesn't work in place of nil.
}
catch (Exception ex)
{
Console.WriteLine(ex.StackTrace); // Exception messages included below
}
engine.Prepare();
engine.StartAndReturnError(out error);
}
private unsafe int sinkReceiverHandler(AudioToolbox.AudioTimeStamp timeStamp, uint frames, ref AudioToolbox.AudioBuffers inputData)
{
// Do stuff with the data...
return 0;
}
}
I found a post related to the use of nil as a parameter in Xamarin.iOS that says the author of the library needs to include the [NullAllowed] argument:
How to assign something to nil using Xamarin.iOS
My question is: Am I missing something obvious, or is this an oversight in the Xamarin library definition? I always assume it's my lack of expertise, but if this is a bug, how do I go about reporting it to Xamarin?
A follow up question: If this is a glitch, is there a viable workaround? Can I go in and tweak the Xamarin library definition manually? (which would break on any updates, I'm sure.) Or can I make a little library using Swift which I then import into my Xamarin project?
Just trying to think of options. Thanks for reading! Below is the Stack dump when I use null as a substitute for nil (again... NSNull.Null isn't considered a valid type in this case. It just doesn't compile):
{System.NullReferenceException: Object reference not set to an instance of an object
at AVFoundation.AVAudioFormat.op_Equality (AVFoundation.AVAudioFormat a, AVFoundation.AVAudioFormat b) [0x00000] in /Library/Frameworks/Xamarin.iOS.framework/Versions/13.18.3.2/src/Xamarin.iOS/AVFoundation/AVAudioFormat.cs:27
at AVFoundation.AVAudioEngine.Connect (AVFoundation.AVAudioNode sourceNode, AVFoundation.AVAudioNode targetNode, AVFoundation.AVAudioFormat format) [0x00024] in /Library/Frameworks/Xamarin.iOS.framework/Versions/13.18.3.2/src/Xamarin.iOS/AVAudioEngine.g.cs:120
at udptest.iOS.AudioEngine.ActivateAudioSession () [0x0009b] in /Users/eludema/dev/xamarin/udptest/udptest.iOS/AudioEngine.cs:43 }
THANKS!
This has been confirmed as a bug: The format parameter is [NullAllowable], but the current code to actually process that null wasn't linked up in the wrapper code. Here's the issue tracker on the Xamarin.Mac/iOS github repo:
https://github.com/xamarin/xamarin-macios/issues/9267
Thanks for the github issue submission link, #SushiHangover!
I'm learning about NativeScript plugins and am trying to get the PubNub iOS SDK working. So far (with the TypeScript below), I am able to successfully configure, subscribe to channels, and publish messages. I'm trying to receive messages as well by converting the "// Handle new message..." section to TypeScript as well, but haven't been able to get it working. How would I write this?
Objective-C:
// Initialize and configure PubNub client instance
PNConfiguration *configuration = [PNConfiguration configurationWithPublishKey:#"demo" subscribeKey:#"demo"];
self.client = [PubNub clientWithConfiguration:configuration];
[self.client addListener:self];
// Subscribe to demo channel with presence observation
[self.client subscribeToChannels: #[#"my_channel"] withPresence:YES];
// Handle new message from one of channels on which client has been subscribed.
- (void)client:(PubNub *)client didReceiveMessage:(PNMessageResult *)message {
NSLog(#"Received message");
}
// Publish message
[self.client publish: #{#"message": #"this is my message"}
toChannel: #"my_channel" withCompletion:^(PNPublishStatus *status) {
}];
Typescript:
// Initialize and configure PubNub client instance
this.config = PNConfiguration.configurationWithPublishKeySubscribeKey("demo", "demo");
this.client = PubNub.clientWithConfiguration(this.config);
this.client.addListener();
// Subscribe to demo channel with presence observation
this.client.subscribeToChannelsWithPresence(channels, true);
// Handle new message from one of channels on which client has been subscribed.
?
// Publish message
this.client.publishToChannelWithCompletion(msgObj, channel, function(publishStatus) {
console.log(publishStatus.data)
})
Looks like you are missing the PNObjectEventListener delegate here. You should implement the delegate and pass it's instance to addListener function for the didReceiveMessage callback to be invoked upon a new message.
For example here you can see how the core framework implements UITextViewDelegate for TextView so it could be notified upon changes and other events.
Since you are using TypeScript, take advantage of typings for your PubNub library so it may be easy for you to get find the right syntax.
In the last project I use BLE plugin.
adapter.DeviceDiscovered += (s, a) =>
{
myDeviceList.Add(a.Device);
}
await adapter.StartScanningForDevicesAsync();
But right now I'm just looking for devices and adding what you find directly to this list.
I want this scan to work continuously and if any device gets lost, it can automatically delete it here.
BLE has StartScanningForDevicesAsync class but I don't know if this is useful for me.
event EventHandler<DeviceErrorEventArgs> DeviceConnectionLost;
//
// Summary:
// Occurs when a device has been disconnected. This occurs on intendet disconnects
// after Plugin.BLE.Abstractions.Contracts.IAdapter.DisconnectDeviceAsync(Plugin.BLE.Abstractions.Contracts.IDevice).
Is this possible?
I think, you can use something like this (pseudocode + C#):
StartTimer(TimeSpan.FromSeconds(30),
() =>
{
if (!isScanning)
{
new Handler().PostDelayed(StopScan, 10000); // stop
scanning after 10 sec
isScanning = true;
StartScan();
}
return true; // this result will tell to fire onTimer event again
});
Here StartScan and StopScan - functions you use for communicating with BLE
I am using Titanium sdk's openCamera function to capture an image and storing it to sdcard.
function captureImage() {
var capturedImg;
Titanium.Media.showCamera({
success : function(event) {
/* Holds the captured image */
capturedImg = event.media;
/* Condition to check the selected media */
if (event.mediaType == Ti.Media.MEDIA_TYPE_PHOTO) {
var window1 = Project.AddDocumentSaveView.init(capturedImg, docImgModel);
window1.oldWindow = win;
Project.UI.Common.CommonViews.addWindowToTabGroup(window1);
activityInd.hide();
}
},
cancel : function() {
},
error : function(error) {
/* called when there's an error */
var a = Titanium.UI.createAlertDialog({
titleid : Project.StringConstant.IMP_DOCS_CAMERA
});
if (error.code == Titanium.Media.NO_CAMERA) {
a.setMessage(Project.StringConstant.IMP_DOCS_ERROR_WITH_CAMERA);
} else {
a.setMessage(Project.StringConstant.UNEXPECTED_ERROR + error.message);
}
a.show();
}
});
}
It works fine in iphone and even samsung galaxy s2. But on one device, Motorola Milestone device, the application crashes when the picture is accepted after capturing.
Here is the log while the device was attached : Log for camera crash
I tried so many times but couldnt find the issue .I think its some memory issue but i am not sure.
Could someone look into it and help me find what the issue is.
Any help/suggestions will be appreciated.
Thanks
everything in this block should be done after the camera is close
if (event.mediaType == Ti.Media.MEDIA_TYPE_PHOTO) {
}
the camera is memory intensive and you are opening new windows and doing a bunch of other stuff... not good.
This is an aging issue on Titanium (TIMOB-12848
On some devices the native camera app (Titanium calls it using an Intent) cause Android to destroy our app.
When it tries to restart it, there's no recovering of the previous state so the intent callback is not called.
I've found a simple workaround to minimize this issue but doesn't resolve it. It just "mask" it somehow.
The workaround is discussed in the previous link and the full example code is here
I was going over the examples of boost.asio and I am wondering why there isn't an example of a simple server/client example that prints a string on the server and then returns a response to the client.
I tried to modify the echo server but I can't really figure out what I'm doing at all.
Can anyone find me a template of a client and a template of a server?
I would like to eventually create a server/client application that receives binary data and just returns an acknowledgment back to the client that the data is received.
EDIT:
void handle_read(const boost::system::error_code& error,
size_t bytes_transferred) // from the server
{
if (!error)
{
boost::asio::async_write(socket_,
boost::asio::buffer("ACK", bytes_transferred),
boost::bind(&session::handle_write, this,
boost::asio::placeholders::error));
}
else
{
delete this;
}
}
This returns to the client only 'A'.
Also in data_ I get a lot of weird symbols after the response itself.
Those are my problems.
EDIT 2:
Ok so the main problem is with the client.
size_t reply_length = boost::asio::read(s,
boost::asio::buffer(reply, request_length));
Since it's an echo server the 'ACK' will only appear whenever the request length is more then 3 characters.
How do I overcome this?
I tried changing request_length to 4 but that only makes the client wait and not do anything at all.
Eventually I found out that the problem resides in this bit of code in the server:
void handle_read(const boost::system::error_code& error,
size_t bytes_transferred) // from the server
{
if (!error)
{
boost::asio::async_write(socket_,
boost::asio::buffer("ACK", 4), // replaced bytes_transferred with the length of my message
boost::bind(&session::handle_write, this,
boost::asio::placeholders::error));
}
else
{
delete this;
}
}
And in the client:
size_t reply_length = boost::asio::read(s,
boost::asio::buffer(reply, 4)); // replaced request_length with the length of the custom message.
The echo client/server is the simple example. What areas are you having trouble with? The client should be fairly straightforward since it uses the blocking APIs. The server is slightly more complex since it uses the asynchronous APIs with callbacks. When you boil it down to the core concepts (session, server, io_service) it's fairly easy to understand.