Measure time from code execution to bluetooth low energy transmission on iPhone using intruments - xcode

I need to measure how long it takes before my code executes a transfer call until the actual packets is sent over the air.
Is this possible using the XCode developer tool "Instruments" or is it best to look for timestamps in my code somewhere?
All help is really appreciated

I have used Packet Analyzer to debug traces over the air (http://www.fte.com/). But it's a very expensive tool.
Otherwise, you won't get a precise measurement. You have no idea, what delays could hardware create.
Although it would be fun to have a look. Set connection intervals of your tag and then check with logging timestamp if the delta you get is similar to delta you set.

Related

Castalia 3.2 energy consumption

I’m developing several application protocols in Castalia 3.2. Compilation succeeds and timing results are right but when I try to obtain energy consumption the result is 612000 for every node and for every protocol. I’ve tried CastaliaResults -i 100812-102156.txt -s energy But results are always 612000 I also set initialenergy to 1000 in omnetpp.ini and the results are always 1000. The resulting time is right and the radio parameters are the default. Can anyone tell me why the results are not the expected and the steps to obtain the energy consumption?
You are correct in using CastaliaResults -i yourfile.txt -s energy to see the energy results, it's just that the results are not what you expect.
We cannot know the details of your protocols since you did not share, but what seems to be happening is that your protocols keep the radio on all of the time.
If you are also transmitting packets in your simulations I would expect to see some very minor variation (Tx power is a little less than Rx/listening power). Are there any transmissions and do you see any minor difference?
Where does the number 612000 come from? Is this the total energy of your nodes? If so then obviously there is another problem: nodes do not have enough energy for the task you want them to complete.
Finally, I would encourage you to use the latest version of Castalia from the Github repository. There are several improvements compared to 3.2 and many bug fixes.

Play sound at an exact moment

I want to play a sound sample (about 2 minute long) and record keystrokes while the sample is playing. I need to know the exact time a key was pressed relative to the play start time, with a resolution of 1 millisecond or less.
First I tried NAudio and a Winforms application. I found it quite impossible to synchronize the playback start time and the keystroke time. I know when I start transferring the sample bytes to NAudio, but not the time it takes between passing the sample and the actual playback. I also know the time my application receives the KeyDown event, but not the time it's actually taken for the event to go all the way from the keyboard hardware to my C# event handler.
This is more or less accurate - I get a 270ms(+- 5ms) delay between the delay reported by the application and the actual delay (I know the actual delay by recording the session and looking at the sample file. The recording is done on a different device, not the computer running the application of course...).
This isn't good enough, because of the +- 5ms. This happens even when I disabled the generation 2 GC during playback. I need to be more accurate than this.
I don't mind switching to C++ and unmanaged code, but I'd like to know which sound playback API to use, and whether there are more accurate ways to get the keyboard input than waiting for the WM_KEYDOWN message.
With NAudio, all output devices that implement the IWavePosition interface can report accurately exactly where the playback position is. It can take a bit of trial and error to learn exactly how to convert the position into time, but this is your best approach to solving this problem.

Using usb cable for random number generation

I have a thought, but am unsure how to execute it. I want to take a somewhat long usb cable and plug both ends into the same machine. Then I would like to send a signal from one end and time how long it would take to reach the other end. I think this should cause signal to arrive at different times and that would cause me to get random numbers.
Can someone suggest a language in which I could do this the quickest? I have zero experience in sending signals over usb and don't know where to start or how to start. Any help will be greatly appreciated.
I simply want to do this as a fun in home project, so I don't need anything official and just would like to see if this idea can work.
EDIT: What if I store the usb cable in liquid nitrogen or a substance just as cold in order to slow down the signal as much as possible (I have access to liquid nitrogen).
Sorry I can't comment (not enough rep), but the delay should always be the same through the wire. This might limit the true randomness of your numbers. Plus the acutal delay time in the wire might be shorter than even a CPU cycle.
If your operating system is Windows, you may run into this type of issue:
Why are .NET timers limited to 15 ms resolution?
Apparently the minimum time resolution on Windows is around 15ms.
EDIT: In response to your liquid nitrogen edit, according to these graphs, you may have more luck with heat! Interestingly enough...
Temperature vs Conductivity http://www.emeraldinsight.com/content_images/fig/1740240120008.png
I want to take a somewhat long usb cable and plug both ends into the same machine.
Won't work. A USB connection is always Host -> Device, a PC can only be Host. And the communication uses predictable 1 ms intervals - bad for randomness.
Some newer microcontrollers have both RNG and USB on chip, that way you can make a real USB RNG.
What if I store the usb cable in liquid nitrogen or a substance just as cold in order to slow down the signal
The signal would travel a tiny bit faster, as the resistance of the cable is lower.

Core Audio - Remote IO Based Low-Latency Metronome

I am trying to construct a low-latency metronome using Core Audio.
What I am trying to achieve is using Remote IO, which should give me a timestamp for every packet of audio I produce. Then I want to use that to remember when I started playback and subtracting the current timestamp from the starting timestamp to get the current position. Then I want to use that to generate the audio for the metronome as needed.
After some research, I have found that this would be the best way to create a low-latency metronome. However, attempting its implementation and diving into this framework has been rather daunting. If anyone knows how I could put this together or perhaps even point me to sources where I could gather the information I would need to make it work, I would be most grateful!
Thank you.
Ignore the packet timestamps, and count samples. If you position the start of each metronome sound an exact number of samples apart at a known sample rate, the tempo will be sub-millisecond accurate. Per packet time stamp resolution is much less precise.

500Hz or higher serial port data recording

Hello I'm trying to read some data from the serial port and record it in the hard drive. I'm using visual C++ express, and made an application using the windows form.
The program basically sends a byte ("s") every t seconds, this trigger the device connected to the serial port to send back 3 bytes. The baud rate now is on 38400bps. The time t is controlled by the timer class of visual c++.
The problem I have is that if I set the ticking time of the timer to 1ms, the data is not recorded every 1ms, but around every 15ms. I've read that maybe the resolution of the timer is set to 15ms, but not sure about it. Anyhow, how can I make the timer event to trigger every 1ms, instead of every 15ms? or is there another way to read the serial port data faster? I'm looking for 500Hz or higher.
The device connected to the serial port is a 32bit microcontroller, which I have control over the program as well so I can easily change it, but just can't figure out another way to make this transmission.
Thanks for any help!
Windows is not a real-time OS, and regardless of what period your timer is set to there are no guarantees that it will be consistently maintained. Moreover the OS clock resolution is dependent on the hardware vendor's HAL implementation and varies from system to system. Multi-media timers have higher resolution, but the real-time guarantees are still not there.
Apart from that, you need to do a little arithmetic on the timing you are trying to achieve. At 38400,N,8,1, you can only transfer at most 3.84 characters in 1ms, so your timing is tight in any case since you are pinging with one character and expecting three characters to be returned. You can certainly go no faster without increasing the bit rate.
A better solution would be to have the PC host send the required reporting period to the embedded target once then have the embedded target perform its own timing so that it autonomously emits data every period until the PC requests that it stop or sends a different period. Your embedded system is far more capable of maintaining hard-real-time constraints.
Alternatively you could simply have your device perform its sample and transmit the three characters with the timing entirely determined by the transmission time of the three characters, and stream the data constantly. This will give you a sample period of 781.25us (1280Hz) without any triggering from the PC and it will be truly periodic and jitter free. If you want a faster sample rate, simply increase the bit rate.
Windows Forms timer resolution is about 15-20 ms. You can try multimedia timer, see timeSetEvent function.
http://msdn.microsoft.com/en-us/library/windows/desktop/dd757634%28v=vs.85%29.aspx
http://msdn.microsoft.com/en-us/library/windows/desktop/dd743609%28v=vs.85%29.aspx
Timer precision is set by uResolution parameter (0 - maximal possible precision). In any case, you cannot get timer callback every ms - Windows is not real-time system.

Resources