Auto detect UART baud rate of a working device - terminal

I have a device which is continuously sending data through UART.
I'm trying to read it using a terminal application on Windows-based PC. The problem is that I don't know at which baud rate the device is sending the data.
The data I'm getting at higher baud rates doesn't make any sense so I have narrowed it down to lesser than or equal to 600 among the standard baud rates available on terminal.
Is there any software to detect the baud rate or a method using any microcontroller??

No, not if you want to get done quickly. Ten years doing this type of task says an oscilloscope or even an inexpensive USB-based logic analyzer is your best solution here. This isn't a software problem yet, it's a signal detection problem. You should be able to clear this up in a few minutes with the right instrument.
I'm assuming you're only doing this exercise because the transmitting part is one for which you cannot find a datasheet. If you had a datasheet in hand, that would clear this up, or at least suggest the possible baud rates you should try.

I tried Realterm software and found out that the data is coming at 300 Baud Rate, no parity, 8 data bits and 1 or 2 stop bits. For the remaining options, I get Framing error and break condition in the software. Thank you all...

Old thread but I thought it might be useful.
Something I did was write a python/pyserial script that kept cycling through different baud rates (300 - 115200) and listened and filtered for strings that aren't garbage. Something that decides easily, with the assumption that but would be good clear text so you have the right rate.
It worked well enough and seemed to find the right rate more than fast enough to esc into the bootloader of my AP.

Related

External memory Data Copy through SPI -- Speed

Any experience still seems to be insufficient to answer those strange issues that pop up in serial communication buses. We are trying to implement a data copy from an external flash in to the SRAM. Below are the details how we have configured our system.
Controller : RH850 (D1M1), PLL speed at 60MHz
External Flash (IS25LP128)
SPI speed: 5MHz (clocks observed using oscilloscope)
Data size: 4 MB
Now, in theory, if my SPI is operating at 5MHZ it should copy 5MBits/Sec. We are trying to copy 4MB so essentially it will be 32 Mega Bits. So in theory, our transfer should take about 7 seconds. Ok we have some implication overheads. My driver code can accept only up to 64Kb per read call so we chose to copy 40Kb for about 100 times to achieve this and we run this in a for loop.. Ok let me add a whooping 5 seconds of overhead (Sorry RH850!) so in total 12 seconds; well, lets add some more buffer and make it a comfort zone of 15 sec (Max expected!). But then when we run the code, its taking a whole 40seconds to finish the copy. We have checked the clock and it is 5MHz as expected and at least they are continuous.
Has anyone here faced this? Where can we look in to? Well I know I have some flash-driver provided by my vendor to dig in to but before I do that, I wanted to be sure! Any help will be really appreciated.
At a first glance, I can think about minimum 10 things which may be responsible for this. One thing I'm sure, this problem is complex. There is no simple "one line solution". The main suspect is what is not yours: the flash driver. So, isolate "pieces" one by one and verify them, starting from the bottom.
Is there operating system? DMA in use? Issue with memory or resource arbitration/sharing? Interrupts are in use or polling? Any higher priority jobs are running? Data read from registers or memory mapped? Generic SPI peripheral or special serial flash is used by the driver (I don't know RH850, some uC has it)?
Your post is not precise enough, so maybe these questions will help you. What I would do? My own driver!

Using usb cable for random number generation

I have a thought, but am unsure how to execute it. I want to take a somewhat long usb cable and plug both ends into the same machine. Then I would like to send a signal from one end and time how long it would take to reach the other end. I think this should cause signal to arrive at different times and that would cause me to get random numbers.
Can someone suggest a language in which I could do this the quickest? I have zero experience in sending signals over usb and don't know where to start or how to start. Any help will be greatly appreciated.
I simply want to do this as a fun in home project, so I don't need anything official and just would like to see if this idea can work.
EDIT: What if I store the usb cable in liquid nitrogen or a substance just as cold in order to slow down the signal as much as possible (I have access to liquid nitrogen).
Sorry I can't comment (not enough rep), but the delay should always be the same through the wire. This might limit the true randomness of your numbers. Plus the acutal delay time in the wire might be shorter than even a CPU cycle.
If your operating system is Windows, you may run into this type of issue:
Why are .NET timers limited to 15 ms resolution?
Apparently the minimum time resolution on Windows is around 15ms.
EDIT: In response to your liquid nitrogen edit, according to these graphs, you may have more luck with heat! Interestingly enough...
Temperature vs Conductivity http://www.emeraldinsight.com/content_images/fig/1740240120008.png
I want to take a somewhat long usb cable and plug both ends into the same machine.
Won't work. A USB connection is always Host -> Device, a PC can only be Host. And the communication uses predictable 1 ms intervals - bad for randomness.
Some newer microcontrollers have both RNG and USB on chip, that way you can make a real USB RNG.
What if I store the usb cable in liquid nitrogen or a substance just as cold in order to slow down the signal
The signal would travel a tiny bit faster, as the resistance of the cable is lower.

Hardware: Shortest delay to send an ouput signal using windows

I would like to set an output from my computer high or low. This will change roughly 5 times a second. I will be measuring the output on an oscilloscope. The important thing is to make the time between requesting the change in software and the output changing state as short as possible. The shorter the delay, the more accurate my result.
Does anyone know which of the following options has the shortest delay in a windows environment (it has to be in windows)?
USB
Serial
Parallel Port
Something else?
I could try all three and measure the difference, but presumably someone else has done this already?
Thanks!
Serial and Parallel will be lower latency that USB (I would expect) as the "height" of the stack between you and the port is smaller. Measuring the latency would be challenging - how will you know when the bit of code which writes to the port is executed?
However, even with a low potential latency, the jitter induced by windows is likely to be quite large as well.
Vaguely related and quite interesting...
https://superuser.com/questions/419070/transatlantic-ping-faster-than-sending-a-pixel-to-the-screen/419167#419167
USB can transmit data the fastest by far.

Map device driver code to Logic Analyzer waveform

As per SDIO specification, the sequence of operations (for write transaction) take place as:
Command53 -- CommandLatency -- Command53Response -- ResponseLatency -- startbit -- write-number-of-bytes -- CRC -- endbit -- WriteLatency -- startbit -- CRC -- endbit -- busybit.
During benchmarking of SDIO UART driver, the time values which I got were more than expected. A lot of latency was found especially during write transaction.
Reasons for latency could be scheduler allocating processor time to other processes, delay in work queues, etc.
I would like to analyze and understand the latency. May be understanding the mapping between the device driver code and the Logic Analyzer waveform can lead to some cue.
Can somebody shed some light on this?
Thank you.
EDIT 1:
Sorry! I assumed a few things.
In sdio_uart_transmit_chars() there is a call to sdio_out() which in turn calls sdio_writeb() and this call writes byte wise (one byte at a time) to a SDIO UART device. I modified the driver to use sdio_writesb() i.e. multi-byte mode. This reduced the time taken to write X bytes relatively. Interestingly, with increase in size of write data, there was exponential increase in WriteLatency (as mentioned above).
This latency could be because of many reasons. I would like to understand these reasons.
Setup: I am using Linux (v 2.6.32) laptop and a loadable kernel module (which is modified sdio_uart.c)
EDIT 2:
May be adding 'SDIO' in this question is misleading..(not sure at the moment). The reasons for delay could be generic to any device driver while interacting with the hardware and it may be independent of SDIO write process.
If somebody can point me to related online resource, I would be happy to explore and update the result here.
Hope I added more clarity this time. Please comment if I the question is still not clear.
Thank you for your time.
EDIT 3:
Yes, I am looking at the signals on Logic Analyzer (LA) and there are longer delays during and between writes than I expected.
To give an idea about time values:
For 512 bytes transfer: At the hardware level theoretically the write should take 50 micro seconds (us), however in reality I got 200 us.
This gap of 150 us is what I want to understand.
Note:
1) I am rounding off the time values to simplify the case.
2) All the time values are calculated at Kernel level and no user space issue is involved here.
One thing worth looking at is if your sd interface functions by DMA, such that the driver can program the state machine and then it just runs by itself, or if getting the message out requires repetitive service by the driver, which might be delayed by other kernel obligations.
You could also see if there may be an I/O bottleneck, for example is the SD interface or whatever bus it hangs off of also used for something else?
Finally you could search for ways to increase the priority. At an extreme, you could switch to a real time SD driver rather than a normal one.

Sending (serial) break using windows (XP+) api

Is there a better way to send a serial break then the setcommbreak - delay - clearcommbreak sequence?
I have to communicate with a microcontroller that uses serial break as the start of a packet on 115k2, and the setcommbreak has two problems:
with 115k2, the break is well below 1ms, and it will get timing critical.
Since the break must be embedded in the packet stream at the correct position, I expect trouble with the fifo.
Is there a better way of doing this, without moving the serial communication to a thread without fifo ? The UART is typically a 16550+
I have a choice in the sense that the microcontroller setup can be switched(other firmware) to a more convention packet format, but the manual warns that the "break" way features hardware integrity checking of the serial.
Compiler is Delphi (2009/XE), but any code or even just a reference is welcome.
The short answer is that serial programming with Windows is fairly limited :-(
You're right that the normal way of sending a break is with SetCommBreak(), and yes, you have to handle the delay yourself - which tends to mean the break ends up substantially longer than it needs to be. The good news is that this doesn't usually matter - most devices expecting a break will treat a much longer break in exactly the same way as a short one.
In the event that your microcontroller is fussy about the precise duration of the break, one way of achieving a shorter, precisely-defined break is to change the baud rate on the port to a slower rate, send a zero byte, then change it back again.
The reason that this works is that a byte sent to the serial port is sent as (usually) one start bit (a zero), followed by the bits in the byte, followed by one or more stop bits (high bits). A 'break' is a sequence of zero bits that is too long to be a byte - i.e. the stop bits don't come in time. By choosing a slower baud rate and sending a zero, you end up holding the line at zero for longer than the receiver expects a byte to be, so it interprets it as a break. (It's up to you whether to determine the baud rate to use by precise calculation or trial-and-error of what the microcontroller seems to like :-)
Of course, either method (SetCommBreak() or baud changing) requires you to know when all data has been sent out of the serial port (i.e. there's nothing left in the transmit FIFO). This nice article about Windows Serial programming describes how to use SetCommMask(), WaitCommEvent() etc. to determine this.

Resources