I purchased a Nodemcu 0.9, it prints gibberish at 9600 or 115200 baud rates.
But at 74880 baud rate I get this error
ets Jan 8 2013,rst cause:2, boot mode:(3,6)
load 0x4010f000, len 1384, room 16
tail 8
chksum 0xef
csum 0xef
csum err
ets_main.c
If someone knows how to correct this please respond .
I found the answer, its a cheap Chinese board, and the code is erased after uploaded the fault is with the Nodemcu board,
The proper generic one should work with 9600 baud rate.
Related
I have an Arduino Uno R3 (actually, distributed from Elegoo but has the same exact components) and I thought about burning the ATMega 8 chip on it with a bootloader, using the "Arduino as ISP" feature on the Arduino IDE.
I looked at the specs for the ATMega 8 chip, and I would like to just ask - everywhere I've looked online, it says the default CPU clock speed is 16MHz, which makes sense because of the crystal clock onboard running at 16MHz. However, I'm not sure the code I have already written is safe:
#define SPI_CLOCK (16000000/6) // Internal clock speed 16 MHz for Arduino UNO.
I think that this code will be fine considering the specs. The example told me to set SPI_CLOCK to a value of 1000000/6, which slow enough for an ATtiny85 (# 1 MHz), but since I want to use the full functionality of the crystal I have onboard and want a faster clock speed, is it safe to set SPI_CLOCK directly to 16000000/6?
Any help will be appreciated.
Thanks!
Anyways, AterLux answered my question:
Setting SPI clock speed on the programmer only affects how fast you can flash the device. It does not change how the flashed code works – AterLux
I am implementing a simple visible light communication module with two Arduinos, as a transmitter and a receiver, with a short text message consisting of 120 characters. I have used Manchester encoding with on-off -keying modulation.
Altogether, in my message frame, with Manchester encoding and with preambles and end-of-frame byte, there are 2480 bits. I have set one bit period to be 500 microseconds. On the receiver side, I am sampling this bit four times, at (500/4) 125 microseconds. From my knowledge, since each bit is 500 μs, there are 2000 bits/s that are being transmitted from the transmitter so a baud rate of 9600 bits/s should work. However, 9600 is not working and any baud rate above 38400 up to 115200 is working, and I can properly decode this short message in my receiver.
What is the explanation for this behavior? Why is baud rate of 9600 not working though I am only transmitting 2000 bits per second?
Further information: I have set the prescalar to 128, so the ADC sampling frequency is (16 MHz/128)/13 = 9.6 kHz.
When you suddenly started talking about "baud rates", it implied that you're using the hardware serial port on the Arduino. If so, then realise that feeding 2,000 bit/s into a device expecting 9,600 bit/s has problems.
The way that an asynchronous UART works is that it waits for a start signal (bit), then decodes the next (typically) 9 signals at the current bit rate. It then checks that the 9th bit is a stop bit; if it isn't, it discards the byte.
Since you're only changing every 9600/2000 = 4.8 bits, then odds on the 9th "stop" bit will be of the wrong sense, and the data will be lost.
Below is an ASCII diagram for the timing that I'm talking about.
I'll use the bitstream 00101101 for the signal produced by the circuit, with a . as a 0 ms separator between bits;
I'll use a ^ to point out where the UART is sampling the bits;
I'll use a * to indicate a "correct" byte (insofar as the byte ends in a correct stop bit);
I'll use a ! to indicate an "incorrect" byte (insofar as the byte ends in an incorrect stop bit);
Of course, I'll assume a baud rate of 10,000 bit/s (5 rather than 4.8...)
00000.00000.11111.00000.11111.11111.00000.11111
^^^^^.^^^^!.......^^^^^.^^^^*.......^^^^^.^^^^*
This sequence would result in the UART recording the following 3 bytes:
Error
0xF0 (LSB first is defined...)
0xF0 (LSB first is defined...)
I am trying to increase the baud rate for serial communication to a microcontroller using Matlab and it seems Matlab does not want to cooperate. My very basic setup in Matlab is:
s = serial('/dev/tty.usbserial-A104VT0Q')
set(s,'BaudRate',256000)
fopen(s)
I get this error:
Open failed: BaudRate could not be set to the specified value.
This works fine (no error) if the baud rate is 115200 or a lower standard baud rate, but I get this error for higher standard baud rates of 128000 or 256000 (Matlab lists standard rates here) or for non-standard rates. Why is this happening and how can I set the baud rate higher?
If I set the baud rate from the microcontroller to 250000 and use the Arduino serial monitor at the same rate, it seems to work fine (can get successful serial communication of bytes), so I don't see that there is a hardware issue preventing this rate from being set. So I suspect this is some quirk of Matlab, but my searching online hasn't found a solution.
More about my setup: using an Atmel ATUC3C1512C MCU on a custom board, the USART is going through an FTDI FT232RL (rated for up to 3 Mbaud) to USB, into a Macbook Pro running OS X 10.11.6, running Matlab R2016a.
Again, the whole setup works at a baud rate of 115200, but I would like to increase the rate for the transmission of lots of data after a test is run and I can't figure out what's preventing me from setting it higher. Thanks.
My application on PC sends a file (2 MB) in chunks of 1 KB to embedded device.
I use FTDI Windows driver, I use the classic FT_Write() API function as my code is cross-platform.
Note: These issues below appear when I use 1KB chunk size. Smaller chunk (I tried 64 bytes) works fine.
The problem is the function returns "0 byte sent" every couple hundred packets and stuck. I found a work around, by purging both TX and Rx, followed by ResetDevice() call recovered the chip. It still happened every couple hundred packets, but at least I can send the whole file (2 MB).
But when I use USB isolator (http://www.bb-elec.com/Products/USB-Connectivity/USB-Isolators/Compact-USB-Port-Guardian.aspx)
the work around failed.
I believe my work around is not a graceful solution.
Note: I use large chunk because of suggestion I found in FTDI application note below:
When writing data to an FTDI device, as much data as possible should
be buffered in the application and written to the device in a single
write function call (either WriteFile for a VCP application using the
Win32 API, FT_Write if using the D2XX classic interface or
FT_WriteFile if using the D2XX FT_W32 interface). The result of this
is that the data will be written to the device with 64 bytes per USB
packet.
Any idea what's the proper fix for these issues? Is it related to FTDI initialization? My driver version is 2.12.16.0 (3/9/2016).
I also saw the same problem of API FT_Write() not working right if too much data was passed,
while working on the library for my USB device Nusbio.
I mostly work in the mode Synchronous Bitbanging rather than UART but after all it is the same
hardware, driver and API.
There are the USB 2.0 specification or the FTDI FT232RL specification and then there is
reality of the electron and bit. The expected numbers of transfer speed never really match at
least at first. In other words it is complicated (see more below in my referenced blog post).
In 2015 I was under the impression that with FTDI chip FT232RL the size of 384 bytes was working well
and the number comes from the chip datasheet (128 byte receive buffer and 256 byte transmit buffer).
Using a size of 500 bytes would still work but above 600 bytes thing would not work.
I later used the chip FT231X which has a larger buffer (1k, 512 byte receive buffer and 512 byte transmit buffer).
and was able to transfer with FT_Write() 1k and 2k buffer of data, therefore more than doubling my speed of transfer.
But above 2k things would not work.
In 2016, I read every thing you can read about FTDI USB 2.0 Full speed chip, I came to the
conclusion that FT_Write should support up to 64K (see datasheet for the following chip
FT232RL, FT231X, FT232H, FT260, FT4222).
I also did some research on faster serial port communication from .NET than 115200 baud.
Somehow I was able to update my C# library to send data in buffer of 32k in FT_Write() and it is
working with the FT232RL and the FT231X chip, but I can't tell you what changed.
I was probably not completely underdanding the in and out of the USB 2.0 full speed FTDI technology.
For example let's say you are using the FT232RL and transfering 384 bytes at the time with
FT_Write(). Knowing that there is at least a 1 milli-second latency in USB 2.0 full speed what ever you
do, you are transfering from a USB point of view 384*1000/1024, that is 375 K byte/s in theory
(that would be the max), that said now what is the baudrate supported by your embedded device.
What is the baudrate used?
The FT232RL max baudrate is 900 000 baud, which would give you only 900000/(1+8+1) == 87 K byte/S.
Right away you can tell there is going to be some problem, may be the FTDI driver takes care of
it or not. I can't tell.
Re do the math based on the baudrate supported by your embedded device, and a 384 byte buffer
sent 1000 per second, then slow down your USB speed with a sleep() to match your baud rate.
That is where I would start.
Do the windows built in com port drivers support non-standard baudrates? (actually does windows have a built in driver for com1 & 2?)
The reason I ask is I'm having trouble getting a reliable connection to a device that uses the unusual baudrate 5787. The device & PC talk briefly, then seem to loose the dialogue, and then get it again. Once a long message is sent, it gets lost at the other end, a short time later the dialogue is back. This sounds to me like the classic baudrate mismatch. Not quite close enough to be reliable though but close enough that some data gets through.
If I use an inexpensive PCI serial board it works without problems. It's only computers that use on board serial I've found don't work properly.
Baudrates in a PC are controlled by a UART and a crystal. The crystal frequency determines what baudrates the serial port can generate. The baudrate is often generated by a divide by 16 counter. The crystal frequency for a standard PC is normally 1.8432 MHz. Dividing that by 16 gives you 115200 which is usually the maximum the com port can do.
Inside the UART is a DLAB register. This further divides the clock. So essentially, to get 5787 baud you're talking about dividing 115200 by 5787 which gives you 19.906687...
It's close to 20 you'd load the DLAB register with 20. 115200 / 20 gives you 5760. Therefore you're probably getting 5760 baud out of the PC com port. That's probably enough of a difference to cause the issue that you're seeing.
No, the difference from 5760 to 5787 is nowhere near enough to explain any sort of problems. UARTs identify the start of a byte from the leading edge of the start bit, then sample the data in the middle of each bit. This means they are tolerant to errors in Baud rate up to the point where the predicted middle is an edge. That's a half bit error in one full byte, because each byte has a stop bit so there's a re-synchronise event per byte. On half bit in ten bits (8 data, one start, one stop) is 5%. The difference from 5760 to 5787 is only 0.5% so miles inside the safe region.