Problems setting baud rate in Matlab - macos

I am trying to increase the baud rate for serial communication to a microcontroller using Matlab and it seems Matlab does not want to cooperate. My very basic setup in Matlab is:
s = serial('/dev/tty.usbserial-A104VT0Q')
set(s,'BaudRate',256000)
fopen(s)
I get this error:
Open failed: BaudRate could not be set to the specified value.
This works fine (no error) if the baud rate is 115200 or a lower standard baud rate, but I get this error for higher standard baud rates of 128000 or 256000 (Matlab lists standard rates here) or for non-standard rates. Why is this happening and how can I set the baud rate higher?
If I set the baud rate from the microcontroller to 250000 and use the Arduino serial monitor at the same rate, it seems to work fine (can get successful serial communication of bytes), so I don't see that there is a hardware issue preventing this rate from being set. So I suspect this is some quirk of Matlab, but my searching online hasn't found a solution.
More about my setup: using an Atmel ATUC3C1512C MCU on a custom board, the USART is going through an FTDI FT232RL (rated for up to 3 Mbaud) to USB, into a Macbook Pro running OS X 10.11.6, running Matlab R2016a.
Again, the whole setup works at a baud rate of 115200, but I would like to increase the rate for the transmission of lots of data after a test is run and I can't figure out what's preventing me from setting it higher. Thanks.

Related

Is it safe to set SPI_CLOCK clock speed of 16 MHz on Arduino Uno?

I have an Arduino Uno R3 (actually, distributed from Elegoo but has the same exact components) and I thought about burning the ATMega 8 chip on it with a bootloader, using the "Arduino as ISP" feature on the Arduino IDE.
I looked at the specs for the ATMega 8 chip, and I would like to just ask - everywhere I've looked online, it says the default CPU clock speed is 16MHz, which makes sense because of the crystal clock onboard running at 16MHz. However, I'm not sure the code I have already written is safe:
#define SPI_CLOCK (16000000/6) // Internal clock speed 16 MHz for Arduino UNO.
I think that this code will be fine considering the specs. The example told me to set SPI_CLOCK to a value of 1000000/6, which slow enough for an ATtiny85 (# 1 MHz), but since I want to use the full functionality of the crystal I have onboard and want a faster clock speed, is it safe to set SPI_CLOCK directly to 16000000/6?
Any help will be appreciated.
Thanks!
Anyways, AterLux answered my question:
Setting SPI clock speed on the programmer only affects how fast you can flash the device. It does not change how the flashed code works – AterLux

STM32F411 I need to send a lot of data by USB with high speed

I'm using STM32F411 with USB CDC library, and max speed for this library is ~1Mb/s.
I'm creating a project where I have 8 microphones connected into ADC line (this part works fine), I need a 16-bit signal, so I'm increasing accuracy by adding first 16 signals from one line (ADC gives only 12-bits signal). In my project, I need 96k 16-bit samples for one line, so it's 0,768M signals for all 8 lines. This signal needs 12000Kb space, but STM32 have only 128Kb SRAM, so I decided to send about 120 with 100Kb data in one second.
The conclusion is I need ~11,72Mb/s to send this.
The problem is that I'm unable to do that because CDC USB limited me to ~1Mb/s.
Question is how to increase USB speed to 12Mb/s for STM32F4. I need some prompt or library.
Or maybe should I set up "audio device" in CubeMX?
If small b means byte in your question, the answer is: it is not possible as your micro has FS USB which max speeds is 12M bits per second.
If it means bits your 1Mb (bit) speed assumption is wrong. But you will not reach the 12M bit payload transfer.
You may try to write (only if b means bit) your own class but I afraid you will not find a ready made library. You will need also to write the device driver on the host computer

A COM Port on a Windows PC indicates the bit rate, or the Baud rate?

If you search around the internet, you can easily find websites, google images, as well as many (YouTube) videos that explain the various properties of COM/serial/RS232 ports. As far as i'm concerned in most of these they state that in the COM port dialogue box the baud rate can be seen (and not just in Windows OS), such as here, here and even on Sparkfun here. And this is clearly false, since it explicitly states the bit rate. Here's an image from my Windows 8.1 PC as well:
And we know that bit rate isn't the same as baud rate. Also numerous times i've heard people e.g. on youtube videos talking about messing around with the "baud-rate" on windows pc. Now i'm confused. What is going on here. It clearly states the bit rate, isn't that right? Am i missing something?
Despite being marked "bits per second", that dialog actually displays baud as a rate in symbols per second. (Symbols include data bits but also start, stop, and parity. For serial ports these are often also called "bits".)
Besides framing symbols, the other cause for a difference between bit rate and baud would be multilevel signalling -- however this doesn't apply to PC serial ports since they only use binary signalling, therefore one data symbol = one bit. Don't be confused by the fact that many serial-attached modems use a larger signal constellation, this refers to the link between the modem and computer, not between two modems.
The selections shown in the image in the question will result in 9600 baud, but only 960 bytes per second. (1 byte = 8 bits but due to start and stop intervals, the serial port sends 10 symbols per byte)
According to this answer:
What is the difference between baud rate and bit rate?
It looks like it's due to the fact that with early analog phones, bps = baud rate. ie 1 symbol = 1 bit. That would lead to the assumption that a UI designer at some point simply made some assumptions and mixed the terms based on some expectation that COM ports were going to be used to plug modems in.
Modems don't use a strict digital transmission method, but instead use FSK, which allows for a baud (your "symbol") o be more than one bit (binary data). A phone line has a high frequency limit of about 3300 Hz. If that was the cutoff, your modem couldn't send more than 2400 baud (bit rate). By shifting the signal within one cycle, it's able to transmit more than 1 bit in 1 baud. Add 4 shifts and you up the bit rate from 2400 to 9600.
At least that's what I remember from some 20 years ago.

Non-standard comport baudrates in windows

Do the windows built in com port drivers support non-standard baudrates? (actually does windows have a built in driver for com1 & 2?)
The reason I ask is I'm having trouble getting a reliable connection to a device that uses the unusual baudrate 5787. The device & PC talk briefly, then seem to loose the dialogue, and then get it again. Once a long message is sent, it gets lost at the other end, a short time later the dialogue is back. This sounds to me like the classic baudrate mismatch. Not quite close enough to be reliable though but close enough that some data gets through.
If I use an inexpensive PCI serial board it works without problems. It's only computers that use on board serial I've found don't work properly.
Baudrates in a PC are controlled by a UART and a crystal. The crystal frequency determines what baudrates the serial port can generate. The baudrate is often generated by a divide by 16 counter. The crystal frequency for a standard PC is normally 1.8432 MHz. Dividing that by 16 gives you 115200 which is usually the maximum the com port can do.
Inside the UART is a DLAB register. This further divides the clock. So essentially, to get 5787 baud you're talking about dividing 115200 by 5787 which gives you 19.906687...
It's close to 20 you'd load the DLAB register with 20. 115200 / 20 gives you 5760. Therefore you're probably getting 5760 baud out of the PC com port. That's probably enough of a difference to cause the issue that you're seeing.
No, the difference from 5760 to 5787 is nowhere near enough to explain any sort of problems. UARTs identify the start of a byte from the leading edge of the start bit, then sample the data in the middle of each bit. This means they are tolerant to errors in Baud rate up to the point where the predicted middle is an edge. That's a half bit error in one full byte, because each byte has a stop bit so there's a re-synchronise event per byte. On half bit in ten bits (8 data, one start, one stop) is 5%. The difference from 5760 to 5787 is only 0.5% so miles inside the safe region.

Can serially transmitting faster baud rate ruin a device?

I have a TV that accepts 9600 8N1 serial transmission to control it.
I have a USB to serial cable that I use to control this TV with my computer (to control things such as power off, power on, volume, etc). I just got the cable yesterday and I got everything to work. Generally by installing the drivers and going:
stty -F /dev/tty.usbserial 9600
echo -ne '\x08\x22\x01\x00\x01\x00\xd4' > /dev/tty.usbserial
^^ the above echo line turns the volume down on my tv by sending hex string serially. (I have a Samsung smart-tv)
Now I might have done the second line without the first a few times when opening a new terminal window. So I think it may have transmitted way faster than 9600 to the tv. And now my TV doesn't accept ANY type of serial commands, not even the 9600 anymore. Is there a chance I could've fried my tv by transmitting in a faster baud rate?

Resources