ESP8266 + NodeMCU Custom Build + WS2812.Init() causing reset - nodemcu

I am trying to get some ws2812 lights to work. I am using
NodeMCU custom build by frightanic.com
branch: 1.5.4.1-final
commit: 1885a30bd99aec338479aaed77c992dfd97fa8e2
SSL: false
modules: adc,file,gpio,http,i2c,net,node,ow,rtctime,spi,tmr,uart,websocket,wifi,ws2812
build built on: 2017-05-11 11:48
powered by Lua 5.1.4 on SDK 1.5.4.1(39cb9a32)
When I execute ws2812.init() the board resets with:
> =ws2812.init()
ets Jan 8 2013,rst cause:2, boot mode:(3,7)
load 0x40100000, len 24560, room 16
tail 0
chksum 0xb4
load 0x3ffe8000, len 2296, room 8
tail 0
chksum 0x09
load 0x3ffe88f8, len 136, room 8
tail 0
chksum 0x9d
csum 0x9d
I can call the ws2812.write and I see a signal on the output pin, however the timing is not correct and the lights don't work.
What am I doing wrong? This is my first ESP8266 project so i feel a bit clueless.
Thanks for any help.

Those ESP8266 chips are very picky when it comes to which pins you can use. Putting a voltage on a pin or even just connecting a sensor output during bootup can cause problems like the one you mentioned. Try not to use GPIO 0, 2 or 15 like also discussed in this post.
GPIO labels are not neccesarily the same as the pin labels on your board. So stay away from pins D3, D4 and D8.
Also when you start using the WiFi functionality even more pins become unusable. This can cause very weird behavior without proper error codes. So be aware of this. I will try to find out for you which pins you can still use when WiFi is enabled.

Related

Using Pin 26 for ADC?

Using pin 26 for ADC seems to be discouraged in Toit.
From what I can see on https://docs.espressif.com/projects/esp-idf/en/latest/esp32/api-reference/peripherals/adc.html pin 26 should have an ADC converter, so what is the reason for that?
The ESP32 is limited around ADC2:
Since the ADC2 module is also used by the Wi-Fi, only one of them could get the preemption when using together, which means the adc2_get_raw() may get blocked until Wi-Fi stops, and vice versa.
Therefore I suggest using ADC1-pins if WiFi is used for connectivity.
Since All ADC2 Pins Can't be used while using WIFI, You can use ADC1
GPIO32 ADC1_CH04
GPIO33 ADC1_CH05
GPIO34 ADC1_CH06
GPIO35 ADC1_CH07
GPIO36 ADC1_CH0
GPIO39 ADC1_CH03
Check this video to learn more about esp32 pins and assigned pirepherals:
https://www.youtube.com/watch?v=LY-1DHTxRAk&t=546s

Esp32 Micropython Max31865 Spi connection and data read

I need to read temperature data with using MAX31865 SPI communication. First of all, I tried to read 4 byte data:
import machine
import ubinascii
spi = machine.SPI(1, baudrate=5000000, polarity=0, phase=0)
#baudrate controls the speed of the clock line in hertz.
#polarity controls the polarity of the clock line, i.e. if it's idle at a low or high level.
#phase controls the phase of the clock line, i.e. when data is read and written during a clock cycle
cs = machine.Pin(15, machine.Pin.OUT)
cs.off()
cs.on()
data = spi.read(4)
cs.off()
print(ubinascii.hexlify(data))
I tried many times with different codes but result is always similar b'00000000'.
I am using ESP32 WROOM.
I used this pins:
ESP32 : D12 - D14 - 3V3 - GND - D15
Max31865: SDO - CLK - VIN - GND - CS
I am new on micropython and esp32.
I don't know what should I do. Is there any suggestions , recommended tutorials or idea?
Short answer: see if you can use CircuitPython and its drivers for MAX31865.
Long answer: a bunch of stuff. I suspect you've been following the Adafruit tutorial for MAX31855, but its SPI interface is very different from the MAX31865.
Your SPI connection is missing the SDI pin. You have to connect it, as communication is bidirectional. Also, I suggest using the default SPI pinout on ESP32 side as described in the micropython documetation for ESP32.
The SPI startup looks to be missing stuff. Looking at the SPI documentation a call to machine.SPI() requires that you assign values to arguments sck, mosi, miso. Those would probably be the pins on ESP32 side where you've connected SCLK, SDI, SDO on MAX31865 (note mosi means "master out, slave in" and miso is "master in, slave out").
The chip select signal on the MAX is inverted (that's what the line above CS input in the datasheet means). You have to set it low to activate the chip and high to disable it.
You can't just read data out of the chip, it has a protocol you must follow. First you have to request a temperature-to-resistance conversion from the chip. The datasheet for your chip explains how to do that, the relevant info starts on page 13 (it's a bit difficult to read for a beginner, but try anyway as it's the authoritative source of information for this chip). On a high level, it works like this:
Write to Configuration register a value which initiates the conversion.
Wait for the conversion to complete.
Read from the RTD (Resistance-To-Digital) registers to get the conversion result.
Calculate the temperature value from the conversion result.
The code might be closer to this (not tested, and very likely to not work off the bat - but it should convey the idea):
import ubinascii, time
from machine import Pin, SPI
cs = Pin(15, Pin.OUT)
# Assuming you've rewired according to default SPI pinout
spi = machine.SPI(1, baudrate=100000, polarity=0, phase=0, sck=Pin(14), mosi=Pin(13), miso=Pin(12))
# Enable chip
cs.off()
# Prime a 1-shot read by writing 0x40 to Configration register 0x00
spi.write(b'\x00\x40')
# Wait for conversion to complete (up to 66 ms)
time.sleep_ms(100)
# Select the RTD MSBs register (0x01) and read 1 byte from it
spi.write(b'\x01')
msb = spi.read(1)
# Select the RTD LSBs register (0x02) and read 1 byte from it
spi.write(b'\x02')
lsb = spi.read(1)
# Disable chip
cs.on()
# Join the 2 bytes
result = msb * 256 + lsb
print(ubinascii.hexlify(result))
Convert result to temperature according to section "Converting RTD Data Register
Values to Temperature" in datasheet.
Side note 1: here spi = machine.SPI(1, baudrate=5000000, polarity=0, phase=0) you've configured a baud rate of 5MHz which is the maximum for this chip. Depending on how you've connected your devices, it may not work. The SPI protocol is synchronous and driven by master device, so you can set any baud rate you want. Start with a much, much lower value, maybe 100KHz or so. Increase this after you've figured out how to talk to the chip.
Side note 2: if you want your conversion result faster than the 100ms sleep in my code, connect the DRDY line from MAX to ESP32 and wait for it to go low. This means the conversion is finished and you can read out the result immediately.

LIRC issue with Raspberry Pi 3 - mode2 -d /dev/lirc0 shows no output

I just installed the latest version of LIRC(0.10.1-5.2) on my Raspberry Pi 3, running Raspbian on Debian Buster.
I am trying to get my Pi to take input from an IR remote using lirc.
I have made the necessary changes to these files :
/etc/lirc/lirc_options.conf
driver = default
device = /dev/lirc0
/boot/config.txt
dtoverlay=gpio-ir,gpio_in_pin=18,gpio_out_pin=17,gpio_in_pull=up
//I set mine on up on GPIO pins 17 and 18 instead of 22 and 23
I have checked and cross-checked my circuit. Everything looks okay.
The challenge I'm facing right now is when I test my IR receiver using the following command,
mode2 -d /dev/lirc0
Nothing happens. There's no output at all. No pulses recorded.
Has anyone else experienced this issue?
Any help would be much appreciated.
After spending a great amount of time, trying to figure out how to solve this issue, I was finally able to resolve it.
So hopefully my answer will help someone else.
First things first, it's important to note that infrared device has changed from lirc-rpi to gpio-ir
Although, I already had this change in my /boot/config.txt
file,like below:
dtoverlay=gpio-ir,gpio_in_pin=18,gpio_out_pin=17,gpio_in_pull=up
// in stead of dtoverlay=lirc-rpi
I just thought it was important to point out.
Since I am trying to get my Pi to take input from an IR remote using lirc, I decided to first test my IR sensor separately, to make sure it works.
To do that, I connected up the sensor like so:
Pin 1 is the output so we wire this to a visible LED and resistor
Pin 2 is ground
Pin 3 is VCC, connect to 3v3
You can find more detailed step by step instructions from this tutorial here which also shows how to wire up your circuit as seen below.
During this test, my LED lit up each time I pointed a remote at the receiver, which gave me hope that it was working just fine.
The next step was to test the IR receiver on my raspberry pi, which is the challenge I had in the beginning.
I re-wired my circuit, this time:
Pin 1 is DATA, goes to RPi pin 12 (GPIO 18)
Pin 2 is GND, goes to RPI pin 6 (GROUND)
Pin 3 is POWER, goes RPi pin 1 (3v3)
Then I ran this command sudo /etc/init.d/lirc stop to make sure that service wasn't running.
I then ran the initial command mode2 -d /dev/lirc0 and now pressed random buttons from my remote at the receiver and violá! I could see some pulses on the screen with each button press now.
Like you I managed to get all the way to receiving pulses/data on the RPI 3, seem to have a problem with the output.
I have the USB strip light and my RPI with the IRC receiver, this is so I can monitor what data captured corresponds with the button pressed on the remote keypad. Works just fine.
However? If I push the ON button - I get data, if I push the ON button again I get another set of data. The two set of data dont match? , in both cases mode2 or mode2 -r.
I get the feeling I'm missing a method to decode the output, I've noticed there's a huge amount of companies and they all have distinct code sets.
Here is one thread that exactly matches what I have (24-Key IR remote).
http://woodsgood.ca/projects/2015/02/13/rgb-led-strip-controllers-ir-codes/
However I dont see the same set codes ???
Try to change the device to mode2 -d /dev/lirc1
I also faced this.

How to stop debug info from printing to serial esp8266 websocketioclient

I'm building a project based on Socket-IO-client from here https://github.com/timum-viw/socket.io-client
I need to turn off all serial data being printed except what I send to serial. I've tried removing #define USE_SERIAL and removing or changing USE_SERIAL to use Serial1 port also edited SocketIOClient.cpp to remove debug statements in there but still getting a lot of this...
[WS][0][sendFrame] sending Frame Done (4506us).
[WS][0][handleWebsocketWaitFor] size: 2 cWsRXsize: 0
[readCb] n: 2 t: 597463
[WS][0][handleWebsocketWaitFor][readCb] size: 2 ok: 1
[WS][0][handleWebsocket] ------- read massage frame -------
[WS][0][handleWebsocket] fin: 1 rsv1: 0 rsv2: 0 rsv3 0 opCode: 1
[WS][0][handleWebsocket] mask: 0 payloadLen: 1
[readCb] n: 1 t: 597478
[WS][0][handleWebsocket] text: 3
I want to receive a websocket Message and print that to serial so arduino mega can read that serial passed to it. I don't want to have to parse all the serial being printed to find the message I actually need.
I'm using a basic nodemcu esp8266 12E module, coding with arduino ide 1.8.5. I'm sure there is something easy I'm missing.
It is very simple actually. In socketIoClient.h you have a define like so:
#define SOCKETIOCLIENT_DEBUG(...) Serial.print(__VA_ARGS__);
You can change it to:
#define SOCKETIOCLIENT_DEBUG(...) do{} while(0);
or simply:
#define SOCKETIOCLIENT_DEBUG(...)
That should rid you of the debug messages but I reckon that esp8266 by itself prints some debug messages to UART too at some special events like connecting or disconnecting to an AP. You may need to follow this from espressif's API reference:
"os_install_putc1((void *)uart1_write_char) in uart_init will set os_printf to be output from UART 1, otherwise, os_printf default output from UART 0."
os_printf is used as default printing function by the OS as well as many applications. Arduino's Serial lib should use uart0 directly and not os_printf so doing the above would only rid you of the messages the OS produces.
I knew it was simple and I was overlooking something. Solomons answer makes sense but I solved it right before checking Stack Overflow and seeing his post. The solution was a combination of his answer or commenting out the debug statements in the .cpp file but also needed was to go in to Arduino IDE tools menu and set DEBUG PORT to "disabled" and DEBUG LEVEL to "None". I had already tried that before but a mistake in my code had made me think that turned off all serial communication. But revisiting that with new code produced the desired results of only printing my
Serial.print();
statements thanks for the help though.

GPIO pins will not toggle (high/low) on beagleboard xm

I am trying to use the expansion header to control a couple motors and auxiliary task mechanism. For this I am using the appropriate pins as GPIO and merely attempting to send high or low signals as needed by the robot. (For instance, I might need the robot to move forward and so I'd send high signals on both sets of pins, whereas if I needed the robot to turn I'd send a high signal to one pin and a low to the other.)
However, the problem is that the pins will only stay high! I've followed the conventions for sysfs just via the terminal, and, although I'm able to set the "values", "active_lows", etc. to 0 or 1, I can't actually get the pins to send 0V. After checking the beagle.h file I used for u-boot it looks like the multiplexer mode is configured correctly. This is also reflected when I get the info from sys/class/gpio/gpio%/% and sys/kernel/debug/gpio. Furthermore I don't get any errors or indication from anywhere that there is something wrong...it just doesn't work!
What should I do? For the first time in my life I have seemingly exhausted the internet...
details:
Beagleboard xm rev c1
ubuntu 12.04
kernel 3.6.8-x4
Im pretty new to the beagle board and I have recently been trying to configure the GPIO pins on my classic beagleboard c4, which i believe should be fairly similar.
Half of my GPIO pins seemed to work fine and the other half seemed to remain high or low no matter what i did. Even though they were configured the same way as the working pins in /sys/class/gpio/
have you tried to use other gpio pins?
I ended up following http://labs.isee.biz/index.php/Mux_instructions
to configure the mux to 4 and now i can control the pins that were not working.
I basically used the command:
sudo echo 0x004 > /sys/kernel/debug/omap_mux/(mux 0 name)
where (mux 0 name) was the name of the subsystem for the mux 0 setting for the gpio pin you wish to configure
ie. for gpio 183 on beagleboard c4
sudo echo 0x004 > /sys/kernel/debug/omap_mux/i2c2_sda
Though I had to change permissions to modify these files
As I said I am pretty new to the beagleboard and ubuntu but this worked for me so I thought I would share it with you, I hope it is of some help.
Regards;
Paul;
It seems that the beagleboard expansion pins are numbered in alternating fashion, as clearly and professionally depicted here.
Thanks to everyone for your help. I now know way more than I should about GPIO on OMAP systems (and so do you). Good luck on finals/life!**
tl;dr I'm an idiot!

Resources