Jump to PIC32MZ FreeRTOS application using bootloader - bootloader

I have made a custom board with a PIC32MZ2048EMF100 and a LAN8720, ATWINC1500 ....
I wrote a program that works very well on this board and uses FreeRTOS (because of ATWINC1500), now I'm trying to do OTA using bootloader (I need the swapping between banks). (MPLAB + Harmony 3)
when I use bootloader as loadable project with the application (using hexmate to merge the two hex), I setted a led after SYS_Initialize(), the led blinks (I can run the app using the bootloader) but the app crashes in the SYS_Tasks() ( in prvCheckTasksWaitingTermination() function)
I need some modifications in linker scripts ?? is something specific to the FreeRTOS ??
Thanks

Related

Flashing ESP32's memory without installing the whole IDF?

Problem
I'm looking for a way to flash an ESP32 module's memory without installing the whole IDF software suite.
Why
Because I want to integrate ESP32 onto a custom board along with a low-performance ARM-powered CPU which runs a tiny Linux distro (based on Debian), and I want to flash ESP32 from this tiny Linux distro.
I know I could use the bootloader, but who will upload the initial bootloader? I don't want to do extra steps, so my idea is to embed the ESP32 module onto my custom board, and let the Linux to flash it from factory-state (when it's flash is empty, ie. no preloaded bootloader). Or is the serial bootloader always preinstalled on all ESP32 modules (like on ESP-WROOM-32)?
Why I don't want to use IDF? Because I don't want to build or debug anything, I just want to flash myprogram.bin onto ESP32. Also, as the board is low-performance, it would take ages to download everything for running IDF.
Current state
The ESP32 module is now visible via UART (RX,TX,GND), and if I held low the GPIO0, it runs the bootloader (my current module is embedded onto a NodeMCU - but there is no USB connected, this is raw UART!):
rst:0x1 (POWERON_RESET),boot:0x3 (DOWNLOAD_BOOT(UART0/UART1/SDIO_REI_REO_V2))
waiting for download
Could I expect the same behavior (controlling GPIO0 for running the bootloader) for all ESP32 modules, or this works just because guys at NodeMCU preprogrammed already some bootloader onto it?
I'm looking for a way to flash this ESP32 preferrably without any python script.
The ESP32 has a first-stage bootloader in ROM capable of writing to Flash - that's what's printing your output. You can talk to it if you know the protocol - this is implemented by the Python scripts in ESP IDF. If you don't want to use the official implementation because it's too heavy, you'll have to write your own implementation of this protocol which scratches your specific itch. Fortunately it's more or less documented and you can likely reverse engineer any missing knowledge from official Python scripts.
Actually Espressif also provides a nice and small binary for flashing ESPs:
https://github.com/espressif/esp-serial-flasher
Serial flasher component provides portable library for flashing Espressif SoCs (ESP32, ESP32-S2, ESP8266) from other host microcontroller. Espressif SoCs are normally programmed via serial interface (UART). Port layer for given host microcontroller has to be implemented, if not available.
One more (but very important) addition:
You have to modify this repo to make it work correctly, and also you might have to upload not just your binary, but also bootloader and partition_table.

Linux : Serial Bootloader Application for the ATtiny1616

I am working on the ATMEL ATtiny1616 micro-controller.
I am looking for a (Linux C/Phython based) serial bootloader application to program the ATtiny1616.
Will you please help me to know, Where I can get the source code for it?
I'm going to use pyupdi for these new tinies (ATtiny814 which has the same programming protocol).
For now, pyupdi
Can read/write fuses
Can write FLASH
Can not read/verify FLASH
Can not read/write EEPROM
UPDI is another way to program the ATtiny1616. But as I said, I would like to program the ATtiny1616 using the Serial Bootloader Application.
I have found a reference link on the site of the microchip. Serial Bootloader Application
& this application will work for me.

Can I create NodeMCU projects within VS Code using the PlatformIO extension?

I am attempting to use a ESP-32 dev kit to control WS2812 LED stripes. Discovered there is some firmware called NodeMCU for these dev kits which uses LUA scripts from what I can tell. There is an extension called PlatformIO for VS Code. I had used this to program a Arduino board to flash an LED.
I was wondering if it is possible to use PlatformIO to build the NodeMCU firmware and the LUA scripts then using PlatformIO to download everything to the ESP-32 dev kit. Is that possible?
I am thinking this can't be done since there are only two Framework selections, "Arduino" and "ESP-IDF", when I create a project which doesn't list NodeMCU.
Thanks
With ESP-IDF you would write C-code directly against the SDK. This can be done in Platform IO. This has its advantages but the major downside of course is that a development roundtrip takes some time. The complete build & install (flashing binary) cycle is run for every bit you turn in your source code.
With NodeMCU you build & install the firmware once and then only transfer the Lua files that changed. The downside here is that you need separate tools for separate tasks. See https://nodemcu.readthedocs.io/en/dev-esp32/ for details.
Build the firmware, either on Linux dev env, on a Linux VM (e.g. on Windows) or with Docker (quite simple, by yours truly).
Flash the firmware. Use esptool.py or the self-contained standalone GUI tool NodeMCU PyFlasher (by yours truly).
Upload Lua code from host to device. Use ESPlorer (very basic editor), NodeMCU Tool or the ChiliPeppr ESP32 Web IDE.

Debugging DSP Application Remotly With GDB and QT Creator

I have an image processing application which uses QT and TI Video Decoder example, it runs on TI DaVinci, DM6446.
I am using QT Creator and compile process gives me two binaries, one for the ARM core, one for the DSP. The DSP binary has the extension of ".x64p".
There is no problem if i start the app directly from the target board.But if I start it using gdb on the host, i can see the debug messages on the arm side, however, it crashes immediately, because it is not able to open DSP binary.
Is there any way to debug ARM+DSP application without using TI CCS or a JTAG device ?
Ok i can see that you have 3 "apps" here
Main app for the arm side
The codec
The server for the dsp side
if i am right you can use Linux Os (if you have the virtual machine known as child and parent has ubuntu which you can find at ti website)
so run this vm and
Build the codec (make all command)
Build the server (make build_server command)
Build the app (make all command)
hope this help
but I wanna ask you how can I make this using ccs, I can build a separate project for the dsp or arm but I wanna build the whole system..
help me if you can
...Regards

Create virtual hardware, kernel, qemu for Android Emulator in order to produce OpenGL graphics

I am new to android and wish to play around with the emulator.
What I want to do is to create my own piece of virtual hardware that can collect OpenGL commands and produce OpenGL graphics.
I have been told that in order to do this I will need to write a linux kernal driver to enable communication with the hardware. Additionally, I will need to write an Android user space library to call the kernal driver.
To start with I plan on making a very simple piece of hardware that only does, say 1 or 2, commands.
Has anyone here done something like this? If so, do you have any tips or possible links to extra information?
Any feedback would be appreciated.
Writing a hardware emulation is a tricky task and by no means easy. So if you really want to do this, I'd not start from scratch. In your case I'd first start with some simpler (because many of the libraries are already in place on guest and the host side): Implementing a OpenGL passthrough for ordinary Linux through qemu. What does it take:
First you add some virtual GPU into qemu, which also involves adding a new graphics output module that uses OpenGL (so far qemu uses SDL). Next you create DRI/DRM drivers in the Linux kernel, that will run on the guest (Android uses its own graphics system, but for learning DRI/DRM are fine), as well as in Mesa. On the host side you must translate what comes from qemu in OpenGL calls. Since the host side GPU is doing all the hard work your DRI/DRM part will be quite minimal and just build a brigde.
The emulator that comes with Android SDK 23 already runs OpenGL, you can try this out with the official MoreTeapots example: https://github.com/googlesamples/android-ndk/tree/a5fdebebdb27ea29cb8a96e08e1ed8c796fa52db/MoreTeapots
I am pretty sure that it is hardware accelerated, since all those polygons are rendering at 60 FPS.
The AVD creation GUI from Studio has a hardware acceleration option, which should control options like:
==> config.ini <==
hw.gpu.enabled=yes
hw.gpu.mode=auto
==> hardware-qemu.ini <==
hw.gpu.enabled = true
hw.gpu.mode = host
hw.gpu.blacklisted = no
in ~/.android/avd/Nexus_One_API_24.a/.

Resources