Move PmodOLED from JB to JA Header - 32MX4 (MPLAB X IDE) - microchip

I am new to programming for microcontrollers and my first task is to accept WiFi connections using the PmodWiFi add-on. I read that the default pin/port for the WiFi chip is the JB header on the 32MX4. The problem is that I currently have the PmodOLED add-on installed in the JB header, so I want to move it to the JA header. In PmodOLED.h (which I have imported into my library), I changed the following lines to correspond to the ports in the JA header instead of the JB header:
#define prtSelect IOPORT_G
#define prtDataCmd IOPORT_B
#define prtReset IOPORT_D
#define prtVbatCtrl IOPORT_D
#define prtVddCtrl IOPORT_B
#define bitSelect BIT_9
#define bitDataCmd BIT_15
#define bitReset BIT_5
#define bitVbatCtrl BIT_4
#define bitVddCtrl BIT_14
However, when I run the code with the PModOLED attachment inserted into the JA header, it does not respond and the screen is blank. I would greatly appreciate it if you could tell me how to modify my code so that the PmodOLED add-on can be installed in the JA header.
Thanks in advance for your help,
Guvvy

Related

Is it possible to set breakpoint at OpenGL functions?

Sometimes I need to find out which part of the code call a certain OpenGL function, so I try this:
b glEnableVertexAttribArray
----------------------------
Breakpoint 3 at 0x7ffff0326c80 (2 locations)
But it doesn't work, is there any way to make this work?
I'm using gdb in ubuntu18.04, my GPU is GeForce GTX 1050 Ti
If you look at your GL/glew.h header, you will see that it contains lines similar to the following:
#define GLEW_GET_FUN(x) x
#define glCopyTexSubImage3D GLEW_GET_FUN(__glewCopyTexSubImage3D)
#define glDrawRangeElements GLEW_GET_FUN(__glewDrawRangeElements)
#define glTexImage3D GLEW_GET_FUN(__glewTexImage3D)
#define glTexSubImage3D GLEW_GET_FUN(__glewTexSubImage3D)
When you call glewInit, these __glew* variables are populated with pointers extracted from your OpenGL implementation.
In your case, you should set a breakpoint on the contents of such a pointer, so *__glewEnableVertexAttribArray.
For GLAD you will have to put a breakpoint on *glad_glEnableVertexAttribArray. Note the * in both cases: that tells your debugger to dereference the pointer and put the breakpoint at the correct location.

gcc and explicit 256B boundary alignment on ARM

I'm using gcc toolchain for ARM and I need a linker to place a structure on a 256B boundary in memory. I've tried the aligned attribute:
volatile struct ohci_hcca hcca __attribute__ ((aligned (256)));
but with no luck:
nm out.elf | grep hcca
20011dc0 B hcca
I remember using this in past for fields that had to be on 512B boundary so I know it's possible but I seem to be missing something this time.
Thanks for any guidance.
EDIT:
The mystery's been solved. The BSP library is composed from tens of drivers and one of their header files contaned this:
#if defined __ICCARM__ || defined __CC_ARM || defined __GNUC__
//#pragma data_alignment=8 /* IAR */
#pragma pack(1) /* IAR */
#define __attribute__(...) /* IAR */
#endif /* IAR */
For __GNUC__ it doesn't make any sense unless the author of this chunk is a member of guerrilla force against GCC users. Ruined my entire evening :)

Warnings with a new Xcode project using FMDB, SqliteCipher and CocoaPods

I have installed CocoaPods.. and loaded the workspace as instructed.
I am getting these warnings which I do not understand, here is an example:
Pods-CipherDatabaseSync-SQLCipher
sqlite.c
/Users/admin/Code/CipherDatabaseSync/Pods/SQLCipher/sqlite3.c:24035:13: Ambiguous expansion of macro 'MAX'
I have looked around for a couple of hours and I am stumped at what I need to do, can someone please point me in the direction of somewhere that will provide some insight?
Thanks.
In the sqlite.c file it looks as if MIN and MAX are trying to be defined in two different areas of the file. The first time on line 214
/* Macros for min/max. */
#ifndef MIN
#define MIN(a,b) (((a)<(b))?(a):(b))
#endif /* MIN */
#ifndef MAX
#define MAX(a,b) (((a)>(b))?(a):(b))
#endif /* MAX */
Then secondly at line 8519
/*
** Macros to compute minimum and maximum of two numbers.
*/
#define MIN(A,B) ((A)<(B)?(A):(B))
#define MAX(A,B) ((A)>(B)?(A):(B))
I commented out where they define it the second time and all of the warnings went away after cleaning and building the project again.
Remove the MAX and MIN macro definitions from the sqlite3.c file, since they are already defined in system headers.
Issue Solution:
Open Xcode project Build settings
add “-Wno-ambiguous-macro” into “Other C flags”

__attribute__() macro and its effect on Visual Studio 2010 based projects

I have some legacy code that has got conditional preprocessing e.g. #ifdef and #else where I have found the use of __attribute__ macro. I have done a quick research and found out that it is specific to GNU compilers. I have to use this legacy code in Visual Studio 2010 using MSVC10 compiler and apparently it is complaining everywhere it sees attribute((unused)) even though it is protected by #ifndef and #ifdefs. An example is:
#ifdef __tone_z__
static const char *mcr_inf
#else
static char *mcr_inf
#endif
#ifndef _WINDOWS
__attribute__(( unused )) % this is causing all the problem!!
#endif
= "#(#) bla bla copyright bla";
#ifdef __tone_z__
static const char *mcr_info_2a353_id[2]
#else
static char *mcr_info_2a353_id[2]
#endif
__attribute__(( unused )) = { "my long copyright info!" } ;
I am really struggling to understand if it is very poorly planned code or is it just my misunderstanding. How do I avoid the usual compiler and linker errors with this __attribute__() directive? I have started to get C2061 errors (missing identifiers/unknown). I have got all necessary header files and nothing is missing, may be except GNU compiler (which I don't want!!).
Also, it seems that the end of line character ; is also being messed up when I take to code in windows....argh....I mean the UNIX end-of-line and Windows EOL how can I use this code without modifying the body....I can define in my property sheet about the _WINDOWS thingy, but cannot automatically adjust the EOL character recognition.
ANy help is appreciated!
Thanks.
My best guess is that _WINDOWS is, in fact, not defined by your compiler, and so use of __attibute__ is not protected.
In my opinion, the best way to protect against attributes is to define a macro like this:
#define __attribute__(A) /* do nothing */
That should simply remove all __attribute__ instances from the code.
In fact, most code that has been written to be portable has this:
#ifdef _GNUC
#define ATTR_UNUSED __attribute__((unused))
#else
#define ATTR_UNUSED
#endif
static TONE_Z_CONST char *integ_func ATTR_UNUSED = "my copyright info";
(Other __tone_z__ conditional removed for clarity only.)

Understanding the resource file of a program(Win32)

Say i have a simple Win32 program with a menu, and all syntax is correct. Here is the resource file:
#define IDR_MYMENU 101
#define IDI_MYICON 102
#define ID_FILE_EXIT 40001
#define ID_STUFF_GO 40002
#define ID_STUFF_GOSOMEWHEREELSE 40003
Im a little confused on the constants declared, could i use any ol number to represent each option? such as:
#define IDR_MYMENU 23
#define IDI_MYICON 412
#define ID_FILE_EXIT 40071
#define ID_STUFF_GO 40892
#define ID_STUFF_GOSOMEWHEREELSE 64982
or is there something behind those specific numbers? Thanks for any help!
You can use any valid 16-bit unsigned integer value (i.e <= 65535). They are used to uniquely identify a resource.
You can use most any number. However, I have found that on WinCE some menu item identifiers can be reserved for special system actions. It is best to avoid those below 100.

Resources