Warnings with a new Xcode project using FMDB, SqliteCipher and CocoaPods - xcode

I have installed CocoaPods.. and loaded the workspace as instructed.
I am getting these warnings which I do not understand, here is an example:
Pods-CipherDatabaseSync-SQLCipher
sqlite.c
/Users/admin/Code/CipherDatabaseSync/Pods/SQLCipher/sqlite3.c:24035:13: Ambiguous expansion of macro 'MAX'
I have looked around for a couple of hours and I am stumped at what I need to do, can someone please point me in the direction of somewhere that will provide some insight?
Thanks.

In the sqlite.c file it looks as if MIN and MAX are trying to be defined in two different areas of the file. The first time on line 214
/* Macros for min/max. */
#ifndef MIN
#define MIN(a,b) (((a)<(b))?(a):(b))
#endif /* MIN */
#ifndef MAX
#define MAX(a,b) (((a)>(b))?(a):(b))
#endif /* MAX */
Then secondly at line 8519
/*
** Macros to compute minimum and maximum of two numbers.
*/
#define MIN(A,B) ((A)<(B)?(A):(B))
#define MAX(A,B) ((A)>(B)?(A):(B))
I commented out where they define it the second time and all of the warnings went away after cleaning and building the project again.

Remove the MAX and MIN macro definitions from the sqlite3.c file, since they are already defined in system headers.

Issue Solution:
Open Xcode project Build settings
add “-Wno-ambiguous-macro” into “Other C flags”

Related

Is it possible to set breakpoint at OpenGL functions?

Sometimes I need to find out which part of the code call a certain OpenGL function, so I try this:
b glEnableVertexAttribArray
----------------------------
Breakpoint 3 at 0x7ffff0326c80 (2 locations)
But it doesn't work, is there any way to make this work?
I'm using gdb in ubuntu18.04, my GPU is GeForce GTX 1050 Ti
If you look at your GL/glew.h header, you will see that it contains lines similar to the following:
#define GLEW_GET_FUN(x) x
#define glCopyTexSubImage3D GLEW_GET_FUN(__glewCopyTexSubImage3D)
#define glDrawRangeElements GLEW_GET_FUN(__glewDrawRangeElements)
#define glTexImage3D GLEW_GET_FUN(__glewTexImage3D)
#define glTexSubImage3D GLEW_GET_FUN(__glewTexSubImage3D)
When you call glewInit, these __glew* variables are populated with pointers extracted from your OpenGL implementation.
In your case, you should set a breakpoint on the contents of such a pointer, so *__glewEnableVertexAttribArray.
For GLAD you will have to put a breakpoint on *glad_glEnableVertexAttribArray. Note the * in both cases: that tells your debugger to dereference the pointer and put the breakpoint at the correct location.

same code doesn't work on my own project (strcpy) in visual studio 2017

I'm working on some exercises for school.
The projects i have from my teacher work without any errors.
When i copy the code to a new project made on my computer, it shows this error:
Compiler Warning (level 3) C4996
I looked at both compiler settings and made them equal, this didn't work.
So i tried to make a project property file from my teachers project and insert it into my own project. Also this doesn't work.
Can somebody help me solving this issue?
This is the code:
#include <stdio.h>
#include <string.h>
int main(void)
{
char s1[32];
char s2[32];
strcpy(s1, "abc def.");
strcpy(s2, "ghi_x");
printf("s1=\"%s\" en s2=\"%s\"\n", s1, s2);
printf("s1 bevat %d symbolen en s2 bevat %d symbolen\n", strlen(s1), strlen(s2));
printf("De functie strcmp(s1,s2) geeft %d als functiewaarde\n", strcmp(s1, s2));
getchar();
return 0;
}
The Error I get is
Severity Code Description Project File Line Suppression State Error C4996 'strcpy': This function or variable may be unsafe. Consider using strcpy_s instead. To disable deprecation, use _CRT_SECURE_NO_WARNINGS. See online help for details
A quick Google search shows that "Compiler Warning (level 3) C4996" means you're using deprecated functions. The most likely culprits are your str* functions since they are generally unsafe. Switch to using their strn* counterparts (e.g. strncpy).

gcc and explicit 256B boundary alignment on ARM

I'm using gcc toolchain for ARM and I need a linker to place a structure on a 256B boundary in memory. I've tried the aligned attribute:
volatile struct ohci_hcca hcca __attribute__ ((aligned (256)));
but with no luck:
nm out.elf | grep hcca
20011dc0 B hcca
I remember using this in past for fields that had to be on 512B boundary so I know it's possible but I seem to be missing something this time.
Thanks for any guidance.
EDIT:
The mystery's been solved. The BSP library is composed from tens of drivers and one of their header files contaned this:
#if defined __ICCARM__ || defined __CC_ARM || defined __GNUC__
//#pragma data_alignment=8 /* IAR */
#pragma pack(1) /* IAR */
#define __attribute__(...) /* IAR */
#endif /* IAR */
For __GNUC__ it doesn't make any sense unless the author of this chunk is a member of guerrilla force against GCC users. Ruined my entire evening :)

Stringify Endpoint for Xcode LLVM Processor Macros

In the "Apple LLVM 7.0 - Preprocessing" section under the "Build Settings" tab, I've defined a Preprocessor Macros as:
STR(arg)=#arg
HUBNAME=STR("myhub")
HUBLISTENACCESS=STR("Endpoint=sb://abc-xyz.servicebus.windows.net/;SharedAccessKeyName=DefaultListenSharedAccessSignature;SharedAccessKey=JKLMNOP=")
In my code, I'm trying to refer to the value of HUBLISTENACCESS as a string:
SBNotificationHub* hub = [[SBNotificationHub alloc] initWithConnectionString:#HUBLISTENACCESS notificationHubPath:#HUBNAME];
But I'm getting errors from Xcode for the initialization of "hub":
Expected ';' at end of declaration
Unterminated function-like macro invocation
Unexpected '#' in program
I suspect that the definition of HUBLISTENACCESS in the Preprocessor Macros needs to be properly escaped but I've tried a few things and can't seem to get it right. Can somebody help me understand what I'm doing wrong?
There's one very obvious reason why you were trying to do failed: you use // in the HUBLISTENACCESS. As in C, things after // were commented out so in the aspect of the compiler, the last line of yours is actually:
HUBLISTENACCESS=STR("Endpoint=sb:
To test it, just remove one slash and it will work again. What you were doing like trying to define things as such:
#define FOO //
which I don't think it's possible. I honestly have no idea how you can do that within the Build Settings, but there are other ways to do it globally via a PCH file (prefix header).
A simple line within the PCH will will save all those troubles:
#define HUBLISTENACCESS #"Endpoint=sb://abc-xyz.servicebus.windows.net/;SharedAccessKeyName=DefaultListenSharedAccessSignature;SharedAccessKey=JKLMNOP="
Then use it as below: (no more # needed!)
NSLog(#"%#", HUBLISTENACCESS);

__attribute__() macro and its effect on Visual Studio 2010 based projects

I have some legacy code that has got conditional preprocessing e.g. #ifdef and #else where I have found the use of __attribute__ macro. I have done a quick research and found out that it is specific to GNU compilers. I have to use this legacy code in Visual Studio 2010 using MSVC10 compiler and apparently it is complaining everywhere it sees attribute((unused)) even though it is protected by #ifndef and #ifdefs. An example is:
#ifdef __tone_z__
static const char *mcr_inf
#else
static char *mcr_inf
#endif
#ifndef _WINDOWS
__attribute__(( unused )) % this is causing all the problem!!
#endif
= "#(#) bla bla copyright bla";
#ifdef __tone_z__
static const char *mcr_info_2a353_id[2]
#else
static char *mcr_info_2a353_id[2]
#endif
__attribute__(( unused )) = { "my long copyright info!" } ;
I am really struggling to understand if it is very poorly planned code or is it just my misunderstanding. How do I avoid the usual compiler and linker errors with this __attribute__() directive? I have started to get C2061 errors (missing identifiers/unknown). I have got all necessary header files and nothing is missing, may be except GNU compiler (which I don't want!!).
Also, it seems that the end of line character ; is also being messed up when I take to code in windows....argh....I mean the UNIX end-of-line and Windows EOL how can I use this code without modifying the body....I can define in my property sheet about the _WINDOWS thingy, but cannot automatically adjust the EOL character recognition.
ANy help is appreciated!
Thanks.
My best guess is that _WINDOWS is, in fact, not defined by your compiler, and so use of __attibute__ is not protected.
In my opinion, the best way to protect against attributes is to define a macro like this:
#define __attribute__(A) /* do nothing */
That should simply remove all __attribute__ instances from the code.
In fact, most code that has been written to be portable has this:
#ifdef _GNUC
#define ATTR_UNUSED __attribute__((unused))
#else
#define ATTR_UNUSED
#endif
static TONE_Z_CONST char *integ_func ATTR_UNUSED = "my copyright info";
(Other __tone_z__ conditional removed for clarity only.)

Resources