Every time I install software that gets built using GCC (in the "AUR" repository on my Arch-based system), I get many thousands or, on big packages, maybe even millions of screen pages of warnings that I am not at all interested in. It makes everything else in my console almost unreachable, because I have to jump back to the beginning and start scrolling down from there, because the scroll bar completely disappears into a <1px line. And it's also plain annoying and ridiculous.
If I was manually running gcc, I could simply append -w, but I'm not doing that, it's either done by yay or pacman or some installation script or who knows what. Is there a way to just always enable -w for gcc, no matter what it is run from?
(Apart from that, GCC-based installations also tend to be a LOT slower than any other ones, twice they have even taken over an hour, but I assume that cannot be fixed so easily.)
Related
While I'm using my computer a blue window will pop up for a second then go away. The label said windows power shell, I've tried looking at the event viewer but I could not identify anything there since I'm a new user. What could be causing this?
Running windows 10
Sometimes installed programs open up command prompts to run services/init tasks, so its not completely unusual.
I've never seen it happen with powershell however.
it could be innocent and just a program you have installed running init behavior, but it could also be malicious.
the first thing to try is checking what programs are set to startup automatically. if there is a load of bloat, you could try turning off the unnecessary ones and see if it still happens.
but realistically the only real way forward is to get a good quality antivirus, and run a full system scan over your pc to double check. it wont give you 100% certainly as things could possibly get passed it, but realistically if it passes you should be fine
I am using kubuntu 10.10 with a 4 cores cpu. When I use 'make -j2' to build a cpp project, 2 core's cup usage become 100%, desktop environment become no response, and build procedure make no progress.
Version info:
The GNU make's version is 3.81
gcc version 4.4.5 (Ubuntu/Linaro 4.4.4-14ubuntu5)
How to resolve this problem? Thanks.
There's not really enough information here to give you a definitive answer. First it's not clear if this happens only when you run with -j2. What if you run without parallelism (no -j)? When you say "2 core's CPU usage [goes to] 100%", what is happening on those CPUs? If you run "top" in another terminal and then start your build, what is showing in top?
Alternatively, if you run "make -d -j2" what program(s) is make running right before the CPU goes to 100%?
The fact that the desktop is unresponsive as well hints at some other problem, rather than CPU usage, since you have 4 cores and only 2 are busy. Maybe something is chewing up all your RAM? Does the system come back after a while (indicating that the OOM killer got involved and stomped something)?
If none of that helps, you can run make under strace, something like "strace -f make -j2" and see if you can figure out what is going on. This will generate a metric ton or two of output but if, when the CPU is pegged, you see something running over and over and over you might get a hint.
Basically I can see these possibilities:
It's not make at all, but rather whatever command make is running that's just bringing your system down. You imply it's just compiling C++ code so that seems unlikely unless there's a bug somewhere.
Make is recursing infinitely. Make will rebuild its own makefile, plus any included makefile, then re-exec itself. If you are not a bit careful defining rules for rebuilding included makefiles make can decide they're always out of date and rebuild/rexec forever.
Something else
Hopefully the above hints will set you on a path to discovering what's going on.
Are you sure the project is prepared for parallel compilation? Maybe the prerequisites aren't correctly ordered.
If you build the project with just "make" the compilation finish? If it gets to the end is a target dependency problem.
After some pain and suffering, i managed to install everything necessary for MinGW to work on a computer not on the network.
It worked nicely for a couple days, but now I'm experiencing very long delays before anything starts to happen after i give the "make" command to build my project.
I tried disabling the network, as suggested here: Why is MinGW very slow?
But it didn't help.
Note that it's not the actual compilation / linking progress that is slow, but the startup of those processes seems to take forever. 5-10 minutes. Except if i just did it, then it starts in 10-30 seconds.
I know, it used to take a lot longer to load those tapes on Commodore, but over the years I have grown impatient.
Any ideas?
Try doing make -r (without implicit rules). For me it was a difference between 30 seconds and fraction of a second for single cpp file.
Explanation:
I've had the same problem MinGW make long ago. I've used make -d to investigate. It was then obvious that make uses a gazillion of implicit rules for every dependency file - if my file had dep on shared_ptr.hpp then make checked for shared_ptr.hpp(o|c|cc|v|f|r|.. and dozens other combinations). Of course those files didn't exist. It looks like checking for file mod time/existence (when it doesn't really exist) on Windows platform is a lot slower than on Linux (becouse with Linux i didn't see any difference with/without -r switch).
As a current Xcode project of mine has gotten larger and larger, I've noticed that quite often, Xcode seems to "recompile the world" when I make a change to a single non-header file from emacs. Not always, but a lot. I think it might have always been doing this, but when the project was small, I never noticed or cared. Now that the project's fairly big, it's absolutely killing my productivity. How the heck do I stop this?
(Yeah ... answering my own question [https://meta.stackexchange.com/questions/17845/etiquette-for-answering-your-own-question]).
It took me a fair amount of tracking to nail this ... but definitely worth it.
It boils down to the lock files that emacs creates to detect simultaneous edits from multiple emacs processes. These files are (invalid) symlinks from .#<filename> to <host:pid>. Xcode absolutely hates these files (so do some other tools I use ... though I'm blanking on what they are right now ... might even have been xcodebuild.) Xcode.app doesn't actually raise any errors but it seems to chuck its dependence information. These lock files are not backup files: they exist when you've changed the contents of a file but not yet saved it, so what you get is behavior where just making a local change in an emacs buffer ends up causing a "rebuild the world" even though nothing's been saved.
There isn't, at this point, any way of disabling these lock files. The issue was raised on the emacs list a few months ago but died out without any resolution.
To work around the problem, you have to disable the lock files at compile time. You do the normal configure dance, then in src/config.h, after the #include for the os and machine configs, add #undef CLASH_DETECTION
I've filed a radar with Apple.
I am working on a project using Google's cmockery unit testing framework. For a while, I was able to build the cmockery project with no problems. e.g. "./configure", "make && make install" etc. and it took a reasonable amount of time (1-2 minutes or so.) After working on other miscellaneous tasks on the computer and going back to re-build it, it becomes horrendously slow. (e.g. after fifteen minutes it is still checking system variables.)
I did a system restore to earlier in the day and it goes back to working properly for a time. I have been very careful about monitoring any changes I make to the system, and have not been able to find any direct correlation between something I am changing and the problem. However, the problem inevitably recurs (usually as soon as I assume I must have accidentally avoided the problem and move on). The only way I am able to fix it is to do a system restore to a time when it was working. (Sometimes restarting the machine works as well, sometimes it does not.)
I imagine that the problem is between the environment and autoconf itself rather than something specific in cmockery's configuration. Any ideas?
I am using MinGW and under Windows 7 Professional
Make sure that antivirus software is not interfering. Often, antivirus programs monitor every file access; autoconf accesses many files during its operation and is likely to be slowed down drastically.