Using mac ports to install py26-gtk switches off my Mac - macos

I'm trying to run sudo port install py26-gtk but my MacBook doesn't get further than building gcc43, while doing that it switches off without shutting down.
It is not overheating since I tried it again while my MacBook was sandwiched between two cold packs. The Mac was cool and the ventilation was not running. i.e. the Mac was not trying to cool itself when I used the cold packs.
Does anyone have any idea what is causing my Mac to switch off?
I have a Mac OS 10.6.3, python 2.6, mac ports 1.8.2.

If this happens to you this means you have a hardware problem. Let's hope you still have apple care. I can't help but get the impression that apple hardware quality gets worse every year : (

Related

Does Xcode 7.3.1 run on macOS High Sierra (10.13)?

Although I know that this Xcode version is quite old, my company's project still needs to use it and I didn't manage to make it ran properly on my company's mac.
Does anyone have faced this issue as well? Any idea would be very helpful!
ps. On my personal mac, with macOS High Sierra 10.13.1 Beta (17B25c), every time that I try to open Xcode 7.3.1, it crashes and doesn't even inform the reason.
It works on my machine... I know that doesn't help you but I just installed it and it opens ok. Maybe upgrade to full version?
After running Xcode 7.3.1 on MacOS High Sierra 10.13 VM, I can confirm that I can compile and run build on Simulator and devices with iOS6 & iOS7.
Unfortunately, there is no Cross-Compability-Guide to downgrade to older Development Environment or generell Software on newer Mac, because only minimum Requirements are listed in wikis like this:
https://de.wikipedia.org/wiki/Xcode
So, my tip on this topic is really to use older Hardware to test on matching Environments or try Virtualisation such as Parallels, Fusion or Virtualbox.There are differents in Virtualisation too. If some Virtualisation fails e.g. Virtualbox, maybe another Program (e.v. Parallels) does the job well.
I really use older Mac Hardware for testing older Software-Versions and that is the best tip I can give you on this topic. So its a good idea to have an older Hardware Repository to do this jobs, on which virtualization fails.
I know some Apple Developer which use some older Mac minis in their basement - remote connected - to solve problems like this.

VirtualBox guest running OSX Mountain Lion: XCode install error

I'm trying to install Xcode in a VirtualBox Hackintosh for app development purposes. I used MultiBeast and the Xcode installation is complaining about the display, but the way VirtualBox handles displays and graphics is beyond me.
I get the following error: Xcode: CVCGDisplayLink::setCurrentDisplay didn't find a valid display -- falling back to 60Hz.
I've currently followed the installation instructions from this link. I'm not sure if it makes a difference but my host is ArchLinux with Intel Haswell with Iris Pro Graphics.
Why am I getting this error?
Hmm.. not sure if this is right or not. I did this, and it fixed something:
from http://www.macbreaker.com/2012/07/mountain-lion-virtualbox.html
You need to change the CPUID to a non-Haswell ID to fix the issue
non-booting 10.8 on a Haswell CPU:
VBoxManage modifyvm {your vm name here} –cpuidset 00000001 000106e5 02100800 0098e3fd bfebfbff
But I'm having some trouble remembering the exact chain of events. It might've been another tweak later on that got XCode to install properly.
Regardless, I feel dirty for even attempting to do this. I hoped I would never be put in a situation where I had to cheat on Linux, but it beats spending over a grand (despite that mine is top-notch) to work on a useless iPhone app. In other words, it sucks being an independent commodity producer.
I'm getting the same error, but on actual Apple hardware.
Despite that, the display link object does seem to work.
Maybe it's because it is not familiar with my Dell monitor.

iSight/FaceTime camera not working after upgrade

right now i am running the OS X Yosemite beta 10.10, after upgrading my OS X i lost my camera, i tried everything in the forums, but nothing helped.
i can't FaceTime, or Skype or anything. normally i would take it in to an apple store, but the closest apple store to me is about 200 miles away.
since i am beta testing OS X Yosemite, i knew i was going to have some problems, but i need this to work
also; i also installed the update 2 days ago as it was recommended in the app store.
(not a hardware problem, was working perfectly before the upgrade and it is not found in the "system info> USB"); also not user specific.
Open your terminal and type this command:
$ sudo Killall VDCAssistant
Now, go to your Skype->Preferences->Audio/Video and you can see the webcam working.
Have a good one!
Apparently, this problem persists even in the recent versions of OS X Yosemite. I also had this problem. So, the answer to solve it is basically resetting the SMC, which Apple provides instructions in https://support.apple.com/en-us/HT201295
Now, I was not able to sort it even by resetting the SMC in a Macbook Pro 15 mid 2012. At least, not immediately. As I keep a few things connected in the USB ports, I believe it might have been the reason, so I closed all my applications (just in case), turned off the computer, unplugged all the USB and thunderbolt connectors (yes, including the LAN adaptor), as well as the earphone connector to speakers. THEN, reset the SMC (for non-removable battery computers, by pressing and releasing control-shift-alt-power [left shift] simultaneously while connected to power). For me, it worked.
It might be useful to remember that the SMC is the System Management Controller, that takes care of the hardware in a low-level, including i/o, keyboard backlight, speakers, camera... so, if you are going to reset it, it might make sense to release all the possible burden it might feel.
Resetting SMC is the only thing that solved the problem on my MBP Mid-2012.
The issue came about around the time I upgraded to High Sierra.
This issue is solved with the newest update of OSX Yosemite. Just update through the Appstore app and everything will be normal.

To install Xcode in Windows XP

I would like to know if I can install Xcode with Windows XP.If possible please provide the document link also.Thank you very much for any help in advance...
It's certainly possible.
There are two routes;
Install OSx86 (aka iATKOS / Kalyway) on a second partition/disk and dual boot.
Run Mac OS X Server under VMWare.
The first route requires modifying (or using a pre-modified) image of Leopard that can be installed on a regular PC. This is not as hard as you would think, although your success/effort ratio will depend upon how closely the hardware in your PC matches that in Mac hardware - e.g. if you're running a Core 2 Duo on an Intel Motherboard, with a NVidia graphics card you are laughing. If you're running an AMD machine or something without SSE3 it gets a little more involved.
If you purchase (or already own) a version of Leopard then this is a gray area since the Leopard EULA states you may only run it on an "Apple Labeled" machine. As many point out if you stick an Apple sticker on your PC you're probably covered.
The second option is the more costly. The EULA for the workstation version of Leopard prevents it from being run under emulation and as a result there's no support in VMWare for this. Leopard server however CAN be run under emulation and can be used for desktop purposes. Leopard server and VMWare are expensive however.
If you're interested in option 1) I would suggest starting at Insanelymac and reading the OSx86 sections.
I do think you should consider whether the time you will invest is going to be worth the money you will save though. It was for me because I enjoy tinkering with this type of stuff and I started during the early iPhone betas, months before their App Store became available.
Alternatively you could pickup a low-spec Mac Mini from eBay. You don't need much horse power to run the SDK and you can always sell it on later if you decide to stop development or buy a better Mac.
No. You can not install XCode on a Windows machine. You need MacOS to run XCode.
Although you can install VMWare Server on your windows machine and then install MacOS on that virtual server and can install the XCode on that MacOS server.
But to install MacOS(VMWare server) and start working on it. your hardware must support virtualization.
To install VMWare following links might be useful.
http://www.petri.co.il/virtual_install_vmware_server.htm
http://www.virtuatopia.com/index.php/Installing_VMware_Server_2.0_on_Windows_Systems
Or there is always Google.
YOu can check if your CPU supports virtualization here.
Since Xcode is an software of Mac OS so its not possible to run Xcode without Mac OS. And for Mac OS you can dual boot your computer with a Mac OS or simply you can just use VMWare to install Mac OS in your laptop or PC. Just the necessary requirement for installing Mac OS through Dual boot or through VMWare is that you need really high configuration in your laptop or PC, Like atleast you require 5th Generation core processor with atleast 4 cores also you require more than 4GB RAM for better functioning of Mac OS. Mac OS can be installed with 2nd generation processors, Dual core, and 2 GB RAM but it will take too much time. Which is not worth installing Mac OS in your laptop or PC. And even after installing you won't be able to use Xcode efficiently with such a low configuration. So it is recommended to have 5th Generation processor with 4 core and atleast 4GB RAM.
If you have this configuration than you need some files and software to install Mac OS in your Laptop.
For installation through VMWare you can prefer to this video: https://www.youtube.com/watch?v=wodqGvug6e0
And I have the required file for the same as in video but not uploaded to the internet.

MonoTouch on OS X VM within Windows?

We're getting into iOS development with MonoTouch. All of our machines are Mac Pros with Windows 7 installed via BootCamp. I'm not crazy about rebooting into OS X just to access the MonoTouch IDE. I'm wondering if it's legal and possible to install OS X on a VM within Windows (if I'm already on Apple hardware, it should be ok, right?). Any other issues with Apple's SDK in a VM (I heard they do some hardware checking of some sort). Thanks in advance for any suggestions!
You can't really run OSX on a VM under windows without going the hacking route. The only way to properly virtualize OSX is to run OSX Server under OSX itself, which is not what you want.
The best option for you is to do what I do: run OSX on your Mac, then use something like VMWare or Parallels to run the Windows you have on your BootCamp as a VM. Works beautifully.
Yup, Eduardo is right, running OSX under non-apple hardware is considered illegal according to apple's license. Moreover, you may run into some issues when creating your developer's account or sumbitting apps.
However, if you still want go the hack way, you can refer to osx86project or just search google for "how to create a hackintosh".

Resources