Is there a way to have XCode ask before downloading documentation/API updates? - xcode

By default, XCode is setup to download updates associated with the various documentation and API libraries available to the application. This can be disabled from the XCode preferences screen. However, I'd prefer not to disable the automatic updates but, rather, prompt me to start the updates so that I can potentially dismiss them for download at a later time.
My reasoning is because I primarily work through a network connection that gains access to the internet through a 4G wireless hotspot that gets hit by overage fees. In fact, I'm connected through this device, to the internet, about 90% of the time I'm working on my Mac. When I need to download any form of large software update, I always take my Macbook to an open, direct-connect source to the internet and let it do what it needs to do.
This works fine for most software, but not XCode.
I want my updates to remain automatic (so that I am at least informed that there is an update available,) however, I'd like to have the choice whether to initiate them or not.
Is there something that I can do to make XCode ask before downloading?
A QUICK NOTE
I know how we technical minded folk are-- half of you are still wondering why I work off of a 4G hotspot and want to fix that problem, instead of the one I asked. (Yes, I tend to think this way too.)
However, I work in an environment that has an IT department that adamantly refuses to allow any operating systems, other than Win XP and Win 7, onto their network. The Engineering team (which I work for) has to have an internet connection and an internal network for storing and backing up data and we are developing iOS software that is integrated with our products. This is obviously problematic since we need to use Macbooks to do our work.
Our solution to this dilemma has been to setup our own, small LAN and our only way of getting internet access is through cellular WiFi. All WiFi ISP plans that are available in our region are tier-based and overages are charged (at a reasonable rate) when we use more than our allotment of data. We don't mind going over our quota, however, we need to keep it reasonable. Automatic updates like this can start to take a huge hit on our network when a few of us have to download a few GBs of data each month.

Software like LittleSnitch might be the best solution (firewall).
In case Xcode tries to update itself, the dialog below shown.
Now you can simply Allow or Deny the connection.
This way you can accurately control all connections (and your privacy).
Other applications might also consume quite some bandwidth.
P.S.
Why not connect a cheap XP box to the network,
which shares the internet connection to all Macs?

Related

Moving TFS working folder to network share

Our IT folks are telling us (the dev group) we shall not have ANY files stored on our local hard drives, including our TFS working folders. This is ridiculous for a variety of reasons but until I'm convinced it's a good idea, I'll play along and when no one is looking make a local working folder.
Does anyone does have their working folder on a network share? How well does it work? Each developer would have their own folder in the share but it would be on the network. My main concerns are performance and we would need to be connected at all times in order to work.
On a TFS point of view it's working without issue, but stay away from the Local Workspace of TFS/VS11.
I strongly feel for you on the compiling point of view, compiling a solution stored over the network is absolutely a disaster in term of performances.
You did not mention it but I assume your Network Share uses a Network Drive.
Btw, can I know why these guys don't want you to store files locally ?
While it's not something I would typically recommend, if that is the policy and you have to adhere to it, it might be worthwhile to consider simply having server-side development VM's that your devs RDP into. I've seen companies do this before, and the big downside is that if your not connected to the network you can't do anything.
There are some upsides too though. Being able to easily increase resources (RAM, disk space, CPU, etc) because of the virtualization infrastructure. If somebodies laptop dies they are not out of comission, just find a loaner machine and RDP into their VM and they're up and running. If somebody leaves, you have a copy of their entire working machine that you can give to their replacement. All machines can be easily backed up. Etc. Compiling, and working within VS in general should be much faster too than trying to work with a local Visual Studio reading/writing to a network drive.

Network problem, suggestions sought

The LAN which has about a half dozen windows xp professional pcs and one windows 7 professional pc.
A jet/access '97 database file is acting as the database.
The method of acccess is via dao (DAO350.dll) and the front end app is written in vb6.
When an instance is created it immediately opens a global database object which it keeps open for the duration of its lifetime.
The windows 7 machine was acting as the fileserver for the last few months without any glitches.
Within the last week what's happened is that instances of the app will work for a while (say 30 mins) on the xp machines and then will fail on database operations, reporting connection errors (eg disk or network error or unable to find such and such a table.
Instances on the windows 7 machine work normally.
Moving the database file to one of the xp machines has the effect that the app works fine on ALL the xp machines but the error occurs on the windows 7 machine instead.
Just before the problem became apparent a newer version of the app was installed.
Uninstalling and installing the previous version did not solve the problem.
No other network changes that I know of were made although I am not entirely sure about this as the hardware guy did apparently visit about the same time the problems arose, perhaps even to do something concerning online backing up of data. (There is data storage on more than one computer) Apparently he did not go near the win 7 machine.
Finally I know not very much about networks so please forgive me if the information I provide here is superfluous or deficient.
I have tried turning off antivirus on the win 7 machine, restarting etc but nothing seems to work.
It is planned to move our database from jet to sql server express in the future.
I need some suggestions as to the possible causes of this so that I can investigate it further. Any suggestions would be gretly appreciated
UPDATE 08/02/2011
The issue has been resolved by the hardware guy who visited the client today. The problem was that on this particular LAN the IP addresses were allocated dynamically except for the Win 7 machine which had a static IP address.
The static address happened to lie within the range from which the dynamic addresses were being selected. This wasn't a problem until last week when a dynamic address was generated that matched the static one and gave rise to the problems I described above.
Thanks to everyone for their input and thanks for not closing the question.
Having smart knowledgeable people to call on is a great help when you're under pressure from an unhappy customer and the gaps in your own knowledge mean that you can't confidently state that your software is definitely not to blame.
I'd try:
Validate that same DAO and ODBC-drivers is used on both xp- and vista machines.
Is LAN single broadcast domain? If not, rewire. (If routers required make
sure WINS is working)
Upgrade to ms-sql. It could be just a day of well worth work, ;-)
regards,
//t

best way to set up a VM for development (regarding performance)

I am trying to set up a clean vm I will use in many of my devs. Hopefully I will use it many times and for a long time, so I want to get it right and set it up so performance is as good as possible. I have searched for a list of things to do, but strangely found only older posts, and none here.
My requirements are:
My host is Vista 32b, and guest is Windows2008 64b, using Vmware Workstation.
The VM should also be able to run on a Vmware ESX
I cannot move to other products (VirtualBox etc), but info about performance of each one is welcomed for reference. Anyway I guess most advices would apply to other OSs and other VM products.
I need network connectivity to my LAN
When developing/testing, guest will run several java processes, a DB and perform some file I/O
What I have found so far is:
HOWTO: Squeeze Every Last Drop of Performance Out of Your Virtual PCs: it's and old post, and about Virtual PC, but I guess most things still apply (and also apply to vmware).
I guess it makes a difference to disable all unnecessary services, but the ones mentioned in 1 seem like too few, I specifically always disable Windows Search. Any other service I should disable?
You can try to run the CD/DVD through vLite to remove unwanted crap. I'm not 100% sure if Windows 2008 server is supported but you could give it a try. I've successfully stripped down XP with nLite to about 200MB with only the bare minimum I need for testing software. You might be able to do something similar to Windows 2008 with vLite.
My host is Vista 32b, and guest is
Windows2008 64b,
First mistake. Seriously, why not running 64 bit even on Vista? This would give your VM a good memory space to work with, while now even if it is possible with VmWare it goes through really nasty API's in the Windows layer.
That said, why use Vista as host? Why not directly load a 2008 R2 host, configure it into workstation mode (heck, you even get our friendly AERO if you install all the things the server leaves out per default) and be happy with it?
I guess it makes a difference to
disable all unnecessary services,
Hm, seriously? I run a couple of Hyper-V hosting servers on top of physical domain controllers without any reconfiguration and with good enough (i.e. great) perforamnce. Helps I dont ahve the typical workstation bottleneck (i.e. one overloaded hard disc). I never found a reason to disable any service for squeezing the last performance out.
Guest will run many java processes, a
DB and perform lots of file I/O
Well, get proper hardware for that. I.e. a hardware RAID controller, and a LOT of drives - in accordance with your needs. DB is IO sensitive. VERY sensitive.

Why is WMDC/ActiveSync so flaky?

I'm developing a Windows Mobile app using the .NET Compact Framework 3.5 and VS2008, and for debugging using the Device Emulator V3, on Win7, and seem to have constant problems with Windows Mobile Device Centre (6.1) connecting.
Using the Emulator Manager (9.0.21022.8) I cradle the device using DMA in WMDC. The problem is it's so flaky at actually connecting that it's becoming a pain.
I find that when I turn my computer on, before I can get it to connect I have to open up WMDC, disable Connect over DMA, close WMDC down, reopen it again, and then it might cradle. Often I have to do this twice before it will cradle.
Once it's cradled it's generally fine, but nothing seems consistent in getting it to connect.
Connecting with physical devices is often better, although not always. If I plug a PDA into a USB socket other than the one it was originally plugged into then it won't connect at all. Often the best/most reliable connection method seems to be over Bluetooth, but that's quite slow.
Anybody got any tips or advice?
I thought I'd add some notes to this so I could close it off. Generally I found not the following to make it a little more stable:
1) Don't setup a partnership. While this doesn't provide a major issue, if there is a partnership in place, a blank dialog window pops up whenever I connect a device.
2) Always make sure you connect the device (via USB) to the same USB port that it was first on when the device driver was first installed. Moving the lead to a different port will often cause the device to not connect.
3) If it doesn't connect then opening WMDC and un-ticking allow DMA connections, close the WMDC screen, re-open it and re-tick the DMA option, and it will generally suddenly connect.
4) Also if you're using the device emulator and have it cradled, ensure you disconnect the ActiveSync connection before saving the state of the device when closing it. If you fail to do this it will no be able to connect when you restart the device, until you fiddle about with the connection enough for it to realise that it's not actually connected.
If anybody else has any tips to making it more stable then feel free to add them in.
Try deleting the existing device partnerships. That has helped me in the past when WMDC/ActiveSync was playing up.
Go to Control Panel -> Sync Center and delete the device partnerships you see listed there.
Then reconnect your device/emulator and when the WMDC window comes up, make sure you create a new device partnership (in my experience the connection/debugging becomes flakier when you choose to not setup a partnership).
Another thing to point out about ActiveSync, I learnt this the hard way, I noticed the battery drains faster with ActiveSync, even if it is not in the cradle, it is actually running in the background, and whether it is a bug or not, not sure, but, it 'thinks' it is still in the cradle and continuously polling for the connection. Here's the link that explains it. And here's the temporary fix.
The way I dealt with it is to run a small C program that looks for the ActiveSync process and kill it each time I un-cradle the pocketpc.

Are there any drawbacks to running Visual Studio remotely?

Let's say you have a slow laptop which can't handle Visual Studio but a blazingly fast desktop that can. Let's also say that you want to develop in several rooms in your house. Are there any drawbacks to having Visual Studio running on the desktop and simply using the laptop as a way to access it remotely? I'd guess that the only thing that you would be concerned about would be the network latency, but if the two computers are on the same network that should be minimal.
Do it.
Since you are running Visual Studio in your own local network, the main drawbacks (security and latency) are not there. In addition, you get the speed of your desktop and the mobility of your laptop.
I do this a lot even over broadband, I've never found speed to be a problem.
This is my standard working practice at work. There are times when you have issues, such as opening TFS document attachments can fail, but overall the experience is fine.
It is also an added bonus that you can leave it running continually (i.e. overnight / weekends) and you can kick off a build before you leave for the evening and come back to a packaged installer (or an error :) ).
I'm looking forward to (in a year or two) be able to do this over Hyper-V - then the application will run as though it IS on my laptop, with no remote desktop required.
No big drawbacks. I've been running VS 2008 remotely on a server 400 miles away, using GNU/Linux and rdesktop on my laptop and the server (of course) running Windows. The only problems I encounter are that it is a mess to move files between the two - but if you have the desktop near and can install anything you like (ftp programs for example), I can't see any drawbacks.
In a corp work environment where I've tried this I never felt particularly joyful. Tried using MSTSC and VNC.
Having a desktop with multiple monitors and trying to view that through a smaller laptop display is typically quite painful, never enough space.
Even when it was PC's on the same switch there always seemed to be some delay in the mouse moving or typing, I'm sure you could adjust, I just found it a bit annoying too.
We haven't tried serving up DevStudio from a CITRIX server yet, that might be worth a go.
I work a lot with Visual Studio over broadband, which is ok.
If you are running linux on your laptop, rdesktop is your friend. There are many options to gain more speed, like using 8-bit color instead of 16 or more. I don't know if mstsc offers such options. Visual Studio 2008 has got many options concerning speed which can be enabled if the connection is too slow: disable fancy menus etc.
greetings
I think that having the dual (or more) monitor set-up does beat the ease of mobility when using a laptop connecting to a remote desktop. I work at home at least two days in a working week using my laptop (which is a 17", 1900x1200 screen, basically what they call a "desktop replacement"), connected to VS and TFS using VPN and I find that experience less than the situation at work where I have the 17" laptop screen AND a 24" TFT (also 1900x1200).
I also have experienced that running VS (or SQL Server Management Studio for example) over an RDP session is just not like the real thing. It does get the job done, however the "feel" isn't just the same.

Resources