Will a google colab notebook connected to a local GPU/runtime ever disconnect? - runtime

I am running a google colab notebook using my local GPU therefore using the colab feature of connecting to a local runtime.
I was wondering if, when running colab this way, it will remain with the same issues of disconnecting the runtime after a few hours being idle while training a model.
Thanks in advance,
Pedro

I have found by experience that when google colab is connected to a local runtime (i.e. GPU on your own machine as an example) it will never disconnect.
The 12h limit only applies when using google resources, since in this way they are not being used, it does not apply.

Notebooks run by connecting to virtual machines that have maximum lifetimes that can be as much as 12 hours. Notebooks will also disconnect from VMs when left idle for too long. However maximum VM lifetime and idle timeout behavior may vary over time, or based on your usage because this is necessary for Colab to be able to offer computational resources free of charge

Related

How to run google colab from terminal?

I am running a deep learning model in google colab and it works fine with colab notebook. The problem is as the training of deep learning model progresses in the cloud colab notebook, my own computers cpu and memory usage also starts to go up. The RAM usage for the colab notebook browser window alone is more than 500 MB and it keeps climbing as the training progresses.
In google colab we have to keep open our running notebooks to train the model else we will lose all our previous work and it stops training. Can I run google colab from my terminal window instead of browser? Is there any way to force google colab to run in the cloud alone, that is, to run the notebook without opening the computer?
Instructions for local execution are available here:
http://research.google.com/colaboratory/local-runtimes.html
If you combine these with an SSH tunnel, you can run the backend on your own GCE VMs, AWS, or anywhere else you like.

Is there any data on how fast Azure VM local drives are?

I'm experimenting with OnStart() in my Azure role using "small" instances. Turns out it takes about two minutes to unpack a 400 megabytes ZIP file that is located in "local storage" on drive D into a folder on drive E.
I though maybe I should do it some other way around but I can't find any data about how fast the local disks on Azure VMs typically are.
Are there any test results for how fast Azure VM local disks are?
I just ran a comparison of disk performance between Azure and Amazon EC2. You can read it here, although you will probably want to translate it from Norwegian :-)
The interesting parts, though, are the following HD Tune screenshots.
First, a small instance at Amazon EC2 running Windows Server 2008:
Next, a small instance on Azure running Windows Server 2012:
This isn't a fair comparison, as some of the differences may be due to missing Windows 2012 drivers, but you may still find it useful.
As pointed out by Sandrino, though, small instances at Azure only get "moderate" I/O performance, and this may be an argument in favor of Amazon.
It all depends on your VM size: https://www.windowsazure.com/en-us/pricing/details/#cloud-services. As you can see a small instance will give you a moderate I/O performance, and medium/large/xxl will give you a high I/O performance.
If you want specifics I suggest you read through this blog post: Microsoft SQL Server 2012 VM Performance on Windows Azure Virtual Machines – Part I: I/O Performance results. They talk about the SQLIO tool that can help people decide on moving they SQL Server infrastructure to Windows Azure VMs.
This tool is interesting since it might just give you the info you need (read and write MB/s):

Is there a way to have XCode ask before downloading documentation/API updates?

By default, XCode is setup to download updates associated with the various documentation and API libraries available to the application. This can be disabled from the XCode preferences screen. However, I'd prefer not to disable the automatic updates but, rather, prompt me to start the updates so that I can potentially dismiss them for download at a later time.
My reasoning is because I primarily work through a network connection that gains access to the internet through a 4G wireless hotspot that gets hit by overage fees. In fact, I'm connected through this device, to the internet, about 90% of the time I'm working on my Mac. When I need to download any form of large software update, I always take my Macbook to an open, direct-connect source to the internet and let it do what it needs to do.
This works fine for most software, but not XCode.
I want my updates to remain automatic (so that I am at least informed that there is an update available,) however, I'd like to have the choice whether to initiate them or not.
Is there something that I can do to make XCode ask before downloading?
A QUICK NOTE
I know how we technical minded folk are-- half of you are still wondering why I work off of a 4G hotspot and want to fix that problem, instead of the one I asked. (Yes, I tend to think this way too.)
However, I work in an environment that has an IT department that adamantly refuses to allow any operating systems, other than Win XP and Win 7, onto their network. The Engineering team (which I work for) has to have an internet connection and an internal network for storing and backing up data and we are developing iOS software that is integrated with our products. This is obviously problematic since we need to use Macbooks to do our work.
Our solution to this dilemma has been to setup our own, small LAN and our only way of getting internet access is through cellular WiFi. All WiFi ISP plans that are available in our region are tier-based and overages are charged (at a reasonable rate) when we use more than our allotment of data. We don't mind going over our quota, however, we need to keep it reasonable. Automatic updates like this can start to take a huge hit on our network when a few of us have to download a few GBs of data each month.
Software like LittleSnitch might be the best solution (firewall).
In case Xcode tries to update itself, the dialog below shown.
Now you can simply Allow or Deny the connection.
This way you can accurately control all connections (and your privacy).
Other applications might also consume quite some bandwidth.
P.S.
Why not connect a cheap XP box to the network,
which shares the internet connection to all Macs?

Network problem, suggestions sought

The LAN which has about a half dozen windows xp professional pcs and one windows 7 professional pc.
A jet/access '97 database file is acting as the database.
The method of acccess is via dao (DAO350.dll) and the front end app is written in vb6.
When an instance is created it immediately opens a global database object which it keeps open for the duration of its lifetime.
The windows 7 machine was acting as the fileserver for the last few months without any glitches.
Within the last week what's happened is that instances of the app will work for a while (say 30 mins) on the xp machines and then will fail on database operations, reporting connection errors (eg disk or network error or unable to find such and such a table.
Instances on the windows 7 machine work normally.
Moving the database file to one of the xp machines has the effect that the app works fine on ALL the xp machines but the error occurs on the windows 7 machine instead.
Just before the problem became apparent a newer version of the app was installed.
Uninstalling and installing the previous version did not solve the problem.
No other network changes that I know of were made although I am not entirely sure about this as the hardware guy did apparently visit about the same time the problems arose, perhaps even to do something concerning online backing up of data. (There is data storage on more than one computer) Apparently he did not go near the win 7 machine.
Finally I know not very much about networks so please forgive me if the information I provide here is superfluous or deficient.
I have tried turning off antivirus on the win 7 machine, restarting etc but nothing seems to work.
It is planned to move our database from jet to sql server express in the future.
I need some suggestions as to the possible causes of this so that I can investigate it further. Any suggestions would be gretly appreciated
UPDATE 08/02/2011
The issue has been resolved by the hardware guy who visited the client today. The problem was that on this particular LAN the IP addresses were allocated dynamically except for the Win 7 machine which had a static IP address.
The static address happened to lie within the range from which the dynamic addresses were being selected. This wasn't a problem until last week when a dynamic address was generated that matched the static one and gave rise to the problems I described above.
Thanks to everyone for their input and thanks for not closing the question.
Having smart knowledgeable people to call on is a great help when you're under pressure from an unhappy customer and the gaps in your own knowledge mean that you can't confidently state that your software is definitely not to blame.
I'd try:
Validate that same DAO and ODBC-drivers is used on both xp- and vista machines.
Is LAN single broadcast domain? If not, rewire. (If routers required make
sure WINS is working)
Upgrade to ms-sql. It could be just a day of well worth work, ;-)
regards,
//t

Simplest way to get access to a remote server for computing tasks

I'm working on some academic research projects involving scraping large data sets from the web using Python. It's been inconvenient to work on my academic institution's Linux server because (1) I don't have superuser access, meaning I'm dependent on the IT staff to install my packages, and (2) my disk quota is somewhat limited (I would ideally want ~10 GB). What is the simplest way for me to get access to a machine that solves these problems? I don't need huge processing power; I just need access to a reasonably fast machine that runs 24/7, so that my programs can run continuously, and above all, something very simple to get running, use, and maintain, since I have a few non-CS people working on this project with me. Linux would be preferable, but I'd consider Windows too.
I'm aware of Amazon Web Services, but am wondering if there's something more appropriate to my specific needs.
By the way, it would be a huge bonus if I could get some sort of remote desktop access to this machine so I wasn't limited to using SSH and SFTP.
Suggestions?
EDIT: I can't use VirtualBox or Virtual PC because I need the program to be running around the clock, and I need to turn off my laptop often, etc.
If you do want to stick with running on your CS department's machines, use virtualenv to solve your package installation woes. And if disk space is an issue, you could use S3 (and perhaps FUSE) to store huge amounts of data extremely cheaply.
However, if that's not really what you're after, I can recommend Slicehost very highly. They give you a virtual private server - so you have complete control over what gets installed, users, admin, etc.
In principle, it's very much like EC2 (which I prefer to use for "real" servers), but has a friendly interface, great customer service and is aimed at smaller projects like yours.
Use x11vnc with ssh.
'sudo apt-get install x11vnc' on your remote server.
Once you have that, you can access your remote server via vnc, but the great thing is that you can tunnel vnc over ssh like so:
ssh -X -C -L 5900:localhost:5900 remotehost x11vnc -localhost -display :0
For more details see the x11vnc manpage.
Or, just setup remote desktop -- (which is actually vnc) on your linux distribution. Most distributions come with a GUI to configure remote desktop access.
If you have a linux machine you can use, then SSH -X will allow you to start GUI programs. It's not remote desktop, but it's close.
ssh -X whoever#whatever.com
firefox
Then bam. A firefox window pops on your desktop.
I have been pretty happy with TekTonic Virtual Private Servers. It's a virtualized environment, but you have full root access to install any packages you need. I'm not sure what your CPU and memory constraints are, but if they aren't too extensive then this should fit the bill nicely for you. I don't know if you would be able to enable a remote desktop as I've never tried but it may be possible to install the requisite packages.
The plans range from $15/mo to $100/mo, the $15/mo plan comes with 294MB RAM, 13GB disk space, and 2.6GHz max CPU speed. I ran on that plan for quite a while and eventually moved up to the next level up with double the disk/cpu/mem, and I've been quite happy with it. I've been with them since 2003 and have yet to find anyone who offers equivalent plans at these prices.

Resources