I am about to travel to Europe (I'm Australian but imagine this is a similar circumstance for US users and simply flipped for European users).
However, there is the slim possibility I will need to do some Visual Studio work while I'm travelling.
As I see it I have three options:
Leave a desktop PC on at home, access remotely via net cafes.
Carry a laptop with me on the trip, upload files as required using public wifi.
Option 2 but instead buy cheap light netbook that is miraculously capable of running VS.
Does anyone have any experience or advice to shed on any of these options?
For reference, this existing post suggests that VS remotely for short distances is okay, but over longer distances could be more problematic. I've used VS via RDP to a US server before and it was pretty laggy but for small changes I could get by.
Concerns I have that you may have some experience with:
Weight of luggage (ideally like to travel light)
Security of laptop (imagine it'll be too heavy to carry around all the time so have to leave it at hotel/hostel etc. and hope for the best)
Security of data (don't want someone stealing RDP access to my home PC)
Security of FTP (don't want someone stealing FTP passwords over wireless)
I'd go with option #2 (carry a laptop that can run VS).
This way you can use the "more convenient" method if it works well (use it as a RDP client if the connection is low-latency enough), but you can still work locally if the connection you find is not reliable.
I think the bottom line is, always have a backup method when depending on networks that are far away and beyond your control.
Edit: Regarding the additional security concerns, most of those are things you should deal with anyway, traveling or not. If the stuff you're working with is that sensitive, you should probably improve the security of your remote work environment with a VPN and more secure file transfer method. Before you take your laptop anywhere, know what your plan is if you were to lose it.
It's a vacation. How do you expect to rest up properly if you're always worrying about work. Leave the phone at home too.
I used to leave a home PC on with VS and use services like GoToMyPc or LogMeIn or some similar service.
Since I have started using a laptop, I just carry the thing with me with VPN connectivity on business trips along with a 3G data card.
But seriously, if on vacation, I do not want to take my laptop with me.
security
First and foremost, encrypt the contents of the HDD - be safe.
If I am on a business trip, the laptop is with me so I am not as concerned with where it is. If I am on vacation, I do not know that I want to take one with me.
If is important then I would keep my laptop/pc at work ON and there will be someone that has access to turn it on/reboot it. So I would carry a light laptop that lets me connect and work if I need it. If that goes down, I can always head into a cybercafe.
database
If you are anticipating working, bring your dev database with you. I know it hogs space and memory (while in use), but it pulling data over the wire has taken long enough to make me lose concentration.
standalone
Make the laptop standalone so that it can work without a connection to VPN or internet - coverage is not the best / uniform in all areas.
Use TrueCrypt for encrypting your harddisk. Use VPN, SSH or something similar for remote connections. I always bring my laptop, but in case I would lose it, it's just a brick for the finder, and I have a good backup system that makes me able to get up and running on another computer quickly.
I tried installing VS2010 on my NetBook and it was a no-go. I was, however, able to install Expression Blend/Web which is good for most tasks.
Edit: To make this more useful... my netbook is HP Mini 1100 Series w/1GB RAM running Windows 7 "Starter"
beware: i don't know where you are going in europe, but do not count on a reliable internet connection in a hotel. it generally works, but when it does not, don't count on the personnel to repair it. of course, if you also carry your own connection (G3 or EDGE on your mobile phone), then this will not be a problem.
I suggest using the option 2 when working on your source code.
I also recommand using Git so you can work with a source control while being disconnected from the office source control. When you get an access, you can sync your whole repository with your office repository.
Of course, it all depend on which source control provider you are using.
For the occasional stuff that are not on Git, use a VPN for enhanced security.
My experience:
1) Purchased a small netbook (Samsung netbook with 2gb or so of RAM, I can lookup exact model number if anyoned interested but I think it's comparable to, or just above the NC10 (just comment if interested)).
2) Internet is bad in Europe (at least the options available to trav ellers). Something to note.
3) The netbook performance was absolutely fine. You don't want to be doing too much dev because of the small screen (though it was only really an issue for me because I got sick of the trackpad and didn't have a separate mouse) but it's honestly pretty fast and easy to use for .NET MVC development in Visual Studio.
Related
just a question, is a good idea to host machines with HyperV on a DC?
This is my Idea.
if the answer is no, can you explain why?
Thanks
Have a nice day
First of all, your question doesn't seem to be related to programming in any way and should hence be considered off topic for SO. Server Fault would probably be a more suitable place for (the question is somewhat old by now as well and you might already have found the info you need but I've flagged it for the moderators to perhaps consider a move from SO to SF or have it closed altogether).
Secondly, as for your question;
Generally no, it's not a good idea but there could be ifs and buts to everything I guess.
For a smaller company with perhaps only the one existing server (and no budget to add machines or get professional help to make any bigger changes to their current setup) which also happenes to be their DC, I guess it all comes down to what kind of workload the DC is under to begin with and just what will be hosted in Hyper-V. I'd personally still recommend against it though.
It's not a good idea as it's not a supported scenario from MS. I don't even know if server manager lets you install both.
You can host a VM with AD, but depending on your setup (cluster/HyperV in domain or not, ...) you really should add an AD physical server (even a very small one) for hyper-V to authenticate his services against when your AD VM has not started. It can save you a lot of time...
I've built a program via MS Access 2007 that I distribute via Microsoft Access Runtime. My clients do not have Access. Recently I've received multiple request for the application to be available for Mac. The volume of requests is low enough that it's not economical to rebuild the entire program in another language.
What would be the most economical method of allowing users to use the software on a Mac?
Is LibreOffice or Wine an option in this case, or is the only option for the user to purchase Windows and use a virtual environment?
LibreOffice Base: Extremely unlikely. Even if you were to get Base to connect to the Access tables it almost certainly would not be able to use the Access forms, reports, macros, VBA code, etc..
Wine: Worth a try, but I wouldn't be at all surprised if there were issues, quite possibly serious ones. According to the WineHQ page here, Access 2010 gets a "Bronze" compatibility rating, meaning
Application works, but it has some issues, even for normal use; a game may not redraw properly or display fonts in wrong colours, be much slower than it should etc.
That same page lists "Visual Basic" as one of the things that did not work under Wine when it was last tested.
If I were you I would give the latest version of Wine a quick try to see if things have improved but I wouldn't spend more than a couple of hours tinkering with it. I suspect that a Virtual Machine running an actual copy of Windows is probably the only real option in this case.
I have secured the budget to upgrade the individual workstations and latops. While newer, bigger screens were welcomed with enthusiasm, the thought of re-installation tools and settings caused most of them to blanch and I got one "Do I really have to?".
How much downtime do you usually have when you move to a new machine?
Do you employ tools or script to set up your dev environment, tools, db's, debuggers etc.specifically for a windows environment?
Is there a standard image that you keep and then let devs move in and tweak the machine as necessary?
My company essentially virtualized in order to stop wasting so much time with upgrades/system failures.
Whenever a desktop/laptop failed, we'd have to spend a better part of a day fixing it and reloading the software.
So, we went out, bought iMacs for everyone and loaded Parallels (a VMware like product for OSX) on them. Then we made a standard dev image for everyone, and just copied it to everyone's machines.
Essentially, if anyone's configuration got messed, we just loaded in a fresh image and kept on truckin'. Saved a lot of time.
Some additional benefits:
When new software is out, we just make a new image and distribute it. Not OS re-installs or anything like that.
If hardware changes, doesn't matter, just move the image.
You can run multiple os's concurrently for testing
You can take "snapshots" in your current image and revert if you really messed something up.
Multiple builds on the same machine...since you can run multiple os's.
Surprisingly the overhead of a virtualized system is quite low.
We only run the software on a real machine for performance tuning/testing purposes.
One day is generally enough for upgrades. I do keep digital copies of VS.NET so much easier to install.
When it comes to other tools generally it's just better to go to websites and install the latest version.
Also it's a good idea to install tools whenever you need instead of trying to install everything at the same time.
The last time I upgraded to a new machine, I think it took about 4 hours to get most of the necessary tools reinstalled. Over time, I've had to re-install quite a few more tools, but I think it's worth it.
If you can get a ghost/image of the default tool set (Visual Studio 2003-2008, Eclipse, NetBeans, or whatever you're using), and all the major service packs, that would help a lot with the initial setup.
I think the downtime is definitely worth it, a new, faster machine will make anyone more productive.
You can have 0 downtime by having both machines available. You will not have as much productivity.
This depends on the number of tools needed by the development team. Tools such as Rational Software Architect can take hours to install on their own. The exercise of having the developers list the applications they need before moving in can help you optimize strategies to deploy effectively. Both machines should be available for a fixed period of time and having them available can allow develoers to both work and kick of long running installs at the same time.
Creating a standard image based on the list provided to you can improve efficiency. Having the relvant software on a share could also let them cherry pick as needed and give the development team the feeling that they can go back as necessary.
Tools to assist in catpuring user settings exist. I have only ever had experience with Doctor Mover. If you have 100 or more developers to move it may be worth the cost. I can't complain too much but it wasn't perfect.
I have never had a problem with just getting a list of all the software a particular users uses. In fact I have never found the base install to be much of an issue. The parts I tend to spend the most time on are re-configuring all of the users custom settings (very common with developers I find). This is where it is very valuable to have the old machine around for awhile so that the user can at a minimum remote-desktop to it and see how they have things set up.
Depending on how your team works, I would highly recommend having every user receiving a new computer get the latest source tree from your source control repository rather than by copying entire directories. And, I would also recommend doing that before actually sending the old workstation elsewhere or even disconnecting it.
One of the great things about tools like CVS and SVN is that it is quite easy for developers to end up with an unofficial "personal branch" from things that are not properly checked in, merged, etc.
While it will cost time to deal with the shift if things are not properly synchronized, it is an invaluable opportunities to catch those things before they come to haunt you later.
I'm a (happy?) user of Windows, but recently have problems that I don't know how to track.
I have a WinXP plus home and work Win2k3 systems. Some of them are freezing itermittently for a short amount of time (from less than a second to a few seconds). There is no CPU usage spike and not much HDD activity. Neither Process Explorer nor Windows Task Manager show any suspicious processes. The services also look ok.
On one of computers, dragging and droping (within Explorer windows or windows and apps) freezes the machine for 10-20 sec. After this period I can continue to use drag & drop for some (long) time with no delays. Don't think it is virus – it would probably infect all machines easily.
How can I know what is going on with my systems?
Update: Thank you for your suggestions. I solved the problem on one of the machines – it was a nasty rootkit. I needed to use 3rd party tools to detect and remove it. How can I diagnose it without this tool?
This is most likely not faulty hardware.
On Windows, there are occasional messages that are broadcast system-wide to all top-level windows. If a window does not respond (or is slow in responding), then the whole system will appear to freeze. There is a built-in timeout and if exceeded, the system will assume that the window isn't going to respond and it skips the window (this could be the 10-20 second delay you're seeing although I think the timeout is a little higher than this).
I have not seen a solution for tracking these kinds of problems. You might experiment by creating a program that sends individual messages to each top-level window and record the time taken for each to respond. This isn't failsafe but it's a starting point, and this is (if I recall correctly) the technique I used to identify such a problem with Adobe's iFilter (for the Microsoft indexing service).
But before you go down this path, you said that these are recent problems. See if you can figure out what you might have installed recently and then uninstall it. This includes Windows patches as well as any new drivers or applications.
Are you able to peg it to a rough time-frame of when the symptoms started? If so, you could match the critical updates/installs in Add/Remove programs to that estimation and start looking there.
More generally, I find using MSCONFIG to temporarily turn off all startup programs and all non-Microsoft services can help quickly divide and conquer - If the symptoms disappear, you have a shorter list to work through.
Safe mode (with or without network - see next idea) is another way of narrowing the list of suspects.
Since it is multiple machines, if it were hardware it would have to be something common... Especially if it is two different locations. That said, network connectivity (or lack thereof) is the other frequent culprit. Bringing up a system in a standalone config (net cable unplugged/wireless radio disabled) will seem VERY slow at first, then once the timeouts and various retries have been exceeded, should zip along, especially if you are still running in a limited startup environment. I have had recalcitrant switches/routers be a problem, as well as sluggish external services (like an ISP's DNS) cause symptoms like this.
No floppy, optical, or other removable drive access at those times?
I would recommend a tool that can show files, COM objects and network addresses accessed within the application:
http://www.moduleanalyzer.com/
You can see the dlls that use each resource and the time is taking the accesses.
The problem with Windows slowdown is in general related to a dll that is running in a process/es that is doing some staff inside a process.
In these situations you won't see anything in tools that monitor from a Process perspective. You will need to see what is happening inside the process to see any suspicious dll or module.
This tool use call stack information to see what module is accessing resources.
Try that application that has a full-feature trial.
You probably have a faulty piece of hardware, from my experience likely your HD. If you are connect to a network share (SMB) and having connectivity issues that also could cause hangs. The drag and drop slowness in general points to the "explorer" process hanging, the same process used to communicate with network resources (file shares for example).
To diagnose the activities or infiltration a rootkit or other malware uses, you might check out the forums on Bleeping Computer, some of the volunteers there who help people remove such may be willing to help you figure out where to look for such infestations.
I recently cleaned up some malware through the help of an expert on that site which I also needed to use a third-party tool (in my case Malwarebytes) to remove, but the malware was relatively new such that this tool couldn't fully clean out the stuff until a more recent update to its definitions got released.
I still don't know how or where exactly to look on a given system for such an infestation, but that site might hook you up with someone who has that expertise. As long as you emphasize that you're looking for this to be able to track down such and not for purposes of writing your own malware I would hope they'd be receptive to your request.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
As developers, we believe that not having local administrative access is going to severely handicap our productivity. We will be restricted from running IIS (we’re a web development shop), installing applications, running Microsoft power tools, etc. If you’re going through the FDCC process now, it would be great to hear how you are coping with these changes.
Having actively worked as a contract developer at a base that uses the AF Standard Desktop, I can tell you a few things.
1: and most important. Don't fight it and don't do what the first person suggested "and let them choke on it". That is absolutely the wrong attitude. The military/government is fighting lack of funding, overstretched resources and a blossoming technology footprint that they don't understand. The steps they are taking may not be perfect, but they are under attack and we need to be helping, not hindering.
OK, that off my chest.
2: You need to look at creating (and I know this is hard with funding the way it is) a local development lab. Every base that I have worked at has an isolated network segement that you can get on that has external access, that is isolated from the main gov network. You basically have your work PC for e-mail, reports etc.. that is on the protected network. But, you develop in your small lab. I've had a lab be 2 PCs tucked under my desk that were going to be returned during a tech refresh. In other words, be creative with making yourself a development machine +servers that are NOT restricted. Those machines are just not allowed to be connected to the main lan segment.
3: Get the distributions of the desktop configurations. Part of your testing needs to be deploying/running on these configurations. Again, these configurations are not meant for development boxes. They are meant to be the machines the people use for day to day gov work.
4: If you are working on web solutions, be very aware of the restrictions on adding trusted sites, ActiveX components, certs, certain types of script execution that the configuration won't allow. Especially if you are trying to embed widgets/portlets/utils that require communications outside the deployed application domain.
5: Above all remember that very few of the people you work for understand the technology they are asking you to implement. They know they want function X but they want you to follow draconian security rule Y while achieving it. What that usually means is that the "grab some open source lib or plugin and go" is not an option. But, that is exactly what your managers think you are going to do because of the buzz around rapid development.
In summary, it's a mess out there. Try to help solve the problem.
While I've never been through the FDCC process, I once worked for a U.S. defense contractor who's policy was that no one had local administrative access to their machines. In addition, flash drives and CD-ROMs were disabled (if you wanted to listen to music on CDs, you had to have a personal CD player with headphones).
If you needed software installed you had to put in a work order. Someone would show up at your desk with the install media, login to a local admin account, and let you install the software (the reasoning being that you knew what to install better than they did). Surprisingly, the turnaround was pretty quick, usually around 1/2 an hour.
While an inconvenience, this policy didn't really cripple us. We were doing a combination of Java, C++ (MS Visual C++ and GNU/C++), VB 6.0 and some web development. For what little web development we did, we had a remote dev box we would RDP into for testing. Again, a bit of an inconvenience, but it didn't stop us from getting our jobs done.
Without ever having had the problem, today I'd probably try a virtualising solution to run these tools.
Or, as a friend of mine once opined: "Follow the process until They choke on it." In this case this'd probably mean calling the helpdesk each time you needed to have a modification to your local IIS config or you'd needed one of the powertools started.
From what I can tell FDCC is only intended to be a recommended security baseline. I'd give some push back on the privileges that you require and see what they can come up with to accommodate your request. Instead of saying I need to be a local administrator, I'd list the things that you need to be able to do and let them come up with a solution that works (which will likely to be to let you administer your machine or a VM). You need to be able to run the debugger in Visual Studio, run a local web server (Cassini), install patches/updates to your dev tools on your schedule, ...
I recently moved to a "semi-managed" environment with SCCM that gets patches installed on a regular basis from a local update repository. I was doing this myself, but this is marginally more efficient for the enterprise and it makes the security office happy. I did get them to put me, and the other developers, in a special collection so that we could block breaking changes if needed (how could IE7 be a security update?). Not much broke except that now I need to update Windows Defender manually since I updated it more frequently than they do in the managed collection! It wasn't as extreme as your case, obviously, but I think that is, in part, due to the fact that I was able to present the case for things that I needed to do for my job that required more local control.
From the NIST FAQ on Securing WinXP.
Should I make changes to the baseline settings? Given the wide
variation in operational and technical
considerations for operating any major
enterprise, it is appropriate that
some local changes will need to be
made to the baseline and the
associated settings (with hundreds of
settings, a myriad of applications,
and the variety of business functions
supported by Windows XP Systems, this
should be expected). Of course, use
caution and good judgment in making
changes to the security settings.
Always test the settings on a
carefully selected test machine first
and document the implemented settings.
This is quite common within financial institutions. I personally treat this as a game to see how much software I can run on my PC without any admin rights or sending requests to the support group.
So far I have done pretty well I have only sent one software install request which was for "Rational Software Architect" ('cos I need the plugins from the "official" release). Apart from that I have perl, php, python, apache all up and running. In addition I have jetty server, maven, winscp, putty, vim and a several other tools running quite happlily on my desktop.
So it shouldnt really bother you that much, and, even though I am one of the worst offenders when it comes to installing unofficial software I would recommend "no admin rights" to any shop remotly interested in securing their applications and networks.
One common practice is to give developers an "official" locked down PC on which they can run the official applications and do their eMail admin etc. and a bare bones development workstation to which they have admin rights.
Not having local administrative access to your workstation is a pain in the rear for sure. I had to deal with that while I was working for my university as a web developer in one of the academic departments. Every time I needed something installed such as Visual Studio or Dreamweaver I had to make a request to Computing Services.