Create a different Windows account for compiling? - windows

I am experimenting with different open source projects just to see which one I can work with since I am a beginner. Of course, many projects have different dependencies and programs that you must install. I want to keep things organized and I don't want to pollute my main windows account, since I use this machine for everyday computing also.
Will creating a seperate windows account on my computer help separate the dependencies for the projects? Are there any better alternatives (other than using virtual machines)?
Thanks

Although you mentioned other than them, but virtual machines are actually the best option if you're planning on working with a lot of different projects and you don't want to pollute your environment too much. If you build them right, you can have a baseline VM that is simple to revert back to if you want to start from scratch because the environment got too polluted.
The problem with only using a separate account is that many installable tools and libraries means they're still going to be made available for all users on the machine, so it doesn't keep things cleaned up. For example, if Visual Studio tools typically apply to all users on the machine. COM dependencies aren't user specific. Some things install Windows Services that need to be running most of the time, but you don't use unless you're developing for them (like SQL Server Reporting Services).

How about installing many kinds of operation system?
If you have enough money, you can buy a computer only for experiment

Virtual machines is definitely the way to go - most non-trivial software modifies more than HKEY_CURRENT_USER machine state. If you don't want full-blown virtual machines (but oh, they're sweet, especially the ones supporting state snapshots!) you could look at something like sandboxie.

Related

VB6 app - move server

I am supporting a vb6 application. I am trying to transfer the executable and DLL to a new server and I am prompted with component not registered errors. I have got round this by manually registering the components on the new server.
I have found two files with file extensions of 000 and 001 that have registry commands in them (registering components) Can anyone explain how these files are generated? I have experience creating installation files in vb.net to a certain extent.
Repackaging and redeployment is not a developer issue and really doesn't belong here. Such issues are more appropriate for someplace like ServerFault.
It is one thing to have lost all of the source code of an application, but even worse in some ways to have lost the deployment package. Sadly some shops fail to archive either of these.
However it was also common enough for shops to see RAD tools like VB6, Delphi, PowerBuilder, etc. as things to shove off on the worst of the worst of their developers. These poor slobs seldom got official Microsoft training that should have emphasized the importance of creating proper installers. For that matter even those courses tended to marginalize the topic. It doesn't help that the Web is full of "Mort teaching Mort" half-baked development even today, or that the pioneers who wrote many of the early serious VB programming books tended to be loose cannons and contrarians who didn't really believe deployment was a serious concern.
The end result is that lots of shops have machines with VB6 programs shoehorned onto them in a half-baked way. Often when deadlines loomed they let Old Mort install VB6 right onto the production server and let him hack away right there! So it's no wonder people get into trouble once a server needs to be replaced or its OS updated.
Those REG files with .000, .001, etc. extensions aren't anything normal that I'm aware of. For all I know they've fallen out of REGMON runs or some 3rd party packaging tool. Manual registry exports created using REGEDIT would normally have .REG extensions.
If you are actually "supporting" this application it implies that you have the source code, VB6 compiler, developer install packages for any 3rd party controls, and a writeup describing any special packaging and installation requirements (target machine DCOM/COM+ configuration, system requirements such as IIS or MSMQ or 3rd party DBMS Providers and Drivers, special folder requirements, software firewall rules, etc.).
From those it ought to be possible to compile a clean new copy of the EXE, DLLs, etc. and create a clean deployment package - even if some configuration still needs to be done manually before and after running the installer.
Without those you are a computer janitor and your question belongs over at ServerFault. It is no fun, I know. I've had to take part in such janitorial services myself all too often.

why is program installation a process?

this may very well be a stupid question, but when i was asked something much simplier then this, i didn't have much of an answer...
why are most programs installed via some several step process of adding and changing and whatnot? we have programs that can be ran straight from a self contained executable, but a large portion of programs cannot. why is this? is it due to the programs growing exponentially by needing to include everything within itself? if that is the case, it is so difficult to design an OS from the ground up to be completely modular... ie. having an OS with a standard set of modules, that can be accessed and used from any 'standalone program' that doesn't require a lengthy install.
thanks
David Kirsch.
It's really a question of how complicated your program is. Many windows programs have dependencies on Visual Studio C++, .Net, Java etc. runtimes that are not delivered by the substrate OS. This means that for your programs to work those components must already be on the system.
If those prerequisites are missing, then your program won't even load, so you can't even get your program to check for them and tell the user to go and get them. This is where an installer comes in, as it generally doesn't have any prerequisites, but is able to sniff out the ones your progam needs and can either tell you to go and get them, or try to install them for you.
Also many programs need some logical, as well as physical, installation work to be done as part of deployment. This might mean registry changes (such as COM registration), changes to IIS (setting up a web application and virtual directory), or changes to the Service Control manager (setting up Windows Services).
In short, unless your program is extremely self-contained and has no external dependencies, an installer is the only way to get your application on the desktop.

Winlibre - An Aptitude-Synaptic for Windows. Would that be useful?

Last year, in 2009 GSoC, I participated with an organization called Winlibre. The basic idea is having a project similar to Aptitude (or Apt-get) and a GUI like Synaptic but for Windows and just to hold (initially), only open source software. The project was just ok, we finished what we considered was a good starting point but unfortunately, due to different occupations of the developers, the project has been idle almost since GSoC finished. Now, I have some energy, time and interest to try to continue this development. The project was divided in 3 parts: A repository server (which i worked on, and which was going to store and serve packages and files), a package creator for developers, and the main app, which is apt-get and its GUI.
I have been thinking about the project, and the first question that came to my mind is.. actually is this project useful for developers and Windows users? Keep in mind that the idea is to solve dependencies problems, and install packages "cleanly". I'm not a Windows developer and just a casual user, so i really don't have a lot of experience on how things are handled there, but as far as I have seen, all installers handle those dependencies. Will windows developers be willing to switch from installers to a packages way of handling installations of Open source Software? Or it's just ok to create packages for already existing installers?
The packages concept is basically the same as .deb or .rpm files.
I still have some other questions, but basically i would like to make sure that it's useful in someway to users and Windows developers, and if developers would find this project interesting. If you have any questions, feedback, suggestions or criticisms, please don't hesitate posting them.
Thanks!!
be sure to research previous efforts on this. Google turns up several similar/relevant efforts.
http://en.wikipedia.org/wiki/Package_management_system#Microsoft_Windows
http://windows-get.sourceforge.net
http://pina.plasmite.com
IIRC there was an rpm for windows at some point
Also I think there was some guy (who used to work at MS) in the news recently that basically is starting up a very similar project. I can't find a link to this now.
But anyway, yeah, it would be awesome if there was such a standard tool and repository.
I can only speak for myself, but obviously I could definitely make use of such a tool as I found your post through googling! ;)
My two use cases for this tool would the following ones:
1. I generally avoid to re-install my system as long as possible (in fact I manage to do so only for switching to a reasonable (not each an every) new version of Windows every few years or to setup new computers). But still I'd like my software to be up-to-date. Neither do I want to have to go to all the web pages and check manually if there are compatibility issues with the new version of Doxygen, Graphviz and the latest version of MikTeX for example, nor do I want to have to navigate to the download pages and run the setups all by myself. I just want to schedule ONE SINGLE (!) tool, which checks whether there are new updates or not and updates those applications which are not in conflict with any other application version.
If it unavoidably happens to me that I have to re-install my system, I don't want to get the new setups neither (and check compatibility). I even don't want to wait for one setup to finish in order to start the next one, I just want to check the tools I need, or even better, I want to simply load my "WinApt XML" batch installation file, which gets the installers and handles the setups sequentially all by itself.
I don't know enough about the architecture of .deb or .rpm but IMHO the most reasonable would be to maintain a DB with only the names, versions, dependencies and the location of the different versions' download locations. I mean, most of the tools available for Windows provide .msi packages anyways, which (I guess) is the application itself and some custom installation properties (really not sure how scripting is handled, but I know that creating a MSI in Visual Studio has very limited abilities to create custom installation steps and I can only imagine this is due to limitations of MSI protocol).
I guess a GUI will be mandatory for Windows users ;) but I personally would prefer the additional ability to handle the setups with the console.
Well, I like the idea and would love to hear from that (or such a) tool in the future.
Cheers
Check out NSIS. It's an open source MSI creator. You might be able to use it as part of your package creation software.
http://nsis.sourceforge.net/Main_Page
For the ALT-.Net tool/lib stack there have been some affords in this direction: Horn Get
However, the usability in a real world project has been subject in this SO question.

Tips to upgrade workstations for development team?

I have secured the budget to upgrade the individual workstations and latops. While newer, bigger screens were welcomed with enthusiasm, the thought of re-installation tools and settings caused most of them to blanch and I got one "Do I really have to?".
How much downtime do you usually have when you move to a new machine?
Do you employ tools or script to set up your dev environment, tools, db's, debuggers etc.specifically for a windows environment?
Is there a standard image that you keep and then let devs move in and tweak the machine as necessary?
My company essentially virtualized in order to stop wasting so much time with upgrades/system failures.
Whenever a desktop/laptop failed, we'd have to spend a better part of a day fixing it and reloading the software.
So, we went out, bought iMacs for everyone and loaded Parallels (a VMware like product for OSX) on them. Then we made a standard dev image for everyone, and just copied it to everyone's machines.
Essentially, if anyone's configuration got messed, we just loaded in a fresh image and kept on truckin'. Saved a lot of time.
Some additional benefits:
When new software is out, we just make a new image and distribute it. Not OS re-installs or anything like that.
If hardware changes, doesn't matter, just move the image.
You can run multiple os's concurrently for testing
You can take "snapshots" in your current image and revert if you really messed something up.
Multiple builds on the same machine...since you can run multiple os's.
Surprisingly the overhead of a virtualized system is quite low.
We only run the software on a real machine for performance tuning/testing purposes.
One day is generally enough for upgrades. I do keep digital copies of VS.NET so much easier to install.
When it comes to other tools generally it's just better to go to websites and install the latest version.
Also it's a good idea to install tools whenever you need instead of trying to install everything at the same time.
The last time I upgraded to a new machine, I think it took about 4 hours to get most of the necessary tools reinstalled. Over time, I've had to re-install quite a few more tools, but I think it's worth it.
If you can get a ghost/image of the default tool set (Visual Studio 2003-2008, Eclipse, NetBeans, or whatever you're using), and all the major service packs, that would help a lot with the initial setup.
I think the downtime is definitely worth it, a new, faster machine will make anyone more productive.
You can have 0 downtime by having both machines available. You will not have as much productivity.
This depends on the number of tools needed by the development team. Tools such as Rational Software Architect can take hours to install on their own. The exercise of having the developers list the applications they need before moving in can help you optimize strategies to deploy effectively. Both machines should be available for a fixed period of time and having them available can allow develoers to both work and kick of long running installs at the same time.
Creating a standard image based on the list provided to you can improve efficiency. Having the relvant software on a share could also let them cherry pick as needed and give the development team the feeling that they can go back as necessary.
Tools to assist in catpuring user settings exist. I have only ever had experience with Doctor Mover. If you have 100 or more developers to move it may be worth the cost. I can't complain too much but it wasn't perfect.
I have never had a problem with just getting a list of all the software a particular users uses. In fact I have never found the base install to be much of an issue. The parts I tend to spend the most time on are re-configuring all of the users custom settings (very common with developers I find). This is where it is very valuable to have the old machine around for awhile so that the user can at a minimum remote-desktop to it and see how they have things set up.
Depending on how your team works, I would highly recommend having every user receiving a new computer get the latest source tree from your source control repository rather than by copying entire directories. And, I would also recommend doing that before actually sending the old workstation elsewhere or even disconnecting it.
One of the great things about tools like CVS and SVN is that it is quite easy for developers to end up with an unofficial "personal branch" from things that are not properly checked in, merged, etc.
While it will cost time to deal with the shift if things are not properly synchronized, it is an invaluable opportunities to catch those things before they come to haunt you later.

How to comply with the new Federal Desktop Core Configuration (FDCC), which will remove local administrator access for all users? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
As developers, we believe that not having local administrative access is going to severely handicap our productivity. We will be restricted from running IIS (we’re a web development shop), installing applications, running Microsoft power tools, etc. If you’re going through the FDCC process now, it would be great to hear how you are coping with these changes.
Having actively worked as a contract developer at a base that uses the AF Standard Desktop, I can tell you a few things.
1: and most important. Don't fight it and don't do what the first person suggested "and let them choke on it". That is absolutely the wrong attitude. The military/government is fighting lack of funding, overstretched resources and a blossoming technology footprint that they don't understand. The steps they are taking may not be perfect, but they are under attack and we need to be helping, not hindering.
OK, that off my chest.
2: You need to look at creating (and I know this is hard with funding the way it is) a local development lab. Every base that I have worked at has an isolated network segement that you can get on that has external access, that is isolated from the main gov network. You basically have your work PC for e-mail, reports etc.. that is on the protected network. But, you develop in your small lab. I've had a lab be 2 PCs tucked under my desk that were going to be returned during a tech refresh. In other words, be creative with making yourself a development machine +servers that are NOT restricted. Those machines are just not allowed to be connected to the main lan segment.
3: Get the distributions of the desktop configurations. Part of your testing needs to be deploying/running on these configurations. Again, these configurations are not meant for development boxes. They are meant to be the machines the people use for day to day gov work.
4: If you are working on web solutions, be very aware of the restrictions on adding trusted sites, ActiveX components, certs, certain types of script execution that the configuration won't allow. Especially if you are trying to embed widgets/portlets/utils that require communications outside the deployed application domain.
5: Above all remember that very few of the people you work for understand the technology they are asking you to implement. They know they want function X but they want you to follow draconian security rule Y while achieving it. What that usually means is that the "grab some open source lib or plugin and go" is not an option. But, that is exactly what your managers think you are going to do because of the buzz around rapid development.
In summary, it's a mess out there. Try to help solve the problem.
While I've never been through the FDCC process, I once worked for a U.S. defense contractor who's policy was that no one had local administrative access to their machines. In addition, flash drives and CD-ROMs were disabled (if you wanted to listen to music on CDs, you had to have a personal CD player with headphones).
If you needed software installed you had to put in a work order. Someone would show up at your desk with the install media, login to a local admin account, and let you install the software (the reasoning being that you knew what to install better than they did). Surprisingly, the turnaround was pretty quick, usually around 1/2 an hour.
While an inconvenience, this policy didn't really cripple us. We were doing a combination of Java, C++ (MS Visual C++ and GNU/C++), VB 6.0 and some web development. For what little web development we did, we had a remote dev box we would RDP into for testing. Again, a bit of an inconvenience, but it didn't stop us from getting our jobs done.
Without ever having had the problem, today I'd probably try a virtualising solution to run these tools.
Or, as a friend of mine once opined: "Follow the process until They choke on it." In this case this'd probably mean calling the helpdesk each time you needed to have a modification to your local IIS config or you'd needed one of the powertools started.
From what I can tell FDCC is only intended to be a recommended security baseline. I'd give some push back on the privileges that you require and see what they can come up with to accommodate your request. Instead of saying I need to be a local administrator, I'd list the things that you need to be able to do and let them come up with a solution that works (which will likely to be to let you administer your machine or a VM). You need to be able to run the debugger in Visual Studio, run a local web server (Cassini), install patches/updates to your dev tools on your schedule, ...
I recently moved to a "semi-managed" environment with SCCM that gets patches installed on a regular basis from a local update repository. I was doing this myself, but this is marginally more efficient for the enterprise and it makes the security office happy. I did get them to put me, and the other developers, in a special collection so that we could block breaking changes if needed (how could IE7 be a security update?). Not much broke except that now I need to update Windows Defender manually since I updated it more frequently than they do in the managed collection! It wasn't as extreme as your case, obviously, but I think that is, in part, due to the fact that I was able to present the case for things that I needed to do for my job that required more local control.
From the NIST FAQ on Securing WinXP.
Should I make changes to the baseline settings? Given the wide
variation in operational and technical
considerations for operating any major
enterprise, it is appropriate that
some local changes will need to be
made to the baseline and the
associated settings (with hundreds of
settings, a myriad of applications,
and the variety of business functions
supported by Windows XP Systems, this
should be expected). Of course, use
caution and good judgment in making
changes to the security settings.
Always test the settings on a
carefully selected test machine first
and document the implemented settings.
This is quite common within financial institutions. I personally treat this as a game to see how much software I can run on my PC without any admin rights or sending requests to the support group.
So far I have done pretty well I have only sent one software install request which was for "Rational Software Architect" ('cos I need the plugins from the "official" release). Apart from that I have perl, php, python, apache all up and running. In addition I have jetty server, maven, winscp, putty, vim and a several other tools running quite happlily on my desktop.
So it shouldnt really bother you that much, and, even though I am one of the worst offenders when it comes to installing unofficial software I would recommend "no admin rights" to any shop remotly interested in securing their applications and networks.
One common practice is to give developers an "official" locked down PC on which they can run the official applications and do their eMail admin etc. and a bare bones development workstation to which they have admin rights.
Not having local administrative access to your workstation is a pain in the rear for sure. I had to deal with that while I was working for my university as a web developer in one of the academic departments. Every time I needed something installed such as Visual Studio or Dreamweaver I had to make a request to Computing Services.

Resources