How can I test my applications against the popular virus scanners? - windows

I need to find out whether my apps are being flagged as viruses by the most popular anti-virus packages (not best, but biggest by user base). I therefore would like to know how others go about this. Some background:
I have an application written in Delphi. Ever since the Delphi virus was found, I've had problems with false positives on my applications, particularly my demonstration versions for some reason (they all share the same code). AVG has been good, and I can now whitelist my files easily, but then I got the latest DevExpress installer and it was false-positived too. Given this is getting more widespread, it struck me that I need to find out if my apps are being flagged by the most popular anti-virus packages. I therefore would like to know how others go about this. I don't want people to be downloading our demonstration versions, getting an AV warning, and deciding not to try it.
The only options I have so far are buying a load of AV packages and putting them in a VM, or using a service like VirusTotal. The latter seemed an ideal option but for the fact that they limit the test to files under 20Mb, and my files are bigger than this. There is no paid for option either to expand the capability. (I thought this an odd limit, but Kaperskis free checker is limited to 1Mb!)
How do you check your applications?

VirusScan.jotti.org and VirusTotal.com may help

http://online.us.drweb.com/
i couldn't see any file size limit on it

My thoughts on this are as follows:
I set up a computer (nothing special) with a lot of disk space. I'll call this the ScanPC. Every time I do a build, the script will copy the new files to the ScanPC into a build specific directory. This will ensure that I have an archive of all builds that can be examined. Any one may have been released to customers.
Now, I then install VMWare server, and set up a number of virtual PCs. In each, I set up the anti-virus software to scan the network share, but in a read-only mode so that no scanner can accidentally modify or remove the false positive. Each VM can then be automatically updated from the vendor, and hopefully they will have an email option to tell me when they spot a virus, which I will then know is a false positive and can report to the vendor.
The benefit of this is that I have a complete build archive (something I need anyway), and it means that old versions out with customers that trigger the AV are identified as well as the most recent. It means I can add or remove AV products as appropriate. It means that I only need a single computer (performance is not important).

Related

Creating a Windows installer using C# Winforms instead of Installer tool [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I have used InstallAware and InstallShield before, and they are pretty difficult to work with and when something goes wrong it is very difficult to find and resolved the issue.
My question is why can't we use a Windows application written using C# to do this.
I understand that .Net framework may not be installed on the destination computer, so I wonder why no one has ever used this architecture:
I will create a simple installer using IntallSiheld(or any other similar tool) to just install .Net Framework and after that extracts and runs my own Windows application which I have written using C# in elevated mode. My application will run a Wizard with Back and Next button and I will take care of everything in it (copying files, creating and starting Windows Services, adding registry values, creating firewall extensions etc.)
Has anyone ever done this, and is there anything that prevents people from doing this?
In essence: don't try to re-invent the wheel. Use an existing deployment tool and stay with your day job :-). There are many such tools available. See links below.
And below, prolonged, repetitive musing:
Redux: IMHO and with all due respect, if I may say so, making your own installer software is reinventing the wheel for absolutely no gain whatsoever I am afraid. I believe you will "re-discover" the complexities found by others who have walked the path that is involved in deployment as you create your own installer software and find that software can be quick to make, but very hard to perfect. In the process you will expend lots of effort trying to wrap things up - and "the last meter is very long" as you curse yourself dealing with trifles that take up your time at the expense of what would otherwise pay the bills. Sorting out the bugs in any toolkit for whatever technical feature, can take years or even decades. And no, I am not making it up. It is what all deployment software vendors deal with.
Many Existing Tools: there are many existing tools that implement such deployment functionality already - which are not based on Windows Installer (Inno Setup, NSIS, DeployMaster and heaps of other less known efforts):
There is a list of non-MSI installer software here.
There is another list of MSI-capable software here.
My 2 cents - if you do not like MSI, choose one of the free, non-MSI deployment tools. How to create windows installer.
Corporate Deployment: The really important point (for me) is that corporate deployment relies on standardized packaging formats - such as MSI - to allow reliable, remote management of your software's deployment. Making your own installer will not impress any system administrators or corporate deployment specialists (at least until you sort out years of bugs and deficiencies). They want standardized format that they know how to handle (that does not imply that they are that impressed with existing deployment technology). Doing your deployment with standardized deployment formats can get you corporate approval for your software. If you make a weird deployment format that does unusual things on install that can't be easily captured and deployed on a large scale your software is head-first out of any large corporation. No mercy - for real. These are busy environments and you will face little understanding for your unusual solution.
"File-Pushers": Those of us who push files around for a living know that the field of deployment is riddled with silly problems that quickly kill your productiveness in other endeavors - the ones that make you stand out in your field - your day job. Deployment is a high profile, low status endeavor - and we are not complaining. It is just what it is: a necessity that is harder to deal with than you might think. Just spend your time more wisely is what I would conclude.
Complexity: Maybe skim the section "The Complexity of Deployment" here: Windows Installer and the creation of WiX. It is astonishing to deal with all the silly bugs that happen in deployment. It is not just a file copy, though it might be easy to think it is. And if it happens to be just a file copy, then there are existing tools that do the job. Free ones too. See links above. And if you think deployment is only file-copy in general, then please skim this list of tasks a deployment task should be capable of supporting: What is the benefit and real purpose of program installation?
Will your home-grown package handle the following? (just some random thoughts)
A malware-infected terminal server PC in Korea with Unicode characters in the path?
Symbolic links and NTFS junction points paths?
A laptop which shuts itself off in the middle of your file copy because it is out of battery?
Out of disk space situations? What about disk errors? And copy timeouts?
What about reboot requirements? For in-use files or some other reason. How are they to be handled? What if the system is in a reboot pending state and you need to detect it before kicking off your install?
How will you reliably install, configure and start and stop services?
How will you support uninstall and cleanup for your application?
Security software which flags your unknown, unrecognized, non-standard package a security threat and quarantines it? How would you begin to deal with this? Who do you contact to get into the good graces of a "recognized binary" for elevation?
Non-standard NTFS permissioning (ACLs) and NT Privileges? How do you detect it and degrade gracefully when you get permission denied? (for whatever reason).
Deployment of necessary runtimes for your application to work? (has been done by many others before). Download of the lastest runtimes if your embedded ones are out of date? Etc...
Provide a standardized way to extract files from your installation binary?
Provide help and support for your setup binaries for users who try to use them?
Etc... This was just a random list of whatever came to mind quickly. There are obviously many issues.
This was a bit over the top for what you asked, but don't be fooled to think deployment is something you can sort out a solution for in a few hours. And definitely don't take the job promising to do so - if that is what you are being asked. Just my two cents.
The above issues, and many others, are what people discover they have to handle when creating deployment software - for all but the most trivial deployments. Don't waste your time - use some established tool.
Transaction: If you are working in a corporation and just need your files to your testers, you can deploy using batch files for that matter - if you would like to. But you have to support it, and I guarantee you it will take a lot of your time. What do you do when the batch file failed half-way through due to a network error, and your testers are testing files that are inconsistent? Future deployment technologies may be better for such light-weight tasks. Perhaps the biggest feature of a deployment tool is to report whether the deployment completed successfully or not, and to log the errors and to roll the machine back to a stable state if something failed. Windows Installer does a lot of this work for you.
Distribution: A lot of people feel they can "just replicate my build folder to the user's computers". The complexities involved here are many. There is network involved, and network can never be assumed to be reliable, you need lots of error handling here. Then there is the issue of transactions: when do you know when the computer is in a stable state and should stop replicating. How often do you replicate, only on demand? How do you deal with the few computers that failed to replicate. How do you tell the users? These are distribution issues. Corporations have huge tools such as SCCM to deal with all these error conditions. Trying to re-implement all these checks, logging and features will take a long time. In the end you will have re-created an existing distribution system. Full circle. And how do you do inventory of your computers when there is no product registered as installed since only a batch file or script ran? And if you start replicating a lot of packages, how many times do you scan each file to determine if they are up to date? How much network traffic do you want to create? Where does it end? The answer: I guess transactions must be implemented with full logging and error tracking and rollback. Then you are full circle to a distribution system like I mentioned above and a supported package format as well.
This "just replicate my build folder to my users" ideas somehow remind me of this list: https://en.wikipedia.org/wiki/Fallacies_of_distributed_computing. Not a 100% match, but the issues are reminiscent. When networking is involved, things start to become very unpredictable and you need logging, error control, transactions, rollback, network communication, etc... We have re-discovered large scale deployment - the beast that it is.
Network: and let's say you want to replicate your build folder to 10000 desktop machines in your enterprise. How do you kick off the replication? Do you start all replications at once and take down the trading floor of the bank as file replication takes over the whole network like a DDOS attack? Sorry - it is getting out of hand - please pardon the lunacy - but it really is upsetting that this replication approach is seen as viable for large scale deployment with current technology approaches. Built-in Windows features could help, but still need to be tested properly. You need scheduling, queuing, caching, regional distribution shares, logging, reporting / inventory, and God knows what else that a packaging / deployment system gives you already. And re-implementing it will be a pain train of brand new bugs to deal with.
Maybe we one day will see automatic output folder replication based on automatic package generation which really works via an intelligent and transacted distribution system. Many corporate teams are trying, and by using existing tools they get closer with standard package formats used. I guess current cloud deployment systems are moving in this direction with online repositories and easy, interactive installation, but we still need to package our software intelligently. It will be interesting to see what the future holds and what new problems result for packaging and distribution in the age of the cloud.
As we pull files directly from online repositories on-demand we will see a bunch of new problems? Malware, spoofing and injection? (already problematic, but could get worse). Remote files deleted without warning (to get rid of vulnerable releases that should no longer be used - leaving users stranded)? Certificate and signature problems? Firewalls & proxy issues? Auto-magic updates with unfortunate bugs hitting everyone immediately and unexpectedly? And the fallacies of the network and other factors as linked to above. Beats me. We will see.
OK, it became a rant as usual - and that last paragraph is heading over board with speculation (and some of the issues already apply to current deployment). Sorry about that. But do try to get management approval to use an existing packaging & deployment solution is my only advice.
Links:
Stefan Kruger's Installsite.org twitter feed: https://twitter.com/installsite
Choosing a deployment tool:
How to create windows installer
What installation product to use? InstallShield, WiX, Wise, Advanced Installer, etc
Windows Installer and the creation of WiX
WiX quick start tips
More on dark.exe (a bit down the page)

Real Time Version Control Software

There's no shortage of traditional version control software, but I'm looking for something that doesn't require me to constantly commit. In other words, I am searching for background software that automatically keeps a history of all files in a directory. If possible, I would also like to be able to add commit notes myself for benchmarks later on.
I have no preference on whether or not the solution is a web service or a local service that I host myself, as long as it is free or has a reasonable one-time fee (no subscriptions please). Performance and Hard Drive Usage are not issues.
I hope that I'm not being too specific with my request. I searched the web for solutions, but I could not find any software that does what I want. For compatibility, I have Windows 7 64-bit and an AMD processor.
You should consider Dropbox. It's not strictly version control software, but it runs in the background and syncs your files to the cloud.
They give you 2GB of storage for free (more if you pay a subscription), and from their website, you can view the different versions of the files in your Dropbox folder.
I don't know of any software or services that do this.
I did come up with a quick idea though
This idea seems like a bit of a hack and it is right off the top of my head (so issues may come up in implementing).
Create an app/service that uses the FileSystemWatcher to detect changes/creation/deletion to the location(s) that you want to version.
When a detection is made, do [bat/cmd/powershell/code/ect] that uses the command line (or other) interface of [insert version control software here] to do a commit.
I think it's fairly straight forward. I think it's easy to implement, but that's the danger of 'top of the head' ideas. A direction to look for rolling your if nothing else. :)
I don't know of any stock solutions, but depending on your IDE/Make system, you should be able to create a postbuild event that commits the files to your "regular" version control system after a successful build (the version control system will need an external API or CLI for this to work). You can then add commit notes, etc. in the version control system at your leisure.

Speeding up WIX compiles

I have a WIX 3.0 installer that is building 88 slightly different builds (cross product of 32 and 64-bit, 11 locales, four editions (Beta, Retail, Evaluation, Different Evaluation).
Each build has slightly different contents in addition to localized UI, so I can't just build one configuration with multiple locales.
The resulting MSI is about 120MB. I'm already using the CabCache.
The installer takes about 3-5 minutes per release to build, resulting in a pretty lengthy overall build time.
The install appears to be heavily disk bound during linking (light.exe).
Clearly making the disks faster could help. Does anybody have advice on how to set up a machine that could crank through these installers faster? (or advice on reconfiguring my WIX project to build more efficiently?)
Get an SSD. Like one of those with internal RAID architecture from e.g. OCZ. SSD is every developer's upgrade of the decade. Plus more RAM if swapping is an issue.
If you have common parts (that are not localized) you can create a merge module with the common parts and then just add the differencing stuff to each build.
I am not sure if you have any say or communication with the developers of the application that you are installing, but if you have to create that many MSI's mainly because of languages, have you considered just offering one Language MSI that delivers all the language specific files to a resources directory and then the user can choose which language they would like to use (but only install this if they need something other than the default language). Also it might be worth looking into having the product made in such a way that the user can pick from within which language is best, then having all the languages installed from the start.
As for your question about speeding up the build, that is a tricky one. Using Merge Modules I would rule out right away, as I don't see any actual gain coming out of that. Of course updating the hardware (as you said) will give some results, but again, I am not sure how much of a jump you would be making so it is hard to tell what kind of gain that would give. I think it might be best to go over your WXS with a fine tooth comb and see what is really going on in there. You can sometimes find things that are left over from the developement of the package, or from a previous tool that are really slowing you down. One example would be that my company recently switched to WiX from a more automated setup creation utility (leaving the name out on purpose cause I am listing the problems with it :P ) and it automatically created every folder under Windows that might possibly be needed in the running of a windows application, as well as the common files folder, the current user profile, and many many more. I think I ended up erasing in all over 100 empty directories that this old technology was nice enough to add for me. That is just one example of optimization that was done. It is amazing what can be found when you take the time to REALLY review what is going on under the hood.
In your wixproj setup file add this just before the end of file in <PropertyGroup> tag
<IncrementalGet>true</IncrementalGet>
This will tell WIX to compile only those files which are changed after the previous build.

How do I do whatever it was that the Windows Installer CleanUp Utility did?

Microsoft's "Windows Installer CleanUp Utility" could be used to help fix broken installations of MSI-installer based products. When the installer failed in some strange way and left corrupt data behind, so bad that even Add/Remove Programs couldn't help, you could often fix things by running this utility and then running the application's installer again.
I just discovered that Microsoft announced a couple weeks ago that they were discontinuing this utility. They didn't merely say "we're not supporting it anymore"; they seemingly removed it from their site entirely.
I have to support a Windows program for a whole bunch of users. Given the number of users, every so often something will go wrong, and this program has been invaluable for me, as a last-ditch line of defense.
I know I could point customers to some third party site that has a cached copy of it, but this seems dangerous (malware potential and such).
So, are there any replacement products? Or, if not, how can I myself do whatever it is that this program did?
To be clear, I'm not asking for help like "how do I programatically modify the registry". I can do that fine. But I need to know what in the registry needs to be modified.
Thanks in advance.
Windows Installer CleanUp utility was never intended to be used in the wild. It was only meant to be used by software developers. If you occasionally have end users needing to use WCU you have some serious installer quality issues that should be addressed.
WCU only removes the Windows Instaleller meta data and doesn't actually uninstall any software. This leaves the machine in a very dirty state. These days with test labs becoming virtualized there's no reason to have this tool anymore. You just roll back to a prior snapshot and keep on working.
I've seen all kinds of online forums full of users who think they know what they are doing ( and don't ) suggest using WCU to solve various problems so in the end Microsoft decided to try to get the horse back in the barn.
I have old copies of WCU archived in my CM system so if you'd like me to generate checksums to help you determine if you are getting a good copy just let me know.
The cleanup utility was a wrapper around the command line utility msizap.exe, described here:
http://msdn.microsoft.com/en-us/library/aa370523%28VS.85%29.aspx#1

Tips to upgrade workstations for development team?

I have secured the budget to upgrade the individual workstations and latops. While newer, bigger screens were welcomed with enthusiasm, the thought of re-installation tools and settings caused most of them to blanch and I got one "Do I really have to?".
How much downtime do you usually have when you move to a new machine?
Do you employ tools or script to set up your dev environment, tools, db's, debuggers etc.specifically for a windows environment?
Is there a standard image that you keep and then let devs move in and tweak the machine as necessary?
My company essentially virtualized in order to stop wasting so much time with upgrades/system failures.
Whenever a desktop/laptop failed, we'd have to spend a better part of a day fixing it and reloading the software.
So, we went out, bought iMacs for everyone and loaded Parallels (a VMware like product for OSX) on them. Then we made a standard dev image for everyone, and just copied it to everyone's machines.
Essentially, if anyone's configuration got messed, we just loaded in a fresh image and kept on truckin'. Saved a lot of time.
Some additional benefits:
When new software is out, we just make a new image and distribute it. Not OS re-installs or anything like that.
If hardware changes, doesn't matter, just move the image.
You can run multiple os's concurrently for testing
You can take "snapshots" in your current image and revert if you really messed something up.
Multiple builds on the same machine...since you can run multiple os's.
Surprisingly the overhead of a virtualized system is quite low.
We only run the software on a real machine for performance tuning/testing purposes.
One day is generally enough for upgrades. I do keep digital copies of VS.NET so much easier to install.
When it comes to other tools generally it's just better to go to websites and install the latest version.
Also it's a good idea to install tools whenever you need instead of trying to install everything at the same time.
The last time I upgraded to a new machine, I think it took about 4 hours to get most of the necessary tools reinstalled. Over time, I've had to re-install quite a few more tools, but I think it's worth it.
If you can get a ghost/image of the default tool set (Visual Studio 2003-2008, Eclipse, NetBeans, or whatever you're using), and all the major service packs, that would help a lot with the initial setup.
I think the downtime is definitely worth it, a new, faster machine will make anyone more productive.
You can have 0 downtime by having both machines available. You will not have as much productivity.
This depends on the number of tools needed by the development team. Tools such as Rational Software Architect can take hours to install on their own. The exercise of having the developers list the applications they need before moving in can help you optimize strategies to deploy effectively. Both machines should be available for a fixed period of time and having them available can allow develoers to both work and kick of long running installs at the same time.
Creating a standard image based on the list provided to you can improve efficiency. Having the relvant software on a share could also let them cherry pick as needed and give the development team the feeling that they can go back as necessary.
Tools to assist in catpuring user settings exist. I have only ever had experience with Doctor Mover. If you have 100 or more developers to move it may be worth the cost. I can't complain too much but it wasn't perfect.
I have never had a problem with just getting a list of all the software a particular users uses. In fact I have never found the base install to be much of an issue. The parts I tend to spend the most time on are re-configuring all of the users custom settings (very common with developers I find). This is where it is very valuable to have the old machine around for awhile so that the user can at a minimum remote-desktop to it and see how they have things set up.
Depending on how your team works, I would highly recommend having every user receiving a new computer get the latest source tree from your source control repository rather than by copying entire directories. And, I would also recommend doing that before actually sending the old workstation elsewhere or even disconnecting it.
One of the great things about tools like CVS and SVN is that it is quite easy for developers to end up with an unofficial "personal branch" from things that are not properly checked in, merged, etc.
While it will cost time to deal with the shift if things are not properly synchronized, it is an invaluable opportunities to catch those things before they come to haunt you later.

Resources