As IT guys, we all help that friend or family member in need of our services. And we all occasionally have to install some program that we are never going to use again. So you install it, do your thing and remove it afterwards. But with installing and removing a lot of software, things get left behind: registry keys, profiles, logs, sometimes even a service. Now I don't like that mess, and it slows down my computer. So once every 4-6 months I have to do a clean install (got some images to fasten it up, but still annoying)
I'm looking for a way to "freeze" my system and start tracking all changes made, so I can remove them afterwards. Ideally I can store them as a file.
For example: I'm helping a friend out whose phone died, galaxy sWhatever. I have to install the drivers, KIES (I believe it was), some tools to get root access. After it's done I remove everything again. Two weeks later, he did it again. Reinstall drivers, kies, tools ... NOT convenient.
So I'm looking for a way to capture/virtualize/save/monitor programs, and installations. Perhaps get them to run modular. (Samsung problem? load my saved Android sdk session, load my Samsung drivers&tools session)
It would be perfect if it works with bigger stuff as well, like Visual Studio. It has several services (like slq) and resources and I only use it once every other month or so.
My search has led me to things like
VM's (which is my current solution) but it has large files, difficult to share files and devices, updates (every .. bloody .. time!).
Sandboxing (not what i'm looking for)
stuff like Docker. Which is somewhat what I'm looking for, but I need it on a personal level and not in a linux-VM
Any ideas? Anyone facing similar challenges? How do you deal with this?
Thanks!
Related
So recently, my friend wanted to debloat my laptop, than dug around in the registry with a program to make my computer faster. Long story short, nothing changed, but I had to fiddle around with python for 3 days to get it to work again, microsoft store straight up doens't exist anymore (an intention, which was in hindsight stupid) and my iCloud Drive does/doesn't work.
I can't see the checkmarks in file manager and the files sorta upload? I'm certain it's something to do with the registry, but I have no idea how to find it and get it back. I know the best option is just reinstalling windows, but I'm afraid to do that as there's quite a lot of data and settings on this laptop and it would eat up a whole day of backing up and fiddling.
I am lucky and thankful to be home for the holidays, and I wish everyone who reads this the best! I have an annual habit of doing windows clean installs on many of my family members' pcs along with my own.
I use dism in cmd/PowerShell on windows to create custom images for certain pcs, like adding drivers, removing preinstalled windows apps, updating preinstalled programs, etc. I made a small little PowerShell script that helps in the process as it is very tedious. (I normally do this while watching TV or something else.)
That got me thinking. Google created Android Flash Tool that sends commands to android devices directly from a website. It even can download new android images/builds and flash them to the device. I also stumbled upon Simon Chan's WebADB.
Those two examples are pretty cool; massive kudos to the developers of both. I was just hoping for some rough ideas. Is running say dism.exe possible on the web? Like taking a cloud file (like Google's android images) and running dism to make some user-selected customizations?
This process would entail being like a web-based Rufus by formatting and putting files on a user-selected USB Stick. (This should be possible?) However, the next step would require "talking to windows" and accessing dism.exe directly on the local windows machine. Then mounting an ESD/wim file that was just put on the USB stick, then making changes to it using dism, and then unmounting and committing changes to the stick. Would this be possible?
This is just a very early stage idea and would honestly probably be more hassle than it is worth. But I could totally work on it during my spare time just to learn. Frankly, before I should have asked the above questions, I should have asked:
Can a website talk directly to "windows."
Can a website say tell windows to unzip a file locally or zip a bunch of files?
Create folders or simple tasks such as writing files directly to a directory (without chrome/file explorer holding its hand)?
I have built websites before, I have used npm/node, angular, and familiar with Google Firebase/GCP. However, this seems more complicated and out of my knowledge base. Hilariously, I am a computing security/networking engineer, and I can't even begin to fathom the sheer amount of security issues that would be possible with something like this. The site basically needs access to run cmd/terminals on the client machine. The thought of that gives me nightmares.
As computing and, namely, the web continues to evolve with the advent of new APIs, PWAs, etc., it is interesting what one can do with a "simple" website. If what I am describing is not possible now, I hope that someday it can be—in a fully secure way.
Thank you to whoever reads this and responds! I am looking for a "yes/no, your crazy" and hopefully a rough description of how/what. However, I am open to anything! Thank you again.
Mostly I am just sad I guess. Yesterday I finished an iPhone app in Xcode 10.2.1, loaded it onto my phone (it works nearly perfectly), and shut down Xcode. The app is on my phone and working, but when I opened the Xcode again the code (viewcontroller, AppDelegate, and storyboards) have no data. To be clear, the folders and files are still there, but the code/data is not. I did not have time back up the finished version. Is it possible to retrieve these from my phone? Or is there some other place to look to find it? Or am I stuck rewriting it (there are some iterations so it is not starting over completely, but it still sucks).
thanks
Is it possible to retrieve these from my phone?
No, your phone contains only the compiled object code; it won't have any source code.
Or is there some other place to look to find it?
It's hard to imagine how the code could have simply disappeared, so one would think it's probably there somewhere. I wouldn't think that you could compile an app without saving the code, and if you saved your work then it certainly shouldn't just be gone. If you can remember even just a part of a phrase from the missing file(s), you can search your machine for files containing it. Use Spotlight or even just grep for that.
If you're unable to recover the file(s), then rewrite it as soon as you can while it's still fresh in your mind. And use the experience as a lesson. In the future you should do both of the following:
back up regularly: Use a backup system that works automatically. Apple's Time Machine works very well for this... all you need to do is plug in your backup disk and let the machine do it's thing.
use revision control: There are a lot of options here, of course, but git is free and private Github accounts are also free, so you can save your work remotely. If you don't know how to use revision control, learn -- it's an essential development skill.
Microsoft's "Windows Installer CleanUp Utility" could be used to help fix broken installations of MSI-installer based products. When the installer failed in some strange way and left corrupt data behind, so bad that even Add/Remove Programs couldn't help, you could often fix things by running this utility and then running the application's installer again.
I just discovered that Microsoft announced a couple weeks ago that they were discontinuing this utility. They didn't merely say "we're not supporting it anymore"; they seemingly removed it from their site entirely.
I have to support a Windows program for a whole bunch of users. Given the number of users, every so often something will go wrong, and this program has been invaluable for me, as a last-ditch line of defense.
I know I could point customers to some third party site that has a cached copy of it, but this seems dangerous (malware potential and such).
So, are there any replacement products? Or, if not, how can I myself do whatever it is that this program did?
To be clear, I'm not asking for help like "how do I programatically modify the registry". I can do that fine. But I need to know what in the registry needs to be modified.
Thanks in advance.
Windows Installer CleanUp utility was never intended to be used in the wild. It was only meant to be used by software developers. If you occasionally have end users needing to use WCU you have some serious installer quality issues that should be addressed.
WCU only removes the Windows Instaleller meta data and doesn't actually uninstall any software. This leaves the machine in a very dirty state. These days with test labs becoming virtualized there's no reason to have this tool anymore. You just roll back to a prior snapshot and keep on working.
I've seen all kinds of online forums full of users who think they know what they are doing ( and don't ) suggest using WCU to solve various problems so in the end Microsoft decided to try to get the horse back in the barn.
I have old copies of WCU archived in my CM system so if you'd like me to generate checksums to help you determine if you are getting a good copy just let me know.
The cleanup utility was a wrapper around the command line utility msizap.exe, described here:
http://msdn.microsoft.com/en-us/library/aa370523%28VS.85%29.aspx#1
I have secured the budget to upgrade the individual workstations and latops. While newer, bigger screens were welcomed with enthusiasm, the thought of re-installation tools and settings caused most of them to blanch and I got one "Do I really have to?".
How much downtime do you usually have when you move to a new machine?
Do you employ tools or script to set up your dev environment, tools, db's, debuggers etc.specifically for a windows environment?
Is there a standard image that you keep and then let devs move in and tweak the machine as necessary?
My company essentially virtualized in order to stop wasting so much time with upgrades/system failures.
Whenever a desktop/laptop failed, we'd have to spend a better part of a day fixing it and reloading the software.
So, we went out, bought iMacs for everyone and loaded Parallels (a VMware like product for OSX) on them. Then we made a standard dev image for everyone, and just copied it to everyone's machines.
Essentially, if anyone's configuration got messed, we just loaded in a fresh image and kept on truckin'. Saved a lot of time.
Some additional benefits:
When new software is out, we just make a new image and distribute it. Not OS re-installs or anything like that.
If hardware changes, doesn't matter, just move the image.
You can run multiple os's concurrently for testing
You can take "snapshots" in your current image and revert if you really messed something up.
Multiple builds on the same machine...since you can run multiple os's.
Surprisingly the overhead of a virtualized system is quite low.
We only run the software on a real machine for performance tuning/testing purposes.
One day is generally enough for upgrades. I do keep digital copies of VS.NET so much easier to install.
When it comes to other tools generally it's just better to go to websites and install the latest version.
Also it's a good idea to install tools whenever you need instead of trying to install everything at the same time.
The last time I upgraded to a new machine, I think it took about 4 hours to get most of the necessary tools reinstalled. Over time, I've had to re-install quite a few more tools, but I think it's worth it.
If you can get a ghost/image of the default tool set (Visual Studio 2003-2008, Eclipse, NetBeans, or whatever you're using), and all the major service packs, that would help a lot with the initial setup.
I think the downtime is definitely worth it, a new, faster machine will make anyone more productive.
You can have 0 downtime by having both machines available. You will not have as much productivity.
This depends on the number of tools needed by the development team. Tools such as Rational Software Architect can take hours to install on their own. The exercise of having the developers list the applications they need before moving in can help you optimize strategies to deploy effectively. Both machines should be available for a fixed period of time and having them available can allow develoers to both work and kick of long running installs at the same time.
Creating a standard image based on the list provided to you can improve efficiency. Having the relvant software on a share could also let them cherry pick as needed and give the development team the feeling that they can go back as necessary.
Tools to assist in catpuring user settings exist. I have only ever had experience with Doctor Mover. If you have 100 or more developers to move it may be worth the cost. I can't complain too much but it wasn't perfect.
I have never had a problem with just getting a list of all the software a particular users uses. In fact I have never found the base install to be much of an issue. The parts I tend to spend the most time on are re-configuring all of the users custom settings (very common with developers I find). This is where it is very valuable to have the old machine around for awhile so that the user can at a minimum remote-desktop to it and see how they have things set up.
Depending on how your team works, I would highly recommend having every user receiving a new computer get the latest source tree from your source control repository rather than by copying entire directories. And, I would also recommend doing that before actually sending the old workstation elsewhere or even disconnecting it.
One of the great things about tools like CVS and SVN is that it is quite easy for developers to end up with an unofficial "personal branch" from things that are not properly checked in, merged, etc.
While it will cost time to deal with the shift if things are not properly synchronized, it is an invaluable opportunities to catch those things before they come to haunt you later.