Does anyone know how to test in Delphi/VCL to see if Dropbox (or similar cloud syncing tools such as Google Drive, MS One-Drive, etc.) are:
running on a client computer, and
whether they are set to synchronize a particular folder/directory?
I am asking because I have been having issues with the database I am using (NexusDB). When the program database is installed in a folder that is also being synchronized by Dropbox this seems to cause program crashes (Dropbox seems to temporarily lock files making them unavailable to my program). I would like to be able to test for this and at least warn users.
I notice by the way that TMS software has a set of Delphi components for working with cloud syncing software, but have not tried these yet.
Related
I have a problem that i have a difficult time explaining, which makes any online search very hard. Here is my dilema.
I'm migrating a VM. The purpose of this machine is to compile send out daily/weekly/monthly reports. I know there are other ways (like Power BI) but this is the situation we are in right now. The older machine has win10 pro and office 365 installed while the new has win10 enterprise version and office 2016 installed. This machine runs 24/7 in the background running specific tasks (via system scheduler app) at given times, that is it's a Virtual machine and has done so without issues since it was created. The reason for the migration is because we need to domain change and bring the machine under a new corporate policy and we don't want to do this on a live server.
We've set it the VM's the same way, same programs and same settings. Everything seams to be running smooth expect for this one thing, and here is the problem i have a hard time to explain or figure out:
MS Access will update the tables and the computer will run the tasks as set but it will not export the data to pdf unless i have a remote desktop connection open. Will not export the pdf's otherwise. MS Access uses a autoexec macro where the pdf export is set with ExportWithFormatting. This works without issues on the old server.
We thought this to be a permission or user specific issue at first but even re-creating the tasks did not work and changing paths. Otherwise also i expect we would have problems with tables updating, specially since it works when you have an active remote desktop conn running.
I'm lost and therefore hoping this community will be able to help or guide me to a solution.
I believe that we found the reason for this. It was caused by windows easy print and the printer drivers of the machine. It worked for some reason differently between the servers. after reinstalling the printer drivers and a few restarts it started working. It exports now from access again.
This is at least solved.
This is probably a pretty easy question to answer for someone that is experienced with FS minifilters. I am trying to script the removal of a filter driver and device.
Some background... this driver is running on Windows 8/10 x64. The vendor that created the driver has not been helpful in fulfilling my request for a removal tool. Unfortunately their MSI uninstall is buggy and only works about half the time you run it... They want us to upgrade to their newest version that doesn't have the bug we are encountering during uninstallation. We aren't interested in continuing use of this software so a paid upgrade seems frivolous... Their only suggestion has been to reimage the computers without the software that includes the FS minifilter device... That's out of the questions because it is on 1000+ computers...
Basically, their official uninstaller does an API callback to one of their servers and verifies the machines eligibility to uninstall:
Does the MAC address of the primary network adapter exist in their
database?
Does the password you entered for uninstallation match
what is set in their database?
If you are eligible, it runs an MSI uninstallation and disables the FS filter, removes the driver file, service files, configuration, and restarts... The bug that is keeping us from doing a normal bulk removal (their way) is that the MSI freezes during the removal process (after checking eligibility) and requires us to restart a client computer up to 3 times to finish the uninstall.
I have been able to successfully remove the software and device/driver by externally mounting the Windows file system and manually removing the driver file under System32/Drivers and also removing all of the actual program files/services. I have not been able to do this booted onto the same partition where the minifilter is loaded. The minifilter driver that is running is protecting those program files, a registry key, and the actual .sys file under System32...
I've done some basic reverse engineering of their MSI... They are using custom actions to perform the removal... First step is the removal of the service, second step is the removal of the minifilter. Both actions are done via an executable that is packaged in the MSI... I've extracted that and attempted to use it by running the same commands that they do during the MSI... I haven't had any luck. The minifilter just doesn't want to die.
They have some other custom actions that are loaded via DLL. Initial investigation makes me think its all of their custom uninstall eligibility craziness.
It looks like their minifilter doesn't have an unload routine built in. Using FLTMC I get this error attempting to detach and/or unload:
0x801f0010 Do not detach the filter from the volume at this time.
0x801f0014 Do not detach the filter from the volume at this time.
Does anyone know of a good way to unload a minifilter that is throwing those errors?
Try to kick out FltMgr.sys of the kernel by:
Renaming %SystemRoot%\sytem32\drivers\FltMgr.sys
Or changing HKLM\SYSTEM\CurrentControlSet\Services\FltMgr\Type to 0x4 (Disabled)
Reboot
Minifilters can't work without Filter Manager.
If you are desperate enough, look into Windows PE, available as part of the Windows Assessment and Deployment Kit.
A Windows PE image can be remotely installed onto a machine's hard disk and configured to perform whatever task you need done and then automatically reboot back into the original operating system. Doing it this way gives you the same access as externally mounting the infected file system, but can be automated. I've used this approach in the past to automate offline maintenance tasks on several hundred machines (e.g., changing a registry setting that Symantec Endpoint Protection was "protecting") and while getting it working is fiddly, once it is working it works well.
My email address is in my profile, you're welcome to contact me if you decide on this approach and have questions about implementing it.
Alternatively, depending on your jurisdiction and circumstances, you might want to consider threatening the vendor with a lawsuit if they refuse to provide a proper solution. They broke your computers, it should be their job to fix it. From the sounds of it, they wouldn't even need to do any work, just let you have the upgraded version for a few weeks free of charge.
We have a problem at a customer which is not reproducible (some data is shown incorrectly). The customer itself has the problem several times per day, but we can't reproduce the problem in house.
We could use remote debugging to investigate the running process once the customer has the problem, but this requires a developer PC to connect to the customer PC via lots of VPN software. In practice, this is almost impossible since the customer does not want us to connect directly to the server running the application (often there is also a Remote Desktop or Citrix system involved).
I know you can make a MiniDump of a running process to investigate it in a debugger, but then you cannot continue the process to see what's is really going on.
Is there a possiblity to make a dump of the process, copy the dump to the developers PC, and continue the process on the developers PC?
Application is a native unmanaged C++ application.
Of course, all logic related to database connections, network connections, files, ... would be unavailable, but in this case I am mainly interested in the internal logic.
If this is not possible, is this generally possible using a virtual machine instead?
I am developing a webapp that will be used on LAN mostly. I have different locations where I deployed this app. Some of the locations run windows and some run linux (no x-window system). I need to know if there is a software out there that could easily synchronize my files stored somehere in the cloud (the clouding service can be provided by the app developers or to use different clouds) on both linux and windows machines. My english is a bit rusty so i'm going to explain this in simple words.
I will work on my local machine. I want to upload the files somewhere on the cloud and the clients installed on the LAN servers should synchronize the files. The client must be available for linux under console (as a daemon if possible) while on windows it can be something like dropbox or ubuntu one.
Does somebody know of such an app?
Dropbox is available for Linux.
You could also investigate unison.
I think "Git" is the best solution to develop your project in different machine.
You can sync your code with easy command through this app, and it will record all the version of your code.
Just google "Git tutorial", and you will find many useful introductions.
I think there is a great tool called Syncthing should be considered after 8 years.
https://syncthing.net/
Syncthing is a continuous file synchronization program. It synchronizes files between two or more computers and replaces proprietary sync and cloud services with something open, trustworthy and decentralized. Your data is your data alone and you deserve to choose where it is stored, if it is shared with some third party and how it's transmitted over the internet.
Check the list of Syncthing's goals for more details.
We have an app that currently installs itself into 'program files\our app', and it puts the internal data files into the common Application Data folder. This means the program is available to any user on that particular PC.
Now we want to make a multi-user version of this program, multiple PCs accessing the program at the same time across the network.
In the bad old days, under XP, we'd just have the user who installed the app 'share' the app directory and off we'd go. In principle, is this still the 'right' way to do it under Vista/Windows 7?
We'd like to do this 'properly' and be as compliant as possible! Is there a recommended 'Microsoft' approach for doing this, or is it largely down to whatever we can get away with and subsequently support (hah!). I've tried researching this on the MS websites but not found anything too helpful at all - it'd be really useful to have a 'if you're trying to install this kind of thing, put it here' type guide for developers!
I think the recommended way to deal with this is one of the following:
Package the application as an MSI and distribute it via group policies to all machines on the domain. Yes, this will install it on every machine but that's usually how it should be done.
Install the application once on a server with terminal services and push a shortcut for running the program on that server to every client machine. You can transparently use single applications on a terminal server. Afaik you can even associate file types with those on the client machines.
Reading between the lines from other resources, and trying to make sense of it all, I've decided that the 'right' place to put the program data for this stuff really will be in the AppData (Roaming) folder, and the right place to put the program binary files themselves really will be the Program Files folder of the host computer, which I can then share out.