How to tell if OneDrive file synce has completed - windows

I have a MS Access database which copies the database to a OneDrive folder upon shutting down. I don't want Access to completely exit until the copy has completed copying the file to the OneDrive cloud. I haven't been able to figure out how to check for this. Using VBA I've tried looking to see if the file exists (it does) and if the filesizes are the same (they are), but I can see that it's still synching.
When I look at the file properties/details, I can see that it's marked as "Available offline". I don't see any way to set this to be Available Online Only. I'm running Windows 10.
-- Geoff

The only solution that I've found was to map the OneDrive to a network drive as described here: https://www.youtube.com/watch?v=qm1Of4eFDDY and then do the copy to the mapped drive. The copy completes when then entire file is on the cloud.
-- Geoff

Related

Deleting Files on a Network Drive that are opened by another User

I have an odd issue currently.
I have a build script that essentially will copy over some files to a directory on a networked drive and initially had an issue where some people were leaving these files open on their machines, causing my build to fail.
A solution came up to simply delete the entire file, because theyre build artifacts, but the other day the folder itself was opened and locked.
I cannot think of a solution to unlock the folder and forcefully close any open files.
Various file unlockers don't seem to work on folders our on a networked drive from my machine.
I figure people have been copying files for years and have had this issue before me, so what are some ways you can get around this file locking issue besides asking someone to close a file or folder?
General Info:
Windows Server
Windows Local Machine
Transfer via UNC Paths
Thanks!

Preventing Powershell from adopting Windows Explorer options

I don't do much with Powershell often but yesterday was needing to extract a file from a zip archive. During the process I was looking for a file by iterating through the contents of a zip anbd comparing the file's (item) Name property to the filename (say abc.ps1).
Code was working locally and I was happy with the result I was getting.
Pushe the code to deployment environment to be run on server and the code was notifying me that it could not find abc.ps1.
Further investigation, I logged onto the server (as my own account, not the service account that the code executed as). When I looked in the zip it was querying I could see the file, but it was named abc.
I had xyz.txt and def.pdf but the ps1 file was simply abc. I knew that this is based on an Option in Windows Explorer options for 'Hide extensions for known file types'.
Logging on to the server with the service account, unchecking this option and re-running the script fixed all problems, it was able to fid the file based on name + ext.
Is there a way of enforcing PS not to take into account Explorer options such as these when running?
Am currently thinking of applying a GPO to prevent any systems from having this option on.

Find out WHO made the last change to files by Powershell?

I have a shared network location for all users saving files. All users have full access to this location.
Is that possible to find out WHO made the last change to a file or folder by Powershell or any other way?
There is no issue to get all those date and owner info from Powershell, but looks like there is no way to find out WHO made the last changes.
any idea please?
The only user held against a file on NTFS is the owner. There is no record of who last modified the file.
However Windows can audit file system operations.
See http://support.microsoft.com/en-us/kb/310399 (says Windows XP in the title but applies to later versions). This needs to be applied to the system hosting the file system.

Moving a .sdf file into isolated storage

How do I move a .sdf file into my isolated storage and after I have moved it is there a way to delete it as it is of no use. I have added my .sdf file as a content in my project.
Your question is not very clear, but let me see if I get this. You created a database, added it to your file as content to your project so that you can have all the data present when the user installs your app. Then you are copying the data from the read-only .sdf file into a database that you are creating on first run, so that you can read/write to it. Correct?
If so, I do not believe there is a way to delete the read-only file that you included with the install.
If your database is large enough that you are concerned about the space it will take by having two copies of it on the phone, I would suggest placing your data on a server, creating a web service, and access that web service on first run. Place a notice on the screen that lets your user know that it is downloading information that will only be downloaded once, and that subsequent launches will not take as long. Be sure you include code to prevent a problem should the download be interrupted by a phone call, text message, back key press, start button, or other event. Make it be able to continue the download if it was interrupted in a prior run.
To answer your question, .SDF is a format of Microsoft SQL Server Compact (SQL CE) databases. The link you have pasted talks about SQLite databases.
This the way to download the entire Isolated Storage onto your device.
Open cmd and go to the following directory
C:\Program Files\Microsoft SDKs\Windows Phone\v7.1\Tools\IsolatedStorageExplorerTool
then use the isetool.exe to download the Isolated Storage along with the .sdf file onto your machine.
isetool.exe ts xd [Product_id_here_see_WMAppManifest.xml] "D:\Sandbox"
You should get message like download successful into D:\Sandbox.
You can also upload the sdf by changing the argument ts with rs

Path too long error when building a windows azure service

I have been trying to publish my service to windows azure. The service consists of a single webRole, however I have added remote login functionality published it and built it a few times, and now all the sudden it will not build. The reason it gives is that
Details below:
"Error 56 The specified path, file name, or both are too long. The fully qualified file name must be less than 260 characters, and the directory name must be less than 248 characters. C:\Program Files (x86)\MSBuild\Microsoft\Cloud Service\1.0\Visual Studio 10.0\Microsoft.CloudService.targets 202 5 FileSystemCreator"
I have gone on all the forums, I have used CSPack command line for packaging the service which is fine but I'm having a really hard time configuring the certificate for remote desktop connect and I would like to take advantage of this feature as I am creating some websites in the onStart event and I would like to peek into IIS. Some microsoft employees do agree that this is a bug and the have promised a fix this issue, refer to post . I am using VS2010 and I do not know how to fix this bug.
Can anyone please help, or point me to a place where I can get any help.
I ran into the same problem with a new solution.
Note that, unlike Eugenio Pace's response suggests, the error occurs only when deploying to Azure (and not when running the project in the Azure Compute Emulator).
Try adding the following line to the first property group of your Windows Azure Visual Studio Project file (*.ccproj):
<ServiceOutputDirectory>C:\Azure\</ServiceOutputDirectory>
The trailing slash (for whatever path you select) appears to be required. This folder will be deleted each time you create a package if it exists.
This setting seems to redirect the working folder for the package to a shorter base path, preventing the path too long error.
Credit goes to: http://govada.blogspot.com/2011/12/windows-azure-package-build-error.html
Perhaps the local folder used to store temporary development fabric is too long. See Windows Azure - Resolving "The Path is too long after being fully qualified" Error Message.
I was having this problem as well when deploying a Node.js project to Azure.
To fix it, I had to change my "TEMP" and "TMP" user environment variables to something shorter than their default values.
In my case, they were pointing by default to %USERPROFILE%\AppData\Local\Temp, changing them to C:\Temp solved it.
Make sure you restart Windows after.
The better solution may be to create a symbolic link to your project folder. This doesn't require moving files or changing system variables. Open up the command prompt as an administrator and run this:
mklink /D C:\Dev C:\Users\danzo\Source\Workspaces
Obviously you can change "C:\Dev" to whatever you want it to be and you'll need to change the longer path above to the root directory of your soltions/projects folder.
Same problem happened to me when I try Packaging an Umbraco project for Azure (https://github.com/WindowsAzure-Accelerators/wa-accelerator-umbraco/wiki/Deployment), I found the solution is to: Copy and rename the long-name path and folder to "C:\someshortname".
(solution was suggested by this: link)
I tried all the above 2 approaches:
-change TEMP and TMP enviromental variables
-<ServiceOutputDirectory> path
and didn't work.
In my case, I had to move the whole project to a shorter path C:\ and worked.
I'm using W7 and VS12.
When you run a cloud service on the development fabric, the development fabric uses a temporary folder to store a number of files including local storage locations, cached binaries, configuration, diagnostics information and cached compiled web site content.
By default this location is: C:\Users\\AppData\Local\dftmp
Credit goes to Jim Nakashima of Microsoft :
https://blogs.msdn.microsoft.com/jnak/2010/01/14/windows-azure-resolving-the-path-is-too-long-after-being-fully-qualified-error-message/
In order to change the temporary folder, a user environment variable has to be created :
It is named _CSRUN_STATE_DIRECTORY
Give it a value of short named directory like :
c:\AzureTemp
Don't forget to restart Visual Studio in order to have the environmennt variables to be read again
It fixed many compilations problem !

Resources