I am trying to implement a virtual drive using a Cloud Files API. I can not make Delete operation to work. For some reason the CF_CALLBACK_TYPE_NOTIFY_DELETE callback is called twice.
I have build two samples in C++ and in C# using two different approaches and in both cases I have the same results:
CF_CALLBACK_TYPE_NOTIFY_DELETE is called.
CF_CALLBACK_TYPE_NOTIFY_DELETE is called again.
CF_CALLBACK_TYPE_NOTIFY_DELETE_COMPLETION is called.
This behavior is somewhat confusing. Can anybody explain why this happening and what should I do inside the second call? Can I somehow distinguish them and ignore one of these calls?
I reproduced this issue in Windows File Manager, but when I did "delete" operation in Powershell there was no the problem. So this probably bug or some limitation of Windows File Manager.
Related
I'm toying with api platform and just saw that it comes with a client generator command.
When I run it a second time it has no effect because files have already been generated.
But, in the case of a custom template, if I want to re-run the client generation but want to achieve an update is there anything I can do?
At the moment the only way that I have found is to delete the directories created and run the command, but it is not e nice solution..
The client generator is designed for scaffolding only. It is intended to bootstrap your project, then it will never try to update the existing code.
At the moment the only way that I have found is to delete the directories created and run the command, but it is not e nice solution..
It's the only solution for now. We may introduce a new --overwrite flag (Pull Request welcome), but internally it will do exactly the same thing.
I have created Web performance testing using visual studio 2017, Most of the pages are data driven by Login, Change of lists etc is there
I have added extraction rule, and when i do load test of the same WPT it gives me errors 403 and fail's the test
My question here is how should I make it work
Thanks in advance
A 403 error usually means that your VS script is failing at login. It is most likely happens due to one or several dynamic values that VS plays back as recorded. To fix it you need to find and manually correlate them by creating extraction rules and parameters somewhere at the beginning of your script.
It looks like you already created some extractors, but if your script is still failing, chances are that you did not create all of them. Try the manual correlation technique described in my article.
Or check our Web Test Builder for Visual Studio referenced here to automatically create missing extractors and parameter.
I am trying to develop an application for monitoring operations on a particular folder which includes creating, deleting, modifying, moveing, duplicating and renaming.
After referring to the FileSystemWatcher API, I found that it does not provide a Move event, instead it will generate a separate Delete and Create events for the same file. Of course I can wait for a Create event each time a Delete event is detected and decide whether it is a Move event or not, but it is a little bit strange to me and the appropriate time to wait for is also a problem.
Actually I find that Dropbox is able to detect the Move event correctly even when dropbox process is not running. Nice work for Dropbox team and I wonder how do they make it work on earth.
So my purpose is quite clear : Filtering Move event from other events and making it work offline (which means when the procedure is stopped)
Well, a move operation is basically a "copy & delete" from the FileSystemWatcher's point of view. Without knowing details about the implementation I bet DropBox stores a hash for each file. If a file is removed in one place and created with the same hash in another place, chances are it was moved. You could actually do the same thing.
Sorry if a similar question has been posed before. There are a lot of deployment questions but none seemed to address my problem.
Anyway. I'm working with asp.net, C# and using Visual Studio.
The Organization I'm working in is changing rapidly. There are a lot of projects coming in the pipeline that will require multiple code changes and iterative deployments over the next few months. While working, these changes are always 'on the forefront', so sometimes I have to code certain parts of the same program multiple times.
Since these projects are all staggered, I can't just make one sweeping change all at once; I have to deploy and redeploy the same program multiple times, using only the changes that are required for that deployment.
If this is confusing, here's a simple example:
Application is being used on an Intranet. This application calls our Database, using Driver A.
There are two environments, test and production.
Certain Stored procedures have to be called with parameters that register 'Test' to allow certain other applications to run even with bad data (for testing purposes).
When deploying applications, these stored procedures have to be modified, removing Test parameters
We have an Operating System upgrade, allowing us to move to a much faster Driver B, but requires changes to be made to the code to use Driver B.
So that's two wholly different deployments where some code must be changed for Deployment 1 and other code must be changed for Deployment 2.
Currently I'm just using notepad for an overall change list, regular debugging break points and a multitude of in-code comments, and then I manually slog through the code to make sure that everything is changed. With hundreds of thousands of lines of code over multiple files, classes, objects, etc. this gets pretty tedious, as well as there being a good chance of missing something (causing it to break) or pushing wrong changes (causing it to either break or allow bad data).
Is there a tool that could be used to help in this situation? Preferably one that I can discern what needs to change for Deployment A and what needs to change for Deployment B? I'm also open to hearing other schools of thought as well (tips are definitely accepted!)
Sure, I understand your problem.
I would suggest a couple of things
Installers : Why don't you think of installers, there are loads of installers i.e Install shield, Wix, MSI installer.
These installers will give you flexibilty to update files which you need to update, i.e. Full Control.
But you need to choose the best of them, I have worked around MSI and Wix a lot, so I know this can sort your problem, however its your call.
Publish : I haven't played around much with this, I have just done website publish. However I know it does wonders, so try it also.
I have written an application that uses Isolated storage store data that I want to clear out on a periodic basis when it gets old. I have written a function that is called from Closing that checks the isolated storage for old data and deletes it.
This routine will delete everyfile that it is supose too except the last two files in the directory. When I debug the code I can see it execute the DeleteFile method on those files. I even when as far as checking right after the call to DeleteFile to see if the file still exists. According to the debugger it does not.
Yet when the appication starts up again the old data is for those last files is still in isolated storage. Thinking that it may be a race condition I put a Thread.sleep(1000) after the delete routeines.
The phone does not honor this delay and exits immediately after executing the delte code. I could not find a flush command that would be related to DeleteFile as I don't have a reference to a stream at that point.
Has anyone else found this or something similar? Is there a magic flush method I am missing or is this a defect in the phone IsolatedStorage implementation?
i agree with Matt and Matthieu.
though also wish to ask u have u tried truncating the file?
IsolatedStorageFileStream isfStream = new IsolatedStorageFileStream(strXMLFile, FileMode.Truncate, isf);