Is there way to determine Last Access Time of the Azure storage files apart from log analytics . So, does anyone ever come across this situation, what would be the best way to achieve this? Or am I too concerned about this?
Thank you in advanced.
In Azure file storage, there is no option till date to know the last opened/viewed/accessed time, but there is a possibility to get to know about the last modified file. In case you are looking for that, here's a C# way:
CloudStorageAccount cloudStorageAccount = CloudStorageAccount.Parse("<Your Connection string>");
CloudFileClient cloudFileClient = cloudStorageAccount.CreateCloudFileClient();
CloudFileShare cloudFileShare = cloudFileClient.GetShareReference("<Your File Share Name>");
IEnumerable<IListFileItem> fileShareItemsList = cloudFileShare.GetRootDirectoryReference().ListFilesAndDirectories();
foreach (IListFileItem listItem in fileShareItemsList)
{
if (listItem is CloudFile) // Checking direct files under the root directory for now
{
CloudFile file = (CloudFile)listItem;
file.FetchAttributes(); // this is mandatory to fetch modified time
DateTime lastModifiedTime = file.Properties.LastModified; // Here's the time!
// Use it in your logic..
}
}
Related
What I want to do is:
Retrieve all metadata from CRM.
Serialize that metadata and store it in a file.
At a later point, deserialize and eed that metadata to XrmFakeEasy for unit tests.
Steps 2 and 3 are done but I don't know how to accomplish step 1.
I've spent some time noodling around in the code and on Google but remain stumped.
We're using .Net so what I need is to read ALL the Entity Metadata (type: Microsoft.Xrm.Sdk.Metadata.EntityMetadata).
If anyone knows how to do this or can point me in the direction of the API (I haven't been able to find one) then please let me know.
P.S.
This case is for on-premise crm.
If I get it right you need to use RetrieveAllEntitiesRequest request.
Here are more details: https://stackoverflow.com/a/29694213/2575544
For the benefit of anyone that comes across this post here's
My final solution
public static EntityMetadata[] GetMetadata(IOrganizationService crmService)
{
var request = new RetrieveAllEntitiesRequest
{
EntityFilters = EntityFilters.All
};
var response = (RetrieveAllEntitiesResponse) crmService.Execute(request);
return response.EntityMetadata;
}
I want to download blobs using Shared access signatures, SAS.
I also want to be able to remove active SAS URI's and, if I understand it correctly, I must use Stored Access Policy for this.
What confuses me is how I can remove a policy. I also read you can only have 5 stored access policies active?
My goal here is to be able to remove an active SAS URI. The only way I can think of accomplishing this is to remove the policy that the SAS URI is linked with, right? If I have over hundreds of files in my blob storage, how in the world can I make this work? I can't have one policy for each blob right? 5 is the maximum policies?
This code demonstrates how I add a policy and how I create a SAS URI that uses this policy, which users can download from.
static void CreateSharedAccessPolicy(CloudBlobContainer container)
{
//Create a new stored access policy and define its constraints.
SharedAccessBlobPolicy sharedPolicy = new SharedAccessBlobPolicy()
{
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(10),
Permissions = SharedAccessBlobPermissions.Read | SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.List
};
//Get the container's existing permissions.
BlobContainerPermissions permissions = new BlobContainerPermissions();
//Add the new policy to the container's permissions.
permissions.SharedAccessPolicies.Clear();
permissions.SharedAccessPolicies.Add("PolicyName", sharedPolicy);
container.SetPermissions(permissions);
}
static string GetBlobSasUriWithPolicy(CloudBlobContainer container, string policyName)
{
//Get a reference to a blob within the container.
CloudBlockBlob blob = container.GetBlockBlobReference("file_name");
//Generate the shared access signature on the blob.
string sasBlobToken = blob.GetSharedAccessSignature(null, "PolicyName");
//Return the URI string for the container, including the SAS token.
return blob.Uri + sasBlobToken;
}
One last question, how do I remove a policy? Is it as simple as:
permissions.SharedAccessPolicies.Remove("PolicyName");
My goal here is to be able to remove an active SAS URI. The only way I
can think of accomplishing this is to remove the policy that the SAS
URI is linked with, right?
Partly correct. Removing the access policy is one way to do it. Other would be to change the name of the policy (policy identifier). For example if the policy identifier is mypolicy then changing it to mypolicy1 would have the same effect as removing the policy.
If I have over hundreds of files in my blob storage, how in the world
can I make this work?
As you may already know, access policy is defined at the blob container level and not at the blob level. Removing/invalidating an access policy would make invalidate SAS URL for all blobs in that container.
I can't have one policy for each blob right? 5 is the maximum
policies?
That is correct.
One last question, how do I remove a policy? Is it as simple as:
permissions.SharedAccessPolicies.Remove("PolicyName");
That is correct. Make sure you save it back though. You can use something like:
var cloudStorageAccount = CloudStorageAccount.DevelopmentStorageAccount;
var blobClient = cloudStorageAccount.CreateCloudBlobClient();
var container = blobClient.GetContainerReference("container-name");
var containerPermissions = container.GetPermissions();
containerPermissions.SharedAccessPolicies.Remove("access-policy-id");
container.SetPermissions(containerPermissions);
I am developing project. Which is related to honey pots.My problem is there any way to get open source honey pot log files.If it possible, Please provide a link or give any suggestions
That would be very easy to do. Here's a PHP example:
if($_POST[shouldBeEmpty] == "") {
// the field is empty, so deal with the form submission
}
else {
// you are dealing with a spammer, therefore add to the log file
}
Of course, don't name your field "shouldBeEmpty". Name it something normal-sounding such as "contactNumber" or "message".
I'm bedeviled by this. I have a c# application that I need to have a backup before I modify my main contact. But it seems that the copy, sticks around no matter what. I'm verifying this by visual check at the contents of my contents folder in Outlook.
I have a simple test case like so...
Application outlookApplication = new Application();
NameSpace outlookNamespace = outlookApplication.GetNamespace("mapi");
outlookNamespace.Logon("", "", true, true);
MAPIFolder Folder = outlookNamespace.GetDefaultFolder(OlDefaultFolders.olFolderContacts);
MAPIFolder Folder2 = Folder.Folders["Test1"];
Items ContactItems = Folder2.Items;
foreach (ContactItem Contact in ContactItems)
{
ContactItem Backup = (ContactItem)Contact.Copy();
Backup.Delete();
break;
}
outlookNamespace.Logoff();
outlookNamespace = null;
If I try to delete it twice, it causes an error.
Even tried moving it to the deleted items folder, but no luck. Outlook 2010. What is going on?
EDIT: WORKAROUND: If I create a new contact and populate from the original, I can delete it just fine.
I'm not familiar with C# syntax, but I suspect it's because you are adding to the Items collection when you create the copy. I would do this:
Before the start of the foreach loop, check the count of ContactItems:
Items ContactItems = Folder2.Items;
' display ContactItems.Count here, is it Console.WriteLine(ContactItems.Count) ??
After creating the copy, check the ContactItems.Count again. If it has increased, then you need to change your loop to a "For i = ContactItems.Count to 1 Step -1" type of loop instead of a foreach loop (sorry, I only know the VB syntax, I don't know the equivalent C# syntax). It has to be a backwards loop.
If that doesn't work, then create a copy and add it to another Contacts folder, that way it won't interfere with the Items collection of the folder you are working with. That is similar to what you are already doing.
I have written a simple wp7 application. i am using wcf service to interact with the database. Now i want to store a part of user's info in the mobile also. this info needs to be accessible across the wp7 app.
I found multiple ways to do this like : isolated storage, resource files or static data in the app.xaml
Which one would be more suitable? as i may wish to edit the data in future...i may not opt for packaged files as they are read-only. also do not wish to lose data by storing in isolated storage.
Please suggest the most suitable option for me
Thanks in advance
Bindu
It sounds like you want to store downloaded data between uses of the app. In this case Isolated Storage is probably your best bet. It will remain in the phone's non-volatile memory and you will not lose it.
Details here
Resource files and static data in the app.xaml won't work for you since you want to be able to change these items at a later date since these will be read only.
I don't know what you are referring to when you say "lose data" by storing in IsolatedStorage. This is your best bet and is actually really easy to do. Here is an example of saving a simple boolean:
private void SaveSettings()
{
IsolatedStorageSettings settings = IsolatedStorageSettings.ApplicationSettings;
settings["VibrationOn"] = VibrationOn;
}
Then to load it later:
private void LoadSettings()
{
IsolatedStorageSettings settings = IsolatedStorageSettings.ApplicationSettings;
bool vo;
if (settings.TryGetValue<bool>("VibrationOn", out vo))
VibrationOn = vo;
else
VibrationOn = true;
}
You would call your LoadSettings() method in the Application_Launching and Application_Activated events and then your SaveSettings() in the Application_Deactivated and Application_Closing events within your App.xaml.cs.
You can also serialize objects or write whole files.