In Windows Phone 7, is there a way to get the application build or compile date in code?
I would like to display the date, along with the version number, for support purposes for my application.
If it isn't immediately available, any hints or alternatives? (I guess one is making it an app setting, which is hokey).
You can parse the version number out of Assembly.GetExecutingAssembly().FullName.
The output is of this form
PhoneApp, Version=1.0.0.0,
Culture=neutral, PublicKeyToken=null
The build date doesn't appear to be available, and arguably you won't need it if you increment your version number every release. Alteranatively you could store this somewhere else if it's important to your app.
The date (& time) of the build isn't included in an assembly.
If you could get to the file system on the phone you may be able to get a date from this but it may be affected by the marketplace ingestion process (when the code is signed) and so you may not be able to guarantee this.
If you used a * for the build part of the version number and then work out the build date from that. (It's the number of days since 2000-01-01.)
Or, you could add something to your build process to set a property or setting.
Or, if using SVN for your version control system, you could use $WCDATE$ in a template with SubWcRev.exe to set this.
Or, you could add this to the app through the use of T4.
The following in a TT file should do the trick:
<## template language="C#" #>
<## import namespace="System" #>
using System.Windows;
namespace MyNamespace
{
public partial class App : Application
{
public string BuildDate { get { return "<#= DateTime.Now #>"; } }
}
}
To get the App Version on Windows Phone 7+: https://stackoverflow.com/a/22838743/1033581
Here is the WP7 code:
var xmlReaderSettings = new XmlReaderSettings
{
XmlResolver = new XmlXapResolver()
};
using (var xmlReader = XmlReader.Create("WMAppManifest.xml", xmlReaderSettings))
{
xmlReader.ReadToDescendant("App");
return xmlReader.GetAttribute("Version");
}
To get the App Version on Windows Phone 8+: https://stackoverflow.com/a/23387825/1033581
Here is the WP8 code:
using (var stream = new FileStream("WMAppManifest.xml", FileMode.Open, FileAccess.Read))
{
var appVersion = XElement.Load(stream).Descendants("App").FirstOrDefault().Attribute("Version");
return appVersion != null ? appVersion.Value : null;
}
add file BuildDate.txt
project Properties > Build Events
Pre-build event command line: echo %date% %time% > "$(ProjectDir)\BuildDate.txt"
Add code:
private static DateTime UpdatedAt()
{
var streamResourceInfo = Application.GetResourceStream(new Uri("BuildDate.txt", UriKind.Relative));
var reader = new StreamReader(streamResourceInfo.Stream);
string text = reader.ReadToEnd();
var substring = text.Substring(0, text.Length - 6); // text = "11.05.2014 20:44:52,07 \n\r"
var exact = DateTime.ParseExact(substring, "dd.MM.yyyy HH:mm:ss", CultureInfo.InvariantCulture);
return exact;
}
Related
I am upgrading a .net45 app to .net core 3.1 and I have a piece of code there like below.
private void GetContainerDirectories(IEnumerable<IListBlobItem> blobList)
{
// First list all the actual FILES within
// the current blob list. No recursion needed:
foreach (var item in blobList.Where
((blobItem, type) => blobItem is CloudBlockBlob))
{
var blobFile = item as CloudBlockBlob;
sb.Add(new Tree { Name = blobFile.Name, Id = blobFile.Name, ParentId = blobFile.Parent.Prefix, Title = Path.GetFileName(blobFile.Name), IsDirectory = false });
}
// List all additional subdirectories
// in the current directory, and call recursively:
foreach (var item in blobList.Where
((blobItem, type) => blobItem is CloudBlobDirectory))
{
var directory = item as CloudBlobDirectory;
sb.Add(new Tree { Name = directory.Prefix, Id = directory.Prefix, ParentId = directory.Parent.Prefix, Title = new DirectoryInfo(directory.Prefix).Name, IsDirectory = true });
// Call this method recursively to retrieve subdirectories within the current:
GetContainerDirectories(directory.ListBlobs()); ***////////Here i am getting error***
}
}
In the last line [ GetContainerDirectories(directory.ListBlobs()) ], I am getting error for ListBlobs and I am not able to find any useful solution for this. The error like this -
'CloudBlobDirectory' does not contain a definition for 'ListBlobs' and no accessible extension method 'ListBlobs' accepting a first argument of type 'CloudBlobDirectory' could be found (are you missing a using directive or an assembly reference?)
Has anyone any idea how to fix this ? Many thanks in advance :)
The WindowsAzure.Storage SDK you are using is too old, .net core does not support the synchronous methods under this SDK, and the ListBlobs method is a synchronous method.
I suggest you use the latest SDK instead:
https://www.nuget.org/packages/Azure.Storage.Blobs/12.8.0
If you don't want to use Azure.Storage.Blobs SDK, you can use ListBlobsSegmentedAsync method under WindowsAzure.Storage SDK
Update:
You can use the code below to instead of your original code:
var blobs = directory.ListBlobsSegmentedAsync(false, BlobListingDetails.Metadata, 100, null, null, null).Result.Results;
GetContainerDirectories(blobs);
My specific problem is how can I automate "add-migration" in a build process for the Entity Framework. In researching this, it seems the mostly likely approach is something along the lines of automating these steps
Open a solution in Visual Studio 2013
Execute "Add-Migration blahblah" in the Package Manager Console (most likely via an add-in vsextention)
Close the solution
This initial approach is based on my own research and this question, the powershell script ultimately behind Add-Migration requires quite a bit of set-up to run. Visual Studio performs that setup automatically when creating the Package Manager Console and making the DTE object available. I would prefer not to attempt to duplicate that setup outside of Visual Studio.
One possible path to a solution is this unanswered stack overflow question
In researching the NuGet API, it does not appear to have a "send this text and it will be run like it was typed in the console". I am not clear on the lines between Visual Studio vs NuGet so I am not sure this is something that would be there.
I am able to find the "Pacakage Manager Console" ironically enough via "$dte.Windows" command in the Package Manager Console but in a VS 2013 window, that collection gives me objects which are "Microsoft.VisualStudio.Platform.WindowManagement.DTE.WindowBase". If there is a way stuff text into it, I think I need to get it to be a NuGetConsole.Implementation.PowerConsoleToolWindow" through reviewing the source code I am not clear how the text would stuffed but I am not at all familiar with what I am seeing.
Worst case, I will fall back to trying to stuff keys to it along the lines of this question but would prefer not to since that will substantially complicate the automation surrounding the build process.
All of that being said,
Is it possible to stream commands via code to the Package Manager Console in Visual Studio which is fully initialized and able to support an Entity Framework "add-migration" command?
Thanks for any suggestions, advice, help, non-abuse in advance,
John
The approach that worked for me was to trace into the entity framework code starting in with the AddMigrationCommand.cs in the EntityFramework.Powershell project and find the hooks into the EntityFramework project and then make those hooks work so there is no Powershell dependency.
You can get something like...
public static void RunIt(EnvDTE.Project project, Type dbContext, Assembly migrationAssembly, string migrationDirectory,
string migrationsNamespace, string contextKey, string migrationName)
{
DbMigrationsConfiguration migrationsConfiguration = new DbMigrationsConfiguration();
migrationsConfiguration.AutomaticMigrationDataLossAllowed = false;
migrationsConfiguration.AutomaticMigrationsEnabled = false;
migrationsConfiguration.CodeGenerator = new CSharpMigrationCodeGenerator(); //same as default
migrationsConfiguration.ContextType = dbContext; //data
migrationsConfiguration.ContextKey = contextKey;
migrationsConfiguration.MigrationsAssembly = migrationAssembly;
migrationsConfiguration.MigrationsDirectory = migrationDirectory;
migrationsConfiguration.MigrationsNamespace = migrationsNamespace;
System.Data.Entity.Infrastructure.DbConnectionInfo dbi = new System.Data.Entity.Infrastructure.DbConnectionInfo("DataContext");
migrationsConfiguration.TargetDatabase = dbi;
MigrationScaffolder ms = new MigrationScaffolder(migrationsConfiguration);
ScaffoldedMigration sf = ms.Scaffold(migrationName, false);
}
You can use this question to get to the dte object and from there to find the project object to pass into the call.
This is an update to John's answer whom I have to thank for the "hard part", but here is a complete example which creates a migration and adds that migration to the supplied project (project must be built before) the same way as Add-Migration InitialBase -IgnoreChanges would:
public void ScaffoldedMigration(EnvDTE.Project project)
{
var migrationsNamespace = project.Properties.Cast<Property>()
.First(p => p.Name == "RootNamespace").Value.ToString() + ".Migrations";
var assemblyName = project.Properties.Cast<Property>()
.First(p => p.Name == "AssemblyName").Value.ToString();
var rootPath = Path.GetDirectoryName(project.FullName);
var assemblyPath = Path.Combine(rootPath, "bin", assemblyName + ".dll");
var migrationAssembly = Assembly.Load(File.ReadAllBytes(assemblyPath));
Type dbContext = null;
foreach(var type in migrationAssembly.GetTypes())
{
if(type.IsSubclassOf(typeof(DbContext)))
{
dbContext = type;
break;
}
}
var migrationsConfiguration = new DbMigrationsConfiguration()
{
AutomaticMigrationDataLossAllowed = false,
AutomaticMigrationsEnabled = false,
CodeGenerator = new CSharpMigrationCodeGenerator(),
ContextType = dbContext,
ContextKey = migrationsNamespace + ".Configuration",
MigrationsAssembly = migrationAssembly,
MigrationsDirectory = "Migrations",
MigrationsNamespace = migrationsNamespace
};
var dbi = new System.Data.Entity.Infrastructure
.DbConnectionInfo("ConnectionString", "System.Data.SqlClient");
migrationsConfiguration.TargetDatabase = dbi;
var scaffolder = new MigrationScaffolder(migrationsConfiguration);
ScaffoldedMigration migration = scaffolder.Scaffold("InitialBase", true);
var migrationFile = Path.Combine(rootPath, migration.Directory,
migration.MigrationId + ".cs");
File.WriteAllText(migrationFile, migration.UserCode);
var migrationItem = project.ProjectItems.AddFromFile(migrationFile);
var designerFile = Path.Combine(rootPath, migration.Directory,
migration.MigrationId + ".Designer.cs");
File.WriteAllText(designerFile, migration.DesignerCode);
var designerItem = project.ProjectItems.AddFromFile(migrationFile);
foreach(Property prop in designerItem.Properties)
{
if (prop.Name == "DependentUpon")
prop.Value = Path.GetFileName(migrationFile);
}
var resxFile = Path.Combine(rootPath, migration.Directory,
migration.MigrationId + ".resx");
using (ResXResourceWriter resx = new ResXResourceWriter(resxFile))
{
foreach (var kvp in migration.Resources)
resx.AddResource(kvp.Key, kvp.Value);
}
var resxItem = project.ProjectItems.AddFromFile(resxFile);
foreach (Property prop in resxItem.Properties)
{
if (prop.Name == "DependentUpon")
prop.Value = Path.GetFileName(migrationFile);
}
}
I execute this in my project template's IWizard implementation where I run a migration with IgnoreChanges, because of shared entites with the base project. Change scaffolder.Scaffold("InitialBase", true) to scaffolder.Scaffold("InitialBase", false) if you want to include the changes.
Simple. I want to localize my application.
I've googled for days, and tried a million different approaches to reach my resources.
The only way I've been succesful is by using the standard asp.net folders "App_LocalResource", making the resource files public and giving them a Custom Tool Name. In the view I can then import the Custom Tool Name with #using.
My issue is that the language/resource items arent changing when I change the culture.
Here is how I change it in global.asax:
protected void Application_AcquireRequestState(object sender, EventArgs e)
{
if (HttpContext.Current.Session != null)
{
CultureInfo ci = (CultureInfo)this.Session["Culture"];
if (ci == null)
{
string langName = "en";
string autoLang = "";
if (HttpContext.Current.Request.UserLanguages != null && HttpContext.Current.Request.UserLanguages.Length != 0)
{
autoLang = HttpContext.Current.Request.UserLanguages[0].Substring(0, 2);
}
if (autoLang == "da")
langName = autoLang;
ci = new CultureInfo(langName);
this.Session["Culture"] = ci;
}
Thread.CurrentThread.CurrentUICulture = ci;
Thread.CurrentThread.CurrentCulture = CultureInfo.CreateSpecificCulture(ci.Name);
}
}
So the culture is either da or en. But I noticed that the names of the resource files has to have a specific syntax. There has to be a default (in this case english) with no country/culture code and other that default has be named like reFile.da-DK.resx. It has to have both language and culture code.
I'm affraid the resource handler can recognize my file, because culture is set to "da" and not "da-DK". If I name my da resouce file to resFile.da.resx I cant import the Custom Tool Name which is my resouce files.
What do I do to solve this?
Use the full culture info string, ex:
var info = new CultureInfo("en-US")
Also for best practice move the code out into the Application_BeginRequest method, that's the standard location you'll see this type of code.
I am dynamically downloading a XAP file that has an embedded resource assembly, with a single resource file (ApplicationStrings.fr-CA.resx). I am using WebClient to pull down the XAP file and using the following code to load the assembly, based on work done by Jeff Prosise in this post: http://www.wintellect.com/CS/blogs/jprosise/archive/2010/06/21/dynamic-localization-in-silverlight.aspx.
Note that I also manually create the XAP file from the fr-CA folder with assembly and the ApplicationManifest.xaml, as described by Guy Smith-Ferrier's steps listed in his presentation here http://www.guysmithferrier.com/post/2010/10/Building-Localized-XAP-Resource-Files-For-Silverlight-4.aspx.
// Get the application manifest from the downloaded XAP
StreamResourceInfo sri = new StreamResourceInfo(e.Result, null);
XmlReader reader = XmlReader.Create(Application.GetResourceStream(sri, new Uri("AppManifest.xaml", UriKind.Relative)).Stream);
AssemblyPartCollection parts = new AssemblyPartCollection();
// Enumerate the assemblies in the downloaded XAP
if (reader.Read())
{
reader.ReadStartElement();
if (reader.ReadToNextSibling("Deployment.Parts"))
{
while (reader.ReadToFollowing("AssemblyPart"))
{
parts.Add(new AssemblyPart() { Source = reader.GetAttribute("Source") });
}
}
}
// Load the satellite assemblies
foreach (AssemblyPart part in parts)
{
if (part.Source.ToLower().Contains("resources"))
{
Stream assembly = Application.GetResourceStream(sri, new Uri(part.Source, UriKind.Relative)).Stream;
part.Load(assembly);
}
}
// Change the culture
Thread.CurrentThread.CurrentCulture = culture;
Thread.CurrentThread.CurrentUICulture = culture;
The assembly seems to load ok, and I have matched up namespaces with the default resource file (ApplicationStrings.resx) with the downloaded resource file (ApplicationStrings.fr-CA.resx). As seen the code, the culture is set for the current thread.
However, calls to ApplicationStrings.ResourceManager.GetString(...) do not return the resources for the set culture. For example, the following should return a string for the new culture (fr-CA), but always returns the default culture (en-US).
/// <summary>
/// Looks up a localized string similar to User Name:.
/// </summary>
public static string Label_UserName {
get {
return ResourceManager.GetString("Label_UserName", resourceCulture);
}
}
Any suggestions? Thanks.
** UPDATE
I figured it out...I had forgotten to reset my supported locals in my satellite assembly project file:
<SupportedCultures>fr-CA</SupportedCultures>
I also made my folder structure exactly as it is for the default resources in my main Silverlight application.
I've been trying to do this all morning. Anyone have a code snippet (C#) showing how to update an "activity" within CRM via the webservice?
I can CreateReadUpdateDelete with entities, but I'm not sure how to do it with Activities.
Can't find anything on google either...
What are you specifically looking to update? Basically, updating an activity is just like updating any other entity, you just have to use the task entity.
public void CloseTask(CrmService crmsvc, Guid activityid, DateTime start, DateTime end)
{
ColumnSet cols = new ColumnSet();
cols.Attributes = new string[] { "activityid", "statecode" };
task tsk = (task)crmsvc.Retrieve(EntityName.task.ToString(), activityid, cols);
if(tsk.statecode.Value != TaskState.Open)
return;
tsk.actualstart = new CRMDateTime();
tsk.actualstart.value = start.ToString();
tsk.actualend = new CRMDateTime();
tsk.actualend.value = end.ToString();
crmsvc.Update(tsk);
SetStateTaskRequest state = new SetStateTaskRequest();
state.EntityId = activityid;
state.TaskState = TaskState.Completed;
state.TaskStatus = -1; // Let MS CRM decide this property;
SetStateTaskResponse stateSet = (SetStateTaskResponse)crmsvc.Execute(state);
}
Edit: added some sample code. note, I had to modify what I had to strip some proprietary code, so I don't know if this will actually compile. It's close though.
We can also update a Custom Workflow Activity Using Assembly Versioning. Below link gives more information:
http://msdn.microsoft.com/en-us/library/gg328011.aspx