Is there a common way to access the local machines csproj.user file to get the web site url for watin tests so that The tests can run on each developers local server as well as on IIS on the build server? I guess you could parse the xml file yourself, but I am wondering if there is a Cleaner/Easier way.
I found this
Starting ASP.NET Development Web Server (Cassini) as part of unit test setup?
and ended up just doing something like this with a relative path from my Test Project to my Web Project. Then set up an appropriate csproj.user file on the build server.
public static string GetDevelopmentServerURL(string csprojFileName)
{
XPathDocument doc = new XPathDocument(csprojFileName);
XPathNavigator navigator = doc.CreateNavigator();
XmlNamespaceManager manager = new XmlNamespaceManager(navigator.NameTable);
manager.AddNamespace("msbuild",
"http://schemas.microsoft.com/developer/msbuild/2003");
const string xpath = "/msbuild:Project/msbuild:ProjectExtensions/"
+ "msbuild:VisualStudio/msbuild:FlavorProperties/"
+ "msbuild:WebProjectProperties";
XPathNavigator webProjectPropertiesNode =
navigator.SelectSingleNode(xpath, manager);
XPathNavigator developmentServerPortNode =
webProjectPropertiesNode.SelectSingleNode("msbuild:DevelopmentServerPort",
manager);
XPathNavigator developmentServerVPathNode =
webProjectPropertiesNode.SelectSingleNode("msbuild:DevelopmentServerVPath",
manager);
XPathNavigator UseIIS =webProjectPropertiesNode.SelectSingleNode("msbuild:UseIIS",
manager);
XPathNavigator IISPath = webProjectPropertiesNode.SelectSingleNode("msbuild:IISUrl",
manager);
XPathNavigator UseCustomServer = webProjectPropertiesNode.SelectSingleNode("msbuild:UseCustomServer",manager);
XPathNavigator CustomeServerPath = webProjectPropertiesNode.SelectSingleNode("msbuild:CustomServerUrl",
manager);
if (UseIIS.Value == "True")
return IISPath.Value;
if (UseCustomServer.Value == "True")
return CustomeServerPath.Value;
return "http://localhost:" + developmentServerPortNode.Value + developmentServerVPathNode;
}
Related
Writing a dotnet core app. I need to log in with network credentials as the service (which happens to be a TFS on-prem server) uses those to authenticate. From my (and another team members') windows machine, the following code works:
Console.WriteLine("Type in your DOMAIN password:");
var pass = GetPassword(); //command line secure string magic from SO
var networkCredential = new NetworkCredential("USERNAME", pass, "DOMAINNAME");
string tfsDefaultCollection = "https://TFSURL/DefaultCollection";
string testUrl = $"{tfsDefaultCollection}/_apis/tfvc/changesets/1234/changes?api-version=2.2";
var httpClientHandler = new HttpClientHandler
{
Credentials = networkCredential
};
var client = new HttpClient(httpClientHandler)
{
BaseAddress = new Uri(testUrl)
};
httpClientHandler.PreAuthenticate = true;
var test = client.GetAsync(testUrl).Result;
Console.WriteLine(test);
But it doesn't work from my mac. I get a 401 unauthorized. Both used the same, hardwired connection. AND this works on my mac:
curl --ntlm --user "DOMAINNAME\USERNAME" "https://TFSURL/DefaultCollection/_apis/tfvc/changesets/1234/changes?api-version=2.2"
So that rules out a connectivity question, I would think. Am I missing something I need to be doing on my mac? Can anybody point me to some documentation or way to troubleshoot what both of these requests are doing at the lowest level to see if there is a difference?
Well finally some google-foo got me there. There's a bug in dotnet core for linux/mac. This issue describes the fix:
https://github.com/dotnet/corefx/issues/25988#issuecomment-412534360
It has to do with the host machine you are connecting to uses both Kerberos and NTLM authentication methods.
Implemented below:
AppContext.SetSwitch("System.Net.Http.UseSocketsHttpHandler", false);
Console.WriteLine("Type in your DOMAIN password:");
var pass = GetPassword(); //command line secure string magic from SO
var networkCredential = new NetworkCredential("USERNAME", pass, "DOMAINNAME");
string tfsDefaultCollection = "https://TFSURL/DefaultCollection";
string testUrl = $"{tfsDefaultCollection}/_apis/tfvc/changesets/1234/changes?api-version=2.2";
var myCache = new CredentialCache
{
{
new Uri(testUrl), "NTLM",
networkCredential
}
};
var httpClientHandler = new HttpClientHandler
{
Credentials = myCache
};
var client = new HttpClient(httpClientHandler)
{
BaseAddress = new Uri(testUrl)
};
httpClientHandler.PreAuthenticate = true;
var test = client.GetAsync(testUrl).Result;
Console.WriteLine(test);
Thanks to #dmcgill50 for getting me on the right googling track.
I have changed the ports that Azure Storage Emulator runs on from 10000,10001,10002 to 10003,10004,10005 from the config file at "C:\Program Files (x86)\Microsoft SDKs\Azure\Storage Emulator\WAStorageEmulator.exe.config"
Now when I try to access Development Storage from Server Explorer in Visual Studio 2013 it fails to access the updated ports. I tried to manually add external storage and specify the endpoints to reflect the updated ports with the following info default storage account information:
DefaultEndpointsProtocol=http
AccountName=devstoreaccount1
AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==
BlobEndpoint=http://127.0.0.1:10003/devstoreaccount1
QueueEndpoint=http://127.0.0.1:10004/devstoreaccount1
TableEndpoint=http://127.0.0.1:10005/devstoreaccount1
but that still does not allow it to connect. I also tried the same endpoints but without the storage account suffix. It even reverts the ports to 10000,10001,10002 when I refresh the External Storage. I assume it is reading from some config somewhere but I cannot seem to google any answer as to where this is being read from.
So how can I configure Server Explorer to reflect the updated ports?
The ports are hard coded into the CloudStorageAccount class so no you can't modifiy them:
private static CloudStorageAccount GetDevelopmentStorageAccount(Uri proxyUri)
{
UriBuilder uriBuilder = proxyUri != (Uri)null ? new UriBuilder(proxyUri.Scheme, proxyUri.Host) : new UriBuilder("http", "127.0.0.1");
uriBuilder.Path = "devstoreaccount1";
uriBuilder.Port = 10000;
Uri uri1 = uriBuilder.Uri;
uriBuilder.Port = 10001;
Uri uri2 = uriBuilder.Uri;
uriBuilder.Port = 10002;
Uri uri3 = uriBuilder.Uri;
uriBuilder.Path = "devstoreaccount1-secondary";
uriBuilder.Port = 10000;
Uri uri4 = uriBuilder.Uri;
uriBuilder.Port = 10001;
Uri uri5 = uriBuilder.Uri;
uriBuilder.Port = 10002;
Uri uri6 = uriBuilder.Uri;
CloudStorageAccount cloudStorageAccount = new CloudStorageAccount(new StorageCredentials("devstoreaccount1", "Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw=="), new StorageUri(uri1, uri4), new StorageUri(uri2, uri5), new StorageUri(uri3, uri6), (StorageUri)null);
cloudStorageAccount.Settings = (IDictionary<string, string>)new Dictionary<string, string>();
cloudStorageAccount.Settings.Add("UseDevelopmentStorage", "true");
if (proxyUri != (Uri)null)
cloudStorageAccount.Settings.Add("DevelopmentStorageProxyUri", proxyUri.ToString());
cloudStorageAccount.IsDevStoreAccount = true;
return cloudStorageAccount;
}
Unfortunately, there is no support for changing the Azure Storage Emulator ports.
My specific problem is how can I automate "add-migration" in a build process for the Entity Framework. In researching this, it seems the mostly likely approach is something along the lines of automating these steps
Open a solution in Visual Studio 2013
Execute "Add-Migration blahblah" in the Package Manager Console (most likely via an add-in vsextention)
Close the solution
This initial approach is based on my own research and this question, the powershell script ultimately behind Add-Migration requires quite a bit of set-up to run. Visual Studio performs that setup automatically when creating the Package Manager Console and making the DTE object available. I would prefer not to attempt to duplicate that setup outside of Visual Studio.
One possible path to a solution is this unanswered stack overflow question
In researching the NuGet API, it does not appear to have a "send this text and it will be run like it was typed in the console". I am not clear on the lines between Visual Studio vs NuGet so I am not sure this is something that would be there.
I am able to find the "Pacakage Manager Console" ironically enough via "$dte.Windows" command in the Package Manager Console but in a VS 2013 window, that collection gives me objects which are "Microsoft.VisualStudio.Platform.WindowManagement.DTE.WindowBase". If there is a way stuff text into it, I think I need to get it to be a NuGetConsole.Implementation.PowerConsoleToolWindow" through reviewing the source code I am not clear how the text would stuffed but I am not at all familiar with what I am seeing.
Worst case, I will fall back to trying to stuff keys to it along the lines of this question but would prefer not to since that will substantially complicate the automation surrounding the build process.
All of that being said,
Is it possible to stream commands via code to the Package Manager Console in Visual Studio which is fully initialized and able to support an Entity Framework "add-migration" command?
Thanks for any suggestions, advice, help, non-abuse in advance,
John
The approach that worked for me was to trace into the entity framework code starting in with the AddMigrationCommand.cs in the EntityFramework.Powershell project and find the hooks into the EntityFramework project and then make those hooks work so there is no Powershell dependency.
You can get something like...
public static void RunIt(EnvDTE.Project project, Type dbContext, Assembly migrationAssembly, string migrationDirectory,
string migrationsNamespace, string contextKey, string migrationName)
{
DbMigrationsConfiguration migrationsConfiguration = new DbMigrationsConfiguration();
migrationsConfiguration.AutomaticMigrationDataLossAllowed = false;
migrationsConfiguration.AutomaticMigrationsEnabled = false;
migrationsConfiguration.CodeGenerator = new CSharpMigrationCodeGenerator(); //same as default
migrationsConfiguration.ContextType = dbContext; //data
migrationsConfiguration.ContextKey = contextKey;
migrationsConfiguration.MigrationsAssembly = migrationAssembly;
migrationsConfiguration.MigrationsDirectory = migrationDirectory;
migrationsConfiguration.MigrationsNamespace = migrationsNamespace;
System.Data.Entity.Infrastructure.DbConnectionInfo dbi = new System.Data.Entity.Infrastructure.DbConnectionInfo("DataContext");
migrationsConfiguration.TargetDatabase = dbi;
MigrationScaffolder ms = new MigrationScaffolder(migrationsConfiguration);
ScaffoldedMigration sf = ms.Scaffold(migrationName, false);
}
You can use this question to get to the dte object and from there to find the project object to pass into the call.
This is an update to John's answer whom I have to thank for the "hard part", but here is a complete example which creates a migration and adds that migration to the supplied project (project must be built before) the same way as Add-Migration InitialBase -IgnoreChanges would:
public void ScaffoldedMigration(EnvDTE.Project project)
{
var migrationsNamespace = project.Properties.Cast<Property>()
.First(p => p.Name == "RootNamespace").Value.ToString() + ".Migrations";
var assemblyName = project.Properties.Cast<Property>()
.First(p => p.Name == "AssemblyName").Value.ToString();
var rootPath = Path.GetDirectoryName(project.FullName);
var assemblyPath = Path.Combine(rootPath, "bin", assemblyName + ".dll");
var migrationAssembly = Assembly.Load(File.ReadAllBytes(assemblyPath));
Type dbContext = null;
foreach(var type in migrationAssembly.GetTypes())
{
if(type.IsSubclassOf(typeof(DbContext)))
{
dbContext = type;
break;
}
}
var migrationsConfiguration = new DbMigrationsConfiguration()
{
AutomaticMigrationDataLossAllowed = false,
AutomaticMigrationsEnabled = false,
CodeGenerator = new CSharpMigrationCodeGenerator(),
ContextType = dbContext,
ContextKey = migrationsNamespace + ".Configuration",
MigrationsAssembly = migrationAssembly,
MigrationsDirectory = "Migrations",
MigrationsNamespace = migrationsNamespace
};
var dbi = new System.Data.Entity.Infrastructure
.DbConnectionInfo("ConnectionString", "System.Data.SqlClient");
migrationsConfiguration.TargetDatabase = dbi;
var scaffolder = new MigrationScaffolder(migrationsConfiguration);
ScaffoldedMigration migration = scaffolder.Scaffold("InitialBase", true);
var migrationFile = Path.Combine(rootPath, migration.Directory,
migration.MigrationId + ".cs");
File.WriteAllText(migrationFile, migration.UserCode);
var migrationItem = project.ProjectItems.AddFromFile(migrationFile);
var designerFile = Path.Combine(rootPath, migration.Directory,
migration.MigrationId + ".Designer.cs");
File.WriteAllText(designerFile, migration.DesignerCode);
var designerItem = project.ProjectItems.AddFromFile(migrationFile);
foreach(Property prop in designerItem.Properties)
{
if (prop.Name == "DependentUpon")
prop.Value = Path.GetFileName(migrationFile);
}
var resxFile = Path.Combine(rootPath, migration.Directory,
migration.MigrationId + ".resx");
using (ResXResourceWriter resx = new ResXResourceWriter(resxFile))
{
foreach (var kvp in migration.Resources)
resx.AddResource(kvp.Key, kvp.Value);
}
var resxItem = project.ProjectItems.AddFromFile(resxFile);
foreach (Property prop in resxItem.Properties)
{
if (prop.Name == "DependentUpon")
prop.Value = Path.GetFileName(migrationFile);
}
}
I execute this in my project template's IWizard implementation where I run a migration with IgnoreChanges, because of shared entites with the base project. Change scaffolder.Scaffold("InitialBase", true) to scaffolder.Scaffold("InitialBase", false) if you want to include the changes.
I use MEF to extend my web application and I use the following folder structure
> bin
> extensions
> Plugin1
> Plugin2
> Plugin3
To achive this automatically, the plugin projects output paths are set to these directories. My application is working with and without azure. My problem is now, that it seems to be inpossible to include the extensions subdirectory automatically to the azure deployment package.
I've tried to set the build dependencies too, without success.
Is there another way?
Well,
I've struggled with the bin folder. The issue (if we may say "issue") is that the packaging process, just packs what is "copy to out directory" set to "copy if newer/aways" only for the Web application (Web Role) project. Having another assemblies in the BIN which are not explicitly referenced by the Web Application will not get deployed.
For my case, where I have pretty "static" references I just pack them in a ZIP, put them in a BLOB container and then use the Azure Bootstrapper to download, extract and put in the BIN folder these references. However, because I don't know the actual location of the BIN folder in a startup task, I use helper wrappers for the bootstrapper to make the trick.
You will need to get the list of local sites, which can be accomplished by something similar to:
public IEnumerable<string> WebSiteDirectories
{
get
{
string roleRootDir = Environment.GetEnvironmentVariable("RdRoleRoot");
string appRootDir = (RoleEnvironment.IsEmulated) ? Path.GetDirectoryName(AppDomain.CurrentDomain.BaseDirectory) : roleRootDir;
XDocument roleModelDoc = XDocument.Load(Path.Combine(roleRootDir, "RoleModel.xml"));
var siteElements = roleModelDoc.Root.Element(_roleModelNs + "Sites").Elements(_roleModelNs + "Site");
return
from siteElement in siteElements
where siteElement.Attribute("name") != null
&& siteElement.Attribute("name").Value == "Web"
&& siteElement.Attribute("physicalDirectory") != null
select Path.Combine(appRootDir, siteElement.Attribute("physicalDirectory").Value);
}
}
Where the _roleModelNs variable is defined as follows:
private readonly XNamespace _roleModelNs = "http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition";
Next you will need something similar to that method:
public void GetRequiredAssemblies(string pathToWebBinfolder)
{
string args = string.Join("",
#"-get https://your_account.blob.core.windows.net/path/to/plugin.zip -lr $lr(temp) -unzip """,
pathToWebBinfolder,
#""" -block");
this._bRunner.RunBootstrapper(args);
}
And the RunBootstrapper has following signature:
public bool RunBootstrapper (string args)
{
bool result = false;
ProcessStartInfo psi = new ProcessStartInfo();
psi.FileName = this._bootstrapperPath;
psi.Arguments = args;
Trace.WriteLine("AS: Calling " + psi.FileName + " " + psi.Arguments + " ...");
psi.CreateNoWindow = true;
psi.ErrorDialog = false;
psi.UseShellExecute = false;
psi.WindowStyle = ProcessWindowStyle.Hidden;
psi.RedirectStandardOutput = true;
psi.RedirectStandardInput = false;
psi.RedirectStandardError = true;
// run elevated
// psi.Verb = "runas";
try
{
// Start the process with the info we specified.
// Call WaitForExit and then the using statement will close.
using (Process exeProcess = Process.Start(psi))
{
exeProcess.PriorityClass = ProcessPriorityClass.High;
string outString = string.Empty;
// use ansynchronous reading for at least one of the streams
// to avoid deadlock
exeProcess.OutputDataReceived += (s, e) =>
{
outString += e.Data;
};
exeProcess.BeginOutputReadLine();
// now read the StandardError stream to the end
// this will cause our main thread to wait for the
// stream to close
string errString = exeProcess.StandardError.ReadToEnd();
Trace.WriteLine("Process out string: " + outString);
Trace.TraceError("Process error string: " + errString);
result = true;
}
}
catch (Exception e)
{
Trace.TraceError("AS: " + e.Message + e.StackTrace);
result = false;
}
return result;
}
Of course, in your case you might want something a bit more complex, where you'll first try to fetch all plugins (if each plugin is in its own ZIP) via code, and then execute the GetRequiredAssemblies multiple times for each plugin. And this code might be executing in the RoleEntryPoint's OnStart method.
And also, if you plan to be more dynamic, you can also override the Run() method of your RoleEntryPoint subclass, and check for new plugins every minute for example.
Hope this helps!
EDIT
And how can you get the plugins deployed. Well, you can either manually upload your plugins, or you can develop a small custom BuildTask to automatically upload your plugin upon build.
I'm creating a setup project for WCF net-tcp service. One thing I came across is that I need to change "Web Site->Manage Application->Advanced settings->Enabled Protocols". It can be also done using command line:
%windir%\system32\inetsrv\appcmd.exe set app "[Web Site Name]/[Applicaiton Name]" /enabledProtocols:http,net.tcp
The problem is in custom action I can get [TARGETSITE] but it's value is "/LM/W3SVC/2" (I have [TARGETVDIR] too). The question is how can I get Web Site Name or how can I use [TARGETSITE] to set application enabled protocols?
The solution I ended with involves converting metabasePath to site name and then using appcmd:
private static string GetSiteName(string metabasePath)
{
var siteIdString = metabasePath.Substring(metabasePath.LastIndexOf("/") + 1);
long siteId;
long.TryParse(siteIdString, out siteId);
if (siteId != 0)
{
var iisManager = new ServerManager();
var config = iisManager.GetApplicationHostConfiguration();
var sites = config.GetSection("system.applicationHost/sites").GetCollection();
ConfigurationElement selectedSite = null;
foreach (var site in sites)
{
if ((long)site.GetAttribute("id").Value == siteId)
selectedSite = site;
}
if (selectedSite != null)
{
return selectedSite.GetAttribute("name").Value as string;
}
}
return null;
}
To use this you will have to reference:
C:\Windows\System32\inetsrv\Microsoft.Web.Administration.dll
C:\Windows\System32\inetsrv\Microsoft.Web.Management.dll