Web API works locally but doesn’t work on azure - visual-studio

I have created a web API connected to azure sql server in .net core using visual studio for Mac 💻. Then I created a web app in azure and then published by project directly in visual studio for Mac to azure.
After I published I try to access the api using postman and chrome (URL/api/menu) but I got 500 server error which is generic and doesn’t tell me anything.
In visual studio for Mac I got the green light it said published and directly took me to the new url.
So, what do you guys thing is the problem.
This is my first time using azure so I didn’t change any setting or anything

Since many different problems can cause this error page, I can strongly recommend the following in order to determine the root cause quickly and easily, without struggling with Azure (or any server/platform for that matter) to get logs.
You can enable extremely helpful error messages at startup by setting the .UseSetting("detailedErrors", "true") and .CaptureStartupErrors(true) actions in your Program.cs file.
For ASP.NET CORE 2.1
public class Program
{
public static void Main(string[] args)
{
BuildWebHost(args).Run();
}
public static IWebHost BuildWebHost(string[] args) =>
WebHost.CreateDefaultBuilder(args)
.CaptureStartupErrors(true)
.UseSetting("detailedErrors", "true")
.UseStartup<Startup>()
.Build();
}
Add these commands in your startup.cs class:
app.UseDeveloperExceptionPage();
app.UseDatabaseErrorPage();
app.UseBrowserLink();
also enable stdoutLog in your web.config file
stdoutLogEnabled="true" stdoutLogFile=".\logs\stdout"
Error code 500 in web api,usually, means problems with a configuration in Startup.cs - the most common problems include an issue with DB itself, an issue with migrations (if you are using Code First approach), problems with appsettings.js.
Please refer to the log file in .\logs\stdout.
Hope it helps.

Related

SSRS reports with .Net Core 3.1 MVC application

I am trying to display the SSRS report in a .Net Core 3.1 MVC application.
I tried to implement the approach mentioned in
https://alanjuden.com/2016/11/10/mvc-net-core-report-viewer/?unapproved=58532&moderation-hash=321d5350c96d2fcf83baa4c939bbdf53#comment-58532
public class ReportsController : AlanJuden.MvcReportViewer.ReportController
{
protected override ICredentials NetworkCredentials
{
get
{
//Custom Domain authentication (be sure to pull the info from a config file)
return new System.Net.NetworkCredential("username", "password");
//Default domain credentials (windows authentication)
//return System.Net.CredentialCache.DefaultNetworkCredentials;
}
}
protected override string ReportServerUrl
{
get
{
//You don't want to put the full API path here, just the path to the report server's ReportServer directory that it creates (you should be able to access this path from your browser:
return "https://YourReportServerUrl.com/ReportServer/ReportExecution2005.asmx";
}
}
public IActionResult ProcessReport()
{
var model = this.GetReportViewerModel(Request);
model.ReportPath = "reportPath";
return RedirectToAction("ReportViewer", model);
}}
but it is not working with the latest framework.
I am getting following error while running the project - Error screenshot
Any help is appreciated.
Thanks!
The same thing happened to me, in my case I needed to install the same package that tells you to install
Install-Package System.ServiceModel.Http -Version 4.1.0
or in the nuget look for the package System.ServiceModel.Http
I tried different workarounds with latest .NET Core including the one you mentioned from Alan Juden. However the easiest thing that worked for me is to create a plain .NET WebForms site using the Report Viewer control from Microsoft. It was still a lot of code but this is solid because the Report Viewer control has been around for many years.
In my case it is showing SSRS Report from Angular UI, but the same will work with MVC or any other Web UI because you will actually redirect/navigate to another url (WebForms aspx page).
More details here.

Xamarin - .NET Standard - Use Azure DevOps Services with VssAadCredential -

I want to create a Xamarin.Forms app where I have to login to a specific Azure DevOps environment/project.
To try out the Azure DevOps Service library I first created a Console App (.NET Framework 4.7.2) to login to the Azure DevOps environment/project. The following code was used for login process (+ extra code to validate the connection actualy works).
public void Login(string _userName, string _pwd)
{
ProjectHttpClient projectClient;
this.Credentials = new VssAadCredential(_userName, _pwd);
this.Connection = new VssConnection(new Uri(this.DevOpsPath), this.Credentials);
this.InitReferences(this.ProjectName);
projectClient = this.Connection.GetClient<ProjectHttpClient>();
this.ProjectReference = projectClient.GetProjects(null, top: 1).Result.Where(item => item.Name == this.ProjectName).FirstOrDefault();
}
When I use the same piece of code in the Xamarin.Forms App (.NET Standard 2.1) it no longer works and I get the following error when executing the last line:
One or more errors occurred. (Could not resolve type with token
0100008d from typeref (expected class
'Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContextIntegratedAuthExtensions'
in assembly 'Microsoft.IdentityModel.Clients.ActiveDirectory,
Version=3.19.4.11002, Culture=neutral,
PublicKeyToken=31bf3856ad364e35'))
When using the VssBasicCredential with a personal acces token, the code runs as expected. However I would prefer using the VssAadCredential and not the VssBasicCredential.
I'm not aware that the VssAadCredential is not supported in .NET Standard and can find no documentation relating to the issue.
Has anyone had a similar experience that might solve this problem or can anyone provide me with some documentation declaring that this cannot work as of yet?

Deploy ASP.net core 2.1 WEB API to IIS using Visual Studio Code

Working on an ASP.net core 2.1 web API project. I need to enable the API so that it can be accesed by client applications that we also have under developement.
So far, the only way I've found to publish to IIS is by doing a manual process:
Run dotnet publish -c Release
Copy the files in bin\Release\netcoreapp2.1\publish\ to my IIS Web App folder
I wonder if there is a more straight forward way of doing this.
Also It takes quite sometime to build this release, so for a development environment it's quite a slow process. The problem is that we cannot allow external access to the WEB api when running with F5 on the Integrated test server. How can we enable an more agile testing environment?
Another issue is that when calling for example fetch('MyAPIServer/api/MyItems') from a javascript application, I get a CORS error:
Failed to load http://localhost:86/api/shit: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://localhost:8082' is therefore not allowed access. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled
Is enabling CORS absolutely necesary when developing this type of apps?
If I fetch like this:
fetch(
`http://localhost:86/api/shit`,{mode: 'no-cors'}
)
I get:
Uncaught (in promise) SyntaxError: Unexpected end of input
at eval (Pos.vue?7f37:68)
As far as the CORs issue goes you can add the following to your startup:
public void ConfigureServices(IServiceCollection services)
{
// Rest of config stuff ...
services.AddCors();
}
Then in you will also need to add the following.
public void Configure(IApplicationBuilder app, IHostingEnvironment env)
{
app.UseCors(builder =>
{
builder.WithOrigins("http://localhost:8080",
"http://localhost:8081",
"http://localhost:8082")
.AllowAnyMethod()
.AllowAnyHeader()
.AllowCredentials();
});
app.UseMvc();
}

Elmah works on localhost, but not on production?

In my ASP.NET MVC 3 app, I've configured Elmah, and then Elmah.MVC for error logging. Both of which log just fine when running on localhost (Windows 7, IIS 6.1). On a production server (2008 R2, IIS 6.1), no errors are logged. I can browse to the /elmah directory in the site without problem (I've allowed remote access for now.) I've set the proper permissions to a folder for XML logging but nothing logged. I back-tracked to use the "in memory" logger, still no log. I've made sure modules and handlers were referenced correctly in both system.web and system.webserver.
I've browsed a lot of posts related to Elmah config issues, permissions, etc., but have not yet found the cause of this.
Are there other security/permissions issues that I'm missing on the production server related to Elmah? What else could be causing this?
Make sure you do NOT have "Read Only" checked on the folder. thanks
Not sure if this is still current but I was having a similar issue: it all worked perfectly on my local computer and on the web server when you had customErrors mode="Off" but if you changed customErrors to RemoteOnly then it would only work if you accessed the website locally.
The solution was to add a new Elmah filter on the FilterConfig section that guarantees elmah will always log the errors regardless of the customErrors mode. Here is the post with details: http://forums.asp.net/t/1687875.aspx?Elmah+Error+Log+is+not+working+my+production+Server
Here is the code snippet (credit to the person on the post):
public class FilterConfig {
public static void RegisterGlobalFilters(GlobalFilterCollection filters) {
filters.Add(new ElmahHandledErrorLoggerFilter());
filters.Add(new HandleErrorAttribute());
}
}
public class ElmahHandledErrorLoggerFilter : IExceptionFilter {
public void OnException(ExceptionContext context) {
//if (context.ExceptionHandled) // Log only handled exceptions, because all other will be caught by ELMAH anyway.
//We want elmah to log ALL exceptions
ErrorSignal.FromCurrentContext().Raise(context.Exception);
}
}

How to speed up Azure deployment from Visual Studio 2010

I have Visual Studio 2010 solution with an Azure Service and an ASP.NET MVC 3 solution that serves as a Web Role for the Azure service. No other roles attached to the service other than that.
Every deployment to the Azure staging (or production, for that matter) environment takes up to 20 minutes to complete, form the moment I click publish on Visual Studio until all instances (2) are started.
As you can imagine this makes it a PITA to publish often, or to quick-fix some bugs. Is there a way to speed the process up? Would it be faster to upload the package to de Blob storage and upgrade from there? How would I go about achieving that?
I feel on-line docs on Azure leave a lot to be desired. Particularly when it comes to troubleshooting by the way.
Thanks.
One idea for reducing the need (and frequency) for redeploying is to move static content into blob storage, external to the package. For instance, move your css and javascript to blob storage, along with images. Once this is done, you'd only have to recompile / redeploy for .NET code changes. You can upload updated css, at any time, to blob storage. If you want to test this in staging first, you could always have a staging vs. production container name for your static content and store that container name in a config setting.
This doesn't change the deployment time when you do need to redeploy, but at least you can reduce how often you go through that process...
You should enable Web Deploy in your Azure project. It works this way :
1/ Create a RDP account (don't forget, you need to upload a certificate with its private key so that Azure can decipher the password). That is hidden in the Deploy Dialog Box for your Azure deployment project.
2/ Enable Web Deployment - same place
Once you've published the app that way, right-click in the web application (not the azure deployment project) and select Publish. The pop-up has everything defined except the password, enter that as well and you'll upload your changes to Azure in a matter of seconds.
CAVEAT : this is meant for single-instance web apps, definitely not the way to go for a production upgrade strategy, and the Blob storage answer already mentioned is the best option in that case.
Pierre
My solution to this problem is only to push a new package when I am changing code in the RoleEntryPoint or with the Service Definition. In Azure 1.3 you now have the ability to use Remote Desktop Connection. Using RDC, I will compile my code locally and use copy/paste to place it on the Azure server in the appropriate directory. Once the production code is running correctly, I can then push the fully tested version to staging and then do a VIP swap. This limits the number of times I actually have to deploy a package.
You actually have quite a long window in which you can keep modifying your code in Azure before you have to publish a new package. The new package is only really needed for those cases where Azure has to shutdown/restart your role instance.
It's a nice idea to try uploading your project to blob storage first, but unfortunately this is what Visual Studio is doing for you behind the scene anyway. As has been pointed out elsewhere, most of the time in doing the deploy is not the upload itself, but the stopping and starting of all of your update domains.
If you're just running this site in a development environment, then the only way I know to speed it up is to run just one instance. If this is the live environment, then... sorry, I think you're out of luck.
So that I don't have to deploy to the cloud to test minor changes, what I've found works quite well is to engineer the site so that it works when running in local IIS just like any other MVC site.
The biggest barrier to this working are settings that you have in the cloud config. The way we get around this is to make a copy of all of the settings in your cloud config and put them in your web.config in the appSettings. Then rather than using RoleEnvironment.GetConfigurationSettingValue() create a wrapper class that you call instead. This wrapper class checks RoleEnvironment.IsAvailable to see if it is running in the Azure fabric, if it is, it calls the usual config function above, if not, it calls WebConfigurationManager.AppSettings[].
There are a few other things that you'll want to do around getting the config setting change events which hopefully you can figure out from the code below:
public class SmartConfigurationManager
{
private static bool _addConfigChangeEvents;
private static string _configName;
private static Func<string, bool> _configSetter;
public static bool AddConfigChangeEvents
{
get { return _addConfigChangeEvents; }
set
{
_addConfigChangeEvents = value;
if (value)
{
RoleEnvironment.Changing += RoleEnvironmentChanging;
}
else
{
RoleEnvironment.Changing -= RoleEnvironmentChanging;
}
}
}
public static string Setting(string configName)
{
if (RoleEnvironment.IsAvailable)
{
return RoleEnvironment.GetConfigurationSettingValue(configName);
}
return WebConfigurationManager.AppSettings[configName];
}
public static Action<string, Func<string, bool>> GetConfigurationSettingPublisher()
{
if (RoleEnvironment.IsAvailable)
{
return AzureSettingsGet;
}
return WebAppSettingsGet;
}
public static void WebAppSettingsGet(string configName, Func<string, bool> configSetter)
{
configSetter(WebConfigurationManager.AppSettings[configName]);
}
public static void AzureSettingsGet(string configName, Func<string, bool> configSetter)
{
// We have to store these to be used in the RoleEnvironment Changed handler
_configName = configName;
_configSetter = configSetter;
// Provide the configSetter with the initial value
configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
if (AddConfigChangeEvents)
{
RoleEnvironment.Changed += RoleEnvironmentChanged;
}
}
private static void RoleEnvironmentChanged(object anotherSender, RoleEnvironmentChangedEventArgs arg)
{
if ((arg.Changes.OfType<RoleEnvironmentConfigurationSettingChange>().Any(change => change.ConfigurationSettingName == _configName)))
{
if ((_configSetter(RoleEnvironment.GetConfigurationSettingValue(_configName))))
{
RoleEnvironment.RequestRecycle();
}
}
}
private static void RoleEnvironmentChanging(object sender, RoleEnvironmentChangingEventArgs e)
{
// If a configuration setting is changing
if ((e.Changes.Any(change => change is RoleEnvironmentConfigurationSettingChange)))
{
// Set e.Cancel to true to restart this role instance
e.Cancel = true;
}
}
}
The uploading itself takes a bit more than a minute most of the time. It's the starting up of the instances that take up most of the time.
What you can do is to deploy your fixes to staging first (note that it costs money so don't let it be there for too long). Swapping from staging to production only takes a couple of seconds. So while your application's still running you can upload the patched version, let your testers test it on staging and when they give the go then simply swap it to production.
I haven't tested your possible alternative approach by first uploading to blob storage first. But I think that's overhead as it doesn't speed up starting up the instances.

Resources