Load Unmanaged dll reference in ASP.NET MVC Core 1.0 (Aka) ASP.net 5 MVC 6 - asp.net-web-api

This question relates to an ASP.NET MVC WEP API, and Naudio
what I want
I was working prototype in WPF appliations when I use this code the wav file is converted to mp3
var retMs = new MemoryStream();
using (var ms = new MemoryStream(File.ReadAllBytes("sound.wav")))
using (var rdr = new WaveFileReader(ms))
using (var wtr = new LameMP3FileWriter(retMs, rdr.WaveFormat, 128))
{
rdr.CopyTo(wtr);
}
return retMs.ToArray();
But when using this code in api project I am getting error like this
Unable to load DLL 'libmp3lame.dll': The specified module could not be
found.
I know libmp3lame is unmanged dll, the WPF project I just copy the dll to bin folder and everything works fine, but how can I achieve this in Web API project, I mean asp.net 5 project
Note
the above code is working expected in wpf project
Also now I am only supported in my API is dotnetframwork means windows only
I removed other platform dependencies
Update:
Created Issue in ASP.Net MVC Repo

I'm a little late to the party but I'll share this for posterity. NAudio can convert a WAVE to an MP3 without the unmanaged DLL to avoid this issue all together. The only snag is it has to write out to a file (which can then be read back in). Assuming you have a MemoryStream where you read all the bytes in (like you did):
using (var wfr = new WaveFileReader(ms))
{
MediaFoundationApi.Startup();
MediaFoundationEncoder.EncodeToMp3(wfr, #"C:\Temp\test.mp3", 48000);
MediaFoundationApi.Shutdown();
}

Sounds like an issue with getting the mapping to the module in the bin library path, check out the following as may help -
How to get bin folder in ASP.NET Core 1.0
MVC4 App "Unable to load DLL 'libmp3lame.32.dll'

Related

How to obtain path to Razor Class Lib static assets in controller/library code?

I am migrating an existing .NET Framework application to .NET 6. Parts of the application were UI components in library projects, which I have updated to be Razor Class Libraries.
I have updated the relevant projects based on these microsoft docs:
consume-a-razor-component-from-an-rcl
razor-pages/ui-class
However, I have found a situation where an image, which exists in the RCL project(s), was being loaded in-memory, and modified using the Graphics/Bitmap APIs. I am trying to discover how to accomplish this in the .NET 6 world. I have such static assets properly loading in Views and such at runtime, but I suppose I am wondering how to leverage that runtime lookup in library code (as opposed to in Razor view code).
For example, given an image path to a content file from an RCL project like:
~/_content/My.RCL.Project/Images/foo.png
I can reference such a file in an image tag, like:
<img src="~/_content/My.RCL.Project/Images/foo.png" />
And this works just fine as long as the path is correct - the image is loaded as expected. This is being done successfully for many images in the site thus far.
But how can I load that file into memory in server-side/library code, e.g.
const string filePath = "~/_content/My.RCL.Project/Images/foo.png";
using(var bitmap = new Bitmap(filePath))
using(var graphics = Graphics.FromImage(bitmap))
{
//... omitted
}
I have tried to find the physical file via the IWebHostEnvironment APIs:
IWebHostEnvironment env; //obtained via DI
env.WebRootFileProvider.GetFileInfo(filePath);
env.ContentRootFileProvider.GetFileInfo(filePath);
However, both of these methods return a non-existent file.
Does such a mechanism/API exist for RCL content? Can this be done by calling the Razor engine directly in some form?

Syncfusion PdfViewerControl on Azure

I am utilizing Syncfusion's PdfViewerControl and PdfLoadedDocument classes to generate thumbnail images of a PDF. However, once I moved the project to an Azure App Service, the PdfViewerControl is throwing an exception when being initialized. I am curious if it is attempting to use system memory and Azure is blocking this. Below is the method GenerateThumbnails I've created and the exception is being thrown when creating a new PdfViewerControl. If anyone has a work around for this or has experienced something similar when moving to Azure, any assistance would be greatly appreciated.
Along with that, if someone knows of another tool to create thumbnails from a PDF in this manner that'd be very helpful as well. Thanks!
Exception:
System.AccessViolationException: 'Attempted to read or write protected memory. This is often an indication that other memory is corrupt.'
Method:
public static List<Byte[]> GenerateThumbnails(Byte[] file)
{
Int32 resizedHeight;
Int32 resizedWidth;
List<Byte[]> thumbnails = new List<Byte[]>();
using (PdfViewerControl pdfViewerControl = new PdfViewerControl())
using (PdfLoadedDocument pdfLoadedDocument = new PdfLoadedDocument(file, true))
{
// The PDF Viewer Control must load the PDF from a PdfLoadedDocument, rather than directly from the filename because
// when loaded from the filename, it is not disposed correctly and causes a file lock.
pdfViewerControl.Load(pdfLoadedDocument);
for (Int32 i = 0; i < pdfViewerControl.PageCount; ++i)
{
using (Bitmap originalBitmap = pdfViewerControl.ExportAsImage(i))
{
if (pdfViewerControl.LoadedDocument.Pages[i].Size.Width > pdfViewerControl.LoadedDocument.Pages[i].Size.Height)
{
resizedHeight = (PdfUtility.TARGET_THUMBNAIL_WIDTH_LANDSCAPE * originalBitmap.Height) / originalBitmap.Width;
resizedWidth = PdfUtility.TARGET_THUMBNAIL_WIDTH_LANDSCAPE;
}
else
{
resizedHeight = PdfUtility.TARGET_THUMBNAIL_HEIGHT_PORTRAIT;
resizedWidth = (PdfUtility.TARGET_THUMBNAIL_HEIGHT_PORTRAIT * originalBitmap.Width) / originalBitmap.Height;
}
using (Bitmap resizedBitmap = new Bitmap(originalBitmap, new Size(resizedWidth, resizedHeight)))
using (MemoryStream memoryStream = new MemoryStream())
{
resizedBitmap.Save(memoryStream, ImageFormat.Jpeg);
thumbnails.Add(memoryStream.ToArray());
}
}
}
}
return thumbnails;
}
Update
Web App for Containers on Windows is now supported. This allows you to bring your own docker container that runs outside of the sandbox, so the restrictions described below won't affect your application.
There are restrictions in the sandbox that the app is running in that prevents certain API calls.
Here is a list of frameworks and scenarios that have been found to be
not be usable due to one or more of the restrictions above. It's
conceivable that some will be supported in the future as the sandbox
evolves.
PDF generators failing due to restriction mentioned above:
Syncfusion Siberix Spire.PDF The following PDF generators are
supported:
SQL Reporting framework: requires the site to run in Basic or higher
(note that this currently does not work in Functions apps in
Consumptions mode) EVOPDF: See
http://www.evopdf.com/azure-html-to-pdf-converter.aspx for vendor
solution Telerik reporting: requires the site to run in Basic or
higher. More info here Rotativa / wkhtmltopdf: requires the site to
run in Basic or higher. NReco PdfGenerator (wkhtmltopdf): requires
subscription plan Basic or higher Known issue for all PDF generators
based on wkhtmltopdf or phantomjs: custom fonts are not rendered
(system-installed font is used instead) because of sandbox GDI API
limitations that present even in VM-based Azure Apps plans (Basic or
higher).
Other scenarios that are not supported:
PhantomJS/Selenium: tries to connect to local address, and also uses
GDI+.
https://github.com/projectkudu/kudu/wiki/Azure-Web-App-sandbox

ASP.NET Output Caching not working after upgrade from MVC2 to MVC4

We have an existing MVC2 project that we just upgraded to MVC4 following first these steps to get to MVC3, then these steps to get to MVC4.
Output caching had been successfully working for a long time in our MVC2 project, but it does not work after the MVC4 version.
I've added a simple controller to test caching:
public class TestController : Controller
{
[OutputCache(Duration = 600, VaryByParam = "*")]
public ActionResult CacheTest()
{
return Content(DateTime.Now.ToLongTimeString());
}
}
Each time i refresh this page, the time output to the browser changes.
Creating a new MVC3 project in this same solution, then upgrading to MVC4, then copying this same code over works as expected.
So there must be something somewhere in our existing code or configuration that is breaking output caching.
I've also tried stripping out a ton of stuff from the web.config file thinking something there was causing problems - no luck.
Any suggestions on how to fix or debug this?
UPDATE:
Rendering the CacheTest action above in any view will display cached results - i.e. the date does not change on each refresh:
<% Html.RenderAction("CacheTest", "Test"); %>
Why does that work, but the action url from a browser is never cached?
Turns out this was an issue with a third party library - 51Degrees. This was an issue introduced into a recent version of this library. In the process of converting from MVC2 to MVC4 I installed the nuget package which was a few versions later than the previous version I was using.
False alarm - nothing to do with the ASP.NET MVC upgrade or anything ASP.NET related.

CRM 2011 external content with relative URL

In CRM 4.0 we could place dynamic content (aspx) in the ISV-folder in CRM, creating separate applications but with security and relative URLs to CRM, so for example a custom 360 view of account could be linked in an iframe using a relative URL along the lines of
/ISV/CrmMvcApp/Account.aspx/Overview?id=....
In CRM 2011 usage of the ISV folder is deprecated and Microsoft has some guidelines on how to transition into doing this in supported manner (MSDN gg309571: Upgrade Code in the ISV folder to Microsoft Dynamics CRM 2011). They say:
For scenarios that will not be satisfied by the Web resources feature, create your Web application in its own application pool with its own web.config.
The way I am reading this (coupled with the guidelines on supported/unsupported) is that we need a separate web site in IIS with its own binding as you are not allowed to add virtual directories etc. under the standard CRM app. This is unfortunate and does not allow relative paths/URLs in customizations and sitemap. This is troublesome especially when exporting and importing solutions from DEV, TEST and/or PROD.
Are my assumptions wrong?
Can we somehow have relative paths?
Have anyone else found a pragmatic and easy approach to having external content without doing the sitemap and customization changes for each environment?
EDIT: Confirmed with other sources that my understanding of the guidelines are correct, as this is also listed in the list of unsupported changes. Virtual folders and web apps are to be kept totally separated from the default CRM web site.
Creating an Internet Information Services (IIS) application inside the Microsoft Dynamics CRM website for any VDir and specifically within the ISV folder is not supported.
MSDN gg328350: Unsupported Customizations
If you primarily need to access CRM data/records, take a look at using a jScript web resources. You can do "most" CRUD operations using the REST OData services. If you use JQuery to parse the JSON it's very productive.
I have found a solution much like the javascript redirect, without the need for client execution and only configuring the environment details (servername, port) once. Additional logic can easily be added.
The solution creates a dependency into the customizations, but not an environment one like and can be used for unmanaged and managed solutions.
The solution was to place a file Redirect.aspx in the ISV folder. The code does not in any way interact with CRM and falls within the supported guidelines, however the solution is not future proof as the ISV folder is deprecated by Microsoft.
Redirect.aspxwill automatically pass along any parameter passed, so will work with or without the entity identifiers and so on.
Usage:
Place the file in the ISV folder on the CRM app server
Change the server name and port to match the current environment (must be done for each environment)
In customizations, for example for an iframe, use the following as a source:
/ISV/Redirect.aspx?redirect=http://SERVERREPLACE/CustomMvcApp/SomeControllerAction
Here is the content of Redirect.aspx
<%# Page Language="C#" %>
<html>
<script runat="server">
protected override void OnLoad(EventArgs e)
{
// must be customized for each environment
const string ServerBaseName = "appserver1:60001";
const string UrlParameterName = "redirect";
const string ReplacePattern = "SERVERREPLACE";
var parameterUrl = Request.Params[UrlParameterName].Replace(ReplacePattern, ServerBaseName);
var queryStringBuilder = new StringBuilder();
foreach (var key in Request.QueryString.AllKeys)
{
if (key == UrlParameterName)
{
continue;
}
queryStringBuilder.Append(!(queryStringBuilder.Length > 0) ? "?" : "&");
queryStringBuilder.Append(key + "=" + Request.QueryString[key]);
}
var completeRedirectString = parameterUrl + queryStringBuilder;
Response.Redirect(completeRedirectString);
}
</script>
<head>
<title>Redirecting</title>
</head>
</html>
Not quite "relative urls" as per your question, but a solution I use is to store "stub" or "root" urls in a config entity and read those records in JScript at runtime to determine the fully qualified destination for your custom links.

using XML files in Windows Phone 7 and XNA 4.0

I'm working on a basic tiling engine for XNA 4.0 on Windows Phone 7. I have a bunch of mapdata xml files with all the tile positions, powerup positions etc.
I was wondering what the best way of using these? I've read that if I want to use them with the Content then I have to alter the layout of the xml files.
Is there any way to load these files on to the device within the project and read the data from them?
Many thanks,
ant.
The way I did it was make the XML file an embedded resource, not content, then I could access them in code:
Assembly app = Assembly.GetExecutingAssembly();
XmlSerializer ser = new XmlSerializer(typeof(xmlType));
string[] resources = app.GetManifestResourceNames();
foreach (string resourceName in resources)
{
xmlObject = (xmlType)ser.Deserialize(new StreamReader(app.GetManifestResourceStream(resourceName)));
}
xmlType is a class that represents my XML Format

Resources