Has anyone been able to get ABCPDF (HTML Conversion) working on azure websites - abcpdf

Webforms Page Code behind
XSettings.InstallRedistributionLicense("REDACTED");
var theDoc = new Doc();
theDoc.HtmlOptions.Engine = EngineType.Gecko;
theDoc.Rect.Inset(72, 144);
theDoc.Page = theDoc.AddPage();
int theID = theDoc.AddImageUrl("http://www.woot.com/");
while (true)
{
theDoc.FrameRect(); // add a black border
if (!theDoc.Chainable(theID))
break;
theDoc.Page = theDoc.AddPage();
theID = theDoc.AddImageToChain(theID);
}
for (int i = 1; i <= theDoc.PageCount; i++)
{
theDoc.PageNumber = i;
theDoc.Flatten();
}
Response.Buffer = false;
Response.AddHeader("Content-Disposition", "inline; filename=\"rept.pdf\"");
Response.ContentType = "application/pdf";
theDoc.Save(Response.OutputStream);
Response.Flush();
should work pretty well.. but get
Failed to add HTML: RPC to Gecko engine process failed.Remote process terminated unexpectedly.
Running Full trust
have in bin folder
XULRunner Folder and everything from C:\Program Files (x86)\WebSupergoo\ABCpdf .NET 9.0\ABCGecko
ABCGecko.dll
ABCpdf.dll
ABCpdf9-32.dll
Package / Publish Web All files in this project folder

You are not allowed to run external processes from within your Windows Azure Website as this would pose a risk in the shared infrastructure.
See this post by a MSFT employee or that post where the same employee talks about other restrictions concerned with native APIs.
You can verify that the problem is related to the externally launched Gecko by not adding the HTML image to the document. For me the creation of the PDF progressed further but failed because of the missing license.
It looks like you would have to find a fully managed/.NET HTML rendering engine (if converting a website to PDF is your use-case) or hope that reserved-mode web sites gain the right to execute native/external processes.

There are full ABCpdf Azure deployment guides here:
http://www.websupergoo.com/support-azure-abcpdf.htm

Related

Uploading images to Windows Server and making them available publicly

I am trying to upload images in my windows server VM "hosted on-premise" and make these images available publicly like www.example.com/imageFolder/cat.png.
This VM has IIS enabled (I am not sure if IIS is related here) and it is assigned a URL and available publicly.
Aside from any programming languages or frameworks, I just want to know if storing the image in the server and making it available via a link is possible? If so, how can I achieve it?
To clarify further:
I believe what I want is very simple and tedious.
Consider the following workflow:
Run an API that has end points to receive images on the server.
Store the messages in the server.
Return the link that points to the picture.
I want to know the procedure from the windows server side like
Do I need to set certain properties on the folder where I will store the images?
Do I need to add a site to show the images back to the user?
I am a developer and I am new to the Windows Server & IIS so I may not have all the fundamentals.
I made a sample by using ASP.NET MVC5. You can follow this article to upload image in MVC5 application.
Upload Images on Server Folder Using ASP.NET MVC
Then, I do some changes so that it can display the url of images..
[HttpPost]
public ActionResult Upload(HttpPostedFileBase file)
{
Uri requestUri = HttpContext.Request.Url;
string baseUrl = requestUri.Scheme + Uri.SchemeDelimiter + requestUri.Host + (requestUri.IsDefaultPort ? "" : ":" + requestUri.Port);
if (ModelState.IsValid)
{
try
{
if (file != null)
{
string path = Path.Combine(Server.MapPath("~/Images"), Path.GetFileName(file.FileName));
file.SaveAs(path);
}
string link = baseUrl + "/Images/" + file.FileName;
ViewBag.FileStatus = "File uploaded successfully.";
//Return the url of image
ViewBag.FileLink = link;
}
catch (Exception e)
{
ViewBag.FileStatus = "Error while file uploading. Error message is "+e.Message;
}
}
return View("Index");
}
Index.cshtml add this to show url.
<div class="col-md-offset-2 col-md-10 text-success">
#ViewBag.FileLink
</div>
Don't forget to add a folder named Images to store images. After publishing, you also need to add it in published folder, otherwise error message will show that cannot find folder.
It works well.

Syncfusion PdfViewerControl on Azure

I am utilizing Syncfusion's PdfViewerControl and PdfLoadedDocument classes to generate thumbnail images of a PDF. However, once I moved the project to an Azure App Service, the PdfViewerControl is throwing an exception when being initialized. I am curious if it is attempting to use system memory and Azure is blocking this. Below is the method GenerateThumbnails I've created and the exception is being thrown when creating a new PdfViewerControl. If anyone has a work around for this or has experienced something similar when moving to Azure, any assistance would be greatly appreciated.
Along with that, if someone knows of another tool to create thumbnails from a PDF in this manner that'd be very helpful as well. Thanks!
Exception:
System.AccessViolationException: 'Attempted to read or write protected memory. This is often an indication that other memory is corrupt.'
Method:
public static List<Byte[]> GenerateThumbnails(Byte[] file)
{
Int32 resizedHeight;
Int32 resizedWidth;
List<Byte[]> thumbnails = new List<Byte[]>();
using (PdfViewerControl pdfViewerControl = new PdfViewerControl())
using (PdfLoadedDocument pdfLoadedDocument = new PdfLoadedDocument(file, true))
{
// The PDF Viewer Control must load the PDF from a PdfLoadedDocument, rather than directly from the filename because
// when loaded from the filename, it is not disposed correctly and causes a file lock.
pdfViewerControl.Load(pdfLoadedDocument);
for (Int32 i = 0; i < pdfViewerControl.PageCount; ++i)
{
using (Bitmap originalBitmap = pdfViewerControl.ExportAsImage(i))
{
if (pdfViewerControl.LoadedDocument.Pages[i].Size.Width > pdfViewerControl.LoadedDocument.Pages[i].Size.Height)
{
resizedHeight = (PdfUtility.TARGET_THUMBNAIL_WIDTH_LANDSCAPE * originalBitmap.Height) / originalBitmap.Width;
resizedWidth = PdfUtility.TARGET_THUMBNAIL_WIDTH_LANDSCAPE;
}
else
{
resizedHeight = PdfUtility.TARGET_THUMBNAIL_HEIGHT_PORTRAIT;
resizedWidth = (PdfUtility.TARGET_THUMBNAIL_HEIGHT_PORTRAIT * originalBitmap.Width) / originalBitmap.Height;
}
using (Bitmap resizedBitmap = new Bitmap(originalBitmap, new Size(resizedWidth, resizedHeight)))
using (MemoryStream memoryStream = new MemoryStream())
{
resizedBitmap.Save(memoryStream, ImageFormat.Jpeg);
thumbnails.Add(memoryStream.ToArray());
}
}
}
}
return thumbnails;
}
Update
Web App for Containers on Windows is now supported. This allows you to bring your own docker container that runs outside of the sandbox, so the restrictions described below won't affect your application.
There are restrictions in the sandbox that the app is running in that prevents certain API calls.
Here is a list of frameworks and scenarios that have been found to be
not be usable due to one or more of the restrictions above. It's
conceivable that some will be supported in the future as the sandbox
evolves.
PDF generators failing due to restriction mentioned above:
Syncfusion Siberix Spire.PDF The following PDF generators are
supported:
SQL Reporting framework: requires the site to run in Basic or higher
(note that this currently does not work in Functions apps in
Consumptions mode) EVOPDF: See
http://www.evopdf.com/azure-html-to-pdf-converter.aspx for vendor
solution Telerik reporting: requires the site to run in Basic or
higher. More info here Rotativa / wkhtmltopdf: requires the site to
run in Basic or higher. NReco PdfGenerator (wkhtmltopdf): requires
subscription plan Basic or higher Known issue for all PDF generators
based on wkhtmltopdf or phantomjs: custom fonts are not rendered
(system-installed font is used instead) because of sandbox GDI API
limitations that present even in VM-based Azure Apps plans (Basic or
higher).
Other scenarios that are not supported:
PhantomJS/Selenium: tries to connect to local address, and also uses
GDI+.
https://github.com/projectkudu/kudu/wiki/Azure-Web-App-sandbox

Self Hosted ASP.NET Web API: how to embed images?

I switched my ASP.NET Web API from IIS-hosted to self-hosted. So far I had my images deployed in its own folder (and accessed them with HostingEnvironment.MapPath). Obviously this folder doesn't exist in a self hosted environment. How can I handle images instead?
OK, I figured it out. Here's what I did:
set Build Action of each image as Embedded Resource
replace my MapPath with the following piece of code:
var resourcePath = "My.Namespace." + iconPath; //iconPath = subfolder.subfolder.file.ext
using (Stream imageStream = Assembly.GetExecutingAssembly()
.GetManifestResourceStream(resourcePath))
{
...

Opening a PDF file in Windows Phone

I'm developing an app for Windows Phone 7 and I'm using a Phonegap template for it.
Everything looks perfect, but now I’m stuck trying to open a PDF file in the browser.
I tried the following but that doesn’t work because the url of the PDF exceeds the 2048 character limit (it’s a data url). This code runs after the deviceReady event was fired.
var ref = window.open('http://www.google.com', '_blank', 'location=no');
ref.addEventListener('loadstart', function () { alert(event.url); });
Now, I'm trying to save the PDF file to storage and then I'm trying to have it opened by the browser, but the browser doesn't show anything. I'm editing the InAppBrowser.cs code from cordovalib and I added the following lines before calling browser.Navigate(loc);
private void ShowInAppBrowser(string url)
{
IsolatedStorageFile store = IsolatedStorageFile.GetUserStoreForApplication();
FileStream stream = store.OpenFile("test.pdf", FileMode.Create);
BinaryWriter writer = new BinaryWriter(stream);
var myvar = Base64Decode("the big data url");
writer.Write(myvar);
writer.Close();
if (store.FileExists("test.pdf")) // Check if file exists
{
Uri loc = new Uri("test.pdf", UriKind.Relative);
...
}
}
This code is returning the following error:
Log:"Error in error callback: InAppBrowser1921408518 = TypeError: Unable to get value of the property 'url': object is null or undefined"
I don’t wanna use ComponentOne.
Any help would be greatly appreciated!
You cannot open pdf files from the isolated storage in the default reader for PDF files. If the file is online e.g. it has a URI for it, you can use WebBrowserTask to open it since that will download and open the file in Adobe Reader.
On Windows Phone 8 you actually can open your own file in default file reader for that extension, but I am not sure how that will help you since you target PhoneGap and Windows Phone 7.
Toni is correct. You could go and try to build your own viewer (which would be the same thing as using C1, but with more time involved). I worked on a port of iTextSharp and PDFSharp for WP7, but neither of which are PDF Viewers. They are good for creating PDFs and parsing them some (but to render them there is more work involved). This has been a personal quest of mine, but honestly the best I have gotten is to be able to extract some images from the PDF (and none of the text)
try this
var installedLocation = Windows.ApplicationModel.Package.Current.InstalledLocation;
var assets = await installedLocation.GetFolderAsync("Assets");
var pdf = await assets.GetFileAsync("metro.pdf");
Windows.System.Launcher.LaunchFileAsync(pdf);
This worked correctly on my Device.

How do I upload some file into Azure blob storage without writing my own program?

I created an Azure Storage account. I have a 400 megabytes .zip file that I want to put into blob storage for later use.
How can I do that without writing code? Is there some interface for that?
Free tools:
Visual Studio 2010 -- install Azure tools and you can find the blobs in the Server Explorer
Cloud Berry Lab's CloudBerry Explorer for Azure Blob Storage
ClumpsyLeaf CloudXplorer
Azure Storage Explorer from CodePlex (try version 4 beta)
There was an old program called Azure Blob Explorer or something that no longer works with the new Azure SDK.
Out of these, I personally like CloudBerry Explorer the best.
The easiest way is to use Azure Storage PowerShell. It provided many commands to manage your storage container/blob/table/queue.
For your mentioned case, you could use Set-AzureStorageBlobContent which could upload a local file into azure storage as a block blob or page blob.
Set-AzureStorageBlobContent -Container containerName -File .\filename -Blob blobname
For details, please refer to http://msdn.microsoft.com/en-us/library/dn408487.aspx.
If you're looking for a tool to do so, may I suggest that you take a look at our tool Cloud Storage Studio (http://www.cerebrata.com/Products/CloudStorageStudio). It's a commercial tool for managing Windows Azure Storage and Hosted Service. You can also find a comprehensive list of Windows Azure Storage Management tools here: http://blogs.msdn.com/b/windowsazurestorage/archive/2010/04/17/windows-azure-storage-explorers.aspx
Hope this helps.
The StorageClient has this built into it. No need to write really anything:
var account = new CloudStorageAccount(creds, false);
var client = account.CreateCloudBlobClient();
var blob = client.GetBlobReference("/somecontainer/hugefile.zip");
//1MB seems to be a pretty good all purpose size
client.WriteBlockSizeInBytes = 1024;
//this sets # of parallel uploads for blocks
client.ParallelOperationThreadCount = 4; //normally set to one per CPU core
//this will break blobs up automatically after this size
client.SingleBlobUploadThresholdInBytes = 4096;
blob.UploadFile("somehugefile.zip");
I use Cyberduck to manage my blob storage.
It is free and very easy to use. It works with other cloud storage solutions as well.
I recently found this one as well: CloudXplorer
Hope it helps.
There is a new OpenSource tool provided by Microsoft :
Project Deco - Crossplatform Microsoft Azure Storage Account Explorer.
Please, check those links:
Download binaries: http://storageexplorer.com/
Source Code: https://github.com/Azure/deco
You can use Cloud Combine for reliable and quick file upload to Azure blob storage.
A simple batch file using Microsoft's AzCopy utility will do the trick. You can drag-and-drop your files on the following batch file to upload into your blob storage container:
upload.bat
#ECHO OFF
SET BLOB_URL=https://<<<account name>>>.blob.core.windows.net/<<<container name>>>
SET BLOB_KEY=<<<your access key>>>
:AGAIN
IF "%~1" == "" GOTO DONE
AzCopy /Source:"%~d1%~p1" /Dest:%BLOB_URL% /DestKey:%BLOB_KEY% /Pattern:"%~n1%~x1" /destType:blob
SHIFT
GOTO AGAIN
:DONE
PAUSE
Note that the above technique only uploads one or more files individually (since the Pattern flag is specified) instead of uploading an entire directory.
You can upload files to Azure Storage Account Blob using Command Prompt.
Install Microsoft Azure Storage tools.
And then Upload it to your account blob will CLI command:
AzCopy /Source:"filepath" /Dest:bloburl /DestKey:accesskey /destType:blob
Hope it Helps.. :)
You can upload large files directly to the Azure Blob Storage directly using the HTTP PUT verb, the biggest file I have tried with the code below is 4,6 Gb. You can do this in C# like this:
// write up to ChunkSize of data to the web request
void WriteToStreamCallback(IAsyncResult asynchronousResult)
{
var webRequest = (HttpWebRequest)asynchronousResult.AsyncState;
var requestStream = webRequest.EndGetRequestStream(asynchronousResult);
var buffer = new Byte[4096];
int bytesRead;
var tempTotal = 0;
File.FileStream.Position = DataSent;
while ((bytesRead = File.FileStream.Read(buffer, 0, buffer.Length)) != 0
&& tempTotal + bytesRead < CHUNK_SIZE
&& !File.IsDeleted
&& File.State != Constants.FileStates.Error)
{
requestStream.Write(buffer, 0, bytesRead);
requestStream.Flush();
DataSent += bytesRead;
tempTotal += bytesRead;
File.UiDispatcher.BeginInvoke(OnProgressChanged);
}
requestStream.Close();
if (!AbortRequested) webRequest.BeginGetResponse(ReadHttpResponseCallback, webRequest);
}
void StartUpload()
{
var uriBuilder = new UriBuilder(UploadUrl);
if (UseBlocks)
{
// encode the block name and add it to the query string
CurrentBlockId = Convert.ToBase64String(Encoding.UTF8.GetBytes(Guid.NewGuid().ToString()));
uriBuilder.Query = uriBuilder.Query.TrimStart('?') + string.Format("&comp=block&blockid={0}", CurrentBlockId);
}
// with or without using blocks, we'll make a PUT request with the data
var webRequest = (HttpWebRequest)WebRequestCreator.ClientHttp.Create(uriBuilder.Uri);
webRequest.Method = "PUT";
webRequest.BeginGetRequestStream(WriteToStreamCallback, webRequest);
}
The UploadUrl is generated by Azure itself and contains a Shared Access Signature, this SAS URL says where the blob is to be uploaded to and how long time the security access (write access in your case) is given. You can generate a SAS URL like this:
readonly CloudBlobClient BlobClient;
readonly CloudBlobContainer BlobContainer;
public UploadService()
{
// Setup the connection to Windows Azure Storage
var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
BlobClient = storageAccount.CreateCloudBlobClient();
// Get and create the container
BlobContainer = BlobClient.GetContainerReference("publicfiles");
}
string JsonSerializeData(string url)
{
var serializer = new DataContractJsonSerializer(url.GetType());
var memoryStream = new MemoryStream();
serializer.WriteObject(memoryStream, url);
return Encoding.Default.GetString(memoryStream.ToArray());
}
public string GetUploadUrl()
{
var sasWithIdentifier = BlobContainer.GetSharedAccessSignature(new SharedAccessPolicy
{
Permissions = SharedAccessPermissions.Write,
SharedAccessExpiryTime =
DateTime.UtcNow.AddMinutes(60)
});
return JsonSerializeData(BlobContainer.Uri.AbsoluteUri + "/" + Guid.NewGuid() + sasWithIdentifier);
}
I also have a thread on the subject where you can find more information here How to upload huge files to the Azure blob from a web page
The new Azure Portal has an 'Editor' menu option in preview when in the container view. Allows you to upload a file directly to the container from the Portal UI
I've used all the tools mentioned in post, and all work moderately well with block blobs. My favorite however is BlobTransferUtility
By default BlobTransferUtility only does block blobs. However changing just 2 lines of code and you can upload page blobs as well. If you, like me, need to upload a virtual machine image it needs to be a page blob.
(for the difference please see this MSDN article.)
To upload page blobs just change lines 53 and 62 of BlobTransferHelper.cs from
new Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob
to
new Microsoft.WindowsAzure.Storage.Blob.CloudPageBlob
The only other thing to know about this app is to uncheck HELP when you first run the program to see the actual UI.
Check out this post Uploading to Azure Storage where it is explained how to easily upload any file via PowerShell to Azure Blob Storage.
You can use Azcopy tool to upload the required files to the azure default storage is block blob u can change pattern according to your requirement
Syntax
AzCopy /Source : /Destination /s
Try the Blob Service API
http://msdn.microsoft.com/en-us/library/dd135733.aspx
However, 400mb is a large file and I am not sure a single API call will deal with something of this size, you may need to split it and reconstruct using custom code.

Resources