Syncfusion PdfViewerControl on Azure - image

I am utilizing Syncfusion's PdfViewerControl and PdfLoadedDocument classes to generate thumbnail images of a PDF. However, once I moved the project to an Azure App Service, the PdfViewerControl is throwing an exception when being initialized. I am curious if it is attempting to use system memory and Azure is blocking this. Below is the method GenerateThumbnails I've created and the exception is being thrown when creating a new PdfViewerControl. If anyone has a work around for this or has experienced something similar when moving to Azure, any assistance would be greatly appreciated.
Along with that, if someone knows of another tool to create thumbnails from a PDF in this manner that'd be very helpful as well. Thanks!
Exception:
System.AccessViolationException: 'Attempted to read or write protected memory. This is often an indication that other memory is corrupt.'
Method:
public static List<Byte[]> GenerateThumbnails(Byte[] file)
{
Int32 resizedHeight;
Int32 resizedWidth;
List<Byte[]> thumbnails = new List<Byte[]>();
using (PdfViewerControl pdfViewerControl = new PdfViewerControl())
using (PdfLoadedDocument pdfLoadedDocument = new PdfLoadedDocument(file, true))
{
// The PDF Viewer Control must load the PDF from a PdfLoadedDocument, rather than directly from the filename because
// when loaded from the filename, it is not disposed correctly and causes a file lock.
pdfViewerControl.Load(pdfLoadedDocument);
for (Int32 i = 0; i < pdfViewerControl.PageCount; ++i)
{
using (Bitmap originalBitmap = pdfViewerControl.ExportAsImage(i))
{
if (pdfViewerControl.LoadedDocument.Pages[i].Size.Width > pdfViewerControl.LoadedDocument.Pages[i].Size.Height)
{
resizedHeight = (PdfUtility.TARGET_THUMBNAIL_WIDTH_LANDSCAPE * originalBitmap.Height) / originalBitmap.Width;
resizedWidth = PdfUtility.TARGET_THUMBNAIL_WIDTH_LANDSCAPE;
}
else
{
resizedHeight = PdfUtility.TARGET_THUMBNAIL_HEIGHT_PORTRAIT;
resizedWidth = (PdfUtility.TARGET_THUMBNAIL_HEIGHT_PORTRAIT * originalBitmap.Width) / originalBitmap.Height;
}
using (Bitmap resizedBitmap = new Bitmap(originalBitmap, new Size(resizedWidth, resizedHeight)))
using (MemoryStream memoryStream = new MemoryStream())
{
resizedBitmap.Save(memoryStream, ImageFormat.Jpeg);
thumbnails.Add(memoryStream.ToArray());
}
}
}
}
return thumbnails;
}

Update
Web App for Containers on Windows is now supported. This allows you to bring your own docker container that runs outside of the sandbox, so the restrictions described below won't affect your application.
There are restrictions in the sandbox that the app is running in that prevents certain API calls.
Here is a list of frameworks and scenarios that have been found to be
not be usable due to one or more of the restrictions above. It's
conceivable that some will be supported in the future as the sandbox
evolves.
PDF generators failing due to restriction mentioned above:
Syncfusion Siberix Spire.PDF The following PDF generators are
supported:
SQL Reporting framework: requires the site to run in Basic or higher
(note that this currently does not work in Functions apps in
Consumptions mode) EVOPDF: See
http://www.evopdf.com/azure-html-to-pdf-converter.aspx for vendor
solution Telerik reporting: requires the site to run in Basic or
higher. More info here Rotativa / wkhtmltopdf: requires the site to
run in Basic or higher. NReco PdfGenerator (wkhtmltopdf): requires
subscription plan Basic or higher Known issue for all PDF generators
based on wkhtmltopdf or phantomjs: custom fonts are not rendered
(system-installed font is used instead) because of sandbox GDI API
limitations that present even in VM-based Azure Apps plans (Basic or
higher).
Other scenarios that are not supported:
PhantomJS/Selenium: tries to connect to local address, and also uses
GDI+.
https://github.com/projectkudu/kudu/wiki/Azure-Web-App-sandbox

Related

How can i use internal database from scratch

How can i using internal database for example (sqlite) for offline app in nativescript without using any plugin.
i'm searched every were how i can installed or used sqlite or other internal database for nativescript but i didn't have any answer.
Just like you would do with any code that you need to access the native APIs
e.g. (JavaScript) Android example
var query = "select sqlite_version() AS sqlite_version";
var db = android.database.sqlite.SQLiteDatabase.openOrCreateDatabase(":memory:", null);
var cursor = db.rawQuery(query, null);
var sqliteVersion = "";
if (cursor.moveToNext()) {
sqliteVersion = cursor.getString(0);
console.log(sqliteVersion);
}
The API references for SQLite in Android here and that said you can now follow a basic Android database tutorial and implement it step by step in your NativeScript application using JavaScript or TypeScript
Still, the plugin could provide all that wrapped in a ready-to-go functionality so unless you are lacking something it will be easier to use the nativescript-sqlite and avoid writing native code for Android and then for iOS.

How can I save some user data locally on my Xamarin Forms app?

I have a simple Xamarin Forms app. I've now got a simple POCO object (eg. User instance or an list of the most recent tweets or orders or whatever).
How can I store this object locally to the device? Lets imagine I serialize it as JSON.
Also, how secure is this data? Is it part of Keychains, etc? Auto backed up?
cheers!
You have a couple options.
SQLite. This option is cross-platform and works well if you have a lot of data. You get the added bonus of transaction support and async support as well. EDIT: In the past I suggested using SQLite.Net-PCL. Due to issues involving Android 7.0 support (and an apparent sunsetting of support) I now recommend making use of the project that was originally forked from: sqlite-net
Local storage. There's a great nuget that supports cross-platform storage. For more information see PCLStorage
There's also Application.Current.Properties implemented in Xamarin.Forms that allow simple Key-Value pairs of data.
I think you'll have to investigate and find out which route serves your needs best.
As far as security, that depends on where you put your data on each device. Android stores app data in a secure app folder by default (not all that secure if you're rooted). iOS has several different folders for data storage based on different needs. Read more here: iOS Data Storage
Another option is the Xamarin Forms settings plugin.
E.g. If you need to store a user instance, just serialize it to json when storing and deserialize it when reading.
Uses the native settings management
Android: SharedPreferences
iOS: NSUserDefaults
Windows Phone: IsolatedStorageSettings
Windows RT / UWP: ApplicationDataContainer
public User CurrentUser
{
get
{
User user = null;
var serializedUser = CrossSettings.Current.GetValueOrDefault<string>(UserKey);
if (serializedUser != null)
{
user = JsonConvert.DeserializeObject<User>(serializedUser);
}
return user;
}
set
{
CrossSettings.Current.AddOrUpdateValue(UserKey, JsonConvert.SerializeObject(value));
}
}
EDIT:
There is a new solution for this. Just use Xamarin.Essentials.
Preferences.Set(UserKey, JsonConvert.SerializeObject(value));
var user= JsonConvert.DeserializeObject<User>(Preferences.Get(UserKey, "default_value");
Please use Xamarin.Essentials
The Preferences class helps to store application preferences in a key/value store.
To save a value:
Preferences.Set("my_key", "my_value");
To get a value:
var myValue = Preferences.Get("my_key", "default_value");
If you want to store a simple value, such as a string, follow this Example code.
setting the value of the "totalSeats.Text" to the "SeatNumbers" key from page1
Application.Current.Properties["SeatNumbers"] = totalSeats.Text;
await Application.Current.SavePropertiesAsync();
then, you can simply get the value from any other page (page2)
var value = Application.Current.Properties["SeatNumbers"].ToString();
Additionally, you can set that value to another Label or Entry etc.
SeatNumbersEntry.Text = value;
If it's Key value(one value) data storage, follow below code
Application.Current.Properties["AppNumber"] = "123"
await Application.Current.SavePropertiesAsync();
Getting the same value
var value = Application.Current.Properties["AppNumber"];

Accessing Azure with a CloudStorageAccount

I have Console application which submit messages to Azure queue. Now I am trying to migrate this application to Mobile, but met reference problem with CloudStorageAccount. It requires Windows dll version but mine is Mobile.
Do you guys have any idea how I can initialize CloudStorageAccount object alternative way?
public Initializator()
{
CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
{
// for a console app, reading from App.config
configSetter(ConfigurationManager.ConnectionStrings[configName].ConnectionString);
});
CloudStorageAccount storageAccount = CloudStorageAccount.FromConfigurationSetting("QueueStorage");
CloudQueueClient queueClient = storageAccount.CreateCloudQueueClient();
queueIn = queueClient.GetQueueReference("queuein");
queueOut = queueClient.GetQueueReference("queueout");
queueIn.CreateIfNotExist();
queueOut.CreateIfNotExist();
}
One of the easiest ways, in my opinion, to work with Windows Azure storage (tables, blobs, & queues) from Windows Phone is to use the Phone.Storage NuGet package (http://www.nuget.org/packages/Phone.Storage). This makes working with storage on the phone nearly identical to working with storage from a server (or console app).
Be sure to check out Wade Wegner's blog post at http://www.wadewegner.com/2011/11/nuget-packages-for-windows-azure-and-windows-phone-developers/ for some additional info on the NuGet packages.
There's also a Phone.Storage.Sample package that may be worth taking a look at.

How do I upload some file into Azure blob storage without writing my own program?

I created an Azure Storage account. I have a 400 megabytes .zip file that I want to put into blob storage for later use.
How can I do that without writing code? Is there some interface for that?
Free tools:
Visual Studio 2010 -- install Azure tools and you can find the blobs in the Server Explorer
Cloud Berry Lab's CloudBerry Explorer for Azure Blob Storage
ClumpsyLeaf CloudXplorer
Azure Storage Explorer from CodePlex (try version 4 beta)
There was an old program called Azure Blob Explorer or something that no longer works with the new Azure SDK.
Out of these, I personally like CloudBerry Explorer the best.
The easiest way is to use Azure Storage PowerShell. It provided many commands to manage your storage container/blob/table/queue.
For your mentioned case, you could use Set-AzureStorageBlobContent which could upload a local file into azure storage as a block blob or page blob.
Set-AzureStorageBlobContent -Container containerName -File .\filename -Blob blobname
For details, please refer to http://msdn.microsoft.com/en-us/library/dn408487.aspx.
If you're looking for a tool to do so, may I suggest that you take a look at our tool Cloud Storage Studio (http://www.cerebrata.com/Products/CloudStorageStudio). It's a commercial tool for managing Windows Azure Storage and Hosted Service. You can also find a comprehensive list of Windows Azure Storage Management tools here: http://blogs.msdn.com/b/windowsazurestorage/archive/2010/04/17/windows-azure-storage-explorers.aspx
Hope this helps.
The StorageClient has this built into it. No need to write really anything:
var account = new CloudStorageAccount(creds, false);
var client = account.CreateCloudBlobClient();
var blob = client.GetBlobReference("/somecontainer/hugefile.zip");
//1MB seems to be a pretty good all purpose size
client.WriteBlockSizeInBytes = 1024;
//this sets # of parallel uploads for blocks
client.ParallelOperationThreadCount = 4; //normally set to one per CPU core
//this will break blobs up automatically after this size
client.SingleBlobUploadThresholdInBytes = 4096;
blob.UploadFile("somehugefile.zip");
I use Cyberduck to manage my blob storage.
It is free and very easy to use. It works with other cloud storage solutions as well.
I recently found this one as well: CloudXplorer
Hope it helps.
There is a new OpenSource tool provided by Microsoft :
Project Deco - Crossplatform Microsoft Azure Storage Account Explorer.
Please, check those links:
Download binaries: http://storageexplorer.com/
Source Code: https://github.com/Azure/deco
You can use Cloud Combine for reliable and quick file upload to Azure blob storage.
A simple batch file using Microsoft's AzCopy utility will do the trick. You can drag-and-drop your files on the following batch file to upload into your blob storage container:
upload.bat
#ECHO OFF
SET BLOB_URL=https://<<<account name>>>.blob.core.windows.net/<<<container name>>>
SET BLOB_KEY=<<<your access key>>>
:AGAIN
IF "%~1" == "" GOTO DONE
AzCopy /Source:"%~d1%~p1" /Dest:%BLOB_URL% /DestKey:%BLOB_KEY% /Pattern:"%~n1%~x1" /destType:blob
SHIFT
GOTO AGAIN
:DONE
PAUSE
Note that the above technique only uploads one or more files individually (since the Pattern flag is specified) instead of uploading an entire directory.
You can upload files to Azure Storage Account Blob using Command Prompt.
Install Microsoft Azure Storage tools.
And then Upload it to your account blob will CLI command:
AzCopy /Source:"filepath" /Dest:bloburl /DestKey:accesskey /destType:blob
Hope it Helps.. :)
You can upload large files directly to the Azure Blob Storage directly using the HTTP PUT verb, the biggest file I have tried with the code below is 4,6 Gb. You can do this in C# like this:
// write up to ChunkSize of data to the web request
void WriteToStreamCallback(IAsyncResult asynchronousResult)
{
var webRequest = (HttpWebRequest)asynchronousResult.AsyncState;
var requestStream = webRequest.EndGetRequestStream(asynchronousResult);
var buffer = new Byte[4096];
int bytesRead;
var tempTotal = 0;
File.FileStream.Position = DataSent;
while ((bytesRead = File.FileStream.Read(buffer, 0, buffer.Length)) != 0
&& tempTotal + bytesRead < CHUNK_SIZE
&& !File.IsDeleted
&& File.State != Constants.FileStates.Error)
{
requestStream.Write(buffer, 0, bytesRead);
requestStream.Flush();
DataSent += bytesRead;
tempTotal += bytesRead;
File.UiDispatcher.BeginInvoke(OnProgressChanged);
}
requestStream.Close();
if (!AbortRequested) webRequest.BeginGetResponse(ReadHttpResponseCallback, webRequest);
}
void StartUpload()
{
var uriBuilder = new UriBuilder(UploadUrl);
if (UseBlocks)
{
// encode the block name and add it to the query string
CurrentBlockId = Convert.ToBase64String(Encoding.UTF8.GetBytes(Guid.NewGuid().ToString()));
uriBuilder.Query = uriBuilder.Query.TrimStart('?') + string.Format("&comp=block&blockid={0}", CurrentBlockId);
}
// with or without using blocks, we'll make a PUT request with the data
var webRequest = (HttpWebRequest)WebRequestCreator.ClientHttp.Create(uriBuilder.Uri);
webRequest.Method = "PUT";
webRequest.BeginGetRequestStream(WriteToStreamCallback, webRequest);
}
The UploadUrl is generated by Azure itself and contains a Shared Access Signature, this SAS URL says where the blob is to be uploaded to and how long time the security access (write access in your case) is given. You can generate a SAS URL like this:
readonly CloudBlobClient BlobClient;
readonly CloudBlobContainer BlobContainer;
public UploadService()
{
// Setup the connection to Windows Azure Storage
var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
BlobClient = storageAccount.CreateCloudBlobClient();
// Get and create the container
BlobContainer = BlobClient.GetContainerReference("publicfiles");
}
string JsonSerializeData(string url)
{
var serializer = new DataContractJsonSerializer(url.GetType());
var memoryStream = new MemoryStream();
serializer.WriteObject(memoryStream, url);
return Encoding.Default.GetString(memoryStream.ToArray());
}
public string GetUploadUrl()
{
var sasWithIdentifier = BlobContainer.GetSharedAccessSignature(new SharedAccessPolicy
{
Permissions = SharedAccessPermissions.Write,
SharedAccessExpiryTime =
DateTime.UtcNow.AddMinutes(60)
});
return JsonSerializeData(BlobContainer.Uri.AbsoluteUri + "/" + Guid.NewGuid() + sasWithIdentifier);
}
I also have a thread on the subject where you can find more information here How to upload huge files to the Azure blob from a web page
The new Azure Portal has an 'Editor' menu option in preview when in the container view. Allows you to upload a file directly to the container from the Portal UI
I've used all the tools mentioned in post, and all work moderately well with block blobs. My favorite however is BlobTransferUtility
By default BlobTransferUtility only does block blobs. However changing just 2 lines of code and you can upload page blobs as well. If you, like me, need to upload a virtual machine image it needs to be a page blob.
(for the difference please see this MSDN article.)
To upload page blobs just change lines 53 and 62 of BlobTransferHelper.cs from
new Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob
to
new Microsoft.WindowsAzure.Storage.Blob.CloudPageBlob
The only other thing to know about this app is to uncheck HELP when you first run the program to see the actual UI.
Check out this post Uploading to Azure Storage where it is explained how to easily upload any file via PowerShell to Azure Blob Storage.
You can use Azcopy tool to upload the required files to the azure default storage is block blob u can change pattern according to your requirement
Syntax
AzCopy /Source : /Destination /s
Try the Blob Service API
http://msdn.microsoft.com/en-us/library/dd135733.aspx
However, 400mb is a large file and I am not sure a single API call will deal with something of this size, you may need to split it and reconstruct using custom code.

How to speed up Azure deployment from Visual Studio 2010

I have Visual Studio 2010 solution with an Azure Service and an ASP.NET MVC 3 solution that serves as a Web Role for the Azure service. No other roles attached to the service other than that.
Every deployment to the Azure staging (or production, for that matter) environment takes up to 20 minutes to complete, form the moment I click publish on Visual Studio until all instances (2) are started.
As you can imagine this makes it a PITA to publish often, or to quick-fix some bugs. Is there a way to speed the process up? Would it be faster to upload the package to de Blob storage and upgrade from there? How would I go about achieving that?
I feel on-line docs on Azure leave a lot to be desired. Particularly when it comes to troubleshooting by the way.
Thanks.
One idea for reducing the need (and frequency) for redeploying is to move static content into blob storage, external to the package. For instance, move your css and javascript to blob storage, along with images. Once this is done, you'd only have to recompile / redeploy for .NET code changes. You can upload updated css, at any time, to blob storage. If you want to test this in staging first, you could always have a staging vs. production container name for your static content and store that container name in a config setting.
This doesn't change the deployment time when you do need to redeploy, but at least you can reduce how often you go through that process...
You should enable Web Deploy in your Azure project. It works this way :
1/ Create a RDP account (don't forget, you need to upload a certificate with its private key so that Azure can decipher the password). That is hidden in the Deploy Dialog Box for your Azure deployment project.
2/ Enable Web Deployment - same place
Once you've published the app that way, right-click in the web application (not the azure deployment project) and select Publish. The pop-up has everything defined except the password, enter that as well and you'll upload your changes to Azure in a matter of seconds.
CAVEAT : this is meant for single-instance web apps, definitely not the way to go for a production upgrade strategy, and the Blob storage answer already mentioned is the best option in that case.
Pierre
My solution to this problem is only to push a new package when I am changing code in the RoleEntryPoint or with the Service Definition. In Azure 1.3 you now have the ability to use Remote Desktop Connection. Using RDC, I will compile my code locally and use copy/paste to place it on the Azure server in the appropriate directory. Once the production code is running correctly, I can then push the fully tested version to staging and then do a VIP swap. This limits the number of times I actually have to deploy a package.
You actually have quite a long window in which you can keep modifying your code in Azure before you have to publish a new package. The new package is only really needed for those cases where Azure has to shutdown/restart your role instance.
It's a nice idea to try uploading your project to blob storage first, but unfortunately this is what Visual Studio is doing for you behind the scene anyway. As has been pointed out elsewhere, most of the time in doing the deploy is not the upload itself, but the stopping and starting of all of your update domains.
If you're just running this site in a development environment, then the only way I know to speed it up is to run just one instance. If this is the live environment, then... sorry, I think you're out of luck.
So that I don't have to deploy to the cloud to test minor changes, what I've found works quite well is to engineer the site so that it works when running in local IIS just like any other MVC site.
The biggest barrier to this working are settings that you have in the cloud config. The way we get around this is to make a copy of all of the settings in your cloud config and put them in your web.config in the appSettings. Then rather than using RoleEnvironment.GetConfigurationSettingValue() create a wrapper class that you call instead. This wrapper class checks RoleEnvironment.IsAvailable to see if it is running in the Azure fabric, if it is, it calls the usual config function above, if not, it calls WebConfigurationManager.AppSettings[].
There are a few other things that you'll want to do around getting the config setting change events which hopefully you can figure out from the code below:
public class SmartConfigurationManager
{
private static bool _addConfigChangeEvents;
private static string _configName;
private static Func<string, bool> _configSetter;
public static bool AddConfigChangeEvents
{
get { return _addConfigChangeEvents; }
set
{
_addConfigChangeEvents = value;
if (value)
{
RoleEnvironment.Changing += RoleEnvironmentChanging;
}
else
{
RoleEnvironment.Changing -= RoleEnvironmentChanging;
}
}
}
public static string Setting(string configName)
{
if (RoleEnvironment.IsAvailable)
{
return RoleEnvironment.GetConfigurationSettingValue(configName);
}
return WebConfigurationManager.AppSettings[configName];
}
public static Action<string, Func<string, bool>> GetConfigurationSettingPublisher()
{
if (RoleEnvironment.IsAvailable)
{
return AzureSettingsGet;
}
return WebAppSettingsGet;
}
public static void WebAppSettingsGet(string configName, Func<string, bool> configSetter)
{
configSetter(WebConfigurationManager.AppSettings[configName]);
}
public static void AzureSettingsGet(string configName, Func<string, bool> configSetter)
{
// We have to store these to be used in the RoleEnvironment Changed handler
_configName = configName;
_configSetter = configSetter;
// Provide the configSetter with the initial value
configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
if (AddConfigChangeEvents)
{
RoleEnvironment.Changed += RoleEnvironmentChanged;
}
}
private static void RoleEnvironmentChanged(object anotherSender, RoleEnvironmentChangedEventArgs arg)
{
if ((arg.Changes.OfType<RoleEnvironmentConfigurationSettingChange>().Any(change => change.ConfigurationSettingName == _configName)))
{
if ((_configSetter(RoleEnvironment.GetConfigurationSettingValue(_configName))))
{
RoleEnvironment.RequestRecycle();
}
}
}
private static void RoleEnvironmentChanging(object sender, RoleEnvironmentChangingEventArgs e)
{
// If a configuration setting is changing
if ((e.Changes.Any(change => change is RoleEnvironmentConfigurationSettingChange)))
{
// Set e.Cancel to true to restart this role instance
e.Cancel = true;
}
}
}
The uploading itself takes a bit more than a minute most of the time. It's the starting up of the instances that take up most of the time.
What you can do is to deploy your fixes to staging first (note that it costs money so don't let it be there for too long). Swapping from staging to production only takes a couple of seconds. So while your application's still running you can upload the patched version, let your testers test it on staging and when they give the go then simply swap it to production.
I haven't tested your possible alternative approach by first uploading to blob storage first. But I think that's overhead as it doesn't speed up starting up the instances.

Resources