Azure Functions and queues - laravel

I want to create an app that supports multiple users. Every user can upload multiple files and each file uploaded needs to be processed by a pipeline (for example processing1.exe -> processing2.py -> processing3.exe). The results must be made available to the user after processing.
The backend is in Laravel.
I have the following questions:
I tried running the processing1.exe binary on Azure Functions but I got "access denied". I guess this is because the default Windows image does not have the necessary dependencies installed (Windows SDK). From what I read there are other offerings of "serverless" in Azure like Logic Apps, Custom containers etc. Is there a way to use "serverless"/Azure Functions with that binary? What are my options?
Everytime a new file upload is detected (by Laravel), how should I trigger Azure to start processing it? Should I Azure Storage Queues, Azure Service Bus, Event Hub or Event Grid (or something else)?
Ideally the system shouldn't process all the files from users in a FIFO manner but instead use round-robin (that way if user1 uploads 1000 files and user2 uploads 3 files, user2 wouldn't have to wait for all of user1 files to finish).

Few of the workarounds for running the .exe file on Azure Functions:
Workaround 1:
There is one of my practical workarounds on running and getting .exe output in Azure Functions, please refer this SO Thread.
Workaround 2:
My .exe file contains SQL database query that insert record into database using Azure Function App:
cmd.CommandText = "insert into [dbo].[debug]([Name]) values('test')";
run.csx:
using System;
public static void Run(TimerInfo myTimer, TraceWriter log)
{
System.Diagnostics.Process process = new System.Diagnostics.Process();
process.StartInfo.FileName = #"D:\home\site\wwwroot\TimerTriggerclass1\demofunction.exe";
process.StartInfo.Arguments = "";
process.StartInfo.UseShellExecute = false;
process.StartInfo.RedirectStandardOutput = true;
process.StartInfo.RedirectStandardError = true;
process.Start();
string output = process.StandardOutput.ReadToEnd();
string err = process.StandardError.ReadToEnd();
process.WaitForExit();
log.Info($"C# Timer trigger function executed at: {DateTime.Now}");
}
Before running the Function App in Azure, we need to upload the .exe file in the Azure Functions App > Functions (TimerTriggerclass1) > Upload File option.
Workaround 3:
using System;
using System.Diagnostics;
using System.Threading;
public static void Run(TimerInfo myTimer, TraceWriter log)
{
System.Diagnostics.Process process = new System.Diagnostics.Process();
string WorkingDirectoryInfo =#"D:\home\site\wwwroot\TimerTriggerClass1";
string ExeLocation = #"D:\home\site\wwwroot\TimerTriggerClass1\MyApplication.exe";
Process proc = new Process();
ProcessStartInfo info = new ProcessStartInfo();
try
{
info.WorkingDirectory = WorkingDirectoryInfo;
info.FileName = ExeLocation;
info.Arguments = "";
info.WindowStyle = ProcessWindowStyle.Minimized;
info.UseShellExecute = false;
info.CreateNoWindow = true;
proc.StartInfo = info;
proc.Refresh();
proc.Start();
proc.WaitForInputIdle();
proc.WaitForExit();
}
catch
{
}
log.Info($"C# Timer trigger function executed at: {DateTime.Now}");
}
Everytime a new file upload is detected (by Laravel), how should I trigger Azure to start processing it? Should I Azure Storage Queues, Azure Service Bus, Event Hub or Event Grid (or something else)? Ideally the system shouldn't process all the files from users in a FIFO manner but instead use round-robin (that way if user1 uploads 1000 files and user2 uploads 3 files, user2 wouldn't have to wait for all of user1 files to finish).
For this process, the pure serverless approach like Azure Functions with Blob Storage Trigger or Azure Logic Apps with Blob Storage Trigger is better.
Whenever the new file is uploaded, then the Blob Storage will trigger the Azure Function to process the files immediately.
It will not make wait of the user 2 until the user1 finishes all files processing. User 2 can get the processed result at the same time when files are uploaded.
In Comparison, Logic Apps also does the same functionality whenever the files are added or modified but Logic Apps has built-in processing methods and connectors like after processing, share or save it to some other storages or mails whereas if you need customized functionality before/on/after processing the files uploaded on Azure Storage, Azure functions is the best approach to write the custom optimized code.

Related

Cannot run cmd.exe through service. No commands appear to be working [duplicate]

Hey, I am trying to get a service to start my program but it isn't showing the GUI. The process starts but nothing is shown. I have tried enabling 'Allow service to interact with desktop' but that still isn't working.
My program is a computer locking device to stop unauthorised users from accessing the computer. I am running windows 7 with a 64 bit OS.
Here is the code for my service:
protected override void OnStart(string[] args)
{
Process p = new Process();
p.StartInfo.FileName = "notepad.exe";
p.Start();
FileStream fs = new FileStream(#"C:\Users\David\Documents\Visual Studio 2010\Projects\LockPCService\LockPCService\bin\Debug\ServiceLog.dj",
FileMode.OpenOrCreate, FileAccess.Write);
StreamWriter m_streamWriter = new StreamWriter(fs);
m_streamWriter.BaseStream.Seek(0, SeekOrigin.End);
m_streamWriter.WriteLine(" LockPCService: Service Started " + DateTime.Now + "\n" + "\n");
m_streamWriter.Flush();
m_streamWriter.Close();
}
protected override void OnStop()
{
FileStream fs = new FileStream(#"C:\Users\David\Documents\Visual Studio 2010\Projects\LockPCService\LockPCService\bin\Debug\ServiceLog.dj",
FileMode.OpenOrCreate, FileAccess.Write);
StreamWriter m_streamWriter = new StreamWriter(fs);
m_streamWriter.BaseStream.Seek(0, SeekOrigin.End);
m_streamWriter.WriteLine(" LockPCService: Service Stopped " + DateTime.Now + "\n"); m_streamWriter.Flush();
m_streamWriter.Close();
}
To try and get the service working I am using notepad.exe. When I look at the processes notepad is running but there is no GUI. Also the ServiceLog is there and working each time I run it.
Any ideas on why this isn't working?
Thanks.
This article explains Session 0 Isolation which among other things disallows services from creating a UI in Windows Vista/7. In your service starts another process, it starts in Session 0 and also will not show any UI. (By the way, the UI is created, it's just that Session 0 is never displayed). This article on CodeProject can help you create a process from a service on the user's desktop and show its UI.
Also, please consider wrapping you stream objects in a using statement so that they are properly disposed.
Services run under different account so notepad is run by another user and on another desktop so that's why you cannot see it. 'Allow service to interact with desktop' is not supported anymore starting from Vista.
I know this is a late post, but I found that this article was very helpful to me. I am running Windows 7 and the solution provided in this article works great.
If you download the code, there is a class called ApplicationLoader. Include that class in your project and then it's as simple as this:
// the name of the application to launch
String applicationName = "cmd.exe";
// launch the application
ApplicationLoader.PROCESS_INFORMATION procInfo;
ApplicationLoader.StartProcessAndBypassUAC(applicationName, out procInfo);
Services run in a different logon session and have a different window station from the user. That means that all GUI activity is segregated from the user's programs, not that the service can't display a GUI. Actually, this design makes it much easier to temporarily block access to the user's programs.
You'll need to call SwitchDesktop.

Syncfusion PdfViewerControl on Azure

I am utilizing Syncfusion's PdfViewerControl and PdfLoadedDocument classes to generate thumbnail images of a PDF. However, once I moved the project to an Azure App Service, the PdfViewerControl is throwing an exception when being initialized. I am curious if it is attempting to use system memory and Azure is blocking this. Below is the method GenerateThumbnails I've created and the exception is being thrown when creating a new PdfViewerControl. If anyone has a work around for this or has experienced something similar when moving to Azure, any assistance would be greatly appreciated.
Along with that, if someone knows of another tool to create thumbnails from a PDF in this manner that'd be very helpful as well. Thanks!
Exception:
System.AccessViolationException: 'Attempted to read or write protected memory. This is often an indication that other memory is corrupt.'
Method:
public static List<Byte[]> GenerateThumbnails(Byte[] file)
{
Int32 resizedHeight;
Int32 resizedWidth;
List<Byte[]> thumbnails = new List<Byte[]>();
using (PdfViewerControl pdfViewerControl = new PdfViewerControl())
using (PdfLoadedDocument pdfLoadedDocument = new PdfLoadedDocument(file, true))
{
// The PDF Viewer Control must load the PDF from a PdfLoadedDocument, rather than directly from the filename because
// when loaded from the filename, it is not disposed correctly and causes a file lock.
pdfViewerControl.Load(pdfLoadedDocument);
for (Int32 i = 0; i < pdfViewerControl.PageCount; ++i)
{
using (Bitmap originalBitmap = pdfViewerControl.ExportAsImage(i))
{
if (pdfViewerControl.LoadedDocument.Pages[i].Size.Width > pdfViewerControl.LoadedDocument.Pages[i].Size.Height)
{
resizedHeight = (PdfUtility.TARGET_THUMBNAIL_WIDTH_LANDSCAPE * originalBitmap.Height) / originalBitmap.Width;
resizedWidth = PdfUtility.TARGET_THUMBNAIL_WIDTH_LANDSCAPE;
}
else
{
resizedHeight = PdfUtility.TARGET_THUMBNAIL_HEIGHT_PORTRAIT;
resizedWidth = (PdfUtility.TARGET_THUMBNAIL_HEIGHT_PORTRAIT * originalBitmap.Width) / originalBitmap.Height;
}
using (Bitmap resizedBitmap = new Bitmap(originalBitmap, new Size(resizedWidth, resizedHeight)))
using (MemoryStream memoryStream = new MemoryStream())
{
resizedBitmap.Save(memoryStream, ImageFormat.Jpeg);
thumbnails.Add(memoryStream.ToArray());
}
}
}
}
return thumbnails;
}
Update
Web App for Containers on Windows is now supported. This allows you to bring your own docker container that runs outside of the sandbox, so the restrictions described below won't affect your application.
There are restrictions in the sandbox that the app is running in that prevents certain API calls.
Here is a list of frameworks and scenarios that have been found to be
not be usable due to one or more of the restrictions above. It's
conceivable that some will be supported in the future as the sandbox
evolves.
PDF generators failing due to restriction mentioned above:
Syncfusion Siberix Spire.PDF The following PDF generators are
supported:
SQL Reporting framework: requires the site to run in Basic or higher
(note that this currently does not work in Functions apps in
Consumptions mode) EVOPDF: See
http://www.evopdf.com/azure-html-to-pdf-converter.aspx for vendor
solution Telerik reporting: requires the site to run in Basic or
higher. More info here Rotativa / wkhtmltopdf: requires the site to
run in Basic or higher. NReco PdfGenerator (wkhtmltopdf): requires
subscription plan Basic or higher Known issue for all PDF generators
based on wkhtmltopdf or phantomjs: custom fonts are not rendered
(system-installed font is used instead) because of sandbox GDI API
limitations that present even in VM-based Azure Apps plans (Basic or
higher).
Other scenarios that are not supported:
PhantomJS/Selenium: tries to connect to local address, and also uses
GDI+.
https://github.com/projectkudu/kudu/wiki/Azure-Web-App-sandbox

Share File Between Applications

I am developing a reasonably complex process control background application on a Pi 3 (using c#, VC 2015). This is being developed and tested in a modular manner (display, user input, gpio extender boards, various types of sensors, relays, network comms, etc). Each module is built as a separate DLL and tested with its own background test app.
My problem is that I need to maintain a common set of data across all modules, particularly a set of application parameters. Also local storage as a cache for results and logging. So several different applications need to access this data during development - but only one at a time. Obviously in the final project, there will be a single application, so no problem.
I have been amazed to find that Win IoT does not seem to allow a simple file to be accessible to different applications. App Services and other inter-app communications all seem to be at the transaction level and not appropriate here. To build an app services facility to handle all I/o would be tedious(not ultimately required).
Does anyone have an idea as to how this situation could be managed sensibly, please?
Due to File access permissions of UWP app. Reading data from a file in local storage seems unreachable.
So, if you are willing to use a removable device(external storage) attached to the Raspberry Pi and store your application parameters in a file, like a text file, and read the data from your apps.
I test the following code(UWP app) on desktop and read data successfully from multi apps. I am sure that you can read data on Raspberry Pi from one app but I didn't test multi apps. You can have a try. If there is any concern please feel free let me know.
Task.Run( async () =>
{
var removableDevices = KnownFolders.RemovableDevices;
var externalDrives = await removableDevices.GetFoldersAsync();
var drive0 = externalDrives[0];
var testFolder = await drive0.GetFolderAsync("test");
var SharedDateFile = await testFolder.GetFileAsync("data.txt");
var data = await FileIO.ReadTextAsync(SharedDateFile);
System.Diagnostics.Debug.WriteLine(data);
});
In the app manifest, you need specify the Removable Storage capability and register at least one File Type Association declaration.
UPDATE:
Using publisher folder is the better solution. First, add the following extension in Package.appxmanifest:
<Extensions>
<Extension Category="windows.publisherCacheFolders">
<PublisherCacheFolders>
<Folder Name="Folder1"/>
</PublisherCacheFolders>
</Extension>
</Extensions>
Then write and read file like this:
//Write file
var folder = Windows.Storage.ApplicationData.Current.GetPublisherCacheFolder("Folder1");
var file = await folder.CreateFileAsync("settings.txt", Windows.Storage.CreationCollisionOption.OpenIfExists);
await FileIO.WriteTextAsync(file,"Hello writen by app1");
//Read file
var folder = Windows.Storage.ApplicationData.Current.GetPublisherCacheFolder("Folder1");
var file = await folder.GetFileAsync("settings.txt");
var text = await FileIO.ReadTextAsync(file);
System.Diagnostics.Debug.WriteLine(text);

bits , sharpBits.net

I using in my project BITS - Background Intelligent Transfer Service for send file with larg size. Using SharpBITS.NET in C# code.
I want to upload file from server to client. I now note the sides.
-------------client side---------------
static void Main(string[] args)
{
string local = #"I:\a.mp3";
string destination = "http://192.168.56.128/BitsTest/Home/FileUpload";
string remoteFile = #destination;
string localFile = local;
if (!string.IsNullOrEmpty(localFile) && System.IO.File.Exists(localFile))
{
var bitsManager = new BitsManager();
var job = bitsManager.CreateJob("uploading file", JobType.Upload);
job.NotificationFlags = NotificationFlags.JobErrorOccured | NotificationFlags.JobModified |
NotificationFlags.JobTransferred;
job.AddFile(remoteFile, localFile);
job.Resume();
job.OnJobError += new EventHandler<JobErrorNotificationEventArgs>(job_OnJobError);
}
}
This is a simple console application. the local -- path the file that I want to send, destination -- the path is receiver it is remote server.
When I run program the job.Error take mi follow --- "The server's response was not valid. The server was not following the defined protocol. Resume the job, and then Background Intelligent Transfer Service (BITS) will try again. -- BG_E_HTTP_ERROR_200 .-2145845048, 0x801900C8"
For Client (receiver) i have the follow code: It is Mvs 3 small project and I View only action
where to go by our destination path.
public ActionResult FileUpload()
{
try
{
HttpPostedFileBase file = Request.Files[0];
file.SaveAs(System.IO.Path.Combine(Server.MapPath("/BitsTest/"), file.FileName));
}
catch
{ }
/*System.IO.File.Move(Server.MapPath("/BitsTest/bin/aa.png"), Server.MapPath("/BitsTest/Content/aa.png"));*/
}
But FileUpload action thas not recevie file. I don't know how I can receive file in client Side.
As you can see, I used HttpPostedFileBase for recive file but that is not working.
My host server is Windows server 2008 r2 and I done needed steps for BITS. For more information you can visit the follow site http://technet.microsoft.com/en-us/library/cc431377.aspx ---- How to Configure Windows Server 2008 for Configuration Manager 2007 Site Systems.
So I don't know what doing that I can receive file in host server.You can tell me what you can do.
With a stateless methdology like the one web applications use, there is no connection to the server once the response is completed. You can poll the server from the client side, but the client is not listening for the server to send additional bits.
In "the past" you could set up ActiveX controls, Java applets, etc (Silverlight today?) to continue to listen, but this is not straight web style development.
HTML5 expands your options, if you are willing to use the websocketAPI. As with all parts of HTML5, you have some risk using these bits for implementation as not all browsers have adopted the "standard" yet (adoption expected to be complete in the next 10-12 years:->).

How do I upload some file into Azure blob storage without writing my own program?

I created an Azure Storage account. I have a 400 megabytes .zip file that I want to put into blob storage for later use.
How can I do that without writing code? Is there some interface for that?
Free tools:
Visual Studio 2010 -- install Azure tools and you can find the blobs in the Server Explorer
Cloud Berry Lab's CloudBerry Explorer for Azure Blob Storage
ClumpsyLeaf CloudXplorer
Azure Storage Explorer from CodePlex (try version 4 beta)
There was an old program called Azure Blob Explorer or something that no longer works with the new Azure SDK.
Out of these, I personally like CloudBerry Explorer the best.
The easiest way is to use Azure Storage PowerShell. It provided many commands to manage your storage container/blob/table/queue.
For your mentioned case, you could use Set-AzureStorageBlobContent which could upload a local file into azure storage as a block blob or page blob.
Set-AzureStorageBlobContent -Container containerName -File .\filename -Blob blobname
For details, please refer to http://msdn.microsoft.com/en-us/library/dn408487.aspx.
If you're looking for a tool to do so, may I suggest that you take a look at our tool Cloud Storage Studio (http://www.cerebrata.com/Products/CloudStorageStudio). It's a commercial tool for managing Windows Azure Storage and Hosted Service. You can also find a comprehensive list of Windows Azure Storage Management tools here: http://blogs.msdn.com/b/windowsazurestorage/archive/2010/04/17/windows-azure-storage-explorers.aspx
Hope this helps.
The StorageClient has this built into it. No need to write really anything:
var account = new CloudStorageAccount(creds, false);
var client = account.CreateCloudBlobClient();
var blob = client.GetBlobReference("/somecontainer/hugefile.zip");
//1MB seems to be a pretty good all purpose size
client.WriteBlockSizeInBytes = 1024;
//this sets # of parallel uploads for blocks
client.ParallelOperationThreadCount = 4; //normally set to one per CPU core
//this will break blobs up automatically after this size
client.SingleBlobUploadThresholdInBytes = 4096;
blob.UploadFile("somehugefile.zip");
I use Cyberduck to manage my blob storage.
It is free and very easy to use. It works with other cloud storage solutions as well.
I recently found this one as well: CloudXplorer
Hope it helps.
There is a new OpenSource tool provided by Microsoft :
Project Deco - Crossplatform Microsoft Azure Storage Account Explorer.
Please, check those links:
Download binaries: http://storageexplorer.com/
Source Code: https://github.com/Azure/deco
You can use Cloud Combine for reliable and quick file upload to Azure blob storage.
A simple batch file using Microsoft's AzCopy utility will do the trick. You can drag-and-drop your files on the following batch file to upload into your blob storage container:
upload.bat
#ECHO OFF
SET BLOB_URL=https://<<<account name>>>.blob.core.windows.net/<<<container name>>>
SET BLOB_KEY=<<<your access key>>>
:AGAIN
IF "%~1" == "" GOTO DONE
AzCopy /Source:"%~d1%~p1" /Dest:%BLOB_URL% /DestKey:%BLOB_KEY% /Pattern:"%~n1%~x1" /destType:blob
SHIFT
GOTO AGAIN
:DONE
PAUSE
Note that the above technique only uploads one or more files individually (since the Pattern flag is specified) instead of uploading an entire directory.
You can upload files to Azure Storage Account Blob using Command Prompt.
Install Microsoft Azure Storage tools.
And then Upload it to your account blob will CLI command:
AzCopy /Source:"filepath" /Dest:bloburl /DestKey:accesskey /destType:blob
Hope it Helps.. :)
You can upload large files directly to the Azure Blob Storage directly using the HTTP PUT verb, the biggest file I have tried with the code below is 4,6 Gb. You can do this in C# like this:
// write up to ChunkSize of data to the web request
void WriteToStreamCallback(IAsyncResult asynchronousResult)
{
var webRequest = (HttpWebRequest)asynchronousResult.AsyncState;
var requestStream = webRequest.EndGetRequestStream(asynchronousResult);
var buffer = new Byte[4096];
int bytesRead;
var tempTotal = 0;
File.FileStream.Position = DataSent;
while ((bytesRead = File.FileStream.Read(buffer, 0, buffer.Length)) != 0
&& tempTotal + bytesRead < CHUNK_SIZE
&& !File.IsDeleted
&& File.State != Constants.FileStates.Error)
{
requestStream.Write(buffer, 0, bytesRead);
requestStream.Flush();
DataSent += bytesRead;
tempTotal += bytesRead;
File.UiDispatcher.BeginInvoke(OnProgressChanged);
}
requestStream.Close();
if (!AbortRequested) webRequest.BeginGetResponse(ReadHttpResponseCallback, webRequest);
}
void StartUpload()
{
var uriBuilder = new UriBuilder(UploadUrl);
if (UseBlocks)
{
// encode the block name and add it to the query string
CurrentBlockId = Convert.ToBase64String(Encoding.UTF8.GetBytes(Guid.NewGuid().ToString()));
uriBuilder.Query = uriBuilder.Query.TrimStart('?') + string.Format("&comp=block&blockid={0}", CurrentBlockId);
}
// with or without using blocks, we'll make a PUT request with the data
var webRequest = (HttpWebRequest)WebRequestCreator.ClientHttp.Create(uriBuilder.Uri);
webRequest.Method = "PUT";
webRequest.BeginGetRequestStream(WriteToStreamCallback, webRequest);
}
The UploadUrl is generated by Azure itself and contains a Shared Access Signature, this SAS URL says where the blob is to be uploaded to and how long time the security access (write access in your case) is given. You can generate a SAS URL like this:
readonly CloudBlobClient BlobClient;
readonly CloudBlobContainer BlobContainer;
public UploadService()
{
// Setup the connection to Windows Azure Storage
var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
BlobClient = storageAccount.CreateCloudBlobClient();
// Get and create the container
BlobContainer = BlobClient.GetContainerReference("publicfiles");
}
string JsonSerializeData(string url)
{
var serializer = new DataContractJsonSerializer(url.GetType());
var memoryStream = new MemoryStream();
serializer.WriteObject(memoryStream, url);
return Encoding.Default.GetString(memoryStream.ToArray());
}
public string GetUploadUrl()
{
var sasWithIdentifier = BlobContainer.GetSharedAccessSignature(new SharedAccessPolicy
{
Permissions = SharedAccessPermissions.Write,
SharedAccessExpiryTime =
DateTime.UtcNow.AddMinutes(60)
});
return JsonSerializeData(BlobContainer.Uri.AbsoluteUri + "/" + Guid.NewGuid() + sasWithIdentifier);
}
I also have a thread on the subject where you can find more information here How to upload huge files to the Azure blob from a web page
The new Azure Portal has an 'Editor' menu option in preview when in the container view. Allows you to upload a file directly to the container from the Portal UI
I've used all the tools mentioned in post, and all work moderately well with block blobs. My favorite however is BlobTransferUtility
By default BlobTransferUtility only does block blobs. However changing just 2 lines of code and you can upload page blobs as well. If you, like me, need to upload a virtual machine image it needs to be a page blob.
(for the difference please see this MSDN article.)
To upload page blobs just change lines 53 and 62 of BlobTransferHelper.cs from
new Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob
to
new Microsoft.WindowsAzure.Storage.Blob.CloudPageBlob
The only other thing to know about this app is to uncheck HELP when you first run the program to see the actual UI.
Check out this post Uploading to Azure Storage where it is explained how to easily upload any file via PowerShell to Azure Blob Storage.
You can use Azcopy tool to upload the required files to the azure default storage is block blob u can change pattern according to your requirement
Syntax
AzCopy /Source : /Destination /s
Try the Blob Service API
http://msdn.microsoft.com/en-us/library/dd135733.aspx
However, 400mb is a large file and I am not sure a single API call will deal with something of this size, you may need to split it and reconstruct using custom code.

Resources