Azure DataLake client fails on creating a directory - azure-blob-storage

I'm creating a POC to store files in Azure following the steps in https://learn.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-directory-file-acl-dotnet. In the snippet below creating the directory fails with message No such host is known. (securedfstest02.blob.core.windows.net:443). Appreciate any suggestion to workaround
this issue.
using Azure;
using Azure.Storage;
using Azure.Storage.Files.DataLake;
using Azure.Storage.Files.DataLake.Models;
using System;
using System.Collections.Generic;
using System.Threading.Tasks;
namespace DataLakeHelloWorld
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Hello World!");
try
{
CreateFileClientAsync_DirectoryAsync().Wait();
}
catch(Exception e)
{
Console.WriteLine(e);
}
}
static async Task CreateFileClientAsync_DirectoryAsync()
{
// Make StorageSharedKeyCredential to pass to the serviceClient
string storageAccountName = "secureblobtest02";
string storageAccountKey = "mykeyredacted";
string dfsUri = "https://" + storageAccountName + ".dfs.core.windows.net";
StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(storageAccountName, storageAccountKey);
// Create DataLakeServiceClient using StorageSharedKeyCredentials
DataLakeServiceClient serviceClient = new DataLakeServiceClient(new Uri(dfsUri), sharedKeyCredential);
// Create a DataLake Filesystem
DataLakeFileSystemClient filesystem = serviceClient.GetFileSystemClient("my-filesystem");
if(!await filesystem.ExistsAsync())
await filesystem.CreateAsync();
//Create a DataLake Directory
DataLakeDirectoryClient directory = filesystem.CreateDirectory("my-dir");
if (!await directory.ExistsAsync())
await directory.CreateAsync();
// Create a DataLake File using a DataLake Directory
DataLakeFileClient file = directory.GetFileClient("my-file");
if(!await file.ExistsAsync())
await file.CreateAsync();
// Verify we created one file
var response = filesystem.GetPathsAsync();
IAsyncEnumerator<PathItem> enumerator = response.GetAsyncEnumerator();
Console.WriteLine(enumerator?.Current?.Name);
// Cleanup
await filesystem.DeleteAsync();
}
}
}

--Update
In your question, you mention of Azure data lake, but you seem to have the host: securedfstest02.blob.core.windows.net
Azure Data Lake Storage uses .dfs.core.windows.net/ whereas a Azure Blob Storage uses .blob.core.windows.net/ While using Blob service related operations in ADLS you would have to change the endpoint too accordingly.
please note the official MS docs URI templates.
I have used the same code and was able to create directory. Just replaced my adls credentials. I have not configured any additional permissions. ADLS is Allowed access from all networks. You might want to check if yours is by default configured to specific network or if firewall allows client (your) IP.
using Azure;
using Azure.Storage;
using Azure.Storage.Files.DataLake;
using Azure.Storage.Files.DataLake.Models;
using System;
using System.Collections.Generic;
using System.Threading.Tasks;
namespace DataLakeHelloWorld
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Starting....");
try
{
Console.WriteLine("Executing...");
CreateFileClientAsync_DirectoryAsync().Wait();
Console.WriteLine("Done");
}
catch (Exception e)
{
Console.WriteLine(e);
}
}
static async Task CreateFileClientAsync_DirectoryAsync()
{
// Make StorageSharedKeyCredential to pass to the serviceClient
string storageAccountName = "kteststarageeadls";
string storageAccountKey = "6fAe+P8LRe8LH0Ahxxxxxxxxx5ma17Slr7SjLy4oVYSgj05m+zWZuy5X8p4/Bbxxx8efzCj/X+On/Fwmxxxo7g==";
string dfsUri = "https://" + "kteststarageeadls" + ".dfs.core.windows.net";
StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(storageAccountName, storageAccountKey);
// Create DataLakeServiceClient using StorageSharedKeyCredentials
DataLakeServiceClient serviceClient = new DataLakeServiceClient(new Uri(dfsUri), sharedKeyCredential);
// Create a DataLake Filesystem
DataLakeFileSystemClient filesystem = serviceClient.GetFileSystemClient("my-filesystem");
if (!await filesystem.ExistsAsync())
await filesystem.CreateAsync();
//Create a DataLake Directory
DataLakeDirectoryClient directory = filesystem.CreateDirectory("my-dir");
if (!await directory.ExistsAsync())
await directory.CreateAsync();
// Create a DataLake File using a DataLake Directory
DataLakeFileClient file = directory.GetFileClient("my-file");
if (!await file.ExistsAsync())
await file.CreateAsync();
// Verify we created one file
var response = filesystem.GetPathsAsync();
IAsyncEnumerator<PathItem> enumerator = response.GetAsyncEnumerator();
Console.WriteLine(enumerator?.Current?.Name);
// Cleanup
//await filesystem.DeleteAsync();
}
}
}
I've edited storage account key, used for reference only.

Related

How to create a pdf file which takes a trdp file and adds values from another JSON file

I am very new to Telerik reporting and i am trying to create a c# console app which takes a simple trdp template file, inserts values into it from a JSON file during runtime and convert it into a pdf as output. Any help is appreciated as i am learning it from scratch.Thanks.
enter image description here
You can try the following C# code for console application, it takes trdp file and exports it to multiple formats, including PDF. You will find the exported documents in your console application Debug folder (if you run it in Debug configuration).
using System;
using System.Collections;
using System.IO;
using System.Linq;
using Telerik.Reporting;
using Telerik.Reporting.Processing;
namespace ConsoleApp2101
{
class Program
{
static void Main(string[] args)
{
var reportSource = new UriReportSource();
var processor = new ReportProcessor();
var deviceInfo = new Hashtable();
reportSource.Uri = #"C:\Program Files (x86)\Progress\Telerik Reporting R1 2021\Report Designer\Examples\MyReport.trdp";
deviceInfo.Add("DocumentTitle", "SomeOptionalTitle");
string[] availableFormats = new string[] { "PDF", "CSV", "DOCX", "XLSX", "PPTX", "RTF" };
foreach (var format in availableFormats)
{
var result = processor.RenderReport(format, reportSource, deviceInfo);
if (result.HasErrors)
{
Console.WriteLine(string.Join(",", result.Errors.Select(s => s.Message)));
}
else
{
File.WriteAllBytes($"MyReport.{format.ToLower()}", result.DocumentBytes);
}
}
Console.WriteLine("Completed!");
Console.ReadKey();
}
}
}
Reference:
https://docs.telerik.com/reporting/programmatic-exporting-report

Azure function - request image from 3d party then send image to requestor without saving to local directory

I've found lots of questions about downloading images and as my code shows that is what I ended up doing. However that is not the behavior I want. I just want it to return the image directly.
using System.Net;
using Microsoft.Extensions.Logging;
using System.IO;
public static async Task<HttpResponseMessage> Run(HttpRequest req, ILogger log, string data)
{
log.LogInformation("start function...");
string qrData = $"{data}";//req.Query["id"];
string QrGeneratorUrl = "https://api.qrserver.com/v1/create-qr-code/?size=100x100&data="+ qrData;
log.LogInformation("QrUrl= " + QrGeneratorUrl);
var filename = "temp.png";
var filePath = Path.Combine(#"d:\home\site\wwwroot\QrGeneratorTest\"+filename);
WebClient myWebClient = new WebClient();
myWebClient.DownloadFile(QrGeneratorUrl, filePath);
var response = new HttpResponseMessage(HttpStatusCode.OK);
var fileStream = new FileStream(filePath, FileMode.Open);
response.Content = new StreamContent(fileStream);
return response;
}
I've tried converting the image to a byte stream and adding the stream to the response content, I've tried putting the image data directly as string content... nothing seems to work - it only transmits the image if it is a local file and I add it to the response via fileStream. Does someone know how I can get it to just put the response I get into the response I am returning? Or explain why it can't be done? This is functionality that exists in a web app that we are trying the move into a function and the web app is able to pass the content along without saving it. Using a byte stream. But I can't seem to replicate that in the function.
There are 2 reasons we are not calling qr server directly
1) it's a 3d party site so it could go down and we need to be able to swap it out for a new provider from one location.
2) we need to build the url so it has not parameters (?p=1&q=2&r=3...) as this is going into an email and having a bunch of parameters often tags the email as junk. With Azure (as with our web app) we can build the url like this: /getImage/1/2/3 which is less likely to be tagged as spam
any insight would be appreciated!!
//*******************//
ANSWER
here is my final code. I think the issue was Stream vs MemoryStream... In any case here is the full code:
#r "Newtonsoft.Json"
using System.Net;
using System;
using System.Web;
using System.Threading.Tasks;
using System.IO;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Text;
public static async Task<HttpResponseMessage> Run(HttpRequest req, ILogger log, string data)
{
log.LogInformation("start function...");
string qrData = $"{data}";
//string qrData = DateTime.Now.Ticks.ToString();
string QrGeneratorUrl = "https://api.qrserver.com/v1/create-qr-code/?size=100x100&qzone=2&data="+ qrData;
//get the QR image from 3d party api
var httpWebRequest = WebRequest.Create(QrGeneratorUrl);
var httpResponse = await httpWebRequest.GetResponseAsync();
//put 3d party response into function response
Stream ms = httpResponse.GetResponseStream(); //new MemoryStream(bytes);
var result = new HttpResponseMessage(HttpStatusCode.OK);
result.Content = new StreamContent(ms);
result.Content.Headers.ContentType = new MediaTypeHeaderValue("image/png");
return result;
}
Suppose you want to download the image to stream and just return it(If use browser send the request, show the image in the browser, If I get it wrong please let me know). If this is your purpose you could refer to my below code, I download the image from blob and return it to FileContentResult.
using System;
using System.IO;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;
using System.DrawingCore;
namespace FunctionApp72
{
public static class Function1
{
[FunctionName("Function1")]
public static async Task<IActionResult> RunAsync(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
CloudStorageAccount blobAccount = CloudStorageAccount.Parse(Environment.GetEnvironmentVariable("AzureWebJobsStorage"));
CloudBlobClient blobClient = blobAccount.CreateCloudBlobClient();
CloudBlobContainer blobContainer = blobClient.GetContainerReference("test");
CloudBlockBlob cloudBlockBlob = blobContainer.GetBlockBlobReference("test.jpg");
MemoryStream streamIn = new MemoryStream();
await cloudBlockBlob.DownloadToStreamAsync(streamIn);
Image originalImage = Bitmap.FromStream(streamIn);
return new FileContentResult(ImageToByteArray(originalImage), "image/jpeg");
}
private static byte[] ImageToByteArray(Image image)
{
ImageConverter converter = new ImageConverter();
return (byte[])converter.ConvertTo(image, typeof(byte[]));
}
}
}
And I deploy it to azure It still could return the image.

Webmasters API - Quota limits

We're trying to download page data for sites using the Webmasters API .NET Client Library, by calling WebmastersService.SearchAnalytics.Query(). To do this we are using Batching and sending approx. 600 requests in one batch. However most of these fail with the error "Quota Exceeded". The amount that fail varies each time but it is only about 10 of the 600 that work (and it varies where they are within the batch). The only way we can get it to work is to reduce the batch size down to 3, and wait 1 second between each call.
According to the Developer Console our daily quota is set to 1,000,000 (and we have 99% remaining) and our per user limit is set to 10,000 requests / second / user.
The error we get back is:
Quota Exceeded [403] Errors [ Message[Quota Exceeded] Location[ - ]
Reason[quotaExceeded] Domain[usageLimits]]
Is there another quota which is enforced? What does "Domain[usage limits]" mean - is the domain the site we are query the page data for, or is it our user account?
We still get the problem if we run each request separately, unless we wait 1 second between each call. Due to the number of sites and the number of pages we need to download the data for this isn't really an option.
I found this post which points out that just because the max batch size is 1000 doesn't mean to say the Google service you are calling supports batches of those sizes. But I'd really like to find out exactly what the quota limits really are (as they don't relate to the Developer Console figures) and how to avoid the errors.
Update 1
Here's some sample code. Its specially written just to prove the problem so no need to comment on it's quality ;o)
using Google.Apis.Auth.OAuth2;
using Google.Apis.Services;
using Google.Apis.Util.Store;
using Google.Apis.Webmasters.v3;
using Google.Apis.Webmasters.v3.Data;
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
namespace ConsoleApplication1
{
class Program
{
static void Main(string[] args)
{
new Program().Run().Wait();
}
private async Task Run()
{
List<string> pageUrls = new List<string>();
// Add your page urls to the list here
await GetPageData("<your app name>", "2015-06-15", "2015-07-05", "web", "DESKTOP", "<your domain name>", pageUrls);
}
public static async Task<WebmastersService> GetService(string appName)
{
//if (_service != null)
// return _service;
//TODO: - look at analytics code to see how to store JSON and refresh token and check runs on another PC
UserCredential credential;
using (var stream = new FileStream("c:\\temp\\WMT.json", FileMode.Open, FileAccess.Read))
{
credential = await GoogleWebAuthorizationBroker.AuthorizeAsync(
GoogleClientSecrets.Load(stream).Secrets,
new[] { Google.Apis.Webmasters.v3.WebmastersService.Scope.Webmasters },
"user", CancellationToken.None, new FileDataStore("WebmastersService"));
}
// Create the service.
WebmastersService service = new WebmastersService(new BaseClientService.Initializer()
{
HttpClientInitializer = credential,
ApplicationName = appName,
});
//_service = service;
return service;
}
private static async Task<bool> GetPageData(string appName, string fromDate, string toDate, string searchType, string device, string siteUrl, List<string> pageUrls)
{
// Get the service from the initial method
bool ret = false;
WebmastersService service = await GetService(appName);
Google.Apis.Requests.BatchRequest b = new Google.Apis.Requests.BatchRequest(service);
try
{
foreach (string pageUrl in pageUrls)
{
SearchAnalyticsQueryRequest qry = new SearchAnalyticsQueryRequest();
qry.StartDate = fromDate;
qry.EndDate = toDate;
qry.SearchType = searchType;
qry.RowLimit = 5000;
qry.Dimensions = new List<string>() { "query" };
qry.DimensionFilterGroups = new List<ApiDimensionFilterGroup>();
ApiDimensionFilterGroup filterGroup = new ApiDimensionFilterGroup();
ApiDimensionFilter filter = new ApiDimensionFilter();
filter.Dimension = "device";
filter.Expression = device;
filter.Operator__ = "equals";
ApiDimensionFilter filter2 = new ApiDimensionFilter();
filter2.Dimension = "page";
filter2.Expression = pageUrl;
filter2.Operator__ = "equals";
filterGroup.Filters = new List<ApiDimensionFilter>();
filterGroup.Filters.Add(filter);
filterGroup.Filters.Add(filter2);
qry.DimensionFilterGroups.Add(filterGroup);
var req = service.Searchanalytics.Query(qry, siteUrl);
b.Queue<SearchAnalyticsQueryResponse>(req, (response, error, i, message) =>
{
if (error == null)
{
// Process the results
ret = true;
}
else
{
Console.WriteLine(error.Message);
}
});
await b.ExecuteAsync();
}
}
catch (Exception ex)
{
Console.WriteLine("Exception occurred getting page stats : " + ex.Message);
ret = false;
}
return ret;
}
}
}
Paste this into program.cs of a new console app and add Google.Apis.Webmasters.v3 via nuget. It looks for the wmt.json file in c:\temp but adjust the authentication code to suit your setup. If I add more than 5 page urls to the pageUrls list then I get the Quota Exceeded exception.
I've found that the stated quotas don't really seem to be the quotas. I had to slow my requests down to avoid this same issue (1/sec), even though I was always at or below the stated rate limit (20/sec). Furthermore, it claims that it gives a rateLimitExceeded error in the docs for going too fast, but really it returns a quotaExceeded error. It might have to do with how Google averages the rate of requests over time (as some of the requests we made were simultaneous, even though the long-run average was designed to be at or below 20/sec), but I cannot be sure.

Error inherits module: Windows azure blobstore in gwt and not GAE Blobstore

I used Windows Azure SDK for java in gwt, and obtain this problem in gwt:
No source code is available for type com.microsoft.windowsazure.services.core.storage.CloudStorageAccount; did you forget to inherit a required module?
Any idea?, for example correct value for <inherits name ="....."/>
this is the code, but the problem not is the code, is the correct value for inherits name:
public class StorageSmple {
public static final String storageConnectionString =
"DefaultEndpointsProtocol=http;" +
"AccountName=xxxxxx;" +
"AccountKey=xxxxxxx";
public void executeProgram()
{
try
{
CloudStorageAccount account;
CloudBlobClient serviceClient;
CloudBlobContainer container;
CloudBlockBlob blob;
account = CloudStorageAccount.parse(storageConnectionString);
serviceClient = account.createCloudBlobClient();
// Container name must be lower case.
container = serviceClient.getContainerReference("gettingstarted");
container.createIfNotExist();
// Set anonymous access on the container.
BlobContainerPermissions containerPermissions;
containerPermissions = new BlobContainerPermissions();
containerPermissions.setPublicAccess(BlobContainerPublicAccessType.CONTAINER);
container.uploadPermissions(containerPermissions);
// Upload an image file.
blob = container.getBlockBlobReference("image");
File fileReference = new File ("www.xxx/a254.png");
blob.upload(new FileInputStream(fileReference), fileReference.length());
// At this point the image is uploaded.
// Next, create an HTML page that lists all of the uploaded images.
MakeHTMLPage(container);
System.out.println("Processing complete.");
System.out.println("Open index.html to see the images stored in your storage account.");
}catch (Exception e){
System.out.print("Exception encountered: ");
System.out.println(e.getMessage());
}
}
// Create an HTML page that can be used to display the uploaded images.
// This example assumes all of the blobs are for images.
public void MakeHTMLPage(CloudBlobContainer container) throws FileNotFoundException, URISyntaxException
{
// Enumerate the uploaded blobs.
for (ListBlobItem blobItem : container.listBlobs()) {
HTMLPanel b = new HTMLPanel("<img src='" + blobItem.getUri() + "'/><br/>");
RootPanel.get().add(b);
}
}
}
I'm not too familiar with Azure, but I highly suspect that the Azure Java SDK is to be used on the server side. There must be code in this SDK that is not emulated by GWT.
Any code that is not already emulated by GWT (see here for a list of emulated classes) must be accompanied by GWT-translatable sources (see <super-source/> here).

Google apps Directory API (1.6 and above) DotNetOpenAuth not resolving

This code is changing fast and hard to get a handle on what works and what doesn't...
I was looking at this post: Have you used Google's Directory API?
Which is using the 1.4 library.
I installed the 1.6 API through nuget. However, the NativeApplicationClient and IAuthorizationState cannot be resolved. I was under the impression that I no longer needed the DotNetOpenAuth nuget package or the Google.Apis.Authentication package (which is where I believe they are resolved.
This is the complete and modified code I am playing with: (if you have a better example of creating users using the new API I'd like to see that!)
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.Security.Cryptography;
using System.Security.Cryptography.X509Certificates;
using System.Diagnostics;
using Google;
using Google.Apis.Auth.OAuth2;
using Google.Apis.Auth;
using Google.Apis.Download;
using Google.Apis.Logging;
using Google.Apis.Services;
using Google.Apis.Upload;
using Google.Apis.Admin.Directory;
using Google.Apis.Admin.Directory.directory_v1.Data;
namespace GoogleAddUser
{
class Program
{
static void Main(string[] args)
{
// Display the header and initialize the sample.
//CommandLine.EnableExceptionHandling();
Console.WriteLine("Create users in a google apps domain!");
Console.WriteLine("by Jonas Bergstedt 2013");
// Get the user data and store in user object
Console.Write("Email: ");
string userId = Console.ReadLine();
Console.Write("Givenname: ");
string GivenName = Console.ReadLine();
Console.Write("Familyname: ");
string FamilyName = Console.ReadLine();
Console.Write("Password: ");
string Password = Console.ReadLine();
User newuserbody = new User();
UserName newusername = new UserName();
newuserbody.PrimaryEmail = userId;
newusername.GivenName = GivenName;
newusername.FamilyName = FamilyName;
newuserbody.Name = newusername;
newuserbody.Password = Password;
// Register the authenticator.
var provider = new NativeApplicationClient(GoogleAuthenticationServer.Description)
{
ClientIdentifier = "<your clientId from Google APIs Console>",
ClientSecret = "<your clientsecret from Google APIs Console>",
};
var auth = new OAuth2Authenticator<NativeApplicationClient>(provider, GetAuthorization);
// Create the service.
var service = new DirectoryService(new BaseClientService.Initializer()
{
Authenticator = auth,
ApplicationName = "Create User",
ApiKey = "<your API Key from Google APIs console> (not sure if needed)"
});
User results = service.Users.Insert(newuserbody).Execute();
Console.WriteLine("User :" + results.PrimaryEmail + " is created");
Console.WriteLine("Press any key to continue!");
Console.ReadKey();
}
private static IAuthorizationState GetAuthorization(NativeApplicationClient arg)
{
// Get the auth URL:
IAuthorizationState state = new AuthorizationState(new[] { DirectoryService.Scopes.AdminDirectoryUser.GetStringValue() });
state.Callback = new Uri(NativeApplicationClient.OutOfBandCallbackUrl);
Uri authUri = arg.RequestUserAuthorization(state);
// Request authorization from the user (by opening a browser window):
Process.Start(authUri.ToString());
Console.WriteLine();
Console.Write("Authorization Code: ");
string authCode = Console.ReadLine();
// Retrieve the access token by using the authorization code:
return arg.ProcessUserAuthorization(authCode, state);
}
}
}
From release 1.6.0-beta we presented a new Google.Apis.Auth NuGet package (Google.Apis.Authentication which uses DNOA is obsolete!). You already installed that package, because all the new APIs have a reference to it. Take a look in our OAuth2 wiki page for more details about how to use the new OAuth2 flows (it's not magic anymore, now the flows actually make sense!)
I recommend you subscribing to our announcement blog and optionally to my personal blog to get more information about the client library. In our announcement blog we described the reason the new OAuth2 pacakge.
Hope it is helpful.

Resources