Error inherits module: Windows azure blobstore in gwt and not GAE Blobstore - windows

I used Windows Azure SDK for java in gwt, and obtain this problem in gwt:
No source code is available for type com.microsoft.windowsazure.services.core.storage.CloudStorageAccount; did you forget to inherit a required module?
Any idea?, for example correct value for <inherits name ="....."/>
this is the code, but the problem not is the code, is the correct value for inherits name:
public class StorageSmple {
public static final String storageConnectionString =
"DefaultEndpointsProtocol=http;" +
"AccountName=xxxxxx;" +
"AccountKey=xxxxxxx";
public void executeProgram()
{
try
{
CloudStorageAccount account;
CloudBlobClient serviceClient;
CloudBlobContainer container;
CloudBlockBlob blob;
account = CloudStorageAccount.parse(storageConnectionString);
serviceClient = account.createCloudBlobClient();
// Container name must be lower case.
container = serviceClient.getContainerReference("gettingstarted");
container.createIfNotExist();
// Set anonymous access on the container.
BlobContainerPermissions containerPermissions;
containerPermissions = new BlobContainerPermissions();
containerPermissions.setPublicAccess(BlobContainerPublicAccessType.CONTAINER);
container.uploadPermissions(containerPermissions);
// Upload an image file.
blob = container.getBlockBlobReference("image");
File fileReference = new File ("www.xxx/a254.png");
blob.upload(new FileInputStream(fileReference), fileReference.length());
// At this point the image is uploaded.
// Next, create an HTML page that lists all of the uploaded images.
MakeHTMLPage(container);
System.out.println("Processing complete.");
System.out.println("Open index.html to see the images stored in your storage account.");
}catch (Exception e){
System.out.print("Exception encountered: ");
System.out.println(e.getMessage());
}
}
// Create an HTML page that can be used to display the uploaded images.
// This example assumes all of the blobs are for images.
public void MakeHTMLPage(CloudBlobContainer container) throws FileNotFoundException, URISyntaxException
{
// Enumerate the uploaded blobs.
for (ListBlobItem blobItem : container.listBlobs()) {
HTMLPanel b = new HTMLPanel("<img src='" + blobItem.getUri() + "'/><br/>");
RootPanel.get().add(b);
}
}
}

I'm not too familiar with Azure, but I highly suspect that the Azure Java SDK is to be used on the server side. There must be code in this SDK that is not emulated by GWT.
Any code that is not already emulated by GWT (see here for a list of emulated classes) must be accompanied by GWT-translatable sources (see <super-source/> here).

Related

Azure DataLake client fails on creating a directory

I'm creating a POC to store files in Azure following the steps in https://learn.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-directory-file-acl-dotnet. In the snippet below creating the directory fails with message No such host is known. (securedfstest02.blob.core.windows.net:443). Appreciate any suggestion to workaround
this issue.
using Azure;
using Azure.Storage;
using Azure.Storage.Files.DataLake;
using Azure.Storage.Files.DataLake.Models;
using System;
using System.Collections.Generic;
using System.Threading.Tasks;
namespace DataLakeHelloWorld
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Hello World!");
try
{
CreateFileClientAsync_DirectoryAsync().Wait();
}
catch(Exception e)
{
Console.WriteLine(e);
}
}
static async Task CreateFileClientAsync_DirectoryAsync()
{
// Make StorageSharedKeyCredential to pass to the serviceClient
string storageAccountName = "secureblobtest02";
string storageAccountKey = "mykeyredacted";
string dfsUri = "https://" + storageAccountName + ".dfs.core.windows.net";
StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(storageAccountName, storageAccountKey);
// Create DataLakeServiceClient using StorageSharedKeyCredentials
DataLakeServiceClient serviceClient = new DataLakeServiceClient(new Uri(dfsUri), sharedKeyCredential);
// Create a DataLake Filesystem
DataLakeFileSystemClient filesystem = serviceClient.GetFileSystemClient("my-filesystem");
if(!await filesystem.ExistsAsync())
await filesystem.CreateAsync();
//Create a DataLake Directory
DataLakeDirectoryClient directory = filesystem.CreateDirectory("my-dir");
if (!await directory.ExistsAsync())
await directory.CreateAsync();
// Create a DataLake File using a DataLake Directory
DataLakeFileClient file = directory.GetFileClient("my-file");
if(!await file.ExistsAsync())
await file.CreateAsync();
// Verify we created one file
var response = filesystem.GetPathsAsync();
IAsyncEnumerator<PathItem> enumerator = response.GetAsyncEnumerator();
Console.WriteLine(enumerator?.Current?.Name);
// Cleanup
await filesystem.DeleteAsync();
}
}
}
--Update
In your question, you mention of Azure data lake, but you seem to have the host: securedfstest02.blob.core.windows.net
Azure Data Lake Storage uses .dfs.core.windows.net/ whereas a Azure Blob Storage uses .blob.core.windows.net/ While using Blob service related operations in ADLS you would have to change the endpoint too accordingly.
please note the official MS docs URI templates.
I have used the same code and was able to create directory. Just replaced my adls credentials. I have not configured any additional permissions. ADLS is Allowed access from all networks. You might want to check if yours is by default configured to specific network or if firewall allows client (your) IP.
using Azure;
using Azure.Storage;
using Azure.Storage.Files.DataLake;
using Azure.Storage.Files.DataLake.Models;
using System;
using System.Collections.Generic;
using System.Threading.Tasks;
namespace DataLakeHelloWorld
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Starting....");
try
{
Console.WriteLine("Executing...");
CreateFileClientAsync_DirectoryAsync().Wait();
Console.WriteLine("Done");
}
catch (Exception e)
{
Console.WriteLine(e);
}
}
static async Task CreateFileClientAsync_DirectoryAsync()
{
// Make StorageSharedKeyCredential to pass to the serviceClient
string storageAccountName = "kteststarageeadls";
string storageAccountKey = "6fAe+P8LRe8LH0Ahxxxxxxxxx5ma17Slr7SjLy4oVYSgj05m+zWZuy5X8p4/Bbxxx8efzCj/X+On/Fwmxxxo7g==";
string dfsUri = "https://" + "kteststarageeadls" + ".dfs.core.windows.net";
StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(storageAccountName, storageAccountKey);
// Create DataLakeServiceClient using StorageSharedKeyCredentials
DataLakeServiceClient serviceClient = new DataLakeServiceClient(new Uri(dfsUri), sharedKeyCredential);
// Create a DataLake Filesystem
DataLakeFileSystemClient filesystem = serviceClient.GetFileSystemClient("my-filesystem");
if (!await filesystem.ExistsAsync())
await filesystem.CreateAsync();
//Create a DataLake Directory
DataLakeDirectoryClient directory = filesystem.CreateDirectory("my-dir");
if (!await directory.ExistsAsync())
await directory.CreateAsync();
// Create a DataLake File using a DataLake Directory
DataLakeFileClient file = directory.GetFileClient("my-file");
if (!await file.ExistsAsync())
await file.CreateAsync();
// Verify we created one file
var response = filesystem.GetPathsAsync();
IAsyncEnumerator<PathItem> enumerator = response.GetAsyncEnumerator();
Console.WriteLine(enumerator?.Current?.Name);
// Cleanup
//await filesystem.DeleteAsync();
}
}
}
I've edited storage account key, used for reference only.

CosmosDB Project Layout

Asking for advice and references.
Using Visual Studio, I have an Azure Web Apps project in my solution. Now, I'm programming my Stored Procedures for CosmosDB. Using the CosmosDB Emulator, I can simply insert the Stored Procedure code directly into the browser editor window. All good and fine, and everything is working beautifully.
I also have a NodeJS project sitting alongside my Web App project. This allows me to store the Stored Procedures as files. The associated Console App is able to connect and modify the CosmosDB Emulator as expected.
My question is, using Visual Studio, what is the best way to lay out my project, so that it's not done on napkins and prayers?
I'm wondering how I should be structuring my project layout and assets to align with current "best practices". Is there any information, articles or posts that you guys/gals have found that talk about this specifically? Would I be running all of these procedures against CosmosDB manually, or are there automated procedures people have devised? I would like to be able to test these stored procedures first, against the Emulator, and with little-to-no source code change, update staging.
Thanks!
I have just recently asked myself the same question regarding stored procedure migrations.. I am currently running a basic Migrate Method that will get stored procedure content from a js file and replace/create the stored procedure, this runs on startup (in startup.cs)
The main gist of the code below, you will need to create the very basic internal methods (comments welcome):
using System;
using System.IO;
using System.Threading.Tasks;
using App.Data.Access;
using Microsoft.AspNetCore.Hosting;
using Microsoft.Azure.Documents;
namespace App.Data.StoredProcedures
{
public class Migrations : IMigrations
{
private readonly IHostingEnvironment _hostingEnvironment;
private readonly IDocumentDbContext _documentDbContext;
public Migrations(IHostingEnvironment hostingEnvironment,IDocumentDbContext documentDbContext)
{
_hostingEnvironment = hostingEnvironment;
_documentDbContext = documentDbContext;
}
public async Task<bool> Migrate()
{
try
{
await AddUpdateBulkDeleteStoredProcedure();
return true;
}
catch (Exception exception)
{
throw new Exception("Error running CosmosDb stored procedure migrations,error" + exception.Message);
}
}
public string GetStoredProcedureScript(string filename)
{
var script = Path.Combine(_hostingEnvironment.WebRootPath, "App_Data", "CosmosDbStoredProcedures", filename);
return IO.File.ToString(script);
}
public async Task<bool> AddUpdateBulkDeleteStoredProcedure()
{
const string storedProcedureId = "BulkDeleteStoredProcedure";
var function = GetStoredProcedureScript($"{storedProcedureId}.js");
if (string.IsNullOrWhiteSpace(function))
{
throw new Exception($"Error running DocumentDb Stored procedure migrations, {storedProcedureId} content is empty");
}
try
{
await _documentDbContext.Client.ReplaceStoredProcedureAsync(_documentDbContext.GetStoredProcedureUri(storedProcedureId), new StoredProcedure {Id = storedProcedureId, Body = function});
return true;
}
catch
{
// ignore
}
await _documentDbContext.Client.CreateStoredProcedureAsync(_documentDbContext.DocumentCollectionUri, new StoredProcedure {Id = storedProcedureId, Body = function});
return true;
}
}
}

Google Drive Api Pdf export from Google Doc generate empty response

I'm using the export Google Drive API to retrieve a Google Doc as Pdf: https://developers.google.com/drive/v3/reference/files/export
I'm having the following problem: for documents bigger than a certain size (I don't know exactly the threshold, but it happens even with relatively small files around 1,5 MB) the API return a 200 response code with a blank result (normally it should contains the pdf data as byte stream), as you can see in the following screenshot:
I can successfully export the file via GoogleDrive/GoogleDoc UI with the "File -> Download as.. -> Pdf" command, despite it takes a bit of time.
Here is the file used for test (1.180 KB exported from Google Doc), I shared it so you can access to try export:
https://docs.google.com/document/d/18Cz7kHfEiDLeTWHyyoOi6U4kFQDMeg0D-CCJzILMMCk/edit?usp=sharing
Here is the (Java) code I'm using to perform the operation:
#Override
public GoogleDriveDocumentContent downloadFileContentAsPDF(String executionGoogleUser, String fileId) {
GoogleDriveDocumentContent documentContent = new GoogleDriveDocumentContent();
String conversionMimeType = "application/pdf";
try {
getLogger().info("GDrive APIs - Downloading file content in PDF format ...");
InputStream gDriveFileData = getDriveService(executionGoogleUser).files()
.export(fileId, conversionMimeType)
.executeMediaAsInputStream();
getLogger().info("GDrive APIs - File content as PDF format downloaded.");
documentContent.setFileName(null);
documentContent.setMimeType(conversionMimeType);
documentContent.setData(gDriveFileData);
} catch (IOException e) {
throw new RuntimeException(e);
}
return documentContent;
}
Does anyone has the same issue and know how to solve it?
The goal is to generate a pdf from a Google Doc.
Thanks
I think you should try using media downloadeder you will have to alter it for Google drive rather than storage service.
{
// Create the service using the client credentials.
var storageService = new StorageService(new BaseClientService.Initializer()
{
HttpClientInitializer = credential,
ApplicationName = "APP_NAME_HERE"
});
// Get the client request object for the bucket and desired object.
var getRequest = storageService.Objects.Get("BUCKET_HERE", "OBJECT_HERE");
using (var fileStream = new System.IO.FileStream(
"FILE_PATH_HERE",
System.IO.FileMode.Create,
System.IO.FileAccess.Write))
{
// Add a handler which will be notified on progress changes.
// It will notify on each chunk download and when the
// download is completed or failed.
getRequest.MediaDownloader.ProgressChanged += Download_ProgressChanged;
getRequest.Download(fileStream);
}
}
static void Download_ProgressChanged(IDownloadProgress progress)
{
Console.WriteLine(progress.Status + " " + progress.BytesDownloaded);
}
Code ripped from here

Image without extension in src not loading in IE alone, and works perfect in all other browers

I have below HTML code:
<img title="hotelThumbImage" id="hotelThumbImage01" width="140px" height="129px"
src="/b2c/images/?url=FixedPkgB2c/FF-252-325"/>
It renders in IE as below:
It renders in all other browser like FireFox and Chrome as:
Related question : How to make a Servlet call form UI which returns the Content itself and place an img tag using Script in the output?
My project is suffering from this too, and it's because IE prevents download/display of files which have a different encoding than their extension. It has something to do with malicious code being able to be hidden as image files simply by changing the extension of the file.
Firefox and Chrome are smart enough to display it as an image so long as the encoding is that of an image, but IE takes no chances, it seems.
You'll have to add the extension that matches your image's encoding for it to display in IE.
Edit: It's also possible that your server is sending the file with a header denoting plain text. Again, Firefox and Chrome are smart enough to handle it, but IE isn't. See: https://stackoverflow.com/a/32988576/4793951
Welcome to IE world... :(
What i would do, in order to have better control of the situation is to modify the getter method, so in Holiday.getPkgCode():
public String getPkgCode() throws IOException {
if (!this.pkgCode.contains(".")) {
String ext = ImgUtil.determineFormat(this.pkgCode);
return this.pkgCode + ImgUtil.toExtension(ext);
} else {
return this.pkgCode;
}
}
To use it you will need to catch exceptions and this ImgUtil class adapted from here:
class ImgUtil {
public static String determineFormat(String name) throws IOException {
// get image format in a file
File file = new File(name);
// create an image input stream from the specified file
ImageInputStream iis = ImageIO.createImageInputStream(file);
// get all currently registered readers that recognize the image format
Iterator<ImageReader> iter = ImageIO.getImageReaders(iis);
if (!iter.hasNext()) {
throw new RuntimeException("No readers found!");
}
// get the first reader
ImageReader reader = iter.next();
String toReturn = reader.getFormatName();
// close stream
iis.close();
return toReturn;
}
public static String toExtension(String ext) {
switch (ext) {
case "JPEG": return ".jpg";
case "PNG": return ".png";
}
return null;
}
}
TEST IT:
NOTE: I placed an image (jpg) without extension placed in C:\tmp folder
public class Q37052184 {
String pkgCode = "C:\\tmp\\yorch";
public static void main(String[] args) throws IOException {
Q37052184 q = new Q37052184();
System.out.println(q.getPkgCode());
}
// the given getter!!!
}
OUTPUT:
C:\tmp\yorch.jpg
You have to set the Content Type property of responses' header in the servlet.
For example in spring 4 mvc,
#GetMapping(value = "/b2c/images/?url=FixedPkgB2c/FF-252-325")
public ResponseEntity<byte []> getImageThumbnail() {
HttpHeaders headers = new HttpHeaders();
headers.setContentType(media type));
byte [] content= ...;
return ResponseEntity.ok().headers(headers).body(content);
}

Polling from a network directory

I have been working on the following project, some background:
I am an intern currently developing a new search system for my organization. The current setup is microsoft sharepoint 2013 in which the users upload files etc.. and on the other hand is the system I am developing which indexes all data being uploaded to apache SOLR.
I have been succesfull in mapping the sharepoint content repository to a network drive, and I can manually start my program to start indexing the conent of this network drive to SOLR using the Solrj api.
The problem I am facing however is that I am unable to poll events from this network drive. In my test build which ran local I used a watcher service to launch code (reindex documents, delete indexes) on file create, file modify and file delete.
This does not work unfortunantly with a url pointing to a network drive :(.
So the big question: Is there any API / library available for polling events from network drives?
Any help would be extemely appreciated !
So I fnally figured this one out, tried looking at .net's variant of the watcher service (system.io.filesystemwatcher) and i was having the same problem. I finally got it working by using java.io.FileAlterationMonitor / observer.
Code:
public class UNCWatcher {
// A hardcoded path to a folder you are monitoring .
public static final String FOLDER =
"A:\\Department";
public static void main(String[] args) throws Exception {
// The monitor will perform polling on the folder every 5 seconds
final long pollingInterval = 5 * 1000;
File folder = new File(FOLDER);
if (!folder.exists()) {
// Test to see if monitored folder exists
throw new RuntimeException("Directory not found: " + FOLDER);
}
FileAlterationObserver observer = new FileAlterationObserver(folder);
FileAlterationMonitor monitor =
new FileAlterationMonitor(pollingInterval);
FileAlterationListener listener = new FileAlterationListenerAdaptor() {
// Is triggered when a file is created in the monitored folder
#Override
public void onFileCreate(File file) {
try {
// "file" is the reference to the newly created file
System.out.println("File created: "
+ file.getCanonicalPath());
if(file.getName().endsWith(".docx")){
System.out.println("Uploaded resource is of type docx, preparing solr for indexing.");
}
} catch (IOException e) {
e.printStackTrace(System.err);
}
}
// Is triggered when a file is deleted from the monitored folder
#Override
public void onFileDelete(File file) {
try {
// "file" is the reference to the removed file
System.out.println("File removed: "
+ file.getCanonicalPath());
// "file" does not exists anymore in the location
System.out.println("File still exists in location: "
+ file.exists());
} catch (IOException e) {
e.printStackTrace(System.err);
}
}
};
observer.addListener(listener);
monitor.addObserver(observer);
System.out.println("Starting monitor service");
monitor.start();
}
}

Resources