How do I access BlockBlobClient in Azure Storage JavaScript client library for browsers? - azure-blob-storage

I'm attempting to use BlockBlobClient in a browser page to upload a file using a server-supplied sastoken / URL, similar to this C# code:
var blob = new CloudBlockBlob(new Uri(assetUploadUrl));
blob.UploadFromFile(FilePath, null, new BlobRequestOptions {RetryPolicy = new ExponentialRetry()});
Although the docs suggest BlockBlobClient is available in #azure/storage-blob and should be able to upload browser data from an input[type=file] element using uploadBrowserData, I can find no reference to BlockBlobClient in the browser library source. I looked into modifying the browserify export scripts but I can't find any references in the main package source either. Also the example code suggests that using #azure/storage-blog gives you a BlobServiceClient by default:
const { BlobServiceClient } = require("#azure/storage-blob");
Is BlockBlobClient actually available in the JavaScript client library?

Okay I've figured this out, I need to use the Azure Storage client library for JavaScript, there's even a sample of doing exactly what I need to do. Now I just need to figure out how to bundle npm package files for use in Razor pages.

Related

Calls From External Web Components in PWAs [duplicate]

We are running 2 servers. Server 1 hosts a react application. Server 2 hosts a webcomponent exposed as a single javascript bundle along with some assets such as images. We are dynamically loading the webcomponent Javascript hosted on Server 2 in our react app hosted on Server 1. The fact that it is a webcomponent might or might not affect the issue.
What's happening is that the webcomponent makes uses of assets such as images that are located on Server 2. But when the react app loads the webcomponent, the images are not found as its looking for the images locally on Server 1.
We can fix this in many ways. I am looking for the simplest fix. Since Server 1 app and Server 2 apps are being developed by different teams both should be able to develop in the most natural way possible without making allowances for their app being potentially loaded by other apps.
The fixes that I could think of was:
Making use of absolute URLs to load assets - Need to know the deployed location in advance .
Adding a reverse proxy to Server 1 to redirect to Server 2 whenever a particular path is hit - Need to configure this. The React app could load hundreds of webcomponents, viz we need add a lot of reverse proxies.
Embed all assets into the single javascript on Server 2, like embed svgs into the javascript. - Too limiting. If the SVGs are huge and will make the bundle size bigger.
I was hoping to implement something like -
Since the react app knows where to hit Server 2, can't we write some clever javascript that will make the browser go to Server 2 whenever assets are requested by a Javascript loaded by Server 2.
If you download your Web Component via a classic script (<script> with default type="text/javascript") you can retrieve the URL of the loaded file by using document.currentScript.src.
If you download the file as a module script (<script> with type="module"), you can retrieve the URL by using import.meta.url.
Parse then the property to extract the base path to the Web Component.
Example of Web Component Javascript file:
( function ( path ) {
var base = path.slice( 0, path.lastIndexOf( '/' ) )
customElements.define( 'my-comp', class extends HTMLElement {
constructor() {
super()
this.attachShadow( { mode: 'open' } )
.innerHTML = `<img src="${base}/image.png">`
}
} )
} ) ( document.currentScript ? document.currentScript.src : import.meta.url )
How about uploading all required assets to a 3rd location, or maybe an AWS S3 bucket, Google Drive, Dropbox etc.? That way those assets always have a unique, known URL, and both teams can access them independently. As long as the names remain the same, so will the URLs.

Failed Prop Type Error in Fine Uploader

I'm trying to get Fine Uploader React to work but keep running into issues.
I'm getting the following errors:
Here's the URL: http://fineuploader.azurewebsites.net/
Here's what I've done so far:
Downloaded the source on to my computer from https://github.com/FineUploader/react-fine-uploader
I then npm installed react-fine-uploader and fine-uploader as per instructions
I ran webpack to transpile and bundle the code
Added an entry point and index.html
Finally, I simply published the app to a new Azure app/website
Any idea what's causing the issue?
P.S. My goal is to use Fine Uploader to upload files to Azure Blob Storage. At this point, I'm simply trying to get Fine Uploader going. I do realize that I'll have to enter a few pieces of information about my blog storage endpoint, etc. but I don't think this error is related to any of that.
A Gallery (and every higher level component of that library) needs an "uploader" props as explained in the section https://github.com/FineUploader/react-fine-uploader#high-level-components
An uploader is one of the 3 classes avaiable in the fine-uploader-wrappers package https://github.com/FineUploader/fine-uploader-wrappers#wrapper-classes
those are for upload to
Aws s3
Azure
or your enpoint
The uploader class need all the configuration endpoint, credentials, custom configuration, etc... (you can find a comprehensive list here in the api section https://docs.fineuploader.com/branch/master/api/options.html)
An example for s3 direct upload would be something like:
const uploader = new FineUploaderS3({
options: {
request: {
endpoint: "http://fineuploadertest.s3.amazonaws.com",
accessKey: "AKIAIXVR6TANOGNBGANQ"
},
signature: {
endpoint: "/vendor/fineuploader/php-s3-server/endpoint.php"
}
}
})
and use that uploader in a gallery
<Gallery uploader={ uploader } />
There are many usefull option for customization: callbacks, onEventHandler, etc you can find them all in the docs of fineuploader
Edit: if im not mistaken react-transition-group is necessary even if it's not listed anywhere in the docs...

How can i use internal database from scratch

How can i using internal database for example (sqlite) for offline app in nativescript without using any plugin.
i'm searched every were how i can installed or used sqlite or other internal database for nativescript but i didn't have any answer.
Just like you would do with any code that you need to access the native APIs
e.g. (JavaScript) Android example
var query = "select sqlite_version() AS sqlite_version";
var db = android.database.sqlite.SQLiteDatabase.openOrCreateDatabase(":memory:", null);
var cursor = db.rawQuery(query, null);
var sqliteVersion = "";
if (cursor.moveToNext()) {
sqliteVersion = cursor.getString(0);
console.log(sqliteVersion);
}
The API references for SQLite in Android here and that said you can now follow a basic Android database tutorial and implement it step by step in your NativeScript application using JavaScript or TypeScript
Still, the plugin could provide all that wrapped in a ready-to-go functionality so unless you are lacking something it will be easier to use the nativescript-sqlite and avoid writing native code for Android and then for iOS.

Using Parse master key from the server and not cloud code

I recently implemented the security of my Parse app thinking that I could use the master key on my server (express not cloud code) to securely bypass my security implementations for admin/server level functions.
I'm using "parse": "^1.5.0",
in my package.json.
Right now in each of my express modules I have:
var Parse = require('parse').Parse;
Parse.initialize("Application ID", "Javascript Key", "Master Key");
Everything works fine without CLPs activated but with CLPs I can't do any read/write of the data with the server. I understand that I can move this to Cloud code and get it to work however I need to use a number of libraries in my code that Parse does not support and transporting all of the code to cloud code would be very tough.
What am I doing wrong?
Here's what worked for me.
/////////////////////////////////this is the top of the JS page/module/////
'use strict';
var Parse = require('parse/node');
Parse.initialize('app-id','js-key','master-key');
exports.create = function(req, res) {
Parse.Cloud.useMasterKey();
//now when you do a parse query or action you can bypass your security settings.
};

How can I authenticate with google inside of a firefox plugin

I'd like to add a calendar entry from my Firefox plugin to the user's Google calendar (with their authorization, of course). Unfortunately, I can't seem to figure out how to authenticate with Gapi within the context of the Firefox SDK.
I tried including the client.js from gapi directly as a module in my source, but this isn't effective, since it can't access the window object. My next attempt was something akin to what I do with jQuery - load it in a content script:
googleClient.js
var tabs = require("sdk/tabs");
var self = require('sdk/self');
function initAuth() {
var worker = tabs.activeTab.attach({
url: 'about:blank',
contentScriptFile: [self.data.url('gapi.js'), self.data.url('authContentScript.js')]
});
}
exports.initAuth = initAuth;
main.js:
var googleClient = require('./googleClient');
I get the following problem:
console.error: foxplugin:
Error opening input stream (invalid filename?)
In the ideal situation, it would open a new window in the browser that allows the user to login to Google (similar to what happens when one requests access to the oauth2 endpoint from within a "real" content script).
I had the same problem so I've made an npm plugin for that. It's called addon-google-oauth2 and works for Google OAuth2 tested with AdSense API. It's really simple, it just calls REST APIs for OAuth2. Steps:
Create an OAuth2 client for native application. No web or Android, just native.
If your addon is using jpm ok, if it uses cfx, please migrate to jpm
Download and save the dependency with npm
npm install addon-google-oauth2 --save
Follow the tutorial on the README.md file. It's easy, just two API calls
refreshToken(options,callback);
getToken();
Insert the HTML and JS file on your data/ directory

Resources