Failed Prop Type Error in Fine Uploader - fine-uploader

I'm trying to get Fine Uploader React to work but keep running into issues.
I'm getting the following errors:
Here's the URL: http://fineuploader.azurewebsites.net/
Here's what I've done so far:
Downloaded the source on to my computer from https://github.com/FineUploader/react-fine-uploader
I then npm installed react-fine-uploader and fine-uploader as per instructions
I ran webpack to transpile and bundle the code
Added an entry point and index.html
Finally, I simply published the app to a new Azure app/website
Any idea what's causing the issue?
P.S. My goal is to use Fine Uploader to upload files to Azure Blob Storage. At this point, I'm simply trying to get Fine Uploader going. I do realize that I'll have to enter a few pieces of information about my blog storage endpoint, etc. but I don't think this error is related to any of that.

A Gallery (and every higher level component of that library) needs an "uploader" props as explained in the section https://github.com/FineUploader/react-fine-uploader#high-level-components
An uploader is one of the 3 classes avaiable in the fine-uploader-wrappers package https://github.com/FineUploader/fine-uploader-wrappers#wrapper-classes
those are for upload to
Aws s3
Azure
or your enpoint
The uploader class need all the configuration endpoint, credentials, custom configuration, etc... (you can find a comprehensive list here in the api section https://docs.fineuploader.com/branch/master/api/options.html)
An example for s3 direct upload would be something like:
const uploader = new FineUploaderS3({
options: {
request: {
endpoint: "http://fineuploadertest.s3.amazonaws.com",
accessKey: "AKIAIXVR6TANOGNBGANQ"
},
signature: {
endpoint: "/vendor/fineuploader/php-s3-server/endpoint.php"
}
}
})
and use that uploader in a gallery
<Gallery uploader={ uploader } />
There are many usefull option for customization: callbacks, onEventHandler, etc you can find them all in the docs of fineuploader
Edit: if im not mistaken react-transition-group is necessary even if it's not listed anywhere in the docs...

Related

How do I access BlockBlobClient in Azure Storage JavaScript client library for browsers?

I'm attempting to use BlockBlobClient in a browser page to upload a file using a server-supplied sastoken / URL, similar to this C# code:
var blob = new CloudBlockBlob(new Uri(assetUploadUrl));
blob.UploadFromFile(FilePath, null, new BlobRequestOptions {RetryPolicy = new ExponentialRetry()});
Although the docs suggest BlockBlobClient is available in #azure/storage-blob and should be able to upload browser data from an input[type=file] element using uploadBrowserData, I can find no reference to BlockBlobClient in the browser library source. I looked into modifying the browserify export scripts but I can't find any references in the main package source either. Also the example code suggests that using #azure/storage-blog gives you a BlobServiceClient by default:
const { BlobServiceClient } = require("#azure/storage-blob");
Is BlockBlobClient actually available in the JavaScript client library?
Okay I've figured this out, I need to use the Azure Storage client library for JavaScript, there's even a sample of doing exactly what I need to do. Now I just need to figure out how to bundle npm package files for use in Razor pages.

Calls From External Web Components in PWAs [duplicate]

We are running 2 servers. Server 1 hosts a react application. Server 2 hosts a webcomponent exposed as a single javascript bundle along with some assets such as images. We are dynamically loading the webcomponent Javascript hosted on Server 2 in our react app hosted on Server 1. The fact that it is a webcomponent might or might not affect the issue.
What's happening is that the webcomponent makes uses of assets such as images that are located on Server 2. But when the react app loads the webcomponent, the images are not found as its looking for the images locally on Server 1.
We can fix this in many ways. I am looking for the simplest fix. Since Server 1 app and Server 2 apps are being developed by different teams both should be able to develop in the most natural way possible without making allowances for their app being potentially loaded by other apps.
The fixes that I could think of was:
Making use of absolute URLs to load assets - Need to know the deployed location in advance .
Adding a reverse proxy to Server 1 to redirect to Server 2 whenever a particular path is hit - Need to configure this. The React app could load hundreds of webcomponents, viz we need add a lot of reverse proxies.
Embed all assets into the single javascript on Server 2, like embed svgs into the javascript. - Too limiting. If the SVGs are huge and will make the bundle size bigger.
I was hoping to implement something like -
Since the react app knows where to hit Server 2, can't we write some clever javascript that will make the browser go to Server 2 whenever assets are requested by a Javascript loaded by Server 2.
If you download your Web Component via a classic script (<script> with default type="text/javascript") you can retrieve the URL of the loaded file by using document.currentScript.src.
If you download the file as a module script (<script> with type="module"), you can retrieve the URL by using import.meta.url.
Parse then the property to extract the base path to the Web Component.
Example of Web Component Javascript file:
( function ( path ) {
var base = path.slice( 0, path.lastIndexOf( '/' ) )
customElements.define( 'my-comp', class extends HTMLElement {
constructor() {
super()
this.attachShadow( { mode: 'open' } )
.innerHTML = `<img src="${base}/image.png">`
}
} )
} ) ( document.currentScript ? document.currentScript.src : import.meta.url )
How about uploading all required assets to a 3rd location, or maybe an AWS S3 bucket, Google Drive, Dropbox etc.? That way those assets always have a unique, known URL, and both teams can access them independently. As long as the names remain the same, so will the URLs.

ckeditor 4.5 fileUploadRequest event not firing

I have a textarea with html id "id_textarea".
editor = CKEDITOR.inline( 'id_textarea', {
filebrowserBrowseUrl : 'browse_url',
filebrowserUploadUrl : 'upload_url'
});
editor.on( 'fileUploadRequest', function( evt ) {
console.log("This is not printing");
});
Whenever I try to upload a file, it doesn't print anything to console. Am I doing something wrong?
By the way, my requirement is to add csrf headers before sending a request for which I need to catch some event like fileUploadRequest.
I assume that you are trying to upload files via the Upload tab in the Image Properties dialog. It is provided by the File Browser plugin and fileButton which does not support the fileUploadRequest and fileUploadResponse events (there is already a feature request with a more in-depth description of this case).
If you would like to use these events for some custom request preprocessing, you can use the Upload Image plugin. The configuration process is described in the official docs, but keep in mind that it will work only for dropping or pasting files. Upload via the Image Properties dialog will still be handled by the File Browser plugin which does not support these events.
The important thing here is that since CKEditor 4.5.6, the File Browser plugin uses the CSRF header so it can be probably used on your server side without any modifications in the JavaScript code. So if you are using an older version I suggest updating to 4.5.6 (using e.g. CKBuilder) and trying to integrate with your backend. The CSRF header in the File Browser plugin should be sufficient for your needs.
Here is the header:

How can I authenticate with google inside of a firefox plugin

I'd like to add a calendar entry from my Firefox plugin to the user's Google calendar (with their authorization, of course). Unfortunately, I can't seem to figure out how to authenticate with Gapi within the context of the Firefox SDK.
I tried including the client.js from gapi directly as a module in my source, but this isn't effective, since it can't access the window object. My next attempt was something akin to what I do with jQuery - load it in a content script:
googleClient.js
var tabs = require("sdk/tabs");
var self = require('sdk/self');
function initAuth() {
var worker = tabs.activeTab.attach({
url: 'about:blank',
contentScriptFile: [self.data.url('gapi.js'), self.data.url('authContentScript.js')]
});
}
exports.initAuth = initAuth;
main.js:
var googleClient = require('./googleClient');
I get the following problem:
console.error: foxplugin:
Error opening input stream (invalid filename?)
In the ideal situation, it would open a new window in the browser that allows the user to login to Google (similar to what happens when one requests access to the oauth2 endpoint from within a "real" content script).
I had the same problem so I've made an npm plugin for that. It's called addon-google-oauth2 and works for Google OAuth2 tested with AdSense API. It's really simple, it just calls REST APIs for OAuth2. Steps:
Create an OAuth2 client for native application. No web or Android, just native.
If your addon is using jpm ok, if it uses cfx, please migrate to jpm
Download and save the dependency with npm
npm install addon-google-oauth2 --save
Follow the tutorial on the README.md file. It's easy, just two API calls
refreshToken(options,callback);
getToken();
Insert the HTML and JS file on your data/ directory

Fine Uploader Basic To S3

Does anyone know if Fine Uploader supports it's uploaderType: 'basic' mode in conjunction with an S3 endpoint?
Their documentation is a box of christmas lights and I can't make heads or tails about which options work with which versions of the uploader.
Using this code, and not including the #qq-template they provide, I get the error below:
var uploader = new qq.s3.FineUploader({
uploaderType: 'basic',
element: document.getElementById("fineUploader"),
request: {
endpoint: "mybucket.s3.amazonaws.com",
accessKey: "MY_AWS_PUBLIC_ACCESS_KEY"
},
signature: {
endpoint: "/s3/signtureHandler"
},
uploadSuccess: {
endpoint: "success.html"
}
});
Error: Cannot find template script at ID 'qq-template'!
However, according to their docs (Fine Uploader Getting Started) it seems as though this is the correct way to get rid of the UI and handle that myself. Except it doesn't work.
Thanks for any help.
You are confusing the jQuery plug-in workflow with the no-dependency workflow. Just like the traditional endpoint handler, you simply need to make use of the FineUploaderBasic constructor. As the documentation illustrates, all S3 endpoint handler modules are appropriately namespaced:
var uploader = new qq.s3.FineUploaderBasic({...
Fine Uploader supports a wide variety of workflow, endpoints, and features. It's tough to fit that all into the documentation in a way that is intuitive for 100% of our users. However, for the most part, the current setup has been well received. If you have a specific suggestion for improvement, please open up an issue in the GitHub project's issue tracker. We will soon make it easier for users to edit the documentation as well.

Resources