filepond doesn't work with video/* type in safari - filepond

i'm using FilePond library in react, which works fine in chrome, but not in safari.
my issue is :
when i set acceptedFileTypes={['video/*']}
in chrome - it accepts all video types.
in safari - it will only accept mov files, and not all video type files.
when i set acceptedFileTypes={['video/mp4']} : it does the expected behaviour on both, and accepts only mp4 files.
i don't wish to set all possible video types, and would like to use the video/* filter.
Safari version : 13.1.1
<FilePond
ref={fileRef}
files={files}
allowMultiple={false}
labelIdle={labelIdle}
className="video"
instantUpload={false}
onupdatefiles={(items) => {
setFiles(items)
}}
onremovefile={() => toggleDisplayProgress(false)}
acceptedFileTypes={['video/*']}
required
/>
i can set the video types specifically here - https://trac.webkit.org/browser/trunk/Source/WebCore/platform/MIMETypeRegistry.cpp as described in this ticket

i ended up using quicktime and mp4, which worked :
acceptedFileTypes={['video/quicktime', 'video/mp4']}.
There's no way to use the video/* on safari.

Browsers don't all detect files in the same manner. Depending on the OS this might differ. The fileValidateTypeDetectType hook helps detect the right type. You can use it to manually set the right type based on (for example) the file extension.
FilePond.create(document.querySelector('input'), {
acceptedFileTypes: ['image/png'],
fileValidateTypeDetectType: (source, type) => new Promise((resolve, reject) => {
// Do custom type detection here and resolve promise
resolve(type);
})
});
See: https://pqina.nl/filepond/docs/patterns/plugins/file-validate-type/#custom-type-detection

Related

Audio Recording Mac app with Visualizations unable to select input device with AudioKit

Hey I wonder if someone can point me in the right direction.
I am building a Mac Audio App that I need to perform the following actions
Select audio input device
Show a live audio graph of device input (think GarageBand/ Logic)
capture the audio for replay
Output the sound or make itself selectable as an input device for another app - Similar to how plugins work for GarageBand Logic etc
I have looked into using the following so far:
AudioKit
AVFoundation / AVCaptureDevice
AudioKit
This framework looks amazing and has sample apps for most of the things I want to do, however it seems that it will only accept the audio input that the Mac has selected in settings. This is a non starter for me as I want the user to be able to chose in App (like GarageBand does or Neural Dsp plugins)
Using AudioEngine I can get the available input devices, but everything I have found points to them not being changeable in App - here's code to display them
struct InputDevicePicker: View {
#State var device: Device
var engine: AudioEngine
var body: some View {
Picker("Input: \(device.deviceID)", selection: $device) {
ForEach(getDevices(), id: \.self) {
Text($0.name)
}
}
.pickerStyle(MenuPickerStyle())
.onChange(of: device, perform: setInputDevice)
}
func getDevices() -> [Device] {
AudioEngine.inputDevices.compactMap { $0 }
}
func setInputDevice(to device: Device) {
// set the input device on the AudioEngine
}
}
}
Alternatively
AVFoundation
This has a nice API for listing devices and setting the input, but when it comes to working with the data the delegate provides, I don't have the first clue how I would handle this in terms creating an audio graph and saving the data for replay. Here's the delegate method for reference
extension Recorder: AVCaptureAudioDataOutputSampleBufferDelegate {
func captureOutput(_ output: AVCaptureOutput,
didOutput sampleBuffer: CMSampleBuffer,
from connection: AVCaptureConnection) {
print("Audio data recieved")
// Needs to save audio here
// Needs to play through speakers or other audio source
// Need to show audio graph
}
}
It would be great if someone with experience using these could advise if this is possible on either and where I can look for examples / guidance
Any help pointers, life savers would be appreciated
Thanks if you got this far !

Three.JS - AudioAnalyser() not working with audio source as Stream type in Safari

I'm developing a streaming radio in 3D using Three.JS, I'm sending music as a PeerConnection to my clients attaching a THREE.AudioAnalyser() to display 3D bars that move according to frequencies.
Sound is working great in all platforms, but THREE.AudioAnalyser() with an input source of stream type only works on Chrome, Safari is not working at all :frowning:
var listener = new THREE.AudioListener();
var audio = new THREE.Audio( listener );
audio.setMediaStreamSource( stream );
audioAnalyser = new THREE.AudioAnalyser( audio, 128 );
function loop(){
console.log(audioAnalyser.getFrequencyData());
}
The console.log() of the loop() function should contain an Array of Integers, on Chrome is all good, Safari logs [0,0,0,0,0,0,0,0]
What could be causing this issue? It seems to work everywhere but not on Safari, and also it only seems to fail when the source is a stream.
Not 100% sure, but you might want to connect the output of the AnalyserNode to the destination node. You may want to stick a GainNode with a gain of 0 in between, just in case you don't really want the audio from the AnalyserNode to be played out.

Svg map does not seem in Firefox

My problem is that the svg based map does not seem in Firefox.
Chrome, Edge, Opera is OK, only Firefox doesn't.
The site is: http://2018.tvep.hu/tagletszam-alakulasa.html
I appreciate any help! Thanks: Laszlo Varga
The filter inner-shadow is invalid because the first feComposite references an undefined offset-blur as its in2. Firefox is, as usual and rightfully, more strict and doesn't just ignore the problematic input. Removing the in2="offset-blur" makes it work because it will then automatically use the result of the previous filter (from the SVG 1.1 spec):
If no value is provided and this is a subsequent filter primitive, then this filter primitive will use the result from the previous filter primitive as its input.
Adding result="offset-blur" to the feOffset above it would achive the same.

Code responsible for browsing Firefox's desktop

Where is the code responsible for showing the desktop when you go to a file URL representing a folder in Firefox? Something like the relevant interface or XUL file?
The implementation is pretty arcane and ancient.
file channel handler will create an nsDirectoryIndexStream. This stream will return a specifically crafted text-only representation of the listing (Try View Source to see how it looks like). The file channel handler also sets a special mime type APPLICATION_HTTP_INDEX_FORMAT = "application/http-index-format"
Via the nsIStreamConverterService, a stream converter implemented in nsIndexedToHTML will now produce the final output stream doing a application/http-index-format -> text/html conversation.
Finally, the output html links some style sheet via chrome://global/skin/dirListing/dirListing.css, which is in fact part of the platform specific themes, to give the result a more native-looking appearance.

WP7 play many compressed (mp3, wma etc) audio files simultaneously/dynamically

For size reasons I need to bundle a WP7 app with compressed audio (mp3, wma etc). How do I play these freely/simultaneously?
The XNA framework only supports WAV files, so unless there is a pure C# managed code library somewhere to decompress mp3/wma/ogg (?) on the fly, the next option would be...
MediaElement. But I don't get good results with MediaElement. It seems that you need to add a MediaElement specifically as a tag in the xaml, and you can't use several instances (several tags). As soon as I play a certain MediaElement I can't play another MediaElement on the same page. I don't find anything about a restriction in the reference (the reference is very empty).
I also tried dynamically creating MediaElement objects, but that doesn't seem valid at all, or I just cannot get it to play the files at all.
Use the built-in XNA content pipeline sound effect compression!
The default setting for a SoundEffect content type is "Best" compression quality (which appears to, in fact, be no compression at all). Set the compression quality to "Low" and you will get a much, much smaller file. Or "Medium" for a nice balance between size and quality.
To change this setting, select your .wav file in the solution explorer, press F4 to bring up the properties window, expand the "Content Processor" node, and change the compression quality setting that appears.
Here are instructions with screenshots:
Create a new WP7 XNA game project (or otherwise get an XNA Content Project)
Add a wav file to the content project:
Press F4 with the wav file selected in the Solution Explorer to bring up the properties window.
Expand the "Content Processor" node and change the compression quality to the desired setting.
A setting of "Best" gives no compression (raw waveform), settings of "Medium" and "Low" give a much smaller file.
In my experience currently there's no good solution for this on WP7. Either you use wavs with XNA and grow the size of the xap or use mp3s with the very limited MediaElement functionalty, compromising on what you can implement with it.
You might be able to port some C# audio libraries to WP7, I haven't heard of any so far so it might be long shot.
In one of my apps I finally decided to go with the wav + XNA combination after playing around with different options for a good while.
using Microsoft.Xna.Framework.Media;
void PlaySound(string pathToMp3) {
Song song = Song.FromUri("name", new Uri(pathToMp3, UriKind.Relative));
Microsoft.Xna.Framework.FrameworkDispatcher.Update();
MediaPlayer.Play(song);
}
You could use MediaElement and set the source to the mp3, but this cannot be changed from code as in.
MediaElement me = sender as MediaElement;
me.Source = new Uri(
as you cannot load resources into the source.
you could use multiple MediaElements in your xaml and stop them and start the ones you require. As long as you predetermined which files you wanted to load at compile time.
You could also combine those files into one and play them at a particular location, like so.
me.Position = new TimeSpan(0, 0, 0, 0, 1);
me.Play();

Resources