Getting the duration of an mp3 on a website - xamarin

I'm using NAudio.Wave in my Xamarin Forms app. I have an mp3 file stored both on the device and can get the duration quite simply using NAudio. However, I also have an mp3 file on my website and need to find the duration of the mp3 file.
Can I use NAudio.Wave to find the length of the mp3 without downloading the file itself?

Related

Reading a wav/ogg file with Tarsosdsp(Android)

I searched the web for an example to read a wav or ogg audio file with TarsosDSP library for android but couldn't find any.
Please provide me with brief code example for android to load an audio file from android's resource directory R.raw.audioFile into TarsosDSP.
I tried PipedAudioStream but it requires a string input while R.raw.audioFile returns an int.

NativeScript-VideoPlayer Android local videos not displaying

Using the nativescript-videoplayer plugin with Angular I am able to view and play videos remotely and locally on IOS. However on Android I am only able to play remote videos. For local videos, the video player is displayed, but the time is at 0.00 and there is no picture, which usually means that the video source cannot be found.
I have tried specifying the source with these following paths:
src="res://videos/test.mp4"
src="~/videos/test.mp4"
And have tried placing the location of the video under:
App_Resources/videos/test.mp4
App_Resources/android/videos/test.mp4
App_Resources/android/src / main/videos.test.mp4
For a simplified example I added the local videos to this simple playground: https://play.nativescript.org/?template=play-ng&id=lCu2B5&v=50
You can put your video file in any folder under your app src. But to be able to play it, you need to get the full absolute path of the video to put it to the src property of video player.
Example that you have your app's src folder > videos > test.mp4. You can get the path to your mp4 file with Nativescript fileModule.
fileModule.path.normalize(fileModule.knownFolders.currentApp().path + "///videos///test.mp4");

Why cant i use .mp4 and .jpg files in my Xamarin Forms Android project?

In my project (under Drawables) I have .png, .jpg and .mp4 files but when I run my project the .png files are the only ones who are successfully being used in the app. If I screenshot how for instance a .mp4 looks compared to the .png file it looks like this. The .jpg file is also looking like the .mp4 file (white).
When I right-click the files and check the build action they are all set on "AndroidResource". I add the resources both through my XAML file and code directly with the same outcome.
I am following this guide where I try to implement a video-background in xamarin.forms.android: https://www.junian.net/2017/03/fullscreen-video-background-control-for-xamarin-forms.html
Any idea why it isn't working and why the solution cannot recognize/find these files?
It is right on the tutorial you referenced:
...remember that video file on Android need to be stored under Assets directory...
You are adding your .mp4 as a Resources/Drawables, you need to add it within the Android Assets
Xamarin Guilde to Using Android Assets

Copy/Paste Audio Samples in Cocoa

I am creating a little audio editor using the Cocoa frameworks for Mac OS X. I implemented "copy" by writing the selected samples to a temporary WAV file, then using NSSound to load the URL and then push the data to the pasteboard (writeToPasteboard:).
Now I am working on paste. I create an NSSound from the pasteboard and am now stuck. How do I get access to the samples?
You don't. NSSound doesn't give you access to raw sample data.
You'll need to use the Audio Toolbox framework instead. It provides two APIs for reading and writing audio in files: Audio File Services is the lower-level of the two, whereas Extended Audio File Services supports compression and other features.
You may want to use the callbacks-based APIs in Audio File Services, since these could save you from having to use a temporary file. Instead, you'd set an NSMutableData object as the “client data” of the audio file, then implement “write” by copying the bytes into the relevant range of that data (setting its length first if needed) and “read” by copying some range of the data out.

WebM Encoded file plays in Chrome but not FFv6 HTML5-Video

I'm trying to use the new fancy html5 video player element and I was wondering:
I encoded a high resolution .mov container in VLC v1.1.9 to WebM format (although an FFMPEG command line would be extremely valuable if you have one handy) and it plays just fine in Chrome, but it won't open in Firefox. Would anyone have any ideas or in what direction I should be looking?
I had this problem and the fix was to change the apache server config to associate *.webm files with the video/webm mime type. This was done by creating a .htaccess file, or could be done in the apache mime types config.

Resources