I'm trying to deploy my NodeJS app to Heroku with static videos up to total of 160 MB (all videos together). However, the upload speed is only 30 Kb/s and it fails at around 50%.
Is there a different way to deploy large files to your app? I know about cloud storage, however, I don't think it is necessary since they are static files.
EDIT
Well one way I found was to deploy them separately. That way, it's not too much to handle and thankfully all the videos can't be more than 25 MB. But it would still be good to find an alternative.
Related
App apk size around 145 MB. It Contains 9-10 gif Images around 100 MB.So,I can't upload app on google play store.
So I am trying to implement app bundle by using this link:-
https://medium.com/#AndreSand/android-app-bundle-96ac16b36875
app bundle makes successful with .aab extension but its size is not less and when I try to upload it on google play store it gives an error that some apk file size has more than 100 MB.
Google Play currently requires that your APK file be no more than 100MB. For most applications, this is plenty of space for all the application's code and assets. However, some apps need more space for high-fidelity graphics, media files, or other large assets. Previously, if your app exceeded 100MB, you had to host and download the additional resources yourself when the user opens the app. Hosting and serving the extra files can be costly, and the user experience is often less than ideal. To make this process easier for you and more pleasant for users, Google Play allows you to attach two large expansion files that supplement your APK.
Read APK Expansion Files.
FYI
The new app publishing format, the Android App Bundle, is a more efficient way to build and release your app. The Android App Bundle lets you more easily deliver a great experience in a smaller app size.
will be more better to upload low size APk otherwise user will never download app from play store. if you have large images or gifs and you don't have server then you can use firebase storege where you can easily store your files and fetch . where you can upload your file and get images paths .
I've built a webapp to host low-res proxies of our teams video files. The webapp is primarily for tagging and searching video. Additionally, I'd like to be able to play a random playlist of clips on TVs around the office. I've implemented this by "Casting Tab" to a Chromecast, and it works fine.
However, now I'm running up against the bandwidth limitations of my host. Latency and everything is fine, but to run a single TV's 2.5Mbps stream 8hrs a day for 23 days a month comes to about 207 GB/month, 20% of my alotted 1TB monthly transfer.
How can i build something that will "cache" these clips client-side, so that it doesnt re-download them unnecessarily? There are about 1000 clips. I'd prefer to keep it connected to my webapp via browser or some API endpoint so the RAND() stream of clips is constantly updated as people add to it.
Note: I asked a related question yesterday, and it seemed to fix my specific issue, but it doesn't seem to have worked at scale, so I'm broadening the approach a bit. Browser Caching of images and videos served via php query strings
Shaka Player has built-in support for offline playback, along with a pretty good API for listing offline assets, and removing them again.
This would require that you have your videos in MPEG-DASH format. Luckily Google also has a tool available for that. Shaka Packager can take your mp4's and package them for MPEG-DASH, provided the MP4's follow some simple requirements.
You could probably build something yourself using similar mechanisms to the Shaka Player, but it seems much easier to use Shaka for doing it.
I have an installer for my Windows app and it is quite big (>100 Mb).
I am also using ClickOnce deployment framework, so each time I issue an update all my users have to download the installer. We tried to use Amazon S3 to store the setup file, but it turns out that download speeds differ significantly across the globe, e.g. in US the download speed is several Mbps while in Europe or China it is less than 30Kbps, which is inapplicable.
However when I download most of the apps from internet, the download speed is usually good and doesn't depend this much on server location. How is this problem usually solved?
Big companies like Microsoft use a content delivery network which makes sure no matter where you come from a download server gets assigned to you which is as near as possible to your current location.
I know this has probably been answered before but I couldn't find an answer
We have a client that is currently selling video DVDs with dance classes. He wants to convert it to an App for obvious reasons.
The DVD has 90 minutes of video (divided in 8 chapters) so we estimated that it would be around 1.6GB of size. I'd like to know best practices for this.
We would like to download the 8 videos embedded in the App to avoid the user having to download chapters once they open the App (and to avoid hosting fees).
We are targeting iOS 6 because most of his customers have the latests iOS devices. We don't want to stream the video, it should play locally for different reasons.
Is 1.6GB too much for an App? Any suggestions for this scenario?
Thanks in advance
(iOS only) App Size Tips
iOS apps can be as large as 2 GB, but consider download times when determining your app’s size. Minimize the file’s size as much as possible, keeping in mind that there is a 50 MB limit for over-the-air downloads.
Too big, I think. Personally I would make the chapters downloadable and perhaps offer a 'download all' option for anyone who wants to get all the videos in one hit.
Making it modular (i.e. app and content separate) should also make it easier to add content or update the app when required.
Why not include the first chapter and build a download option for the rest of them? Best of both worlds.
Don't forget you can re-encode the video to reduce the size. If its a DVD rip you'll be getting a 480p version at most. Play around with bit rates to find an acceptable size file.
I am trying to implement a new feature in our Delpi project that will help our users to make backups online on our servers, so I've used Indy FTP component to build a form that will upload / download the user files or folders
But I need to offer the ability to do incremental backups. Where our users can only upload ONLY 'new' differences to their files and folders. So for example if I have a text file with size 5 KB, and I've added text to it worth additional 2 KB, then the backup will just upload those extra 2 KB and NOT the whole 7 KB
So can someone please recommend any approach, algorithm to start with?
Thanks for your time
Note: we are using Delphi 7
There is a Microsoft Delta Compression API that will allow you to diff and patch, however I have investigated this route before and found that using rSync or robocopy is a much easier route - especially as it will be quite hard to apply the patch/changes from the server side unless you build your own custom FTP server.