Strategies for Hosting E-learning SWFs - hosting

Our training department has developed a number of interactive training videos that are in SWF format. Is it common to use a different dedicated host for large numbers of SWFs (like is often done with Video) or are SWFs usually light enough weight that they can be hosted on the same platform as the main site without causing too much additional impact on things?
Please pardon my ignorance here... thanks!

I would think the biggest impact you're going to see on your servers (aside from disk space required for storage of the SWF files) is the bandwidth consumed as people download these videos from your servers.

Related

Is Electron a good choice for an app dealing with a large amount of data?

I'm working on a web app to download, decrypt, and create a report of a user's data. This report could be > 100 Gb with individual files of up to 5 Gb. The initial hope was to achieve this in the browser. But memory limitations, especially with a 5 Gb file, have scrapped this idea. Instead the new plan is to provide a standalone app to compile and download the report.
A suggestion has been put forward to use Electron. I'd like to know if this is viable? Or will Electron suffer from the same limitations as the browser?
I'll provide my own response based on experience.
I'd like to know if this is viable? Or will Electron suffer from the same limitations as the browser?
Electron has no such limits. It is perfectly capable of streaming large files and a large quantity of files. In our usage we validated it for 500 GBs and 12 million individual files.
There are other limits. For example if you add 100k+ files to a single zip, it takes a lot of time to extract. As such other strategies may be required. However this is not an Electron limitation.
One limitation of Electron that does matter is its lack of support for FIPS. i.e. Enterprise / Government level security. In our case it's omission means we need to re-write the client in C++. It was possible to get FIPS support on Linux. But this was only achieved using, with a good amount of effort, boringssl. As there are no boringssl releases for Windows we switched to C++.

Does slicing bigger image (2 MB~5 MB) helpful for loading sliced images on web for better appeal

Well I am struggling between best practice and nice to have feature and need your opinion before embarking on fruit(less/full) endeavor.
To improve server performance its been suggested to have less server calls. But then I dislike the part where a big file takes a long time to load. I would rather prefer to load file in chunks for better appeal (like google map loads in layer/tiles).
What is the take of community on this?
Thanks
Its better you Use CDN (Content Delivery Network) for static images rendering in your webpage, which should be hosted in other Server or Cloud environment.
You'll surely see the performnace improvement.
Thanks

How do I know how much traffic my wordpress/buddypress based social media website could hold? What to do when traffic goes up?

Right now I'm paying 5 dollars a month for hosting to godaddy.com. Although there are no users registered yet (it's closed for maintenance mode as I'm testing and buiding it), it's slower than e.g. facebook. Does anyone have experience on using buddypress? What happens if my site blows up and draws a lot of users very fast. I guess I can get more expensive and better quality hosting, but is there a limit for buddypress based sites, especially when I'm using quite a few plugins.
BuddyPress scales quite high, so the code itself won't be a problem, even with tens of thousands of users. Your problems will probably be imposed by your host--limiting database transactions or sizes of tables--or specific themes taking a long time to render.
Firebug can be a great tool to use if you want to identify what component is causing a site to be slow. Instructions on using Firebug

Basic knowledge for a high traffic application

Thanks for all the questions and responses posted on here. This site usually shows up whenever I search for information from google, and in many cases, the answers are usually relevant to the issues I needed solved.
I want to preface my question by stating that I've been programming (.NET, XML, T-SQL, AJAX, etc) for less than 2 years, and I still have a lot to learn; so, pardon my ignorance.
Here's my situation (and question): I'm building a social web application, which I know will have much traffic in a short time; as a result,
What are the basic information that I need to have, in order not to be overwhelmed? It's currently a one-man affair, and here is the hosting specification that I plan to start with: 2GB RAM, 600 HDD, 1000 GB bandwidth, and 2.13GHz Duo Core Processor.
I've read about web-farms, but I've never had an opportunity to use them, so I'm not entirely sure how to phrase this question: how can one split the same application on multiple physical servers? How do you make all the files act as one entity? And since every .net application requires a web.config, how is it split among the various files on these multiple servers?
I've built smaller projects before, but this is the first big project I'm building, and to be frank, I'm a little intimidated. So, I would like to ensure I know what I'm getting into before starting.
Thank you.
Based on your background I assume you are developing in a .Net environment? If so, I highly recommend you take a look at Windows Azure. Developing your app against Azure will allow you to deploy your app in Microsoft's cloud platform. Once deployed you can shrink and grow your resources according to demand without having to deal with the relative hassle of setting up multiple servers in multiple locations and managing it all. This allows you to pay for a "little bit" of server up front and if your app gets popular you can easily pay for "web farm" like power and geographic diversity. It also gives you a decent framework for developing an app that will scale relatively well. That's an 18,000-feet overview. If you can put some more details in your question I'm sure you will get more detailed responses. Best of luck!
Your "social web application" will not have any users if it isn't working and deployed. Don't worry about scaling much until the site actually does something useful and has a few hundred users (or at least a few dozen!). Get it working, find people around you who can help when the going gets tough, and keep at it. Otherwise your concerns about needing to scale will never be warranted.

Best way to update multi-gigabyte program (DVD fulfillment? Updater software?)

Two years ago, we shipped a multi-gigabyte Windows application, with lots of video files. Now we're looking to release a significant update, with approximately 1 gigabyte of new and changed data.
We're currently looking at DVD fulfillment houses (like these folks, for example), which claim to be able to ship DVDs to our customers for $5 and up. Does anyone have any experience with these companies?
We've also looked at a bunch of network-based "updater" software. Unfortunately, most of these tools are intended for much smaller programs. Are there any libraries or products which handle gigabyte-sized updates well?
Thank you for your advice!
BITS is a library from Microsoft for downloading files piece by piece using unused bandwidth. You can basically have your clients trickle-download the new video files. The problem, however, is that you'll have to update your program to utilize BITS first.
Depending on who the end user is you have a few options:
Shipping DVD's
This option tends to be rather expensive, and may not be the best way, what if you are shipping it to someone that no longer has the software installed.
HTTP hosting (using Akamai, or any other CDN)
This works rather well for other companies, for example Apple and I believe Microsoft as well.
Bittorrent
It is not just used for illegal content, it will allow you to offload some of the work load of sending the file, and at the same time it is a fast protocol, if you make sure the that the machine seeding has the correct file, the bittorrent protocol will make sure the end user gets the same file with the exact same hash.
You can use the rsync algorithm: http://samba.anu.edu.au/rsync/

Resources