Google drive, onedrive or box which is the fastest API method for displaying complete folder contents? [closed] - codeigniter

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
We are working on a web application for photo selections/proofing. To manage the photo albums, we want to use a file sharing service like dropbox, google drive, onedrive etc.
One of our use cases includes displaying albums with large quantity of photos inside our web application. We are currently using dropbox to fetch complete folders (albums) but since the dropbox api doesn't offer batch requests, it takes A LOT of time to generate individual share links and display inside our application (in terms of hours)
I am personally checking other APIs too but will really appreciate little guidance from people having experience with other APIs like google drive, onedrive, box.com etc
What will be the best method/api to achieve the use case efficiently?
Like I personally think if we have a batch operations available, we would be able to fetch an album of 3000 photos (120 kb avg size) in less than 5 minutes on a normal internet connection.

All services you mention (dropbox, onedrive, google drive) offer a thumbnail service. My advice is to use that functionality first.
So (supported by all services):
request meta data about all files in folder (one call), including thumbnail URL and id. Normally this request is paged. For 3000 images you need only three requests (if page-size is 1000, the maximum for Google Drive API)
display thumbnails (using the cached URL)
retrieve actual image if user selects thumbnail based on the cached id. Which is one call
We integrated with all three services (and s3) and in my experience, the performance is more than enough for this kind of scenario.

Related

To store images in file system, database, or blob? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
I have roughly 600 static images that I need to store and use for my web app and I was wondering what kind of options I have for this application.
What is the typical procedure?
What are my options, and what are the pro's and con's of each?
Thanks ahead of time.
You should store static resources that you want to serve from your site somewhere under the wwwroot folder. I recommend putting them in an images subfolder, but you can use whatever organization works for you. There are many reasons why it can be worthwhile to use a Content Delivery Network (CDN) for serving your static resources, including scripts, stylesheets, and images, in which case you might want to store your images there. For example Amazon CloudFront is an inexpensive CDN service you can use for this purpose. This will speed up your page load times since the images will load in parallel with your site's assets, it will reduce load on your server, and the CDN server will host the images on edge servers that are geographically close to the client (so clients on the other side of the world from your server will get the files faster than if they loaded them from your server).
Overall this isn't so much an ASP.NET Core question, but a general web site question. ASP.NET Core will serve static resources (as long as you have the static files middleware installed), but other than that it doesn't have a lot to do with it. Just put the files under wwwroot and you're good to go unless you think it's worth using a CDN.

Fastest Image delivery from local server or CDN, or Cloud? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 5 years ago.
Improve this question
My situation is as follows. I've a reasonably fast website that hosts around 200,000 images. I'm as of now using these images without any problem (reasonably fast image loading) and displaying it on my website. But now, I realise that I have an option to download all these 200,000 onto my local server using a small python script. I could also upload these 200,000 images onto a cloud, CDN.
I've seen famous websites like google request their data from a separate domain and most probably a separate server.
So my question is this. Which is the best way to store a large number of image files for fastest delivery on my webpage. Is it on a local server, a external server or a strategically placed server like what CDNs do? Because, I'm under the impression right now that data is transfer ed within the server fastest and hence it would be best to have it on my local server.
The term fastest depends on so many factors like
Region of server located
Bandwidth of server and user
Server RAM Size & Hard Disk read/write speed
When having same features comparing to cloud,cdn,local
I choose cloud because reliability of cloud is better than local

Is there any bandwidth limits for GDrive and Dropbox [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
I want to store my website images, js, css, flash files into google drive and dropbox services.
Also I want to load files from gdrive & dropbox into my website pages...
So, I have a doubt, about there bandwidth limitations.
Is there any bandwidth limitations of there services...
For Dropbox, from https://www.dropbox.com/help/4204/en:
Links are automatically banned if they generate an uncommonly large
amount of traffic. For Basic accounts, the total amount of traffic
that all of your links together can generate without getting banned is
20 GB per day. For Pro and Business accounts, the limit is 200 GB per
day.
If your account hits our limit, we'll send a message to the email
address registered to your account. Your links will be temporarily
disabled, and those who try to access them will see an error page
instead of your files.
Note that this is for share links, not for access via the API.

When not to use AJAX client-side routing? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Using client-side routing (with Angular's $routeProvider for example), it is possible to load a whole web app at once and not interact with the server anymore.
However this might imply a long load time when visitors first open the web app assuming it has a lot of views.
What are the best practices in terms of client-side routing vs. initial load time?
Well the default behaviour of angular is to have the whole app front loaded. But it depends what your app does and how big it is. If it's a small app you could do that. If your app is huge (unlikely given all the app will do is just show some static data) then it's not really a good idea to load everything up front. For a smaller app it would be OK, especially if you minify everything. But for larger apps, what if you have 10MB of scripts and resources? You're putting a lot of strain on your server and eating your customers bandwidth. Fr large apps you could dynamically load scripts as routes change, we do something similar to this in a pretty huge angular app.
The best practice would be to only get the files that are required to generate the content you want to show to the user on that specific route, which goes against what you want to do. Regarding "not interact with the server anymore", well if your app relies on a server to get some data or do some authentication, you can't really stop interacting with the server.

Design Project Insight and Ideas [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
First note, im not asking for code, more insight on the process flow.
i have a client who is a photographer. he takes pictures of his clients.
he wants to be able to upload them to his site, via web interface.
he also wants a directory for each client.
i.e. www.photographer.com/clientname
pictures will be retouched and resized jpgs.
what is the best method for storing those images .. mysql?
what method should i use to have my client be able to create new directories on the web server?
from there i would essentally list the jpgs out by file name, and hyperlink them to the acutal image that the client would click on.
thoughts and ideas are appreciated.
Does your client want actual directories on the server? Or does he simply want a directory-like URL? If it's the latter, I would use a CMS or mod_rewrite to abstract those URLs. You can have your own system of directories for image storage, and store only their filenames/paths in the database.
A CMS would also make your life and his life a lot easier. I know the more popular ones (drupal, joomla, wordpress) have various image galleries/storage systems you can add in, some allowing multiple simultaneous file uploads, which is not the easiest thing to implement from scratch.
Unless your client wants to upload everything via FTP, it's probably best not to ask his opinion about the directories, because he probably won't know the difference. I think your best bet is to implement the URL system he wants, along with an easy system for him to add/remove images. A well-implemented CMS would make it even easier and offer far more features than FTP, because that's what they were designed for. It would also be less error-prone, because it will have safeguards against deleting directories, etc.
...Or if you want to reinvent the wheel, you could write your own CMS. Either way, it's best to abstract the storage process and let the client access the site through the controls you set up, which will also let him attach more meaning and functionality to the information he's storing.
here are some recommendations:
- store the image files on disk. have a unique id assigned to the image files. these unique id(s) can be mapped to each client in a database.
- to create new directories you can either create them manually or have someone with admin access create them. having an admin access to the site makes it more secure. the admin can create a new director for the client and thereby enable settings for them to access them.
- having a user-based access would enable a client to view only those files that are specific to him/her

Resources