I have a question: I am looking to deploy my app to Laravel Forge. Now my app does save images locally on the server, but when I scale my app and add load balancers, I think saving images on different servers isn't a good practice.
Because when one server goes down, some of the images wont load for example.
How could I do this in a good manner?
Related
hopefull someone can shed some light on this topic and here are some of my questions:
How do big ecommerce websites manage their images for their websites?
Is there any Best Practices should be consider when deciding where to keep websites images?
I have heard to keep the images in multiple folder structure on the same server where the website is hosted so websites can render them easily and fast since they are all on same server - Is this the idea solution?
How do professionals or big ecommerce handle images storage and maintain website images reliability and stability?
Is Azure or AWS etc best place to store images for websites rendering?
Thanks in advance!
Keeping the files in the same server comes with more risk, if your server crashes or region goes down your application stopped as well as your files not gonna render if you are using those files separately for different applications like mobile applications.
In this case, users will also face a high loading time for those media files if the users are not in the same zone as your application hosted.
The best practice to store the image/media files on some cloud storage like S3 or Azure Blob then connect it with some CDN like CloudFront or Azure CDN.
Now you can serve your media files via CDN which will act as a global caching system for your media files.
We are currently developing a service to share photos about people's interests and we are using below technologies. (newbies btw.)
for backend
Nodejs
MongoDb
Amazon S3
for frontend
ios
android
web (Angularjs)
Storing and serving images is a big deal for our service (it must be fast). We are thinking about performance issues. We stored photos on mongodb first but we changed it into aws S3 then.
So,
1.our clients can upload images from the app(s)
2.we are handling these images on nodejs and sending them to the aws S3 storage
3.s3 sends an url back to us
4.we save the url into the user's related post
5.so, when user wants to see his photos the app gets the photos from with their urls
6.finally we are getting images from S3 to the user directly.
Is this a good way to handle the situation? or is there a best way to do it?
Thanks
I am building a website where users can upload images and later view them on different devices. I am planning to store images on S3, while my webserver will be running on EC2.
Now, I have a doubt - whether to serve images directly from S3 to client (browser, app etc) or serve them through my webserver in between.
If I serve directly through S3, then webserver will be less loaded but I need to authenticate requests directly going to S3 (as only a user should be able to view his/her images).
Similarly, should I upload images directly to S3 without bringing my webserver in between?
In which case it will be expensive (band-width utilization etc) ?
thanks!
I have a server side API running on Heroku for one of my iOS apps, implemented as a Ruby Rack app (Sinatra). One of the main things the app does is upload images, which the API then processes for meta info like size and type and then stores in S3. What's the best way to handle this scenario on Heroku since these requests can be very slow as users can be on 3G (or worse)?
Your best option is to upload the images directly to Amazon S3 and then have it ping you with the details of what was uploaded.
https://devcenter.heroku.com/articles/s3#file-uploads
I have a unique set-up I am trying to determine if Heroku can accommodate. There is so much marketing around polygot applications, but only one example I can actually find!
My application consists of:
A website written in Django
A separate Java application, which takes files uploaded by users, parses them, and stores the data in a database
A shared database accessible by both applications
Because these user-uploaded files can be enormous, I want the uploaded file to go directly to the Java application. My preferred architecture is:
The Django-generated webpage displays the upload form.
The form does an AJAX submit to the Java application
The browser starts polling the database to see if the Java application has inserted the data
Meanwhile the Java application does its thing w/ the user-uploaded file and updates the database when it's done
The Django webpage AJAX-refreshes a div with the results of the user upload once the polling mechanism sees that the upload is complete
The big issue I can't figure out here is if I can get both the Django the Java apps either running on the same set of dynos or on different dynos but under the same domain to avoid AJAX cross-domain issues. Does Heroku support URL-level routing? For ex:
Django application available at http://www.myawesomewebsite.com
Java application available at http://www.myawesomewebsite.com/javaurl/
If this is not possible, does anyone have any ideas for work-arounds? I know I could have the user upload the file to Django and have Django send the request to Java from the server-side instead of the client side, but that's an awful lot of passing around of enormous files.
Thanks so much!
Heroku does not support the ability to route via the URL. Polyglot components should exist as their own subdomains and operate in a cross-domain fashion.
As a side-note: Have you considered directly uploading to S3 instead of uploading to your app on Heroku which will then (presumably) upload to S3. If you're dealing with cross-domain file uploads this is worth considering for its high level of scalability.