Joomla on EC2 instance: Where do the files go? - joomla

This a very simple question, but I haven't found the answer anywhere.
I am thinking about moving one of my websites, a joomla website, to the cloud, more specifically to a EC2 instance on Amazon. I have been watching some videos and the process seems rather simple. However, I haven't found any information about where the files are stored.
Are they in a S3 bucket? Are they saved somewhere else? How do I access the files?
Can I use Cloudfront to serve images and other files?
How does the whole integration process between EC2, S3 and Cloudfront work for a hosted website?
Thank you!

In my experience With EC2, you interact with the instance just as you would with any headless server in the basement. You can connect over SSH and install software, store files, etc, as normal.
It's great if you want a whole server to play with, but if it's just website hosting you're after, it might be overkill.
S3 can serve up static pages, but you can't install any code on there.
I don't know about cloudfront.
It seems to be though that what you need is standard website hosting.

Related

Allowing users of laravel app to connect their own amazon aws s3, or DO spaces S3 to the app to serve files - possible?

first post here although i've been reading for about 7 years :). Please take it easy on me if there are things that are wrong with my post.
I am working on a social network/marketplace for online education. The business problem i am dealing with is how to delegate the cost of hosting (most importantly video content) to the vendors themselves, in order to simplify the platform's relationship with them.
As an example, Patreon.com integrates with vimeo to upload videos from patreon website straight to vimeo, thus the platform is not incurring the cost of hosting creator videos.
My question is if it would be possible to do the same thing with amazon s3 or digital ocean s3. Instead of our app (laravel+vue) to have it's own hosting on amazon s3, each user of our app would be able to connect their own s3 drives, access his buckets/objects and share them on our platform. That way they can pay their own bills for hosting and we don't have to deal with "re-selling" hosting.
Has anyone gone the route of allowing their app users to enter their own S3 hosting credentials, store them in the db, and when retrieving the file, setting those credentials on the fly, as ml59 suggested, and accessing the user's s3 files.
Thanks,

Migrate existing Squarespace site to AWS

I have I Squarespace website I made for myself a while back. The main purpose at the time was to have something to link to from my iOS app, and I opted for something expedient rather that thinking long term just to get the app released. Fast forward to now and I have an AWS EC2 instance where I could do more with a personal site in the future. Ultimately it would be nice to get it off Squarespace and not have to pay another full year billing cycle, but the renewal date is a pretty tight deadline at this point.
Nothing on this domain requires must more than frontend web code really, but a completely different page UI could take more time than I have for this. I'm wondering if there might be a way to just temporarily have the Squarespace page source as is running on EC2 so I can worry about a possible non CMS design when I'm not worried about getting billed for another whole year by Squarespace.
I'm not sure if this is possible, but if not it seems like I should just port the content to minimalistic empty html files with no styling just to avoid the billing or get billed for a shorter time period. Billing seems like the limiting factor here. I would also need to add my new credit card to get billed for more time which I also have yet to do.
Basically, has anyone else dealt with this situation personally? What would you recommend I do? Does Squarespace even allow me to port to EC2 somehow, or is that more in the realm of WordPress? Thanks.
Note: Tomcat's what I'm using on the EC2 instance currently. I will also need to do the multiple site per instance setup for this, but I believe that's the most relevant config info here unless I'm forgetting something.
Not sure why you've already chosen to use Tomcat as I don't see anything that would allow you to easily convert your Squarespace site to a Java webapp. It looks like Squarespace sites can be exported into Wordpress, which you could host on an EC2 server.
Alternatively you could use wget to create a static copy of your website which you could then host easily on your EC2 server with Nginx, or skip EC2 and just host the static website on S3.

Amazon EC2 Drupal 6 Site

Before anything, I have never worked with Amazon EC2 Service, first time I even hear of it. I was asked to work on a Drupal 6 site and I need to upload a custom module. The client gave me a username and password to log into Amazon EC2, but told me nothing else. I assumed their site was hosted there. I came upon the EC2 dashboard, and to my surprise (or maybe not) there were no running instances. If I understood properly, you need a running instance that's supposed to work as the server, please, correct me if I'm wrong. I might be understanding it all wrong, and associating "instance" as if it were the Virtual Server itself (sort of like when you use virtual machines on your computer and instance=="virtual machine").
If there are no running instances, how is the site "up" ? There must be a server, somewhere, answering to the client's requests. Or is it that the "instances" are more like "working sessions"? Thing is, I don't want to meddle too much into the dashboard in case I mess it up since this client has no staging site nor repository. That's why I wasn't bold enough to create an instance.
Helps is much appreciated.
You are correct, that if the site is hosted on aws ec2, there must be an ec2 instance running somewhere - definitely check to make sure you have selected the correct region in the upper right hand corner of the console.
The only other possibility, and I don't this would apply to Drupal, is it actually is possible to host an html/css/javascript only site completely on aws s3 (which would not required ec2 instance) but that is not likely what you are dealing with.

No permanent filesystem for Heroku?

The app I am currently hosting on Heroku allows users to submit photos. Initially, I was thinking about storing those photos on the filesystem, as storing them in the database is apparently bad practice.
However, it seems there is no permanent filesystem on Heroku, only an ephemeral one. Is this true and, if so, what are my options with regards to storing photos and other files?
It is true. Heroku allows you to create cloud apps, but those cloud apps are not "permanent" - they are instances (or "slugs") that can be replicated multiple times on Amazon's EC2 (that's why scaling is so easy with Heroku). If you were to push a new version of your app, then the slug will be recompiled, and any files you had saved to the filesystem in the previous instance would be lost.
Your best bet (whether on Heroku or otherwise) is to save user submitted photos to a CDN. Since you are on Heroku, and Heroku uses AWS, I'd recommend Amazon S3, with optionally enabling CloudFront.
This is beneficial not only because it gets around Heroku's ephemeral "limitation", but also because a CDN is much faster, and will provide a better service for your webapp and experience for your users.
Depending on the technology you're using, your best bet is likely to stream the uploads to S3 (Amazon's storage service). You can interact with S3 with a client library to make it simple to post and retrieve the files. Boto is an example client library for Python - they exist for all popular languages.
Another thing to keep in mind is that Heroku file systems are not shared either. This means you'll have to be putting the file to S3 with the same application as the one handling the upload (instead of say, a worker process). If you can, try to load the upload into memory, never write it to disk and post directly to S3. This will increase the speed of your uploads.
Because Heroku is hosted on AWS, the streams to S3 happen at a very high speed. Keep that in mind when you're developing locally.

Cloud Storage of Protected Images and Fast Access

Writing an application for a client that is very photo focused. Users will be uploading hundreds of images to their account. These images are only accessible to the owners. Our first thought was to throw them all out on AmazonS3 but after doing some research we have found that accessing protected files on S3 is too slow for the kind of response we need. Also, from further research, we're not sure that non-US users are going to have a good experience because of S3 (still researching that one).
I'm wondering if I'm finding the right information or not. Is it possible to host images on S3 that have specific user rights and are pulled down quickly? Would we need to implement some sort of caching proxy to speed things up? Should I be looking at hosting the images on our own servers and delivering them that way? Thanks for any feedback.
Are you using the CloudFront CDN in front of S3? That might be why it's a bit slow.
If you don't like Amazon, you could use CloudFiles from Rackspacecloud. That uses Limelight as a CDN.

Resources