Loading storage files from another server - laravel

I've put the database and files on separate servers (because I've set up 3 servers for our web application and use the load balancer for them) and I use SFTP for filestorage driver.
Because I've used the SFTP driver for Laravel file storage, the SSH connection to the destination server will increase and the SSH port will be block and the files cannot load from the storage server.
What should I do? is there any other solution to load files from another server?

I recommend using a cloud storage for this usecase, in which one can set the CORS settings. You could for instance use aws s3 for this or use digital ocean spaces which has a similiar api to s3.
You basically post your to that cloud server and save the url to your database.
Of course you need to set of the cors settings, to basically say that server x can access specific files on the cloud storage.

Related

How to retrieve source code from Amazon Elastic Compute?

My application is hosted on Amazon Elastic Compute Cloud by the developer. I need to retrieve the source code for my web application. I am a new user so I need to know how can i download the source codes in my local host.
You need to log into the instance using SSH. If you're familiar with SSH then you can SCP from your local machine.
Of you're not familiar, you can use Systems Manager and transfer the data to S3 then download from there:
https://aws.amazon.com/premiumsupport/knowledge-center/systems-manager-ssh-vpc-resources/

Amazon cloudfront - s3 or ec2?

I have an application hosted on EC2 instance.
Now I want to fetch all the static content used in application from Cloudfront.
Read from a source that Cloudfront uses S3/EC2 or private servers to get the static files.
I can't come to solution what exactly to use? Can I use same EC2 instance for this purpose? Any better option for this implementation?
Amazon CloudFront sits "in front" of your application on Amazon EC2 and/or your content in Amazon S3. It caches content in 50+ locations when people access your application.
For example, let's say you had a web app running on an EC2 instance serving HTML pages, and also some pictures in S3.
You would create a CloudFront distribution and configure two origins: one for your web server and one for your S3 bucket. Behaviours can be configured to tell CloudFront when to use each origin -- for example, serve *.htm URLs from EC2 and *.jpg URLs from S3.
Your users would then access your application via the supplied CloudFront URL. Content will be cached (if appropriate) at one of the 50+ CloudFront locations around the world that is closest to each of your users, resulting in faster response times for your user.
You can also use your own domain name with CloudFront so that it has a more-friendly URL than the one supplied on the CloudFront distribution (which looks like d3i7tv8nzqzfbt.cloudfront.net).
simply Amazon s3 is used for cloud storage where as Amazon EC2(elastic cloud) is used to server web pages(hosting) same as like godaddy.com.

How to performant connect a FTP Server to a Webdav Server?

I have implemented a webdav directory in PHP using Sabre DAV, for my website (Application Server Webinterface).
For this Website I am writing now an TCP Socket using C#, which is running on another server (actualy it is in the same datacenter, but for theoretical sake, it is on the other hemisphere).
The Socket actualy is a service, which can start and stop applications (gameserver in this case). I also have implemented a FTP Service in this socket too (for data transfer).
My Goal:
I want to connect my Web Dav to the FTP-Server of my socket, which means File Listening, Download, Upload. The usecase should be, that a user only connect to a single service. Imagine, my socket is running on more then one server.
If i would implement this with my current know how, i would do it this way:
User Request Web Dav Directory
Server make a file listening of the FTP Server
The file listening is added dynamicly to the Web Dav Directory
Now the user open the directory, and want to download the file:
Web Dav Server request the file from the Ftp server
Web Dav Server provide the downloaded file
Web Dav Server delete the provided file
On the other direction, the WebDav Server will accept a file, and upload it then to the FTP Server.
If the servers are not in the same datacenter, this cost traffic. Anyway, i think it takes some time, if the data are binrary instead of textbased configs. Also, the client side progress bar will not notice, if the download to the webdav server / upload to the ftp server is processed (the user possible think nothing happens).
I hope i have successful communicated, where my problem is.
So how can I implement this, without delegate an upload/download from one server to another? Is this even possible?
Bonus: Would a solution like WebDav to Webdav or FTP to FTP provide a better way of implementing it?
Easy way to achieve this is to have a third party software like webdrive to map the ftp server contents to a drive letter. Then point the webdav server to this drive. Windows also provides option to map a webdav/ftp URL as a drive letter so that the application can access it as if its a local drive.

Can I store static files on nginx whith cloudbees?

As far as I know, cloudbees is using nginx servers for routing the requests.
I am wondering if there is a way to drop my static files on nginx to make it serves my static files and save some load on my application server.
M.
On runtime the file system is "ephermeral". You have more info about it here.
You should use an external service like Amazon S3. The Amazon S3 ClickStart explains how to use this service from CloudBees.
You can also use other services like Dropbox or Google Drive.

amazon S3 database options and using EC2 as a REST api from S3

I am hosting my website on S3.
On my local host I am using backboneJS above a PHP Rest API that uses mySQL as a database.
So i opened an EC2 to host my Rest API but then realized this means cross domain AJAX.
How can i use EC2 as a Rest API if my index.html sits on S3?
What are my other DB options for S3?
many thanks,
Your JavaScript is being executed on web pages served from S3, and it has to access a REST API from a server you run on EC2. Unless the web pages and server are in the same domain (say, example.com), this will be a cross-origin request, prohibited by browsers by default.
Solution 1: have your S3 pages and your EC2 server in the same domain. S3 allows static website hosting that makes your S3 objects (web pages) available at the address of your choice. Put them and your EC2 server at addresses on the same domain, and it can work.
Solution 2: have your REST API server allow cross-origin requests. Since you control this EC2 server you can modify it to tell web browsers to allow pages from other domains to make such requests to your server. This involves setting several headers on your responses, and usually requires your server to respond to HTTP OPTIONS requests properly, too. See the CORS specification to get started on that.
You also ask about other DB options. If you keep your own EC2 server to provide the REST API it can use pretty much any kind of database you like, either running on the same or other EC2 instances, or database-as-a-service offerings like AWS RDB or AWS DynamoDB. But if you want to connect to the database directly from your web pages' JavaScript you would have to use a service that provides an HTTP API directly and that supports CORS. RDB does not provide an HTTP API at all, and DynamoDB does not seem to support CORS at this time, so neither of them would work.

Resources