Laravel Display Images in Shared Hosting - laravel

I've developed a site, for a local charity, where their budget is basically zero. I've used Laravel and I managed to source free hosting, for life. They just need to pay for domain name renewal, every year.
LCN Hosting provide the free hosting and 1 year's free domain name, however, it is a shared hosting package, which is less than ideal. I followed the steps, to upload it, by changing the name of the public directory and placing everything else in a second directory, as well as changing the paths in the index.php file and the autoload file and deleting the config.php file.
The site runs, but at times it is a bit sluggish and page load times are higher than is probably acceptable, using a mobile or running it through Google Lighthouse or Screaming Frog.
I work between 2 machines and my desktop has Laragon installed on a D drive, so when I transfer my project to my laptop, I change the paths, in the config file, for all references to a D drive to a C, so for example:
'path' => 'B:\\laragon\\www\\projectfinal\\storage\\framework/cache/data',
Becomes:
'path' => 'C:\\laragon\\www\\projectfinal\\storage\\framework/cache/data',
I have no SSH access on the shared hosting, so I can't run any optimisation or cache commands, once its uploaded. Is it possible to spoof the location, to match the folder structure of the shared hosting? I know a virtual or dedicated server would be much better and if it weren't a charity, I would have used one. So any tips would be great.
Thanks in advance.

Related

Is there a way to create a custom download link for a google drive file with my own domain

Sorry for my poor knowledge about it. I would appreciate if anybody could explainme how can I create a custom download link for a google drive file. I got a domain in freenom.com and I want to change the download link of a google drive file to use my domain name in the url
What I’m try to do is avoid the final url docs.googleusercontent.com/docs/... using a custom link with my domain.
Is that possible?
I know it sound crazy but in my workplace they block docs.google url with a proxy filter and others cloud services like multcloud.com and koofr.eu too. Beside using proxy web or vpn is very suspect for they. 
So I need a way to download my files with my own url without lifting suspicions trace. 
I know is a nightmare but you never wish to live here. The only internet where I can download something is in my workplace.
Any solution with google colab may helpme too. Maybe create a temporal server via python in google colab and asign my domain name. But I just don’t know how to do it. With flask-ngrok I create one in google colab but the link I obtain is through de domain ngrok and Is blocked too.
The objetive is to create a custom download link to one file that doesn’t lifting suspicions trace and maybe reemplace in the future the file in google drive rename it so that keep the same path.
Any help will be appreciated.
Thanks for your time
Answer
I did it, finally I did it. I’m a beginner, so for me, founding a method to circumvent the stupid proxy filters of my workplace without leaving any suspects in order to download files from the internet is great!!!!!! wonderful!!!!!!. Also I got to say that in my country the price of the internet is a madness. Imagine that you would have to pay the equivalent of 250 dollars of your salary for only 2 gigabytes of data usage in your cellular. And also imagine that you would have to pay the equivalent of 375 dollars of your salary monthly for only 30 hours per month through adsl modem connection with the ridiculous speed of 1 Mbit/s. But only in my workplace the internet is free and fast (at least for me). Add to that situation the fact that they are watching you every move you do in your internet traffic.
As a communist country they control everything and in my workplace they have a man only for censure the internet blocking any suspicious page they found, and if they catching you using vpn or proxy web you receive a big problem. They block google drive, dropbox, and any other cloud storage services. Also they block any site you can use to download files if that site is not related to your work.
But hey, I finally found a way. First I got some free domains in freenom.com and I put them a name related with my work in order to not leave any suspicious trace in my web traffic. Then I got a free hosting service and associated each domains with a Wordpress site retouched in order to apparent something related with my work. Then I use my internet in my cellphone for download files from internet to my google drive account using google colab in order to store my files. After that I use google colab to upload my files from google drive to the ftp account of one of my Wordpress site created and then I create download links for all my file with my own domain name. And finally in my workplace I prepare a curl download script to download my files intermittently in order to not leave the appearance of downloading a big file traffic consuming. Finally I download the equivalent of 1 gb per day. Sad but true. I’m working now in a way to create a free ssl connection for my downloads with authentication system that won’t let the censor man to access to my links. Mabe with a Wordpress plugin

How to upload huge files into webserver

I have a virtual machine on google cloud and i create a webserver on this machine (ubuntu 12.04). I will service my website on this machine.
My website shows huge size images which format is jpeg2000. Also my website supports, users can upload their images and share other people.
But problem is images' size about 1 ~ 3 gb and i can not use standart file upload methods (php file upload) because when the connection gone, that upload starts again. So i need to better way ?
I am thinking about google drive api. If i create a common google drive account and users upload this account on my website using google drive api. Is it will good way ?
Since you're uploading files to Drive, you can use the Upload API with uploadType=resumable.
Resumable upload: uploadType=resumable. For reliable transfer, especially important with larger files. With this method, you use a session initiating request, which optionally can include metadata. This is a good strategy to use for most applications, since it also works for smaller files at the cost of one additional HTTP request per upload.
However, do note that there's a storage limit for the account. If you want have more capacity, you'll have to purchase it.

create google drive upload ability

I create a little web system, written on PHP. Now we want to allow our clients to access out little system. On each job, they must upload one or more files (100-200 mb max size per file). Till now, i upload them via PHP component to server, but we have a lot of trouble with our ISP, and i decide to use free Google drive account. So, i read tutorials, but i can not understand clearly:
Is there a way to upload file from client browser directly to Google drive, without upload to our server first? As far as I see, i can use php library to operate with my Google drive and upload files, but - unfortunately - files must be on out server first, which is my big problem.
Big thanks in advance for every one which can help us.
Direct upload from javascript to Drive is very easy. Check out the Drive Picker at https://developers.google.com/picker/docs/ which does everything for you.

VPS and Dedicated server functionalities and resemblances?

suppose i'm developing a web app in which I need to show the remaining disk space on the server to the user. So I use the 'disk_free_space' php function to get that info. Now this is gonna work on my local machine( the one i'm developing on ) and it's gonna work on a dedicated server( which is the same as my own local machine ). I don't know if it would work on a vps and I know that it WOULDN'T work on a shared server. ( by working I mean showing the correct amount ). So my question is, that if i develop my app on my local machine, which acts like a dedicated server, would I have such problems if I deployed the script on a VPS?
Thanks
disk_free_space returns the amount of free space that is left on the filesystem.
On a dedicated server and VPS you have full access to your servers filesystem and thus the correct amount is returned. On shared hosting however you don't have your own filesystem and thus only have access to a little piece of what PHP thinks is left on the disk.

Redirect apache icons folder

A client of mine is running an ecommerce store on godaddy shared hosting.
They are trying to pass pci compliance and the only issue is the default apache icons folder by allowing it to be indexed.
This folder is NOT in my web root. So I don't have access to it.
I've tried htaccess rewrites, but it's not working.
Anyone know of any other solutions?
I am sorry but Go Daddy shared hosting is not PCI Complaint. You can review the last part of this page to verify that: http://support.godaddy.com/help/article/4265/quick-shopping-cart-pci-compliance-faq?locale=en
turns out the scan is not required at level 4 with the amount of annual sales this client processes. So, we opted to leave it as-is.

Resources