I'm told to down load a text file from another website and to put in the root directory of my website - download

I want to know how to down load a text file from another website and how to put it in the root directory of my website.Can you help me with problem please.THANK'S !!

As a program, or as a human action?
As a human, you should be able to take the link and download the file, upload to your website using for example FTP or (hopefully no one does this now) frontpage.
If you mean programattically, well its almost the same. Your script would need to open the file in the root directory, open the URL and read in the data sent and save it to the file, close the file. However, how to do so exactly depends on the lanugage you want, is this a repeated event or a once off?

This type of request usually happens when you are requesting a service that requires proof that you are the website owner. Being the owner of that website would also indicate that you should have at least ftp access over your site. If you are hosting the website yourself, this is an easy task you just copy the file into the root directory (windows default is c:\inetpub\wwwroot, ubuntu default is /var/www/). However if your website is hosted, you need to find your ftp username and password and utilize an ftp program like FileZilla. If you want to tell us what file host you use maybe someone can give you exact instructions. But beware of what file you host.

If you have the URL of the text file you can put it into your browser and then save the file to your disk. You then need to FTP it to your web server (or whatever method you normally use to get files onto the server)

In PHP:
<?php
$resource = curl_init('http://www.someserver.com/file.txt');
// important, otherwise curl_exec will output directly
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
$data = curl_exec($resource);
curl_close($resource);
file_put_contents('/dir/localfile.txt', $data);
Or even better, with a bash script with a simple wget and cp.

Related

how to generate download link Files that are located on another server in laravel

I put my videos on my Shared web host and user can direct download all files. but I want to hide my actual file paths and make time-limited download links.
if files were on same server it is work fine.
when I use this code :
return response()->download('/home2/alihoss1/domains/alihossein.ir/public_html/dl/video/MySql/Sql1.mp4');
i see this error :
is_file(): open_basedir restriction in effect. File(/home2/alihoss1/domains/alihossein.ir/public_html/dl/video/MySql/Sql1.mp4) is not within the allowed path(s): (/home2/alihosse/domains/alihossein.ir/:/tmp/:/usr/local/php-7.0/lib/php/)
What solution would you recommend ؟
videos and laravel Project are not same host.
You could use something like file_get_contents() to get the file from the other server. This would lead to unneccessary traffic though because server 1 would download the file from server 2. That applies also to scp etc.
You should think about encryption:
$hash = encrypt([
'valid_to' => strtotime('+30 minutes'),
'file_path' => '/home2/alihoss1/domains/alihossein.ir/public_html/dl/video/MySql/Sql1.mp4'
]);
return redirect('http://server2.example/download/hash/' . urlencode($hash));
You then need to decrypt this on the second server with the same key. If you do not have laravel installed there, you can implement your own decrypting functionality (see: laravel openssl encryption).

How to decode and inspect an HTTP payload when it is a zip

So I'm pretty new at all this. I am trying to reverse engineer a web application.
When I submit a form, it sends a POST with a request payload that looks something similar to this:
encoding=UTF8&zip=1&size=136240&html=DwQgIg_a_whole_lot_more_gibberish_not_worth_posting
Anyways, from inspecting the captured traffic from Chrome developer tools, I noticed it is encoded and sent as a zipped up html?
How would I go about reversing this to see what the content is actually being sent to the server?
What you want to do is this:
1) Get the name of the zip file
2) Get the path of the zip file (likely the root directory or the current path the form is at)
3) Generate the URL (http://site_name.com/path/to/folder/zip_file.zip)
4) Download it using a too such as wget (typing the URL into the browser may work too)
I used this technique to download all the files that get downloaded to the OTA updates on iOS devices (used burp suit to intercept the zip file name where the server was on my computer which my iDevice was connected to).
Please note: the name of the zip file you have given does not end in .zip. this may mean it doesn't have a extension; you may have to add .zip to the file manually; or it may have another ending such as .tar, .tar.gz etc.

Can there be a batch file to block websites?

A common method to block websites is to go this directory.
C->System32–>drivers–>etc and add the exceptions to the 'hosts' file.
But anybody can re-edit the file and remove the exceptions.
So..is there some kind of batch programming to block certain websites ?
You have a few options for this.
Change admin rights and set up yourself as the supervisor and everyone else as something else and lock edit permissions.
Write a bat file that opens both the internet and a second bat file that reads the website to the host file. If you do this every single time they start the web browser they will add the website back to the blocked list in the background forcing them to exit if they want to change it and if that happens they will reblock the website when they open the web browser again. Effective and out of some peoples abilities bypass.
Example can be found here: http://www.makeuseof.com/tag/launch-multiple-programs-single-shortcut-using-batch-file/
Similar to the method above add that bat file to launch when someone access's the hosts file with a timeout function to rewrite the hosts file after some amount of time...
Password protecting the system 32 folder but this could prove problematic for a plethora of reasons.

How to find out the document root or find out actual path of a URL in cPanel server?

First of all, I searched in Google to my level best to find the answer and not able to even get some clue on this.
I want to find out the actual path in the server for a given URL. I have root access.
For example, I want to write a script, It takes URL as a input and prints the actual path in the server.
Example Input: some.com/wp-includes/js/tinymce/plugins/media/js/file.php
abcd.com/wp-includes/js/tinymce/plugins/wpgallery/img/xml.php
The output should be
/home/some/public_html/wp-includes/js/tinymce/plugins/media/js/file.php
/home/abcd/public_html/wp-includes/js/tinymce/plugins/wpgallery/img/xml.php
The domain name can be add-on, or sub-domain , But it is hosted on the same server.
I want to write a shell script to achieve this. Please guide me.
I want this to be done using Linux shell script only and not using PHP.
There is a perl based cPanel API for getting this information. Without that, It is very tough to implement it or i have not seen any code on Internet. So I dropped this.
From any PHP code running in browser you can use:
# prints DocumentRoot path
echo $_SERVER["DOCUMENT_ROOT"];
# prints full filesystem path of the script
echo $_SERVER["SCRIPT_FILENAME"];

How To Write Your Own FTP Uploader with Automator

Is it possible to write your own application/command that will allow you to automatically upload your files to your ftp server?
Basically the flow I want to achieve is this
My app/action/whatever is scheduled to upload at a certain time
When the certain time arrives, the files in my specified folder will be uploaded
Of course, to upload, some data must be set like the username, password, ftp server etc...
After my files have been uploaded, the local files will be wiped-out.
I don't exactly know where to start. Can someone help me with this. Thank you.
Take a look at http://editkid.com/upload_to_ftp/. It comes with the source code so you can modify it to fit your needs. You can combine it with an Automator action to delete the files after upload.
To schedule it, http://smallbusiness.chron.com/schedule-automator-tasks-mac-os-x-39132.html.

Resources