How should I download file from an ftp server to my local machine using php? Is curl good for this?
you can use wget, or curl, from PHP. Be aware that the PHP script will wait for the download to finish. So if the download takes longer than your PHPs max_execution_time, your PHP script will be killed during runtime.
The best way to implement something like this is by doing it asynchronously, that way you don't slow down the execution of the PHP script which is probably supposed to serve a page later.
There are many ways to implement it asynchronously. The cleanest one is probably to use some queue like RabbitMQ or ZeroMQ over AMQP. A less clean one, which works as well, would be writing the urls to download into a file, and then implement a cronjob which minutely checkes this file for new urls to download and executes the download.
just some ideas...
Related
Let me start by saying I understand that heroku's dynos are temporary and unreliable. I only need them to persist for at most 5 minutes, and from what I've read that generally won't be an issue.
I am making a tool that gathers files from websites and zips the up for download. My tool does everything and creates the zip - I'm just stuck at the last part: providing the user with a way to download the file. I've tried direct links to the file location, and http GET requests, and Heroku didn't like either. I really don't want to have to set up AWS just to host a file that only needs to persist for a couple of minutes.. Is there another way to download files stored on /tmp?
As far as I know, you have absolutely no guarantee that a request goes to the same dyno as the previous request.
The best way to do this would probably be to either host the file somewhere else, like S3, or to send it immediately in the same request.
If you're generating the file in a background worker, then it most definitely won't work. Every process runs on a separate dyno.
See How Heroku Works for more information on their backend.
I am building a Portlet using Vaadin 6. In the portlet I let the end user download the result of the searches/operations he's done.
What I am doing here is generate, on the fly, a zip file for download and serve it for download using
getMainWindow().open(resource);
where resource is a FileResource.
Since the search for is quite complex, I have very little chance to be able to reuse the results and, in order to make things nice,I would like to delete the zip file from the server once it's been "consumed" by the download process.
Is there any chance I can monitor somehow when the download has been completed ?
TIA
If your concern is just keeping the Server clean, it should be enough to use the tmp-dir of your machine. This Way, the OS handles deletion for you.
Or you could write your own clean up process either with cron or sheduler/timer services.
I'm trying to upload several hundred files to 10+ different servers. I previously accomplished this using FileZilla, but I'm trying to make it go using just common command-line tools and shell scripts so that it isn't dependent on working from a particular host.
Right now I have a shell script that takes a list of servers (in ftp://user:pass#host.com format) and spawns a new background instance of 'ftp ftp://user:pass#host.com < batch.file' for each server.
This works in principle, but as soon as the connection to a given server times out/resets/gets interrupted, it breaks. While all the other transfers keep going, I have no way of resuming whichever transfer(s) have been interrupted. The only way to know if this has happened is to check each receiving server by hand. This sucks!
Right now I'm looking at wput and lftp, but these would require installation on whichever host I want to run the upload from. Any suggestions on how to accomplish this in a simpler way?
I would recommend using rsync. It's really good at only transferring just the data that's been changed during a transfer. Much more efficient than FTP! More info on how to resume interrupted connections with an example can be found here. Hope that helps!
I have a django application in heroku and one thing I need to do sometimes that take a little bit of time is sending emails.
This is a typical use case of using workers. Heroku offers support for workers, but I have to leave them running all the time (or start and stop them manually), which is annoying.
I would like to use a one-off process to send every email. One possibility I first thought of was using IronWorker, since I thought that I could simply add the job to ironworker's queue and it would be exectuted with a mex of 15 min delay, which is ok for me.
The problem is that with ironworker, I need to put in a zip file all the modules and their dependencies in order to run the job, so in my email use case, as I use "EmailMultiAlternatives" from "django.core.mail.message", I would need to include all the django framework in my zip file in order to be able to use it.
According to this link, it's possible to add/remove workers from the app. Is it possible to start one-off processes from the app?
Does anyone has a better solution?
Thanks in advance
I am using Nginx, and need to be able to generate images on the fly. When the client sends a request for an image, I need to run an external program to generate the image. The external program leaves the generated image in the filesystem.
It seems that the easiest approach would be to write a FastCGI script that runs the external program and then reads the image from the filesystem, transferring it via FastCGI to nginx.
However, this seems inefficient, since I would need to write my own file copy routine, and the file is copied from the disk into a local buffer, then into packets for FastCGI transfer to nginx, then into nginx's buffer, and then finally into packets to send to the client. It seems that it would be more efficient to leverage nginx's ability to efficiently serve static content.
Ideally, I'd like some way to make nginx wait until the image has been generated, and then serve it from the disk. Another thought is that maybe the FastCGI response could use some kind of header indicate that nginx should actually go and serve a file, instead of the response from the FastCGI script. Are either of these approaches possible?
X-Accel-Redirect - exactly what you are looking for.
Usage example can be found here: http://kovyrin.net/2006/11/01/nginx-x-accel-redirect-php-rails/
Nginx is asynchronous, so it would serve all other connection without waiting data from you FastCGI script.