can somebody please provide me link to some FTP Client application with complete functionality like Filezilla or others like that..
i am looking for some OpenSource Solution and should be developed in .Net (C#, VB.Net).
i went through many FTP libraries. like NetFtp and many other. but i dont have enough time to develop one from scratch. i need some pre developed and than will modify it according to my requirements.
i want to implement restriction on file upload and no. of files uploaded. (on the base of logged in user).
Thank you.
There are many FTP Servers that have the capability built in to restrict certain files types, set quotas, throttle bandwidth, and limit the files uploaded.
You may want to consider getting a server that already supports this functionality and then you can use any standard FTP client that you want.
Related
I need to deploy an application onto some Windows machines for purposes of data collection from a group of people (i.e. the application will be used to gather responses to a series of survey questions). The process is interactive, alternating between displays of text and images with specific timing requirements. I have put together a prototype application using HTML and JavaScript that implements the survey. However, there are some unique constraints on the deployment environment that have me stuck:
While the machine is Internet-connected, the client requires that the survey application must run fully local to the PC that it runs on. Therefore, sending the survey results to a remote server is not permissible. Obviously, saving to a local file from a Web browser is typically not permitted for security reasons.
Installation of applications onto the machines that will run the survey is not permitted.
The configuration of the machines is not known specifically a priori, but I can assume some recent version of Windows with IE8+.
The "no remote access" requirement was a late comer, and has thrown a wrench into the plan of just writing a simple Web application that could post results to an HTTP server. I'm now looking for the easiest way forward. Two main approaches come to mind:
Use a GUI framework that provides a control that can display HTML/JavaScript; running a full-blown application on the PC would allow me to save the results to the filesystem. I've never done this, but it seems like in this day and age it shouldn't be too difficult. This would allow me to reuse much of my existing prototype implementation, but I would need some way of transferring the results (which would be stored in a JavaScript data structure) outside of the Web control to where the rest of the application could access it.
Reimplement the entire application using some GUI framework (I've used PyQt successfully before, although not on Windows). This approach is obviously less desirable than #1 due to the lack of reuse. However, it may be necessary if #1 isn't feasible.
Any recommendations for the best way to go? Ideally, I'm looking for a solution that can be run in a "portable" manner from a USB thumbdrive or similar.
Have you looked at HTML Applications (HTA)? They work in IE5+ and can use Windows Scripting Host to write to local drives and UNC shares...
Maybe you can use a portable web server with a scripting language on the server side. http://code.google.com/p/mongoose/ Mongoose, for example, you can run PHP, CGI, etc. .. scripts. Then, simply create a script to save a file to your hard drive. And let the rest of the application in the same manner.
Use a script to start the web server, and perhaps a portable web browser like K-Meleon to start the application http://kmeleon.sourceforge.net/ This is highly configurable. Or start the system explorer to your localhost URL.
The only problem may be that the user has to modify the firewall for the first time you run the server?
I'm trying to move a large amount of files from one CDN to another. I know that they have a very high-speed connection to each other, so what I'd like to do is connect them directly, but the only protocol I have access to with each is FTP. Is there any way to log into the current CDN and send their files to the other FTP? It seems like it should be possible, I just have no idea how to do it.
FXP: see Wikipedia article. Powerful FTP clients are capable of doing this. From protocol point of view it's trivial.
BTW this question is probably offtopic here.
I'm researching solutions for a potential client. They're requesting the ability to download a large amount of MP3's (1000+) from their online catalog.
I've researched/tested building a zip containing all MP3s using ZipArchive but ran into obvious memory leak issues that have ruled that solution out.
I'm now trying to think out of the box.
One idea was to create an FTP queue or a Torrent type download link for them. Is there anything out there that can pull something like this off?
Any help or suggested direction would be greatly appreciated! Thanks!!
Edit: Here is the overall process/goal that we're trying to achieve.
The client creates music for TV/Flim placement. They maintain a online catalog AND a local copy they send to potential buyers. The online catalog and the offline catalog need to mirror each other. Problem being, they have multiple offices that will have to update their local copy with the new files added to the online catalog from many different locations
Example: East Coast User updates catalog with 100 new files. West Coast User needs to update the offline catalog with the new files retrieved from the online catalog.
We had hoped to create custom zip's of the files each user needed to update their catalog based on the user's download history that we'd maintain in MySQL. We were testing ZipArchive but we couldn't seem to build Zips over 175 MEG (give or take). We're in the process of testing ZipStreaming but are having some issues.
I hope this clears up the overall goal and problems we are facing.
GNU wget?
It can download recursive. Just give wget a list of all files on the server, e.G.
http://www.example.org/filelist.html which contains links like file1.mp3, file2.mp3 etc (apache normally generates such an index file automatically wenn a directory without index.html/php in it gets called.
http://linux.die.net/man/1/wget
Frankly speaking, I can't identify the actual problem/question from your post. If you are looking for minimizing network load, then you need to remember that MP3 files are not compressed well because they are already compressed (not as well as possible, but well). If you are looking for a transport, than any file transfer protocol will do (FTP, SFTP, HTTP, WebDAV).
If you need flexibility and features, I'd recommend SFTP: this is a protocol for remote file system access, so besides "get file" operation it has plenty of useful operations including machine-readable directory listing (not always available in FTP and not available in standard HTTP), built-in ZLib compression, built-in possibility to resume file transfers and more bonuses. HTTP also has ZLib compression, but this one is not always available.
Update: your approach doesn't care about what is really available on the client and you are going to prepare ZIP files based on your (possibly incorrect) knowledge of the client already has.
If the client and server are both applications that you develop, then you should use RSync protocol or something similar to update data online (not using any ZIP files) and download the files that are missing on the client. If direct communication between the client and the server is not possible, you can make the client send his state to the server and the server will prepare an individual package after that. As for ZIP functionality - it's needed only when you use batch update (no real-time communication between the client and the server). I don't know what technology you are using but if your only problem is with ZIP component, you can use something else for data packing - either different ZIP component (for .NET and VCL we have ZIP component) or some other packing solution (for example, our SolFS product doesn't have size limits). Unfortunately I am not aware of RSync-like implementation available as a component.
I have an ftp directory with Akamai now and I need to upload images as fast as possible (possibly 1+ million per day)
What would be the fastest way to sync local files to an ftp site?
thanks
Instead of FTP, use Rsync. It has lower overhead than FTP and is well suited to synchronising a large filebase.
Rsync documentation
Akamai Netstorage supports Rsync as an upload method. It may need to be enabled in the Akamai control panel - whoever administers your Netstorage user accounts can enable it.
Rsync is included in all Linux distributions, if you are on Windows you can get it as part of cygwin.
1 million a day sure is a lot, its hard to imagine what requires having such a huge number resources. All I can suggest is solving this purely at the ftp sync level, using an off the shelf tool. (Maybe http://www.ftpsynchronizer.com/?)
Failing that, knocking up a directory watching ftp uploader wouldn't be a hugely difficult programming job in most common languages that have ftp libaries.
The other alternative is that if you can get these files on an internet facing server, you can switch to using Akamai Http Content Delivery and get Akamai to pull the images rather than you having to continuously push them.
if you have such huge number of files and you want to upload faster, then I would suggest going for 'signiant' product which improves the upload time drastically. its a 3rd party upload service which works with Akamai very well, many customers use it.
We have an Oracle 10g forms application running on a Solaris OAS server, with the forms displaying in IE. Part of the application involves uploading and downloading files (Word docs and PDFs, mainly) from the PC to the OAS server, using Oracle's webutil utility.
The problem is with large files (anything over 25Megs or so), it takes a long time, sometimes many minutes. Uploading seems to work, even with large files. Downloading large files, though, will cause it to error out part way through the download.
I've been testing with a 189Meg file in our development system. Using WEBUTIL_FILE_TRANSFER.Client_To_DB (or Client_To_DB_with_Progress), the download would error out after about 24Megs. I switched to WEBUTIL_FILE_TRANSFER.URL_To_Client_With_Progress, and finally got the entire file to download, but it took 22 minutes. Doing without the progress bar got it down to 18 minutes, but that's still too long.
I can display files in the browser, and my test file displayed in about 5 seconds, but many files need to be downloaded for editing and then re-uploaded.
Any thoughts on how to accomplish this uploading and downloading faster? At this point, I'm open to almost any idea, whether it uses webutil or not. Solutions that are at least somewhat native to Oracle are preferred, but I'm opn to suggestions.
Thanks,
AndyDan
This may be totally out to lunch, but since you're looking for any thoughts that might help, here are mine.
First of all, I'm assuming that the actual editing of the files happens outside the browser, and that you're just looking for a better way to get the files back and forth.
In that case, one option I've used in the past is just to route around the web application using Apache, or any other vanilla web server you like. For downloading, create a unique file session token, remember it in the web application, and place a copy of the file, named with the token (e.g. <unique token>.doc), in a download directory visible to Apache. Then provide a link to the file that will be served via Apache.
For upload, you have a couple of options. One is to use the mechanism you've got, then when a file is uploaded, you just have to match on the token in the name to patch the file back into your archive. Alternately, you could create a very simple file upload form separate from your application that will upload the file to a temp directory via Apache, then route the user back into your application and provide the token in the URL HTTP GET-style or else in a cookie.
Before you go to all that trouble, you'll want to make sure that your vanilla web server will provide better upload and download speed and reliability than your current solution, but it should.
As an aside, I don't know whether the application server you're using provides HTTP compression, but if it does, you should make sure it's enabled and working. This is probably the best single thing you can do to increase transfer speed of large files, assuming they're fairly compressible. If your application server doesn't support it, then most any vanilla web server will.
I hope that helps.
I ended up using CLIENT_HOST to call an FTP command to download the files. My 189MB test file took 20-22 minutes to download using WEBUTIL_FILE_TRANSFER.URL_To_Client_With_Progress, and only about 20 seconds using FTP. It's not the best solution because it leaves the FTP password exposed on the PC temporarily, but only for as long as the download takes, and even then the user would have to know where to find it.
So, we're implementing this for now, and looking for a more secure but still performant long term solution.