FTP tranfer is kept on hold - ftp

I've set up a PROFTP server on a CentOS 7 machine. And I am accessing it from other machines (with windows servers) to send files to it.
I've created some rules to only enable to stor files to a certain directory and the subdirectories will have different ownerships. At this point they are owned by user.
<Directory pathToDir>
<Limit STOR CWD>
AllowAll
</Limit>
<Limit READ RMD DELE MKD>
DenyAll
</Limit>
<Directory>
So here is what happens to me.
I log in with user from a windows server machine and access first sub-directory (own user grp user), mput several files and the files are copied.
I log in with user from a different windows server machine and access second sub-directory (own user grp user), put file and I get confirmation code (200 PORT command successful), but transfer doesn't start, however the file is created on the server and it is empty.
If I use my laptop, everything works.
Does anyone know how to fix this? Or what is wrong with my FTP server?
EDIT: FIXED. It was a windows firewall issue, couldn't get response from the ftp server. Since my server has a static ip I managed to add an exception to the windows firewall allowing only that ip to have full access to the ftp rather than opening a set of ports.

these would point to a firewall issue:
If the connection times out (rather than fails instantly)
If a directory listing from the client also fails
as a workaround you could try passive (PASV) FTP.

Related

Unable to Retrieve Directory Using ProFTPD(WHM)

Well, after looking for many solutions. I came here now.
I am setting up WHM/cPanel for hosting website. Everything was going smooth but I am stuck on FTP connection (Server sent passive reply with unroutable address. Using server address instead.)
Server Details:
CentOS Linux release 7.2.1511 (Core)
WHM/cPanel Version 11.58.0.13
FTP Server: PureFTPD
Acutal error while connecting
To fix this issue and get FTP working you need to open up more numbered ports so FTP can connect. I assume you are using CSF.
Login to WHM then go to CSF >> Firewall Configuration >>
allow TCP_In 30000:50000 and TCP_Out 30000:50000
Once you made the changes Restart the firewall
Now you need to make changes in FTP config file to use these ports, you will find this file to this location /etc/pure-ftpd.conf
Now you will see a line as follows and you will need to uncomment it
# Port range for passive connections replies. - for firewalling.
PassivePortRange 30000 50000
Restart FTP Service and should work.

How to setup FTP on xampp

i want to make a server using xampp. i have already installed xampp and setting port 8080. php and mysql work fine but i can't access ftp from internet. Can you please suggest way how can I do this?
XAMPP comes preloaded with the FileZilla FTP server. Here is how to setup the service, and create an account.
Enable the FileZilla FTP Service through the XAMPP Control Panel to make it startup automatically (check the checkbox next to filezilla to install the service). Then manually start the service.
Create an ftp account through the FileZilla Server Interface (its the essentially the filezilla control panel). There is a link to it Start Menu in XAMPP folder. Then go to Users->Add User->Stuff->Done.
Try connecting to the server (localhost, port 21).
XAMPP for linux and mac comes with ProFTPD. Make sure to start the service from XAMPP control panel -> manage servers.
Further complete instructions can be found at localhost XAMPP dashboard -> How-to guides -> Configure FTP Access. I have pasted them below :
Open a new Linux terminal and ensure you are logged in as root.
Create a new group named ftp. This group will contain those user accounts allowed to upload files via FTP.
groupadd ftp
Add your account (in this example, susan) to the new group. Add other users if needed.
usermod -a -G ftp susan
Change the ownership and permissions of the htdocs/ subdirectory of the XAMPP installation directory (typically, /opt/lampp) so that it is writable by the the new ftp group.
cd /opt/lampp
chown root.ftp htdocs
chmod 775 htdocs
Ensure that proFTPD is running in the XAMPP control panel.
You can now transfer files to the XAMPP server using the steps below:
Start an FTP client like winSCP or FileZilla and enter connection details as below.
If you’re connecting to the server from the same system, use
"127.0.0.1" as the host address. If you’re connecting from a different
system, use the network hostname or IP address of the XAMPP server.
Use "21" as the port.
Enter your Linux username and password as your FTP credentials.
Your FTP client should now connect to the server and enter the /opt/lampp/htdocs/ directory, which is the default Web server document root.
Transfer the file from your home directory to the server using normal FTP transfer conventions. If you’re using a graphical FTP client, you can usually drag and drop the file from one directory to the other. If you’re using a command-line FTP client, you can use the FTP PUT command.
Once the file is successfully transferred, you should be able to see it in action.
I launched ubuntu Xampp server on AWS amazon.
And met the same problem with FTP, even though add user to group ftp SFTP and set permissions, owner group of htdocs folder.
Finally find the reason in inbound rules in security group, added All TCP, 0 - 65535 rule(0.0.0.0/0,::/0) , then working right!
On XAMPP click "Start" and after "Admin".
Login to localhost (127.0.0.1) without password, with second port, not with 21.
Add users and passwords, change your settings. Quit.

Downloading a file from FTP using kettle

I was trying to download a file form FTP server(Remote machine) using pentaho kettle (Get a file with FTP), I can able to do that in my local machine. But when i try to deploy the app in jboss web server it fails to download resulting in a error "Error getting files from FTP : Login incorrect."
But everything seems to be correct regarding the login details.
Did i have configure any where else in the server? Please help.
There are several things you could check in this case:
Check that the security settings of the remote FTP server allows for the machine that runs the job to establish an FTP connection. If you have access to the server, try the following command to ensure that the server has access:
telnet <your-remote-ftp-server-host> 21
# Or try:
telnet <your-remote-ftp-server-host> 22
Verify that the remote FTP server respects FTP connection requests, otherwise try SFTP (Get a file with FTP & Get a file with SFTP are two entirely different job steps in Kettle)
Wherever you are fetching your FTP credentials from (ideally a configuration file), test that the credentials are properly read by the job in the relevant scope - use the Write To Log step for that matter

Uploading to EC2 problems. How do you do FTP?

I have setup a new EC2 instance on AWS and I'm trying to get FTP working to upload my application. I have installed VSFTPD as standard, so I haven't changed anything in the config file (/etc/vsftpd/vsftpd.conf).
I have not set my port 21 in the security group, because I'm doing it through SSH. I log into my EC2 through termal like so
sudo ssh -L 21:localhost:21 -vi my-key-pair ec2-user#ec2-instance
I open up filezilla and log into local host. Everything goes fine until it comes to listing the directory structure. I can log in and right and everything seems fine as you can see below:
Status: Resolving address of localhost
Status: Connecting to [::1]:21...
Status: Connection established, waiting for welcome message...
Response: 220 Welcome to EC2 FTP service.
Command: USER anonymous
Response: 331 Please specify the password.
Command: PASS ******
Response: 230 Login successful.
Command: OPTS UTF8 ON
Response: 200 Always in UTF8 mode.
Status: Connected
Status: Retrieving directory listing...
Command: PWD
Response: 257 "/"
Command: TYPE I
Response: 200 Switching to Binary mode.
Command: EPSV
Response: 229 Entering Extended Passive Mode (|||37302|).
Command: LIST
Error: Connection timed out
Error: Failed to retrieve directory listing
Is there something which I'm missing in my config file. A setting which needs to be set or turned off. I thought it was great that it connected but when it timed out you could picture my face. It meant time to start trawling the net try and find the answer! Now with no luck.
I'm using the standard Amazon AMI 64 bit. I have a traditional lamp setup.
Can anyone steer me in the right direction? I have read a lot about getting this working but they are all incomplete, as if they got bored half way through typing up how to do it.
I would love to hear how you guys do it as well. If it makes life easier. How do you upload your apps to a EC2 instance? (Steps please - it saves a lot of time plus it is a great resource for others.)
I figured it out, after the direction help by Antti Haapala.
You don't even need VSFTP setup on the instance created. All you have to do is make sure the settings are right in FileZilla.
This is what I did (I'm on a mac so it should be similar on windows):
Open up file zilla and go to preferences.
Under preferences click sftp and add a new key. This is your key pair for your ec2 instance. You will have to convert it to the format FileZilla uses. It will give you a prompt for the conversion
Click okay and go back to site manager
In site manager enter in your EC2 public address, this can also be your elastic IP
Make sure the protocol is set to SFTP
Put in the user name of ec2-user
Remove everything from the password field - make it blank
All done! Now connect.
That's it you can now traverse your EC2 system. There is a catch. Because you are logged in as ec2-user and not root you will not be able to modify anything. To get around this, change the group ownership of the directory where your application will lie (/var/www/html) or what ever. I would change it so it is on a EBS volume. ;) Also make sure this group has read write and execute permissions. The group for the ec2-user is ec2-user. Leave everyone else as nothing. So the command you use while logged in via ssh
sudo chgrp ec2-user file/folder
sudo chmod 770 file/folder
Hope this helps someone.
FTP is a very troublesome protocol because it requires a secondary pipe for the actual data transfer and does not definitely work well when piped. With ssh you should use SFTP which has nothing to do with FTP but is a completely different protocol.
Read also on Wikipedia
Adding the key to www is a recipe for disaster! Any minor issue with your app will become a security nightmare.
As an alternative to ftp, consider using rsync or a more "mature" deploy strategy based on capistrano for instance. There are plenty of tools for that around.
Antti Haapala's tips are the only way to work around with EC2 SFTP. It works just fine! Just note that you need to create the /var/www/.ssh/ folder and copy the authorized_keys file there.
After that you'll need to change authorized_keys ownership to www-data so ssh connection can recognize it. Amazon should let people know that. I looked for this in there forums, FAQ, etc. No clue at all... Cheers once more to stackoverflow, the way to go haha!

FTP file transfer access denied

I installed Windows Server 2008 on my VMWare machine. In Windows Server 2008, I installed FTP and ran it. I also turn off all firewalls. However, from my main machine, I could not send a text file and got this errors:
200 PORT command successful.
550 file.txt: Access is denied.
Please help
If you got this error message when trying to upload a file to the server, it is possible that you did not enable write permissions for the folder that you are uploading to.

Resources