Jmeter-SSH SFTP -Permission denied for download file - jmeter

I am using SSH SFTP sampler to connect to SFTP. I need to download the files from SFTP. i have selected Action-"Get" and source path s SFTP file path location and destination is local folder path. I am getting error Permission denied. The same works if i try Action "ls"

Most probably the user account you're using for establishing SFTP session doesn't have read permission therefore you cannot read file contents hence not able to copy it.
Make sure that the user has read permissions and if this is not the case - provide the permission using chmod command like:
chmod +r your_file
Check out Load Testing FTP and SFTP Servers Using JMeter article for more information.
If you still experience problems - update your question to show SSH SFTP sampler configuration.

Related

Jmeter-Transfer a file from server ( using ssh sftp) to local

I need to copy file from remote server to local and trying it by using SSH SFTP using JMETER.I gave login credentials and in "File Transfer" section , selected
Actions-- PUT
source path -- remote path ( /ftp/xxx/yyy)
Destination -- local path.
Received error as `
java.io.FileNotFoundException: C:\apache-jmeter-5.1.1\bin\ftp\xxx\yyy)
So I changed the Actions-- GET in View results tree for SSH SFTP response I see data of file,
but file is not copied to local directory
Any help would greatly appreciated.
All the parts of the "Destination path" must exist, i.e. you need to create all /ftp/xxx/yyy folder structure using mkdir command like:
mkdir -p /ftp/xxx/yyy
The user which executes JMeter must have write permissions to the destination path, give him the appropriate permissions using chmod command
You need to set "Print file content" to false
More information: Load Testing FTP and SFTP Servers Using JMeter
Also according to JMeter Best Practices you should always be using the latest version of JMeter so consider upgrading to JMeter 5.4.1

PSFTP - File Transfer Not Happening

I want a file to transfer automatically for one server to another server. For that i have installed putty.
Batch File: Transfer.Bat
cd "C:\Program Files\PuTTY"
psftp Username#RemoteServer.com -pw password -b MovementFileTransferScript.txt -be
Text File: MovementFileTransferScript.txt
open RemoteServer.com
put D:/Backup/FULL_DB_BACKUP-DontDelete.txt R:\DB\Backup
quit
In the above text file, D:/ belongs to Source Server and R:/ belongs to destination server.
When i am running the batch file i am getting the below error.
Using username "Username".
Remote working directory is /
psftp: already connected
local: unable to open D:/Backup/FULL_DB_BACKUP-DontDelete.txt
I have checked the permissions of the folders and everything looks good. Can some one let me know what was the issue here.?

Copy folder from an http source to a local directory

I am trying to copy a folder from an http source using the following statement:
FileUtils.cp_r 'http://else.repository.labs/static/lit/MDMInternalTools/', 'c:\Users\Public\Desktop\'
However I get this error:
EINVAL: Invalid argument - https://else.repository.labs/static/lit/MDMInternalTools/
If that's a folder on a server that you have access to via ssh, then you can use scp to copy individual files, or a folder with all subfolders/files, using the -r option. The command will be something like
scp -r username#http://else.repository.labs:/path/to/rails/public/static/lit/MDMInternalTools c:\Users\Public\Desktop
This is assuming you can use scp. It looks like you're in windows, you'll need a decent command-line shell where you can install ssh.
https://en.wikipedia.org/wiki/Secure_copy
You can check if the server supports webdav, have an ftp or an ssh access.
Otherwise your only choice could be to use wget to get a local mirror:
wget -mk http://else.repository.labs/static/lit/MDMInternalTools/

Transferring a file to an amazon ec2 instance using scp always gives me permission denied (publickey,gssapi-with-mic)

I am trying to transfer a file to an ec2 instance. I followed the Amazon's documentation, this is what my command looked like:
scp -i [the key's location] Documents/[the file's location] ec2-user#[public dns]:[home/[destination]]
where I replaced all the variables with the proper things, I am sure it's the correct key and it has permission 400. When I call the command, it tells me the RSA key fingerprint, asks me if I want to continue connecting. I type yes and it replies with
Permission denied (publickey,gssapi-with-mic)
lost connection
I have looked at many of the other similar questions on stack overflow and can't find a correct way to do it.
Also ssh traffic is enabled on port 22.
The example amazon provided is correct. It sounds like a folder permissions issue. If you created the folder you are trying to copy to with another user or another user created it, chances are you don't have permissions to copy to it or edit it.
If you have sudo abilities, you can try opening access for yourself. Though not recommended to be left this way, you could try this command:
sudo chmod 777 /folderlocation
That gives complete read/write/executable permissions to anyone (hence why you shouldn't leave it at 777) but it will give you the chance to test your scp command to rule out permissions.
Afterwards if you aren't familiar with permissions, I suggest you read up on it. this is an example: http://www.tuxfiles.org/linuxhelp/filepermissions.html It is generally suggested you lock down the folder as much as possible depending on the type of information held within.
If that was not the cause some other things you might want to check:
are you in the directory of your key when executing the 'scp -i keyname' command?
do you have permissions to use the folder you are transferring from?
Best of luck.
The problem may be the user name. I copied a file to my Amazon instance and first tried to use the command:
scp -r -i ../.ssh/Amazon_server_key_pair.pem ./empty.test ec2-user#ec2-xx-yy-zz-tt.compute-1.amazonaws.com:~
and got the error:Permission denied (publickey).
I then realized that my instance is an Ubuntu environment and the user user is then "ubuntu" the correct command that worked for me is then:
scp -r -i ../.ssh/Amazon_server_key_pair.pem ./empty.test ubuntu#ec2-xx-yy-zz-tt.us-west-2.compute.amazonaws.com:~
The file "empty.test" is a text file containing the text "testing ...". Replace the address of your virtual server with the correct address to your instance's Public DNS. I have replaced the ip to my instance with xx.yy.zz.tt.
I have to use ubuntu# instead of ec2-user# because when i ssh i was seeing ubuntu# in my terminal, try changing to the name you see at your terminal
Also you have to set permission for pem file in your computer
chmod 400 /path/my-key-pair.pem
The below code will copy file from your computer to Ec2 instance.
scp -i ~/location_of_your_ec2_key_pair.pem ~/location_of_transfer_file/sample.txt ubuntu#ec2_your_ec2_instance.compute.amazonaws.com:~/folder_to_which_it_needs_to_be_copied
The below code will copy file from Ec2 instance to your computer
scp -i ~/location_of_your_ec2_key_pair.pem ubuntu#ec2_your_ec2_instance.compute.amazonaws.com:~/location_of_transfer_file/sample.txt ~/folder_to_which_it_needs_to_be_copied
I was facing the same problem. Hope this will work for you.
scp -rp -i yourfile.pem ~/local_directory username#instance_url:directory
Permission should also be correct to make this work.
Might be ones uses wrong username. Happened to me, was the same error msg -> Permission denied (publickey,gssapi-keyex,gssapi-with-mic).
lost connection

How to upload files and folders to AWS EC2 instance?

I use SSH to connect to my Ubuntu instance. With SSH I can administer files and folders on the instance, but how do I upload files and folders from my local machine to the instance?
Is it possible to do right from SSH session, without using SFTP clients?
Just to add a bit more detail to the scp command (included in OSx and most linux/unix):
scp -i myssh.pem local_file username#200.200.200.200:/home/username
Obviously - replace the pem file with the one used for ssh access. Obviously replace "username" and "200.200.200.." with valid values for your setup.
You can try kitten utility which is a wrapper around boto3. You can easily upload/download files and run commands on EC2 server or on multiple servers at once for that matter.
kitten put -i ~/.ssh/key.pem cat.jpg /tmp [SERVER NAME][SERVER IP]
Where server name is e.g ubuntu or ec2-user etc.
This will upload cat.jpg file to /tmp directory of server
As mentioned already, I've used WinSCP, which logs me in as "ec2-user" - then make sure to adjust that user's permissions via SSH. Example:
chown -R ec2-user /path/to/files
(Authenticate as the root user first.)
Whatever folder or files you need to edit via WinSCP, allow permissions on them (otherwise you will get a permission denied error when trying to upload/edit files in WinSCP).
you cannot copy files using ssh. you can use scp/sftp.
scp if you are on linux or winscp if you are on windows
You can use this:
scp -i yourkeypair.pem source destination
This Works Fine
scp -r -i myssh.pem /local/directory remote_username#10.10.0.2:/remote/directory
-r for recursive
You could also install and set up an FTP Server, which will allow you to set up users, and directories for them to upload to. That being said, I've upvoted the above because scp/sftp is the ideal method.
The easiest way is to install webmin and user the file manager (java plugin) from your browser.
//Go to home folder
cd ~
//Download the latest version
wget http://prdownloads.sourceforge.net/webadmin/webmin-1.660-1.noarch.rpm
//install
sudo rpm -U webmin-1.660-1.noarch.rpm
//Change default password of root user
passwd
Finally, open port 10000 in the security groups
Then, log into
https://server_name:10000
with user:root password:what_you_set_before

Resources