I Can't download phpmyadmin.conf from amazon ec2 - amazon-ec2

Since i have to change some settings inside "etc/httpd/conf.d/phpMyAdmin.conf".
i can't download this file using "FileZilla", I also tried sudo nano command in putty , it returns empty. i don't know how to change permission for this file.
I spent more than an hour. Guide me if someone know how to resolve this.

EC2 is a computer rental service, not a web hosting service, so you won't be able to connect with FTP (filezilla) unless you run an FTP server on your EC2 instance.
As for editing the file while you're connected through SSH (putty), you need to make sure that you're properly referencing the file you want. Try running "sudo nano /etc/httpd/conf.d/phpMyAdmin.conf". Note the leading "/" on the file path; it's important.

Related

Go cd configuration issue

I've been having an issue trying to add github materials from a private repo on a Windows server.
I've seen lots of people suggesting how to add the ssh keys and where but on unix based systems. Haven't found anything related to Windows Servers.
I'm using Go latest release and have installed Go Server & Agent on a Windows Server 2008 with git installed.
I can connect to the private repo using Git Bash.
Whenever I try to add the materials it keeps saying Checking Connection and looks like it stays there forever.
If I use basic auth it works but I would like to make it work without exposing my password in the URL.
Is there a way to do that?
If you run Go under the default local system account, you can follow the suggestions from http://opensourcetester.co.uk/2013/06/28/jenkins-windows-ssh/ to setup the ssh keys for local system account.
If you run Go Server under a domain account (and not the default Local System account), check if you have uploaded your ssh keys to %USERPROFILE%/.ssh/ folder on the server machine, %USERPROFILE% being HOME folder for the domain user. Once you set that up, Go server would be able to pick the required keys. The same holds good for the agent machines. Just so you know, Go would not invoke Git-bash internally to run the git commands, so any setup on bash will not take effect when running git from within Go.
If you are using a windows machine to host GoCD server and agents , it does not run under a normal user account, it runs under the “Local System Account”
So even you can access your git repo from git bash (logged in as the current user),GOCD cannot access the same.
So you need to add the SSH keys for the Local System Account from your your current user.
1.First find the home directory for the Local System Account(It will not reside under C:/Users )
2.Use any remote administration tool to find the home directory- If you go with http://download.sysinternals.com/files/PSTools.zip
a)unzip and run command-line as administrator
b)PsExec.exe -i -s cmd.exe -start the tool )
c)run echo %userprofile% to get the home directory (eg:C:\Windows\system32\config\systemprofile)
3.Now you can either copy the SSH key files from current user or create a new one using ssh commands.
Try checking Connection after creating/copying the SSH keys, it will show Connection OK!

FTP on lampstack - Google cloud platform

So I installed a LAMP on a Google Cloud instance with debain wheezy7. Everything is working fine but I am not able to work the ftp. I am following this tutorial by digital ocean
I am stuck at this last step where I need to make vsftpd allow the user to write outside the chroot file.
The error is get is
hetunandu_gmail_com#lamp:~$ mkdir /root/hetunandu/files
mkdir: cannot create directory /root/hetunandu/files': Permission denied
Then when i use sudo with it i get this error
hetunandu_gmail_com#lamp:~$ sudo mkdir /root/hetunandu/files
mkdir: cannot create directory /root/hetunandu/files': No such file or directory
Where do I go from here?
Also I dont know how to get my username and password setup for FTP
I followed the tutorial and could not replicate your issue. I initially got "Permission denied" but you can circumvent this by running:
$ sudo su
and then
$ mkdir -p /root/$USER/files
Why not use /home/$USER ? not sure why you want to create the folders under /root.
As for your second question, regarding the username and password, I am not sure I understand. From the Developers Console > Compute Engine > VM Instances > click SSH and that should log you in with root privileges. then you can create all the users you want:
$ sudo adduser test_user
Please don't use FTP as it's an insecure clear-text protocol which will let others see your password and easily get access your instance, read/modify/delete your files, etc.
Instead, you should use secure protocols such as SCP or SFTP with public key authentication.
Here are some options to transfer files to/from your GCE VM instance:
sftp CLI tool, as described in this answer
gcloud compute copy-files, as described in this answer
WinSCP with SFTP

Failed to upload a file to ec2 instance

I'm a newbie and all I want is to set up an ec2 instance for fever rss.
Here is my info: os x 10.9.2, aws with ami of ubuntu 12.04 lts. I set up lamp on ec2 following this guide: http://www.robotmedia.net/2011/04/how-to-create-an-amazon-ec2-instance-with-apache-php-and-mysql-lamp/
Now I can ssh to my server public IP using terminal. After connected the server, I typed
scp -i /path/to/keypair.pem /path/to/test.txt ubuntu#theServerPublicIP:~/
and got the error as follows:
Warning: Identity file keypair.pem not accessible: No such file or directory.
I have tried to resolve the problem by:
1. change permission of .pem file to 600 on my os x.
chmod 600 keypair.pem
and ssh again, scp again, and got same error. Then I change its permission to 400 on my os x,
chmod 600 keypair.pem
and redid ssh and scp, and got same error.
rewrite file path using ~/path/to/file for both of keypair.pem and test.txt, and then redid ssh and scp, got same error.
Next rewrite file path using /Users/myUserName/path/to/file for both files and redid ssh and scp, got same error.
Next cd to the folder of keypair.pem and test.txt (I put them in the same folder), and tried the above two naming and got same error for each.
change path on the server. I have tried "~","~/","/","/var/www/", for all I still got the same error.
I also tried forklift because I saw the developer of Fever using it in the demo video. I tried all options for connection: sftp... but couldn't connect to the server.
Please help to get the test.txt uploaded... then I will be able to upload the fever folder.
Thanks!
If you gotta do this frequently I advise you to create an alias.
For example: I was running a webserver on EC2 instance and I had HTML contents in a local dir awsplaywww
$ alias syncaws="rsync -avrz --delete /home/sanket/workspace/awsplaywww/ -e ssh sanket#awsplay1.ddns.net:/var/www/html/"
Now everytime I update a HTML file or something and need to send it back to server. I just open terminal and type syncaws and job done!

Changing permissions on Windows for FTP via XAMPP

I am developing a rather basic CMS locally and can not ftp due to permissions problems. I have FileZilla FTP server running in XAMPP and can connect without problems but receive error "550 Permission Denied" when my code tries to upload via ftp.
I tried changing permissions via my FileZilla FTP client but without success (presumably because it is windows- error: "504 Command not implemented for that parameter"). I also tried going to the target folder's properties --> security --> permissions and checking "full" for the user XAMPP is running under.
So how do I setup the permissions to allow my CMS to use ftp in XAMPP?
Finally figured it out. For anyone else who has this problem, there is no need to set permissions in Windows. Instead, on the FileZilla FTP server admin interface select edit --> users --> shared folders and check 'write'.
For a fix of this make sure that the root folder is allowed to write and pass rights permission. had this same issue for a script to run on local web xampp server windows xp.. just change the root file attributes to 755 then try it if that don't work try 777. should work now... hope this works for you... as it did me.
If that does not work go to your htdocs folder in your xampp directory then right click it and check properties .. if read only is ticked .. untick it and apply to all sub folders and files
You're behind firewall? Try to allow FileZilla server program in Firewall.

Does anyone know how to download a project from nitrous.io?

I made an ruby web application on nitrous.io, the tool is very nice and it helped a lot but now I want to download ther project in my computer and I didn't found any option to do that...
You can download and upload projects by any of the following options:
Utilize Nitrous Desktop to Sync your files locally.
Upload your project to Github, and pull the project from there. Here is a guide on adding the SSH key to Github if needed.
Upload the content via SCP. To do this, you will need to add an SSH Key to your account.
Next, run this command on your local machine, replacing {PORT} with the port # assigned to your Nitrous.IO box, and also changing usw1 with the proper region found in the SSH URI of your boxes page.
To Upload:
scp -P{PORT} -r path/to/yourFolder action#usw1-2.nitrousbox.com:~/workspace
To Download:
scp -P{PORT} -r action#usw1-2.nitrousbox.com:~/workspace path/to/yourLocalFolder
I do not know the service, but apparently they offer ssh access. Then you can use scp to copy the files to your machine. Anyway, probably you should ask their support...
...post a summary of their answer here and close the question :)
The easiest way is to store your project in a Git repository and then push this repository to an external host. You will then be able to clone your project from the external repository to any machine you want.
Personally, I use Bitbucket (Bitbucket as it is free and very easy to set up. Have a look at the tutorials there.
ok replying really late but I hope this will help anyone still looking for this. Here is how I download stuff from nitrous, no desktop utility download needed, and no ssh/scp or adding keys.
What you do is, simply make a archive for the folder you want to download by
tar -zcvf myarchive.tar.gz mydir/
now you got a *.gz file right? Whichever folder your gz file is in, be there and type:
python3.3 -m http.server 8080
you just started a cute little http server ready to serve you your download, now from the Preview menu click "Port 8080", this opens a new browser tab showing your gz file in the file listing (sample url http://yourboxes.apse1.nitrousbox.com:8080/). Now you can click your gz file and it will start downloading. Once done with the download, press Ctrl+C on the terminal to terminate the http server.
This is not limited to nitrous, you can make this work on many online VMs like cloud9 etc.

Resources