Laravel dependency installation on live server - laravel

I have uploaded a laravel site on live server and now I wanted to install a library like MailChimp. On local server I do it using composer but I want to do it on live server where the site is hosted and running . How to
install dependency and connect to hosting server using command line. Is it possible to access the hosting server file using DOS etc and run composer.

Well you can connect to the server through ssh and have access to the command line but it depends on the server, you might have the rights to install other programms or you may not...to get your hands on composer and install new dependencies you probably need to have those rights to install.
What type of server you have...VPS or Shared host. Probably on a VPS you can set up ssh connection...on a shared host things depend on what they have allowed you to have access to.

Related

Execute actions before specific commands in Apache Guacamole

I setup Apache Guacamole 0.9.14 on my CentOS 7 with nginx as reverse proxy to it.
I want to give limited access to some of my employees for some of my servers via ssh.
Some of them are SFTP enabled and to prevent sabotage on purpose or not I edit guacamole upload function to upload a copy of file uploaded on guacamole server itself alongside destination server.
I was wondering if I could create a copy of files getting on destination servers via wget, curl, etc.
If I can control specific commands on destination servers and do some actions before executing them, (For example backing files on guacamole server before executing any rm -rf command or creating a copy of file 'wget'ed on guacamole server), that would be great.
There are more than a thousand servers with different Linux OSs on them, so editing any server except guacamole server itself is impossible to do.
Any idea on how to control commands before executing on guacamole server specially on ssh?

Opscode Chef Server / Workstation force commands from server

Background : Chef Server Version 12 and a Windows workstation SDK 0.10 targeting windows nodes
I've created recipes and bootstrapped local windows servers into the Chef manager and applied recipes so the very basics are all working.
Question : when running the bootstrap commands for a hosted server (e.g azure / aws) I need the command to come from the Chef Server not the workstation.
I had hoped that the knife.rb with the Chef_server_url would force all commands to come from there.
WireShark shows the WinRM connections trying to come from my workstation.
Is there any setting I can implement that forces this in the knife.rb or elsewhere?
I had tried to add the following from searches but they've not been successful :
chef_zero.enabled false
local_mode false
Is this resolved through Chef Provisioning rather than Chef knife commands?
many thanks in advance for any assistance you can give.
"when running the bootstrap commands for a hosted server (e.g azure / aws) I need the command to come from the Chef Server not the workstation." is not correct. Knife commands that manipulate servers go directly from your workstation, and this is how it is supposed to work. The way the bootstrap functions is it starts the cloud machine using the relevant provider API, then connects to the new VM via SSH or WinRM and installs Chef, and then launches chef-client using a configuration file based on your knife settings (this is where chef_server_url comes in).

CentOS 6.4 Minimal + how to configure jenkins jobs via xml?

I need to create a Build Server in CentOS 6.4 Minimal I sucessfully installed:
Java compiler (OpenJDK 1.7.0)
Git or Mercurial
Maven
Jenkins
Now I need to to the following:
At given intervals (eg daily at midnight) is the latest revision in the version control system (tip, HEAD, ...) compiled with Maven. In addition, Java Docs and packages (jar, war) need to be created.
Then Jenkins with all tests conducted and reported.
Make sure there is a report of previous builds
Ensure that the Java Docs and packages can be downloaded (jars, wars, ...) of the latest build
I can't use a GUI on CentOS Minimal so I need to configure the job in xml files? Could please someone show me the way... I'm not a linux server guru.
It's a bit impractical to configure Jenkins via XML by hand, because Jenkins' configuration is spread over multiple files, and the format of the configuration files changes between releases.
Given that Jenkins is a web application, you should be able to visit port 8080 (Jenkins' default port, assuming you didn't change it) on the server where you installed Jenkins (e.g. http://mycentosserver.example.com:8080), and configure it via the web interface.
If you're unable to access the web interface because of a firewall or similar, but you are able to SSH to the server (presumably you can, given that you were able to install stuff on it), you could set up an SSH tunnel to forward a port on your local machine to port 8080 on the server. For example, from your local machine, run the following command. You will then be able to access Jenkins on your local machine at http://localhost:28080 . If you're on Windows, you can use Putty to do the same thing.
ssh -L 28080:127.0.0.1:8080 mycentosserver.example.com
If you can't access the web app directly, and you can't SSH tunnel, I'd recommend setting up Jenkins on a server where you can access the web app, configuring it, and copying the XML config files from /var/lib/jenkins on that server across to your Centos server.

Confusion about FTP

I am learning web development and I'm a bit stuck with FTP. I know it's used for file transfer but how do I actually use it? I found some PHP functions to connect to the FTP server and log in but what do I log in with? How do I create a username? Is FTP something like MySQL with it's own command line? Or is it something like Apache?
I am using Ubuntu 12.04 and I have LAMP installed. I found somewhere that I need to install a program to use FTP but I found somewhere else that I need to install FTP while installing PHP. This is really confusing.
Thanks.
FTP is File Transfer Protocol. It is not a programming language. FTP is used to connect to a computer to access its file system - to upload or download files. Imagine opening a folder on firends pc from you computer. In most linux you can type in a ftp address to the location bar in whatever file browser your using and access the ftp server as if it was a local folder. You can also use specific software for that - gftp, filezilla.
A ftp daemon does not come with lamp. Please refer to https://help.ubuntu.com/12.04/serverguide/ftp-server.html for details on how to install/configure a ftp daemon on ubuntu.
What you will use ftp for is to put your .php files on a remote machine. If you are doing things on your computer only it is likely that you do not need ftp.

How do I use MAMP, SVN Server, and the SVN client all on the same machine?

I have a dev box (mac mini) that hosts all of my development sites. I have SVN server configured on the server and have created 2 repositories so that I can push my work from my laptop to the server. Now I want to "update" the webdev directory (where I have my files set up for MAMP) so that I can see my changes on my dev server. Is there any way to do this? I've tried connecting to localhost using the SVN client and I get a 200 OK error -- and another error I can't remember how to recreate.
Can SVN Server, Client, and Webserver all co-exist on the same machine -- and if so -- how do I update/commit/checkout using the client on the same machine?
Can SVN Server, Client, and Webserver all co-exist on the same machine
Yes
how do I update/commit/checkout using the client on the same machine
same way as for any remote host - connect to repo and use svn commands
BTW - site as working copy is "worst practice"

Resources