How to run shell command from joomla component - joomla

I'm writing a component in Joomla 3 and want to save the database periodically (eg after a user updates something). I'd like to therefore run mysqldump using shell_exec (or similar) but I can't get this to work. I suspect it's a permissions issue, but I'm not sure how to resolve this...
Any ideas appreciated.

Your little question inspired us to write a post on how to run SSH commands from Joomla. You can find it here: http://www.itoctopus.com/how-we-ran-an-ssh-command-from-joomla
The post is how we created a secure script that will unblock blocked IPs in CSF - but, the nice thing about it, it provides very clear instructions on how to run ssh commands from a Joomla extension (which is what you essentially need).
I really hope you enjoy this post and that it works for you. If it doesn't, then please provide feedback and we can help!

Related

bash script invoked in freeradius

Can you please help me insert my bash script into freeradius. I would like to start my script each time a user is allowed access via freeradius to my network.
I tried to insert my script into queries (/etc/freeradius/3.0/mods-config/sql/main/mysql/queries.conf), but the script is not invoked.
If you have any idea on how to do this please let me know.
Thank you in advance!
Adding random things to the SQL configuration isn't going to help here.
You need to configure the exec module, the best example is in mods-enabled/echo (though also see mods-enabled/exec). There are examples in that file on how to point to the script that you want to run, and what it should return.
Then to ensure that it is run after a successful authentication, make sure that echo (or whatever instance name you gave to the module configuration) is listed in the post-auth{} section of the correct virtual server, most likely sites-enabled/default.
Note that calling out to external scripts is nearly always a bad idea, it will cause performance to drop significantly. There is usually a better way to solve the problem.

Executing gcloud commands in bash

I've spent 3 days beating my head against this before coming here in desperation.
So long story short I thought I'd fire up a simple PHP site to allow moderators of a gaming group I'm in the ability to start GCP servers on demand. I'm no developer so I'm looking at this from a Systems perspective to find the simplest solution to do the job.
I fired up an Ubuntu 18.04 machine on GCP and set it up with the Google SDK, authorised it for access to the project and was able to simply run gcloud commands which worked fine. Had some issues with the PHP file calling the shell script to run the same commands but with some testing I can see it's now calling the shell script no worries (it broadcasts wall "test") to console everytime I click the button on the PHP page.
However what does not happen is the execution of the gcloud command. If I manually run this shell script it starts up the instance no worries and broadcasts wall, if I click the button it broadcasts but that's it. I've set the files to have execution rights and I've even added the user nginx runs as to have sudo rights, putting sudo sh in front of the command in the PHP file also made no difference. Please find the bash script below:
#!/bin/bash
/usr/lib/google-cloud-sdk/bin/gcloud compute instances start arma3s1-prod --zone=australia-southeast1-b
wall "test"
Any help would be greatly appreciated, this coupled with an automated shut down would allow our gaming group to save money by only running the servers people want to play on.
Any more detail you want about the underlying system please let me know.
So I asked a PHP dev at work about this and in two seconds flat she pointed out the issue and now I feel stupid. In /etc/passwd the www-data user had /usr/sbin/nologin and after I fixed that running the script gcloud wanted permissions to write a log file to /var/www. Fixed those and it works fine. I'm not terribly worried about the page or even server being hacked and destroyed, I can recreate them pretty easily.
Thanks for the help though! Sometimes I think I just need to take a step back and get a set fresh of eyes on the problem.
When you launch a command while logged in, you have your account access rights to the Google cloud API but the PHP account doesn't have those.
Even if you add the www-data user to root, that won't fix the problem, maybe create some security issues but nothing more.
If you really want to do this you should create a service account and giving the json to the env variable, GOOGLE_APPLICATION_CREDENTIALS, which only have the rights on the compute instance inside your project this way your PHP should have enough rights to do what you are asking him.
Note that the issue with this method is that if you are hacked there is a change the instance hosting your PHP could be deleted too.
You could also try to make a call to prepared cloud function which will create the instance, this way, even if your instance is deleted the cloud function would still be there.

Codeigniter - Using environments with different hosts

I was wondering if someone could help me.
I have started using version control (git) for my website which is using CodeIgniter.
Everytime i transfer files from my localhost host to my live server, i always have to go through all my files and change the config details.
I came across a post saying i could do all this with the ENVIRONMENT settings in the index.php file automatically based on the SERVER_NAME.
Has anybody done this before? if so, would it be possible to let me know how its done properly?
Cheers,
Try this for a start (index.php):
if ($_SERVER["HTTP_HOST"] == 'devserver1' || $_SERVER["HTTP_HOST"] == 'devserver2')
define('ENVIRONMENT', 'development');
else
define('ENVIRONMENT', 'production');
Then, whenever you need it, you check for the ENVIRONMENT constant (for example, different database settings, etc.). For localhost, simply check if the server is 'localhost' ($_SERVER["HTTP_HOST"] == 'localhost'), or whichever virtual host name you might be using.
You could always use environment variables
http://httpd.apache.org/docs/2.2/env.html
This will allow you to get the environment instead of hard coding the information in your code
This may also help you out
http://docstore.mik.ua/orelly/linux/apache/ch04_06.htm
Not sure if you're still needing help with this, but I had this issue a while ago and released a CodeIgniter module which is designed to automatically handle multiple environments.
I'm shameful to be plugging myself, but it's saved me lots of editing and it might be of use to anyone else that'll read this post in the future.
Here's the link to the Git repo: https://github.com/jedkirby/ci-multi-environments and this is a brief explanation of why and how I made the module: http://jedkirby.com/blog/2012/11/codeigniter-multiple-development-environments

Twilio script runs locally but not on server?

I've been using twilio's library just fine locally (mac os x with XAMPP), but when I upload it to an amazon ec2 instance, the ability to send sms messages breaks.
$sms = $client->account->sms_messages->create(
"xxx-xxx-xxxx", $users[pnumber], "Testing!");
(the x's are numbers)
The above code seems to be what breaks it. I have uploaded the twilio library to the correct directory. I have also tried enabling all permissions to see if it was a permission issue.
I'm rather inexperienced to running things on my own server. Any guidance, guesses, and tips would be appreciated!
edit: Clarification - by "breaks", I mean the rest of the page does not load. If I add "echo "Hi";", it will not be printed. However, echo-ing before the code above works.
The problem was that I had not installed cURL onto my server yet. It was not included in my php installation. Thanks to Kevin Burke's advice, I ran it in command line and realized that it was calling a non-existant function. Some googling led to me installing curl, which fixed the problem. Thanks Kevin!

Invoke a CakePHP console shell on server without command line access

Is there way to invoke a CakePHP console shell on server without shell access? I have written a shell for performing some once off (and hence not a cron task) post DB upgrade tasks.
I could always just copy the logic into a temporary controller, call its actions via http and then delete it, but was wondering if there was a better way to go about it.
It seems that this is a one off script you might want to typically be running after DB updates right?
If that's the case, you can make it part of your "DB update script"
If you use anything like capistrano, you can include there too.
In all cases, if you don't want to touch the shell, I agree that having a controller to call the console code (or any php file running exec() as mentioned previously) would do the trick.
Also, if you want to run it just once and have it scheduled - don't forget that you have the "at" command (instead of cron) which will run it at that scheduled date (see http://linux.about.com/library/cmd/blcmdl1_at.htm)
Hope it helps,
Cheers,
p.s: if its a console shell and you don't want to run it from the console, then just don't make it a console shell.
I have to agree with elvy. Since this is something that you need to do once in a while after other events have happened, why not just create an 'admin' area for your application and stick code for that update in there?
you may be able to use php's exec function to call it from any old php script.
http://www.php.net/exec

Resources