PostgreSQL 9.2 streaming replication recovery.conf - windows

I am working on Postgresql 9.2 streaming replication and I have finished setting up on the master and on the standby I want to set up the parameters in recovery.conf file.
But I can not get the file so I have created a new file 'named recovery.conf' and copied all the contents of recovery.conf.sample file and edited the parameters.
I saved it and when I start the postgresql service, it gives error
"service on local computer started and stopped....."
But when I remove recovery.conf file the service starts.
I need help.

Related

Error at prompt:Setting Up Greenplum command center web Application with centos 6.5

We have small gpdb cluster . In that,We are trying to setup the Greenplum command center web portal.
ENVIRONMENT IS
Product | Version
Pivotal Greenplum (GPDB) 4.3.x
Pivotal Greenplum Command Center (GPCC) 2.2
stage of error is : Set up the Greenplum Command Center Console
We have launched the following setup utility.
that is
$ gpcmdr --setup
Getting the following error at prompt msg :
What is the hostname of the standby master host? [smdw]:sbhostname
standby is sbhostname
Done writing lighttpd configuration to /usr/local/greenplum-cc-web/./instances/gpcc/conf/lighttpd.conf
Done writing web UI configuration to /usr/local/greenplum-cc-web/./instances/gpcc/conf/gpperfmonui.conf
Done writing web UI clustrs configuration to /usr/local/greenplum-cc-web/./instances/gpcc/conf/clusters.conf
Copying instance 'gpcc' to host 'sbhostname'...
ERROR: the instance directory was not successfuly copied to the remote host: '/usr/local/greenplum-cc-web/./instances/gpcc'
+You have to reload the configuration by gpstop -u or restart the the database after the gpcc setup, Because setup will add some entries in pg_hba.conf for gpperfmon.
+Also check if you have correct entries in .pgpass file in /home/gpadmin

Run post processing commands on remote server from informatica cloud

I am running a job on informatica cloud. It picks up a file from a server (remote) and dumps the data into salesforce. I want to run post processing commands from informatica cloud on the source file which is present in the remote server after the informatica job finishes. Is it possible?
Files need to present in the Agent installed machine.
Post processing command file cannot be present in remote location.

Laravel Beanstalkd job cannot connect to remote server via ssh

I've got a workflow in my web application that looks like this (Built in Laravel 4):
1) User uploads a file (up to 50mb or so)
2) File is moved to temp directory
3) Queued job created that does the following:
- Uploads file to amazon s3
- SSH into another file processing server and transfers the file to a folder there.
- Delete the temporary file
To connect to the remote server and upload the file within the queued job, I'm using:
SSH::into('processing')->put($localPath, $remotePath);
Everything works fine when I queue this job using the 'sync' driver, so I know the environment and permissions are correct. The problem is, when I switch over to beanstalkd as my queue driver, the job fails with the following:
[2015-01-09 14:15:40] production.ERROR: exception 'RuntimeException' with message 'Unable to connect to remote server.'
Beanstalkd jobs run fine elsewhere in the application (none of the others have ssh commands).
I'm using a username and password for the connection, so it's not a key permissions or passphrase issue. Any ideas?
If you know the file has uploaded OK to S3, why not generate a new job, that will then be run on the other file processing server (step 3b), to have it download the file from S3, if it needs it?
Other than that - you would need more debugging on the SSH upload.

Bash script to write Log Processing output to remote DB2

I have a bash script running on a server to capture and process the Logs.
The script is invoked by an utility for processing the Logs. After processing the Logs the
state of the Logs should be stored in db2 table located in some other server.
say, if i have shell script on 110.88.99.10 and db2 on 110.88.99.11..
I need save the processed result to db2.. Any suggestions?
It seems that you need to install the DB2 client: IBM data server access. Once you have installed, you configure the remote instance and database (catalog TCPIP Node and catalog database), and then you can integrate db2 commands (db2 insert ... - log result) in your script.

Configuring MySQL fails with service not starting (Windows 7)

I am trying to use Mysql5.5.30-win32. I installed it and run the configure wizard and see that at the final page my wizard hangs when trying to start mysqld service.
I tried starting mysqld manually and it says
InnoDB: Error: log file .\ib_logfile0 is of different size 0 54525952 bytes InnoDB: than specified in the .cnf file 0 115343360 bytes!". Then i tried changing the innodb_log_file_size in my.ini to 52M (==54525952) and start manually again.
Now it starts, but if i run the configuration wizard, the same problem is faced coz the configuration wizard overwrites my my.ini file and changes the value of innodb_log_file_size.
How to i solve this? I tried uninstalling mysql and completely deleting the mysql folder, but the problem still persist.
Moreover I am not able to locate any ib_logfile0 anywhere in my mysql5.5 folder. So where is this 54525952 value taken from?
Found it. Actually the ib_logfile0 was available within the C:\Progsql seramData\Mysql folder and C:\ProgramData was a hidden folder.. Removing the ib_logfile* files and configuring the mysql service works fine..

Resources