How to connect Sync gate way by the config file? - couchbase-lite

I have referred the documentation from couch base server officials, over there it is given that for connecting sync gateway by using config.json file use the command
$ sync_gateway config.json
But my question is this command is for windows OS ? and if not then how should I run sync gateway by using config.json file ?

For windows OS
1) Go to command prompt
2) go inside the folder of sync-gateway(where sync gateway .exe is put)
3) Now run the above command as it is

Related

Azure DevOps ThirdParty Tools for build / Deployment

List item
pipelines:
default:
- step:
name: Push changes to Commerce Cloud
script:
- dcu --putAll $OCCS_CODE_LOCATION --node $OCCS_ADMIN_URL --applicationKey $OCCS_APPLICATION_KEY
- step:
name: Publish changes Live Storefront
image: Python 3.5.1
script:
python publishDCUAuthoredChanges.py -u $OCCS_ADMIN_URL -k $OCCS_APPLICATION_KEY
environment variables:
$OCCS_CODE_LOCATION: Path to location of all OCCS code
$OCCS_ADMIN_URL: URL for the administration interface on the target Commerce Cloud instance
$OCCS_APPLICATION_KEY: application key to use to log into the target Commerce Cloud administration interface
So I want to use Azure Dev Repository to CI / CD.
in the above code block if you see I have specified - dcu & python code in two task.
dcu is nodejs third party oracle tool which needed to be used to migrate code to cloud system. I want to know how to use that tool in azure dev ops,
Second python (or) nodejs which I want to invoke to REST api to publish the changes.
So where to place those files and how do we invoke it.
*********** Update **************
I hosted the self pool agent and able to access the system.
Just start executing basic bash code, but end up in two issue -
1) the git extract files from the repository it is going to _work/1/s, not sure how that path is decided. How can I change that location s
2) I did 'pwd' to the correct path but it fails in 'dcu' command. I tried with npm and other few commands it fails. But things like mkdir , rmdir it create & remove folder correctly from the desired path. when I tried the 'dcu' cmd from the terminal manually from the system it works fine as expected.
You can follow below steps to use DCU tool and python in azure pipelines.
1, create a azure git repo to include dcu zip file and your .py files. You can follow the steps in this thread to create a azure git repo and push local files to azure repo.
2, create azure build pipeline. Please check here to create a yaml pipeline. Here is a good tutorial for you to get started.
To create a classic UI pipeline, please choose Use the classic editor in the pipeline setup wizard, and choose start with an Empty job to start with an empty pipeline and add your own steps.(I will use classic UI pipeline in below example.)
3, Click "+" and search for Extract files task to unzip the DCU zip file. Click the 3dots on the Destination folder field to select a destination folder for extracted dcu files. eg. $(agent.builddirectory). Please check my answer in this thread more information about predefined variables
4, click "+" to add a powershell task. Run below script in screenshot to install dcu and run dcu command. For environment variables (like $OCCS_CODE_LOCATION), please click the variables tab in below screenshot to define them
cd $(agent.builddirectory) #the folder where the unzipped dcu files reside. eg. $(agent.builddirectory)
npm install -g
.\dcu.cmd --putAll $(OCCS_CODE_LOCATION) --node $(OCCS_ADMIN_URL) --applicationKey $(OCCS_APPLICATION_KEY)
5, add Use python version task to define a python version to execute your .py file.
6, add Python script task to run your .py file. Click the 3dots on Script path field to locate your publishDCUAuthoredChanges.py file(this py file and the dcu zip file have been pushed to azure git repo in the above step 1).
You should be able to run the script of above question in the azure devops pipeline.
Update:
_work/1/s is the default working folder for the agent. You cannot change it. Though there are ways to change the location where the source code is cloned from git, the tasks' workingdirectory is still from the default folder.
However, You can change the workingdirectory inside the tasks. And there are predefined variables you can use to refer to the places in the agents. For below example:
$(Agent.BuildDirectory) is mapped to c:\agent_work\1
%(Build.ArtifactStagingDirectory) is mapped to c:\agent_work\1\a
$(Build.BinariesDirectory) is mapped to c:\agent_work\1\b
$(Build.SourcesDirectory) is mapped to c:\agent_work\1\s
The .sh scripts in the _temp folder are generated automatically by the agent which contains the scripts in the bash task.
For above dcu command not found error. You can try adding dcu command path to the system variables path for your local machine's environment variables. (path in user variables cannot be found by agent jobs, For the agent use a different user account to connect to local machine)
.
Or you can use the physically path to dcu command in the bash task. For example let's say the dcu.cmd in the c:\dcu\dcu.cmd on local machine. Then in the bash task use below script to run dcu command.
c:/dcu/dcu.cmd --putAll ...

aws s3 cli not working in window task scheduler

I try to run the following aws cli command in console it working correctly.
I have aws access key and secret configured.
aws s3 sync "C:\uploadfolder" s3://uploadfolder
However, when i run it inside windows task scheduler in windows 10 as well as windows server 2012, I got the following error:
cannot find the file specified 0x80070002
It does not seems like it is a corrupted profile because it does not work for both windows and other command is running as expected.
Is there any step that I miss out? or any other special command needed when run aws cli in window task scheduler.
Your cli command is attempting to sync a FILE called "uploadfolder". You need to change to the directory first, then run the command. Your command should instead be:
cd C:\uploadfolder
aws s3 sync . s3://uploadfolder/
This will recursively copy all files in your local directory that are not in your s3 bucket. If you would also like the sync command to delete files that are no longer in the local directory, you also need to add the --delete flag.
aws s3 sync . s3://uploadfolder/ --delete

How to see Parse Server cloud code logs?

I have Bitnami's Parse Server set up on Azure.
I'm logging some info from cloud code using console.log and console.error. When using hosted Parse these logs were displayed in the Info & Error Logs section on the Dashboard. Any idea where the logs go to now?
The issue is not specific to Bitnami's distribution. I also tested on a local machine with parse-server-example & Parse Dashboard and got the same result (no logs).
I use AWS but you can see the logs by downloading them or running it on localhost just cd into your folder then do Npm start on terminal and switch you parse server URL to http://localhost:1337/parse.
You can manually download them through the azure cli
Take a look here for installation : https://azure.microsoft.com/en-us/documentation/articles/xplat-cli-install/
I used npm : npm install azure-cli -g
open up terminal and type in : azure site log download webappname
This will save the logs for the web app named 'webappname' to a file named diagnostics.zip in the current directory.
Unzip and open the folder diagnostics -> LogFiles -> Application
The text file with -stderr- in the name of it will display the logs you display by using console.error() in your cloud code.
The text file with -stdout- in the name of it will display the logs you display by using console.log() in your cloud code.
This is a known issue on Bitnami Parse. We are working on fixing it for the next release.
You have to log in your server via SSH and modify the line below at the /opt/bitnami/apps/parse/htdocs/server.js file:
From:
cloud: "./node_modules/parse-server/lib/cloud-code/Parse.Cloud.js",
To:
cloud: "./cloud/main.js",
You have to include the path to the ./cloud/main.js you previously created (assuming you created it in /opt/bitnami/apps/parse/htdocs/).
Remember to restart the Server after applying those changes running:
sudo /opt/bitnami/ctlscript.sh restart

Keeping PHPStorm files in sync with the ones generated on the server via php artisan

I am using Laravel with PHPStorm and a custom server where I connect via SFTP. The problem is that being SFTP, it's not in sync. So everytime I generate files via php artisan command, I have to download the file(s) with PHPStorm. I know that I can get around that by using Homestead and Shared folders, but this project requires a custom VPS.
I know that no SFTP "drive" is currently working ok with Windows. Also, the server is remote, not on the same network, so Samba can't do the job.
Thank you!
This is a workflow I use, you may simply need to do the following, assuming you have already setup a default deployment server.
Editing remote files
If you are editing the remote files instead of a local copy, don't; instead:
create a local copy/git clone/etc of you project files.
create a new phpstorm project with the local copy.
Setting up a sync
If you already are working off a local copy but just need sync setup:
ctrl+shift+a
type deployment
select options
change the option: Upload changed files automatically [..] to always
enable upload external changes
As an added bonus, this also automatically syncs assets from say gulp watch too.
If you haven't setup a deployment server
ctrl+shift+a
type deployment
select configuration
create a new server with you method of connection to it.
enable as default server (last icon on the top left column)
Important: if you don't select the server as the default, it will not be able to auto upload changes.
Also don't forget to setup the excludes in the configuration menu, I usually exclude bower_components, and node_modules from deploying to my servers, and only send the build assets. (But it's up to you)
EDIT: Don't run commands remotely, run them locally and let them sync back to the server.
I execute the artisan commands on both sides... i do it on this way on my linux maschine
<?php
unset($argv[0]);
$params = implode(' ', $argv);
$remoteOutput = shell_exec("sshpass -p password ssh -o StrictHostKeyChecking=no user#1.1.1.1 'php /path/to/artisan $params'");
if(!empty($remoteOutput)){
shell_exec("php artisan $params");
}
Save it and add it as commandline tool in phpstorm.... in windows i think you can use the PHP SSH library or somthing else.

How to setup Pydevd remote debugging with Heroku

According to this answer I am required to copy the pycharm-debug.egg file to my server, how do I accomplish this with a Heroku app so that I can remotely debug it using Pycharm?
Heroku doesn't expose the File system it uses for running web dyno to users. Means you can't copy the file to the server via ssh.
So, you can do this by following 2 ways:
The best possible way to do this, is by adding this egg file into requirements, so that during deployment it gets installed into the environment hence automatically added to python path. But this would require the package to be pip indexed
Or, Commit this file in your code base, hence when you deploy the file reaches the server.
Also, in the settings file of your project if using django , add this file to python path:
import sys
sys.path.append(relative/path/to/file)

Resources