Imagemagick - change policy.xml on Heroku - heroku

I'm trying to access images via https on Heroku with Imagemagick. How can I change the policies (in policy.xml) on Heroku?
Heroku made an "ImageMagick security update" in May, 2016: https://devcenter.heroku.com/changelog-items/891
I can see the policy list, after typing heroku run bash and convert -list policy:
Path: [built-in]
Policy: Undefined
rights: None
Path: /etc/ImageMagick/policy.xml
[...]
Policy: Coder
rights: None
pattern: HTTPS
[...]
How can I change the policy?
update 1: this is the error in the log file:
Command failed: convert.im6: not authorized `//scontent-fra3-1.xx.fbcdn.net/v/t1.0-9/13962741_132344500547278_4974691444630710043_n.jpg?oh=c169b4ffce9e5ce330ee99214cc6b8d5&oe=5880F245'

I’ve found a relatively simple solution.
Create a .magick directory in your app’s source, and add your policy.xml there. Then, you’ll have to set the environment variable MAGICK_CONFIGURE_PATH to /app/.magick in order to load your file with higher precedence than the default one.

We need to install the third party software ImageMagick on heroku. I used this https://github.com/ello/heroku-buildpack-imagemagick build pack for installing ImageMagick.
So, inside bin/compile, there is a policy file, which is restricting the images to read over Https, enable the attribute rights to read which allows to read over Https
Fork the repo and do your changes, commit and add that repository url to your heroku buildpacks

Read the warnings at ImageTragick, then make a backup and delete the line that restricts you.
You can find the file to edit in the same directory as the other XML config files by doing the following - the file is called policy.xml:
convert -debug configure -list font 2>&1 | grep -E "Searching|Loading"

Related

How to connect NPM to azure artifacts feed on Mac?

I'm trying to connect to my private npm feed from Mac. I generated credentials from Connect to feed menu and they looked like that:
; begin auth token
//pkgs.dev.azure.com/<yourorganization>/_packaging/<yourfeed>/npm/registry/:username=ANYTHING-BUT-EMPTY
//pkgs.dev.azure.com/<yourorganization>/_packaging/<yourfeed>/npm/registry/:_password=BASE64-ENCODED-PAT-GOES-HERE
//pkgs.dev.azure.com/<yourorganization>/_packaging/<yourfeed>/npm/registry/:email=npm requires email to be set but doesn't use the value
//pkgs.dev.azure.com/<yourorganization>/_packaging/<yourfeed>/npm/:username=ANYTHING-BUT-EMPTY
//pkgs.dev.azure.com/<yourorganization>/_packaging/<yourfeed>/npm/:_password=BASE64-ENCODED-PAT-GOES-HERE
//pkgs.dev.azure.com/<yourorganization>/_packaging/<yourfeed>/npm/:email=npm requires email to be set but doesn't use the value
; end auth token
I placed that in .npmrc file in my project and it didn't work. When im trying to do npm install I get this error:
code E401
npm ERR! Unable to authenticate, need: Bearer authorization_uri=https://login.windows.net/...,
Basic realm="https://pkgsprodsu3weu.app.pkgs.visualstudio.com/", TFS-Federated
I also placed these credentials in $HOME directory which also didn't solve the issue. What am I doing wrong? In which .npmrc file should they be? Should I run additional commands to use them?
How to connect NPM to azure artifacts feed on Mac?
The .npmrc file which including the credentials should set int the $home directory.
Check the document Use npm to store JavaScript packages in Azure DevOps Services or TFS:
On your development machine, you also have an .npmrc file in $home for
Linux or Mac systems, or $env.HOME for Windows systems. This .npmrc
file should contain credentials for all of the registries that you
need to connect to. The npm client will look at your project's .npmrc
file, discover the registry, and fetch matching credentials from
$home/.npmrc or $env.HOME/.npmrc. The next section will discuss
credential acquisition.
Since it still not work for you, you could check if your npmrc token has expired. In your .npmrc, I found you are using BASE64-ENCODED-PAT-GOES-HERE, it seems you are using the PAT, but in the 90-day token type. The .npmrc file should like:
//pkgs.dev.azure.com/<yourorganization>/_packaging/<yourfeed>/npm/registry/:username=ANYTHING-BUT-EMPTY
//pkgs.dev.azure.com/<yourorganization>/_packaging/<yourfeed>/npm/registry/:_password=BASE64-ENCODED-PAT-GOES-HERE
//pkgs.dev.azure.com/<yourorganization>/_packaging/<yourfeed>/npm/registry/:email=YOUREMAIL#EXAMPLE.COM
//pkgs.dev.azure.com/<yourorganization>/_packaging/<yourfeed>/npm/registry/:always-auth=true
Check Create a token that lasts longer than 90 days.
If you still have 401 error, please check if your PAT has expired and convert to Base64String or do not have enough permission.
Hope this helps.
In my case for some reason I needed to surround the base 64 encoded token with double quotes and square bracket to make it work.
//pkgs.dev.azure.com/<yourorganization>/_packaging/<yourfeed>/npm/registry/:_password="[BASE64-ENCODED-PAT-GOES-HERE]"
Note the : "[ and at the end ]".
after adding that, all worked just fine.
I'm running MacOS Big Sur 11.6
To be able to connect to the ADO npm feed without saving the credentials, you can get an access token in ADO, and pass that as a parameter to the following script:
setup-npmrc-feed-auth.bash
#!/bin/bash
DecodedPat=$1
NPMAuthIdent=$(echo -ne "$DecodedPat" | base64);
pnpm config set registry https://pkgs.dev.azure.com/{orgName}/{projectName}/_packaging/{feedName}/npm/registry/ --location=global
pnpm config set //pkgs.dev.azure.com/{orgName}/{projectName}/_packaging/{feedName}/npm/registry/:username {orgName} --location=global
pnpm config set //pkgs.dev.azure.com/{orgName}/{projectName}/_packaging/{feedName}/npm/registry/:_password $NPMAuthIdent --location=global
pnpm config set //pkgs.dev.azure.com/{orgName}/{projectName}/_packaging/{feedName}/npm/registry/:email some#email.com --location=global
pnpm config set //pkgs.dev.azure.com/{orgName}/{projectName}/_packaging/{feedName}/npm/:username {orgName} --location=global
pnpm config set //pkgs.dev.azure.com/{orgName}/{projectName}/_packaging/{feedName}/npm/:_password $NPMAuthIdent --location=global
pnpm config set //pkgs.dev.azure.com/{orgName}/{projectName}/_packaging/{feedName}/npm/:email some#email.com --location=global
in your workspace, have the following:
.npmrc
registry=https://pkgs.dev.azure.com/{orgName}/{projectName}/_packaging/{feedName}/npm/registry/
auto-install-peers=true
strict-peer-dependencies=false
always-auth=true

GitHub - Using multiple deploy keys on a single server

Background
I have a system where when I push changes to my Repository, A web hook sends a request to my site which runs a bash script to pull the changes and copy any updated files.
I added a second repository with its own deploy key but after doing so i was getting a permission denied error when trying to pull changes.
Question
Is there a way to use 2 deploy key's on the same server?
Environment Details
Site uses Laravel 5.6, Symfony used to run shell script
Git 1.7
Go Daddy web hosting (Basic Linux one)
Notes
Script just runs git pull command
Error given is " Permission denied (publickey) "
SHH is used as a deploy key so only read access, there is one other project also using a deploy key on the same server
Thank you in advance for you help! Any other suggestions are welcome!
Edit #1
Edited post to reflect true problem as it was different to what I though (Feel free to revert if this is bad practice), please see answer below for details and solution
What i though was an issue with authentication what actually an issue with the git service not knowing which ssh key to use as i had multiple on the server.
The solution was to use a config file in the .ssh folder and assign alias to specify which ssh key to use for git operations in separate repositories.
Solution is here: Gist with solution
This gist explains the general idea, it suggests using sub-domains however a comment further down uses alias which seems neater.
I have now resolved the issue and the system is working fine with a read-only, passphrase-less deploy key.
This can be done by customizing the GIT_SSH_COMMAND. As ssh .config only gets the host, you have to create aliases to handle different paths. Alternatively, as the git CLI sends the path of the repo to the GIT_SSH_COMMAND, you can intercept the request in a custom script, added in between git and ssh.
You can create a solution where you extract the path and add in the related identity file, if available on the server.
One approach to do this can be found here.
Usage:
cp deploy_key_file ~/.ssh/git-keys/github-practice
GIT_SSH_COMMAND=custom_keys_git_ssh git clone git#github.com:github/practice.git

Unable to resolve "unable to get local issuer certificate" using git on Windows with self-signed certificate

I am using Git on Windows. I installed the msysGit package. My test repository has a self signed certificate at the server. I can access and use the repository using HTTP without problems. Moving to HTTPS gives the error:
SSL Certificate problem: unable to get local issuer certificate.
I have the self signed certificate installed in the Trusted Root Certification Authorities of my Windows 7 - client machine. I can browse to the HTTPS repository URL in Internet Explorer with no error messages.
This blog post by Philip Kelley explained that cURL does not use the client machine's certificate store. I followed the blog post's advice to create a private copy of curl-ca-bundle.crt and configure Git to use it. I am sure Git is using my copy. If I rename the copy; Git complains the file is missing.
I pasted in my certificate, as mentioned in the blog post, I still get the message "unable to get local issuer certificate".
I verified that Git was still working by cloning a GitHub Repository via HTTPS.
The only thing I see that's different to the blog post is that my certificate is the root - there is no chain to reach it. My certificate originally came from clicking the IIS8 IIS Manager link 'Create Self Signed Certificate'. Maybe that makes a certificate different in some way to what cURL expects.
How can I get Git/cURL to accept the self signed certificate?
The problem is that git by default using the "Linux" crypto backend.
Beginning with Git for Windows 2.14, you can now configure Git to use SChannel, the built-in Windows networking layer as the crypto backend. This means that it will use the Windows certificate storage mechanism and you do not need to explicitly configure the curl CA storage mechanism: https://msdn.microsoft.com/en-us/library/windows/desktop/aa380123(v=vs.85).aspx
Just execute:
git config --global http.sslbackend schannel
That should help.
Using schannel is by now the standard setting when installing git for Windows, also it is recommended to not checkout repositories by SSH anmore if possible, as https is easier to configure and less likely to be blocked by a firewall it means less chance of failure.
Open Git Bash and run the command if you want to completely disable SSL verification.
git config --global http.sslVerify false
Note: This solution opens you to attacks like man-in-the-middle attacks.
Therefore turn on verification again as soon as possible:
git config --global http.sslVerify true
I had this issue as well. In my case, I was trying to get a post-receive Git hook to update a working copy on a server with each push. Tried to follow the instructions in the blog you linked to. Didn't work for me as well and overriding the settings on a per-user basis didn't seem to work either.
What I ended up having to do was disable SSL verification (as the article mentions) for Git as a whole. Not the perfect solution, but it'll work until I can figure out a better one.
I edited the Git config text file (with my favorite line-ending neutral app like Notepad++) located at:
C:\Program Files (x86)\Git\etc\gitconfig
In the [http] block, I added an option to disable sslVerify. It looked like this when I was done:
[http]
sslVerify = false
sslCAinfo = /bin/curl-ca-bundle.crt
That did the trick.
NOTE:
This disables SSL verification and is not recommended as a long term solution.
You can disable this per-repository which still isn't great, but localizes the setting.
With the advent of LetsEncrypt.org, it is now fairly simple, automated and free to set up SSL as an alternative to self-signed certs and negates the need to turn off sslVerify.
kiddailey I think was pretty close, however I would not disable ssl verification but rather rather just supply the local certificate:
In the Git config file
[http]
sslCAinfo = /bin/curl-ca-bundle.crt
Or via command line:
git config --global http.sslCAinfo /bin/curl-ca-bundle.crt
I faced this issue as well. And finally got resolved by getting guidance from this MSDN Blog.
Update
Actually you need to add the certificate in git's certificates file curl-ca-bundel.cert that resides in Git\bin directory.
Steps
Open your github page in browser, and click over lock icon in address bar.
In the opened little popup up navigate to 'view certificate' link, it will open a popup window.
In which navigate to certificates tab (3rd in my case). Select the top node that is root certificate. And press copy certificate button in the bottom and save the file.
In file explorer navigate Git\bin directory and open curl-ca-bundle.crt in text editor.
Open the exported certificate file (in step 3) in text editor as well.
Copy all of the content from exported certificate to the end of curl-ca-bundle.crt, and save.
Finally check the status. Please note that backup curl-ca-bundle.crt file before editing to remain on safe side.
An answer to Using makecert for Development SSL fixed this for me.
I do not know why, but the certificate created by the simple 'Create Self Signed Certificate' link in IIS Manager does not do the trick. I followed the approach in the linked question of creating and installing a self-signed CA Root; then using that to issue a Server Authentication Certificate for my server. I installed both of them in IIS.
That gets my situation the same as the blog post referenced in the original question. Once the root certificate was copy/pasted into curl-ca-bundle.crt the git/curl combo were satisfied.
To avoid disabling ssl verification entirely or duplicating / hacking the bundled CA certificate file used by git, you can export the host's certificate chain into a file, and make git use it:
git config --global http.https://the.host.com/.sslCAInfo c:/users/me/the.host.com.cer
If that does not work, you can disable ssl verification only for the host:
git config --global http.https://the.host.com/.sslVerify false
Note : Subjected to possible man in the middle attacks when ssl verification is turned off.
In case of github Repositories (or any none-self-signed certs), choosing below while installing Git-on-windows, resolved the issue.
To completely detail out the summary of all the above answers.
Reason
This problem is occuring because git cannot complete the https handshake with the git server were the repository you are trying to access is present.
Solution
Steps to get the certificate from the github server
Open the github you are trying to access in the browser
Press on the lock icon in the address bar > click on 'certificate'
Go to 'Certification Path' tab > select the top most node in the hierarchy of certificates > click on 'view certificate'
Now click on 'Details' and click on 'Copy to File..' > Click 'Next' > Select 'Base 64 encoded X509 (.CER)' > save it to any of your desired path.
Steps to add the certificate to local git certificate store
Now open the certificate you saved in the notepad and copy the content along with --Begin Certificate-- and --end certificate--
To find the path were all the certificates are stored for your git, execute the following command in cmd.
git config --list
Check for the key 'http.sslcainfo', the corresponding value will be path.
Note: If u can't find the key http.sslcainfo check for Git's default path: C:\Program Files\Git\mingw64\ssl\certs
Now open 'ca-bundle.crt' present in that path.
Note 1 : open this file administrator mode otherwise you will not be able to save it after update. (Tip - you can use Notepad++ for this
purpose)
Note 2 : Before modifying this file please keep a backup elsewhere.
Now copy the contents of file mentioned in step 1 to the file in step 4 at end file, like how other certificates are placed in ca-bundle.crt.
Now open a new terminal and now you should be able to perform operations related to the git server using https.
I've just had the same issue but using sourcetree on windows Same steps for normal GIT on Windows as well. Following the following steps I was able to solve this issue.
Obtain the server certificate tree
This can be done using chrome.
Navigate to be server address.
Click on the padlock icon and view the certificates.
Export all of the certificate chain as base64 encoded files (PEM) format.
Add the certificates to the trust chain of your GIT trust config file
Run "git config --list".
find the "http.sslcainfo" configuration this shows where the certificate trust file is located.
Copy all the certificates into the trust chain file including the "- -BEGIN- -" and the "- -END- -".
Make sure you add the entire certificate Chain to the certificates file
This should solve your issue with the self-signed certificates and using GIT.
I tried using the "http.sslcapath" configuration but this did not work. Also if i did not include the whole chain in the certificates file then this would also fail. If anyone has pointers on these please let me know as the above has to be repeated for a new install.
If this is the system GIT then you can use the options in TOOLS -> options
GIt tab to use the system GIT and this then solves the issue in sourcetree as well.
I have had this issue before, and solve it using the following config.
[http "https://your.domain"]
sslCAInfo=/path/to/your/domain/priviate-certificate
Since git 2.3.1, you can put https://your.domain after http to indicate the following certificate is only for it.
Jan 2021 - Got around this in VS2019 by setting Menu > Git > Settings > Git Global Settings > Cryptographic Network Provider > [Secure Channel] instead of [OpenSSL]
Git SSL certificate problem unable to get local issuer certificate (fix)
PS: Didn't need to set --global or --local http.sslVerify false. I was cloning an Azure DevOps repo which wasn't using any self signed certs.. This seems like an issue with either VS2019 or Git for Windows.. They need to fix it !!
In my case, as I have installed the ConEmu Terminal for Window 7, it creates the ca-bundle during installation at C:\Program Files\Git\mingw64\ssl\certs.
Thus, I have to run the following commands on terminal to make it work:
$ git config --global http.sslbackend schannel
$ git config --global http.sslcainfo /mingw64/ssl/certs/ca-bundle.crt
Hence, my C:\Program Files\Git\etc\gitconfig contains the following:
[http]
sslBackend = schannel
sslCAinfo = /mingw64/ssl/certs/ca-bundle.crt
Also, I chose same option as mentioned here when installing the Git.
Hope that helps!
When using Windows, the problem resides that git by default uses the "Linux" crypto backend. Starting with Git for Windows 2.14, you can configure Git to use SChannel, the built-in Windows networking layer as the crypto backend. To do that, just run the following command in the GIT client:
git config --global http.sslbackend schannel
This means that it will use the Windows certificate storage mechanism and you don't need to explicitly configure the curl CA storage (http.sslCAInfo) mechanism.
One thing that messed me up was the format of the path (on my Windows PC). I originally had this:
git config --global http.sslCAInfo C:\certs\cacert.pem
But that failed with the "unable to get local issuer certificate" error.
What finally worked was this:
git config --global http.sslCAInfo "C:\\certs\\cacert.pem"
solved my problem
git config --global http.sslBackend schannel
Download certificate from this link:
https://github.com/bagder/ca-bundle
Add it to C:\Program Files\Git\bin and C:\Program Files\Git\mingw64\bin
Then try something like: git clone https://github.com/heroku/node-js-getting-started.git
git config --global http.sslVerify false
To fix the especific error SSL certificate problem: unable to get local issuer certificate in git
I had the same issue with Let's Encrypt certificates .
An web site with https we just to need :
SSLEngine On
SSLCertificateFile /etc/letsencrypt/live/example.com/cert.pem
SSLCertificateKeyFile /etc/letsencrypt/live/example.com/privkey.pem
Include /etc/letsencrypt/options-ssl-apache.conf
but git pull says :
fatal: unable to access 'https://example.com/git/demo.git/': SSL certificate problem: unable to get local issuer certificate
To fix it, we need also add:
SSLCertificateChainFile /etc/letsencrypt/live/example.com/chain.pem
In my case, I had to use different certificates for different git repositories.
Follow steps below (If you have a certificate of your repository, you can read from step 5)
Go to remote repository's site. Ex: github.com, bitbucket.org, tfs.example...
Click Lock icon on the upper left side and click Certificate.
Go to Certification Path tab and double click to .. Root Certificate
Go to Details tab and click Copy to file.
Export/Copy certificate to wherever you want. Ex: C:\certs\example.cer
Open git bash at your local repository folder and type:
$ git config http.sslCAInfo "C:\certs\example.cer"
Now you can use different certificates for each repository.
Remember, calling with the --global parameter will also change the certificates of git repositories in other folders, so you should not use the --global parameter when executing this command.
git config --global http.sslbackend secure-transport
(had to do that after update to Big Sюr)
This works for me. I opened cmd line and ran following command. and pulled again.
git config --global http.sslVerify false
I've had the same problem from Azure DevOps (Visual Studio). Finally I've decided to clone my repo using SSH protocol because of i've prefered it instead of disabling SSL verification.
You only need to generate a SSH Key, you can do it so... SSH documentation
ssh-keygen
And then, import your public key on yout git host (like Azure Devops, Github, Bitbucket, Gitlab, etc.)
I had this error occur when using visual studio. This occurs when you have the Cryptographic Network provider settings set to OpenSSL in the Visual Studio Options window. When I changed the setting to Secure Channel it solved it for me. This setting must have been set for me when I upgraded my VS.
Error
push failed
fatal: unable to access
SSL certificate problem: unable to get local issuer certificate
Reason
After committing files on a local machine, the "push fail" error can occur when the local Git connection parameters are outdated (e.g. HTTP change to HTTPS).
Solution
Open the .git folder in the root of the local directory
Open the config file in a code editor or text editor (VS Code, Notepad, Textpad)
Replace HTTP links inside the file with the latest HTTPS or SSH link available from the web page of the appropriate Git repo (clone button)
Examples:
url = http://git.[host]/[group/project/repo_name] (actual path)
replace it with either
url = ssh://git#git.[host]:/[group/project/repo_name] (new path SSH)
url = https://git.[host]/[group/project/repo_name] (new path HTTPS)
I have resolved the issue by adding below entry in ${HOME}/.gitconfig file
[remote "origin"]
proxy=
In most case it will happen when proxy enabled in your machine so above mentioned entry will fix this problem.
You might have a DNS issue and not a certificate issue, so before you disable SSL verification in your Git shell you should rule out a DNS problem. Cases such as these have been mentioned in Q&A forums such as https-issues-possibly-related-to-dns. If you are using WSL on Windows as your terminal, then you can try running sudo echo nameserver 8.8.8.8 > /etc/resolv.conf and then issue the git commands to see if that makes a difference. This does not seem to be a permanent DNS fix (lasting only the lifetime of your terminal session), but it could help you determine whether it is a DNS issue and not a certificate issue. You could also check this document on configuring your network to use a public DNS. Again, this is only to help you determine if your DNS settings might need adjusting in order to help resolve the certificate issues.
Download and install local certificate. Probably it is published at your company site. For instance, *.cer file.
Right click it and select Install Certificate. 'Certificate Inport Wizard' will appear. Select Local Machine. Press Next, confirm.
Select Place all certificates in the following store, press Browse and select Trusted Root Certification Authorities, OK, Finish.
Also you can check if other applications can fetch, pull or push data. For instance, in Android Studio or probably IDEA you should select in Settings this checkbox: Use credential helper.
I got this error when trying to "clone" the project. One work-around is to just use the "download as zip" on the webpage, which, for me, achieved what I wanted to do.
This might help some who come across this error. If you are working across a VPN and it becomes disconnected, you can also get this error. The simple fix is to reconnect your VPN.

How to setup Pydevd remote debugging with Heroku

According to this answer I am required to copy the pycharm-debug.egg file to my server, how do I accomplish this with a Heroku app so that I can remotely debug it using Pycharm?
Heroku doesn't expose the File system it uses for running web dyno to users. Means you can't copy the file to the server via ssh.
So, you can do this by following 2 ways:
The best possible way to do this, is by adding this egg file into requirements, so that during deployment it gets installed into the environment hence automatically added to python path. But this would require the package to be pip indexed
Or, Commit this file in your code base, hence when you deploy the file reaches the server.
Also, in the settings file of your project if using django , add this file to python path:
import sys
sys.path.append(relative/path/to/file)

How do I set up Mercurial and hgweb on IIS?

I've been looking all over for decent instructions on how to get hgweb working on IIS but I haven't found much of worth.
There's this "step by step" on the Mercurial wiki, but it's not very good.
There's also this and this, but again, I can't find good steps to lead up to where those get started.
I just had to install a fresh Mercurial instance yesterday, here's updated instructions for 1.7:
Install Mercurial (these instructions were tested with 1.7)
Install Python (for Mercurial 1.7, you must use the x86 version of Python 2.6.6)
You will need to download the hgweb.cgi file from the Mercurial source. You can download the source by running: hg clone https://www.mercurial-scm.org/repo/hg/
Create a folder that will be your web application folder. You will need to copy three things into this folder:
The hgweb.cgi file
The contents of the Library.zip from your "C:\Program Files\Mercurial" folder
The Templates folder from your "C:\Program Files\Mercurial"
You will need to make sure you have Python set up in IIS.
Enable CGI via the following: Control Panel -> Turn Windows Features On or Off -> Roles -> Web Server (IIS) -> Add Role Services -> Check CGI
Create a new Web Site in IIS and make sure the physical path is the folder you created above
In the Handler Mappings for the new website, select "Add Script Map". Enter *.cgi for the request path, c:\Python26\python.exe -u "%s" for the Executable, and Python for the Name.
You will also need to create a file named "hgweb.config" with contents similar to below. The path within the file needs to be the location on your drive where you want to store the Mercurial repositories:
[collections]
c:\Mercurial\repos = c:\Mercurial\repos
Edit the hgweb.cgi file and change the line where it sets the path to your hgweb.config to something like the following (wherever the hgweb.config file is):
config = "C:\Mercurial\hgweb.config"
Now, open a browser and navigate to http://localhost/mercurial/hgweb.cgi (or whatever is the appropriate URL path you set up in IIS) and you should see the Mercurial Repositories page.
Also, check out Jeremy Skinners blog post . It's a little outdated, but has some extra nice steps like setting up URL re-writing for cleaner URL's.
It seems since Mercurial 1.5.2 was released, these tutorials don't work exactly right. For one thing, hgwebdir.cgi has been removed, and is now replaced with hgweb.cgi.
The instructions that worked best for me is at eworldui.net:
http://www.eworldui.net/blog/post/2010/04/08/Setting-up-Mercurial-server-in-IIS7-using-a-ISAPI-module.aspx
Those instructions are meant for IIS 7 or greater. If you're setting this up on IIS 6, I wrote up similar instructions geared toward Win2k3 and IIS 6.0:
http://partialclass.blogspot.com/2010/05/setting-up-mercurial-server-on-win2k3.html
UPDATE: Shortly after getting this working I learned that BitBucket changed their pricing scheme to offer free, unlimited, private hosting: https://bitbucket.org/. I would've opted for that in a heartbeat when I was originally working on this project.
Below are what I did after doing a fair amount of research for geting hgwebdir.cgi setup on IIS6 . It is based on the following sites:
http://python.markrowsoft.com/iiswse.asp
http://www.jeremyskinner.co.uk/mercurial-on-iis7/
You'll need to install the following on the server:
Mercurial (I used version 1.5)
Python 2.6. The version of Python depends on the version of Mercurial installed.
Mercurial 1.5 uses Python 2.6. Install x86 even if you are running x64.
The steps for me were:
Create a directory for the website. I used c:\inetpub\wwwroot\hg.
In IIS, right click on the folder for hg, select properties, select the Home Directory tab.
Click on the Create application button. Set the execute permissions to "scripts".
Still in the Home Directory tab, click on the Configuration button. In the "Application Configuration" popup, click the Add button to add an application extension. The Executable is c:\Python26\python.exe -u "%s" "%s". The extension is .cgi. Set the "verbs" to "limit to: GET,HEAD,POST". Check both Script engine and Verify that file exists.
In the Directory Security tab, click on the Edit button in the Authentication and access control section. Uncheck all authentication methods, and check the "Basic authenication" method. Set the Default domain if you like to your Active Directory domain.
In IIS, click on the Web Service Extensions folder on the left panel. Click on "Add a new Web service extension" link. Extension name should be Python, the required file is c:\Python26\python.exe -u "%s" "%s". Make sure the new extension is "Allowed".
Now is a good time to test that Python is working. Create a file in your new Hg folder called test.cgi. Paste the following python code:
print 'Status: 200 OK'
print 'Content-type: text/html'
print
print '<html><head>'
print ''
print '<h1>It works!</h1>'
print ''
print ''
Open the browser to your site, for instance, http://localhost/hg/test.cgi
You should see "It works!" in the browser.
Next let's get the hgwebdir working.
Delete test.cgi
clone the hg repo to a new directory: https://www.mercurial-scm.org/repo/hg/
copy hgwebdir.cgi to your web directory: c:\inetpub\wwwroot\hg\ from the cloned hg repo
Edit the file and change
application = hgwebdir('hgweb.config')
wsgicgi.launch(application)
to
application = hgwebdir('c:\inetpub\wwwroot\hg\hgweb.config')
wsgicgi.launch(application)
Unzip the Library.zip file in the Mercurial directory, c:\Program Files\Mercurial\, to your web directory, c:\inetpub\wwwroot\hg\
Copy the templates directory from c:\Program Files\Mercurial\templates\ to c:\inetpub\wwwroot\hg\templates\
Create a file called hgweb.config in your web directory.
Now is a good time to test it out. Go to the following URL in the browser, http://localhost/hg/hgwebdir.cgi
Edit hgweb.config, and paste the following:
[collections]
\\server\share$\Hg\ = \\server\share$\Hg\
[web]
allow_push = *
push_ssl = false
These are all my preferences, for instance we have our repos in subdirectories at \\server\share$\Hg. The web app will run under the permissions of the logged in user via the browser, so they'll need read/write permissions to the share.
The last step is to allow for long connections which can happen when you first clone a repo. Run the following command to increase the timeout to 50 minutes:
cd \inetpub\AdminScripts\
cscript adsutil.vbs GET /W3SVC/CGITimeout
cscript adsutil.vbs SET /W3SVC/CGITimeout 3000
Use mercurial to clone the mercurial repository:
hg clone https://www.mercurial-scm.org/repo/hg/
you will find hgwebdir.cgi at the top level. It should install
like any other cgi script.
I've been fighting with this setup for mercurial 1.7.2 for the past week or so, I had to do things slightly differently than the above articles do in order to get it working.
Posting here because google kept bringing me back here....
Full instructions posted here
I followed a combination of these instructions and these (in the source)
The main differences are that I had to do the "pure python" install of mercurial otherwise it would complain about missing dlls, and I found it was important to use the "python installers" for pywin and isapi-wsgi. (maybe this is obvious to experienced python developers, but I'm a python newbie so it was news to me)
Hope this helps somebody and I'm not just making stuff up (I might be, like i said, python newbie)
The hg red book contains some much better general instructions than I've seen in other places. They are not IIS specific, but they are quite good:
http://hgbook.red-bean.com/read/collaborating-with-other-people.html#sec:collab:cgi
I was running into a "...can not load module..." type error and after some reading, the key for me was to ignore the Library.zip file in the Mercurial folder, and instead use the one from C:\Program Files (x86)\TortoiseHg folder.
That tip I found as #6 in this guide:
http://www.endswithsaurus.com/2010/05/setting-up-and-configuring-mercurial-in.html
Hope this helps someone...
I know this is an old question, but I really struggled getting Hg installed on Server 2019 and IIS 10.
Here is what I did to get it working:
Install Python 2.7 which in my case was python-2.7.18.amd64.msi. I will assume it's installed in C:\Python27. Make sure python is added to your path and that pip is installed.
Install Mercurial as a module using pip at the command line:
pip install mercurial
Under Default Web Site add a new application called hg and point it to the directory you want to use to use.
Configure Python as CGI handler in IIS 10.0 for this new website (or the entire web server if you wish). You can do this manually or create/add the follwing to your web.config file:
<system.webServer>
<handlers accessPolicy="Read, Script">
<add name="Python 2.7" path="*.cgi" verb="*" modules="CgiModule" scriptProcessor="C:\Python27\python.exe -u "%s"" resourceType="File" />
</handlers>
</system.webServer>
In the 'hg' application folder create a hgweb.cgi that looks similar to the following:
#!/usr/bin/env python3
#
# An example hgweb CGI script, edit as necessary
# See also https://mercurial-scm.org/wiki/PublishingRepositories
# Path to repo or hgweb config to serve (see 'hg help hgweb')
config = "hgweb.config"
# Uncomment and adjust if Mercurial is not installed system-wide
# (consult "installed modules" path from 'hg debuginstall'):
# import sys; sys.path.insert(0, "/path/to/python/lib")
# Uncomment to send python tracebacks to the browser if an error occurs:
#import cgitb; cgitb.enable()
from mercurial import demandimport
demandimport.enable()
from mercurial.hgweb import hgweb, wsgicgi
application = hgweb(config)
wsgicgi.launch(application)
In the 'hg' application folder create the hgweb.config file and point it at your repos like the following:
[collections]
C:\Web\www\hg\repos\ = C:\Web\www\hg\repos\
Navigate to http://localhost/hg/hgweb.cgi and enjoy!
You can try HgLab. This isn't exactly hgwebdir; rather it is a purely managed Mercurial implementation with push and pull server and repository browser.

Resources