I am new to gitlab and git, hence apologies if this is a naive question. I am planning to install gitlab omnibus in my amazon ec2 server. Assuming that there is an elastic IP for the server, will it be possible to browse the code / doc committed there through a desktop or mobile web browser?
Idea is to have a private repository which is accessible (using a userid/password) from the mobile web browser.
Related
I've started up a Google Cloud VM with the external IP address 35.225.45.169:
Just to check that I can serve a website from there, I've cloned a Hugo started project and run hugo server --bind=0.0.0.0 --baseURL=http://0.0.0.0:1313:
kurt_peek#mdm:~/synamdm$ hugo server --bind=0.0.0.0 --baseURL=http://0.0.0.0:1313
Building sites … WARN 2020/01/02 04:36:44 .File.Dir on zero object. Wrap it in if or with: {{ with .File }}{{ .Dir }
}{{ end }}
| EN
+------------------+----+
Pages | 16
Paginator pages | 0
Non-page files | 0
Static files | 20
Processed images | 0
Aliases | 0
Sitemaps | 1
Cleaned | 0
Built in 112 ms
Watching for changes in /home/kurt_peek/synamdm/{content,layouts,static,themes}
Watching for config changes in /home/kurt_peek/synamdm/config.toml
Environment: "development"
Serving pages from memory
Running in Fast Render Mode. For full rebuilds on change: hugo server --disableFastRender
Web Server is available at http://0.0.0.0:1313/ (bind address 0.0.0.0)
Press Ctrl+C to stop
Now I would expect to be able to go to http://35.225.45.169:1313/ in my browser and for the website to be visible, but I find that it is not; instead the operation times out (as shown below with a curl command):
> curl http://35.225.45.169:1313
curl: (7) Failed to connect to 35.225.45.169 port 1313: Operation timed out
Am I missing something here? How should I deploy this static website from the Google Cloud Compute instance to the internet?
Update
Following Ahmet's comment, I edited the VM to allow HTTP and HTTPS traffic. This appears to have created several Firewall Rules in the VPC Network tab (see below).
However, I'm still not able to access http://35.225.45.169:1313/ after this; are there specific rules that I must define?
You have to create a new Firewall rule which allows tcp:1313 port.
But why do you want to host Hugo website on a GCP VM?
Did you checked out hosting Hugo website on GCS or using Firebase?
https://gohugo.io/hosting-and-deployment/hosting-on-firebase/
How to deploy a Hugo website from a Google Cloud VM?
As pradeep mentioned, you will need to create a new Firewall rule that allows the port tcp:1313 to receive and egress traffic.
Here you will find more details on how to create Firewalls rules in Google Cloud Platform.
Nonetheless, I think there are better approaches depending on the website that you would like to serve. Here you will find the different options available for serving websites in Google Cloud Platform, but mainly there are three:
Using Google Cloud Storage.
Using Google App Engine.
Firebase Hosting.
Google Cloud Storage
If you are serving a static website, I highly recommend you to go with Google Cloud Storage or Firebase Hosting. It is true that they do not have either Load Balancing capabilities or Logging, but they are an easy way if you are new to Google Cloud Platform.
As shown here if you would like to host a static site, you could do it within Cloud Storage, but you will need to create a Cloud Storage Bucket, and upload the content to it.
Here you will find more information and a tutorial on how host static websites within Google Cloud Platform using Google Cloud Storage.
Google App Engine
Another option would be to use App Engine, not only is fully managed by Google's infrastructure but also is more simpler than spinning up a VM and making sure that X ports are open or not, Google does it for you.
I attached you a tutorial on how to host Hugo on Google App Engine.
Firebase Hosting
Finally, you could also use Firebase Hosting in order to serve your Hugo website. I attached some documentation regarding more detail information about Firebase Hosting here.
I hope it helps.
I have installed Jenkins in mac and it can be accessed with http://localhost:8080. When I try to add a git webhook, it says 'Couldn't connect to Server'. How can GitHub connect to Jenkins installed in mac?
I have installed all the Git plugins. I have tried with github personal token and github password too.
No code involved here
I expect the github webhook to connect to jenkins server
You need to configure webhook in GitHub first:
set url in configuration of webhook as follows:
http://YourIpAddress:8080/github-webhook/
Now, configure jenkins as mentioned below:
Setup you deploy keys (ssh keys) first in GitHub and Jenkins Credentials and then add github project info to your project, put ssh url of git repository (if repository is private), select webhook SCM hook pulling in Build triggers window of the respective project and add your other settings.
Now try to run this, it should work fine if all these steps followed.
Bitbucket can't connect because your localhost is not accessible from the internet.
What are you trying to use jenkins for?
Usually, you would want it to be installed on a server, not your local machine. That way it can be accessed by other team members and other tools/services such as Github.
Github or Bitbucket are internet based services.
As they are hosted "outisde" your network, these services cannot reach your ip (ie: 192.168.0.1.120)
Depending on your internet access provider, you might have a "public" IP visible from internet.
Once you'll have it, you then need to expose your port "8080" (Jenkins one) through your firewall or router...
Please notice opening firewall to well known service port is a risk to consider.
I read that heroku uses what they call cedar containers in their infrastructure which allows developers to use containerisation in their apps hosted on heroku. If I'm not mistaken that is, I'm new to all this.
Is is possible to run docker containers on web servers and integrate them as part of your website? Or at least, come up with a method of converting docker containers into Cedar containers or something similar which are compatible with the web server?
On your own private server I see no reason why you couldn't do this, but when it comes to commercial web hosting services, where does this stand?
You are not running "docker on web server", but running "docker with web server".
I mean, you supposed to package your app into the docker with some kind of web server.
After it, you can call your app in this container as regular web site. Also, you can host this container in some docker host (for example, docker cloud, sloppy.io,...)
As for heroku, may be you'll find this helpful
I wish to create a private cloud using openstack for our college (which has to be hosted in our college) which will be mostly used for file sharing purpose (right now we have this as one of the priority) so that students can access the files hosted on it from their home.
Right now I have installed openstack's latest version in my local ubuntu machine (All in one machine)using devstack for the testing purpose, it's working fine. Now I am planning to install multi node open stack with help of my friend's laptops (3-4 laptops) to host files which can be accessed through FTP or HTTP. Once configured with local machines we will replicate the same in our college environment. At this moment I am not able to understand what all components are required to be installed and what should be the openstack structure to achieve our requirement.
Please let me know how should I proceed for the same.
I have started using Heroku's addon for ElasticSearch, Bonsai. I want to create a backend search for several categories on my website. Since this is a backend only service, and may contain sensitive information, how do I limit the IP addresses connecting to the Bonsai server that Heroku has provided me to only the IP address/range of my web servers.
Note that my web servers are running on private hardware and are not hosted on a cloud service. I am also not using any other web service on Heroku, so I prefer not to use a Ruby answer to this.
I ended up working with the Bonsai team to set up a custom solution on their end that required a username and password combination to access any data on my hosted search.