I just started using subclipse for my class projects after a re-writing half of a project due to going off on the wrong logical branch. Since I'm using it on my home systems, I have a couple questions whose answers I haven't found in searching.
Say I have local separate repositories on my computers. First, can I use a file sync to keep them synced until I figure out what I need to do to access the university's network? (aside- only information available is about accessing the network remotely is thru PUTTY and FTP, which are not valid URL strings)
Second, in the same local system, am I correct in that once I check out a project and apply commits regularly, I don't have to keep checking out the project?
I ended up using SlikSVN for hosting the repository, since it is free for <100MB storage, which is perfect for my classwork. Also, I don't have to bother with tunneling through the school's network.
You can access SVN repositories via ssh, using urls of the form:
svn+ssh://me#myhost.com:/mypath/to/repo
From the sounds of things with you saying you have 'putty' access this should be very possible.
I would avoid the "separate local repositories" route as much as possible since keeping them in sync introduces problems that having one, definitive location for the whole repository avoids..
Related
this is sort of a top-level full stack question for someone who has some top-tier git knowledge(fluff up).
In our current projects environment we have a single dev server with a typical LAMP set up and building with Laravel, to handle 4 dev's working off a single server I setup a multi-site serving with apache, giving each dev their own port. Each port directs to a folder that each dev works from, giving them all their own code base but one URL with port appendices.
Folder structure
/var/www/master_dev
/var/www/dev_1
/var/www/dev_2
etc.
Scenario basically goes, each dev has a port which they do their specified work in, when completed they create a branch and push, we check it, merge to the standard port 80 and test for bugs.
We're currently at git version 2.35.3, but for some unknown reason sometimes when we merge, there is dup data, sometimes old versions find their way in too.
Now some dev's auto format their HTML. Some space differently. Some don't format anything and it's horrible. But does that effect merging in anyway shape or form?
Is it the apache serving on different ports when pulling/pushing? Each dev does a fresh pull from the master every morning (or should).
When git pulls is it intelligent enough to stay within the working directory it is requested from?
Is it possible that this is just human error?
A lot of questions I know but I'm starting to lose the will to live.
Exclaimer: yes I know of other approaches, i.e. local, containers, etc. etc. I'm working on it, coming to this party late
Why do some companies or projects host nexus on their own domain instead of using Maven Central Repository? Is it related to security? Is this good practice?
Several reasons:
Have a place for the artifacts you build in your company.
Proxy several external repositories, so that the settings.xml only needs to have an entry for the mirror.
Circumvent proxy/firewall problems that stop developers from using outside repositories directly.
Actually, if you have more than one or two developers, it is the way to go.
In addition to proxying several external repositories already mentioned, repository manager groups allow you to combine hosted repositories as well into a single source.
Said group can then be organized to return components in an ordered manner (central first, then others, for example).
This makes configuration simpler and allow you to access your internal and external stuff from one place.
Additionally, if you wish to restrict who can access what, you can setup security policies to this affect. Usually that's not just related to Central, but imagine if you had 3 teams and didn't want them to share each others artifacts. Then 3 repositories, restrict the security per team but can be same config.
Another benefit is caching. If you download something, I download something and JF downloads something, that's 3 hits to the internet. If NXRM downloads it, that's 1 hit then you have it in your intranet.
Note, pretty much everything I just said is not related to just Maven, they're general repository manager perks.
I am running a local ElasticSearch server from my own home, but would like access to the content from outside. Since I am on a dynamic IP and besides that do not feel comfortable opening up ports to the outside, I would like to rent a VPS somewhere, setup ElasticSearch and let this server be a read only copy of the one I have at home.
As I understand it, this should be possible - however I have been unsuccessful at creating any usable version that lets another server be a read-only version of my home ES-server.
Can anyone point me to a piece of information or create a guide, that would help me to set this up? I am rather known to ES-usage, however my setup-skills are still vague.
As I understand it, this should be possible
It might be possible with some workarounds, but it's definitely not built for that:
One cluster needs to be in one physical region; mainly because of latency and the stability of the network connection.
There are no read-only versions. You could only allow read access to a node (via a reverse proxy or the security plugin), but that's only a workaround.
I am reading Repository Management with Nexus and the focus of it seems to act as a local proxy. Instead I would like to use it to distribute custom artifacts (very few of them, like less than 10). Some of them might be open source and some private to a company or another company (I'm a consultant).
Before I read the whole book and find out that Nexus is not for me, do you think this is a reasonable use case? I'm only at chapter 2, so I don't know what kind of authorization can Nexus provide for a single artifact. One option would be to install multiple copies of Nexus in different path, with http password I guess, albeit probably not the smartest.
The purpose of this question is to know if Nexus is suitable to distribute private artifacts to different companies with different privileges and to work on the internet, not in a intranet, or I should look for other options. Thanks!
I think this is a very reasonable thing to ask of a Nexus installation. I've used both Artifactory and Nexus (2 of the most popular Maven repos), and found Nexus to be much more flexible and full-featured. Sonatype has a similar setup to what you are asking about for their open source artifact hosting. In that case, I believe the security only restricts uploads, not downloads, but I'm 99% certain that downloads can also be secured. JBoss also has a large public Nexus installation.
Do keep in mind, though, that Sonatype has both an OSS edition and a commercial edition of Nexus. The segregation you're looking for may be a commercial feature only. But I would still recommend Nexus for the purpose you described, as long as the cost isn't prohibitive. Hint: I think you'll start to really find what you're looking for when you get to chapter 6.
Nexus allows you to have both hosted and proxy repositories. Apart from this it allows you to have virtual repositories and groups. Groups can be used for grouping your repositories under one name. So... you can set up a repository containing some artifacts that should be visible only to your clients, another one for your OSS artifacts and then group them for some client.
You can also use the Pro version, which is paid, and, as far as I recall had this sort of feature.
Recently I stumbled across mongoDB, couchDB etc.
I am hoping to have a play with this type of database and was wondering how much access to the hosting server one needs to get it running.
If anyone has any knowledge of this, I would love to know whether it can be set up to work when your app is hosted via a 'normal' hosting company.
I use Mongo, and so I'm really only speaking for Mongo, but your typical web hosting environment wouldn't allow you to set up your own database. You'd want root-level (admin) access to the server to set up Mongo. To get that, you'd want something like a VPS or a dedicated server.
However, to just play around with Mongo, I'd recommend downloading the binary for your OS and giving it a run. Their JavaScript shell interface is very easy to use.
Hope that helps!
Tim
Various ways:-
1) There are many free mongodb hosting available. Try DotCloud.com. Many others here http://www.cloudhostingguru.com/mongoDB-server-hosting.php
2) If you are asking specifically about shared hosting, the answer is mostly no. But, if you could run mongoDB somewhere else (like from the above link) and want to connect from your website, it is probably possible if your host allows your own extensions (for php)
3) VPS
How about virtual private server hosting? The host gives you what looks like an entire machine... hard drive, CPU, memory. You get to install whatever you want, since it's your (virtual) machine.
In terms of MongoDB like others have said, you need the ability to install the MongoDB software and run it (normally as a daemon). However, hosted services are just beginning to appear, such as MongoHQ. Perhaps something like this might be appropriate once its out of beta (or if you request an invite).
It appears hosted CouchDB services are also popping up, such as couch.io or Cloudant. I personally have no experience with Couch so I can be less certain than with Mongo, but I'd imagine that again to run it yourself, you'd need to install the software (and thus require root access).
If you don't currently have a VPS or dedicated server (or the cloud-based versions of the aforementioned), perhaps moving your data out to a dedicated hosted service would be an ideal way to go to avoid the pain and expense of changing your hosting setup.
You can host your application and your database in the different hosting servers.
For MongoDB you can use mongohq or mongolab with space 0.5 Gb for free