I'm setting up our company's first Mercurial web interface, and I've hit a bit of a roadblock. We will have multiple teams using this server, and I don't want team A to have to deal with team B's repositories, and vice-versa.
If hgweb served the repositories as they were held in the folder tree, that would be perfect. Unfortunately, all of the templates I've seen "flatten the tree" into a simple repository list. So I've been trying to set up multiple sites, so that https://hg.server/teamA lists only team A's repos, and https://hg.server/teamB shows only team B's repos. Sounds simple enough!
I'm using ISAPI rather than CGI. Unfortunately, the ISAPI handler seems incapable of pointing to more than one hgweb.config file.
I'm impressed with RhodeCode's look, but I have been unable to get it installed properly. I'm extremely new to Python; so some of the installation instructions are, to say the least, confusing.
So... any suggestions on how to successfully install RhodeCode, or otherwise do what I need to do??
You're correct that hgweb will descend into subdirectories by default and show everything in a big flat list. But try setting
[web]
descend = False
to prevent this. You can still browse the subdirectories directly, as shown in this example on my server:
http://hg.lazybytes.net/team-a/
http://hg.lazybytes.net/team-b/
The top-level directory will then look pretty empty, though. Leaving web.descend at its default values might then be better — you'll then see a big flat list at the top-level, but can still browse subdirectories to only see the repos specific to each team. Seems like the best of both worlds.
Related
In my scenario, I have two people that do work on the same code base. Their only available workspace is a shared dev environment (where the files built are used to host the dev version of the site to boot). As such, they perform their work directly in that location. I've recently introduced source control to the project, and turned that location into a Git repository.
Let me preface by saying: Yes, I would love it if the dev host spot was a deploy-to spot, and these people had their own local copies of the source code. But that isn't feasible right now.
My question: Is it possible for two different Windows users/Git users (they have separate accounts that they can use to interact with GitHub/etc. with) to share the same folder? My hope would be that SourceTree (our weapon of choice) or Git, at least, wouldn't have a problem with this: Just show diffs of what's changed, and use the currently-logged-in user's information when making commits/other actions.
It looks like that while SourceTree has separate installation directories, it still embeds some account information in the .git folder itself. When I try to interact with Git (via a pull for example), it first tries to prompt for new credentials/etc., but shortly thereafter it says "please enter password for {other-user}" without an option to hop usernames.
It looks like we'll just have to do things the right way after all. Painful (for them) but no choice.
I'm curious how other people have approached this. Our group has been given the directive of implementing an internal website utilizing Joomla. We've set up a dev server for the person who is responsible for maintaining the site, and a production server. We're using IIS and the current version of Joomla.
I can sync the two with Akeeba Backup Core and Kickstart, but it seems an "All-or-Nothing" choice. It works, but if she's doing work on, for example, the look and feel of the site, but just wants to sync content, that doesn't appear to be doable.
I feel that someone out there must have tackled this goal before, but web searches seem to turn up people running dev/prod on the same server but in different subdirectories, or ignore the "all-or-nothing"ness of the issue, going for the "Do all at once" approach, which doesn't seem practical. Content changes frequently, but not-so the look/feel.
We've been doing this for several years now. We use a dev server and a prod server. When we make content changes on dev, we use phpMyAdmin to copy the content table from the dev db to the prod db. In some respects, it's still an all-or-nothing approach, because we have to copy the entire content table at once. This means you can't have some pages still in development when you do the copy. In other respects, it still a piecemeal approach, because we can copy individual tables such as modules, menus, etc. But again, it's ALL modules at once, ALL menus at once, etc. There is a way in phpMyAdmin to copy an individual page or item from a table in dev and put it in the corresponding table in prod, but it's a little cumbersome. It works, though.
As for design elements (images, css, template changes, etc.) we do the same thing, but the copying is done manually by ftp from one server to the other. Obviously this is the same method for things like pdf files on dev that need moved to prod.
In summary, this method has worked fairly well for us for a long time. But it's limitation is that you must realize you're copying an entire table at once.
The positive of all of this is that when we have pages that are in development, I have leverage over the content people to hurry and finish their work because one unfinished page can hold up the entire site!
This workflow dilemma has come up a few times for me.
You mention changes to look and feel, and that is simpler really, if it is just template changes. It is quite simpler to pull down an Akeeba Backup of the live server, kickstart it onto a local server, work on the template files, and then upload the updated template files to the live server.
That said, if it is more than CSS and HTML tweaks to existing files, it can be a more involved process.
Personally I've not found a silver bullet for this sort of thing, but with some forethought and planning it is not too bad.
To start with some background, I am a member of a small team developing an ASP.NET application. In addition to us, there are 2 other teams working on it, all from different countries. Source code is hosted on a shared SVN server but there is no central testing environment. Each developer runs the app on their own machine and data services are set up per team.
Unfortunately our SVN workflow has some gaps in it: annoyances arise when there is time for an SVN update.
It is mainly because each developer and team have slightly different environments in terms of disk directory structure and configuration (both IIS and app itself). Hence conflicts in configuration files and elsewhere that in essence are not conflicts at all - for runtime configuration (XML) and in *.suo.
How should we handle this if our objective is to keep checkout, app setup and update as painless as possible?
One option would obviously be master copies. Another one establishing uniformity in developer environments and keeping it. But what about a third alternative?
One thing to do is to not put the .suo files into SVN, there's no reason to do that.
For IIS configuration there should be no argument - uniform environment across the build team.
For app.config files and the like, I tend to keep them in a separate "cfg" directory in the root of the project and use pre-build events to copy in the relevant ones I need depending on the project and environment I'm working on.
You could have a separate build task to copy in user-specific config into your output directory. Add a new directory in your root project called "user.config or something, and leave it empty. Then configure your project build to check this for entries and copy them to the output directory. This is easy to do, and then each dev can have their own config without affecting the master copies. Just make sure you have an ignore pattern on that folder so you don't commit user-specific configuration. If you have svnadmin access to your source code repo, you could set a hook to prevent it from ever happening.
Also set ignore patterns on your root directory (recursively) for .suo, .user, _Resharper or any other extensions you think are pertinent. There are some So questions already on exactly this topic:
Best general SVN Ignore Pattern?
Ignore *.suo and *.user files in svn. It is easy. After that create two types of config files in subversion. Development and Server, if in use add Test also. See below example.
ConnectionStringDevelopment.config
ConnectionStringServer.config
AppSettingsDevelopment.config
AppSettingsServer.config
Server files would contain server information. Development files is not contained in svn and ignored there. Every new developer will start by copying server files and making changes according to his environment.
Look following example site
http://code.google.com/p/karkas/source/browse/trunk/Karkas.Ornek/WebSite/web.config
following lines are interest.
<appSettings configSource="appSettingsDevelopment.config"/>
<connectionStrings configSource="ConnectionStringsDevelopment.config" />
ConfigSource can be used almost everywhere in web.config therefore you will be able to change every config to every developer. Only make use of following naming convention. ignore *Development.config in subversion. This way no developer config will be added to subversion.
Its not a perfect solution (and should only be used if there are not many of those special files), but what I do is to add fake files for each case, and switch the real file locally to it.
In detail: I have a file foo that creates the problem. I also create foo_1 and foo_2 and then locally switch foo to foo_1 (I use tortoisesvn, so I cant really give you the command line to do that). Then I am working on foo on my machine, but actually commit to foo_1. Other parties could then switch to foo_2...
(I admit this is basically a variant of the master-file approach you suggested yourself; but if there are not many actual changes to those files this at least reduces the numer of conflicts you have to think about)
Can I create a symlink to the local extension from aonther project folder? I have a common local-server and i need to implement same extension on all local project-installations. I tried to put the symlink, but some times i do not get expected output. I get it only after clearing the cache of that perticular project.
Your scenario is a common one I guess. But as Omar said, linking to the same code base of the extension through several typo3 instances is not a good practice.
But we have the same structure as yours, we realize this through SVN. All of our projects got a SVN repository and common extensions have their own repository. Through svn:externals the extensions are linked into the concrete project. This has the advantage that you can change the extension in the concrete project and after committing all other projects (that do have to update from svn though) contribute from it. I Think this would fit your needs, too.
If I understand your question correctly you have several Typo3 sites on the same server and want to share an extension between them using a symlink. I don't think that is a very great idea because many extensions use tables and every site normally has it's own database so you would have to do a lot of tinkering to get that to work.
Instead you could make all the modifications to the extension files in the typo3conf/ext/extension_name folder and then export the extension to a t3x file (Ext Manager in the Backend). This t3x file can be installed as a extension (Import extension) on all your other sites.
If you extension does not use a database and you are planning to make frequent changes then I guess you should be able to make that work (the symlink). Otherwise I recommend you use the first approach I described.
I have not tried this, but you should be able to install extensions globally in Typo3. What this means is that the given extension is placed inside '(typo3_src/)typo3/ext/' instead of 'typo3conf/ext/', presuming both sites use the same Typo3 Core/Source (and thus typo3_src is a symlink to the location of the core).
You can enable installing global extensions via the Install Tool. Once inside the tool, click on 'All Configuration', then search for allowGlobalInstall. Or put the following line into your localconf.php:
$TYPO3_CONF_VARS['EXT']['allowGlobalInstall'] = '1';
At last, but not least, you need to make sure the 'typo3/ext/' directory is writeable.
Hope this will be to some help. If you have any further questions, let me know :)
As Björn mentioned, I'd sugegst to install them globally. Mind you, updating the source will require to move the extensions accordingly..
As for "expected output": be aware that the code in these folders is cached in various ways (mainly page content and config settings), and hence not always run. This is the reason a change done from "outside" the current installation is likely not to propagate to your output without clearing these caches (as you have observed).
When you actually install an extension via the extension manager, the cache should (if correctly configured) be cleared (interested parties may search for clearCacheOnLoad in class.em_index.php to reveal a clear_cacheCmd('all')). There is a small checkbox, which is normally checked, during the installation process to accomplish this.
Omar's first approach is therefore, as I see it, the more easy way to get "expected output" and less jumbling around with global extensions.
We're having a spirited discussion about this at my workplace. We're talking about user uploaded images for a bunch of products, not images needed to display the basic site. I say "no way" but I'm curious what others think.
Update: Just to clarify. These are customer supplied images for products that they are entering/modifying.
I agree with 'no way'.
Anything that may change on the site through day-to-day use, or is editable by whoever administers the website I consider to be 'content'. This includes uploaded files and database content, both of which are backed up separately. Nothing on the website that is in version control changes once it's been deployed. Easier that way.
Other ways of asking if something should be in version control:
Do the images change?
Are the changes related to anything else?
Can mistakes be made?
Is traceability wanted/needed?
If the rest of the site is version controlled, version control the images.
If the images are generated, version control the generator.
Presumably, what you are talking about is content that would be classified as user data, as opposed to project files. That stuff, while important, does not need versioning - that needs a plain old backup mechanism.
I recently added a new project into a fresh SVN repository, and every time I look at the 'uploads' folder I realise how stupid I was to include that in the initial commit.
It seems like what you're talking about is content that is in (or perhaps will be) in a database. If a customer is supplying you a list of products as well as the pictures of those products, then that should all come from a database. In this case, I wouldn't because your database should be backed up, but not in the VCS.
If it is not, and your web site is static, then I would only because it is "part of the site."
If you feel you must revision it, put these resources out of the path of the main repository somehow, and then give it a dedicated repository just for that content.
You don't want everyone who has to check out code getting a copy of every image when they checkout or update, its slow, and pointless, and having them in your primary tree will just have more headaches than you can Imagine.
/common_ancestor
/project_code/ # repository a
/resources_dir/ # repository b
If you have to use symlinks or web-server magic to make this happen, then do that, but whatever you do, DON'T put content like that in your main repository.
As far as backups vs revisioning go, revisioning it like this does give you a slight ease if you're using SVN as your distribution method as well, that way if a developer needs a copy of the images for testing purposes, its relatively easy to get a relatively up-to-date set of them.
If you aren't going to expose the versioning to the customers, then what would be the point?
The customers are already free to use version control on their own end, before they submit the files. You may want to encourage them to do so.