Alfresco deployment doesn't work - maven

I've created the simple example from here, then packaged it using mvn package, and made an effort to deploy the result amp file as described here. But after restarting Alfresco I didn't see any changes. There were no new variants of workflow to choose. The java -jar alfresco-mmt.jar list <WARFileLocation> executing showed that modules org.alfresco.integrations.google.docs and org.alfresco.vti are installed in the chosen war but not a word about my helloworld workflow
UPD: I looked deeper through the tutorial and found that to add an activiti to the share u first need to add some other activiti to the alfresco. it is told there:
Open a command-line window and switch to
$TUTORIAL_HOME/workflow-tutorial-repo. Run mvn integration-test
-Pamp-to-war -Dmodule.log.level=debug. Your repo tier project will be installed and started on Tomcat running on port 8080. Open a new
command-line window and switch to
$TUTORIAL_HOME/workflow-tutorial-share. Run mvn integration-test
-Pamp-to-war -Dmaven.tomcat.port=8081
why is it so? I can't just deploy the only one project to the share but need to make a progect for the Alfresco first?

By the command you wrote you start repo. Since share and repo running on the same server they should run on different ports, so for share you need to provide another port, which is done by additional parameter: -Dmaven.tomcat.port=8081
UPD
#NikitinMikhail The quote you've added describes how to start share.
Alfresco consists of two projects (according to the maven sdk you use) which are repo and share.
Alfresco Share provides a rich web-based collaboration environment for managing documents, wiki content, blogs and more. Share leverages the Alfresco repository to provide content services and utilises the Alfresco Surf Platform to provide the underlying presentation framework.
In other words share is just separate project which communicates with repo and provides better user interface than repo.

Related

How can create script to get code, publish and run it in some empty machine (NetCore WebApi)

I have a doubt.
How can i create scritps to :
Get my code from repository (GitHub, GitLab...)
Build
Publish
Test
Run in IIS
This script should run in windows or linux OS, and consider that i have a empty VM.
This application is an .Net Core WebApi.
I searched in web but not found an template geting code from repository.
This is doable with scripts like #Scott said and you should consider using solutions for this because there are some great free ones out there like teamcity with octopus integration. Here is what you need to consider if you decide on making scripts for this.
The vm you have is empty so the runtimes need to be installed and
checked are they compatible with code you are trying to deploy to
them.
The scripts for some parts of deployment will need to be run under user with sufficient privileges
You will need to handle the webserver configuration with the scripts as well for all of this
And those only a few things that are on the list for that path. Now having said that there is the path of containers which handle most of this through code and can be deployed to all of environments you mentioned before and you only need to worry that there is a container service on those vm-s you want to deploy to and it will be much easier to handle since like i mentioned it is all in code and is easily changed unlike some scripts.

How to get Jenkins repository server to host only stable builds?

I have Jenkins version 2.7.1 running on a Windows 7 machine. It is successfully pulling code from a subversion repository and running tests. I have the test jobs set up for the development branch of each project only.
We periodically make stable releases of the projects in jar files with version numbers. I would like to have Jenkins be the repository manager for those stable releases. These are made by hand - There is no Jenkins job making or testing stable releases. The projects do use Maven.
Each stable build is tagged in the subversion repository, so it could be made again on demand if needed.
I downloaded the Maven repository server hoping to make this fit the purpose. I read the documentation that's provided, but it's pretty terse. As I understand it and have it configured now, this appears to have a couple of issues:
If I go to jenkins-ip/plugin/repository/project, it has made directories there that expose the names of all of my projects, which seems undesirable. (Here jenkins-ip is the IP where I access Jenkins on my local network.)
On the other hand, there's nothing but empty directories under these projects, so they're currently useless.
These projects all correspond to the continuous testing of the development branch. There's no apparent way to get the stable builds into the hierarchy. (It doesn't seem efficient to create a job for each stable release...)
Is there anyway to get Jenkins (with this plugin or through another method) to be the repository manager just for the stable builds? I know that I can start a different repository manager like archiva, but it would be ideal to use Jenkins since it's already running and it seems to claim capability for this function now.
To use Maven repository server you have to build the project on Jenkins.
Then the plugin will expose all archived artifacts as maven repo.
Note you need to use a "Maven project" type for it to work (freestyle is not supported)
There are several plugins that will help you manage building from multiple tags, however not all of them work with "Maven project" type.
You could also try Jenkins pipeline (previously "Workflow") or the Job-DSL plugin.
A simplest solution would be to have a build parameter specify the tag name (then checkout e.g. ^/tags/projectname/${tagParam}), but you have to figure out how to trigger the job then.

Integrating SilkCentral with Nexus

We currently use SilkCentral Test Manager (SCTM) integrated with our source control system via SCTM source control profiles. However, we would like to explore integrating with build artifacts checked into Maven's remote Nexus repository instead.
The idea being that the application-under-test is built and checked into Nexus along with the automated tests only if the build and the tests pass. Therefore, when QA is ready to run tests from SCTM (manual or automated), there is a well-defined combination of application build artifacts and test build artifacts in Nexus that present a more reliable target for SCTM as compared to getting the latest code from the source control system.
All of this is more relevant during active development when the code and the tests and changing daily and the builds are snapshot builds rather than formal builds with tags in the source control system that SCTM could use.
SCTM apparently has support for both universal naming convention (UNC) and Apache virtual file system (VFS) and either of these should potentially be utilizable to point the SCTM source control profiles to Nexus artifacts rather than raw source code. However, I wanted to check with the community to see if there's a simpler approach. (For example, I noted the existence of a Hudson SCTM plugin.) Also, I welcome alternative thoughts and ideas.
There are probably many solutions for solving this, I'd try the following:
Manage the build/first test/publishing steps in Hudson/Jenkins.
For example by modelling it with dependent jobs, the publish job is only triggered if the tests pass. There are also more advanced gatekeeper plugins available (for example a Downstream Ext plugin) which might solve this even more comfortable.
Once the publishing is done, use the Hudson/Jenkins-Silk Central plugin to trigger the executions on Silk Central. There, instead of using UNC or VFS, I'd rather use a setup script which pulls the artifacts from the repository and prepares everything for the tests. This would allow you to use something Maven/Nexus aware to pull the correct artifacts from the repository, instead of somehow trying to make it accessible via UNC or VFS.

Deploying a pre-built package to Appharbor

I couldn't find any information on Appharbor's website about the possibility to deploy pre-built asp.net (mvc) applications. Does anyone know if that's doable?
Another question I have is wether appharbor's built process supports project that launch an executable (node.exe in this case) that's included in a solution folder as part of a custom build step?
If you're worried about precompilation, that's something AppHarbor does out of the box. If you push a repository without a solution file, we won't build it, but just deploy the contents (see part with no solution file).
You should also be able to run node.exe as part of the build, as long as all dependencies (incl. node.exe) are in the repository.

Is there something similar to Maven Cargo but for AppAssembler?

I want to deploy a generated Maven AppAssembler assembler/ directory to somewhere in a file system, SSH, or whatnot. Can Cargo do that for me, or is there an equivalent deployment tool that will let me glob a bunch of files (in this case the target/appassembler/ directory) and deploy them to a destination?
I have a couple command-line applications that run as scheduled tasks (via cron or Windows Scheduler), and I want to deploy them out to these remote locations (in one case via SSH, and another a network share \\servername\C$\whatever\). I don't know how I can accomplish that, since all of the deployment plugins I have been looking at cater to web applications and app containers or Remote repos like Nexus.
Try maven copy plugin - it has excellent networking support (scp,FTP,HTTP).
You might also find useful maven sshexec plugin.
I know this question is quite old, but since someone else might also be interested in this:
I don't have a complete/concrete example for this, since I never tried it, but maybe the maven assembly plugin could be used for this, with the dir assembly format?

Resources