NestJS Monorepo Microservices Deployment to Heroku with Shared Files - heroku

Currently, I am building a monorepo microservices with an apigateway using Redis to communicate. I am planning to deploy each services into different dynos in Heroku. Currently, it's pretty straightforward since every service has its own package.json, tsconfig.json, and Procfile. I could deploy them using git subtree The problem occurs when I want to have a shared files. For example I want to share the same DTOs across microservices to minimize bug and error. Does anyone have any idea on how to approach this?
Current File Structure:
Project
| .git
|
|____Apigateway
| | Procfile
| | package.json
|
|____Microservice 1
| | Procfile
| | package.json
|
|____Microservice 2
| | Profile
| | package.json
Desired File Structure:
Project
| .git
| create.dto.ts
| delete.dto.ts
|
|____Apigateway
| | Procfile
| | package.json
|
|____Microservice 1
| | Procfile
| | package.json
|
|____Microservice 2
| | Profile
| | package.json

A possible way to do this is to create a Nest.js library and then add the library as a dependency to the individual services in their respective package.json files using a local path.

Related

REST access to Spring Config Server from non-spring application

Does anyone know if its possible to use configuration values from the spring config server via a REST interface. If so, is there any documentation on the interface? TIA.
The official API doc is hosted on GitHub.
I have used the REST API manually for testing purposes. I found this sample app to be useful.
API Resources
| Path | Description |
| ----------------------------------- | ------------------------------------------------------------------- |
| /{app}/{profile} | Configuration data for app in Spring profile (comma-separated). |
| /{app}/{profile}/{label} | Add a git label |
| /{app}/{profiels}{label}/{path} | An environment-specific plain text config file (at "path") |

Where to put Hunspell dictionary for Elasticsearch

The Elasticsearch Hunspell docs say to put the dictionaries
in config/hunspell
Is it
/usr/share/elasticsearch/config/hunspell/
or
/etc/elasticsearch/config/hunspell/
or
/etc/elasticsearch/hunspell/
or something else?
So far, I've tried all of those with no success.
There is some talk about a similar issue in this bug report, but I don't see an answer.
Here is an example of the directory structure for Elasticsearch 5 installed using the .deb installer:
# Elasticsearch home directory
#ES_HOME=/usr/share/elasticsearch
# Elasticsearch configuration directory
#CONF_DIR=/etc/elasticsearch
# Elasticsearch data directory
#DATA_DIR=/var/lib/elasticsearch
# Elasticsearch logs directory
#LOG_DIR=/var/log/elasticsearch
# Elasticsearch PID directory
#PID_DIR=/var/run/elasticsearch
In this case, the Hunspell dictionaries should be in a folder called /hunspell in the config directory, which in this particular case would be:
/etc/elasticsearch/hunspell
For the version 6.5.1, no need to create config directory
/usr/local/etc/elasticsearch/hunspell place all language folders here.
-- hunspell
| |-- en_US
| | |-- en_US.dic
| | |-- en_US.aff
| |-- ru_RU
| | |-- ru_RU.dic
| | |-- ru_RU.aff
After that just restart the elasticsearch service.
After the installation you will see this.
\Data: /usr/local/var/lib/elasticsearch/elasticsearch_bira/
Logs: /usr/local/var/log/elasticsearch/elasticsearch_bira.log
Plugins: /usr/local/var/elasticsearch/plugins/
Config: /usr/local/etc/elasticsearch/
The location of the Hunspell directory can be changed by setting
indices.analysis.hunspell.dictionary.location in the
config/elasticsearch.yml file.
https://www.elastic.co/guide/en/elasticsearch/guide/current/hunspell.html
Structure of elasticsearch/config must be
- conf
|-- hunspell
| |-- en_US
| | |-- en_US.dic
| | |-- en_US.aff
| |-- ru_RU
| | |-- ru_RU.dic
| | |-- ru_RU.aff
Turns out that the symbolic links are not followed by elasticsearch (see here), so the ACTUAL files need to be at the location specified.
Also, the file permissions for the hunspell files need to allow the elasticsearch user to access them.
Ex (in /etc/elasticsearch/hunspell/):
drwxr-xr-x 2 root elasticsearch 4.0K Sep 9 09:24 nl_NL

Update to Joomla 3.3.3 from 3.3.1 failed, could not open CONTRIBUTING.md

I tried to update my Joomla 3.3.1 installation to 3.3.3 just now but my browser immediately gave the alert that CONTRBUTING.md could not be opened. It should be in the root folder of my installation, but it is not there. Moreover, when I download a new full package from Joomla there isn't a CONTRIBUTING.md either.
How can I solve this? I have never deleted CONTRBUTING.md so I don't know why it is gone.
Thank you
Not sure why it would require that file however it's pretty easy to re create. Do you have shell access to your server? If not create a file called CONTRIBUTING.MD and load it to the server via FTP or vim into the file via shell and insert the following copy:
Contributing to the Joomla! CMS
===============
All contributions are welcome to be submitted for review for inclusion in the Joomla! CMS, but before they will be accepted, we ask that you follow these simple steps:
1) Open an item on the Joomlacode tracker in the appropriate area.
* CMS Bug Reports: http://joomlacode.org/gf/project/joomla/tracker/?action=TrackerItemBrowse&tracker_id=8103
* CMS Feature Requests: http://joomlacode.org/gf/project/joomla/tracker/?action=TrackerItemBrowse&tracker_id=8549
2) Follow the [Joomla! Coding Standards](http://joomla.github.io/coding-standards)!
3) After submitting the item to the Joomlacode tracker, add a link to the Joomlacode tracker item and the GitHub issue or pull request.
Please be patient as not all items will be tested immediately (remember, all bug testing for the Joomla! CMS is done by volunteers) and be receptive to feedback about your code.
#### Branches
Pull Requests should usually be made for the `staging` branch as this contains the most recent version of the code.
There are other branches available which serve specific purposes.
| Branch | Purpose |
| ------ | ------- |
| staging | Current codebase. |
| master | Each commit made to staging gets tested if it passes unit tests and codestyle rules and then merged into master. This is done automatically. |
| 2.5.x | Branch for the Joomla 2.5.x series. Currently in maintenance mode with EOL end of 2014. No new features are accepted here. |
| 3.2.x | Branch for the Joomla 3.2.x series. Currently in security mode with EOL Oct 2014. Only security issues are fixed. |
| 3.4-dev | Branch for the next minor Joomla version. New backward compatible features go into this branch. Commits to staging will be applied to this branch as well. |
Save that, make sure the file had the normal permissions and you should be all set. Note you need this to be in the root or base directory where Joomla! is installed.

Could not retrieve information from environment production source

I'm using puppet as my provisioner in one of my vagrant project. I'm trying to add a module for a custom bash_profile.
The module_path for puppet is set to:
puppet.module_path = "puppet/modules"
The class for my bash_profile module looks like this:
class bash_profile
{
file
{
"/home/vagrant/bash_profile":
ensure => present,
source => "puppet:///modules/bash_profile/files/bash_profile"
}
}
Here's the file structure for my puppet structure:
puppet
| manifests
| | phpbase.pp // my main manifest file that has includes for modules
| modules
| | bash_profile
| | | files
| | | | bash_profile // the actual bash_profile file I want to ensure is present on my VM
| | | manifests
| | | | init.pp // the init file included for the bash_profile class
When I run the provisioning for vagrant, I get the error
err: /Stage[main]/Bash_profile/File[/home/vagrant/bash_profile]: Could not evaluate: Could not retrieve information from environment production source(s) puppet:///modules/bash_profile/files/bash_profile at /tmp/vagrant-puppet-1/modules-0/bash_profile/manifests/init.pp:8
I'm not sure why it can't retrieve the information. The path seems to be correct. Can anyone see what I'm missing?
Yes, you are not supposed to include the literal files/ in the URL. Instead, it should just be
puppet:///modules/bash_profile/bash_profile
You may also receive this error with recurse => true if your module name is invalid. For instance, if you have this module structure:
modules
├── my-example
│   └── files
│   └── example
│   └── test.txt
and this resource:
file { "/tmp/example":
ensure => directory,
recurse => true,
source => "puppet:///modules/my-example/example",
}
you'll get this error:
==> default: Info: Could not find filesystem info for file 'my-example/example' in environment production
==> default: Error: /Stage[main]/Main/Node[default]/File[/tmp/example]: Could not evaluate: Could not retrieve information from environment production source(s) puppet:///my-example/example
The fix is to rename the module—for instance, naming it my_example fixes it. The rules for module names are documented but easy to miss.
Things to care about
The Puppet URI format puppet:///modules/name_of_module/filename
The fileserver directory to be present in the module directory
This video is an shows step-by-step guide to resolve the error

Chef-Repo Berkshelf Confusion Setup

I am soooo confused when it comes to Chef / Berkshelf and need help and advice.
What I've found / read there's an underlining assumption with some things with Berkshelf and for the newbie there is a bit of a grey area that needs filling
Let me try to explain:
I followed the typical Chef path
Create Chef-repo in user directory
C:\Users\itsmeofcourse\chef-repo
then hooked that into an internal git-repo
and happily writing basic cookbooks for Windows and uploading everything into that git-repo
as it stands every cookbook exists under the "cookbook" folder in my chef-repo.
C:\Users\itsmeofcourse\chef-repo
/cookbook
I've then followed the path of writing wrapper cookbooks around community cookbooks, so it would look like
client_iis - depends upon
department_iis - depends upon
global_iis - depends upon
iis - community cookbook
this allows us make IIS changes at certain different levels within our infrastructure.
Now where documentation I feel falls down, is everyone is saying move your cookbooks out of the "cookbook" folder
so what I understand, "your" chef-repo will exist in a git-repo but just for changes to sub-folders like environments / data bags / roles / certificates etc ? and the cookbook are then separate projects is that correct or not ?
Where do you move your cookbooks to ? anywhere on you machine / within your user %home%?
How does Chef know where these are stored or do you have to amend your "knife.rb" and point to a certain directory ?
so it would look like
knife.rb
cookbook_path ["c:/cookbooks"]
C:\Users\itsmeofcourse\chef-repo :github => repo_1
c:/cooksbooks
/base :github => repo_2
/iis :github => repo_3
/sql :github => repo_4
/client_iis :github => repo_
/department_iis :github => repo_3
Can I ask what am I missing
or do you place a berksfile in the root of my chef-repo and then do what ? to manage everything in my cookbook folder ?
I have read through https://github.com/berkshelf/berkshelf/issues/535
please can someone help
Yes, it's correct and you would usually move your cookbooks to a separate repository.
One gotcha I had, was while reading this article: http://www.prashantrajan.com/posts/2013/06/leveling-up-chef-best-practices/ and the "Single Repo Per Cookbook" part. Read the whole thing, it's good!
It seems like you are not missing anything. Moving your cookbooks out of cookbooks directory, means creating separate repos per cookbook and depending on them using top level Berksfile (in the root of your chef-repo).
For a typical vagrant+chef repo for a web app (called coolwebapp), I would usually have:
.
+-- cmp-cookbooks
| +-- cmp-coolwebapp (this is only cookbook stored in this repo, and this repo exists because of this cookbook)
+-- data_bags
| +-- users
| | +-- mysql.json
| | +-- os.json
| | +-- admins.json
| +-- private_keys
| +-- deployment.json
+-- environments
| +-- production.rb
| +-- staging.rb
| +-- qa.rb
| +-- integration.rb
| +-- local.rb
+-- nodes (but this should not be stored in your repo I guess)
| +-- ip_here.json
| +-- other_ip_here.json
+-- Berksfile
+-- Vagrantfile
Berksfile would contain:
cookbook "cmp-coolwebapp", "~>0.3.0", path: "./cmp-cookbooks/cmp-coolwebapp"
cookbook "cmp-provisioning", "~>0.7.0", git: _priv_provisioning_cookbook_repo_
cookbook "cmp-role-db", "~>0.7.0", git: _priv_role1_cookbook_repo_
cookbook "cmp-role-www", "~>0.8.0", git: _priv_role2_cookbook_repo_
cookbook "cmp-role-devops", "~>0.7.0", git: _priv_role3_cookbook_repo_
"cmp" stands for our company name. Our cookbooks are stored in private repos, and are being maintained individually.
Cookbook cmp-role-www for example, would have mostly community cookbooks as dependencies in Berksfile, and our own cmp-apache2, cmp-nginx, cmp-varnish wrapper cookbooks stored in its repo.
Answering your last question" "How does Chef know where these are stored or do you have to amend your "knife.rb" and point to a certain directory ?"
If you manage your cookbook dependencies with Berkshelf, you can include cookbook from any location you prefer:
cookbook "artifact", path: "/Users/reset/code/artifact-cookbook"
cookbook "mysql", git: "https://github.com/opscode-cookbooks/mysql.git", branch: "foodcritic"
cookbook "rightscale", git: "https://github.com/rightscale/rightscale_cookbooks.git", rel: "cookbooks/rightscale"
The last one is useful when you store several company cookbook in one repository.
http://berkshelf.com/ "Source options" section.

Resources