I'm just playing around deploying a monolithic JHipster application to Heroku.
I followed these instructions. It seems to deploy ok ~ tables are created and the static data is loaded. I can login and navigate the site.
However, when I try to update or create anything I get an internal server error. After running heroku log as per here, I see the following error:
ERROR 4 --- [ XNIO-2 task-32] o.h.m.w.rest.errors.ExceptionTranslator : An unexpected error occurred: None of the configured nodes are available: [{#transport#-1}{localhost}{127.0.0.1:9300}]
2017-08-09T22:25:52.182502+00:00 heroku[router]: at=info method=DELETE
When running the app locally, mvn (dev profile) and H2 database everything works fine. Can anyone give me a pointer of how to proceed?
JHipster v4.6.2 / Angular 4 / PostgreSQL
Many Thanks,
Turns out, when I generated my app I selected elastic search. As per here:
JHipster expects an external Elasticsearch instance
... and as mentioned by Gael in the comments here:
the usual error with elasticsearch jhipster apps in prod profile is that user forgot to instantiate an elasticsearch instance
... and indeed, in my application-prod.yml, I had this:
data:
elasticsearch:
cluster-name:
cluster-nodes: localhost:9300
For the time being, I have just commented this out and redeployed the app to Heroku as per here.
Thanks,
Related
I have a mern app that I am using to fetch packages from my backend.
It works in development but not in production when I host it on AWS EC2.
I figured out that the reason it's not working id because I am getting incomplete response back in production. How is it even possible? I am frustrated right now.
Attaching some images and console logs in development and production.
The field packageImages is missing in production response!
Because of this I am getting error about trying to read properties of undefined.
Please someone guide me with this.
Iam looking for help to containerize a laravel application with docker, running it locally and make it deployable to gcloud Run, connected to a gcloud database.
My application is an API, build with laravel, and so far i have just used the docker-compose/sail package, that comes with laravel 8, in the development.
Here is what i want to achieve:
Laravel app running on gcloud Run.
Database in gcloud, Mysql, PostgreSQL or SQL server. (prefer Mysql).
Enviroment stored in gcloud.
My problem is can find any info if or how to use/rewrite the docker-composer file i laravel 8, create a Dockerfile or cloudbuild file, and build it for gcloud.
Maybe i could add something like this in a cloudbuild.yml file:
#cloudbuild.yml
steps:
# running docker-compose
- name: 'docker/compose:1.26.2'
args: ['up', '-d']
Any help/guidanceis is appreciated.
As mentioned in the comments to this question you can check this video that explains how you can use docker-composer, laravel to deploy an app to Cloud Run with a step-by-step tutorial.
As per database connection to said app, the Connecting from Cloud Run (fully managed) to Cloud SQL documentation is quite complete on that matter and for secret management I found this article that explains how to implement secret manager into Cloud Run.
I know this answer is basically just links to the documentation and articles, but I believe all the information you need to implement your app into Cloud Run is in those.
After my app is successfully pushed via cf I usually need do manually ssh-log into the container and execute a couple of PHP scripts to clear and warmup my cache, potentially execute some DB schema updates etc.
Today I found out about Cloudfoundry Tasks which seems to offer a pretty way to do exactly this kind of things and I wanted to test it whether I can integrate it into my build&deploy script.
So used cf login, got successfully connected to the right org and space, app has been pushed and is running and I tried this command:
cf run-task MYAPP "bin/console doctrine:schema:update --dump-sql --env=prod" --name dumpsql
(tried it with a couple of folder changes like app/bin/console etc.)
and this was the output:
Creating task for app MYAPP in org MYORG / space MYSPACE as me#myemail...
Unexpected Response
Response Code: 404
FAILED
Uses CF CLI: 6.32.0
cf logs ArcticTenTestBackend --recent does not output anything (this might be the case because I have enabled an ELK instance for logging - as I wanted to service-connect to ELK to look up the logs I found out that the service-connector cf plugin is gone for which I will open a new ticket).
Created new Issue for that: https://github.com/cloudfoundry/cli/issues/1242
This is not a CF CLI issue. Swisscom Application Cloud does not yet support the Cloud Foundry tasks. This explains the 404 you are currently receiving. We will expose this feature of Cloud Foundry in an upcoming release of Swisscom Application Cloud.
In the meantime, maybe you can find a way to execute your one-off tasks (cache warming, DB migrations) at application startup.
As mentioned by #Mathis Kretz Swisscom has gotten around to enable cf run-task since this question was posted. They send out e-mails on 22. November 2018 to announce the feature.
As discussed on your linked documentation you use the following commands to manage tasks:
cf tasks [APP_NAME]
cf run-task [APP_NAME] [COMMAND]
cf terminate-task [APP_NAME] [TASK_ID]
I am running a Play application written in Java and running 2.11.1. It works fine locally and I successfully pushed it to Heroku a couple times. Since then, I have added a new controller and updated the routes with a couple of new actions.
It works fine locally but when deploying to Heroku, I get the following error (one for all the actions for this controller):
[error] /tmp/scala_buildpack_build_dir/conf/routes:12: object Dataset is not a member of package controllers
[error] GET /data controllers.Dataset.list()
I am using another controller that has been defined in the same fashion (only before) and it works fine. Extract from routes:
# Models page
GET /models controllers.PredictionModels.list()
# Data page
GET /data controllers.Dataset.list()
Is there a known problem on Heroku? Is it because I run Dev locally and Production on Heroku?
I started by defining a framework ID as specified here
http://www.playframework.org/documentation/1.2/guide11
I called my server appnameheroku
Then I retrieved the database URL using
heroku config
from the console
I then added the following two lines to application.conf
%appnameheroku.jpa.ddl=validate
appnameheroku.db=postgres://....compute-1.amazonaws.com/etc
I then deploy the app and get the following error
Oops, an error occured
This exception has been logged with id 6963iilc8. I'm using the free version of Heroku.
Two things here: Storing config in the application code is a bad idea, as it prevents Heroku from carrying out a lot of administrative tasks on your behalf.
Therefore I would configure my application.conf as:
db=${DATABASE_URL}
jpa.dialect=org.hibernate.dialect.PostgreSQLDialect
jpa.ddl=update
Heroku don’t recommend setting jpa.ddl to update for a real world production app. Use Play!’s database evolutions instead.