Play application update runs builds/runs fine locally however Pushing to Heroku (after migrating to cedar-14) is failing with the following:
[error] (*:update) sbt.ResolveException: download failed: org.apache.httpcomponents#httpclient;4.0.1!httpclient.jar
After browsing I have tried the following to resolve the issue without any luck:
Set sbt.version to 0.13.5
The issue seems to be in Heroku's own build process.
Any help on this issue would be greatly appreciated.
DUMP:
remote: [info] [SUCCESSFUL ] org.scala-lang#jline;2.10.3!jline.jar (11ms)
remote: [warn] ::::::::::::::::::::::::::::::::::::::::::::::
remote: [warn] :: FAILED DOWNLOADS ::
remote: [warn] :: ^ see resolution messages for details ^ ::
remote: [warn] ::::::::::::::::::::::::::::::::::::::::::::::
remote: [warn] :: org.apache.httpcomponents#httpclient;4.0.1!httpclient.jar
remote: [warn] ::::::::::::::::::::::::::::::::::::::::::::::
remote: sbt.ResolveException: download failed: org.apache.httpcomponents#httpclient;4.0.1!httpclient.jar
Build.sbt
name := "jmpdb"
version := "1.0-SNAPSHOT"
libraryDependencies ++= Seq(
javaJdbc,
javaEbean,
cache,
"mysql" % "mysql-connector-java" % "5.1.18",
"org.mindrot" % "jbcrypt" % "0.3m",
"com.thoughtworks.xstream" % "xstream" % "1.4.7",
"org.apache.velocity" % "velocity" % "1.7",
"commons-lang" % "commons-lang" % "2.6",
"com.google.api-client" % "google-api-client" % "1.12.0-beta",
"com.google.http-client" % "google-http-client-jackson" % "1.12.0-beta",
"com.google.oauth-client" % "google-oauth-client" % "1.12.0-beta",
"com.google.apis" % "google-api-services-drive" % "v2-rev30-1.12.0-beta",
"com.google.apis" % "google-api-services-oauth2" % "v2-rev25-1.12.0-beta",
"org.json" % "json" % "20080701",
"org.reflections" % "reflections" % "0.9.8",
"com.ecwid" % "ecwid-mailchimp" % "1.3.0.7",
"com.typesafe" %% "play-plugins-mailer" % "2.1.0"
)
resolvers += "jBCrypt Repository" at "http://repo1.maven.org/maven2/org/"
resolvers += "google-api-services" at "http://google-api-client-libraries.appspot.com/mavenrepo"
resolvers := Seq("typesafe" at "http://repo.typesafe.com/typesafe/repo")
// Only needed in development
javaOptions ++= Seq("-Xmx512M", "-Xmx2048M", "-XX:MaxPermSize=2048M")
play.Project.playJavaSettings
Plugins.sbt
// Comment to get more information during initialization
logLevel := Level.Warn
// The Typesafe repository
resolvers += "Typesafe repository" at "http://repo.typesafe.com/typesafe/releases/"
// Use the Play sbt plugin for Play projects
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.2.2")
After consulting with Heroku technical support the issue was related to a corruption in the SBT cache on Heroku. The steps suggested below resolved the issue (in my case the purge_cache as I had already been running sbt_clean=true):
$ heroku config:set SBT_CLEAN=true
$ git push heroku master
If that still does not work, please try purging the ivy2 and m2 caches by running these commands:
$ heroku plugins:install https://github.com/heroku/heroku-repo.git
$ heroku repo:purge_cache -a peaceful-mountain-6737
$ git push heroku master
Hopefully, this will help someone else going in circles.
Related
Pushes of updates to my app to Heroku are now rejected because of a tex-live incompatibility. See the error message below.
Heroku error message
remote: tlmgr: Remote repository is newer than local (2018 < 2019)
remote: Cross release updates are only supported with
remote: update-tlmgr-latest(.sh/.exe) --update
remote: Please see https://tug.org/texlive/upgrade.html for
details.
remote: ! Push rejected, failed to compile TeX Live app.
I have tried the following:
Add a file texlive.repository with contents ftp://tug.org/historic/systems/texlive/2018/tlnet-final. This does not work because the texlive repository lacks a required cryptographic certificate.
Delete the texlive buildpack and start over. Below is my buildpack config. However, when I try to remove it, I get this message:
$ heroku buildpacks:remove syphar/heroku-buildpack-tex
› Error: invalid json response body at https://buildpack-registry.heroku.com/buildpacks/syphar%2Fheroku-buildpack-tex reason: Unexpected end of JSON input
At this point I am stuck!
Heroku buildpacks
$ heroku buildpacks
=== nshost Buildpack URLs
1. https://github.com/HashNuke/heroku-buildpack-elixir.git
2. https://github.com/syphar/heroku-buildpack-tex.git
This works: in the file texlive.packages, replace
ftp://tug.org/historic/systems/texlive/2018/tlnet-final
with
https://www.math.utah.edu/pub/texlive/historic/systems/texlive/2018/tlnet-final
Solution courtesy of Nelson Beebe, University of Utah Mathematics Department
I am trying to spark-submit using Amazon ec2 with the following:
spark-submit --packages org.apache.hadoop:hadoop-aws:2.7.1 --master spark://amazonaws.com SimpleApp.py
and I end up with the following error. It seems to be that it is looking for hadoop. My ec2 cluster was created using spark-ec2 command.
Ivy Default Cache set to: /home/adas/.ivy2/cache
The jars for the packages stored in: /home/adas/.ivy2/jars
:: loading settings :: url = jar:file:/home/adas/spark/spark-2.1.0-bin-hadoop2.7/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
org.apache.hadoop#hadoop-aws added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
confs: [default]
:: resolution report :: resolve 66439ms :: artifacts dl 0ms
:: modules in use:
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 1 | 0 | 0 | 0 || 0 | 0 |
---------------------------------------------------------------------
:: problems summary ::
:::: WARNINGS
module not found: org.apache.hadoop#hadoop-aws;2.7.1
==== local-m2-cache: tried
file:/home/adas/.m2/repository/org/apache/hadoop/hadoop-aws/2.7.1/hadoop-aws-2.7.1.pom
-- artifact org.apache.hadoop#hadoop-aws;2.7.1!hadoop-aws.jar:
file:/home/adas/.m2/repository/org/apache/hadoop/hadoop-aws/2.7.1/hadoop-aws-2.7.1.jar
==== local-ivy-cache: tried
/home/adas/.ivy2/local/org.apache.hadoop/hadoop-aws/2.7.1/ivys/ivy.xml
-- artifact org.apache.hadoop#hadoop-aws;2.7.1!hadoop-aws.jar:
/home/adas/.ivy2/local/org.apache.hadoop/hadoop-aws/2.7.1/jars/hadoop-aws.jar
==== central: tried
https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/2.7.1/hadoop-aws-2.7.1.pom
-- artifact org.apache.hadoop#hadoop-aws;2.7.1!hadoop-aws.jar:
https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/2.7.1/hadoop-aws-2.7.1.jar
==== spark-packages: tried
http://dl.bintray.com/spark-packages/maven/org/apache/hadoop/hadoop-aws/2.7.1/hadoop-aws-2.7.1.pom
-- artifact org.apache.hadoop#hadoop-aws;2.7.1!hadoop-aws.jar:
http://dl.bintray.com/spark-packages/maven/org/apache/hadoop/hadoop-aws/2.7.1/hadoop-aws-2.7.1.jar
::::::::::::::::::::::::::::::::::::::::::::::
:: UNRESOLVED DEPENDENCIES ::
::::::::::::::::::::::::::::::::::::::::::::::
:: org.apache.hadoop#hadoop-aws;2.7.1: not found
::::::::::::::::::::::::::::::::::::::::::::::
:::: ERRORS
Server access error at url https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/2.7.1/hadoop-aws-2.7.1.pom (java.net.NoRouteToHostException: No route to host (Host unreachable))
Server access error at url https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/2.7.1/hadoop-aws-2.7.1.jar (java.net.NoRouteToHostException: No route to host (Host unreachable))
Server access error at url http://dl.bintray.com/spark-packages/maven/org/apache/hadoop/hadoop-aws/2.7.1/hadoop-aws-2.7.1.pom (java.net.NoRouteToHostException: No route to host (Host unreachable))
Server access error at url http://dl.bintray.com/spark-packages/maven/org/apache/hadoop/hadoop-aws/2.7.1/hadoop-aws-2.7.1.jar (java.net.NoRouteToHostException: No route to host (Host unreachable))
:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;2.7.1: not found]
at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1078)
at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:296)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:160)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
You are submitting the job with --packages org.apache.hadoop:hadoop-aws:2.7.1 option and job is attempting to resolve the dependencies by downloading the packages from public maven repo. However, this error indicates it's unable to reach the maven repo.
Server access error at url https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/2.7.1/hadoop-aws-2.7.1.pom (java.net.NoRouteToHostException: No route to host (Host unreachable))
You might want to check if the spark master has access to internet.
Sorry everyone, it was just some local proxy issues.
I'm trying to access S3 file from SparkSQL job. I already tried solutions from several posts but nothing seems to work. Maybe because my EC2 cluster runs the new Spark2.0 for Hadoop2.7.
I setup hadoop this way:
sc.hadoopConfiguration.set("fs.s3a.impl","org.apache.hadoop.fs.s3a.S3AFileSystem")
sc.hadoopConfiguration.set("fs.s3a.awsAccessKeyId", accessKey)
sc.hadoopConfiguration.set("fs.s3a.awsSecretAccessKey", secretKey)
I build an uber-jar using sbt assembly using:
name := "test"
version := "0.2.0"
scalaVersion := "2.11.8"
libraryDependencies += "com.amazonaws" % "aws-java-sdk" % "1.7.4"
libraryDependencies += "org.apache.hadoop" % "hadoop-aws" % "2.7.3" excludeAll(
ExclusionRule("com.amazonaws", "aws-java-sdk"),
ExclusionRule("commons-beanutils")
)
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.0" % "provided"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.0.0" % "provided"
When I submit my job to the cluster, I always got the following errors:
Exception in thread "main" org.apache.spark.SparkException: Job
aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most
recent failure: Lost task 0.3 in stage 0.0 (TID 6, 172.31.7.246):
java.lang.RuntimeException: java.lang.ClassNotFoundException: Class
org.apache.hadoop.fs.s3a.S3AFileSystem not found at
org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2195)
at
org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2638)
at
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2651)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:92) at
org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2687)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2669)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:371) at
org.apache.spark.util.Utils$.getHadoopFileSystem(Utils.scala:1726) at
org.apache.spark.util.Utils$.doFetchFile(Utils.scala:662) at
org.apache.spark.util.Utils$.fetchFile(Utils.scala:446) at
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$3.apply(Executor.scala:476)
It seems that the driver is able to read from S3 without problem but not the workers/executors... I do not understand why my uberjar is not sufficient.
However, I tried as well without success to configure spark-submit using:
--packages com.amazonaws:aws-java-sdk:1.7.4,org.apache.hadoop:hadoop-aws:2.7.3
PS: If I switch to s3n protocol, I got the following exception:
java.io.IOException: No FileSystem for scheme: s3n
If you want to use s3n:
sc.hadoopConfiguration.set("fs.s3n.impl","org.apache.hadoop.fs.s3native.NativeS3FileSystem")
sc.hadoopConfiguration.set("fs.s3n.awsAccessKeyId", accessKey)
sc.hadoopConfiguration.set("fs.s3n.awsSecretAccessKey", secretKey)
Now, regarding the exception, you need to make sure both JARs are on the driver and worker classpaths, and make sure to distribute them to the worker node if you're using Client Mode via the --jars flag:
spark-submit \
--conf "spark.driver.extraClassPath=/location/to/aws-java-sdk.jar" \
--conf "spark.driver.extraClassPath=/location/to/hadoop-aws.jar" \
--jars /location/to/aws-java-sdk.jar,/location/to/hadoop-aws.jar \
Also, if you're building your uber JAR and including aws-java-sdk and hadoop-aws, no reason to use the --packages flag.
Actually all operations of spark working on workers. and you set these configuration on master so once you can try to app configuration of s3 on mapPartition{
}
I've tried on windows 7 and ubuntu 12.04. In both cases, I got errors about unable to get dependencies. I'm following the instructions from http://www.playframework.org/documentation/2.0.1/ScalaTodoList and http://www.playframework.org/documentation/2.0.1/JavaTodoList
Getting net.java.dev.jna jna 3.2.3 ...
downloading http://s3pository.heroku.com/maven-central/net/java/dev/jna/j
na/3.2.3/jna-3.2.3.jar ...
:: problems summary ::
:::: WARNINGS
[FAILED ] net.java.dev.jna#jna;3.2.3!jna.jar: The HTTP respo
nse code for http://s3pository.heroku.com/maven-central/net/java/dev/jna/jna/3.2
.3/jna-3.2.3.jar did not indicate a success. See log for more detail. (40ms)
[FAILED ] net.java.dev.jna#jna;3.2.3!jna.jar: The HTTP respo
nse code for http://s3pository.heroku.com/maven-central/net/java/dev/jna/jna/3.2
.3/jna-3.2.3.jar did not indicate a success. See log for more detail. (40ms)
==== heroku-central: tried
http://s3pository.heroku.com/maven-central/net/java/dev/jna/jna/3.2.3/
jna-3.2.3.jar
::::::::::::::::::::::::::::::::::::::::::::::
:: FAILED DOWNLOADS ::
:: ^ see resolution messages for details ^ ::
::::::::::::::::::::::::::::::::::::::::::::::
:: net.java.dev.jna#jna;3.2.3!jna.jar
::::::::::::::::::::::::::::::::::::::::::::::
:::: ERRORS
SERVER ERROR: Gateway Timeout url=http://s3pository.heroku.com/maven-cen
tral/net/java/dev/jna/jna/3.2.3/jna-3.2.3.jar
:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
download failed: net.java.dev.jna#jna;3.2.3!jna.jar
Error during sbt execution: Error retrieving required libraries
(see /tmp/build_2alkkctoiqzvf/project/boot/update.log for complete log)
Error: Could not retrieve jna 3.2.3
! Failed to build app with sbt
! Heroku push rejected, failed to compile Play 2.0 app
To git#heroku.com:falling-stone-8800.git
! [remote rejected] master -> master (pre-receive hook declined)
error: failed to push some refs to 'git#heroku.com:falling-stone-8800.git'
This seems to be a systemwide error today (June 10, 2012). There's most likely nothing wrong with your code or deployment method, it may just be a Heroku hiccup. I have been unable to push changes to my (working) Java Play 2.0 app all morning due to the same system error.
I have filed a ticket and would urge others to do the same.
I'm pretty new to Heroku and I tried three times to deploy my code on it without success. I'm following this tutorial and I get stuck when it asks me to do heroku ps:scale web=1 on which I receive Scaling web processes... ! Resource not found. This sounds not really correct. All the previous steps seems working correctly. Here is the logs of the command git push heroku master:
Counting objects: 5, done.
Delta compression using up to 4 threads.
Compressing objects: 100% (3/3), done.
Writing objects: 100% (3/3), 279 bytes, done.
Total 3 (delta 2), reused 0 (delta 0)
-----> Heroku receiving push
-----> Removing .DS_Store files
-----> Node.js app detected
-----> Resolving engine versions
Using Node.js version: 0.4.7
Using npm version: 1.0.106
-----> Fetching Node.js binaries
-----> Vendoring node into slug
-----> Installing dependencies with npm
npm WARN nodeunit#0.5.1 package.json: bugs['web'] should probably be bugs['url']
> mongodb#0.9.7-3-5 install /tmp/build_2xv21ycwhho11/node_modules/mongodb
> node install.js
================================================================================
= =
= To install with C++ bson parser do <npm install mongodb --mongodb:native> =
= =
================================================================================
mongodb#0.9.7-3-5 ./node_modules/mongodb
> mongodb#0.9.7-3-5 install /tmp/build_2xv21ycwhho11/node_modules/mongodb
> node install.js
================================================================================
= =
= To install with C++ bson parser do <npm install mongodb --mongodb:native> =
= =
================================================================================
xmlhttprequest#1.3.0 /tmp/build_2xv21ycwhho11/node_modules/xmlhttprequest
redis#0.6.0 /tmp/build_2xv21ycwhho11/node_modules/redis
mongoose#2.5.5 /tmp/build_2xv21ycwhho11/node_modules/mongoose
mongodb#0.9.7-3-5 /tmp/build_2xv21ycwhho11/node_modules/mongodb
juggernaut#2.0.5 /tmp/build_2xv21ycwhho11/node_modules/juggernaut
socket.io#0.6.18 /tmp/build_2xv21ycwhho11/node_modules/juggernaut/node_modules/socket.io
redis#0.5.11 /tmp/build_2xv21ycwhho11/node_modules/juggernaut/node_modules/redis
node-static-maccman#0.5.3 /tmp/build_2xv21ycwhho11/node_modules/juggernaut/node_modules/node-static-maccman
optimist#0.1.9 /tmp/build_2xv21ycwhho11/node_modules/juggernaut/node_modules/optimist
jquery#1.6.3 /tmp/build_2xv21ycwhho11/node_modules/jquery
jsdom#0.2.13 /tmp/build_2xv21ycwhho11/node_modules/jquery/node_modules/jsdom
htmlparser#1.7.4 /tmp/build_2xv21ycwhho11/node_modules/jquery/node_modules/htmlparser
jade#0.20.1 /tmp/build_2xv21ycwhho11/node_modules/jade
html2jade#0.1.16 /tmp/build_2xv21ycwhho11/node_modules/html2jade
express#2.3.4 /tmp/build_2xv21ycwhho11/node_modules/express
connect#1.4.1 /tmp/build_2xv21ycwhho11/node_modules/express/node_modules/connect
qs#0.1.0 /tmp/build_2xv21ycwhho11/node_modules/express/node_modules/qs
mime#1.2.2 /tmp/build_2xv21ycwhho11/node_modules/express/node_modules/mime
ejs#0.4.2 /tmp/build_2xv21ycwhho11/node_modules/ejs
connect#1.8.5 /tmp/build_2xv21ycwhho11/node_modules/connect
qs#0.4.1 /tmp/build_2xv21ycwhho11/node_modules/connect/node_modules/qs
mime#1.2.4 /tmp/build_2xv21ycwhho11/node_modules/connect/node_modules/mime
formidable#1.0.8 /tmp/build_2xv21ycwhho11/node_modules/connect/node_modules/formidable
Dependencies installed
-----> Discovering process types
Procfile declares types -> (none)
-----> Compiled slug size is 7.0MB
-----> Launching... done, v4
http://stark-winter-4562.herokuapp.com deployed to Heroku
To git#heroku.com:stark-winter-4562.git
43e3a5e..84f9cd8 master -> master
I don't know if this can be of any help, but I guess this output is pretty different from the one seen in the tutorial, just in case.
Can anyone point out what's wrong, or what am I doing wrong? Thanks.
Your procfile needs to be called Procfile with a capitalised P