I am currently working on a monorepo of Lambda functions.
Currently there are 4 independent Gradle modules. One of the four modules contains all shared code, and the three other modules represent a Lambda function. Each of these three modules is dependent on the fourth shared module.
The subproject dependency import is achieved through:
settings.gradle
include ':commons'
project(":commons").projectDir = file("../../commons")
and build.gradle:
implementation project(":commons")
Building the dependent modules directly through Gradle works fine, the problem is when I try to build/deploy with AWS SAM.
After running sam build --debug, I get the following:
2021-01-03 18:50:45,589 | Telemetry endpoint configured to be https://aws-serverless-tools-telemetry.us-west-2.amazonaws.com/metrics
2021-01-03 18:50:45,716 | 'build' command is called
2021-01-03 18:50:45,720 | No Parameters detected in the template
2021-01-03 18:50:45,743 | 1 resources found in the template
2021-01-03 18:50:45,743 | Found Serverless function with name='DailyUpdateFunction' and CodeUri='.'
2021-01-03 18:50:45,744 | No Parameters detected in the template
2021-01-03 18:50:45,770 | Instantiating build definitions
2021-01-03 18:50:45,773 | Same function build definition found, adding function (Previous: BuildDefinition(java11, ., Zip, , a50fb9b9-79f6-48f1-a4dc-51818ab36e50, {}, []), Current: BuildDefinition(java11, ., Zip, , e9cb278a-418b-4322-b626-eaca6512385c, {}, []), Function: Function(name='DailyUpdateFunction', functionname='DailyUpdateFunction', runtime='java11', memory=512, timeout=20, handler='lechads.fantasybball.LambdaHandler::handleRequest', imageuri=None, packagetype='Zip', imageconfig=None, codeuri='.', environment={'Variables': {'ENV': 'PROD'}}, rolearn=None, layers=[], events={'NoonDailyUpdate': {'Type': 'Schedule', 'Properties': {'Schedule': 'cron(0, 17, *, *, ?, *)', 'Name': 'noon-daily-update', 'Enabled': True}}}, metadata=None, codesign_config_arn=None))
2021-01-03 18:50:45,774 | Building codeuri: . runtime: java11 metadata: {} functions: ['DailyUpdateFunction']
2021-01-03 18:50:45,774 | Building to following folder /Users/cschafer/Desktop/fantasy_bball_slack_integration/components/daily_update/.aws-sam/build/DailyUpdateFunction
2021-01-03 18:50:45,774 | Looking for a supported build workflow in following directories: ['/Users/cschafer/Desktop/fantasy_bball_slack_integration/components/daily_update', '/Users/cschafer/Desktop/fantasy_bball_slack_integration/components/daily_update']
2021-01-03 18:50:45,775 | Loading workflow module 'aws_lambda_builders.workflows'
2021-01-03 18:50:45,779 | Registering workflow 'PythonPipBuilder' with capability 'Capability(language='python', dependency_manager='pip', application_framework=None)'
2021-01-03 18:50:45,780 | Registering workflow 'NodejsNpmBuilder' with capability 'Capability(language='nodejs', dependency_manager='npm', application_framework=None)'
2021-01-03 18:50:45,782 | Registering workflow 'RubyBundlerBuilder' with capability 'Capability(language='ruby', dependency_manager='bundler', application_framework=None)'
2021-01-03 18:50:45,783 | Registering workflow 'GoDepBuilder' with capability 'Capability(language='go', dependency_manager='dep', application_framework=None)'
2021-01-03 18:50:45,784 | Registering workflow 'GoModulesBuilder' with capability 'Capability(language='go', dependency_manager='modules', application_framework=None)'
2021-01-03 18:50:45,787 | Registering workflow 'JavaGradleWorkflow' with capability 'Capability(language='java', dependency_manager='gradle', application_framework=None)'
2021-01-03 18:50:45,790 | Registering workflow 'JavaMavenWorkflow' with capability 'Capability(language='java', dependency_manager='maven', application_framework=None)'
2021-01-03 18:50:45,792 | Registering workflow 'DotnetCliPackageBuilder' with capability 'Capability(language='dotnet', dependency_manager='cli-package', application_framework=None)'
2021-01-03 18:50:45,794 | Registering workflow 'CustomMakeBuilder' with capability 'Capability(language='provided', dependency_manager=None, application_framework=None)'
2021-01-03 18:50:45,794 | Found workflow 'JavaGradleWorkflow' to support capabilities 'Capability(language='java', dependency_manager='gradle', application_framework=None)'
2021-01-03 18:50:46,549 | Running workflow 'JavaGradleWorkflow'
2021-01-03 18:50:46,550 | Running JavaGradleWorkflow:GradleBuild
2021-01-03 18:50:56,643 | JavaGradleWorkflow:GradleBuild failed
Traceback (most recent call last):
File "/usr/local/Cellar/aws-sam-cli/1.15.0/libexec/lib/python3.8/site-packages/aws_lambda_builders/workflows/java_gradle/actions.py", line 47, in _build_project
self.subprocess_gradle.build(
File "/usr/local/Cellar/aws-sam-cli/1.15.0/libexec/lib/python3.8/site-packages/aws_lambda_builders/workflows/java_gradle/gradle.py", line 45, in build
raise GradleExecutionError(message=stderr.decode("utf8").strip())
aws_lambda_builders.workflows.java_gradle.gradle.GradleExecutionError: Gradle Failed: FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':jar'.
> Cannot expand ZIP '/var/folders/90/ytzb7jbs72q6x6_zklk09tlm0000gr/T/tmpa7zgkyzw/7c0bdfb5795645b1a06a4b13758f35c5eca150e7/build/libs/commons.jar' as it does not exist.
It seems that SAM copies source code to specific temporary location in the file system, but doesn't seem to know about the subproject dependency. Does anyone have advice/suggestions on how to accommodate this code structure with AWS SAM? FWIW everything was working via AWS SAM prior to pulling shared code into it's own Gradle module.
As mentioned from the title i'm trying to run a shell action that kicks off a spark job but unfortunately i'm consistently getting the following error...
19/05/10 14:03:39 ERROR AbstractRpcClient: SASL authentication failed.
The most likely cause is missing or invalid credentials. Consider
'kinit'.
javax.security.sasl.SaslException: GSS initiate failed [Caused by
GSSException: No valid credentials provided (Mechanism level: Failed
to find any Kerberos tgt)]
java.io.IOException: Could not set up IO Streams to
<hbaseregionserver>
Fri May 10 14:03:39 BST 2019,
RpcRetryingCaller{globalStartTime=1557493419339, pause=100,
retries=2}, org.apache.hadoop.hbase.ipc.FailedServerException: This
server is in the failed servers list: <hbaseregionserver>
Been playing around trying to get the script to take in the kerberos ticket but having no luck, as far as I can tell its related to the Oozie job not being able to pass the kerberos ticket any ideas why its not picking it up? I'm at a loss? Related code is below
Oozie workflow action
<action name="sparkJ" cred="hive2Cred">
<shell xmlns="uri:oozie:shell-action:0.1">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<configuration>
<property>
<name>mapred.job.queue.name</name>
<value>${oozieQueueName}</value>
</property>
</configuration>
<exec>run.sh</exec>
<file>/thePathToTheScript/run.sh#run.sh</file>
<file>/thePathToTheProperties/myp.properties#myp.properties</file>
<capture-output />
</shell>
<ok to="end" />
<error to="fail" />
</action>
Shell script
#!/bin/sh
export job_name=SPARK_JOB
export configuration=myp.properties
export num_executors=10
export executor_memory=1G
export queue=YARNQ
export max_executors=50
kinit -kt KEYTAB KPRINCIPAL
echo "[[[[[[[[[[[[[ Starting Job - name:${job_name},
configuration:${configuration} ]]]]]]]]]]]]]]"
/usr/hdp/current/spark2-client/bin/spark-submit \
--name ${job_name} \
--driver-java-options "-Dlog4j.configuration=file:./log4j.properties" \
--num-executors ${num_executors} \
--executor-memory ${executor_memory} \
--master yarn \
--keytab KEYTAB \
--principal KPRINCIPAL \
--supervise \
--deploy-mode cluster \
--queue ${queue} \
--files "./${configuration},./hbase-site.xml,./log4j.properties" \
--conf spark.driver.extraClassPath="/usr/hdp/current/hive-
client/lib/datanucleus-*.jar:/usr/hdp/current/tez-client/*.jar" \
--conf spark.executor.extraJavaOptions="-
Djava.security.auth.login.config=./jaas.conf -
Dlog4j.configuration=file:./log4j.properties" \
--conf spark.executor.extraClassPath="/usr/hdp/current/hive-
client/lib/datanucleus-*.jar:/usr/hdp/current/tez-client/*.jar" \
--conf spark.streaming.stopGracefullyOnShutdown=true \
--conf spark.dynamicAllocation.enabled=true \
--conf spark.shuffle.service.enabled=true \
--conf spark.dynamicAllocation.maxExecutors=${max_executors} \
--conf spark.streaming.concurrentJobs=2 \
--conf spark.streaming.backpressure.enabled=true \
--conf spark.yarn.security.tokens.hive.enabled=true \
--conf spark.yarn.security.tokens.hbase.enabled=true \
--conf spark.streaming.kafka.maxRatePerPartition=5000 \
--conf spark.streaming.backpressure.pid.maxRate=3000 \
--conf spark.streaming.backpressure.pid.minRate=200 \
--conf spark.streaming.backpressure.initialRate=5000 \
--jars /usr/hdp/current/hbase-client/lib/guava-
12.0.1.jar,/usr/hdp/current/hbase-client/lib/hbase-
common.jar,/usr/hdp/current/hbase-client/lib/hbase-
client.jar,/usr/hdp/current/hbase-client/lib/hbase-
protocol.jar,/usr/hdp/current/spark-client/lib/datanucleus-api-jdo-
3.2.6.jar,/usr/hdp/current/spark-client/lib/datanucleus-rdbms-
3.2.9.jar,/usr/hdp/current/spark-client/lib/datanucleus-core-
3.2.10.jar \
--class myclass myjar.jar ./${configuration}
Many thanks to any help you can provide.
I have downloaded hadoop source code from github and compiled with the native option:
mvn package -Pdist,native -DskipTests -Dtar -Dmaven.javadoc.skip=true
I then copied the .dylib files to the $HADOOP_HOME/lib
cp -p hadoop-common-project/hadoop-common/target/hadoop-common-2.7.1/lib/native/*.dylib /usr/local/Cellar/hadoop/2.7.2/libexec/share/hadoop/lib
The LD_LIBRARY_PATH was updated and hdfs restarted:
echo $LD_LIBRARY_PATH
/usr/local/Cellar/hadoop/2.7.2/libexec/lib:
/usr/local/Cellar/hadoop/2.7.2/libexec/share/hadoop/common/lib:/Library/Java/JavaVirtualMachines/jdk1.8.0_92.jdk/Contents/Home//jre/lib
(Note: this also means that the answer to Hadoop “Unable to load native-hadoop library for your platform” error on docker-spark? does not work for me..)
But checknative still returns uniformly false:
$stop-dfs.sh && start-dfs.sh && hadoop checknative
16/06/13 16:12:32 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Stopping namenodes on [sparkbook]
sparkbook: stopping namenode
localhost: stopping datanode
Stopping secondary namenodes [0.0.0.0]
0.0.0.0: stopping secondarynamenode
16/06/13 16:12:50 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/06/13 16:12:50 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [sparkbook]
sparkbook: starting namenode, logging to /usr/local/Cellar/hadoop/2.7.2/libexec/logs/hadoop-macuser-namenode-sparkbook.out
localhost: starting datanode, logging to /usr/local/Cellar/hadoop/2.7.2/libexec/logs/hadoop-macuser-datanode-sparkbook.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /usr/local/Cellar/hadoop/2.7.2/libexec/logs/hadoop-macuser-secondarynamenode-sparkbook.out
16/06/13 16:13:05 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/06/13 16:13:05 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Native library checking:
hadoop: false
zlib: false
snappy: false
lz4: false
bzip2: false
openssl: false
To get this working on a fresh install of macOS 10.12, I had to do the following:
Install build dependencies using homebrew:
brew install cmake maven openssl protobuf#2.5 snappy
Check out hadoop source code
git clone https://github.com/apache/hadoop.git
cd hadoop
git checkout rel/release-2.7.3
Apply the below patch to the build:
diff --git a/hadoop-common-project/hadoop-common/src/CMakeLists.txt b/hadoop-common-project/hadoop-common/src/CMakeLists.txt
index 942b19c..8b34881 100644
--- a/hadoop-common-project/hadoop-common/src/CMakeLists.txt
+++ b/hadoop-common-project/hadoop-common/src/CMakeLists.txt
## -16,6 +16,8 ##
# limitations under the License.
#
+SET(CUSTOM_OPENSSL_PREFIX /usr/local/opt/openssl)
+
cmake_minimum_required(VERSION 2.6 FATAL_ERROR)
# Default to release builds
## -116,8 +118,8 ## set(T main/native/src/test/org/apache/hadoop)
GET_FILENAME_COMPONENT(HADOOP_ZLIB_LIBRARY ${ZLIB_LIBRARIES} NAME)
SET(STORED_CMAKE_FIND_LIBRARY_SUFFIXES ${CMAKE_FIND_LIBRARY_SUFFIXES})
-set_find_shared_library_version("1")
-find_package(BZip2 QUIET)
+set_find_shared_library_version("1.0")
+find_package(BZip2 REQUIRED)
if (BZIP2_INCLUDE_DIR AND BZIP2_LIBRARIES)
GET_FILENAME_COMPONENT(HADOOP_BZIP2_LIBRARY ${BZIP2_LIBRARIES} NAME)
set(BZIP2_SOURCE_FILES
diff --git a/hadoop-common-project/hadoop-common/src/main/conf/core-site.xml b/hadoop-common-project/hadoop-common/src/main/conf/core-site.xml
index d2ddf89..ac8e351 100644
--- a/hadoop-common-project/hadoop-common/src/main/conf/core-site.xml
+++ b/hadoop-common-project/hadoop-common/src/main/conf/core-site.xml
## -17,4 +17,8 ##
<!-- Put site-specific property overrides in this file. -->
<configuration>
+<property>
+<name>io.compression.codec.bzip2.library</name>
+<value>libbz2.dylib</value>
+</property>
</configuration>
diff --git a/hadoop-tools/hadoop-pipes/pom.xml b/hadoop-tools/hadoop-pipes/pom.xml
index 34c0110..70f23a4 100644
--- a/hadoop-tools/hadoop-pipes/pom.xml
+++ b/hadoop-tools/hadoop-pipes/pom.xml
## -52,7 +52,7 ##
<mkdir dir="${project.build.directory}/native"/>
<exec executable="cmake" dir="${project.build.directory}/native"
failonerror="true">
- <arg line="${basedir}/src/ -DJVM_ARCH_DATA_MODEL=${sun.arch.data.model}"/>
+ <arg line="${basedir}/src/ -DJVM_ARCH_DATA_MODEL=${sun.arch.data.model} -DOPENSSL_ROOT_DIR=/usr/local/opt/openssl"/>
</exec>
<exec executable="make" dir="${project.build.directory}/native" failonerror="true">
<arg line="VERBOSE=1"/>
Build hadoop from source:
mvn package -Pdist,native -DskipTests -Dtar -Dmaven.javadoc.skip=true
Specify JAVA_LIBRARY_PATH when running hadoop:
$ JAVA_LIBRARY_PATH=/usr/local/opt/openssl/lib:/opt/local/lib:/usr/lib hadoop-dist/target/hadoop-2.7.3/bin/hadoop checknative -a
16/10/14 20:16:32 INFO bzip2.Bzip2Factory: Successfully loaded & initialized native-bzip2 library libbz2.dylib
16/10/14 20:16:32 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
Native library checking:
hadoop: true /Users/admin/Desktop/hadoop/hadoop-dist/target/hadoop-2.7.3/lib/native/libhadoop.dylib
zlib: true /usr/lib/libz.1.dylib
snappy: true /usr/local/lib/libsnappy.1.dylib
lz4: true revision:99
bzip2: true /usr/lib/libbz2.1.0.dylib
openssl: true /usr/local/opt/openssl/lib/libcrypto.dylib
There are some missing steps in #andrewdotn's response above:
1) For step (3), create the patch by adding the text posted to a text file e.g. "patch.txt", and then execute "git apply patch.txt"
2) In addition to copying the files as directed by javadba, certain applications also require that you set:
export HADOOP_OPTS="-Djava.library.path=${HADOOP_HOME}/lib/native"
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:${HADOOP_HOME}/lib/native
export JAVA_LIBRARY_PATH=$JAVA_LIBRARY_PATH:${HADOOP_HOME}/lib/native
The needed step is to copy the *.dylib from the git sources build dir into the $HADOOP_HOME/<common dir>lib dir for your platform . For OS/X installed via brew it is:
cp /git/hadoop/hadoop-dist/target/hadoop-2.7.1/lib/native/ /usr/local/Cellar/hadoop/2.7.2/libexec/share/hadoop/common/
We can see the required libs there now:
$ll /usr/local/Cellar/hadoop/2.7.2/libexec/share/hadoop/common/*.dylib
-rwxr-xr-x 1 macuser staff 149100 Jun 13 13:44 /usr/local/Cellar/hadoop/2.7.2/libexec/share/hadoop/common/libhadoop.dylib
-rwxr-xr-x 1 macuser staff 149100 Jun 13 13:44 /usr/local/Cellar/hadoop/2.7.2/libexec/share/hadoop/common/libhadoop.1.0.0.dylib
And now the hadoop checknative command works:
$hadoop checknative
6/06/15 09:10:59 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
Native library checking:
hadoop: true /usr/local/Cellar/hadoop/2.7.2/libexec/share/hadoop/common/libhadoop.dylib
zlib: true /usr/lib/libz.1.dylib
snappy: false
lz4: true revision:99
bzip2: false
openssl: false build does not support openssl.
As an update to #andrewdotn answer, here is the patch.txt file to be used with Hadoop 2.8.1:
diff --git a/hadoop-common-project/hadoop-common/src/CMakeLists.txt b/hadoop-common-project/hadoop-common/src/CMakeLists.txt
index c93bfe78546..e8918f9ca29 100644
--- a/hadoop-common-project/hadoop-common/src/CMakeLists.txt
+++ b/hadoop-common-project/hadoop-common/src/CMakeLists.txt
## -20,6 +20,8 ##
# CMake configuration.
#
+SET(CUSTOM_OPENSSL_PREFIX /usr/local/opt/openssl)
+
cmake_minimum_required(VERSION 2.6 FATAL_ERROR)
list(APPEND CMAKE_MODULE_PATH ${CMAKE_SOURCE_DIR}/..)
## -50,8 +52,8 ## get_filename_component(HADOOP_ZLIB_LIBRARY ${ZLIB_LIBRARIES} NAME)
# Look for bzip2.
set(STORED_CMAKE_FIND_LIBRARY_SUFFIXES ${CMAKE_FIND_LIBRARY_SUFFIXES})
-hadoop_set_find_shared_library_version("1")
-find_package(BZip2 QUIET)
+hadoop_set_find_shared_library_version("1.0")
+find_package(BZip2 REQUIRED)
if(BZIP2_INCLUDE_DIR AND BZIP2_LIBRARIES)
get_filename_component(HADOOP_BZIP2_LIBRARY ${BZIP2_LIBRARIES} NAME)
set(BZIP2_SOURCE_FILES
diff --git a/hadoop-common-project/hadoop-common/src/main/conf/core-site.xml b/hadoop-common-project/hadoop-common/src/main/conf/core-site.xml
index d2ddf893e49..ac8e351f1c8 100644
--- a/hadoop-common-project/hadoop-common/src/main/conf/core-site.xml
+++ b/hadoop-common-project/hadoop-common/src/main/conf/core-site.xml
## -17,4 +17,8 ##
<!-- Put site-specific property overrides in this file. -->
<configuration>
+<property>
+<name>io.compression.codec.bzip2.library</name>
+<value>libbz2.dylib</value>
+</property>
</configuration>
diff --git a/hadoop-tools/hadoop-pipes/pom.xml b/hadoop-tools/hadoop-pipes/pom.xml
index 8aafad0f7eb..d4832542265 100644
--- a/hadoop-tools/hadoop-pipes/pom.xml
+++ b/hadoop-tools/hadoop-pipes/pom.xml
## -55,7 +55,7 ##
<mkdir dir="${project.build.directory}/native"/>
<exec executable="cmake" dir="${project.build.directory}/native"
failonerror="true">
- <arg line="${basedir}/src/ -DJVM_ARCH_DATA_MODEL=${sun.arch.data.model}"/>
+ <arg line="${basedir}/src/ -DJVM_ARCH_DATA_MODEL=${sun.arch.data.model} -DOPENSSL_ROOT_DIR=/usr/local/opt/openssl"/>
</exec>
<exec executable="make" dir="${project.build.directory}/native" failonerror="true">
<arg line="VERBOSE=1"/>
I'm trying to deploy the Equinox v3.9.1, but I reach the following Exception:
!ENTRY org.eclipse.osgi 4 0 2013-11-11 12:19:29.027
!MESSAGE Could not find bundle: org.eclipse.equinox.console
!STACK 0
org.osgi.framework.BundleException: Could not find bundle: org.eclipse.equinox.console
at org.eclipse.osgi.framework.internal.core.ConsoleManager.checkForConsoleBundle(ConsoleManager.java:211)
at org.eclipse.core.runtime.adaptor.EclipseStarter.startup(EclipseStarter.java:298)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:177)
at org.eclipse.core.runtime.adaptor.EclipseStarter.main(EclipseStarter.java:152)
I am using the config.ini file as follows:
osgi.bundles=org.eclipse.osgi#-1:start ,\
org.apache.felix.gogo.command#:start, \
org.apache.felix.gogo.runtime#:start, \
org.apache.felix.gogo.shell#:start, \
org.eclipse.equinox.console#:start, \
org.eclipse.equinox.cm#:start, \
org.eclipse.equinox.common#:start, \
I am sure the jar file is in the folder. Could anyone help me to solve this error?
Thank you very much
I tried to use the MVVMCross following the TipCalc example but my View doesn't load. I can see the Activity label but the layout doesn't show up, with the following application output:
Forwarding debugger port 8973
Forwarding console port 8974
Detecting existing process
Loaded assembly: /data/data/MVX.DroidWUL/files/.__override__/MVX.DroidWUL.dll
Loaded assembly: /data/data/MVX.DroidWUL/files/.__override__/Cirrious.MvvmCross.Droid.dll [External]
Loaded assembly: /data/data/MVX.DroidWUL/files/.__override__/Cirrious.CrossCore.Droid.dll [External]
Loaded assembly: /data/data/MVX.DroidWUL/files/.__override__/Cirrious.CrossCore.dll [External]
Loaded assembly: /data/data/MVX.DroidWUL/files/.__override__/System.Windows.dll [External]
Loaded assembly: /data/data/MVX.DroidWUL/files/.__override__/Cirrious.MvvmCross.dll [External]
Loaded assembly: /data/data/MVX.DroidWUL/files/.__override__/Cirrious.MvvmCross.Binding.dll [External]
Loaded assembly: /data/data/MVX.DroidWUL/files/.__override__/Cirrious.MvvmCross.Binding.Droid.dll [External]
Loaded assembly: /data/data/MVX.DroidWUL/files/.__override__/PCLCoreMVX.dll
Loaded assembly: /data/data/MVX.DroidWUL/files/.__override__/Cirrious.MvvmCross.Plugins.Json.dll [External]
Loaded assembly: /data/data/MVX.DroidWUL/files/.__override__/Newtonsoft.Json.dll [External]
Loaded assembly: /data/data/MVX.DroidWUL/files/.__override__/System.Net.dll [External]
Loaded assembly: /data/data/MVX.DroidWUL/files/.__override__/System.Xml.Serialization.dll [External]
Loaded assembly: /data/data/MVX.DroidWUL/files/.__override__/Cirrious.MvvmCross.Localization.dll [External]
Loaded assembly: Mono.Android.dll [External]
Loaded assembly: System.Core.dll [External]
[ApplicationPackageManager] cscCountry is not German : LUX
[mono] WARNING: The runtime version supported by this application is unavailable.
[mono] Using default runtime: v2.0.50727
[monodroid-gc] GREF GC Threshold: 46800
[MonoDroid] Xamarin/Android Trial Mode Active
[ApplicationPackageManager] cscCountry is not German : LUX
Loaded assembly: MonoDroidConstructors [External]
[ApplicationPackageManager] cscCountry is not German : LUX
Loaded assembly: System.dll [External]
Loaded assembly: System.Xml.dll [External]
[mvx] 0.05 Setup: PlatformServices start
mvx:Diagnostic: 0.05 Setup: PlatformServices start
[mvx] 0.70 Setup: Bootstrap actions
mvx:Diagnostic: 0.70 Setup: Bootstrap actions
[mvx] 0.82 Setup: StringToTypeParser start
mvx:Diagnostic: 0.82 Setup: StringToTypeParser start
[mvx] 0.84 Setup: ViewModelFramework start
mvx:Diagnostic: 0.84 Setup: ViewModelFramework start
[mvx] 0.85 Setup: PluginManagerFramework start
mvx:Diagnostic: 0.85 Setup: PluginManagerFramework start
[mvx] 0.89 Configuring Plugin Loader for Cirrious.MvvmCross.Plugins.Json.PluginLoader
mvx:Diagnostic: 0.89 Configuring Plugin Loader for Cirrious.MvvmCross.Plugins.Json.PluginLoader
[mvx] 0.89 Ensuring Plugin is loaded for Cirrious.MvvmCross.Plugins.Json.PluginLoader
mvx:Diagnostic: 0.89 Ensuring Plugin is loaded for Cirrious.MvvmCross.Plugins.Json.PluginLoader
[mvx] 0.90 Setup: App start
mvx:Diagnostic: 0.90 Setup: App start
[mvx] 0.91 Setup: ViewModelTypeFinder start
mvx:Diagnostic: 0.91 Setup: ViewModelTypeFinder start
[mvx] 0.92 Setup: ViewsContainer start
mvx:Diagnostic: 0.92 Setup: ViewsContainer start
[mvx] 0.93 Setup: ViewDispatcher start
mvx:Diagnostic: 0.93 Setup: ViewDispatcher start
[mvx] 0.95 Setup: Views start
mvx:Diagnostic: 0.95 Setup: Views start
[mvx] 1.15 Setup: CommandCollectionBuilder start
mvx:Diagnostic: 1.15 Setup: CommandCollectionBuilder start
[mvx] 1.16 Setup: NavigationSerializer start
mvx:Diagnostic: 1.16 Setup: NavigationSerializer start
[mvx] 1.21 Setup: LastChance start
mvx:Diagnostic: 1.21 Setup: LastChance start
[mvx] 1.53 Setup: Secondary end
mvx:Diagnostic: 1.53 Setup: Secondary end
[mvx] 1.62 Null Extras seen on Intent when creating ViewModel - this should not happen - have you tried to navigate to an MvvmCross View directly?
mvx:Error: 1.62 Null Extras seen on Intent when creating ViewModel - this should not happen - have you tried to navigate to an MvvmCross View directly?
[mvx] 1.63 ViewModel not loaded for view CalenderWUL
mvx:Warning: 1.63 ViewModel not loaded for view CalenderWUL
What am i doing wrong?
Thanks
What am I doing wrong?
From that level of detail, no idea
The problem is clearly somewhere in the initialisation of CalendarWUL - whatever that is. But I can't diagnose it from that.
Some things you could try:
does the completed sample (using a more up to date version of the code) run from: https://github.com/slodge/MvvmCross-Tutorials/tree/master/TipCalc
do the latest released binaries work - https://github.com/slodge/MvvmCross-Binaries/
or do older versions work (from that same folder)
does the complete video walkthrough help - https://www.youtube.com/watch?v=qGup08cz7LM&list=PLR6WI6W1JdeYSXLbm58jwAKYT7RQR31-W (there are several other walkthroughs on that same playlist)