sparkR Installation Issue with spark-1.6.3-bin-hadoop2.6 - sparkr

After installing spark it got installed successfully but
while running sparkR i am getting issue as env: R: Permission denied??
my spark shell is working. can someone help me out with this issue???

Related

pod install throws env: ruby_executable_hooks: No such file or directory on Jenkins only

I have tried other solutions provided, none of them helped this exact scenario.
We have recently updated Jenkins machine to Catalina, then Jenkins from 2.5 to 2.222.1, then ruby from 2.3 to 2.6, reinstalled Cocoapods.
/usr/local/bin/pod install command fails in Jenkins CI throwing error
env: ruby_executable_hooks: No such file or directory
Remotely logged in to machine and running same command from same job folder in Terminal - all runs good with no issues
Anyone know what might be an issue?
This question answer game an idea
Error env: ruby_executable_hooks: No such file or directory while running Slather on Jenkins
This works fine
$HOME/.rvm/gems/ruby-2.6.5/wrappers/pod install
Though same fix for xcpretty doesn't seem to work. But that is for another time

Unable to Launch Hive prompt in Windows 10

I have downloaded Hadoop 3.1.0 and Hive is 2.1.0 in my Windows 10. I have Hadoop running properly using start-all.cmd command from terminal. When I try to run 'hive' from command prompt it gives following messages and errors which are in the attached screenshot. I am using Derby 10.12.1.1 with hive. Following this tutorial from Youtube.
Have also tried reinstalling Hive but still not working. I have already spent a lot of my time dealing with this problem but got no success.
Any sort of help will be appreciated. Thank You.

error while installing kylo specific services for nifi

I am trying to install kylo 0.8.4.
There is a step to install kylo specific components after installing Nifi using command,
sudo ./install-kylo-components.sh /opt /opt/kylo kylo kylo
but getting follwing error.
Creating symlinks for NiFi version 1.4.0.jar compatible nars
ERROR: spark-submit not on path. Has spark been installed?
I have spark installed.
need help.
The script calls which spark-submit to check if Spark is available. If available, it uses spark-submit --version to determine the version of Spark that is installed.
The error indicates that spark-submit is not available on system path. Can you please execute which spark-submit on the command line and check the result? Please refer to the screenshot below for expected result on Kylo sandbox.
If spark-submit is not available on the system path, you can fix it by updating the PATHvariable in .bash_profile file by providing the location of your Spark installation.
As a next step, you can also verify the installed version of Spark by running spark-submit --version. Please refer to screenshot below for an example result.

pyspark is not running - mac

I am trying to run pyspark on my local machine and somehow I can not. I have downloaded the latest version of the Spark and set the following path in .bash_profile:
export PATH="/Users/user/Downloads/spark-1.3.1-bin-hadoop2.6/bin:$PATH"
When I run this command:
IPYTHON_OPTS="notebook” pyspark
Nothing happens. I have done everything based on instructions and what I see as the output of that command is this symbol followed by nothing:
>
Can anyone please help me?
Thanks

Issues with manual installation of predictionIO dependencies

I am installing predictionIO from source code. I have downloaded and done the predictionIO installation successfully. I am now trying to install the dependencies (Spark, Elasticsearch, HBase) but I am running into errors for each of them. Below are the issues I am facing when I execute pio status:
1 - Unable to locate a proper Apache Spark installation
2 - It is also unable to find metadata files.
I have not changed any default settings. I'm using windows 8.1. On localhost, I have running IIS. On 127.0.0.1:8888 I run ipython notebook.
Please help on how I can get predictionIO up and running on my machine.
Thanks
If you are on Windows, you can install with Vagrant.
http://docs.prediction.io/community/projects/#vagrant-installation-for-predictionio
I believe the discussion has moved to the google group.
https://groups.google.com/forum/#!searchin/predictionio-user/SAS/predictionio-user/ZamBr1ZaQ3o/fyNkXh3zsv0J
https://groups.google.com/forum/#!searchin/predictionio-user/SAS/predictionio-user/0awaASUR8lE/JkLtPeRrNt4J
This is the relevant thread.
https://groups.google.com/forum/#!searchin/predictionio-user/SAS/predictionio-user/0awaASUR8lE/JkLtPeRrNt4J
Moreover, PredictionIO docs had a few errors. Below are some of them and their corrected versions.
1 - Actual line: PATH=$PATH:/home/yourname/predictionio/bin; export PATH
Corrected Version PATH=$PATH:/home/yourname/PredictionIO/bin; export PATH
2 - Actual Line: $ pio eventserver
Corrected Version: $ pio eventserver --ip 0.0.0.0
3 - Actual Line pio template get PredictionIO/templates-scala-parallel-recommendation MyRecommendation
Corrected Version pio template get PredictionIO/template-scala-parallel-recommendation MyRecommendation

Resources