Regarding installation of HtmlUnit Driver - installation

i am trying to installing selenium-server-standalone-2.0b1.jar actually for that i have to set all jar files into the classpath of a java project. But i don't know that where all the jars under this folder "selenium-server-standalone-2.0b1.jar" ? and how to install selenium-server-standalone-2.0b1.jar? Please help me out

I didn't get your question correctly. If you are looking for the jars,you can download from the following sites.Try to used the new jars,as these come with lot of fixes.
Just add to classpath & ready to go.
http://seleniumhq.org/download/
Archive Releases:- http://release.seleniumhq.org/

Related

How to resolve "test-life cycle-and-artifact Handler1.pom.sha1 " cannot creat on maven installiton

I need your help.
I downloaded maven 3-5-0-src.zip file from maven site. I started to unzip process after a few seconds I got an this error.
Error: System doesn't find this path.
Cannot create
C:\Users\AppData\Local\Temp\wz85b2\apache-maven-3.5.0\maven-core\src\test\resources\org\apache\maven\extension\test-extension-repo\org\apache\maven\core\test\test-lifecycle-and-artifactHandler\1\test-lifecycle-and-artifactHandler-1.pom.sha1.
I can't install maven on my computer. How can solve this?
The ...-src.zip file only contains the source files for your reference.
You need to download and install the binary zip file at apache-maven-3.5.0-bin.zip if you're planning to use maven to build stuff.
Otherwise, the above error looks suspiciously like a Windows path length problem.

Spark Cassandra NoClassDefFoundError guava/cache/CacheLoader

Running Cassandra 2.2.8, Win7, JDK8, Spark2, HAve thse in the CP: Cassandra core 3.12, spark-cassandra-2.11, Spark-cassandra-java-2.11, Spark2.11, spark-network-common_2.11, Guava-16.0.jar, sacala2.11.jar, etc
Trying to run a basic example- compiles fine, but when when I try to run- at the first line itself get error:
SparkConf conf = new SparkConf();
java.lang.NoClassDefFoundError: org/spark_project/guava/cache/CacheLoader
Missing spark-network-common is supposed to cause this error - but I do have it. Any conflicting jars?
Thanks
So the answer is: don't exactly know the answer but the problem was solved. Used the the pom and created a maven project in eclipse. it brought in several (dozen) jars and it finally worked. So likely some conflicting/missing jar - tried to look into it- hard to figure out.
Maybe you should check the repository. To check the whether jar with the lastupdated .If it has lastupdated, and then del those files. And download again.

Eclipse plugin error for Hadoop on Ubuntu

I installed Hadoop version 1.0.3 and its related eclipse plugin successfully. All the Hadoop functionalities and examples are working pretty well, but when I want to use its plugin on eclipse, it could not connect to hdfs and I get the error:
An internal error occurred during: "Connecting to DFS localhost".
org/apache/commons/configuratiĀ­on/Configuration.
could anybody help me how to solve this problem!
Thanks
You are facing this problem because the plugin is missing some necessary jars. In order to solve the problem you need to rebuild the plugin after including the necessary jars. I have seen this kind of questions a lot on SO, and they all point out to the same thing. Please see these links :
Eclipse Hadoop plugin issue(Call to localhost/127.0.0.1:50070 )Can any body give me the solution for this?
Hadoop eclipse mapreduce is not working?
Installing Hadoop's Eclipse Plugin
I did follow the following blog instructions to make Hadoop eclipse plugin 1.0.4 :
http://iredlof.com/part-4-compile-hadoop-v1-0-4-eclipse-plugin-on-ubuntu-12-10/
but it seems it has some missing parts like:
in MANIFEST.MF you should add:
/lib/commons-cli-1.2.jar
and in build-contrib.xml you should also add:
<property name="commons-cli.version" value="1.2"/>
I hope these are useful!
you must run hadoop with command line first!!
./[hadoop-path]/bin/start-all.sh

Jar bundler ANT task and working directory

There is a checkbox in Jar Bundler which allows to set working folder to inside package. Anybody knows how to do same thing using JarBundler ANT task? Thanks in advance!
I made it myself. Here's how:
<jarbundler dir="${deployment.dir}/bundled"
name="qqq"
mainclass="main.Main"
version="1.5"
jvmversion="1.6+"
icon="${deployment.dir}/Equals.icns"
workingdirectory="$APP_PACKAGE/Contents/Resources/Java">
Made a bundle and looked inside. There was a value "$APP_PACKAGE/Contents/Resources/Java". Jar Bundler uses it to set what i asked for. Thanks to those 9 people who at least read my question.

How do I set where my grails plugins should be installed?

I saw the light and install the joda-time plugin for grails.
However, when I tried to commit my changes to source control I realised that grails had located the files in:
C:\Users\Steve\.grails\1.1.1\plugins
instead of somewhere under the project directory of:
f:\grails\projects\myproject
Yeah I'm using windows :-\
So now when someone pulls down my changes from source control they are missing all the joda-time plugin lovelyness and they are wanting to spank me :)
What should I be setting so that grails doesn't put anything under my user directory?
(It isn't installed as a global plugin - just as a project one - at least I think so, I ran "grails install-plugin joda-time" )
Many thanks in advance.
P.S. Currently listening to Plug In Baby by Muse....how coincidental :D
The plugin is listed in application.properties, so when someone gets your code Grails will install missing plugins the first time they run 'grails run-app' or other commands.
If you want to revert to 1.0.x behavior just create grails-app/conf/BuildConfig.groovy with the line
grails.project.plugins.dir='plugins'
and your plugins will be in with the rest of the project files.

Resources