I am on HDP 2.1 hortonwork, hive 0.13.
I want to use an alias with space in the query but it is giving error, is it possible to have a space in alias name? (it works fine without space alias).
Tried with single quotes, no quotes still fails.
*hive> select ot_vdot "alias test" from test4 where ot_vdot<>'null';
NoViableAltException(301#[146:1: selectExpression : ( expression | tableAllColumns );])
at org.antlr.runtime.DFA.noViableAlt(DFA.java:158)
at org.antlr.runtime.DFA.predict(DFA.java:144)
at org.apache.hadoop.hive.ql.parse.HiveParser_SelectClauseParser.selectExpression(HiveParser_SelectClauseParser.java:4142)
at org.apache.hadoop.hive.ql.parse.HiveParser_SelectClauseParser.selectItem(HiveParser_SelectClauseParser.java:3038)
at org.apache.hadoop.hive.ql.parse.HiveParser_SelectClauseParser.selectList(HiveParser_SelectClauseParser.java:1307)
at org.apache.hadoop.hive.ql.parse.HiveParser_SelectClauseParser.selectClause(HiveParser_SelectClauseParser.java:1070)
at org.apache.hadoop.hive.ql.parse.HiveParser.selectClause(HiveParser.java:40193)
at org.apache.hadoop.hive.ql.parse.HiveParser.singleSelectStatement(HiveParser.java:38048)
at org.apache.hadoop.hive.ql.parse.HiveParser.selectStatement(HiveParser.java:37754)
at org.apache.hadoop.hive.ql.parse.HiveParser.regularBody(HiveParser.java:37691)
at org.apache.hadoop.hive.ql.parse.HiveParser.queryStatementExpressionBody(HiveParser.java:36898)
at org.apache.hadoop.hive.ql.parse.HiveParser.queryStatementExpression(HiveParser.java:36774)
at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:1338)
at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1036)
at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:199)
at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:166)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:408)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:322)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:976)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1041)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:912)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:902)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:793)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
FAILED: ParseException line 1:15 cannot recognize input near 'ot_vdot' '"alias test"' 'from' in select expression
hive>
*
Backticks are hive's way to give things names with weird characters or reserved words. For example:
select `my field with spaces`, `select`, `where` from `_my_table`
Related
I am using EMR-4.3.0, Spark 1.6.0, Hive 1.0.0.
I write a table like so (pseudocode) -
val df = <a dataframe>
df.registerTempTable("temptable")
sqlContext.setConf("hive.exec.dynamic.partition.mode","true")
sqlContext.sql("create external table exttable ( some columns ... )
partitioned by (partkey int) stored as parquet location 's3://some.bucket'")
sqlContext.sql("insert overwrite exttable partition(partkey) select columns from
temptable")
The write works fine and I can read the table back using -
sqlContext.sql("select * from exttable")
However, when I try to read the table using Hive as -
hive -e 'select * from exttable'
Hive throws a NullPointerException with the stack trace below. Any help appreciated! -
questTime=[0.008], ResponseProcessingTime=[0.295], HttpClientSendRequestTime=[0.313],
2016-05-19 03:08:02,537 ERROR [main()]: CliDriver (SessionState.java:printError(833)) - Failed with exception java.io.IOException:java.lang.NullPointerException
java.io.IOException: java.lang.NullPointerException
at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:663)
at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:561)
at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:138)
at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1619)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:221)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:153)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:364)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:712)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:631)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:570)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.NullPointerException
at parquet.format.converter.ParquetMetadataConverter.fromParquetStatistics(ParquetMetadataConverter.java:247)
at parquet.format.converter.ParquetMetadataConverter.fromParquetMetadata(ParquetMetadataConverter.java:368)
at parquet.format.converter.ParquetMetadataConverter.readParquetMetadata(ParquetMetadataConverter.java:346)
at parquet.hadoop.ParquetFileReader.readFooter(ParquetFileReader.java:296)
at parquet.hadoop.ParquetFileReader.readFooter(ParquetFileReader.java:254)
at org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.getSplit(ParquetRecordReaderWrapper.java:200)
at org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.<init>(ParquetRecordReaderWrapper.java:79)
at org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.<init>(ParquetRecordReaderWrapper.java:66)
at org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat.getRecordReader(MapredParquetInputFormat.java:72)
at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:498)
at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:588)
... 15 more
UPDATE - After messing around for a bit, it seems like null values in the data mess Hive up. How do I avoid this?
I am trying to execute the below code in hive:
create table xyz (name string,div int) ;
It shows error. Cant we use a column in hive with name div ? I have a large table that has a column div, executing that hql throwed me below error. That is how i tried with a smaller hql as the one above, and it shows same error. I am using hive 0.13.
NoViableAltException(14#[])
at org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.identifier(HiveParser_IdentifiersParser.java:11627)
at org.apache.hadoop.hive.ql.parse.HiveParser.identifier(HiveParser.java:40134)
at org.apache.hadoop.hive.ql.parse.HiveParser.columnNameType(HiveParser.java:34747)
at org.apache.hadoop.hive.ql.parse.HiveParser.columnNameTypeList(HiveParser.java:32979)
at org.apache.hadoop.hive.ql.parse.HiveParser.createTableStatement(HiveParser.java:4544)
at org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:2144)
at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:1398)
at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1036)
at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:199)
at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:166)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:408)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:322)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:976)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1041)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:912)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:902)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:793)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
FAILED: ParseException line 2:15 cannot recognize input near 'div' 'int' ')' in column specification`
Well the answer is found!
create table xyz (name string, `div` int);
This works! Surround div with "`" symbol and then it works.
I suppose div would be a keyword in hive (not found in any document though).
Thanks,
Neethu
while running pig in mapreduce mode im occuring really strange error.
The pigscript.pig contains....
x= load 'hdfs://file.avro' USING AvroStorage();
some transofrmations...
fs mv src/file dest/file;
up to this point all works fine, but script continues as
y = load 'hdfs://file2.avro' USING AvroStorage();
While executed previous command i got error bellow. I double check and the file2.avro is there ... stored in the HDFS.
When I quit pig and re-run the code from the line
y = load 'hdfs://file2.avro' USING AvroStorage();
all works fine.
Any idea?
Pig Stack Trace
---------------
ERROR 1200: null
Failed to parse: null
at org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:201)
at org.apache.pig.PigServer$Graph.validateQuery(PigServer.java:1707)
at org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1680)
at org.apache.pig.PigServer.registerQuery(PigServer.java:623)
at org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:1063)
at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:501)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:230)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:205)
at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:66)
at org.apache.pig.Main.run(Main.java:558)
at org.apache.pig.Main.main(Main.java:170)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.NullPointerException
at org.apache.pig.builtin.AvroStorage.getAvroSchema(AvroStorage.java:298)
at org.apache.pig.builtin.AvroStorage.getAvroSchema(AvroStorage.java:282)
at org.apache.pig.builtin.AvroStorage.getSchema(AvroStorage.java:256)
at org.apache.pig.newplan.logical.relational.LOLoad.getSchemaFromMetaData(LOLoad.java:175)
at org.apache.pig.newplan.logical.relational.LOLoad.<init>(LOLoad.java:89)
at org.apache.pig.parser.LogicalPlanBuilder.buildLoadOp(LogicalPlanBuilder.java:901)
at org.apache.pig.parser.LogicalPlanGenerator.load_clause(LogicalPlanGenerator.java:3568)
at org.apache.pig.parser.LogicalPlanGenerator.op_clause(LogicalPlanGenerator.java:1625)
at org.apache.pig.parser.LogicalPlanGenerator.general_statement(LogicalPlanGenerator.java:1102)
at org.apache.pig.parser.LogicalPlanGenerator.statement(LogicalPlanGenerator.java:560)
at org.apache.pig.parser.LogicalPlanGenerator.query(LogicalPlanGenerator.java:421)
at org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:191)
... 16 more
================================================================================
I want to perform simple update on table fleet_online_orc with some data within table skytrack_data_packet_orc. Both tables are highlighted below and queries are also given. Please help on it.
fleet_online_orc table
-----------------------
`latitude longitude gid fleet_sim_no location ast_updated_time`
`26.8708 80.9568 21444 9828221444 Budshahnagar 2015-03-27 11:56:08.097`
`22.6958 75.8868 27534 9828027534 Samwad Nagar 2015-03-27 11:56:08.097`
`19.0918 72.8822 13458 9259113458 Jari Mari 2015-03-27 11:56:08.097`
`skytrack_data_packet_orc
-------------------------`
sim_no latitude longitude utc_date ist_time
9611170088 12.98304 77.59880 200415 2015-04-20 16:35:35.0
9611170088 12.98304 77.59880 200415 2015-04-20 16:35:19.0
9611170088 12.98304 77.59881 200415 2015-04-20 16:35:03.0
This is the query which I am trying....
UPDATE fleet_online_orc tar
SET
tar.al_latitude = src.latitude,
tar.al_longitude = src.longitude,
tar.al_location ='Bangalore'
FROM skytrack_data_packet_orc src
WHERE tar.fleet_sim_no = sor.sim_no
AND tar.gid = 21444;
And this the error which I am getting..
UPDATE bd_fleet.fleet_online_orc tar
SET tar.al_latitude = sor.latitude, tar.al_longitude = sor.longitude, tar.al_location = 'Bangalore'
FROM bd_fleet.skytrack_data_packet_orc sor
WHERE tar.fleet_sim_no = sor.sim_no
AND tar.gid = 2089;
MismatchedTokenException(17!=20)
at org.antlr.runtime.BaseRecognizer.recoverFromMismatchedToken(BaseRecognizer.java:617)
at org.antlr.runtime.BaseRecognizer.match(BaseRecognizer.java:115)
at org.apache.hadoop.hive.ql.parse.HiveParser.columnAssignmentClause(HiveParser.java:42463)
at org.apache.hadoop.hive.ql.parse.HiveParser.setColumnsClause(HiveParser.java:42536)
at org.apache.hadoop.hive.ql.parse.HiveParser.updateStatement(HiveParser.java:42682)
at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:1604)
at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1052)
at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:199)
at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:166)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:389)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:303)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1067)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1129)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1004)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:994)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:247)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:199)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:410)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:783)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:677)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:616)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
FAILED: ParseException line 1:34 extraneous input 'tar' expecting SET near 'tar' in update statement
line 2:7 mismatched input '.' expecting = near 'tar' in update statement
~
I have a huge log file with a lot of exceptions and I need to extract full stack traces and few lines before and after it. It will be perfect if tool for this is bash script.
example:
$16.02.2012 16:04:34 *INFO * [main] InitialContextInitializer: Reference bound: rmirepository (InitialContextInitializer.java, line 203)
16.02.2012 16:04:34 *ERROR* [main] StandaloneContainerInitializedListener: Error of StandaloneContainer initialization (StandaloneContainerInitializedListener.java, line 109)
java.lang.RuntimeException: Cannot instantiate component key=org.exoplatform.services.jcr.ext.script.groovy.GroovyScript2RestLoader type=org.exoplatform.services.jcr.ext.script.groovy.GroovyScript2RestLoader found at file:/home/roman/reports/backup/1.14.7-5636/rdbms/single/exo-tomcat_1.14.7-5636/exo-configuration.xml
at org.exoplatform.container.jmx.MX4JComponentAdapter.getComponentInstance(MX4JComponentAdapter.java:134)
at org.exoplatform.container.management.ManageableComponentAdapter.getComponentInstance(ManageableComponentAdapter.java:68)
at org.exoplatform.container.ConcurrentPicoContainer.getInstance(ConcurrentPicoContainer.java:468)
at org.exoplatform.container.ConcurrentPicoContainer.getComponentInstancesOfType(ConcurrentPicoContainer.java:366)
at org.exoplatform.container.CachingContainer.getComponentInstancesOfType(CachingContainer.java:111)
at org.exoplatform.container.LifecycleVisitor.visitContainer(LifecycleVisitor.java:151)
at org.exoplatform.container.ConcurrentPicoContainer.accept(ConcurrentPicoContainer.java:615)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.picocontainer.defaults.AbstractPicoVisitor.traverse(AbstractPicoVisitor.java:32)
at org.exoplatform.container.LifecycleVisitor.traverse(LifecycleVisitor.java:90)
at org.exoplatform.container.LifecycleVisitor.start(LifecycleVisitor.java:170)
at org.exoplatform.container.ConcurrentPicoContainer.start(ConcurrentPicoContainer.java:554)
at org.exoplatform.container.ExoContainer.start(ExoContainer.java:266)
at org.exoplatform.container.StandaloneContainer$3.run(StandaloneContainer.java:178)
at org.exoplatform.container.StandaloneContainer$3.run(StandaloneContainer.java:175)
at org.exoplatform.commons.utils.SecurityHelper.doPrivilegedAction(SecurityHelper.java:291)
at org.exoplatform.container.StandaloneContainer.getInstance(StandaloneContainer.java:174)
at org.exoplatform.container.StandaloneContainer.getInstance(StandaloneContainer.java:129)
at org.exoplatform.ws.frameworks.servlet.StandaloneContainerInitializedListener.contextInitialized(StandaloneContainerInitializedListener.java:104)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4205)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4704)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:601)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:675)
at org.apache.catalina.startup.HostConfig.deployDescriptors(HostConfig.java:601)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:502)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1315)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1061)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:840)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1053)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
at org.apache.catalina.core.StandardService.start(StandardService.java:525)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:754)
at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
Caused by: java.lang.RuntimeException: Cannot instantiate component key=org.exoplatform.services.jcr.RepositoryService type=org.exoplatform.services.jcr.impl.RepositoryServiceImpl found at file:/home/roman/reports/backup/1.14.7-5636/rdbms/single/exo-tomcat_1.14.7-5636/exo-configuration.xml
at org.exoplatform.container.jmx.MX4JComponentAdapter.getComponentInstance(MX4JComponentAdapter.java:134)
at org.exoplatform.container.management.ManageableComponentAdapter.getComponentInstance(ManageableComponentAdapter.java:68)
at org.exoplatform.container.ConcurrentPicoContainer.getInstance(ConcurrentPicoContainer.java:468)
at org.exoplatform.container.ConcurrentPicoContainer.getComponentInstanceOfType(ConcurrentPicoContainer.java:422)
at org.exoplatform.container.CachingContainer.getComponentInstanceOfType(CachingContainer.java:139)
at org.exoplatform.container.ExoContainer.createComponent(ExoContainer.java:407)
at org.exoplatform.container.jmx.MX4JComponentAdapter.getComponentInstance(MX4JComponentAdapter.java:96)
... 45 more
Use awk:
BEGIN {
previous = "";
}
/^\tat / {
if( previous != "" ) {
print previous;
previous = "";
}
print;
next;
}
{ previous = $0; }
should do the trick. In a nutshell, look for the pattern \tat (tab, at, blank) which almost always is used in a stack trace.
If you have many exceptions, then you can use maps (associative arrays in AWK's lingo) to save part of the exception message and then do statistics (like which exception happens the most).
Most of line in Java exception block are start with TAB delimiter.
So to get all exceptions from a huge log file you can use grep TAB delimiter.
Use following command to grep all exceptions from a file:
**$ grep -P -A 15 -B 15 "^\t" inputLogFile**
You can use below command
grep -B 10 -A 100 "Caused By" app.log -->this will give all the java exception logged in application log file. You can change string if you want to find specific exceptions.
https://www.techiedelight.com/java-program-search-exceptions-huge-log-file-on-server/
You can try something like this,
=======
grep -B20 -A20 ^'$16.02.2012 16:04:34' <path/to/log/file>
=======
Where
-B20 -- will display 20 lines before the line where the match was found.
-A20 -- will display 20 lines after the line where the match was found.