How to rasie the log level for logs stored in elk stack - elasticsearch

Is it possible to raise the log level for logs stored on ELK stack? Now I found all log levels are stored on my elk stack, I only want those warning anf error logs are stored in the stack, how to do that?

I think you're looking for the logstash drop filter, which lets you filter out logs based on some criteria, in your case debug, info and the like. From the docs, a filter might look like:
filter {
if [loglevel] == "debug" {
drop { }
}
}
https://www.elastic.co/guide/en/logstash/current/plugins-filters-drop.html
Also, your question looks similar to this one:
Logstash drop filter for event

If you have a log file test.log like below:
DEBUG | 2008-09-06 10:51:44,817 | DefaultBeanDefinitionDocumentReader.java | 86 | Loading bean definitions
WARN | 2008-09-06 10:51:44,848 | AbstractBeanDefinitionReader.java | 185 | Loaded 5 bean definitions from location pattern [samContext.xml]
INFO | 2008-09-06 10:51:44,848 | XmlBeanDefinitionReader.java | 323 | Loading XML bean definitions from class path resource [tmfContext.xml]
DEBUG | 2008-09-06 10:51:44,848 | DefaultDocumentLoader.java | 72 | Using JAXP provider [com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderFactoryImpl]
ERROR | 2008-09-06 10:51:44,848 | BeansDtdResolver.java | 72 | Found beans DTD [http://www.springframework.org/dtd/spring-beans.dtd] in classpath: spring-beans.dtd
ERROR | 2008-09-06 10:51:44,864 | DefaultBeanDefinitionDocumentReader.java | 86 | Loading bean definitions
DEBUG | 2008-09-06 10:51:45,458 | AbstractAutowireCapableBeanFactory.java | 411 | Finished creating instance of bean 'MS-SQL'
You can define an if conditional on the message you want to keep and drop others:
input {
file {
path => "/your/path/test.log"
sincedb_path => "/your/path/test.idx"
start_position => "beginning"
}
}
filter {
if [message] =~ "WARN" or [message] =~ "ERROR" {
} else {
drop {}
}
}
output {
stdout {
codec => rubydebug
}
}
Then, your result will look like:
{
"message" => "WARN | 2008-09-06 10:51:44,848 | AbstractBeanDefinitionReader.java | 185 | Loaded 5 bean definitions from location pattern [samContext.xml]",
"#version" => "1",
"#timestamp" => "2015-09-17T18:30:24.897Z",
"host" => "MacBook-Pro-de-Alain.local",
"path" => "/Users/Alain/Workspace/elk/logstash-1.5.4/config/filter/test.log"
}
{
"message" => "ERROR | 2008-09-06 10:51:44,848 | BeansDtdResolver.java | 72 | Found beans DTD [http://www.springframework.org/dtd/spring-beans.dtd] in classpath: spring-beans.dtd",
"#version" => "1",
"#timestamp" => "2015-09-17T18:30:24.898Z",
"host" => "MacBook-Pro-de-Alain.local",
"path" => "/Users/Alain/Workspace/elk/logstash-1.5.4/config/filter/test.log"
}
{
"message" => "ERROR | 2008-09-06 10:51:44,864 | DefaultBeanDefinitionDocumentReader.java | 86 | Loading bean definitions",
"#version" => "1",
"#timestamp" => "2015-09-17T18:30:24.899Z",
"host" => "MacBook-Pro-de-Alain.local",
"path" => "/Users/Alain/Workspace/elk/logstash-1.5.4/config/filter/test.log"
}
Regards,
Alain

Related

How to iterate through a nested dict in puppet?

Here is the python example of a data set i have in my puppet code similar to the below, :
dict = {'account1': {'uid': ['123456'], 'user': ['appuser1'], 'appname': ['myapp1']},
'account2': {'uid':['567878'], 'user':['appuser'2], 'appname':['myapp2']}}
for i in dict.keys():
print dict[i]['user'], dict[i]['uid']
How do i achieve the same solution in puppet/ruby.TIA.
In Puppet manifests, you can iterate over a hash using the each function:
$ cat foo.pp
$dict = {
'account1' => {
'uid' => ['123456'],
'user' => ['appuser1'],
'appname' => ['myapp1']
},
'account2' => {
'uid' => ['567878'],
'user' => ['appuser2'],
'appname' => ['myapp2']
}
}
$dict.each | $account_key, $account | {
notice("${account['user'][0]}, ${account['uid'][0]}")
}
$ puppet apply foo.pp
Notice: Scope(Class[main]): appuser1, 123456
Notice: Scope(Class[main]): appuser2, 567878
Notice: Compiled catalog for it070137 in environment production in 0.04 seconds
Notice: Applied catalog in 0.03 seconds
If you like, you can use types to check that the key and value in the hash are what you expect:
$dict.each | String $account_key, Hash $account | {
notice("${account['user'][0]}, ${account['uid'][0]}")
}

JPARepository deleteAllInBatch() not working as expected

I am trying to delete some rows from my table using Spring JPA deleteAllInBatch() but when the number of rows to be deleted exceed some threshold value JPA throws error. I'm not certain the cause of this error but found a jira ticket : https://jira.spring.io/browse/DATAJPA-137.
I don't want to use deleteAll() as it deletes the data one by one and will lead to performance issues. Is this a drawback of JPA or there's some solution to it. I tried for some workaround but didn't found anything much useful. Please help me to get an efficient solution for this operation or some useful references. Thanks in advance...
DbIssueApplication.java
#SpringBootApplication
public class DbIssueApplication
{
public static void main(String[] args)
{
ApplicationContext context = SpringApplication.run(DbIssueApplication.class, args);
TestService service = context.getBean(TestService.class);
long st = System.currentTimeMillis();
List<Test> testList = new ArrayList<>();
for(int i=0;i<5000;i++)
{
testList.add(new Test(i,(i%2==0)?"field1":"field2"));
}
service.insert(testList);
service.deleteByName("field2");
System.err.println("The processing took = "+(System.currentTimeMillis()-st)+" ms");
}
}
Test.java
#Entity
#Table(name="test")
public class Test implements Serializable
{
private static final long serialVersionUID = -9182756617906316269L;
#Id
private Integer id;
private String name;
... getter,setter and constructors
}
TestRepository.java
public interface TestRepository extends JpaRepository<Test, Integer>
{
List<Test> findByName(String name);
}
TestService.java
public interface TestService
{
public void insert(List<Test> testList);
public void deleteByName(String name);
}
TestServiceImpl.java
#Service
public class TestServiceImpl implements TestService
{
#Autowired
TestRepository testRepository;
#Override
public void insert(List<Test> testList)
{
testRepository.deleteAllInBatch();
testRepository.saveAll(testList);
}
#Override
public void deleteByName(String name)
{
System.err.println("The number of rows to be deleted = "+testRepository.findByName(name).size());
testRepository.deleteInBatch(testRepository.findByName(name));
}
}
dbSchema
create table test
(
id int,
name varchar(40)
);
ErrorLog
[ main] o.h.e.t.i.TransactionImpl : begin
[ main] o.h.h.i.a.QueryTranslatorImpl : parse() - HQL: delete from com.example.demo.entity.Test x where x = ?1 or x = ?2 or x = ?3 or x = ?4 or ... x = ?2500
[ main] o.h.h.i.a.ErrorTracker : throwQueryException() : no errors
[ main] o.h.e.t.i.TransactionImpl : rolling back
[ Thread-14] o.h.i.SessionFactoryImpl : HHH000031: Closing
[ Thread-14] o.h.t.s.TypeConfiguration$Scope : Un-scoping TypeConfiguration [org.hibernate.type.spi.TypeConfiguration$Scope#6cf001] from SessionFactory [org.hibernate.internal.SessionFactoryImpl#1ad3d8a]
[ Thread-14] o.h.s.i.AbstractServiceRegistryImpl : Implicitly destroying ServiceRegistry on de-registration of all child ServiceRegistries
[ Thread-14] o.h.b.r.i.BootstrapServiceRegistryImpl : Implicitly destroying Boot-strap registry on de-registration of all child ServiceRegistries
=======================================================================================================================================================================================================================================================================================================
[ main] o.h.h.i.QueryTranslatorFactoryInitiator : QueryTranslatorFactory : org.hibernate.hql.internal.ast.ASTQueryTranslatorFactory#5e167a
[ main] o.h.h.i.QueryTranslatorFactoryInitiator : HHH000397: Using ASTQueryTranslatorFactory
[ main] o.h.h.i.a.QueryTranslatorImpl : parse() - HQL: select generatedAlias0 from com.example.demo.entity.Test as generatedAlias0 where generatedAlias0.name=:param0
[ main] o.h.h.i.a.ErrorTracker : throwQueryException() : no errors
[ main] o.h.h.i.a.QueryTranslatorImpl : --- HQL AST ---
\-[QUERY] Node: 'query'
+-[SELECT_FROM] Node: 'SELECT_FROM'
| +-[FROM] Node: 'from'
| | \-[RANGE] Node: 'RANGE'
| | +-[DOT] Node: '.'
| | | +-[DOT] Node: '.'
| | | | +-[DOT] Node: '.'
| | | | | +-[DOT] Node: '.'
| | | | | | +-[IDENT] Node: 'com'
| | | | | | \-[IDENT] Node: 'example'
| | | | | \-[IDENT] Node: 'demo'
| | | | \-[IDENT] Node: 'entity'
| | | \-[IDENT] Node: 'Test'
| | \-[ALIAS] Node: 'generatedAlias0'
| \-[SELECT] Node: 'select'
| \-[IDENT] Node: 'generatedAlias0'
\-[WHERE] Node: 'where'
\-[EQ] Node: '='
+-[DOT] Node: '.'
| +-[IDENT] Node: 'generatedAlias0'
| \-[IDENT] Node: 'name'
\-[COLON] Node: ':'
\-[IDENT] Node: 'param0'
[ main] o.h.h.i.a.HqlSqlBaseWalker : select << begin [level=1, statement=select]
[ main] o.h.h.i.a.t.FromElement : FromClause{level=1} : com.example.demo.entity.Test (generatedAlias0) -> test0_
[ main] o.h.h.i.a.t.FromReferenceNode : Resolved : generatedAlias0 -> test0_.id
[ main] o.h.h.i.a.t.FromReferenceNode : Resolved : generatedAlias0 -> test0_.id
[ main] o.h.h.i.a.t.DotNode : getDataType() : name -> org.hibernate.type.StringType#d003cd
[ main] o.h.h.i.a.t.FromReferenceNode : Resolved : generatedAlias0.name -> test0_.name
[ main] o.h.h.i.a.HqlSqlBaseWalker : select : finishing up [level=1, statement=select]
[ main] o.h.h.i.a.HqlSqlWalker : processQuery() : ( SELECT ( {select clause} test0_.id ) ( FromClause{level=1} test test0_ ) ( where ( = ( test0_.name test0_.id name ) ? ) ) )
[ main] o.h.h.i.a.u.JoinProcessor : Using FROM fragment [test test0_]
[ main] o.h.h.i.a.HqlSqlBaseWalker : select >> end [level=1, statement=select]
[ main] o.h.h.i.a.QueryTranslatorImpl : --- SQL AST ---
\-[SELECT] QueryNode: 'SELECT' querySpaces (test)
+-[SELECT_CLAUSE] SelectClause: '{select clause}'
| +-[ALIAS_REF] IdentNode: 'test0_.id as id1_0_' {alias=generatedAlias0, className=com.example.demo.entity.Test, tableAlias=test0_}
| \-[SQL_TOKEN] SqlFragment: 'test0_.name as name2_0_'
+-[FROM] FromClause: 'from' FromClause{level=1, fromElementCounter=1, fromElements=1, fromElementByClassAlias=[generatedAlias0], fromElementByTableAlias=[test0_], fromElementsByPath=[], collectionJoinFromElementsByPath=[], impliedElements=[]}
| \-[FROM_FRAGMENT] FromElement: 'test test0_' FromElement{explicit,not a collection join,not a fetch join,fetch non-lazy properties,classAlias=generatedAlias0,role=null,tableName=test,tableAlias=test0_,origin=null,columns={,className=com.example.demo.entity.Test}}
\-[WHERE] SqlNode: 'where'
\-[EQ] BinaryLogicOperatorNode: '='
+-[DOT] DotNode: 'test0_.name' {propertyName=name,dereferenceType=PRIMITIVE,getPropertyPath=name,path=generatedAlias0.name,tableAlias=test0_,className=com.example.demo.entity.Test,classAlias=generatedAlias0}
| +-[ALIAS_REF] IdentNode: 'test0_.id' {alias=generatedAlias0, className=com.example.demo.entity.Test, tableAlias=test0_}
| \-[IDENT] IdentNode: 'name' {originalText=name}
\-[NAMED_PARAM] ParameterNode: '?' {name=param0, expectedType=org.hibernate.type.StringType#d003cd}
[ main] o.h.h.i.a.ErrorTracker : throwQueryException() : no errors
[ main] o.h.h.i.a.QueryTranslatorImpl : HQL: select generatedAlias0 from com.example.demo.entity.Test as generatedAlias0 where generatedAlias0.name=:param0
[ main] o.h.h.i.a.QueryTranslatorImpl : SQL: select test0_.id as id1_0_, test0_.name as name2_0_ from test test0_ where test0_.name=?
[ main] o.h.h.i.a.ErrorTracker : throwQueryException() : no errors
[ main] o.h.h.i.a.QueryTranslatorImpl : parse() - HQL: delete from com.example.demo.entity.Test x where x = ?1 or x = ?2 ... or x = ?2500
[ main] o.h.h.i.a.ErrorTracker : throwQueryException() : no errors
The Sample code is uploaded in github the link to which is : https://github.com/Anand450623/Stackoverflow
You could try using a JPQL query to make deleteAll() delete in batch rather than one by one.
However, you might actually want to drop the orm-framework entirely. A common experience is that even though it looks like a good idea in the beginning, it almost turns up with issues like the one you gave here :/ you could read https://www.toptal.com/java/how-hibernate-ruined-my-career The gist of it is: it's hard to debug, you can't avoid writing native SQL in most cases anyway, JPQL limits your expressiveness and it's extremely invasive in how you model (e.g. you can't do immutability in a lot of casee).
Spring has an excellent JdbcTemplate, but keep in mind that also have drawbacks, mainly that you have to write the mapping yourself - that said it's not that much code. That said the benefits are huge. So if a JPQL query doesn't work, consider if using JPA (hibernate?) is the right choice to begin with
The way I got around this was to implement my own deleteInBatch method with native sql,
something like
#Modifying
#Query(value="delete from table_x where id in (:ids)", nativeQuery=true)
void deleteBatch(List<String> ids);
The problem is this piece of code over here https://github.com/spring-projects/spring-data-jpa/blame/a31c39db7a12113b5adcb6fbaa2a92d97f1b3a02/src/main/java/org/springframework/data/jpa/repository/query/QueryUtils.java#L409
Which generates a awfull sql, not suited for large number of elements

Import class PowerShell5 module

I've created some classes, which I would like to split into several files. Unfortunately, the inheritance does not work. My only working solution is:
Using Module "<Path>"
That is not flexible and a nightmare for adjustments.
My test folder structure looks like this:
Project
|
+-- classes
| |
| +-- domain
| | |
| | +---Domain.psm1
| |
| +-- dns
| |
| +--DNS.psm1
|
|
+-- functions
|
+-- test
|
+-- test.ps1
My current approach including classes is mentioned here:
https://info.sapien.com/index.php/scripting/scripting-classes/import-powershell-classes-from-modules
Domain.psm1:
class Domain
{
[string] fullname () {
return "test.local";
}
static [Domain] $instance
static [Domain] GetInstance() {
if ($null -eq [Domain]::instance) {
[Domain]::instance = [Domain]::new()
}
return [Domain]::instance
}
}
DNS.psm1:
# Working:
#using module "C:\Users\test\Desktop\Project\classes\domain\Domain.psm1"
# Not working:
$scriptDir = Join-Path -Path $PSScriptRoot -ChildPath ..\.. -Resolve
$classFile = -join($scriptDir, "\classes\domain\Domain.psm1")
$scriptBody = "using module $classFile"
$script = [ScriptBlock]::Create($scriptBody)
. $script
class DNS : Domain
{
[int] testRecord() {
return 9000;
}
# Init Singleton
static [DNS] $instance
static [DNS] GetInstance() {
if ($null -eq [DNS]::instance)
{
[DNS]::instance = [DNS]::new()
}
return [DNS]::instance
}
}
test.ps1:
$scriptDir = Join-Path -Path $PSScriptRoot -ChildPath ..\.. -Resolve
$classFile = -join($scriptDir, "\classes\dns\DNS.psm1")
# Working:
# Include DNS.psm1
$scriptBody = "using module $classFile"
$script = [ScriptBlock]::Create($scriptBody)
. $script
$task = [DNS]::GetInstance()
$task.testRecord()
$task.fullname()
In my test.ps1 I create an instance of the DNS class which extends the Domain class. Extending Domain fails with:
+ . $script
+ ~~~~~~~~~
+ CategoryInfo : InvalidOperation: (C:\Users\test...\dns\DNS.psm1:String) [], RuntimeException
+ FullyQualifiedErrorId : TypeNotFound
Is there an easy way to get this working ?

how to code the Url for oracle database in grails dataSource.config

I'm trying to connect from grails to oracle 10gXE.
All over the net I only find a Url for localhost:
url = "jdbc:oracle:thin:#localhost:1521:XE"
But I need to connect to a host in my local network
I can ping it successfully with ping 192.168.2.128
but
url = "jdbc:oracle:thin:#192.168.2.128:1521:XE"
ends up with:
Error 2014-12-22 14:26:23,612 [localhost-startStop-1] ERROR pool.ConnectionPool - Unable to create initial connections of pool.
Message: E/A-Exception: The Network Adapter could not establish the connection
Line | Method
->> 112 | throwSqlException in oracle.jdbc.driver.DatabaseError
| 146 | throwSqlException in ''
| 255 | throwSqlException in ''
| 387 | logon in oracle.jdbc.driver.T4CConnection
| 414 | <init> . . . . . in oracle.jdbc.driver.PhysicalConnection
| 165 | <init> in oracle.jdbc.driver.T4CConnection
| 35 | getConnection . . in oracle.jdbc.driver.T4CDriverExtension
| 801 | connect in oracle.jdbc.driver.OracleDriver
| 262 | run . . . . . . . in java.util.concurrent.FutureTask
| 1145 | runWorker in java.util.concurrent.ThreadPoolExecutor
| 615 | run . . . . . . . in java.util.concurrent.ThreadPoolExecutor$Worker
^ 745 | run in java.lang.Thread
peter
Dortmund Germany
ps.
my complete Data source:
dataSource {
pooled = true
driverClassName = "oracle.jdbc.OracleDriver"
dialect = "org.hibernate.dialect.Oracle10gDialect"
username = "peter"
password = "wuffwuff"
}
environments {
development {
dataSource {
dbCreate = "update" // one of 'create', 'create-drop','update'
url = "jdbc:oracle:thin:#192.168.2.128:1521:XE"
}
}
test {
dataSource {
dbCreate = "update"
url = "jdbc:oracle:thin:#192.168.2.128:1521:XE"
}
}
production {
dataSource {
dbCreate = "update"
url = "jdbc:oracle:thin:#192.168.2.128:1521:XE"
username = "peter"
password = "wuffwuff"
}
}
}

Grails plugin returning invalid stream header error

I have a Grails app that uses a plugin I have created. Both use Spring-security-core plugin.
My plugin has a domain class, controller and views. Data to display is selected from a line drawn on a map. The data returned is displayed in a list at the bottom of the browser window. I have created a link on one of the fields that is supposed to show the selected record.
When I click this link, however I get the following error:
| Error 2014-05-27 16:14:11,415 [http-bio-8082-exec-6] ERROR errors.GrailsExceptionResolver - StreamCorruptedException occurred when processing request: [GET] /test/bridge/show/661
invalid stream header: 8401FE00. Stacktrace follows:
Message: invalid stream header: 8401FE00
Line | Method
->> 802 | readStreamHeader in java.io.ObjectInputStream
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
| 299 | <init> in ''
| 55 | show . . . . . . in RIMS.bridge.BridgeController
| 195 | doFilter in grails.plugin.cache.web.filter.PageFragmentCachingFilter
| 63 | doFilter . . . . in grails.plugin.cache.web.filter.AbstractFilter
| 53 | doFilter in grails.plugin.springsecurity.web.filter.GrailsAnonymousAuthenticationFilter
| 49 | doFilter . . . . in grails.plugin.springsecurity.web.authentication.RequestHolderAuthenticationFilter
| 82 | doFilter in grails.plugin.springsecurity.web.authentication.logout.MutableLogoutFilter
| 1145 | runWorker . . . in java.util.concurrent.ThreadPoolExecutor
| 615 | run in java.util.concurrent.ThreadPoolExecutor$Worker
^ 724 | run . . . . . . in java.lang.Thread
The code that fails is where I attempt to get the data with the ID in the following action:
#Secured(["hasRole('ROLE_MAP_USER')"])
def show(Long id) {
println "ID: ${id}"
def princObj = springSecurityService?.principal
log.debug "User: ${princObj}"
def bridgeInstance = RIMS.bridge.Bridge.get(id)
if (!bridgeInstance) {
flash.message = message(code: 'default.not.found.message', args: [message(code: 'bridge.label', default: 'Bridge'), id])
redirect(action: "list")
return
}
render(plugin: "nhvr", view:"show", model: [bridgeInstance: bridgeInstance])
}
The returned user, retured at log.debug "User: ${princObj}" is correct and has the required role.
I am not sure how to proceed. Can anyone suggest what to try?
Here is the code for the Bridge class:
package RIMS.bridge
class Bridge {
String bridgeNo
String roadNo
BigDecimal linkNo
BigDecimal chng
String travelDirectionCode
BigDecimal northing
BigDecimal easting
String overUnderCode
Date dateActive
Date dateArchived
String tranUser
Date tranDateTime
String tranType
Serializable obj
String cwayCode
String constructed
static hasMany = [
bridgeClearances: BridgeClearance,
bridgeCulverts: BridgeCulvert,
bridgeDimensions: BridgeDimension,
bridgeDrawings: BridgeDrawing,
bridgeDucts: BridgeDuct,
bridgeHydraulics: BridgeHydraulic,
bridgeMaintenances: BridgeMaintenance,
bridgeMiscellaneouses: BridgeMiscellaneous,
bridgeParts: BridgePart,
bridgeServices: BridgeService,
bridgeSpans: BridgeSpan,
bridgeSubstructures: BridgeSubstructure,
bridgeSuperstructures: BridgeSuperstructure]
static mapping = {
id generator: "assigned"
version false
}
static constraints = {
bridgeNo maxSize: 10
roadNo nullable: true, maxSize: 5
linkNo nullable: true
chng nullable: true
travelDirectionCode maxSize: 10
northing nullable: true
easting nullable: true
overUnderCode maxSize: 10
dateArchived nullable: true
tranUser maxSize: 30
tranType maxSize: 6
obj nullable: true
cwayCode nullable: true, maxSize: 4
constructed nullable: true, maxSize: 3
}
def locateByBridgeNo(String bridgeNo){
println "locating by no: ${bridgeNo}"
withCriteria{
eq('bridgeNo', bridgeNo)
}
}
static listActiveBridges(){
withCriteria(){
isNull('dateArchived')
}
}
static listBridgesByBridgeNo(String bridgeNo){
withCriteria(){
eq('bridgeNo', bridgeNo)
}
}
String toString(){
return "${bridgeNo} Road: ${roadNo}, Link: ${linkNo}, chainage: ${chng}"
}
}

Resources