Write log messages to the log file in openscript - openscript

I need to write log messages to the log file instead of console in script.
By default few methods are provided by openscript.info(), warn().
where did they configured to write messages to the console.
when I have written info(message); it's writing message to the console.
where is the log4j.properties configured? Do I need to override to write to log file?

All log messages are stored in your OATS location, default in: C:\OracleATS\logs. Files are named "process_console_[timestamp].[hash].log"
However, if you want to create your own log file, I suggest to create a dedicated method, which uses built-in methods. In my code I did something like this:
private String filePath = "c:/warnings.log";
public void saveLog(String message) throws Exception {
DateFormat df = new SimpleDateFormat("[yyyy/MM/dd HH:mm:ss]");
Date sysdate = new Date();
String modifiedText = df.format(sysdate) + " " + message + "\n";
Files.write(Paths.get(filePath), modifiedText.getBytes(),
StandardOpenOption.APPEND);
}
public void warning(String message) throws Exception {
saveLog(message);
warn(message);
}
public void run() throws Exception {
warning("Test");
}

Related

I want to connect the Spring application with the external bukkit server through the REST API method

I want to control the bukkit server through the spring web application.
For example, send a command to the console, receive his response, etc
I'm trying to figure out a way, but I can't find a good one.
How shall I do it?
Even if third-party plugins are imported through the database, I want to find a way to do basic bukkit control.
First, you need to decide how to send the request to the server. It seems to me that in your case, the easiest is run the built-in java web server (HttpServer) to receive commands, and then process them.
If you need synchronous actions, then you can always do callSyncMethod
To receive command output, simply create your own implementation of CommandSender with overridden sendMessage methods
For example, how do command execution endpoint
JavaPlugin plugin = /** get plugin **/;
HttpServer server = HttpServer.create(new InetSocketAddress("localhost", 8001), 0);
server.createContext("/executeCommand", exchange -> {
if (!exchange.getRequestMethod().equals("POST")) {
exchange.getResponseBody().write("Method not supported".getBytes(StandardCharsets.UTF_8));
return;
}
// In this example body is command
String body = new String(exchange.getRequestBody().readAllBytes(), StandardCharsets.UTF_8);
StringBuilder builder = new StringBuilder();
// You also need override many another methods to compile code,but just leave it empty
CommandSender sender = new CommandSender() {
#Override
public void sendMessage(#NotNull String message) {
builder.append(message);
}
#Override
public void sendMessage(#NotNull String... messages) {
for (String message : messages) {
builder.append(message + "\n");
}
}
#Override
public boolean isOp() {
return true;
}
#Override
public boolean hasPermission(#NotNull String name) {
return true;
}
#Override
public #NotNull String getName() {
return "WebServerExecutor";
}
};
// Waiting command execute finish
Bukkit.getScheduler().callSyncMethod(plugin, () -> Bukkit.dispatchCommand(sender, body)).get();
byte[] response = builder.toString().getBytes(StandardCharsets.UTF_8);
exchange.getResponseBody().write(response);
});
server.start()

How to enable Spring Boot to display a list of files under a directory

I have a folder structure /data/reports on a file system, which contains all reports.
How can I configure a SpringBoot application to serve the contents of this file sytem.
Currently I have tried few options, but none working
#Configuration
#EnableWebMvc
public class AppConfig implements WebMvcConfigurer {
#Value(value = "${spring.resources.static-locations:#{null}}")
private String fileSystem;
#Override
public void addResourceHandlers(ResourceHandlerRegistry registry) {
registry
.addResourceHandler("/data/reports/**")
.addResourceLocations(fileSystem)
.setCachePeriod(3600)
.resourceChain(true)
.addResolver(new PathResourceResolver());
}
}
and in application.properties I have defined
spring.resources.static-locations=file:///data/reports
server.servlet.jsp.init-parameters.listings=true
But in both cases, when I try
http://host:port/application/data/reports
I'm getting 404
What am I missing ?
Based on the suggestions given, I realized that one mistake I'm doing is to access the reports via
http://host:port/application/data/reports
instead of
http://host:port/data/reports
if I use application in the request, those calls will go through RequestDispatcher and will try to find for a matching RequestMapping, which does not exist. I think I'm convinced so far.
But the problem I'm seeing now is, I'm getting SocketTimeoutException while trying to read from the resource listed in the URL. I had put some breakpoints in Spring source "ResourceHttpMessageConverter.java"
protected void writeContent(Resource resource, HttpOutputMessage outputMessage)
throws IOException, HttpMessageNotWritableException {
try {
InputStream in = resource.getInputStream(); //It is timing out here
try {
StreamUtils.copy(in, outputMessage.getBody());
}
catch (NullPointerException ex) {
// ignore, see SPR-13620
}
The resource is a small text file with 1 line "Hello World". Yet it is timing out.
The resource in the above class is a FileUrlResource opened on file:///c:/data/reports/sample.txt
On the other hand, I tried to read that resource as
File file = new File("c:/data/reports/sample.txt");
System.out.println(file.exists());
URL url = file.toURI().toURL();
URLConnection con = url.openConnection();
InputStream is = con.getInputStream(); //This works
Thanks

Camel log doesn't work

Camel route:
from("seda:rest_upload")
.log(LoggingLevel.WARN, "Got new file from sftp with name ${header.filename}")
.to("file://rest_files?fileName=${header.filename}");
I see new files into rest_files but I don't see log about it.
What do I wrong?
P.S.
Bu the way I tried to write:
from("seda:rest_upload")
.process(new Processor() {
#Override
public void process(Exchange exchange) throws Exception {
logger.info("Got new file from sftp with name {}", exchange.getIn().getHeader("filename"));
}
})
.to("file://rest_files?fileName=${header.filename}");
The behaviour really strange - file saves successfully but process method is not invoked.

set a conf value in mapper - get it in run method

In the run method of the Driver class, I want to fetch a String value (from the mapper function) and want to write it to a file. I used the following code, but null was returned. Please help
Mapper
public void map(LongWritable key, Text value, Context context)
throws IOException, InterruptedException {
context.getConfiguration().set("feedName", feedName);
}
Driver Class
#Override
public int run(String[] args) throws Exception {
String lineVal = conf.get("feedName")
}
Configuration is one way.
If you want to pass non-counter types of values back to the driver, you can utilize HDFS for that.
Either write to your main output context (key and values) that you emit from your job.
Or alternatively use MultipleOutputs, if you do not want to mess with your standard job output.
For example, you can write any kind of properties as Text keys and Text values from your mappers or reducers.
Once control is back to your driver, simply read from HDFS. For example you can store your name/values to the Configuration object to be used by the next job in your sequence:
public void load(Configuration targetConf, Path src, FileSystem fs) throws IOException {
InputStream is = fs.open(src);
try {
Properties props = new Properties();
props.load(new InputStreamReader(is, "UTF8"));
for (Map.Entry prop : props.entrySet()) {
String name = (String)prop.getKey();
String value = (String)prop.getValue();
targetConf.set(name, value);
}
} finally {
is.close();
}
}
Note that if you have multiple mappers or reducers where you write to MultipleOutputs, you will end up with multiple {name}-m-##### or {name}-r-##### files.
In that case, you will need to either read from every output file or run a single reducer job to combine your outputs into one and then just read from one file as shown above.
Using configuration you can only do the viceversa.
You can set values in Driver class
public int run(String[] args) throws Exception {
conf.set("feedName",value);
}
and set get those in Mapper class
public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {
Configuration conf = context.getConfiguration();
String lineVal = conf.get("feedName");
}
UPDATE
One option to your question is write data to a file and store it in HDFS, and then access them in Driver class. These files can be treated as "Intermediate Files".
Just try it and see.

How do I redirect log4j output to my HttpServletResponse output stream?

I'm using log4j 1.2.15 in a Spring 3.1.1.RELEASE application deployed on JBoss AS 7.1.1.Final. I'm trying to route output written in log4j to my response output stream. I have output written like this
private static final Logger LOG = Logger.getLogger(TrainingSessionServiceImpl.class);
…
LOG.info("Creating/updating training session associated with order #:" + order.getId());
and I'm trying to route it to my output stream like so …
#RequestMapping(value = "/refreshPd", method = RequestMethod.GET)
public void refreshPD(final HttpServletResponse response) throws IOException
{
...
final WriterAppender appender = new WriterAppender(new PatternLayout("%d{ISO8601} %p - %m%n"),response.getWriter());
appender.setName("CONSOLE_APPENDER");
appender.setThreshold(org.apache.log4j.Level.DEBUG);
Logger.getRootLogger().addAppender(appender);
worker.work();
Logger.getRootLogger().removeAppender("CONSOLE_APPENDER");
but sadly, nothing is getting output to my browser, even though I know (through debugging) that logging statements are getting called. Does anyone know how I can adjust my setup to make it work? Below is my log4j.properties file, deployed to my wAR's WEB-INF/classes directory.
log4j.rootLogger=DEBUG, CA, FA
#Console Appender
log4j.appender.CA=org.apache.log4j.ConsoleAppender
log4j.appender.CA.layout=org.apache.log4j.PatternLayout
log4j.appender.CA.layout.ConversionPattern=%-4r [%t] %-5p %c %x - %m%n
#File Appender
log4j.appender.FA=org.apache.log4j.FileAppender
log4j.appender.FA.File=/usr/java/jboss/server/default/log/log4j.log
log4j.appender.FA.layout=org.apache.log4j.PatternLayout
log4j.appender.FA.layout.ConversionPattern=%-4r [%t] %-5p %c %x - %m%n
# Set the logger level of File Appender to WARN
log4j.appender.FA.Threshold = DEBUG
Thanks, - Dave
This was an interesting problem. The key thing is to write your own appender. I looked up the in built org.apache.log4j.ConsoleAppender code for inspiration. I have tested this in my tomcat and verified that it works. I used log4j-1.2.17 (hopefully shouldn't matter)
1) First implement your own appender. This appender will write all log events to current thread's outputstream
package com.tstwbprj.log;
import org.apache.log4j.Layout;
import org.apache.log4j.WriterAppender;
import java.io.IOException;
import java.io.OutputStream;
public class HttpLogAppender extends WriterAppender {
static ThreadLocal<OutputStream> streamPerHttpThread = new ThreadLocal<OutputStream>();
public HttpLogAppender() {
}
public HttpLogAppender(Layout layout) {
setLayout(layout); //super-class method
activateOptions();
}
public void setCurrentHttpStream(OutputStream stream) {
streamPerHttpThread.set(stream);
}
public void activateOptions() {
setWriter(createWriter(new CurrentHttpThreadOutStream()));
}
/**
* An implementation of OutputStream that redirects to the
* current http threads servlet output stream
*/
private static class CurrentHttpThreadOutStream extends OutputStream {
public CurrentHttpThreadOutStream() {
}
public void close() {
}
public void flush() throws IOException {
OutputStream stream = streamPerHttpThread.get();
if (stream != null) {
stream.flush();
}
}
public void write(final byte[] b) throws IOException {
OutputStream stream = streamPerHttpThread.get();
if (stream != null) {
stream.write(b);
}
}
public void write(final byte[] b, final int off, final int len)
throws IOException {
OutputStream stream = streamPerHttpThread.get();
if (stream != null) {
stream.write(b, off, len);
}
}
public void write(final int b) throws IOException {
OutputStream stream = streamPerHttpThread.get();
if (stream != null) {
stream.write(b);
}
}
}
}
2) Add this appender in your log4j configuration file just like the other settings
log4j.rootLogger=DEBUG, CA, FA , HA
..
log4j.appender.HA=com.tstwbprj.log.HttpLogAppender
log4j.appender.HA.layout=org.apache.log4j.PatternLayout
log4j.appender.HA.layout.ConversionPattern=%-4r [%t] %-5p %c %x - %m%n
3) Add a small piece of code in your servlet so that this appender works correctly . Here's my servlet.
import org.apache.log4j.Category;
import org.apache.log4j.Logger;
import javax.servlet.ServletOutputStream;
import java.io.IOException;
public class LogServlet extends javax.servlet.http.HttpServlet {
private static final Logger LOG = Logger.getLogger(LogServlet.class);
protected void doPost(javax.servlet.http.HttpServletRequest request, javax.servlet.http.HttpServletResponse response) throws javax.servlet.ServletException, IOException {
}
protected void doGet(javax.servlet.http.HttpServletRequest request, javax.servlet.http.HttpServletResponse response) throws javax.servlet.ServletException, IOException {
ServletOutputStream outstream = response.getOutputStream();
configureLogForCurrentRequest(outstream);
LOG.info("Got request");//this is now send to the servlet output stream !!
LOG.info("Hello!!");
LOG.info("Done!!");
}
private void configureLogForCurrentRequest(ServletOutputStream outstream) {
HttpLogAppender appender = (HttpLogAppender) LOG.getAppender("HA");
while (appender == null) {
Category parent = LOG.getParent();
if (parent == null) {
break; //This ideally shouldn't happen. Navigated all the way to root logger and still did not find appender !!..something wrong with log4j configuration setup
}
appender = (HttpLogAppender) parent.getAppender("HA");
}
appender.setCurrentHttpStream(outstream);
}
}
Caution : This is not thoroughly tested especially with multiple servlet requests etc. Also not sure why you want to do this. Its not typical to pipe log messages to browser. Proceed with caution..:)-
Try with something like this:
Logger logger = Logger.getRootLogger();
String name = "myAppender";
Appender servletAppender = logger.getAppender(appenderName);
OutputStream out = response.getOutputStream();
if (servletAppender == null) {
servletAppender = new WriterAppender(new PatternLayout("%d{ISO8601} %p - %m%n"), out);
servletAppender.setName(appenderName);
appender.setThreshold(org.apache.log4j.Level.DEBUG);
logger.addAppender(servletAppender);
}
try {
// Your work
worker.work();
} finally {
logger.removeAppender(appenderName);
out.flush();
}
I suggest to take alternative approach and fetch log file contents to separate browser tab.
This would not require main code modification and would not destroy original page's formatting.
Some web-based log file viewers links:
http://logio.org/
http://www.log-viewer.com/net-java-log4j-log-viewer/
https://github.com/aroneous/Log4j-Log-Viewer
http://log2web.sourceforge.net/
Not a precise answer as such, but a better way that I have seen this handled is to write your own Appender that will collect logs in a ThreadLocal. At the time your servlet request completes, you can drain the contents of the ThreadLocal and output to the response stream however you wish.
This satisfies the (unstated) requirement of thread safety, and can fairly cleanly isolate the log4j (or other logging framework) implementation code (which should be small, using this technique) from the manipulation of the ThreadLocal, which could in theory be reused in other areas of your code.
This type of technique is used by many server-side scripting languages such as ColdFusion and others.
I won't go into the potential bugs you could cause with inappropriate use of ThreadLocal in an app server, there are techniques to manage this, along with relevant answers on SO and other sites.
Hope this answer might redirect your thinking in a slightly different direction!

Resources