How to convert .yml to .properties with a gradle task? - spring

The only supported i18n formats for Spring are .properties and .xml, but it's not really optimal.
What I'd like is to have a complex Yaml file (messages.yml and messages_xx.yml) that get converted to .properties in a Gradle task so I can queue it before Build task.
For example, a messages.yml would look like:
group1:
group2:
group3:
message1: hello
message2: how are you?
group4:
message3: good
group5:
group6:
message4: let's party
And the output .properties would be:
group1.group2.group3.message1: hello
group1.group2.group3.message2: how are you?
group1.group2.group4.message3: good
group1.group5.group6.message4: let's party
Is there a way to achieve this?
I didn't find existing converters.

import java.io.IOException;
import java.io.InputStream;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.util.Map;
import java.util.TreeMap;
import org.yaml.snakeyaml.Yaml;
public class YamlBackToProperties {
public static void main(String[] args) throws IOException {
Yaml yaml = new Yaml();
try (InputStream in = Files.newInputStream(Paths.get("test.yml"))) {
TreeMap<String, Map<String, Object>> config = yaml.loadAs(in, TreeMap.class);
System.out.println(String.format("%s%n\nConverts to Properties:%n%n%s", config.toString(), toProperties(config)));
}
}
private static String toProperties(TreeMap<String, Map<String, Object>> config) {
StringBuilder sb = new StringBuilder();
for (String key : config.keySet()) {
sb.append(toString(key, config.get(key)));
}
return sb.toString();
}
private static String toString(String key, Map<String, Object> map) {
StringBuilder sb = new StringBuilder();
for (String mapKey : map.keySet()) {
if (map.get(mapKey) instanceof Map) {
sb.append(toString(String.format("%s.%s", key, mapKey), (Map<String, Object>) map.get(mapKey)));
} else {
sb.append(String.format("%s.%s=%s%n", key, mapKey, map.get(mapKey).toString()));
}
}
return sb.toString();
}
}

Made some changes to the first answer. Now works for me for all cases:
import java.io.IOException;
import java.io.InputStream;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.util.Map;
import java.util.TreeMap;
import org.yaml.snakeyaml.Yaml;
public class YamlConverter {
public static void main(String[] args) throws IOException {
Yaml yaml = new Yaml();
try (InputStream in = Files.newInputStream(Paths.get("yourpath/application.yml"))) {
TreeMap<String, Map<String, Object>> config = yaml.loadAs(in, TreeMap.class);
System.out.println(String.format("%s%n\nConverts to Properties:%n%n%s", config.toString(), toProperties(config)));
}
}
private static String toProperties(TreeMap<String, Map<String, Object>> config) {
StringBuilder sb = new StringBuilder();
for (String key : config.keySet()) {
sb.append(toString(key, config.get(key)));
}
return sb.toString();
}
private static String toString(String key, Object mapr) {
StringBuilder sb = new StringBuilder();
if(!(mapr instanceof Map)) {
sb.append(key+"="+mapr+"\n");
return sb.toString();
}
Map<String, Object> map = (Map<String, Object>)mapr;
for (String mapKey : map.keySet()) {
if (map.get(mapKey) instanceof Map) {
sb.append(toString(key+"."+mapKey, map.get(mapKey)));
} else {
sb.append(String.format("%s.%s=%s%n", key, mapKey, map.get(mapKey).toString()));
}
}
return sb.toString();
}
}

Here's a straighforward implementation in Kotlin:
Once you have a Map with the parsed Yaml just call flatten():
fun flatten(map: Map<String, *>): MutableMap<String, Any> {
val processed = mutableMapOf<String, Any>()
map.forEach { key, value ->
doFlatten(key, value as Any, processed)
}
return processed
}
fun doFlatten(parentKey: String, value: Any, processed: MutableMap<String, Any>) {
if (value is Map<*, *>) {
value.forEach {
doFlatten("$parentKey.${it.key}", it.value as Any, processed)
}
} else {
processed[parentKey] = value
}
}

You can try like this
package com.example.yaml;
import java.io.IOException;
import java.io.InputStream;
import java.nio.file.Files;
import java.nio.file.Paths;
import org.yaml.snakeyaml.Yaml;
public class YamlConfigRunner {
public static void main(String[] args) throws IOException {
if( args.length != 1 ) {
System.out.println( "Usage: <file.yml>" );
return;
}
Yaml yaml = new Yaml();
try( InputStream in = Files.newInputStream( Paths.get( args[ 0 ] ) ) ) {
Configuration config = yaml.loadAs( in, Configuration.class );
System.out.println( config.toString() );
}
}
}
reference: https://dzone.com/articles/using-yaml-java-application

Related

How to set the default JDBC URL value in H2 console using spring boot

I have enable the H2 console in spring boot. However, when I open the console connection page the default url is the one staved in the H2 console history. How can i configure the project to populate the URL to be the same as spring.datasource.url on project start? Currently I set the url in the console manually but I would like to have it setup automatically by the project itself.
yaml:
spring:
h2:
console:
enabled: true
path: /admin/h2
datasource:
url: jdbc:h2:mem:foobar
update:
I know that the last connection settings are saved to ~/.h2.server.properties but what I need is to set the properties from the starting application potentially, potentially switching between several of them
There is no hook provided to fill in settings.
The good news is that we can change that with a bit of code.
Current state
The Login screen is created in WebApp.index()
String[] settingNames = server.getSettingNames();
String setting = attributes.getProperty("setting");
if (setting == null && settingNames.length > 0) {
setting = settingNames[0];
}
String combobox = getComboBox(settingNames, setting);
session.put("settingsList", combobox);
ConnectionInfo info = server.getSetting(setting);
if (info == null) {
info = new ConnectionInfo();
}
session.put("setting", PageParser.escapeHtmlData(setting));
session.put("name", PageParser.escapeHtmlData(setting));
session.put("driver", PageParser.escapeHtmlData(info.driver));
session.put("url", PageParser.escapeHtmlData(info.url));
session.put("user", PageParser.escapeHtmlData(info.user));
return "index.jsp";
We want to tap into server.getSettingNames(), and precisely into server.getSettings() used underneath.
synchronized ArrayList<ConnectionInfo> getSettings() {
ArrayList<ConnectionInfo> settings = new ArrayList<>();
if (connInfoMap.size() == 0) {
Properties prop = loadProperties();
if (prop.size() == 0) {
for (String gen : GENERIC) {
ConnectionInfo info = new ConnectionInfo(gen);
settings.add(info);
updateSetting(info);
}
} else {
for (int i = 0;; i++) {
String data = prop.getProperty(Integer.toString(i));
if (data == null) {
break;
}
ConnectionInfo info = new ConnectionInfo(data);
settings.add(info);
updateSetting(info);
}
}
} else {
settings.addAll(connInfoMap.values());
}
Collections.sort(settings);
return settings;
}
The plan
disable ServletRegistrationBean<WebServlet> created by H2ConsoleAutoConfiguration
replace it with our config class with a subclass of WebServlet
our CustomH2WebServlet will override init and register CustomH2WebServer (subclass of WebServer)
in CustomH2WebServer we override getSettings() and we are done
The Code
#EnableConfigurationProperties({H2ConsoleProperties.class, DataSourceProperties.class})
#Configuration
public class H2Config {
private final H2ConsoleProperties h2ConsoleProperties;
private final DataSourceProperties dataSourceProperties;
public H2Config(H2ConsoleProperties h2ConsoleProperties, DataSourceProperties dataSourceProperties) {
this.h2ConsoleProperties = h2ConsoleProperties;
this.dataSourceProperties = dataSourceProperties;
}
#Bean
public ServletRegistrationBean<WebServlet> h2Console() {
String path = this.h2ConsoleProperties.getPath();
String urlMapping = path + (path.endsWith("/") ? "*" : "/*");
ServletRegistrationBean<WebServlet> registration = new ServletRegistrationBean<>(
new CustomH2WebServlet(this.dataSourceProperties.getUrl()), urlMapping);
H2ConsoleProperties.Settings settings = this.h2ConsoleProperties.getSettings();
if (settings.isTrace()) {
registration.addInitParameter("trace", "");
}
if (settings.isWebAllowOthers()) {
registration.addInitParameter("webAllowOthers", "");
}
return registration;
}
}
package org.h2.server.web;
import javax.servlet.ServletConfig;
import java.lang.reflect.Field;
import java.util.ArrayList;
import java.util.Enumeration;
public class CustomH2WebServlet extends WebServlet {
private final String dbUrl;
public CustomH2WebServlet(String dbUrl) {
this.dbUrl = dbUrl;
}
#Override
public void init() {
ServletConfig config = getServletConfig();
Enumeration<?> en = config.getInitParameterNames();
ArrayList<String> list = new ArrayList<>();
while (en.hasMoreElements()) {
String name = en.nextElement().toString();
String value = config.getInitParameter(name);
if (!name.startsWith("-")) {
name = "-" + name;
}
list.add(name);
if (value.length() > 0) {
list.add(value);
}
}
String[] args = list.toArray(new String[0]);
WebServer server = new CustomH2WebServer(dbUrl);
server.setAllowChunked(false);
server.init(args);
setServerWithReflection(this, server);
}
private static void setServerWithReflection(final WebServlet classInstance, final WebServer newValue) {
try {
final Field field = WebServlet.class.getDeclaredField("server");
field.setAccessible(true);
field.set(classInstance, newValue);
}
catch (SecurityException|NoSuchFieldException|IllegalArgumentException|IllegalAccessException ex) {
throw new RuntimeException(ex);
}
}
}
package org.h2.server.web;
import java.util.ArrayList;
import java.util.Collections;
class CustomH2WebServer extends WebServer {
private final String connectionInfo;
CustomH2WebServer(String dbUrl) {
this.connectionInfo = "Test H2 (Embedded)|org.h2.Driver|" +dbUrl+"|sa";
}
synchronized ArrayList<ConnectionInfo> getSettings() {
ArrayList<ConnectionInfo> settings = new ArrayList<>();
ConnectionInfo info = new ConnectionInfo(connectionInfo);
settings.add(info);
updateSetting(info);
Collections.sort(settings);
return settings;
}
}
spring.h2.console.enabled=false
spring.datasource.url=jdbc:h2:mem:foobar
Everything went smoolthly, except of one private field that needed to be set via reflection.
The code provided works with H2 1.4.199
Inspired in #Lesiak's answer, using a simple and easier configuration class
import java.lang.reflect.Field;
import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method;
import lombok.extern.apachecommons.CommonsLog;
import org.h2.server.web.ConnectionInfo;
import org.h2.server.web.WebServer;
import org.h2.server.web.WebServlet;
import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty;
import org.springframework.boot.autoconfigure.jdbc.DataSourceProperties;
import org.springframework.boot.web.servlet.ServletRegistrationBean;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
#CommonsLog
#Configuration
#ConditionalOnProperty(prefix = "spring.h2.console", name = "enabled", havingValue = "true", matchIfMissing = false)
public class H2ConsoleConfiguration {
#Bean
ServletRegistrationBean<WebServlet> h2ConsoleRegistrationBean(final ServletRegistrationBean<WebServlet> h2Console, final DataSourceProperties dataSourceProperties) {
h2Console.setServlet(new WebServlet() {
#Override
public void init() {
super.init();
updateWebServlet(this, dataSourceProperties);
}
});
return h2Console;
}
public static void updateWebServlet(final WebServlet webServlet, DataSourceProperties dataSourceProperties) {
try {
updateWebServer(getWebServer(webServlet), dataSourceProperties);
} catch (NoSuchFieldException | IllegalArgumentException | IllegalAccessException | NoSuchMethodException | InvocationTargetException | NullPointerException ex) {
log.error("Unable to set a custom ConnectionInfo for H2 console", ex);
}
}
public static WebServer getWebServer(final WebServlet webServlet) throws NoSuchFieldException, IllegalArgumentException, IllegalAccessException {
final Field field = WebServlet.class.getDeclaredField("server");
field.setAccessible(true);
return (WebServer) field.get(webServlet);
}
public static void updateWebServer(final WebServer webServer, final DataSourceProperties dataSourceProperties) throws NoSuchMethodException, IllegalAccessException, IllegalArgumentException, InvocationTargetException {
final ConnectionInfo connectionInfo = new ConnectionInfo(String.format("Generic Spring Datasource|%s|%s|%s", dataSourceProperties.determineDriverClassName(), dataSourceProperties.determineUrl(), dataSourceProperties.determineUsername()));
final Method method = WebServer.class.getDeclaredMethod("updateSetting", ConnectionInfo.class);
method.setAccessible(true);
method.invoke(webServer, connectionInfo);
}
}

NiFI "unable to find flowfile content"

I am using nifi 1.6 and get the following errors when trying to modify a clone of an incoming flowFile:
[1]"unable to find content for FlowFile: ... MissingFlowFileException
...
Caused by ContentNotFoundException: Could not find contetn for StandardClaim
...
Caused by java.io.EOFException: null"
[2]"FlowFileHandlingException: StandardFlowFileRecord... is not known in this session"
The first error occurs when trying to access the contents of the flow file, the second when removing the flow file from the session (within a catch of the first). This process is known to have worked under nifi 0.7.
The basic process is:
Clone the incoming flow file
Write to the clone
Write to the clone again (some additional formatting)
Repeat 1-3
The error occurs on the second iteration step 3.
An interesting point is that if immediately after the clone is performed, a session.read of the clone is done everything works fine. The read seems to reset some pointer.
I have created unit tests for this processor, but they do not fail in either case.
Below is code simplified from the actual version in use that demonstrates the issue. (The development system is not connected so I had to copy the code. Please forgive any typos - it should be close. This is also why a full stack trace is not provided.) The processor doing the work has a property to determine if an immediate read should be done, or not. So both scenarios can be performed easily. To set it up, all that is needed is a GetFile processor to supply the input and terminators for the output from the SampleCloningProcessor. A sample input file is included as well. The meat of the code is in the onTrigger and manipulate methods. The manipulation in this simplified version really don't do anything but copy the input to the output.
Any insights into why this is happening and suggestions for corrections will be appreciated - thanks.
SampleCloningProcessor.java
processor sample.package.cloning
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.OutputStream;
import java.io.Reader;
import java.util.Arrays;
import java.util.Hashset;
import java.util.List;
import java.util.Scanner;
import java.util.Set;
import org.apache.commons.compress.utils.IOUtils;
import org.apache.nifi.annotation.documentaion.CapabilityDescription;
import org.apache.nifi.annotation.documentaion.Tags;
import org.apache.nifi.componets.PropertyDescriptor;
import org.apache.nifi.flowfile.FlowFile;
import org.apache.nifi.processor.AbstractProcessor;
import org.apache.nifi.processor.ProcessorContext;
import org.apache.nifi.processor.ProcessorSession;
import org.apache.nifi.processor.ProcessorInitioalizationContext;
import org.apache.nifi.processor.Relationship;
import org.apache.nifi.processor.exception.ProcessException;
import org.apache.nifi.processor.io.InputStreamCalback;
import org.apache.nifi.processor.io.OutputStreamCalback;
import org.apache.nifi.processor.io.StreamCalback;
import org.apache.nifi.processor.util.StandardValidators;
import com.google.gson.Gson;
#Tags({"example", "clone"})
#CapabilityDescription("Demonsrates cloning of flowfile failure.")
public class SampleCloningProcessor extend AbstractProcessor {
/* Determines if an immediate read is performed after cloning of inoming flowfile. */
public static final PropertyDescriptor IMMEDIATE_READ = new PropertyDescriptor.Builder()
.name("immediateRead")
.description("Determines if processor runs successfully. If a read is done immediatly "
+ "after the clone of the incoming flowFile, then the processor should run successfully.")
.required(true)
.allowableValues("true", "false")
.defaultValue("true")
.addValidator(StandardValidators.BOLLEAN_VALIDATOR)
.build();
public static final Relationship SUCCESS = new Relationship.Builder().name("success").
description("No unexpected errors.").build();
public static final Relationship FAILURE = new Relationship.Builder().name("failure").
description("Errors were thrown.").build();
private Set<Relationship> relationships;
private List<PropertyDescriptors> properties;
#Override
public void init(final ProcessorInitializationContext contex) {
relationships = new HashSet<>(Arrays.asList(SUCCESS, FAILURE));
properties = new Arrays.asList(IMMEDIATE_READ);
}
#Override
public Set<Relationship> getRelationships() {
return this.relationships;
}
#Override
public List<PropertyDescriptor> getSuppprtedPropertyDescriptors() {
return this.properties;
}
#Override
public void onTrigger(final ProcessContext context, final ProcessSession session) throws ProcessException {
FlowFile incomingFlowFile = session.get();
if (incomingFlowFile == null) {
return;
}
try {
final InfileReader inFileReader = new InfileReader();
session.read(incomingFlowFile, inFileReader);
Product product = infileReader.getProduct();
boolean transfer = false;
getLogger().info("\tSession :\n" + session);
getLogger().info("\toriginal :\n" + incomingFlowFile);
for(int i = 0; i < 2; i++) {
transfer = manipulate(context, session, inclmingFlowFile, product);
}
} catch (Exception e) {
getLogger().error(e.getMessage(), e);
session.rollback(true);
}
}
private boolean manipuate(final ProcessContext context, final ProcessSession session
final FlowFile incomingFlowFile, final Product product) {
boolean transfer = false;
FlowFile outgoingFlowFile = null;
boolean immediateRead = context.getProperty(IMMEDIATE_READ).asBoolean();
try {
//Clone incoming flowFile
outgoinFlowFile = session.clone(incomingFlowFile);
getLogger().info("\tclone outgoing :\n" + outgoingFlowFile);
if(immediateRead) {
readFlowFile(session, outgoingFlowFile);
}
//First write into clone
StageOneWrite stage1Write = new StaeOneWrite(product);
outgoingFlowFile = session.write(outgoingFlowFile, stage1Write);
getLogger().info("\twrite outgoing :\n" + outgoingFlowFile);
// Format the cloned file with another write
outgoingFlowFile = formatFlowFile(outgoingFlowFile, session)
getLogger().info("\format outgoing :\n" + outgoingFlowFile);
session.transfer(outgoingFlowFile, SUCCESS);
transfer != true;
} catch(Exception e)
getLogger().error(e.getMessage(), e);
if(outgoingFlowFile ! = null) {
session.remove(outgoingFlowFile);
}
}
return transfer;
}
private void readFlowFile(fainl ProcessSession session, fianl Flowfile flowFile) {
session.read(flowFile, new InputStreamCallback() {
#Override
public void process(Final InputStream in) throws IOException {
try (Scanner scanner = new Scanner(in)) {
scanner.useDelimiter("\\A").next();
}
}
});
}
private FlowFile formatFlowFile(fainl ProcessSession session, FlowFile flowfile) {
OutputFormatWrite formatWrite = new OutputFormatWriter();
flowfile = session.write(flowFile, formatWriter);
return flowFile;
}
private static class OutputFormatWriter implement StreamCallback {
#Override
public void process(final InputStream in, final OutputStream out) throws IOException {
try {
IOUtils.copy(in. out);
out.flush();
} finally {
IOUtils.closeQuietly(in);
IOUtils.closeQuietly(out);
}
}
}
private static class StageOneWriter implements OutputStreamCallback {
private Product product = null;
public StageOneWriter(Produt product) {
this.product = product;
}
#Override
public void process(final OutputStream out) throws IOException {
final Gson gson = new Gson();
final String json = gson.toJson(product);
out.write(json.getBytes());
}
}
private static class InfileReader implements InputStreamCallback {
private Product product = null;
public StageOneWriter(Produt product) {
this.product = product;
}
#Override
public void process(final InputStream out) throws IOException {
product = null;
final Gson gson = new Gson();
Reader inReader = new InputStreamReader(in, "UTF-8");
product = gson.fromJson(inreader, Product.calss);
}
public Product getProduct() {
return product;
}
}
SampleCloningProcessorTest.java
package sample.processors.cloning;
import org.apache.nifi.util.TestRunner;
import org.apache.nifi.util.TestRunners;
import org.junit.Before;
import org.junit.Test;
public class SampleCloningProcessorTest {
final satatic String flowFileContent = "{"
+ "\"cost\": \"cost 1\","
+ "\"description\": \"description","
+ "\"markup\": 1.2"
+ "\"name\":\"name 1\","
+ "\"supplier\":\"supplier 1\","
+ "}";
private TestRunner testRunner;
#Before
public void init() {
testRunner = TestRunner.newTestRunner(SampleCloningProcessor.class);
testRunner.enqueue(flowFileContent);
}
#Test
public void testProcessorImmediateRead() {
testRunner.setProperty(SampleCloningProcessor.IMMEDIATE_READ, "true");
testRunner.run();
testRinner.assertTransferCount("success", 2);
}
#Test
public void testProcessorImmediateRead_false() {
testRunner.setProperty(SampleCloningProcessor.IMMEDIATE_READ, "false");
testRunner.run();
testRinner.assertTransferCount("success", 2);
}
}
Product.java
package sample.processors.cloning;
public class Product {
private String name;
private String description;
private String supplier;
private String cost;
private float markup;
public String getName() {
return name;
}
public void setName(final String name) {
this.name = name;
}
public String getDescription() {
return description;
}
public void setDescriptione(final String description) {
this.description = description;
}
public String getSupplier() {
return supplier;
}
public void setSupplier(final String supplier) {
this.supplier = supplier;
}
public String getCost() {
return cost;
}
public void setCost(final String cost) {
this.cost = cost;
}
public float getMarkup() {
return markup;
}
public void setMarkup(final float name) {
this.markup = markup;
}
}
product.json A sample input file.
{
"const" : "cost 1",
"description" : "description 1",
"markup" : 1.2,
"name" : "name 1",
"supplier" : "supplier 1"
}
Reported as a bug in Nifi. Being addressed by https://issues.apache.org/jira/browse/NIFI-5879

Apache CXF Interceptors: Unable to modify the response Stream in a Out Interceptor [duplicate]

I would like to modify an outgoing SOAP Request.
I would like to remove 2 xml nodes from the Envelope's body.
I managed to set up an Interceptor and get the generated String value of the message set to the endpoint.
However, the following code does not seem to work as the outgoing message is not edited as expected. Does anyone have some code or ideas on how to do this?
public class MyOutInterceptor extends AbstractSoapInterceptor {
public MyOutInterceptor() {
super(Phase.SEND);
}
public void handleMessage(SoapMessage message) throws Fault {
// Get message content for dirty editing...
StringWriter writer = new StringWriter();
CachedOutputStream cos = (CachedOutputStream)message.getContent(OutputStream.class);
InputStream inputStream = cos.getInputStream();
IOUtils.copy(inputStream, writer, "UTF-8");
String content = writer.toString();
// remove the substrings from envelope...
content = content.replace("<idJustification>0</idJustification>", "");
content = content.replace("<indicRdv>false</indicRdv>", "");
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
outputStream.write(content.getBytes(Charset.forName("UTF-8")));
message.setContent(OutputStream.class, outputStream);
}
Based on the first comment, I created an abstract class which can easily be used to change the whole soap envelope.
Just in case someone wants a ready-to-use code part.
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import org.apache.commons.io.IOUtils;
import org.apache.cxf.binding.soap.interceptor.SoapPreProtocolOutInterceptor;
import org.apache.cxf.io.CachedOutputStream;
import org.apache.cxf.message.Message;
import org.apache.cxf.phase.AbstractPhaseInterceptor;
import org.apache.cxf.phase.Phase;
import org.apache.log4j.Logger;
/**
* http://www.mastertheboss.com/jboss-web-services/apache-cxf-interceptors
* http://stackoverflow.com/questions/6915428/how-to-modify-the-raw-xml-message-of-an-outbound-cxf-request
*
*/
public abstract class MessageChangeInterceptor extends AbstractPhaseInterceptor<Message> {
public MessageChangeInterceptor() {
super(Phase.PRE_STREAM);
addBefore(SoapPreProtocolOutInterceptor.class.getName());
}
protected abstract Logger getLogger();
protected abstract String changeOutboundMessage(String currentEnvelope);
protected abstract String changeInboundMessage(String currentEnvelope);
public void handleMessage(Message message) {
boolean isOutbound = false;
isOutbound = message == message.getExchange().getOutMessage()
|| message == message.getExchange().getOutFaultMessage();
if (isOutbound) {
OutputStream os = message.getContent(OutputStream.class);
CachedStream cs = new CachedStream();
message.setContent(OutputStream.class, cs);
message.getInterceptorChain().doIntercept(message);
try {
cs.flush();
IOUtils.closeQuietly(cs);
CachedOutputStream csnew = (CachedOutputStream) message.getContent(OutputStream.class);
String currentEnvelopeMessage = IOUtils.toString(csnew.getInputStream(), "UTF-8");
csnew.flush();
IOUtils.closeQuietly(csnew);
if (getLogger().isDebugEnabled()) {
getLogger().debug("Outbound message: " + currentEnvelopeMessage);
}
String res = changeOutboundMessage(currentEnvelopeMessage);
if (res != null) {
if (getLogger().isDebugEnabled()) {
getLogger().debug("Outbound message has been changed: " + res);
}
}
res = res != null ? res : currentEnvelopeMessage;
InputStream replaceInStream = IOUtils.toInputStream(res, "UTF-8");
IOUtils.copy(replaceInStream, os);
replaceInStream.close();
IOUtils.closeQuietly(replaceInStream);
os.flush();
message.setContent(OutputStream.class, os);
IOUtils.closeQuietly(os);
} catch (IOException ioe) {
getLogger().warn("Unable to perform change.", ioe);
throw new RuntimeException(ioe);
}
} else {
try {
InputStream is = message.getContent(InputStream.class);
String currentEnvelopeMessage = IOUtils.toString(is, "UTF-8");
IOUtils.closeQuietly(is);
if (getLogger().isDebugEnabled()) {
getLogger().debug("Inbound message: " + currentEnvelopeMessage);
}
String res = changeInboundMessage(currentEnvelopeMessage);
if (res != null) {
if (getLogger().isDebugEnabled()) {
getLogger().debug("Inbound message has been changed: " + res);
}
}
res = res != null ? res : currentEnvelopeMessage;
is = IOUtils.toInputStream(res, "UTF-8");
message.setContent(InputStream.class, is);
IOUtils.closeQuietly(is);
} catch (IOException ioe) {
getLogger().warn("Unable to perform change.", ioe);
throw new RuntimeException(ioe);
}
}
}
public void handleFault(Message message) {
}
private class CachedStream extends CachedOutputStream {
public CachedStream() {
super();
}
protected void doFlush() throws IOException {
currentStream.flush();
}
protected void doClose() throws IOException {
}
protected void onWrite() throws IOException {
}
}
}
I had this problem as well today. After much weeping and gnashing of teeth, I was able to alter the StreamInterceptor class in the configuration_interceptor demo that comes with the CXF source:
OutputStream os = message.getContent(OutputStream.class);
CachedStream cs = new CachedStream();
message.setContent(OutputStream.class, cs);
message.getInterceptorChain().doIntercept(message);
try {
cs.flush();
CachedOutputStream csnew = (CachedOutputStream) message.getContent(OutputStream.class);
String soapMessage = IOUtils.toString(csnew.getInputStream());
...
The soapMessage variable will contain the complete SOAP message. You should be able to manipulate the soap message, flush it to an output stream and do a message.setContent(OutputStream.class... call to put your modifications on the message. This comes with no warranty, since I'm pretty new to CXF myself!
Note: CachedStream is a private class in the StreamInterceptor class. Don't forget to configure your interceptor to run in the PRE_STREAM phase so that the SOAP interceptors have a chance to write the SOAP message.
Following is able to bubble up server side exceptions. Use of os.close() instead of IOUtils.closeQuietly(os) in previous solution is also able to bubble up exceptions.
public class OutInterceptor extends AbstractPhaseInterceptor<Message> {
public OutInterceptor() {
super(Phase.PRE_STREAM);
addBefore(StaxOutInterceptor.class.getName());
}
public void handleMessage(Message message) {
OutputStream os = message.getContent(OutputStream.class);
CachedOutputStream cos = new CachedOutputStream();
message.setContent(OutputStream.class, cos);
message.getInterceptorChain.aad(new PDWSOutMessageChangingInterceptor(os));
}
}
public class OutMessageChangingInterceptor extends AbstractPhaseInterceptor<Message> {
private OutputStream os;
public OutMessageChangingInterceptor(OutputStream os){
super(Phase.PRE_STREAM_ENDING);
addAfter(StaxOutEndingInterceptor.class.getName());
this.os = os;
}
public void handleMessage(Message message) {
try {
CachedOutputStream csnew = (CachedOutputStream) message .getContent(OutputStream.class);
String currentEnvelopeMessage = IOUtils.toString( csnew.getInputStream(), (String) message.get(Message.ENCODING));
csnew.flush();
IOUtils.closeQuietly(csnew);
String res = changeOutboundMessage(currentEnvelopeMessage);
res = res != null ? res : currentEnvelopeMessage;
InputStream replaceInStream = IOUtils.tolnputStream(res, (String) message.get(Message.ENCODING));
IOUtils.copy(replaceInStream, os);
replaceInStream.close();
IOUtils.closeQuietly(replaceInStream);
message.setContent(OutputStream.class, os);
} catch (IOException ioe) {
throw new RuntimeException(ioe);
}
}
}
Good example for replacing outbound soap content based on this
package kz.bee.bip;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import org.apache.commons.io.IOUtils;
import org.apache.cxf.binding.soap.interceptor.SoapPreProtocolOutInterceptor;
import org.apache.cxf.io.CachedOutputStream;
import org.apache.cxf.message.Message;
import org.apache.cxf.phase.AbstractPhaseInterceptor;
import org.apache.cxf.phase.Phase;
public class SOAPOutboundInterceptor extends AbstractPhaseInterceptor<Message> {
public SOAPOutboundInterceptor() {
super(Phase.PRE_STREAM);
addBefore(SoapPreProtocolOutInterceptor.class.getName());
}
public void handleMessage(Message message) {
boolean isOutbound = false;
isOutbound = message == message.getExchange().getOutMessage()
|| message == message.getExchange().getOutFaultMessage();
if (isOutbound) {
OutputStream os = message.getContent(OutputStream.class);
CachedStream cs = new CachedStream();
message.setContent(OutputStream.class, cs);
message.getInterceptorChain().doIntercept(message);
try {
cs.flush();
IOUtils.closeQuietly(cs);
CachedOutputStream csnew = (CachedOutputStream) message.getContent(OutputStream.class);
String currentEnvelopeMessage = IOUtils.toString(csnew.getInputStream(), "UTF-8");
csnew.flush();
IOUtils.closeQuietly(csnew);
/* here we can set new data instead of currentEnvelopeMessage*/
InputStream replaceInStream = IOUtils.toInputStream(currentEnvelopeMessage, "UTF-8");
IOUtils.copy(replaceInStream, os);
replaceInStream.close();
IOUtils.closeQuietly(replaceInStream);
os.flush();
message.setContent(OutputStream.class, os);
IOUtils.closeQuietly(os);
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
}
public void handleFault(Message message) {
}
private static class CachedStream extends CachedOutputStream {
public CachedStream() {
super();
}
protected void doFlush() throws IOException {
currentStream.flush();
}
protected void doClose() throws IOException {
}
protected void onWrite() throws IOException {
}
}
}
a better way would be to modify the message using the DOM interface, you need to add the SAAJOutInterceptor first (this might have a performance hit for big requests) and then your custom interceptor that is executed in phase USER_PROTOCOL
import org.apache.cxf.binding.soap.SoapMessage;
import org.apache.cxf.binding.soap.interceptor.AbstractSoapInterceptor;
import org.apache.cxf.interceptor.Fault;
import org.apache.cxf.phase.Phase;
import org.w3c.dom.Node;
import javax.xml.soap.SOAPException;
import javax.xml.soap.SOAPMessage;
abstract public class SoapNodeModifierInterceptor extends AbstractSoapInterceptor {
SoapNodeModifierInterceptor() { super(Phase.USER_PROTOCOL); }
#Override public void handleMessage(SoapMessage message) throws Fault {
try {
if (message == null) {
return;
}
SOAPMessage sm = message.getContent(SOAPMessage.class);
if (sm == null) {
throw new RuntimeException("You must add the SAAJOutInterceptor to the chain");
}
modifyNodes(sm.getSOAPBody());
} catch (SOAPException e) {
throw new RuntimeException(e);
}
}
abstract void modifyNodes(Node node);
}
this one's working for me. It's based on StreamInterceptor class from configuration_interceptor example in Apache CXF samples.
It's in Scala instead of Java but the conversion is straightforward.
I tried to add comments to explain what's happening (as far as I understand).
import java.io.OutputStream
import org.apache.cxf.binding.soap.interceptor.SoapPreProtocolOutInterceptor
import org.apache.cxf.helpers.IOUtils
import org.apache.cxf.io.CachedOutputStream
import org.apache.cxf.message.Message
import org.apache.cxf.phase.AbstractPhaseInterceptor
import org.apache.cxf.phase.Phase
// java note: base constructor call is hidden at the end of class declaration
class StreamInterceptor() extends AbstractPhaseInterceptor[Message](Phase.PRE_STREAM) {
// java note: put this into the constructor after calling super(Phase.PRE_STREAM);
addBefore(classOf[SoapPreProtocolOutInterceptor].getName)
override def handleMessage(message: Message) = {
// get original output stream
val osOrig = message.getContent(classOf[OutputStream])
// our output stream
val osNew = new CachedOutputStream
// replace it with ours
message.setContent(classOf[OutputStream], osNew)
// fills the osNew instead of osOrig
message.getInterceptorChain.doIntercept(message)
// flush before getting content
osNew.flush()
// get filled content
val content = IOUtils.toString(osNew.getInputStream, "UTF-8")
// we got the content, we may close our output stream now
osNew.close()
// modified content
val modifiedContent = content.replace("a-string", "another-string")
// fill original output stream
osOrig.write(modifiedContent.getBytes("UTF-8"))
// flush before set
osOrig.flush()
// replace with original output stream filled with our modified content
message.setContent(classOf[OutputStream], osOrig)
}
}

hbase co-processor failing to load from shell

I am trying to add a coprocessor with one hbase table and it is failing with error -
2016-03-15 14:40:14,130 INFO org.apache.hadoop.hbase.regionserver.RSRpcServices: Open PRODUCT_DETAILS,,1457953190424.f687dd250bfd1f18ffbb8075fd625145.
2016-03-15 14:40:14,173 ERROR org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost: Failed to load coprocessor com.optymyze.coprocessors.ProductObserver
java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "mylocalhost/mylocalhostip"; destination host is: "mydestinationhost":9000;
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:772)
to add co processor I did following -
hbase> disable 'PRODUCT_DETAILS'
hbase> alter 'PRODUCT_DETAILS', METHOD => 'table_att', 'coprocessor'=>'hdfs://mydestinationhost:9000/hbase-coprocessors-0.0.3-SNAPSHOT.jar|com.optymyze.coprocessors.ProductObserver|1001|arg1=1,arg2=2'
now enable 'PRODUCT_DETAILS' won't work.
co processor code is as follows-
package com.optymyze.coprocessors;
import org.apache.hadoop.hbase.KeyValue;
import org.apache.hadoop.hbase.client.HTableInterface;
import org.apache.hadoop.hbase.client.Put;
import org.apache.hadoop.hbase.coprocessor.BaseRegionObserver;
import org.apache.hadoop.hbase.coprocessor.ObserverContext;
import org.apache.hadoop.hbase.coprocessor.RegionCoprocessorEnvironment;
import org.apache.hadoop.hbase.regionserver.wal.WALEdit;
import org.apache.hadoop.hbase.util.Bytes;
import org.slf4j.Logger;
import java.io.IOException;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import static org.slf4j.LoggerFactory.*;
/**
*
* Created by adnan on 14-03-2016.
*/
public class ProductObserver extends BaseRegionObserver {
private static final Logger LOGGER = getLogger(ProductObserver.class);
private static final String PRODUCT_DETAILS_TABLE = "PRODUCT_DETAILS";
public static final String COLUMN_FAMILY = "CF";
#Override
public void postPut(ObserverContext<RegionCoprocessorEnvironment> e, Put put, WALEdit edit, boolean writeToWAL) throws IOException {
List<KeyValue> kvs = put.getFamilyMap().get(Bytes.toBytes(COLUMN_FAMILY));
LOGGER.info("key values {}", kvs);
Map<String, Integer> qualifierVsValue = getMapForQualifierVsValuesForRequiredOnes(kvs);
LOGGER.info("qualifier values {}", qualifierVsValue);
List<Put> puts = createPuts(kvs, qualifierVsValue);
LOGGER.info("puts values {}", puts);
updateProductTable(e, puts);
LOGGER.info("puts done");
}
private void updateProductTable(ObserverContext<RegionCoprocessorEnvironment> e, List<Put> puts) throws IOException {
HTableInterface productTable = e.getEnvironment().getTable(Bytes.toBytes(PRODUCT_DETAILS_TABLE));
try {
productTable.put(puts);
}finally {
productTable.close();
}
}
private List<Put> createPuts(List<KeyValue> kvs, Map<String, Integer> qualifierVsValue) {
int salePrice, baseline = 0, finalPrice = 0;
List<Put> puts = new ArrayList<Put>(kvs.size());
for (KeyValue kv : kvs) {
if (kv.matchingQualifier(Bytes.toBytes("BASELINE"))) {
baseline = convertToZeroIfNull(qualifierVsValue, "PRICE")
- convertToZeroIfNull(qualifierVsValue, "PRICE")
* convertToZeroIfNull(qualifierVsValue, "DISCOUNT") / 100;
puts.add(newPut(kv, baseline));
}
if (kv.matchingQualifier(Bytes.toBytes("FINALPRICE"))) {
finalPrice = baseline + baseline * convertToZeroIfNull(qualifierVsValue, "UPLIFT") / 100;
puts.add(newPut(kv, finalPrice));
}
if (kv.matchingQualifier(Bytes.toBytes("SALEPRICE"))) {
salePrice = finalPrice * convertToZeroIfNull(qualifierVsValue, "VOLUME");
puts.add(newPut(kv, salePrice));
}
}
return puts;
}
private Map<String, Integer> getMapForQualifierVsValuesForRequiredOnes(List<KeyValue> kvs) {
Map<String, Integer> qualifierVsValue = new HashMap<String, Integer>();
for (KeyValue kv : kvs) {
getValueFromQualifier(kv, "PRICE", qualifierVsValue);
getValueFromQualifier(kv, "DISCOUNT", qualifierVsValue);
getValueFromQualifier(kv, "UPLIFT", qualifierVsValue);
getValueFromQualifier(kv, "VOLUME", qualifierVsValue);
}
return qualifierVsValue;
}
private Integer convertToZeroIfNull(Map<String, Integer> qualifierVsValue, String qualifier) {
Integer v = qualifierVsValue.get(qualifier);
return v == null ? 0 : v;
}
private void getValueFromQualifier(KeyValue kv, String qualifier, Map<String, Integer> qualifierVsValue) {
if (kv.matchingQualifier(Bytes.toBytes(qualifier))) {
qualifierVsValue.put(qualifier, Bytes.toInt(convertToByteZeroIfNull(kv)));
}
}
private Put newPut(KeyValue kv, int newVal) {
Put put = new Put(kv.getValue(), kv.getTimestamp());
put.add(kv.getFamily(), kv.getQualifier(), Bytes.toBytes(newVal));
return put;
}
private byte[] convertToByteZeroIfNull(KeyValue kv) {
return kv.getValue() == null ? Bytes.toBytes(0) : kv.getValue();
}
}

Using Multiple Mappers for multiple output directories in Hadoop MapReduce

I want to run two mappers that produce two different outputs in different directories.The output of the first mapper(Send as argument) should be send to the input of the second mapper.i have this code in the driver class
import java.io.IOException;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.input.TextInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.mapreduce.lib.output.MultipleOutputs;
import org.apache.hadoop.mapreduce.lib.output.TextOutputFormat;
public class Export_Column_Mapping
{
private static String[] Detail_output_column_array = new String[27];
private static String[] Shop_output_column_array = new String[8];
private static String details_output = null ;
private static String Shop_output = null;
public static void main(String[] args) throws Exception
{
String Output_filetype = args[3];
String Input_column_number = args[4];
String Output_column_number = args[5];
Configuration Detailsconf = new Configuration(false);
Detailsconf.setStrings("output_filetype",Output_filetype);
Detailsconf.setStrings("Input_column_number",Input_column_number);
Detailsconf.setStrings("Output_column_number",Output_column_number);
Job Details = new Job(Detailsconf," Export_Column_Mapping");
Details.setJarByClass(Export_Column_Mapping.class);
Details.setJobName("DetailsFile_Job");
Details.setMapperClass(DetailFile_Mapper.class);
Details.setNumReduceTasks(0);
Details.setInputFormatClass(TextInputFormat.class);
Details.setOutputFormatClass(TextOutputFormat.class);
FileInputFormat.setInputPaths(Details, new Path(args[0]));
FileOutputFormat.setOutputPath(Details, new Path(args[1]));
if(Details.waitForCompletion(true))
{
Configuration Shopconf = new Configuration();
Job Shop = new Job(Shopconf,"Export_Column_Mapping");
Shop.setJarByClass(Export_Column_Mapping.class);
Shop.setJobName("ShopFile_Job");
Shop.setMapperClass(ShopFile_Mapper.class);
Shop.setNumReduceTasks(0);
Shop.setInputFormatClass(TextInputFormat.class);
Shop.setOutputFormatClass(TextOutputFormat.class);
FileInputFormat.setInputPaths(Shop, new Path(args[1]));
FileOutputFormat.setOutputPath(Shop, new Path(args[2]));
MultipleOutputs.addNamedOutput(Shop, "text", TextOutputFormat.class,LongWritable.class, Text.class);
System.exit(Shop.waitForCompletion(true) ? 0 : 1);
}
}
public static class DetailFile_Mapper extends Mapper<LongWritable,Text,Text,Text>
{
public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException
{
String str_Output_filetype = context.getConfiguration().get("output_filetype");
String str_Input_column_number = context.getConfiguration().get("Input_column_number");
String[] input_columns_number = str_Input_column_number.split(",");
String str_Output_column_number= context.getConfiguration().get("Output_column_number");
String[] output_columns_number = str_Output_column_number.split(",");
String str_line = value.toString();
String[] input_column_array = str_line.split(",");
try
{
for(int i = 0;i<=input_column_array.length+1; i++)
{
int int_outputcolumn = Integer.parseInt(output_columns_number[i]);
int int_inputcolumn = Integer.parseInt(input_columns_number[i]);
if((int_inputcolumn != 0) && (int_outputcolumn != 0) && output_columns_number.length == input_columns_number.length)
{
Detail_output_column_array[int_outputcolumn-1] = input_column_array[int_inputcolumn-1];
if(details_output != null)
{
details_output = details_output+" "+ Detail_output_column_array[int_outputcolumn-1];
Shop_output = Shop_output+" "+ Shop_output_column_array[int_outputcolumn-1];
}else
{
details_output = Detail_output_column_array[int_outputcolumn-1];
Shop_output = Shop_output_column_array[int_outputcolumn-1];
}
}
}
}catch (Exception e)
{
}
context.write(null,new Text(details_output));
}
}
public static class ShopFile_Mapper extends Mapper<LongWritable,Text,Text,Text>
{
public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException
{
try
{
for(int i = 0;i<=Shop_output_column_array.length; i++)
{
Shop_output_column_array[0] = Detail_output_column_array[0];
Shop_output_column_array[1] = Detail_output_column_array[1];
Shop_output_column_array[2] = Detail_output_column_array[2];
Shop_output_column_array[3] = Detail_output_column_array[3];
Shop_output_column_array[4] = Detail_output_column_array[14];
if(details_output != null)
{
Shop_output = Shop_output+" "+ Shop_output_column_array[i];
}else
{
Shop_output = Shop_output_column_array[i-1];
}
}
}catch (Exception e){
}
context.write(null,new Text(Shop_output));
}
}
}
I get the error..
Error:org.apache.hadoop.mapreduce.lib.input.InvalidInputException:
Input path does not exist:
file:/home/Barath.B.Natarajan.ap/rules/text.txt
I want to run the jobs one by one can any one help me in this?...
There is something called jobcontrol with which you will be able to achieve it.
Suppose there are two jobs A and B
ControlledJob A= new ControlledJob(JobConf for A);
ControlledJob B= new ControlledJob(JobConf for B);
B.addDependingJob(A);
JobControl jControl = newJobControl("Name");
jControl.addJob(A);
jControl.addJob(B);
Thread runJControl = new Thread(jControl);
runJControl.start();
while (!jControl.allFinished()) {
code = jControl.getFailedJobList().size() == 0 ? 0 : 1;
Thread.sleep(1000);
}
System.exit(1);
Initialize code at the beginning like this:
int code =1;
Let the first job in your case be the first mapper with zero reducer and second job be the second mapper with zero reducer.The configuration should be such that the input path of B and output path of A should be same.

Resources