No procedure/function/signature - oracle

I am relatively new at using stored procedures and I have really run up aganst the wall. I am receiving the following error message using the Spring JdbcTemplate. My dev environment is Xubuntu, jdk 1.8.
The stack trace is:
Exception in thread "main" org.springframework.dao.InvalidDataAccessApiUsageException: Unable to determine the correct call signature - no procedure/function/signature for 'PROCONEINPARAMETER'
at org.springframework.jdbc.core.metadata.GenericCallMetaDataProvider.processProcedureColumns(GenericCallMetaDataProvider.java:347)
at org.springframework.jdbc.core.metadata.GenericCallMetaDataProvider.initializeWithProcedureColumnMetaData(GenericCallMetaDataProvider.java:112)
at org.springframework.jdbc.core.metadata.CallMetaDataProviderFactory$1.processMetaData(CallMetaDataProviderFactory.java:133)
at org.springframework.jdbc.support.JdbcUtils.extractDatabaseMetaData(JdbcUtils.java:299)
at org.springframework.jdbc.core.metadata.CallMetaDataProviderFactory.createMetaDataProvider(CallMetaDataProviderFactory.java:73)
at org.springframework.jdbc.core.metadata.CallMetaDataContext.initializeMetaData(CallMetaDataContext.java:286)
at org.springframework.jdbc.core.simple.AbstractJdbcCall.compileInternal(AbstractJdbcCall.java:303)
at org.springframework.jdbc.core.simple.AbstractJdbcCall.compile(AbstractJdbcCall.java:288)
at org.springframework.jdbc.core.simple.AbstractJdbcCall.checkCompiled(AbstractJdbcCall.java:348)
at org.springframework.jdbc.core.simple.AbstractJdbcCall.doExecute(AbstractJdbcCall.java:375)
at org.springframework.jdbc.core.simple.SimpleJdbcCall.executeFunction(SimpleJdbcCall.java:153)
at test.jdbc.StringDao.executeProcOneINParameter(StringDao.java:21)
at test.jdbc.SimpleJdbcTest.main(SimpleJdbcTest.java:15)
Code:
SimpleJdbc.java
package test.jdbc;
import java.util.Map;
import org.springframework.context.ApplicationContext;
import org.springframework.context.support.ClassPathXmlApplicationContext;
public class SimpleJdbcTest {
public static void main(String[] args) {
ApplicationContext ctx=new ClassPathXmlApplicationContext("applicationContext.xml");
StringDao dao=(StringDao)ctx.getBean("edao");
String request = new String(" Wow, this works!");
String response = dao.executeProcOneINParameter(request);
if (response != null && !response.equals(new String())) {
System.out.println("stored proc worked: "+ response);
} else {
System.err.println("stored proc did not work.");
}
}
}
StringDao.java
package test.jdbc;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.simple.SimpleJdbcCall;
public class StringDao {
private static final String PROC_NAME = "PROCONEINPARAMETER";
private static final String CAT_NAME = "LISTENER";
private JdbcTemplate jdbcTemplate;
public void setJdbcTemplate(JdbcTemplate jdbcTemplate) {
this.jdbcTemplate = jdbcTemplate;
}
public String executeProcOneINParameter(String callParam){
SimpleJdbcCall jdbcCall = new SimpleJdbcCall(jdbcTemplate)
.withCatalogName(CAT_NAME)
.withProcedureName(PROC_NAME);
return jdbcCall.executeFunction(String.class, callParam);
}
}
Stored Proc:
PROCONEINPARAMETER
CREATE OR REPLACE PROCEDURE procOneINParameter(param1 IN VARCHAR2)
IS
BEGIN
DBMS_OUTPUT.PUT_LINE('Hello World IN parameter ' || param1);
END;

Aside from the problems that #Alex posted, and I corrected, the final problem was the following:
static final String PROC_NAME = "PROCONEINPARAMETER";
private static final String CAT_NAME = "LISTENER";
…..
SimpleJdbcCall jdbcCall = new SimpleJdbcCall(jdbcTemplate)
.withCatalogName(CAT_NAME)
.withProcedureName(PROC_NAME);
instead of:
SimpleJdbcCall jdbcCall = new SimpleJdbcCall(jdbcTemplate)
.withSchemaName(CAT_NAME)
.withProcedureName(PROC_NAME);
Obviously, there was no way for anyone to know that I was using catalog and schema name interchangeably.

You are trying to call a procedure, not a function. But you are calling it via the executeFunction() method, and specifying a return type of String.
You need to use execute() instead, still passing the procedure argument, but without the return type (since there isn't one from a procedure):
Map<String,Object> out = jdbcCall.execute(callParam);
Your procedure doesn't have any OUT parameters either so out will be empty.

Had similar problem.
Issue was : Schema Name was in uppercase, so it was taking double quotes.
eg : SchemaName was ABCD, while executing postgres was expecting "ABCD".
Resolution : converted schema name to lowercase.

Related

Mockito mock is not properly matching arguments (?)

I'm trying to stub one method in service layer for testing another object:
#SpringBootTest
#RunWith(JUnitPlatform.class)
class WorkreportCrudFacadeTest {
private static Logger LOGGER = LogManager.getLogger(WorkreportCrudFacadeTest.class);
#Test
public void detailTest() {
final AccessRightsService ars = Mockito.mock(AccessRightsService.class);
final SystemPriceSettingService spss = Mockito.mock(SystemPriceSettingService.class);
final WorkreportActivityRepository wrar = Mockito.mock(WorkreportActivityRepository.class);
final WorkreportRepository wrr = Mockito.mock(WorkreportRepository.class);
final DomainObjectTools dot = Mockito.mock(DomainObjectTools.class);
final ApplicationEventPublisher aep = Mockito.mock(ApplicationEventPublisher.class);
Mockito.when(ars.hasEmployeeRightsToWorkReport(
ArgumentMatchers.any(Employee.class), ArgumentMatchers.any(Workreport.class)
)
).thenReturn(true);
final WorkreportCrudFacade s = new WorkreportCrudFacade(ars, spss, wrar, wrr, dot, aep);
final EmployeeId employeeId = new EmployeeId(154149756298300L);
final WorkreportId workreportId = new WorkreportId(154149757395700L);
final Workreport detail = s.detail(workreportId, employeeId);
LOGGER.debug("Detail: {}", detail);
}
}
and method that invokes tested method:
public Workreport detail(final WorkreportId workreportId, final EmployeeId employeeId) {
final Workreport workreport = domainObjectTools.getWorkreportOrThrowNotFoundException(workreportId);
final Employee viewer = domainObjectTools.getEmployeeOrThrowNotFoundException(employeeId);
boolean hasRights = accessRightsService.hasEmployeeRightsToWorkReport(viewer, workreport);
LOGGER.debug("Has rights: {}", hasRights);
if (!hasRights) {
throw new ForbiddenException();
}
return workreport;
}
but when I call tested method hasEmployeeAccessToWorkReport on WorkreportCrudFacade instance, the method is not properly stubbed (it should return true, but returns false).
I'm sure it'll be some detail but I'm not able to find out what is wrong - probably something in argument matcher, but not sure.
I'm using Mockito 2.22.0.
Citing from ArgumentMatchers javadoc:
Since Mockito any(Class) and anyInt family matchers perform a type check, thus they won't match null arguments. Instead use the isNull matcher.
I think that the following happens here: Your DomainObjectTools is an empty mock (not stubbed) and thus it returns null Workreport and null Employee. It results in calling accessRightsService.hasEmployeeRightsToWorkReport(null,null). The null values are not matched by ArgumentMatchers.any(Class).

How to edit or alter Spring parameter expansion?

In application.yml:
flyway:
locations: classpath:/db/migration, classpath:/db/local
placeholders:
csvpath: ${user.dir}/src/main/resources/db/local
In flyway init script:
LOAD DATA LOCAL INFILE '${csvpath}/data.csv'
INTO TABLE myscheama.t1 FIELDS TERMINATED BY ','
ENCLOSED BY '"' LINES TERMINATED BY '\n'
IGNORE 1 LINES;
The ${user.dir} expands to the path of the current project (what I desire), but on a Windows box that expands with path separators of '\'. When Spring Boot spins up, flyway executes right away passing the expanded path to my DB (MySQL), which will not accept '\' in the path name. How can I force a '/' instead or get the proper slash in my path to get this to work?
I was able to resolve this by implementing an EnvironmentPostProcessor:
Add a META-INF/spring.factories properties file to src/main/resources
# Environment Post Processor
org.springframework.boot.env.EnvironmentPostProcessor = com.gitlab.ttubbs.myapp.PropertiesEnvironmentPostProcessor
Add implementation:
package com.gitlab.ttubbs.myapp;
import java.util.Map;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.env.EnvironmentPostProcessor;
import org.springframework.core.env.ConfigurableEnvironment;
import org.springframework.core.env.MapPropertySource;
import org.springframework.core.env.MutablePropertySources;
import org.springframework.core.env.PropertySource;
public class PropertiesEnvironmentPostProcessor implements EnvironmentPostProcessor
{
private static final String SYSTEM_PROPERTIES = "systemProperties";
private static final String USER_DIR = "user.dir";
private static final char BACKSLASH = '\\';
private static final char FORWARD_SLASH = '/';
#Override
public void postProcessEnvironment(ConfigurableEnvironment environment, SpringApplication application) {
MutablePropertySources propertySources = environment.getPropertySources();
Map<String, Object> systemProperties = getMapPropertySource(propertySources, SYSTEM_PROPERTIES);
// Update ${user.dir} to use FORWARD_SLASH instead of BACKSLASH
updateMapProperty(systemProperties, USER_DIR, environment.getProperty(USER_DIR).replace(BACKSLASH, FORWARD_SLASH));
}
private Map<String, Object> getMapPropertySource(MutablePropertySources propertySources, String sourceName) {
if (propertySources.contains(sourceName)) {
PropertySource<?> source = propertySources.get(sourceName);
if (source instanceof MapPropertySource) {
MapPropertySource target = (MapPropertySource) source;
return target.getSource();
}
}
return null;
}
private void updateMapProperty(Map<String, Object> target, String key, String value) {
if(target != null) {
target.put(key, value);
}
}
}

Getting reasonable performance from a parameterized query in Spring JDBC template

I am trying to execute a very simple query from Spring JDBCTemplate. I am retrieving one attribute from a record that is identified by primary key. The entirely of the code is shown below. When I do this with a query constructed by concatenation (dangerous and ugly, and currently uncommented) it executes in 0.1 second. When I change my comments and use the parameterized query it executes in 50 seconds. I would much prefer to get the protection that comes with the parameterized query, however 50 seconds seems like a steep price to pay. Any hints how this could be made more reasonable.
public class JdbcEventDaoImpl {
private static JdbcTemplate jtemp;
private static PreparedStatement getJsonStatement;
private static final Logger logger = LoggerFactory.getLogger(JdbcEventDaoImpl.class);
#Autowired
public void setDataSource(DataSource dataSource) {
JdbcEventDaoImpl.jtemp = new JdbcTemplate(dataSource);
}
public String getJdbcForPosting(String aggregationId){
try {
return (String) JdbcEventDaoImpl.jtemp.queryForObject("select PostingJson from PostingCollection where AggregationId = '" + aggregationId + "'", String.class);
//return (String) JdbcEventDaoImpl.jtemp.queryForObject("select PostingJson from PostingCollection where AggregationId = ?", aggregationId, String.class);
} catch (EmptyResultDataAccessException e){
return "Not Available";
}
}
}

Calling through Spring a procedure with collection of object (oracle ARRAY STRUCT)

im trying to execute a procedure which contains between others a parameter which is a collection of object (oracle). I have managed them lot of times without spring, but I'm a bit lost trying to do it with spring, althoug there is some information on the internet, I can't find a full example in order to compare my code. Spring doc has just fragments. Probably my code is wrong but i ignore why, could you help me? I'm running simplier procedures without problems. My DAO looks like this:
//[EDITED]
private SimpleJdbcCall pActualizaDia;
....
#Autowired
public void setDataSource(DataSource dataSource) {
pActualizaDia = new SimpleJdbcCall(dataSource).withCatalogName("PTR_GRUPOS_TRABAJO").withProcedureName("UPDATE_DIA");
pActualizaDia.getJdbcTemplate().setNativeJdbcExtractor(new OracleJdbc4NativeJdbcExtractor());
}
...
public Calendario updateSingle(final Calendario calendario) {
SqlTypeValue cambiosEmpresa = new AbstractSqlTypeValue() {
protected Object createTypeValue(Connection conn, int sqlType, String typeName) throws SQLException {
ArrayDescriptor arrayDescriptor = new ArrayDescriptor("TTPTR_CAMBIO_EMPRESA", conn);
Object[] collection = new Object[calendario.getCambiosEmpresa().size()];
int i = 0;
for (CeAnoEmp ce : calendario.getCambiosEmpresa()) {
collection[i++] = new STRUCT(new StructDescriptor("TPTR_CAMBIO_EMPRESA", conn), conn, new Object[] {
ce.getSQLParam1(),
//...more parameters here in order to fit your type.
ce.getSQLparamn() });
}
ARRAY idArray = new ARRAY(arrayDescriptor, conn, collection);
return idArray;
}
};
MapSqlParameterSource mapIn = new MapSqlParameterSource();
mapIn.addValue("P_ID_ESCALA", calendario.getEscala().getIdEscala());
//more simple params here
//Here it is the Oracle ARRAY working properly
pActualizaDia.declareParameters(new SqlParameter("P_CAMBIOS_EMPRESA",
OracleTypes.STRUCT, "TTPR_CAMBIO_EMPRESA"));
mapIn.addValue("P_CAMBIOS_EMPRESA",cambiosEmpresa);
//When executing the procedure it just work :)
pActualizaDia.execute(mapIn);
return null;
}
The exception I get sais
java.lang.ClassCastException: $Proxy91 cannot be cast to oracle.jdbc.OracleConnection
I've been reading more about this topic and i found that It almost seems like if using Oracle Arrays you also have to cast the connection to be an oracle connection.
However, most Spring jdbc framework classes like SimpleJDBCTemplate and StoredProcedure hide the connection access from you. Do I need to subclass one of those and override a method somewhere to get the dbcp connection and then cast it to an Oracle connection?
Thank you very much.
I've solved it finally, I've edited the post in order to have an example for anyone looking for a piece of code to solve this issue.
There are two important things to have in mind:
1) It's mandatory to set oracle extractor in jdbctemplate in order to cast properly the connection to get oracle functionality.
2) When using this extractor ojdbc and JRE version must be the same, any other case you'll get an abstractmethodinvocation exception.
Thanks anyone who tried to solve it and hope it helps.
you can use spring to call a procedure with array of collection of oracle structure : below a simple example to do this
import java.util.Collection;
import java.util.HashMap;
import java.util.Iterator;
import java.util.Map;
import javax.sql.DataSource;
import oracle.jdbc.driver.OracleConnection;
import oracle.jdbc.driver.OracleTypes;
import oracle.sql.ARRAY;
import oracle.sql.ArrayDescriptor;
import oracle.sql.STRUCT;
import oracle.sql.StructDescriptor;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.jdbc.core.SqlParameter;
import org.springframework.jdbc.object.StoredProcedure;
public class SpringObjectMapper {
public static class SaveObjectFunction extends StoredProcedure {
final static Logger logger = LoggerFactory.getLogger(SavePackInterFunction.class);
private static final String PROC_NAME = "schema.proc_name";
private final static String ARRAY_OF_VALUE_PARAM_NAME = "ARRAY_OF_VALUE";
private final static String OUT_PARAM_NAME = "out";
public SaveObjectFunction(DataSource dataSource) {
super(dataSource, PROC_NAME);
declareParameter(new SqlParameter(ARRAY_OF_VALUE_PARAM_NAME, OracleTypes.ARRAY, "schema.array_object_type"));
compile();
}
public String execute(Collection<Model> values) {
logger.info("------------------------EnregInterlocuteurPrcedure::execute : begin----------------------------");
String message = null;
try {
OracleConnection connection = getJdbcTemplate().getDataSource().getConnection().unwrap(OracleConnection.class);
ArrayDescriptor arrayValueDescriptor = new ArrayDescriptor("schema.array_object_type", connection);
StructDescriptor typeObjeDescriptor = new StructDescriptor("schema.object_type", connection);
Object[] valueStructArray = new Object[values.size()];
int i = 0;
for (Iterator<Model> iterator = values.iterator(); iterator.hasNext();) {
Model model = (Model) iterator.next();
STRUCT s = new STRUCT(typeObjeDescriptor, connection, new Object[] {model.getAttribute1(), model.getAttribute2(), model.getAttribute3(),
model.getAttribute4(), model.getAttribute5(), model.getAttribute6(), model.getAttribute7()});
valueStructArray[i++] = s;
}
ARRAY inZoneStructArray = new ARRAY(arrayValueDescriptor, connection, valueStructArray);
Map<String, Object> inputs = new HashMap<String, Object>();
inputs.put(ARRAY_OF_VALUE_PARAM_NAME, inZoneStructArray);
Map<String, Object> out = super.execute(inputs);
message = (String) out.get(OUT_PARAM_NAME);
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return message;
}
}
}

Can you think of a better way to only load DBbUnit once per test class with Spring?

I realise that best practise may advise on loading test data on every #Test method, however this can be painfully slow for DBUnit so I have come up with the following solution to load it only once per class:
Only load a data set once per test class
Support multiple data sources and those not named "dataSource" from the ApplicationContext
Roll back of the inserted DBUnit data set not strictly required
While the code below works, what is bugging me is that my Test class has the static method beforeClassWithApplicationContext() but it cannot belong to an Interface because its static. Therefore my use of Reflection is being used in a non Type safe manner. Is there a more elegant solution?
/**
* My Test class
*/
#RunWith(SpringJUnit4ClassRunner.class)
#TestExecutionListeners({DependencyInjectionTestExecutionListener.class, DirtiesContextTestExecutionListener.class, DbunitLoadOnceTestExecutionListener.class})
#ContextConfiguration(locations={"classpath:resources/spring/applicationContext.xml"})
public class TestClass {
public static final String TEST_DATA_FILENAME = "Scenario-1.xml";
public static void beforeClassWithApplicationContext(ApplicationContext ctx) throws Exception {
DataSource ds = (DataSource)ctx.getBean("dataSourceXyz");
IDatabaseConnection conn = new DatabaseConnection(ds.getConnection());
IDataSet dataSet = DbUnitHelper.getDataSetFromFile(conn, TEST_DATA_FILENAME);
InsertIdentityOperation.CLEAN_INSERT.execute(conn, dataSet);
}
#Test
public void somethingToTest() {
// do stuff...
}
}
/**
* My new custom TestExecutioner
*/
public class DbunitLoadOnceTestExecutionListener extends AbstractTestExecutionListener {
final String methodName = "beforeClassWithApplicationContext";
#Override
public void beforeTestClass(TestContext testContext) throws Exception {
super.beforeTestClass(testContext);
Class<?> clazz = testContext.getTestClass();
Method m = null;
try {
m = clazz.getDeclaredMethod(methodName, ApplicationContext.class);
}
catch(Exception e) {
throw new Exception("Test class must implement " + methodName + "()", e);
}
m.invoke(null, testContext.getApplicationContext());
}
}
One other thought I had was possibly creating a static singleton class for holding a reference to the ApplicationContext and populating it from DbunitLoadOnceTestExecutionListener.beforeTestClass(). I could then retrieve that singleton reference from a standard #BeforeClass method defined on TestClass. My code above calling back into each TestClass just seems a little messy.
After the helpful feedback from Matt and JB this is a much simpler solution to achieve the desired result
/**
* My Test class
*/
#RunWith(SpringJUnit4ClassRunner.class)
#TestExecutionListeners({DependencyInjectionTestExecutionListener.class, DirtiesContextTestExecutionListener.class, DbunitLoadOnceTestExecutionListener.class})
#ContextConfiguration(locations={"classpath:resources/spring/applicationContext.xml"})
public class TestClass {
private static final String TEST_DATA_FILENAME = "Scenario-1.xml";
// must be static
private static volatile boolean isDataSetLoaded = false;
// use the Qualifier to select a specific dataSource
#Autowired
#Qualifier("dataSourceXyz")
private DataSource dataSource;
/**
* For performance reasons, we only want to load the DBUnit data set once per test class
* rather than before every test method.
*
* #throws Exception
*/
#Before
public void before() throws Exception {
if(!isDataSetLoaded) {
isDataSetLoaded = true;
IDatabaseConnection conn = new DatabaseConnection(dataSource.getConnection());
IDataSet dataSet = DbUnitHelper.getDataSetFromFile(conn, TEST_DATA_FILENAME);
InsertIdentityOperation.CLEAN_INSERT.execute(conn, dataSet);
}
}
#Test
public void somethingToTest() {
// do stuff...
}
}
The class DbunitLoadOnceTestExecutionListener is no longer requried and has been removed. It just goes to show that reading up on all the fancy techniques can sometimes cloud your own judgement :o)
Not a specialist, but couldn't you call an instance method of your test object in prepareTestInstance() after having verified it implements the appropriate interface, and call this method only if it's the first time prepareTestInstance is invoked with a test instance of this class. You would just have to keep a set of already seen classes:
#Override
public void prepareTestInstance(TestContext testContext) throws Exception {
MyDbUnitTest instance = (MyDbUnitTest) getTestInstance();
if (!this.alreadySeenClasses.contains(instance.getClass()) {
instance.beforeClassWithApplicationContext(testContext.getApplicationContext());
this.alreadySeenClasses.add(instance.getClass());
}
}

Resources