Using BulkInsert with OracleDB - oracle

I have some kind of promblem with BulkInsert on my OracleDB. I need to insert couple of thousand objects so I decided to use EF.BulkInsert.Oracle added by Nuget which is extension of EF6.BulkInsert for Oracle.
private IOracleDbContext _context;//Class property
//method body:
EF6.BulkInsert.ProviderFactory.Register<EF6.BulkInsert.Providers.OracleBulkInsertProvider>("BulkInsertProvider");
using (var context = (OracleDbContext)_context)
{
using (var dbContextTransaction = context.Database.BeginTransaction())
{
try
{
//Preparing list of objects
var opt = new EF6.BulkInsert.BulkInsertOptions();
opt.Connection = context.Database.Connection;
await context.BulkInsertAsync<ObjectType>(ObjectList,opt);
await context.SaveChangesAsync();
dbContextTransaction.Commit();
stopwatch.Stop();
}
catch (Exception ex)
{
dbContextTransaction.Rollback();
throw ex;
}
}
}
Without adding opt (BulkInsertOptions object) as parameter of BulkInsert it is trying to connect with SQLServer (which don't exist so I get connection failure). After add this BulkOptions with connection I get exception that connection is already part of transaction :/
Traditional way (_context.TableName.Add() ) of course works but It takes unacceptable amount of time.
Any idea what I did wrong here?

I found better way (BulkInsert still do not cooperate). I used Array Binding
mentioned here
It reduced insert time from ~6 minutes to ~1-1.5 seconds :D (7770 records)

Related

How to transmit when data is ready through a rest call with Spring Boot?

I have an ssh manager to execute (bash) scripts on a server. It contains a commandWithContinousRead(String command, Consumer<String> consumer). Whenever an echo is called in the bash script it is consumed by the consumer. I want to extend this with Spring Boot and an HTTP call. When a client sends a request, the server streams the data when it's ready from a bash script and the client can print it out.
I know Server-Sent Events, however, I feel like that is mostly for events and usually uses multiple resources on an API.
Additionally, I tried searching for streaming topics, but had no success. I did find StreamingResponseBody from Spring, but it collects all the data and then sends it all at once.
I used Postman for testing, maybe it cannot handle streaming?
However, how do I test this?
Example:
#/bin/bash
# Scriptname: stream-this.sh
echo "Starting line"
sleep 4
echo "Middle line"
sleep 4
echo "End line"
Request with commandWithContinousRead, but prints everything at once after eight seconds.
#RequestMapping(value = "/stream-this", method = RequestMethod.POST,
produces = MediaType.TEXT_EVENT_STREAM_VALUE)
public ???? streamScript() {
StreamingResponseBody stream = out -> {
sshManager.commandWithContinousRead("bash /scripts/stream-this.sh", echo -> {
try {
byte[] bytes = echo.getBytes(StandardCharsets.UTF_8);
out.write(bytes);
System.out.println(echo);
} catch (IOException e) {
e.printStackTrace();
}
});
};
return new ResponseEntity<>(stream, HttpStatus.OK);
}
Implementation of commandWithContinousRead function.
public void commandWithContinousRead(String command, Consumer<String> consumer) {
SSHClient client = buildClient();
try (Session session = client.startSession()) {
Session.Command cmd = session.exec(command);
BufferedReader br = new BufferedReader(new InputStreamReader(cmd.getInputStream(), StandardCharsets.UTF_8));
String line;
while ((line = br.readLine()) != null) {
consumer.accept(line);
}
br.close();
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
client.disconnect();
} catch (IOException e) {
e.printStackTrace();
}
}
}
Now that you have posted the commandWithContinuousRead method, everything looks correct. Also, you've just now stated that you're testing with Postman, and that's definitely a problem -- postman doesn't support streaming responses
https://github.com/postmanlabs/postman-app-support/issues/5040
It's always a good idea to programmatically unit and integration test your code. A simple unit test doesn't even need to use Spring, or a real SSH connection (run the bash script local to the test). The unit test would just be testing the logic of your Consumer and would let you know that the reading of the output, and the bash script itself aren't blocking. Ideally, you would use junit, but here's a simple test class that I put together that shows what I mean.
import java.io.BufferedReader;
import java.io.InputStreamReader;
import java.io.IOException;
import java.lang.Process;
import java.nio.charset.StandardCharsets;
import java.util.function.Consumer;
public class Test {
// This would be a #Test instead of a main
public static void main(String... args) {
commandWithContinousRead("bash stream-this.sh", echo -> {
byte[] bytes = echo.getBytes(StandardCharsets.UTF_8);
// assert statements go here
System.out.println("In main -- " + echo);
});
}
public static void commandWithContinousRead(String command, Consumer<String> consumer) {
try {
Process process = Runtime.getRuntime().exec(command);
BufferedReader br = new BufferedReader(new InputStreamReader(process.getInputStream()));
String line;
while ((line = br.readLine()) != null) {
consumer.accept(line);
}
br.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
An integration test would actually setup Spring, and would go through the endpoint, thereby testing in the same manner that the client/browser would. Commonly, this is done using #WebMvcTest and mockMvc async. You could choose to either mock the SSH client, or to have a server setup explicitly so your actual SSH client can connect to it. (The second option would expose/eliminate issues related to the ssh connection). This kind of test would expose issues with the spring setup/streaming response. You would need to set an artificial timeout on your mock mvc after say, 5 seconds and using a new mock mvc, after 9 seconds That would allow you to see that after 5 seconds, you've received the first echo, and after 9, you have the whole expected response. A good starting point for you would be to look at https://www.tabnine.com/code/java/methods/org.springframework.test.web.servlet.result.RequestResultMatchers/asyncStarted
Having passed those two levels of tests, then you would begin to suspect the client, which in this case, is Postman. If possible, try to use the actual browser(s) or clients that will be running your code. It may turn out that streaming might not be an option for you.
Please post the implementation of commandWithContinousRead
It could be a fundamental problem where the script that is echoing and sleeping is running on the same thread as the code that is supposed to read the echo and print them out. I.e., you're blocking while you wait for the bash script itself to run which would explain the 8 second delay before getting any output. Also, what type does commandWithContinousRead return? Depending, on how you're "reading" the echos in that method, you could be blocking there too. It's hard to say with 100% certainty without seeing the code for commandWithContinousRead.
Your return type will be a ResponseEntity<StreamingResponseBody> (to fill in the ????)
Okay, I came up with a solution that worked. As Pickled Brain mentioned, the main problem was Postman not working with streaming. Also, I went back to try SSE in a single call and I did by running the bash script in another thread. Additionally, I created an SSE client in Nodejs for testing purposes and it worked flawlessly.
Function to run the script, and place it in another thread.
private SseEmitter runScript() {
SseEmitter emitter = new SseEmitter(-1L); // -1L = no timeout
ExecutorService sseMvcExecutor = Executors.newSingleThreadExecutor();
sseMvcExecutor.execute(() -> {
try {
shellManager.commandWithContinousRead("bash scriptname"), s -> {
SseEmitter.SseEventBuilder event = SseEmitter.event().name("message").data(s);
try {
emitter.send(event);
System.out.println(s);
} catch (IOException e) {
e.printStackTrace();
}
});
emitter.send(SseEmitter.event().name("close").data(""));
} catch (IOException e) {
e.printStackTrace();
}
emitter.complete();
});
return emitter;
}
SSE Client:
const EventSource = require('eventsource'); // npm install eventsource
const url = 'yoururl';
var es = new EventSource(url);
es.onopen = function(ev) {
console.log("OPEN");
console.log(ev);
};
es.onmessage = function(ev) {
console.log("MESSAGE");
console.log(ev.data);
};
es.addEventListener('close', function() {
es.close();
console.log('closing!');
});
es.onerror = function(ev) {
console.log("ERROR");
console.log(ev);
es.close();
};
process.on('SIGINT', () => {
es.close();
console.log(es.CLOSED);
});

Replay a particular type of event from eventstore

I am currently using the Event Store to handle my events. I currently need to replay a particular type of event as I have made changes in the way they are subscribed and written to DB.
Is this possible? If so, how can it be done? Thanks.
You cannot tell EventStore to replay a specific event onto a persistent subscription because the point of the persistent subscription is to keep state for the subscribers.
To achieve this kind of fix you would really need a catch up application to do the work.
And really if you think about, if you replayed ALL the events to a new database then you would have the correct data in there?
So I have a console application that reuses the same logic as the persistent connection but the only difference is:
I change the target database connection string - So this would be a new Database or Collection (not the broken one)
It connects to EventStore and replays all the events from the start
It rebuilds the entire database to the correct state
Switch the business over to the new database
This is the point of EventStore - You just replay all the events to build any database at any time and it will be correct
Your persistent connections deal with new, incoming events and apply updates.
If you enable $by_event_type projection than you can access that projection stream under
/streams/$et-{event-type}
https://eventstore.org/docs/projections/system-projections/index.html
Then you can read it using .net api if you wish.
Here is some code to get you started
private static T GetInstanceOfEvent<T>(ResolvedEvent resolvedEvent) where T : BaseEvent
{
var metadataString = Encoding.UTF8.GetString(resolvedEvent.Event.Metadata);
var eventClrTypeName = JObject.Parse(metadataString).Property(EventClrTypeHeader).Value;
var #event = JsonConvert.DeserializeObject(Encoding.UTF8.GetString(resolvedEvent.Event.Data), Type.GetType((string) eventClrTypeName));
if (!(#event is BaseEvent))
{
throw new MessageDeserializationException((string) eventClrTypeName, metadataString);
}
return #event as T;
}
private static IEventStoreConnection GetEventStoreConnection()
{
var connectionString = System.Configuration.ConfigurationManager.ConnectionStrings["EventStore"].ConnectionString;
var connection = EventStoreConnection.Create(connectionString);
connection.ConnectAsync().Wait();
return connection;
}
private static string GetStreamName<T>() where T : BaseEvent
{
return "$et-" + typeof(T).Name;
}
And to read events you can use this code snippet
StreamEventsSlice currentSlice;
long nextSliceStart = StreamPosition.Start;
const int sliceCount = 200;
do
{
currentSlice = await esConnection.ReadStreamEventsForwardAsync(streamName, nextSliceStart, sliceCount, true);
foreach (var #event in currentSlice.Events)
{
var myEvent = GetInstanceOfEvent<OrderMerchantFeesCalculatedEvent>(#event);
TransformEvent(myEvent);
}
nextSliceStart = currentSlice.NextEventNumber;
} while (currentSlice.IsEndOfStream == false);

Using JAX-RS and trying to DELETE an item

I am currently working in Enterprise Java and I'm a newbie. I am trying to create a method which should delete a selected item from a data table. My project contains Graphical User Interface elements from "http://www.primefaces.org/showcase/".
The deletion is made through a web-service.
This is the method I created so far:
public boolean delete(String articleId) {
Client client = ClientBuilder.newClient();
WebTarget target
= client.target(DELETE_URL);//this is a String
//TODO call ws method delete
try{
target.request()....;
} catch(Exception ex) {
LOGGER.error("Delete Article Error ", ex);
}
return true;
}
Could you tell me how can I handle the deletion in an appropiate way?
All the best!
In your case the following should do the trick.
target.request().delete(Response.class)

How to close refcursor in spring when using simplejdbccall

I am using spring simpleJdbcCall to call oracle stored procedure and i am using oracle 11g .
I stumbled on a couple of posts which suggests there might be memory leak as the ref cursors are not properly closed by spring.
Is there anyway to explicitly close cursor while using spring simplejdbccall? or is increasing the oracle OPEN_CURSOR the only way out?.
I am planning to scale up my application to handle around one million transactions every hour .Any suggestions will be helpful.
Actually there is no such an issue with Spring JDBC. It closes all resources within finally after all execute. SimpleJdbcCall uses JdbcTemplate:
public <T> T execute(CallableStatementCreator csc, CallableStatementCallback<T> action)
throws DataAccessException {
try {
...
}
catch (SQLException ex) {
...
}
finally {
if (csc instanceof ParameterDisposer) {
((ParameterDisposer) csc).cleanupParameters();
}
JdbcUtils.closeStatement(cs);
DataSourceUtils.releaseConnection(con, getDataSource());
}
}
The same for ResultSet OUT parameters:
protected Map<String, Object> processResultSet(ResultSet rs, ResultSetSupportingSqlParameter param) throws SQLException {
....
finally {
JdbcUtils.closeResultSet(rs);
}
return returnedResults;
}
From other side I have a big experience with Spring JDBC and Oracle in high-loaded systems and want to say that we noticed enough open resources on Oracle with at peak loads, but they have been released properly after that.
Although we used JBOSS Pooled DataSource and its TransactionMaanger
I use CallableStatement directly and I can release statements and connections quickly and safely, try both methods and measure memory consumption, it worked perfectly for me to solve memory consumption and connection retention problems that proved many waiting and rejection of connections the applications.
try {
log.info("**** RepositoryPSostgres.getAllProducts ******** ");
Connection conn = jdbcTemplate.getDataSource().getConnection();
conn.setAutoCommit(false);
// Procedure call.
CallableStatement proc = conn.prepareCall("{? = call get_all_products() }");
proc.registerOutParameter(1, Types.OTHER);
proc.execute();
ResultSet results = (ResultSet) proc.getObject(1);
**proc.close();
proc.isClosed();
conn.close();**
ArrayList <Products> resp = new ArrayList <Products>();
while (results.next()) {
Products resp1 = new Products();
resp1.setId(results.getInt("id"));
resp1.setName((String) results.getString("name"));
resp1.setPrice((BigDecimal) results.getBigDecimal("price"));
resp.add(resp1);
log.info("***" + results.getInt("id") + "***** ");
log.info("***" + results.getString("name") + "***** ");
log.info("***" + results.getBigDecimal("price") + "***** ");
}
results.close();
return resp;
} catch (Exception e) {
e.printStackTrace();
log.error(new StringBuffer("Error en transaccion en saldo CashPooling : ").append(e.getLocalizedMessage()).toString());
return null;
}

Oracle sessions stay open after closing connection

While testing a new application, we came across an issue that sometimes a stored proc takes over 1 minute to execute and causes a time out. It was not 1 stored proc in particulary, it could be any.
Trying to reproduce the issue I've created a small (local) testapp that calls the same stored proc in different threads (code below).
Now it seems that the Oracle-sessions are still there. Inactive. And the CPU of the Oracle-server hits 100%.
I use the System.Data.OracleClient
I'm not sure if one is related to the other, but it slows down the time needed to get an answer from the database.
for (int index = 0; index < 1000; ++index)
{
ThreadPool.QueueUserWorkItem(GetStreet, index);
_runningThreads++;
WriteThreadnumber(_runningThreads);
}
private void GetStreet(object nr)
{
const string procName = "SPCK_ISU.GETPREMISESBYSTREET";
DataTable dataTable = null;
var connectionstring = ConfigurationManager.ConnectionStrings["CupolaDB"].ToString();
try
{
using (var connection = new OracleConnection(connectionstring))
{
connection.Open();
using (var command = new OracleCommand(procName, connection))
{
//Fill parameters
using (var oracleDataAdapter = new OracleDataAdapter(command))
{
//Fill datatable
}
}
}
}
finally
{
if (dataTable != null)
dataTable.Dispose();
}
}
EDIT:
I just let the dba make a count of the open sessions and there are 105 sessions that stay open-inactive. After closing my application, the sessions are removed.
Problem is solved.
We hired an Oracle-expert to take a look at this and the problem was caused due to some underlying stored procedures that took a while to execute and consumed a lot of CPU.
After the necessary tuning, everything runs smoothly.

Resources