I have two JDBC SQL prepared statements in the program to UPDATE tables (different areas of program). When I run the statement, I can see the updates in MySQL when I query the table. So I know the update is happening. However, I have a table connected to the DB in the program that does not show the updated results unless I stop running the program and restart it. Then the updates show in the table. I have other prepared statements that use INSERT INTO and these changes show up right away in the tables (I don't have to stop-restart my program). I don't know if the problem lies in the UPDATE, or in my code (the usual culprit), or some setting I have wrong. I think I might be passing things wrong between method calls. This is part of what it does: I have an income table, the user creates a new income listing, and part of that listing gets sent to a FUNDS table (with the UPDATE statement) and another part gets sent to an INCOME table (with the INSERT INTO statement). Here is the code, thanks for help and suggestions!
//Get user info
Income income = new Income();
Funds fund = new Funds();
income.setIncomeName(jTextField10.getText());
String budgetAmt = jTextField11.getText();
BigDecimal bAmt = new BigDecimal(budgetAmt);
income.setAmount(bAmt);
income.setDescription(jTextArea3.getText());
String iDate = jTextField12.getText();
Date date = new SimpleDateFormat("MM-dd-yy", Locale.ENGLISH).parse(iDate);
java.util.Date utilDate = date;
java.sql.Date sqlDate = new java.sql.Date(utilDate.getTime());
income.setDateReceived(sqlDate);
s = (String)jList10.getSelectedValue().toString();
income.setFundAllocation(s);
// Send info to be processed in FUND table with UPDATE - start of problem area
FundsSvcJDBCImpl fundImpl = new FundsSvcJDBCImpl();
Funds calculateFundAmt = fundImpl.calculateFundAmt(fund, income);
// Send info to be process in INCOME table - this part works
IncomeSvcJDBCImpl budgetImpl = new IncomeSvcJDBCImpl();
Income addIncome = budgetImpl.addIncome(income);
//This is the part of the FundsSvcJDBCImpl that uses the UPDATE statement
public Funds calculateFundAmt (Funds fund, Income income) throws Exception {
Statement stmt = null;
int max = 0;
BigDecimal fundTotal = new BigDecimal(0);
Connection conn = getConnection();
try{
stmt = conn.createStatement();
String sql1 = "SELECT sum(Amount) As fundTotal FROM funds WHERE FundsName = ?";
PreparedStatement pstmt1 = conn.prepareStatement(sql1);
pstmt1.setString(1, income.getFundAllocation());
ResultSet rs2 = pstmt1.executeQuery();
while(rs2.next()){
fundTotal = fundTotal.add(rs2.getBigDecimal("fundTotal"));
}
fundTotal = fundTotal.add(income.getAmount());
String sql2 = "UPDATE funds SET Amount = ? WHERE FundsName = ?";
PreparedStatement pstmt = conn.prepareStatement(sql2);
pstmt.setBigDecimal(1, fundTotal);
pstmt.setString(2, income.getFundAllocation());
pstmt.executeUpdate();
}
catch (Exception e){
throw e;
}
finally {
if (conn != null){
conn.close();
stmt.close();
}
}
return fund;
}
Related
All,
I am sending a list of Ids( More than the max of 1000 allowed by Oracle) in the sql where clause using Dapper 1.50 and Oracle 18. Since there is a limit on the number of items in the in clause, I decided to do a sub query as below but I can't get this to working. Can someone shed some light on this. I will always be sending more than 1000 items as Ids. The second sql statement is not working( It says invalid sql).
public static List<Notes> GetNotes()
{
List<Notes> notes = new List<Notes>();
try
{
using (var connection = OracleConnectionString)
{
connection.Open();
string idCommand = #"select * from (select
id
from
note_text
order by
1
desc)where
rownum <=2000";
var notesList = connection.Query<Notes>(idCommand);
var noteIds1 = notesList .Select(i => i.id).ToList();
string command = #"select
tnt.id as id,
tnt.NOTE_TXT
from
note_text tnt
(
select
note_id
from
dual
where
note_id in :noteIds1
)x where x.note_id = tnt.id";
var info = connection.Query<Notes>(command, new
{
noteIds1
});
notes = info.ToList();
}
}
catch (Exception ex)
{
}
return notes;
}
Where does this list of id's originate? I'd get them into a file that can be accessed as an external table. Then, instead of
select ...
from ...
where id in (<listthatstoolong)
you would
select ...
from ...
where id in (select id from id_external_table)
Then you are not limited to a size of a string of elements in your IN list.
This issue occurred in jdbc batch insert. I queried from an Oracle datasource, parsed the resultset and then inserted into another Oracle datasource. I have got the connect metadata and printed the current username along with url, both are invalid.
But when it went to batch update, I got the ora-00942 exception. I'm pretty sure all above works fine in database. Has anyone encountered this exception and can you give me some advice?
EDIT:
Ok, I got a table named photos for example in REMOTE_USER and I queried from it. It gave me a resultset, then I parse it after that INSERT it to LOCAL_USER.photos. I did query the LOCAL_USER.photos where I logon in from PL/SQL Developer. The interesting thing was I could do the select command but not the insert. Below is some part of code.
conn = datasource.getConnection(); // notice that it was target datasource
DatabaseMetaData connMetaData = conn.getMetaData();
String userName = connMetaData.getUserName();
resultSet = ds.getResultSet();
ResultSetMetaData metaData = resultSet.getMetaData();
int count = metaData.getColumnCount();
String insertSql = generateInsertSql(count, metaData, userName);
// this was generated through metaData , the output should be
// "insert into LOCAL_USER.photos(col1,col2) values(?,...)"
logger.error("insert clause is {}", insertSql);
ps = conn.prepareStatement(insertSql);
conn.setAutoCommit(false);
while (resultSet.next()) { // this was the original datasource
stageTotalNum++;
for (int i = 1; i <= count; i++) {
Object object = resultSet.getObject(i);
dealClobColumn(ps, i, object);
}
ps.addBatch();
if (stageTotalNum % 500L == 0L) {
ps.executeBatch(); // throws batchupdateexception.
ps.clearBatch();
conn.commit();
}
}
ps.executeBatch();
conn.commit();
It should be the blob type column which I didn't handle it the right way.
First I queried from original datasource then got the blob column of the resultset by
conn.getObject(index) . Next I insert the blob column into target datasource by conn.setObject. Of course that way wasn't working at all, so I changed to the following:
conn.setBlob(rs.getBlob(index)).
Although it worked fine in my own environemnt, but when the application ran in remote server, it kept annoying about the 'table or view does not exists'.The third version is:
conn.setBinaryStream(rs.getBlob(index).getBinaryStream());
Ok, this time it worked both my pc and remote server. Thanks to #codeLover's advice and link, it really hepled me and saved my time. Appreciated it!
this is my first time playing with WCF in visual studio (2015), and I am running into some problems.
I have my database hosted in appharbor. I managed to insert new items into the database with:
public Test Insert (Test TestTable)
{
SqlConnection conGet = new SqlConnection(connectionString);
SqlCommand cmd = new SqlCommand(connectionString, conGet);
cmd.Parameters.AddWithValue("#testID", testTable.testID);
cmd.Parameters.AddWithValue("#testName", testTable.testName);
conGet.Open();
cmd.CommandText = "INSERT INTO TestTable (testID, testName) VALUES (#testID, #testName)";
cmd.ExecuteScalar();
conGet.Close();
return testTable;
}
PROBLEM: when I tried to select from it:
public Test GetData(Test test)
{
SqlConnection conPut = new SqlConnection(connectionString);
SqlCommand cmd = new SqlCommand("SELECT * FROM TestTable", conPut);
conPut.Open();
SqlDataReader rd = cmd.ExecuteReader();
if (rd.Depth>0)
{
while (rd.Read())
{
test.testID = (int)rd["testID"];
test.testName = rd["testName"].ToString();
}
}
else
{
Console.WriteLine("No rows found.");
}
conPut.Close();
return test;
}
rd (SqlDataReader) is not reading anything and just returns a null value, even though the test table has rows in it.
Please give me some pointers? I have tried several methods online, and no luck.. Thank you!
Remove that test for Depth. The Sql DataProvider doesn't support that property. It is always zero
Source: MSDN SqlDataReader.Depth
The outermost table has a depth of zero. The .NET Framework Data
Provider for SQL Server does not support nesting and always returns
zero.
If you want to test the condition of no rows then use the property HasRows
SqlDataReader rd = cmd.ExecuteReader();
if (rd.HasRows)
{
while (rd.Read())
{
test.testID = (int)rd["testID"];
test.testName = rd["testName"].ToString();
}
}
else
{
Console.WriteLine("No rows found.");
}
I have a code that checks an inquery database a for user,
If the user does not exist, then the code will create a new user in Contact,
Here is only part of the code:
newcontact = [SELECT Id, FirstName FROM Contact WHERE Contact.Email =inquery.email__c];
if(newcontact.size() == 0) {
Account[] aa = [SELECT Id FROM Account WHERE Name = :inquery.Institution__c];
contact = new Contact();
contact.FirstName = inquery.First_Name__c;
contact.LastName = inquery.Last_Name__c;
contact.Email = inquery.email__c;
contact.AccountId = aa.Id;
try {
insert contact; // inserts the new record into the database
} catch (DMLException e) {
ApexPages.addMessage(new ApexPages.message(ApexPages.severity.ERROR,'Error creating new contact'));
return null;
}
I am trying to associate that user with an existing Account?
But the following line gives me an error:
contact.AccountId = aa.Id;
Which is
Initial term of field expression must be a concrete SObject: LIST<Account> at line
And aa.size() returns 1, as it should,
Because the account exists,
Can someone please tell me what wrong?
Thanks
This line contact.AccountId = aa.get(0).Id; will fail if your query returns 0 rows. Make sure to wrap your code within a if (aa.size() > 0) clause to ensure proper execution in all cases.
Ok I fixed it, as follows:
contact.AccountId = aa.get(0).Id;
Best
This query is returning the record with Min Create time Stamp for the Person Pers_ID when I run it in SQL Developer and the same query is not returning any value from Java JDBC connection.
Can you please help?
select PERS_ID,CODE,BEG_DTE
from PRD_HIST H
where PERS_ID='12345'
and CODE='ABC'
and CRTE_TSTP=(
select MIN(CRTE_TSTP)
from PRD_HIST S
where H.PERS_ID=S.PERS_ID
and PERS_ID='12345'
and EFCT_END_DTE is null
)
Java Code
public static List<String[]> getPersonwithMinCreateTSTP(final String PERS_ID,final String Category,final Connection connection){
final List<String[]> personRecords = new ArrayList<String[]>();
ResultSet resultSet = null;
Statement statement = null;
String PersID=null;
String ReportCode=null;
String effBegDate=null;
try{
statement = connection.createStatement();
final String query="select PERS_ID,CODE,EFCT_BEG_DTE from PRD_HIST H where PERS_ID='"+PERS_ID+"'and CODE='"+Category+"'and CRTE_TSTP=(select MIN(CRTE_TSTP) from PRD_HIST S where H.PERS_ID=S.PERS_ID and PERS_ID='"+PERS_ID+"' and EFCT_END_DTE is null)";
if (!statement.execute(query)) {
//print error
}
resultSet = statement.getResultSet();
while (resultSet.next()) {
PersID=resultSet.getString("PERS_ID");
ReportCode=resultSet.getString("CODE");
effBegDate=resultSet.getString("EFCT_BEG_DTE");
final String[] personDetails={PersID,ReportCode,effBegDate};
personRecords.add(personDetails);
}
} catch (SQLException sqle) {
CTLoggerUtil.logError(sqle.getMessage());
}finally{ // Finally is added to close the connection and resultset
try {
if (resultSet!=null) {
resultSet.close();
}if (statement!=null) {
statement.close();
}
} catch (SQLException e) {
//print error
}
}
return personRecords;
}
Print out your SQL SELECT statement from your java program and paste it into SQL*Plus and see what is happening. It's likely you're not getting your variables set to what you think you are. In fact, you're likely to see the error when you print out the SELECT statement without even running it - lower case values when upper is needed, etc.
If you still can't see it, post the actual query from your java code here.
I came here with similar problem - just thought I'd post my solution for others following - I hadn't run "COMMIT" after the inserts I'd made (via sqlplus) - doh!
The database table has records but the JDBC client can't retrieve the records.
Means the JDBC client doesn't have the select privileges. Please run the below query on command line:
grant all on emp to hr;