Creating event-driven SQL scripts [closed] - oracle

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
I am creating a database that stores GPS data. As soon as the database updates with a data point , I want the server to check to see if that point is within a certain area and send a message or update another database (haven't decided what action it should take yet). Is this event-driven operation possible in PL/SQL? I am only familiar with passive querying and running scheduled scripts.

Yes there is such feature called database triggers. On insert or update (actually there are much more event types) of the data you can check if some conditions are met and call PL/SQL procedure to handle the event.

Related

running an oracle sql command without waiting for result [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 1 year ago.
Improve this question
I have an oracle database which I am accessing using delphi with an ODAC component.
I would like to populate a table using a select statement and don't want to wait for the sql to complete before moving on to next delphi command.
I have tried using TOraSQL with non-blocking set to true but although the program moves on without any delay the sql doesn't populate the table. Any ideas?
I don't have any Delphi-related ideas (as I don't know it), but - as far as Oracle is concerned - you could
put that code into a stored procedure
schedule a job (using DBMS_SCHEDULER or older (but simpler) DBMS_JOB) to run right now from Delphi
job (i.e. the procedure) would run in the background, while ...
... your Delphi code would go on

How to store constant data in DB [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
We have some queries that always return the same result.
For example, query for retrieving all user roles of our system. The result is constant since server is up.
I'm looking for the best way to store such data.
I think about calling table creating script on server start up.
Or to write stored function which will create and fill a table if it doesn't exist and retrieve data if it does.
May be there are better alternatives?
Sounds like you should look into Materialised views.

Lotus Notes to Oracle database migration [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I have a NSF file of a lotus database. The objective is to give up the legacy lotus notes database and migrate it to relational database oracle. Do any one have expertise in this area to give a step wise process to carry out migration from lotus notes to oracle db.
10 years ago I integrated Domino and Oracle and well it was pretty impressive.
I googled migrate from domino to oracle and didn't find much more than the LEI (or DECS) to allow connection of DATA between the 2 systems.
some steps:
1 analyze the NSF: size (MB or GB ?) number of form/view logic in code (my 5 cents opinion find someone that really use the DB that explain what they use in it !)
2 form/view will be table and requests in Oracle
3 data migration : all text date ... will be strate forward. BUT attachment Rich Text and in lined image will be painfull
4 logic well you will have to rewrite all no formula/lotuscript/xpages to J2EE or else
read also http://searchdomino.techtarget.com/answer/Migrating-from-Domino-to-Java-and-Oracle

Insert into not working but query does [closed]

Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 8 years ago.
Improve this question
my issue is that I have a insert into inside of a stored procedure, and sometimes it says "No data found" the problem is that I know that there's data that can be selected with that criteria.
Also, the weird part is that I'm sure that the data is there because I use dbms_output.put_line to print on console the query with the values of the variables used, so I know it's exactly the same query executed inside the stored procedure, and If I execute the printed query it does return data.
Any idea of what's happening?
Thank you.
I manage to find what was happening, seems that oracle has an issue when working with dates, the query inside of the procedure was receiving some dates, and even when it was oracle itself generating the dates, it had some interpretation issues, so I just used
TO_CHAR(b, 'YYYY/MM/DD') to query the dates at the beginning and problem solved

SQLite-like alternative for MongoDB? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
I'm looking for a document-oriented db with a Ruby API that has SQLite-like properties:
self-contained,
serverless,
zero-configuration.
Are there light alternatives to MongoDB or CouchDB?
Is RDDB a possibility?
If not, what are the best paths to walk then?
I know, the question was asked 5 years ago, but just for completeness' sake, embedded MongoDB has happened since:
https://github.com/hamiltop/MongoLiteDB
It's not ready yet, but embeddable version of CouchDB are on the long term roadmap.
Replication is intended to enable offline applications with CouchDB. If you ended up with very specific needs you could replicate data from couchdb to a local datastructure, store it locally, update it, and push the data back via replication but it would take some code.
If you were using Perl, I'd recommend DBM::Deep, which stores arbitrary data structures on disk, including transactions with commit/rollback, and it's a non-C one-Perl-module install. Doesn't get much lighter than that.
I almost feel you could do some sort of hack to achieve this.
Have a table using sqlite's row ids along with a field for collection name and text blob that would be json code.
Have another table for indexing with fields in a collection (collection name, field name, field value, document row id).
You could do some wrapper class to handle things like updates and lookups. Would be interesting.

Resources