I know i can generate a sql script from a logical model in data modeler, can i do the opposite?
i created the database from django and migrate to oracle, o generate an logical model would be really useful.
Yes.
In Data Modeler, File - Import
Choose either your SQL file(s) or use a database connection to import a data dictionary from the database.
This will give you a relational model.
Then you can use the engineer to logical model button to get your logical design and ERD.
Related
I have little experience in power Bi stuff, I am building a report and we are using SSAS model as database. I would like to add 2 columns which should call oracle functions. I am trying to add new columns to the report but it seems when I choose SSAS as database, I cannot use the Modeling options in the main menu ( New column, new table, new parameter, etc )
I tried to add a measure as new field ( it is the only option enabled in the Modeling menu), but I don't know how can I call my Oracle function using a measure.. Can I do that?
Please notice I am using a SSAS model..
Can I use different sources of data when I am using SSAS to get data?, we don't want to modify the model, we want add a new column directly in Power BI which should call a oracle function..
Some suggestions to add my new columns which should call Oracle functions?
Thanks in advance for your help
Use import mode instead of direct query,
Direc query is not recommended unless you want live update.
You are connecting PBI to SSAS using Live Connection Mode, and don't have the modelling options. You can use the new preview of Direct Data Sources on SSAS and Power BI Datasets. This will give you some options to add items.
You cannot call a Oracle function via a measure or column. This would have to be done in the database level before it gets to Power BI and the SSAS model.
You can connect to the SSAS model in import mode, but you'll have to create the SSAS model again in Power BI, and still not be able to use the Oracle function in your import process directly. If you can connect the the database, to may be able to write an SQL query that can be used in the import that calls the function. But again you'll have to recreate the SSAS model in Power BI.
I would like to keep the Neo4J database as the master database, and I would like to keep oracle relational database tables sincronized with Neo4J dat as a read-only Materialized View .
I only find links and articles explaining how to import relational data into Neo4j and not the other way around.
Conceptually I am looking for a kind of Materizalized View in Oracle using a Cypher Query as source. Maybe I could make a custom Merge program, mapping Oracle Tables to Cypher queries. Ideally I would like to run this program in Oracle (PLSQL).
Thanks in advance,
Suppose I want to use an Oracle database, and I have some flat binary file containing structured data. Suppose I have a relational model that fits this data structure.
Does Oracle provide an API to implement some adapter to be able to relationally query this sequence of bytes as a set of views?
If so:
where should the data reside?
what version offers this feature?
If no:
is there any other RDBMS that offers such an API?
You can use an external table. Normally, external tables must use text columns, but you can use the PREPROCESSOR directive to specify a script that will transform the source file before loading it.
You could also use UTL_FILE to load the table from disk and do whatever you want to it in the database. This could include a pipelined table function that you access with the TABLE operator.
I have 5 tables: 3 for nodes and 2 for relationships between them ( relationship = child). How to transfer the data from Oracle to neo4j ?
The Neo4j site has an entire documentation section on moving data from relational databases to neo4j. There are a bunch of different possibilities.
The simplest way though is to use your chosen database tools to export your tables to a CSV format, and then to use Cypher's LOAD CSV command to pull the data in.
The data can't be directly transferred though, in the sense that your tables represent entities and relationships between them; when moving the data to neo4j, this requires that you consider what you want your graph data model to look like.
Because the flexibility and the power that neo4j will give you will ultimately have a lot to do with how you modeled your data, you should give this careful consideration before you dump the CSV and try to import it into neo4j.
I happen to find myself in a situation where i am using Oracle SQL Developer Version 1.5.5 and there's this huge database for which the documentation is very poor. I'd like to create a star or a snowflake schema for better understanding of the data. Is there a simple way to do it?
You can reverse engineer the physical data model using SQL Developer Data Modeler. This is actually a separate tool from SQL Developer but shares some branding. It is also free.
The quality of the resultant diagram will depend heavily on how well the physical data structures have been implemented. You will only get relationships if the database has defined foreign key constraints (disabled is good enough). Likewise UIDs require defined primary key constraints. If your database lacks constraints you'll have to rely on column naming conventions, data analysis and your business knowledge.
Star or Snowflake schemas are for data warehouses. Is that the sort of database you're dealing with?