Loading huge data from Oracle to Teradata [closed] - oracle

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
Which is the best way to export huge data from Oracle DB and load into teradata db?

I would suggest using Teradata Parallel Transport. It offers different operators to move data depending on your needs and volume of data. It may be possible to accomplish this via named pipes so that you don't have to physically land the data to disk before loading it into Teradata.
I would recommend reading the information available about TPT and rephrase your question accordingly once you have a general direction in which you would like to proceed. This article on Teradata's Developer Exchange is a few years old but provides you a foundation from which you can move forward.

Related

Is it necessary to memorize the codes of data structures? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
Is it necessary to memorize the code of data structures like linked lists, dynamic arrays , circular linked list, queues , stacks , Graphs etc. Or just the basic knowledge of code is enough ? What kind of questions can be asked in a job interview regarding data structures ?
I don't know what your (future) employer may ask, but generally, I'd say no. You have to know how they work and what they're used for, expecially which data structure serves which purpose with its advantages/disadvantages. If you know that, you'll be able to write the code of such a structure without having it memorized - because you know how it will work.

SQL to MapReduce - How to? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I have a complex query used in ETL process (SQL-based). It is too big to fit here but in general few inner joins between several tables and some BL using window functions and other 'goodies'.
I need to port it to Hadoop MapReduce. just dump all the tables in FROM cases to CSV format and bring files jto HDFS.
Then write MapReduce jobs that copy the logic implemented in SQL.
I wonder: are there any best-practices/recommendations/pitfalls I should be aware of while porting SQL to MapReduce?
Googling in my case was no good as the results were either too specific or some scientific papers with no practical tips.
You can look at sqoop as one of the option for transferring data between Hadoop and structured datastores.
Also, this link could be helpful- http://www.ibm.com/developerworks/library/bd-sqltohadoop1/

what strategy to consider when moving from oracle to cassandra [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
We are thinking of moving from Oracle database to Cassandra. Is a there a step by step techniques to design Cassandra tables based on what we have in Oracle ?
You should look some slides by Patrick McFadin about modeling techniques, it should definitely help :
The Datamodel is dead, long live the data model
Become a super modeler
The world' next top data model

What was this live filtered table implemented with? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I was just browsing http://plugins.netbeans.org/PluginPortal/
I like the way that the table at the bottom of the page works, with its various live filtering & sort options. Particularly its speed, fluidity & function.
Does anyone know how this would have been implemented (I'm talking specifically, I have a good understanding of the generalized process, and am interested in the specific technology if it already exists as a particular type of control in/on a particular platform)?
Your help would be greatly appreciated.
looking at the page source it looks like the DataTables plug in for jquery see here

Websolr synonyms for multiple languages [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
How do I use separate synonym dictionaries for websolr?
(There is only one synonym file in configuration panel. I probably need separate synonym set for each language)
Right now we only support the one synonyms file. We've got some ideas for more flexible configuration that may address this in the future. Best to ask us these kinds of questions at our official support channels: http://help.websolr.com/

Resources