I am trying to export data from oracle but i cant get the latest updates.I am newbie on oracle so don't get angry
Is it a default time for export i mean like specific history(1 week before?)
I created a table last week but i cant get this table in this export file.
Also there are new values in these tables but i cant get this value too.
I am using this code:
exp SYSTEM/password FILE=tables.dmp TABLES(sometable) GRANTS=y INDEXES=y log=exp.log
Related
I'm trying to import a huge table from oracle 10g to HDFS (GCS since i'm using sqoop with Google Cloud Dataproc) as AVRO. Everything works fine when the table doesnt have any date columns, but when it does some dates are imported very wrong.
Like: Oracle data -> 30/07/76 and HDFS data -> 14976-07-30 20:02:00.0
Like: Oracle data -> 26/03/84 and HDFS data -> 10384-03-26 20:32:34.0
I'm already mapping the date fields as String to bring them like that. I was importing using the default sqoop way that is bringing the date fields as epoch ints but the conversion was incorrect too.
Like: Oracle data -> 01/01/01 and HDFS data -> -62135769600000 when it should be 978314400000
Please, hope someone help me to fix this issue.
Thanks
Aditional information:
Sqoop command that i'm running
import -Dmapreduce.job.user.classpath.first=true -Dorg.apache.sqoop.splitter.allow_text_splitter=true --connect=$JDBC_STR --username=$USER --password=$PASS --target-dir=gs://sqoop-dev-out-files/new/$TABLE --num-mappers=10 --fields-terminated-by="\t" --lines-terminated-by="\n" --null-string='null' --null-non-string='null' --table=$SCHEMA.$TABLE --as-avrodatafile --map-column-java="DATACADASTRO=String,DATAINICIAL=String,DATAFINAL=String"
Sqoop version: 1.4.7
JDBC version: 6
I think your date in oracle is 01/01/0001, try to_char(COLUMN,'DD/MM/YYYY').
My issue is that my date is really 01/01/0001, because of user mistyping, and I can't update the column in the origin oracle database.
My issue is that converting to unix should have come -62135596800000 but instead, it comes -62135769600000(30/12/0000).
At first, I thought that was a timezone issue but it is two days difference.
I am trying to use the Postgres import functionality, but it gives me an error on the date datatype.
I have dates in the following format 17-MAY-90 12.00.00.000000000 AM in my Oracle db, but I need to import the data into my Postgres db. I tried timestap with and without timezone and it still gives me an error msg.
What datatype can I use?
PS: Error I am getting
Thanks.
I was able to fix the problem by going to SQL Developer and under
Tools-Preferences-Database-Advanced-NLS
and changed the date format to the format I wanted. This way when I exported the file, it exports with the format I want to use and import in my Postgres db.
The company I work for is in the process of switching from oracle to EnterpriseDB and I'm trying to update a query that uses a timestamp from a table, but whenever I try to pull that timestamp it gives me:
[Devart][ODBC][PostgreSQL]Invalid TIMESTAMP string {HY000}
I've tried casting it as varchar2, date, timestamp, using to_date, and nothing has worked.
The query is:
select "ship_date" from "promotion"#pgProd
In postgres ship_date is just a timestamp.
Any information about how this can be accomplished would be appreciated.
EDIT: To clarify, this query is being run in oracle pulling data from postgres.
Oracle version is 11g
The relevent line of the creation script is:
ship_date timestamp without time zone NOT NULL
I am new to HBase and using Phoenix driver to connect HBase using Squirrel client. Below query describes my table structure and it has composite primary key with "Alert Id ( varchar)" and "Alert StartTime ( Row Timestamp)".
CREATE TABLE ALERT_DETAILS (ALERTID VARCHAR,MACHINENAME VARCHAR(100),PLACE VARCHAR(100),ALERTTYPE VARCHAR(32),ALERTSTARTTIME TIMESTAMP NOT NULL CONSTRAINT CTKEY PRIMARY KEY (ALERTID, ALERTSTARTTIME ROW_TIMESTAMP));
When I am inserting data using using below query. I am not able to see the time stamp value which I have given in the query. It is changing (5 hours before) to other value.
upsert into ALERT_DETAILS values('956dbd63fc586e35bccb0cac18d2cef0','machineone','AUS','CRITICAL ALERT','2016-12-22 11:30:23.0')
After executing the query The timestamp value is changing from '2016-12-22 11:30:23.0' to '2016-12-22 06:30:23.0'.
My system time zone is EST and please help me how to change configuration of Phoenix and Hbase
Phoenix uses the system time zone.
Use tzselect and follow the prompts. It will output an environment variable that you can set in your .bash_profile or set on system startup.
ie. TZ='America/New_York'; export TZ
I want export data something like this.
exp xxx/xxx file=d:\xxx.dmp owner=xxx query=\"where rownum < 1000\"
But I get an error "QUERY parameter is only use in table mode"
Oracle version 10g
As #Thilo says, with exp you can only user the query parameter in table mode. If you're able to use the newer data pump functionality, via the expdp command, you can apply a similar query parameter to the whole export.
#Thilo is right, you can export a single table or a SUBSET of a single table
I also recommend reading Tom's advice in regards to using parfile