Script to find multi level dependencies of a package - oracle

I've a package which references many objects from the same schema and other schemas. I want to find all the dependencies of the package. I can get only first level dependencies from user_dependencies. Also utldtree would give me the objects which are dependent on my current object.utldtree also gives only the referenced objects in the same schema.
While I'm trying to find the solution for this on the net, I came across the following link
http://rodgersnotes.wordpress.com/2012/01/05/notes-on-deptree/
where he mentioned that, he uses his own script to find the multi level dependencies of an object.
Could you please help me out, how to proceed for such a script which will get us the multi-level dependencies of an object,(for example if the package is referencing views, then our script should mention the views and the tables/views upon which our view is build as we get in deptree)

You can use a connect by on user_dependencies for most cases.
Determining dependencies
Sample which works for any Oracle user since PUBLIC has been granted select access on user_dependencies:
select name
, type
, prior name
, prior type
from user_dependencies
start
with name='BUBS#MUNT_EENHEDEN'
and type='PACKAGE'
connect
by nocycle
name = prior referenced_name
and type = prior referenced_type
Sample output
Level 1: BUBS#MUNT_EENHEDEN PACKAGE
Level 2: BUBS_MUNT_EENHEDEN_V VIEW BUBS#MUNT_EENHEDEN PACKAGE
Level 3: BUBS#VERTALINGEN PACKAGE BUBS_MUNT_EENHEDEN_V VIEW
Level 4: ITGEN_LANGUAGES_V VIEW BUBS#VERTALINGEN PACKAGE
Complex scenarios
For complex scenarios I've found it necessary to use an own view directly on the data dictionary. Do this only when you know what you are doing and what RDBMS version you want to support! For instance, datamodel versions introduced major changes in the data dictionary.
Sample:
create or replace force view itgen_object_tree_changes_r
as
select o_master.obj# ojt#
, o_master.name ojt_name
, o.mtime ojt_ref_mtime
, o.name ojt_ref_name
, o.owner# ojt_ref_owner#
, decode
( o.type#
, 0, 'NEXT OBJECT'
, 1, 'INDEX'
, 2, 'TABLE'
, 3, 'CLUSTER'
, 4, 'VIEW'
, 5, 'SYNONYM'
, 6, 'SEQUENCE'
, 7, 'PROCEDURE'
, 8, 'FUNCTION'
, 9, 'PACKAGE'
, 11, 'PACKAGE BODY'
, 12, 'TRIGGER'
, 13, 'TYPE'
, 14, 'TYPE BODY'
, 19, 'TABLE PARTITION'
, 20, 'INDEX PARTITION'
, 21, 'LOB'
, 22, 'LIBRARY'
, 23, 'DIRECTORY'
, 24, 'QUEUE'
, 28, 'JAVA SOURCE'
, 29, 'JAVA CLASS'
, 30, 'JAVA RESOURCE'
, 32, 'INDEXTYPE'
, 33, 'OPERATOR'
, 34, 'TABLE SUBPARTITION'
, 35, 'INDEX SUBPARTITION'
, 40, 'LOB PARTITION'
, 41, 'LOB SUBPARTITION'
, 42, nvl
( ( select 'REWRITE EQUIVALENCE'
from sys.sum$ s
where s.obj# = o.obj#
and bitand ( s.xpflags, 8388608 ) = 8388608 ), 'MATERIALIZED VIEW'
)
, 43, 'DIMENSION'
, 44, 'CONTEXT'
, 46, 'RULE SET'
, 47, 'RESOURCE PLAN'
, 48, 'CONSUMER GROUP'
, 51, 'SUBSCRIPTION'
, 52, 'LOCATION'
, 55, 'XML SCHEMA'
, 56, 'JAVA DATA'
, 57, 'EDITION'
, 59, 'RULE'
, 60, 'CAPTURE'
, 61, 'APPLY'
, 62, 'EVALUATION CONTEXT'
, 66, 'JOB'
, 67, 'PROGRAM'
, 68, 'JOB CLASS'
, 69, 'WINDOW'
, 72, 'WINDOW GROUP'
, 74, 'SCHEDULE'
, 79, 'CHAIN'
, 81, 'FILE GROUP'
, 82, 'MINING MODEL'
, 87, 'ASSEMBLY'
, 90, 'CREDENTIAL'
, 92, 'CUBE DIMENSION'
, 93, 'CUBE'
, 94, 'MEASURE FOLDER'
, 95, 'CUBE BUILD PROCESS'
, 'UNDEFINED'
)
ojt_ref_type
from sys.obj$ o
, ( /* All dependencies from the object if there are any. */
select distinct connect_by_root d_obj# obj#, dep.p_obj# obj_ref#
from sys.dependency$ dep
connect
by nocycle dep.d_obj# = prior dep.p_obj#
start
with dep.d_obj# in ( select obj.obj# from itgen_schemas_r sma, sys.obj$ obj where obj.owner# = sma.owner# )
union all /* Union all allowed, 'in' ignores duplicates. */
/* The object itself. */
select obj.obj#
, obj.obj#
from itgen_schemas_r sma
, sys.obj$ obj
where obj.owner# = sma.owner#
) deps
, sys.obj$ o_master
where o_master.obj# = deps.obj#
and o.obj# = deps.obj_ref#
--
-- View: itgen_object_tree_changes_r
--
-- Overview of dependencies between a master object and all objects used by it. It can be used to analyze the reason why a project version views must be recalculated.
--
-- Code (alias): ote_r
--
-- Category: Hardcoded.
--
-- Example:
--
-- The object 'X' is invalid, since 'Y' is invalid.
--

Related

DECODE In Oracle PL/SQL

I am trying to use DECODE in PL/SQL statement (for sample HR schema)
but I got this error:
''The number specified in exact fetch is less than the rows returned''
This statement got a DEPARTMENT_ID from the user , compare it with a decode section, and shows where is that department located.
declare
v_dep varchar2(30);
v_User_Input number(4):=&EnterLocID;
begin
select decode(v_User_Input,10,'Seattle',
20,'Toronto',
30,'Toronto',
40,'London',
50,'South San Francisco',
60,'Southlake',
70,'Munich',
80,'Oxford',
90,'Toronto',
'else' )into v_dep from locations l,departments d
where v_User_Input = d.department_id;
DBMS_OUTPUT.PUT_LINE(v_dep);
end;
/
The way I see it, you forgot to join LOCATIONS and DEPARTMENTS tables so they are cross-joined, and then you applied a filter to D.DEPARTMENT_ID. However, there's a possibility that query still returns 2 or more rows which can't fit into a scalar V_DEP variable.
Therefore, this is what you might need to do (I don't have your tables so I can't test it, but I hope you'll get the idea):
FROM locations l join departments d on d.location_id = l.location_id
WHERE l.location_id = v_User_Input
Also, as you can see, I modified the WHERE clause and used l.location_id instead of your d.department_id. Why?
because substitution variable suggests so (&EnterLocID) - location ID, not department ID
if it was really d.department_id, then why are you joining locations and departments in the first place? Omit locations table if you don't need it.
Finally, why decode? If you joined these two tables, then you have everything you need, i.e. something like this should do it:
select l.location_name
into v_dep
from locations l join departments d on d.location_id = l.location_id
where l.location_id = v_user_input
Or, if it has to be decode for some reason (educational?), then
SELECT DECODE (v_User_Input,
10, 'Seattle',
20, 'Toronto',
30, 'Toronto',
40, 'London',
50, 'South San Francisco',
60, 'Southlake',
70, 'Munich',
80, 'Oxford',
90, 'Toronto',
'else')
INTO v_dep
FROM ldepartments d
WHERE d.department_id = v_user_input

Oracle Trigger with UPDATE OF ( column name identical with table)

i have a problem with my column who has the same name as one of my tables.
I can not use table.column in this situation
repres was a table in the windev database and a column in commerc database and devent table where i create this trigger
When i do : AFTER UPDATE OF etat,total, devent.repres
I Have :
ERROR line 2, col 35, ending_line 2, ending_col 35, Found '.', Expecting: , or ON OR
This is my code:
CREATE OR REPLACE TRIGGER WINDEV_AES_UPDATE
AFTER UPDATE OF etat,total, repres
ON COMMERC.DEVENT
FOR EACH ROW
DECLARE
BEGIN
IF UPDATING AND :OLD.etat != :NEW.etat THEN
INSERT INTO windev.aes (code)
SELECT (:NEW.CODE)
FROM DUAL
WHERE NOT EXISTS (SELECT * FROM windev.aes WHERE code = :NEW.CODE);
UPDATE windev.aes SET aes.update_flag = 1 WHERE aes.code = :NEW.CODE;
END IF;
IF UPDATING AND :OLD.total != :NEW.total THEN
INSERT INTO windev.aes (code)
SELECT (:NEW.CODE)
FROM DUAL
WHERE NOT EXISTS (SELECT * FROM windev.aes WHERE code = :NEW.CODE);
UPDATE windev.aes SET aes.update_flag = 1 WHERE aes.code = :NEW.CODE;
END IF;
END;
/
I hope you tell me how resolve this epic problem ;-)

Different result sets for same native query when executed by SQL client and application

all. I have a weird situation when executing a native SQL query against an Oracle database. If the query is executed via SQL client software (in my case, DbVisualizer) I have one result set; if my application (Java, Spring-based) executes it, the result is different.
select
c.id
, c.parentId
, c.name
, c.sequence
, c.isSuppressed
, c.isGotoCategory
, c.hasChildren
, c.startDate
, c.endDate
from (
select
category.category_id as id
, category.parent_category_id as parentId
, nvl(context.category_name, category.category_name) as name
, nvl(context.sequence_num, category.sequence) as sequence
, nvl(context.is_suppressed, 'N') as isSuppressed
, decode(category.syndicate_url, null, 'N', 'Y') as isGotoCategory
, decode(category.is_leaf, 1, 'N', 'Y') as hasChildren
, category.start_date as startDate
, category.end_date as endDate
from (
select
category.category_id
, category.parent_category_id
, category.category_name
, category.start_date
, category.end_date
, category.syndicate_url
, category.sequence
, connect_by_isleaf as is_leaf
, level as category_level
from
category
start with
category.category_id = (
select
category_id
from
category
where
parent_category_id is null
start with
category_id = 3485
connect by prior parent_category_id = category_id
)
connect by category.parent_category_id = prior category.category_id and level <= (4 + 1)
) category
inner join category_destination_channel channel on channel.category_id = category.category_id
and channel.publish_flag = 'Y'
and channel.destination_channel_id = 1
left join contextual_category context on context.category_id = category.category_id
and context.context_type = 'DESKTOP'
where
category.category_level <= 4
and category.start_date <= sysdate
and category.end_date >= sysdate
) c
where
c.isSuppressed <> 'Y'
The query above is the one that problematic one. When executed via SQL client, the outer restriction applies (c.isSuppressed <> 'Y') and the records are filtered out. When the query is executed by the application the outer restriction doesn't seem to be applied at all and my result set has records that should not be there.
Anyone has faced this kind of problem before?
My application is built with: Java 7, Spring 4.x, Oracle 11 (with OJDBC driver version 11.2.0.3). Application server is JBOSS EAP 6.3 by my tests are made with Jetty (maven-jetty-plugin 6.1.26).
I already considered some possible causes of the problem - application accessing wrong database, unusual issue while using #SqlResultSetMapping - but ruled them out with some tests. Don't know what to consider anymore.
Any help is appreciated. Thanks in advance.

ERROR 1128: Cannot find field dryTemp

my pig was run code temperature and me an error, put the code below and the error to facilitate the understanding of my problem occurred.
the error is in line 38 column 15, tried to delete the dryTemp, but also gave another error.
Code:
--Load files into relations
month1 = LOAD 'hdfs:/data/big/data/weather/weather/201201hourly.txt' USING PigStorage(',');
month2 = LOAD 'hdfs:/data/big/data/weather/weather/201202hourly.txt' USING PigStorage(',');
month3 = LOAD 'hdfs:/data/big/data/weather/weather/201203hourly.txt' USING PigStorage(',');
month4 = LOAD 'hdfs:/data/big/data/weather/weather/201204hourly.txt' USING PigStorage(',');
month5 = LOAD 'hdfs:/data/big/data/weather/weather/201205hourly.txt' USING PigStorage(',');
month6 = LOAD 'hdfs:/data/big/data/weather/weather/201206hourly.txt' USING PigStorage(',');
--Combine relations
months = UNION month1, month2, month3, month4, month5, month6;
/* Splitting relations
SPLIT months INTO
splitMonth1 IF SUBSTRING(date, 4, 6) == '01',
splitMonth2 IF SUBSTRING(date, 4, 6) == '02',
splitMonth3 IF SUBSTRING(date, 4, 6) == '03',
splitRest IF (SUBSTRING(date, 4, 6) == '04' OR SUBSTRING(date, 4, 6) == '04');
*/
/* Joining relations
stations = LOAD 'hdfs:/data/big/data/QCLCD201211/stations.txt' USING PigStorage() AS (id:int, name:chararray)
JOIN months BY wban, stations by id;
*/
--filter out unwanted data
clearWeather = FILTER months BY skyCondition == 'CLR';
--Transform and shape relation
shapedWeather = FOREACH clearWeather GENERATE date, SUBSTRING(date, 0, 4) as year, SUBSTRING(date, 4, 6) as month, SUBSTRING(date, 6, 8) as day, skyCondition, dryTemp;
--Group relation specifying number of reducers
groupedByMonthDay = GROUP shapedWeather BY (month, day) PARALLEL 10;
--Aggregate relation
aggedResults = FOREACH groupedByMonthDay GENERATE group as MonthDay, AVG(shapedWeather.dryTemp), MIN(shapedWeather.dryTemp), MAX(shapedWeather.dryTemp), COUNT(shapedWeather.dryTemp) PARALLEL 10;
--Sort relation
sortedResults = ORDER aggedResults BY $1 DESC;
--Store results in HDFS
STORE sortedResults INTO 'hdfs:/data/big/data/weather/pigresults' USING PigStorage(':');
Put down the error, he was kinda big, still do not know much about the pig, I'm still studying, I believe that error has to do with the type of variable that is not recognized but do not know fix it hopefully help me.
Error:
ERROR 1128: Cannot find field dryTemp in :bytearray,year:chararray,month:chararray,day:chararray,:bytearray,:bytearray
org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1000: Error during parsing. Cannot find field dryTemp in :bytearray,year:chararray,month:chararray,day:chararray,:bytearray,:bytearray
at org.apache.pig.PigServer$Graph.parseQuery(PigServer.java:1691)
at org.apache.pig.PigServer$Graph.access$000(PigServer.java:1411)
at org.apache.pig.PigServer.parseAndBuild(PigServer.java:344)
at org.apache.pig.PigServer.executeBatch(PigServer.java:369)
at org.apache.pig.PigServer.executeBatch(PigServer.java:355)
at org.apache.pig.tools.grunt.GruntParser.executeBatch(GruntParser.java:140)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:202)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:173)
at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:84)
at org.apache.pig.Main.run(Main.java:607)
at org.apache.pig.Main.main(Main.java:156)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
Caused by: Failed to parse: Pig script failed to parse:
<file Documentos/pig/weather.pig, line 38, column 15> pig script failed to validate: org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1128: Cannot find field dryTemp in :bytearray,year:chararray,month:chararray,day:chararray,:bytearray,:bytearray
at org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:196)
at org.apache.pig.PigServer$Graph.parseQuery(PigServer.java:1678)
... 15 more
Caused by:
<file Documentos/pig/weather.pig, line 38, column 15> pig script failed to validate: org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1128: Cannot find field dryTemp in :bytearray,year:chararray,month:chararray,day:chararray,:bytearray,:bytearray
at org.apache.pig.parser.LogicalPlanBuilder.buildForeachOp(LogicalPlanBuilder.java:1017)
at org.apache.pig.parser.LogicalPlanGenerator.foreach_clause(LogicalPlanGenerator.java:15870)
at org.apache.pig.parser.LogicalPlanGenerator.op_clause(LogicalPlanGenerator.java:1933)
at org.apache.pig.parser.LogicalPlanGenerator.general_statement(LogicalPlanGenerator.java:1102)
at org.apache.pig.parser.LogicalPlanGenerator.statement(LogicalPlanGenerator.java:560)
at org.apache.pig.parser.LogicalPlanGenerator.query(LogicalPlanGenerator.java:421)
at org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:188)
... 16 more
Caused by: org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1128: Cannot find field dryTemp in :bytearray,year:chararray,month:chararray,day:chararray,:bytearray,:bytearray
at org.apache.pig.newplan.logical.expression.DereferenceExpression.translateAliasToPos(DereferenceExpression.java:215)
at org.apache.pig.newplan.logical.expression.DereferenceExpression.getFieldSchema(DereferenceExpression.java:149)
at org.apache.pig.newplan.logical.optimizer.FieldSchemaResetter.execute(SchemaResetter.java:264)
at org.apache.pig.newplan.logical.expression.AllSameExpressionVisitor.visit(AllSameExpressionVisitor.java:148)
at org.apache.pig.newplan.logical.expression.DereferenceExpression.accept(DereferenceExpression.java:84)
at org.apache.pig.newplan.ReverseDependencyOrderWalker.walk(ReverseDependencyOrderWalker.java:70)
at org.apache.pig.newplan.PlanVisitor.visit(PlanVisitor.java:52)
at org.apache.pig.newplan.logical.optimizer.SchemaResetter.visitAll(SchemaResetter.java:67)
at org.apache.pig.newplan.logical.optimizer.SchemaResetter.visit(SchemaResetter.java:122)
at org.apache.pig.newplan.logical.relational.LOGenerate.accept(LOGenerate.java:245)
at org.apache.pig.newplan.DependencyOrderWalker.walk(DependencyOrderWalker.java:75)
at org.apache.pig.newplan.logical.optimizer.SchemaResetter.visit(SchemaResetter.java:114)
at org.apache.pig.parser.LogicalPlanBuilder.buildForeachOp(LogicalPlanBuilder.java:1015)
... 22 more
Here are a few lines of the file 201211 hourly.txt:
WBAN,Date,Time,StationType,SkyCondition,SkyConditionFlag,Visibility,VisibilityFlag,WeatherType,WeatherTypeFlag,DryBulbFarenheit,DryBulbFarenheitFlag,DryBulbCelsius,DryBulbCelsiusFlag,WetBulbFarenheit,WetBulbFarenheitFlag,WetBulbCelsius,WetBulbCelsiusFlag,DewPointFarenheit,DewPointFarenheitFlag,DewPointCelsius,DewPointCelsiusFlag,RelativeHumidity,RelativeHumidityFlag,WindSpeed,WindSpeedFlag,WindDirection,WindDirectionFlag,ValueForWindCharacter,ValueForWindCharacterFlag,StationPressure,StationPressureFlag,PressureTendency,PressureTendencyFlag,PressureChange,PressureChangeFlag,SeaLevelPressure,SeaLevelPressureFlag,RecordType,RecordTypeFlag,HourlyPrecip,HourlyPrecipFlag,Altimeter,AltimeterFlag
03011,20120101,0015,0,CLR, ,10.00, , , ,23, ,-5.0, ,15, ,-9.5, ,-9, ,-23.0, , 24, , 5, ,120, , , ,21.70, , , , , ,M, ,AA, , , ,30.43,
03011,20120101,0035,0,CLR, ,10.00, , , ,21, ,-6.0, ,14, ,-10.2, ,-9, ,-23.0, , 26, , 6, ,130, , , ,21.70, , , , , ,M, ,AA, , , ,30.43,
03011,20120101,0055,0,CLR, ,10.00, , , ,21, ,-6.0, ,13, ,-10.5, , -13, ,-25.0, , 21, , 0, ,000, , , ,21.71, , , , , ,M, ,AA, , , ,30.44,
03011,20120101,0115,0,CLR, ,10.00, , , ,21, ,-6.0, ,14, ,-10.1, ,-8, ,-22.0, , 27, , 0, ,000, , , ,21.71, , , , , ,M, ,AA, , , ,30.44,
03011,20120101,0135,0,CLR, ,10.00, , , ,21, ,-6.0, ,13, ,-10.4, , -11, ,-24.0, , 23, , 0, ,000, , , ,21.72, , , , , ,M, ,AA, , , ,30.45,
03011,20120101,0155,0,CLR, ,10.00, , , ,21, ,-6.0, ,13, ,-10.5, , -13, ,-25.0, , 21, , 6, ,130, , , ,21.72, , , , , ,M, ,AA, , , ,30.45,
03011,20120101,0215,0,CLR, ,10.00, , , ,21, ,-6.0, ,14, ,-10.2, ,-9, ,-23.0, , 26, , 5, ,090, , , ,21.73, , , , , ,M, ,AA, , , ,30.46,
03011,20120101,0235,0,CLR, ,10.00, , , ,21, ,-6.0, ,14, ,-10.2, ,-9, ,-23.0, , 26, , 6, ,120, , , ,21.74, , , , , ,M, ,AA, , , ,30.47,
03011,20120101,0255,0,CLR, ,10.00, , , ,21, ,-6.0, ,13, ,-10.4, , -11, ,-24.0, , 23, , 7, ,130, , , ,21.74, , , , , ,M, ,AA, , , ,30.48,
03011,20120101,0315,0,CLR, ,10.00, , , ,23, ,-5.0, ,15, ,-9.4, ,-8, ,-22.0, , 25, , 9, ,120, , , ,21.74, , , , , ,M, ,AA, , , ,30.47,
03011,20120101,0335,0,CLR, ,10.00, , , ,23, ,-5.0, ,15, ,-9.4, ,-8, ,-22.0, , 25, , 8, ,120, , , ,21.74, , , , , ,M, ,AA, , , ,30.47,
03011,20120101,0355,0,CLR, ,10.00, , , ,21, ,-6.0, ,14, ,-10.2, ,-9, ,-23.0, , 26, , 7, ,120, , , ,21.73, , , , , ,M, ,AA, , , ,30.46,
03011,20120101,0415,0,CLR, ,10.00, , , ,23, ,-5.0, ,14, ,-9.7, , -13, ,-25.0, , 19, , 7, ,130, , , ,21.73, , , , , ,M, ,AA, , , ,30.46,
I have done few modification in your script,
1. Load the data with proper schema (you can change the datatype of each field according to your need)
2. Optimized all the 6 loads into 1 load.
3. Removed the commented code
I have tested the below pig script with your input and its working fine, pasted the output also.
PigScript:
--Load all the files into relations
months = LOAD 'hdfs:/data/big/data/weather/weather/20120[1-6]hourly.txt' USING PigStorage(',') AS (WBAN:int,Date:chararray,Time:chararray,StationType:int,SkyCondition:chararray,SkyConditionFlag,Visibility,VisibilityFlag,WeatherType,WeatherTypeFlag,DryBulbFarenheit:int,DryBulbFarenheitFlag,DryBulbCelsius:double,DryBulbCelsiusFlag,WetBulbFarenheit:int,WetBulbFarenheitFlag,WetBulbCelsius:double,WetBulbCelsiusFlag,DewPointFarenheit,DewPointFarenheitFlag,DewPointCelsius,DewPointCelsiusFlag,RelativeHumidity,RelativeHumidityFlag,WindSpeed,WindSpeedFlag,WindDirection,WindDirectionFlag,ValueForWindCharacter,ValueForWindCharacterFlag,StationPressure,StationPressureFlag,PressureTendency,PressureTendencyFlag,PressureChange,PressureChangeFlag,SeaLevelPressure,SeaLevelPressureFlag,RecordType,RecordTypeFlag,HourlyPrecip,HourlyPrecipFlag,Altimeter,AltimeterFlag);
--filter out unwanted data
clearWeather = FILTER months BY SkyCondition == 'CLR';
--Transform and shape relation
shapedWeather = FOREACH clearWeather GENERATE Date,
SUBSTRING(Date,0,4) AS year,
SUBSTRING(Date,4,6) AS month,
SUBSTRING(Date,6,8) AS day,
SkyCondition,
DryBulbFarenheit AS dryTemp;
--Group relation specifying number of reducers
groupedByMonthDay = GROUP shapedWeather BY (month, day) PARALLEL 10;
--Aggregate relation
aggedResults = FOREACH groupedByMonthDay GENERATE group as MonthDay, AVG(shapedWeather.dryTemp), MIN(shapedWeather.dryTemp), MAX(shapedWeather.dryTemp), COUNT(shapedWeather.dryTemp) PARALLEL 10;
--Sort relation
sortedResults = ORDER aggedResults BY $1 DESC;
--Store results in HDFS
STORE sortedResults INTO 'hdfs:/data/big/data/weather/pigresults' USING PigStorage(':');
Output: (based on your above input samples)
(01,01):21.615384615384617:21:23:13
MonthDay:(01,01)
Avg:21.615384615384617
Min:21
Max:23
Count:13
It looks like you are loading 'month1','month2' etc without specifying the schema (where you should specify 'dryTemp'). You may try something like:
month1 = LOAD 'hdfs:/data/big/data/weather/201201hourly.txt' USING PigStorage(',')
AS (wban,year_month_day,time,station_type,maint_indic,
sky_cond,visibility,weather_type,dryTemp);
Similarly for all the other months.
Thanks

Show the last record if meets certain conditions

I have two tables for example USER and AUDIT, where the join is user.id = audit.record:
USER:
ID, USER, NAME, LAST_NAME
205, USER1, PEDRO, PEREZ
206, USER2, JUAN, PEREZ
AUDIT:
ID, ACTION, RECORD, FIELD, DATE
1, MODIFY, 205, LAST_NAME, 08/11/12
2, MODIFY, 205, NAME, 25/09/12
3, MODIFY, 206, NAME, 08/11/12
I need a query where the result shows all the users that his last registry entry have a modification in its name. In the example the result would be just USER2
I'm using Oracle 8, can i use a simple query or do i need to create a function?
Thanks in advance.
Maybe this will do it:
select id from user u,
audit a
where u.id=a.record
and a.action='MODIFY'
and a.field='NAME'
and a.id = (select max(id)
from audit a2
where a2.record=u.id)
Here is my solution... MAX (FIELD_NAME) KEEP (DENSE_RANK FIRST ORDER BY OPERATION_DT)

Resources