Reg: database is not starting up an error - oracle

getting below error while starting the database:-
startup
ORA-01078: failure in processing system parameters
ORA-01565: error in identifying file '+DATA/mis/PARAMETERFILE/spfile.276.967375255'
ORA-17503: ksfdopn:10 Failed to open file +DATA/mis/PARAMETERFILE/spfile.276.967375255
ORA-04031: unable to allocate 56 bytes of shared memory ("shared pool","unknown object","KKSSP^24","kglseshtSegs")

Your database cannot find the SPFILE (newer init.ora) within ASM with the actual system parameters or has no permissions to access it.
Either your Grid Infrastructure stack or the dbs/spfile.ora is pointing to the wrong file.
To find out what the grid infrastructure stack is using, run "srvctl" which should display the parameterfile name the database should be using
srvctl config database -d <dbname>
...
Spfile: +DATA/<dbname>/PARAMETERFILE/spfile.269.1066152225
...
Then check (as the grid user), if the file indeed is not visible (by using asmcmd):
asmcmd
ASMCMD> ls +DATA/<dbname>/PARAMETERFILE/
spfile.269.1066152225
If the name is different, then you got the issue... (and you have to point to the correct file).
If the name is correct, then it could be wrong permissions on the oracle executable(s) (check My Oracle Support):
RAC Database Can't Start: ORA-01565, ORA-17503: ksfdopn:10 Failed to open file +DATA/BPBL/spfileBPBL.ora (Doc ID 2316088.1)

Related

NiFi PutFile processor doesn't save file to a directory

In my NiFi workflow I need to download .zip file from SOAP web-server, save it on machine (optional) and unpack content to sub-folder. Everything works on my local Win 10 machine, but issues occur when I try to move to remote Linux server. Here is the part of my flow when error happens:
So we have a FlowFile entered UpdateAttribute where filename attributed is set with required name and .zip extension. The file is correct as can be seen in queue after starting the processor.
Problems start happen when I pass the FlowFile to PutFile processors. I tried different scenarios based on selected directory:
relative to NiFi main folder ./out:
12:30:01 MSK ERROR
PutFile[id=05788ae5-64e5-32af-bb40-88d50d4c886c] Penalizing StandardFlowFileRecord[uuid=3e0c5e38-76f8-4ce3-b911-90f6901c35a4,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1586337594333-49, container=default, section=49], offset=2335, length=375628606],offset=0,name=,size=375628606] and transferring to failure due to /opt/nifi/nifi-1.11.4/./out: java.nio.file.DirectoryNotEmptyException: /opt/nifi/nifi-1.11.4/./out
12:30:01 MSK ERROR
PutFile[id=05788ae5-64e5-32af-bb40-88d50d4c886c] Unable to remove temporary file /opt/nifi/nifi-1.11.4/./out/. due to /opt/nifi/nifi-1.11.4/./out/.: Invalid argument: java.nio.file.FileSystemException: /opt/nifi/nifi-1.11.4/./out/.: Invalid argument
Full path /opt/nifi/nifi-1.11.4/out/file/ :
12:32:45 MSK ERROR
PutFile[id=0171102b-c82d-149d-c9ae-ea4da99b1750] Penalizing StandardFlowFileRecord[uuid=0573803f-8407-46e4-93f0-e52a5fc35a07,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1586337594333-49, container=default, section=49], offset=2335, length=375628606],offset=0,name=,size=375628606] and transferring to failure due to Failed to export StandardFlowFileRecord[uuid=0573803f-8407-46e4-93f0-e52a5fc35a07,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1586337594333-49, container=default, section=49], offset=2335, length=375628606],offset=0,name=,size=375628606] to /opt/nifi/nifi-1.11.4/out/file/. due to java.io.FileNotFoundException: /opt/nifi/nifi-1.11.4/out/file/. (No such file or directory): org.apache.nifi.processor.exception.FlowFileAccessException: Failed to export StandardFlowFileRecord[uuid=0573803f-8407-46e4-93f0-e52a5fc35a07,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1586337594333-49, container=default, section=49], offset=2335, length=375628606],offset=0,name=,size=375628606] to /opt/nifi/nifi-1.11.4/out/file/. due to java.io.FileNotFoundException: /opt/nifi/nifi-1.11.4/out/file/. (No such file or directory)
So it adds dot ('.') to path which causes exception. All folders are created and permissions granted. I tried to run simple test flow with file of 42B and the same path (GenerateFlowFile -> PutFile) and everything is OK.
What am I doing wrong?
The problem was in Linux system rights: there was 'nifi' user for flow execution and one more user for accessing Linux filesystem.
Assigning rwxrwxrwx for used folders solved the issue.

Unable to connect to Oracle with SchemaSpy

I've installed Oracle Instant Client 64 bits, when connecting with SchemaSpy I get the error message below.
PLEASE NOTE: Both these files exist
C:\app\instantclient_12_1\ojdbc6.jar
C:\app\instantclient_12_1\ocijdbc12.dll
And "C:\app\instantclient_12_1\" is in the PATH.
I've tried C:\app\instantclient_12_1\ojdbc7.jar as well, same result.
Windows 7 64 bits.
Would greatly appreciate any help from anyone who got this to work correctly.
Error message:
Failed to load driver [oracle.jdbc.driver.OracleDriver] from classpath [file:/C:/app/instantclient_12_1/ojdbc6.jar]
Make sure the reported library (.dll/.lib/.so) from the following line can be
found by your PATH (or LIB*PATH) environment variable
java.lang.UnsatisfiedLinkError: C:\app\instantclient_12_1\ocijdbc12.dll: Specified process not found
at java.lang.ClassLoader$NativeLibrary.load(Native Method)
at java.lang.ClassLoader.loadLibrary0(Unknown Source)
at java.lang.ClassLoader.loadLibrary(Unknown Source)
at java.lang.Runtime.loadLibrary0(Unknown Source)
at java.lang.System.loadLibrary(Unknown Source)
at oracle.jdbc.driver.T2CConnection$1.run(T2CConnection.java:4115)
at java.security.AccessController.doPrivileged(Native Method)
at oracle.jdbc.driver.T2CConnection.loadNativeLibrary(T2CConnection.java:4111)
at oracle.jdbc.driver.T2CConnection.logon(T2CConnection.java:308)
at oracle.jdbc.driver.PhysicalConnection.connect(PhysicalConnection.java:662)
at oracle.jdbc.driver.T2CDriverExtension.getConnection(T2CDriverExtension.java:54)
at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:560)
at net.sourceforge.schemaspy.SchemaAnalyzer.getConnection(SchemaAnalyzer.java:582)
at net.sourceforge.schemaspy.SchemaAnalyzer.analyze(SchemaAnalyzer.java:157)
at net.sourceforge.schemaspy.Main.main(Main.java:42)
E=3I=3
Here's how to run SchemaSpy 6 against an Oracle database:
Dependecies
Make sure you have the following available on your machine:
The lastest version from schemaspy.org, the following will describe the process for schemaspy-6.0.0-rc1.
The Oracle JDBC thin driver, otherwise you'll have to mess around with Oracle OCI. You can get it from Oracle Database 12.1.0.2 JDBC Driver & UCP Downloads
SchemaSpy uses GraphViz to generate the diagrams, get it from graphviz.org. You'll need to update you PATH variable, add C:\Program Files (x86)\Graphviz2.38\bin to it (make sure the version fits the one you downloaded).
Database Type
Note, SchemaSpy supports Oracle OCI (-t ora) and Oracle Thin (-t orathin) as database types. To get the list of available database types:
java -jar schemaspy-6.0.0-rc1.jar -dbhelp
Configuration
You can put most configuration parameters into a file called schemaspy.properties, put this file into the same directory as schemaspy-6.0.0-rc1.jar.
Example schemaspy.properties:
# type of database. Run with -dbhelp for details
schemaspy.t=orathin
# path to the dowloaded oracle jdbc drivers, for example
schemaspy.dp=C:\tools\dbdoc\drivers\ojdbc7.jar
# database properties: host, port number, name user, password
schemaspy.host=[orcale database host]
schemaspy.port=[orcale database port, usualy 1521]
schemaspy.db=[database name or SID]
schemaspy.u=[username]
schemaspy.p=[password, for complexer ones, put it in quotation marks]
# output dir to save generated files
schemaspy.o=C:\tools\dbdoc\output
# db scheme for which generate diagrams
schemaspy.s=[scheme name]
Generate documentation
With the configuration in place, now all you have to do is run:
java -jar schemaspy-6.0.0-rc1.jar

Oracle Data Integrator SQL to HDFS IKM returns error

I am using ODI (12.1.3.0.0). I created topology for Oracle DB which is OK and I created topology for HDFS using File technology where I think the problem is in.
DataServer for HDFS, I left JDBC driver empty, and filled JDBC Url with hdfs://remotehostname:port
Physical Schema for HDFS, I filled both Schema and Work Schema with /my/path
Then created Logical Schema and Model. After that created Datastore under the model with these definitions.
Name: TestName
Resource Name: TESTFILE.txt
File Format: Fixed
After all these, created a project and a mapping under the project.
Finally when I run the mapping I see these errors:
ODI-1217: Session Oracle2HDFSMapping_Physical_SESS (15) fails with return code ODI-1298.
ODI-1226: Step Physical_STEP fails after 1 attempt(s).
ODI-1240: Flow Physical_STEP fails while performing a Add execute to Sqoop script-IKM SQL to HDFS File (Sqoop)- operation. This flow loads target table null.
ODI-1298: Serial task "SERIAL-MAP_MAIN- (10)" failed because child task "SERIAL-EU-GGUSER_UNIT (20)" is in error.
ODI-1298: Serial task "SERIAL-EU-GGUSER_UNIT (20)" failed because child task "Add execute to Sqoop script-IKM SQL to HDFS File (Sqoop)- (40)" is in error.
Caused By: java.io.IOException: Cannot run program "chmod": CreateProcess error=2, The system cannot find the file specified
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
at java.lang.Runtime.exec(Runtime.java:617)
at java.lang.Runtime.exec(Runtime.java:450)
at java.lang.Runtime.exec(Runtime.java:347)
at oracle.odi.runtime.agent.execution.cmd.OSCommandExecutor.execute(OSCommandExecutor.java:54)
at oracle.odi.runtime.agent.execution.cmd.OSCommandExecutor.execute(OSCommandExecutor.java:29)
at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:52)
at oracle.odi.runtime.agent.execution.SessionTask.processTask(SessionTask.java:203)
at oracle.odi.runtime.agent.execution.SessionTask.doExecuteTask(SessionTask.java:114)
at oracle.odi.runtime.agent.execution.AbstractSessionTask.execute(AbstractSessionTask.java:886)
at oracle.odi.runtime.agent.execution.SessionExecutor$SerialTrain.runTasks(SessionExecutor.java:2198)
at oracle.odi.runtime.agent.execution.SessionExecutor.executeSession(SessionExecutor.java:591)
at oracle.odi.runtime.agent.processor.TaskExecutorAgentRequestProcessor$1.doAction(TaskExecutorAgentRequestProcessor.java:718)
at oracle.odi.runtime.agent.processor.TaskExecutorAgentRequestProcessor$1.doAction(TaskExecutorAgentRequestProcessor.java:611)
at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:203)
at oracle.odi.runtime.agent.processor.TaskExecutorAgentRequestProcessor.doProcessStartAgentTask(TaskExecutorAgentRequestProcessor.java:800)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$1400(StartSessRequestProcessor.java:74)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:702)
at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:180)
at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:108)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: CreateProcess error=2, The system cannot find the file specified
at java.lang.ProcessImpl.create(Native Method)
at java.lang.ProcessImpl.<init>(ProcessImpl.java:385)
at java.lang.ProcessImpl.start(ProcessImpl.java:136)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
... 20 more
I wonder where I did it wrong?
For a file Datastore, you need to define the attributes (columns) by opening the Datastore and going on the attribute tab. If the file already exists, you can reverse-engineer the attributes and rename them and change the datatype if needed.
The error message you received for the second task mentions that the file (generated in the fist task) does not exist. So there might be a problem with the first task, probably due to the missing attributes in your datastore.
Here is a detailed article about SQL To HDFS file (Sqoop) KM written by the ODI A-Team : http://www.ateam-oracle.com/importing-data-from-sql-databases-into-hadoop-with-sqoop-and-oracle-data-integrator-odi/

java.sql.SQLException: ORA-01157: cannot identify/lock data file

I was getting the below given error, When I run the application:
Caused by: org.hibernate.exception.GenericJDBCException: could not execute native bulk manipulation query
.
.
Caused by: java.sql.SQLException: ORA-01157: cannot identify/lock data file - see DBWR trace file
ORA-01110: data file : '/fld1/fld2/mytemp_tablespace.dbf'
I tried to find out this files and came to know that there is no folders. I have ,
then created the respective folders and a new empty mytemptemp_tablespace.dbf file. But still the same error is getting over there.
Any idea why this error is happening?If it is an SQL exception it could have happened at the right beginning itself.
What I have done is, I have created a new schema and exported the db from the old to this new one.
Also how can I see or get the DBWR trace file.
This could be the result of a restored database and during the restore rman was not able to create the tempfiles because of a missing directory.
Solution is quite simple, once the directories are created, just add one or more tempfiles:
alter tablespace mytemp_tablespace add tempfile '/fld1/fld2/mytemp_tablespace01.dbf';
when the temp tablespace has it's storage, your actions can succeed.

DefaultDataPath is empty in VS2010 SQL2008 database deployment script

I have a VS2010 database project pointing to a SQL2005 database. When I deploy it, it correct picks up the DefaultDataPath from the SQL instance and everything works.
Today, I changed the project type from SQL205 to SQL2008 and changed the deploy properties to point to my SQL2008 server. However, now when I try to deploy, I get this error:
Error SQL01268: .Net SqlClient Data Provider: Msg 5105, Level 16, State 2, Line 1 A file activation error occurred. The physical file name '\AutoDeployedTRS.mdf' may be incorrect. Diagnose and correct additional errors, and retry the operation.
Error SQL01268: .Net SqlClient Data Provider: Msg 1802, Level 16, State 1, Line 1 CREATE DATABASE failed. Some file names listed could not be created. Check related errors.
An error occurred while the batch was being executed.
The reason for this error is that the SQL script created by VS contain these three lines:
:setvar DatabaseName "AutoDeployedTRS"
:setvar DefaultDataPath "\"
:setvar DefaultLogPath "\"
If I check the SQL Instance properties (through the UI or by reading the registry), they are set correctly so it seems like VS2010 can't pick them up for some reason.
Any ideas?
Try to go to location Schema Objects\Database level objects\Storage\Files.
There can be found two files:
Open file [your_database_name].sql and make parameter
FILENAME = '$(DefaultDataPath)$(DatabaseName).mdf'.
Then open file [your_database_name]_log.sql and make parameter
FILENAME = '$(DefaultDataPath)$(DatabaseName)_log.ldf'.
After that try to deploy your project. This parameters now are defined during deployment according to current target database path. Hope it will help you.

Resources