Writer initialization failed.Error opening output file.The system cannot find the path specified - informatica-powercenter

In informatica pc I got an error like Writer initialization failed.Error opening output file.The system cannot find the path specified.
Even I checked the directories and file names but what exactly confused.

It's exactly as it says: the Writer failed to initialize, as it was not able to locate the path and file specified.
Note that PowerCenter Workflows and Mappings are executed on the Server. So while you develop on your local laptop (for example) and place a file in C:\Temp folder, and you are able to see the file, once you run the process, it will be executed on the Server. And the Server will not refer your laptop. It will look for C:\Temp location on its local disk. And if that's a unix box, there won't even be a C: path!
Hence, the process will fail with exactly the message you've seen: initialization failed, error opening output file. You need to place the file in the location accessible by Server.
In case of Writer, you name target location where the file will be created - make sure the user used by PowerCenter does have the write access.

Related

Unable to save output from Rscripts in system directory using Devops Pipeline

I am running Rscripts on a self hosted Devops agent. My Windows agent is able to access the system's directory where its hosted. Below is the directory structure for my code
Agent loc. : F:/agent
Source Code : F:/agent/deployment/projects/project1/sourcecode
DWH _dump : F:/agent/deployment/DWH_dump/2021/
Output loca. : F:/agent/deployment/projects/project1/output_data/2021
The agent is using CMD in the devops pipeline to trigger R from the system and use the libraries from the system directory.
Problem statement: I am unable to save the output from my Rscript in to the Output Loca. directory. It give an error as Probable reason: permission denied error by pointing to that directory.
Output File Format: file_name.rds but same issue happens even for a csv file.
Command leading to failure: saverds(paste0(Output loca.,"/",file_name.rds))
Workaround : However I found a workaround, tried to save the scripts at the Source Code directory and then save the same files at the Output Loca. directory. This works perfectly fine but costs me 2 extra hours of run time because I have to save all intermediatory files and delete them in the end. Keeping the intermediatory files in memory eats up my RAM.
I have not opened that directory anywhere in the machine. Only open application in my explorer is my browser where the pipeline is running. I spent hours to figure out the reason but no success. Even I checked the system Path to see whether I have mentioned that directory over there and its not present.
When I run the same script directly, on the machine using Rstudio, I do not have any issues with saving the file at any directory.
Spent 2 full days already. Any pointers to figure out the root cause can save me few hours of runtime.
Solution was to set the Azure Pipeline Agent services in Windows to run with Admin Credentials. The agent was not configured as an admin during creation and so after enabling it with my userid which has admin access on the VM, the pipelines were able to save files without any troubles.
Feels great, saved few hours of run time!
I was able to achieve this by following this post.

Set a custom homepath for Mongo in Win 10

I installed Mongo on my machine (installed in the Program Files folder), setting up the data and log destinations on another drive.
Upon exit from the Mongo shell, it wants to write the .dbshell file to my User profile folder (C:\Users\Name.dbshell). This process fails, because my username contains diacritic characters, which the Mongo shell parses incorrectly. This throws the following error:
Error saving history file: FileOpenFailed Unable to fopen() file C:\Users\Name and Surname\.dbshell: The system cannot find the path specified.
Can I change the Environment variables of the Mongo application? I've looked in the configuration file and there does not seem to be the appropriate option listed in the comment lines. I haven't been able to find the answer at the configuration page as of yet. I looked at the mongo reference page and while it does mention the Environment variables and associated commands, using these does not seem to do anything and running a Mongo shell will still throw an error upon exit because of incorrect parsing of the path.
My question, then, is, how can I change the default path to which Mongo will write its .dbshell file?

wsadmin command gives warning *sys-package-mgr*: can't write cache file

When I run the jython script using wsadmin command on WAS 8.5 ND it shows me below messages:
[wsadmin] *sys-package-mgr*: processing modified jar, '/opt/IBM/WebSphere/AppServer/java_1.7_64/lib/ibmorbtools.jar'
[wsadmin] *sys-package-mgr*: can't write cache file for '/opt/IBM/WebSphere/AppServer/java_1.7_64/lib/ibmorbtools.jar'
I have provided the 777 permission to the specified directory. How can I remove these messages.
This warnings usually look like this:
...
*sys-package-mgr*: processing modified jar, '/usr/IBM/WebSphere/AppServer/plugins/org.eclipse.core.runtime.compatibility.jar'
*sys-package-mgr*: can't write cache file for '/usr/IBM/WebSphere/AppServer/plugins/org.eclipse.core.runtime.compatibility.jar'
*sys-package-mgr*: can't write index file
This are confusing. Source of errors is jython underlying wsadmin. Jython is scanning WebSphere jars, trying to create a pkc file for an each scanned jar and finally is creating or updating the package.idx file. Warnings are raised as user running jython/wsadmin has no proper file permissions.
User running wsadmin must have read and write permission for jython cache folder and all contained files. Default cache folder is temp/cachedir within WAS profile. For example my "Deployment manager" named Dmgr01 has jython cache folder within:
${WAS_INSTALL_ROOT}/profiles/Dgmr01/temp/cachedir
Once one proper rights are given "can't write cache file for" and "can't write index file" warnings will be gone. And "processing modified jar" will be raised whenever jars are updated or temp folder cleaned which is rare event for WAS installation.
If wlst works before and doesn't work now, this error means the temp directory "/tmp/wlstTemporacle/packages" are corrupted.
If you go into "/tmp/wlstTemporacle/packages", you will lots of *.pkc files are duplicated with a "$1" in it's name. In your case, the file "ibmorbtools.pkc" will have a crapy sibling "ibmorbtools$1.pkc" there. This is not correct and it means this directory is corrupted.
The solution is to totally remove directory "/tmp/wlstTemporacle/packages", or even entire "/tmp/wlstTemporacle". And then rerun wlst.sh, you will see a new directory structure is created and all *.pkc are copied over correctly. Your above error message will go away.
Notes: In my case, this is oracle WebLogic 10.x.x.x running on Linux. Its WLST is Jython apps. The temp dir for java is point to "/tmp/wlstTemporacle". If your case is different, the easiest way to determine where is temp directory for your environment is to do a full search for file "ibmorbtools.pkc". It's name come from your error message which is "ibmorbtools.jar", just postfix are different.
Hopefully, this will help!
Yubo

Windows Bat file run from Autosys - Failed to load the sqljdbc_auth.dll

I have a question and wondering if you guys could help.. Here are the details -
Program Flow ->
Autosys Job -> Windows .bat job on network path-> Internally calls java program along with few other components
When the .bat runs through Autosys job, it gives below warning and does partial processing and exits out without any failure
(i guess the internal code might not have good error handling..but sadly I do not have access to view/modify the same). However, when I logon to that
windows box and run the .bat file from cmd prompt, it works like a charm and throws no error.
Warning in Logs - "WARNING: Failed to load the sqljdbc_auth.dll"
Things I tried-
I tried creating a wrapper bat file in windows C drive (not the netwrok path where the actual .bat is in) and placed sqljdbc_auth.dll in that custom folder.
Job went to SU , after partial processing.
Appreciate your help.
Many Thanks,
Raj
I'd suspect there is a path error. Try echo %path% just before calling your batch, even simply run
echo %path%>alogfile
call yourbatch
and if the displayed path doesn't include a directory in which your .dll resides, add it into the path before calling yourbatch.
It may very well be that your logon includes the required directory as part of your user-defined path and the username under which the job runs does not include the directory.
Or, you could try moving the dll into somewhere like system32 which should be common to everyone.
(Windows should use the standard path-scanning algorithm to locate any required dll that isn't invoked from a full pathname)

FileSystemObject - default location

When I debug my application (in the VB6 IDE), I have to specify the absolute path (e.g. c:\logfile.log) to the log file otherwise nothing is written to the log file. However, when the application is live I do not have to specify the absolute path i.e. I can specify logfile.log. Why is this?
The log file is always in the same directory as the .exe and .dll.
Your file is being written to the current working directory.
When your exe is running this is the folder the exe is sat in, however in Debug mode your exe is actually running from the temporary build location (can't actually remember where this is in VB6).
You can test this simply by doing MsgBox(App.Path) in your program and seeing what location appears.
You'll probably find that there is a logfile.log in the location that appears when you run the above command while debugging.

Resources