Indirect file loading informatica PC - informatica-powercenter

Application server directory is mounted on Etl server for various project reasons. Informatica process indirect files and reads data from all files when pointed to informatica server path but randomly fails with error “ couldn’t read indirect file” when pointing to newly mounted server directory, though the directory is also present in same host. Permission and group is same for both directories. Note: there is no issue faced for direct files in both directory. Kindly help
Is there any catch behind indirect file processing for Informatica, should that be placed only in Etl server srcfiles?

Related

Writer initialization failed.Error opening output file.The system cannot find the path specified

In informatica pc I got an error like Writer initialization failed.Error opening output file.The system cannot find the path specified.
Even I checked the directories and file names but what exactly confused.
It's exactly as it says: the Writer failed to initialize, as it was not able to locate the path and file specified.
Note that PowerCenter Workflows and Mappings are executed on the Server. So while you develop on your local laptop (for example) and place a file in C:\Temp folder, and you are able to see the file, once you run the process, it will be executed on the Server. And the Server will not refer your laptop. It will look for C:\Temp location on its local disk. And if that's a unix box, there won't even be a C: path!
Hence, the process will fail with exactly the message you've seen: initialization failed, error opening output file. You need to place the file in the location accessible by Server.
In case of Writer, you name target location where the file will be created - make sure the user used by PowerCenter does have the write access.

Duplicity Restore Throwing "IsADirectoryError: Is a directory" Error

My Linux machine recently failed and I am trying to restore my files onto a Windows 11 machine. The files were created using Duplicity (the external HD containing the files has hundreds of .difftar.gz and .sigtar.gz files as well as a '.manifest'). Having installed CGWin and the duplicity package, I traverse to my external HD in cgwin...
$ pwd
/cygdrive/e
... and attempt to restore the latest snapshot of my lost directories/files to a temp folder on my Windows 11 machine by running:
duplicity restore file:/// /cygdrive/c/Users/john/OneDrive/Documents/temp
At this juncture, the restoration fails due to a "IsADirectoryError" error.
Warning, found the following remote orphaned signature file:
duplicity-new-signatures.20211221T070230Z.to.20211224T103806Z.sigtar.gz
Warning, found signatures but no corresponding backup files
Warning, found incomplete backup sets, probably left from aborted session
Synchronizing remote metadata to local cache...
Copying duplicity-full-signatures.20211118T103831Z.sigtar to local cache.
Attempt of get Nr. 1 failed. IsADirectoryError: Is a directory
Attempt of get Nr. 2 failed. IsADirectoryError: Is a directory
Attempt of get Nr. 3 failed. IsADirectoryError: Is a directory
Attempt of get Nr. 4 failed. IsADirectoryError: Is a directory
Giving up after 5 attempts. IsADirectoryError: Is a directory
Is there an error in my duplicity command? Do I have corrupted backups? Any assistance in trouble-shooting this would be greatly appreciated!
let's assume that duplicity works (it's not officially supported on windows in any way). never tried it.
say your backup data exists in the root of your external harddrive mounted as E:.
you want to restore the complete last backup into a folder C:\Users\john\OneDrive\Documents\temp\ .
two points
point it to your backup location properly. absolutely that would be /cygdrive/e/ or as url file:///cygdrive/e/
point to your target folder as folder ending with a slash / to signal that the backup is to be restored in there.
taking these points into account a command like
duplicity file:///cygdrive/e/ /cygdrive/c/Users/john/OneDrive/Documents/temp/
should work as expected.
NOTE: you don't need the action command restore as the order of arguments (url before local file system location) tells duplicity already that you want to restore.

Nifi: Failed to retrieve directory listing when connecting to FTP server

I have a ListenFTP processor opened on a port and when i am trying to connect to it via FileZila i have an error "Failed to retrieve directory listing".
The connection seems to be establish first but then this error occurs.
Nifi is hosted on an ubuntu server running in a docker image
ListenFTP processor is opened on port 2221
I tried to change some configuration in FileZila based on this issue but nothing worked.
The connection works well on localhost, i can connect to the ftp server and transfer files
Somone has an idea how to solved that ?
If you look at the documentation of the processor, it states that
"After starting the processor and connecting to the FTP server, an
empty root directory is visible in the client application. Folders can
be created in and deleted from the root directory and any of its
subdirectories. Files can be uploaded to any directory. Uploaded files
do not show in the content list of directories, since files are not
actually stored on this FTP server, but converted into FlowFiles and
transferred to the next processor via the 'success' relationship. It
is not possible to download or delete files like on a regular FTP
server. All the folders (including the root directory) are virtual
directories, meaning that they only exist in memory and do not get
created in the file system of the host machine. Also, these
directories are not persisted: by restarting the processor all the
directories (except for the root directory) get removed. Uploaded
files do not get removed by restarting the processor, since they are
not stored on the FTP server, but transferred to the next processor as
FlowFiles."

Unable to save output from Rscripts in system directory using Devops Pipeline

I am running Rscripts on a self hosted Devops agent. My Windows agent is able to access the system's directory where its hosted. Below is the directory structure for my code
Agent loc. : F:/agent
Source Code : F:/agent/deployment/projects/project1/sourcecode
DWH _dump : F:/agent/deployment/DWH_dump/2021/
Output loca. : F:/agent/deployment/projects/project1/output_data/2021
The agent is using CMD in the devops pipeline to trigger R from the system and use the libraries from the system directory.
Problem statement: I am unable to save the output from my Rscript in to the Output Loca. directory. It give an error as Probable reason: permission denied error by pointing to that directory.
Output File Format: file_name.rds but same issue happens even for a csv file.
Command leading to failure: saverds(paste0(Output loca.,"/",file_name.rds))
Workaround : However I found a workaround, tried to save the scripts at the Source Code directory and then save the same files at the Output Loca. directory. This works perfectly fine but costs me 2 extra hours of run time because I have to save all intermediatory files and delete them in the end. Keeping the intermediatory files in memory eats up my RAM.
I have not opened that directory anywhere in the machine. Only open application in my explorer is my browser where the pipeline is running. I spent hours to figure out the reason but no success. Even I checked the system Path to see whether I have mentioned that directory over there and its not present.
When I run the same script directly, on the machine using Rstudio, I do not have any issues with saving the file at any directory.
Spent 2 full days already. Any pointers to figure out the root cause can save me few hours of runtime.
Solution was to set the Azure Pipeline Agent services in Windows to run with Admin Credentials. The agent was not configured as an admin during creation and so after enabling it with my userid which has admin access on the VM, the pipelines were able to save files without any troubles.
Feels great, saved few hours of run time!
I was able to achieve this by following this post.

Unable to find TeamCity 9.1.x data directory

This is really weird.
I am trying a clean Teamcity 9.1.1 install but the Data Directory is nowhere to be found.
if I access the Global Settings tab under Administration, it lists "C:\Windows\System32\config\systemprofile.BuildServer" - a folder that doesn't exist.
if I try to browse to that folder, it shows me a range of files; uploading a specific file there instead uploads it to C:\Windows\SysWOW64\config\systemprofile.BuildServer.
there is no teamcity-startup.properties file anywhere - I am unable to customize the location of the data directory.
when I restore a backup, the backup files are instead restored to C:\Users\[user name]\.BuildServer rather than in the correct data directory.
Does anyone has any suggestions on how to regain control of the situation? How can I tell TeamCity which data folder to use?
I resolved the situation by:
stopping TC services;
creating a teamcity-startup.properties in [install folder]\conf with the following content:
teamcity.data.path=D:\\[install folder]\\config
restarting TC services;
restoring my backup.
This restored the 9.1.1 install as well as stabilizing the location of the data directory. After this was done, the subsequent installation of 9.1.7 prompted me to uninstall 9.1.1 first (which it hadn’t done the first time around) and the upgrade succeeded.
I believe the system was already compromised at the beginning, unknown to me, due to the data folder being all over the place. Once that was resolved, everything else fell into place.

Resources