Are there any SOAP/REST API available for BIP administration? I'm mostly interested in a possibility to define JDBC data source for reports.
Whole situation description:
For development purposes BI Publisher is run as Docker container. We have multiple environments (with separate databases), where some of them will have a BI Publisher instance. We would like to automate BI Publisher environment creation, so we need to dynamically define data source for BI Publisher reports, so that it would correspond to appropriate environment.
No for datasources. But the createReport service takes a dynamic data model.
The server settings are actually set in .config files. You could write your own external webservices to modify these config files are restart the server.
Related
I create service which uses few ADF pipelines, which will be responsible for processing of large amount of big files.
The main purpose is not to create azure services like database, azure storage account and pipelines on local developer account with 50Euro credit. The biggest drawback of such solution is size of processed file. Such big files could burn credit and blue CK such account.
Structure of a project looks like:
web-ui - API server - azure data factory with pipelines (unit used for processing and calculation files). In order to develop and debug such project everything should be configurable on local machine. Data factory with pipelines will use database information in order to process and create calculations on files
I looked for different approaches to deploy such projects with ADF pipelines, but there is no general solution for such structure. Is it possible to simulate and create local instance of pipeline on developer's machine?
All environments are in the same tenant, same Azure Active Directory.
Need to push data from one environment's (Line of Business) Common Data Service to another environment's Common Data Service (Central Enterprise CDS) where reporting is running from.
I've looked into using OData Dataflows, however this seems like more of a manually triggered option.
OData dataflows is meant for and designed to support migration and synchronization of large datasets in Common Data Service during such scenarios:
A one-time cross-environment or cross-tenant migration is needed (for
example, geo-migration).
A developer needs to update an app that is being used in production.
Test data is needed in their development environment to easily build
out changes.
Reference: Migrate data between Common Data Service environments using the dataflows OData connector
For continuous data synchronization, use the CDS connector in Power Automate and attribute filters for source CDS record updates to target CDS entities.
I currently have a desktop PBIX file that I manually publish to Power BI Web.
I have to keep different version of the same PBIX file just to keep track of different sources based on environment such as Dev/QA/UAT/Prod etc
I have more than one data source for each environment i.e. in same PBIX file I have data coming from say Application Insights and REST API.
I scanned through power bi community to see how to do this but can't find relevant information. All pointers are for refreshing either the local PBIX or using Schedule Refresh option in Power BI Web.
Someone even wrote code to hit Publish code via OLE automation but that's not acceptable solution.
https://community.powerbi.com
I would like to automate this process such that
A. I can provide the data source connection string/ credentials externally based on the environment I want to publish it to.
B. Publish the report to Power BI web using a service account instead of my own.
Our current build and deployment tool set does allow use of PowerShell/ Azure CLI etc. Hence it would be helpful if the solution uses those.
Fetching data from sql Azure won't need refresh but it's expensive.
In one of the organizations I worked for they used views on sql Azure to accomplish this task
I need to configure service accounts for connecting to some of the services and for that we are required to configure the details in a template file.
So basically that means, I want to configure service account at run time.
We are using oracle service bus 11g.
Since I've never worked on service accounts before, any suggestions will be helpful.
I checked that we can do that at run time by fn-bea:lookupBasicCredentials XQuery function. but this is not what we want. We want to generated dynamically through the template files.
I have a project where I have to migrate data from one database to another. We have a copy of the database on our development server and I am planning to use SSIS with Visual Studio to do this. I have set up Connection Managers with data flows and control flows.
How do i configure it such that all i have to do in the future is to change a setting and that will allow me to run the package from my DEV database to my Production one with ease? Will it be as easy as re-configuring the connection managers? If I try to add a new Connection Manager and point it to that, all my mappings seems to dissapear.
You should utilise SSIS configuration to make portable between environments
Best Practices for Integration Services Configurations
There are several options including xml files, command line options etc.
Command line options (/SET)
XML configuration file
Environment variable
Registry entry
Parent package variable
SQL Server
See Package Configurations