Prerequisites
A Power BI account that’s tied to a Power BI Premium Capacity (or) Premium Per User (or) Power BI Pro license
BI Connector v6.2.8 or above.
The BI Connector Server Edition is installed on your Gateway Server
Note - If you are using a clustered Gateway Setup, then BI Connector must be installed and configured on all the Gateways of the cluster, as Power BI could route the refresh request to any Gateway in the clusterThe user configuring the BI Connector on the Gateway Server must have Admin access to the Server
A Data Model in your Oracle Fusion or Analytics application with at least one dataset (that you want to refresh incrementally in Power BI) having one Date/Time column
A Data Source (for the Data Model that you want to incrementally refresh) to be created in the BI Connector application on the Gateway Server using just step 1 alone in this article.
Step 1: Configure a Table With Incremental Refresh in BI Connector on Gateway Server
Open the BI Connector application on the Gateway Server
Select the Tables tab on the left pane, and click on the Add Table button.
Enter a table name that’s easy for you to identify, and then select the Data Source (that you created as a prerequisite for the Incremental refresh) from the drop-down. Then click on Next : Select Dataset option.
Navigate to the folder paths, and select the data model that you created as a prerequisite, and select the Dataset with the Date/time column , click on Next : Configure Parameters
Under the Configuration Tab, Enable the Incremental Refresh option. Once enabled you will see the below options
Date Time Column: Select the preferred Column from the dropdown Menu.
Format: Select the Date Format.
Fill in the parameter values as per your required data conditions and Click on Save.
Step 2: Create a New Connection in Power BI Service
- Log in to Power BI Service
- Click on the Gear icon on the top right and click Manage Connections and Gateways
- Click on the + New at the top left to create a new connection
- Select the exact Gateway or the cluster (on which you configured the BI Connector) from the Gateway Cluster name drop-down field, and give a name to the connection on the Connection name field
- Select the Connection type as ODBC
- Enter the Data Source name (that you created on the BI Connector app on the Gateway Server as a prerequisite) in the Connection string field, with dsn= prefixed to it.
For example, if you named the Data Source as Oracle Fusion, the following will go into the connection string field:
dsn=Oracle Fusion - Select the Authentication Method as Basic, and enter the Oracle credentials in the Username and Password fields.
- Click Create.
The connection would be added at this point, provided the credentials and connection string entered are valid.
Step 3: Set Up Your Dataflow
Log in to the Power BI Service:
Navigate to your workspace in Power BI Service where you want to create the dataflow.
Create a New Dataflow:
In your workspace, click on + New item on the top left, and search for Dataflow Gen1. In the side pane that opens
Select Dataflow Gen1
Note - If a pop-up opens asking to create a Gen2 dataflow, please click on the No, create a Dataflow Gen1 buttonUnder the Define new tables Section, click on the Add New Tables button
Connect to Your Data Source:
Choose the data source i.e. in our case it's ODBC
Enter the Data Source name (that you created on the BI Connector app on the Gateway Server as a prerequisite) in the ODBC Connection string field, with dsn= prefixed to it.
For example, if you named the Data Source as Oracle Fusion, the following will go into the connection string field:
dsn=Oracle FusionPower BI would automatically map the connection that you created in step 2, and map it to the Connection field. Or you can also select one of the connections from the drop-down of that field.
Click on Next
In the left, open the appropriate folder paths and select the table that you created in BI Connector
Click Transform Data
At the right side, under the Name field, give a name to the table that’s easy for you to identify
Make any transformations to the data if required, and click Save & Close
Give a name to the Dataflow in the Name field on the Save your dataflow pop-up that opens
DO NOT CLICK the Refresh now button even if it shows up at this point
Step 4: Configure Incremental Refresh Settings
Go to the workspace and hover over the particluar dataflow, and click on the three dots, and select Edit option
Click on the Incremental Refresh icon under the Actions section
In the Incremental Refresh Settings panel that opens up, enable the toggle button
Under the Choose a DateTime column to filter by field, select the date time column based on which you want to incrementally refresh the data
Under the Store rows from the past section, provide the timeframe upto which the dataset must hold the data
Under the Refresh rows from the past section, provide the timeframe upto which you want to incrementally update/add rows to the dataset on each refresh
Note- You can also use the Detect Data Changes, and Only refresh complete periods options based on your requirementClick Save on the Incremental Refresh Settings panel
Now click on the Refresh Now button
Step 5: Configure Refresh Schedules
Go to the workspace and hover over the particluar dataflow, and click on the Schedule Refresh icon, and click on the Refresh option
Select the appropriate timezone from the drop-down
Toggle the button to ON under the Configure a refresh schedule section, and schedule the incremental refresh based on your requirements
Configure the time slots of refresh, and refresh failure notifications based on your need.
Click Apply.
Quick tips to verify Incremental Refresh.
The Incremental refresh needs vary based on a number of use cases, and here are some quick tips to help you verify the correctness of the refereshed data.
Check Refresh Logs:
Monitor the refresh history for the dataflow to ensure incremental refresh is working correctly.
Verify that only the incremental data (not the entire dataset) is being refreshed.
Validate Data:
Use Power BI Desktop to connect to the dataflow and verify the refreshed data.
Optimize Performance (Optional):
Avoid complex transformations directly in the dataflow to improve performance.