A Data Flow combines as set of activities (steps) with commands to fill, process and manage the data warehouse content. Each step is connected to each other to form a sequence.
Data Flows can be scheduled to run automatically by the Profitbase Server. This way, your business data stays up to date without any manual work.
To add a new Data Flow, right click the Data Flows folder in the navigator. You then get the option to add:
The Data Flow Builder is a graphical designer which lets you easily edit and manage your Data Flows. The Data Flow Builder consists of following main parts;
Following table lists types of steps/activities supported in Profitbase Studio 6.
Note: Upgraded data flows may have obsolete steps (marked in red/pink). These should be replaced to ensure correct operations.
Generic Tasks | SQL Expression |
This activity lets you execute a custom SQL statement Settings:
|
SSIS Package Exec |
This activity lets you execute a SSIS Package. The SSIS Package is executed using dtexec.exe on your local machine. Settings:
|
|
Parallel Task |
This activity lets you execute two or more Data Flow activities in parallel by dragging and dropping them onto a Parallel Task activity.
Settings:
|
|
Execute Data Flows |
This activity lets you execute Profitbase Studio Data Flows. Settings:
|
|
Execute Script(s) | This activity lets you execute Script Extension(s).
Settings:
|
|
Web API Call |
This activity lets you send a command to a web-api. Settings:
|
|
MS Azure App, Insights Trace |
This activity lets you send messages to MS Azure Application Insights. Settings:
|
|
Log Maintenance |
This deletes older items from Operation and/or Message logs. Older data can be copied to a historic table. Settings:
|
|
Override Content Status |
This activity lets you override status on selected solution content. Settings:
|
|
Override Dataflow Status | This activity lets you override status of the current dataflow.
Settings:
|
|
Data Source | Reload Source Data |
This activity lets you load or reload data from your Data Sources.
Settings:
|
Dimension Task | Generate Dimensions |
This activity lets you load or update your Dimensions. Settings:
|
Module Tasks | Module Command(s) |
This activity lets you run commands for a processing modules in the data warehouse database. Dimensions and module definitions fact results are processed according to commands described here. |
SSAS Command(s) |
This activity lets you run commands against a SSAS database. |
|
SSAS XMLA Expression | This activity lets you execute a custom XMLA statement against given SSAS database.
Settings:
|
Generic settings:
All steps must be connected. Right click a step and select [Create Connection]. A line is then shown. Click the connector box on the next step to connect.
Right click the connector line to get a menu with:
A Data Flow can be scheduled or run manually on the Profitbase Studio server from Schedule Management.
Otherwise click [Execute] or right click a step and select [Execute Current Step]/[Execute Current and Subsequent Steps] to run it / test it from the user interface.
When executing, the message log will be shown at bottom.
Note - When run via the Profitbase Studio Server server, the server services user account is used if not otherwise set in each step (Run As) or in specific data warehouse connection settings. The service user may differ from the Profitbase Studio user.