Autoplay
Autocomplete
Previous Lesson
Complete and Continue
Building a Data Mart with Pentaho Data Integration
Getting Started
The Second-hand Lens Store (6:49)
The Derived Star Schema (4:29)
Setting up Our Development Environment (7:07)
Agile BI – Creating ETLs to Prepare Joined Data Set
Importing Raw Data (3:22)
Exporting Data Using the Standard Table Output (4:33)
Exporting Data Using the Dedicated Bulk Loading (4:32)
Agile BI – Building OLAP Schema, Analyzing Data, and Implementing Required ETL Improvements
Creating a Pentaho Analysis Model (3:25)
Analyzing Data Using Pentaho Analyzer (3:49)
Improving Your ETL for Better Data Quality (4:15)
Slowly Changing Dimensions
Creating a Slowly Changing Dimension of Type 1 Using Insert/Update (6:47)
Creating a Slowly Changing Dimension of Type 1 Using Dimension Lookup Update (4:58)
Creating a Slowly Changing Dimension Type 2 (5:18)
Populating Data Dimension
Defining Start and End Date Parameters (5:17)
Auto-generating Daily Rows for a Given Period (4:26)
Auto-generating Year, Month, and Day (6:27)
Creating the Fact Transformation
Sourcing Raw Data for Fact Table (3:52)
Lookup Slowly Changing Dimension of the Type 1 Key (4:28)
Lookup Slowly Changing Dimension of the Type 2 key (6:08)
Orchestration
Loading Dimensions in Parallel (6:20)
Creating Master Jobs (4:09)
ID-based Change Data Capture
Implementing Change Data Capture (CDC) (4:58)
Creating a CDC Job Flow (4:48)
Final Touches: Logging and Scheduling
Setting up a Dedicated DB Schema (1:22)
Setting up Built-in Logging (4:22)
Scheduling on the Command Line (5:30)
Exporting Data Using the Dedicated Bulk Loading
Lesson content locked
If you're already enrolled,
you'll need to login
.
Enroll in Course to Unlock