Skip to main content

Datawarehouse Bigdata Integration - Proof of Concept

The objective of this proof of concept project is to evaluate the feasibility of converting a traditional ETL architecture for data warehouse load into a hybrid approach with bigdata integration.
 
Refer the following post for architectural details.
 
  • Proof of Concept - Project Plan

 The POC project has a timeline of 4 weeks.
Following activities planned during this period.
  1. Define Business goals and corresponding use cases.
  2. Setup and Configuration.
  3. Architecture and Design.
  4. Development.
  5. Evaluation and recommendations.
 
 
  • System Hardware Architecture

Minimal hardware investment planned for the POC project. Cluster configured with 1 master nodes and 3 slave nodes.
 

 

  • System Software Architecture

 
  • Application Use Cases

Application use cases include the following.
  1. Data ingestion from OLTP to HDFS using Sqoop.
  2. Load Facts using mapreduce jobs.
  3. Create Aggregates using Hive QLs.
  4. Export processed Aggregates from from HDFS to DW using Sqoop.

  • Performance Use Cases

 

 

Comments

Post a Comment

Popular posts from this blog

DW Architecture - Traditional vs Bigdata Approach

DW Flow Architecture - Traditional             Using ETL tools like Informatica and Reporting tools like OBIEE.   Source OLTP to Stage data load using ETL process. Load Dimensions using ETL process. Cache dimension keys. Load Facts using ETL process. Load Aggregates using ETL process. OBIEE connect to DW for reporting.  

Healthcare Analytics Example - Predicting Hospital Readmissions for Diabetic Patients

  Scenario: A healthcare institution seeks to decrease the frequency of hospital readmissions for patients diagnosed with diabetes. Repeated hospital stays incur significant expenses and frequently signal unfavorable patient results. The business aims to utilize big data analytics to proactively identify patients with a high likelihood of readmission and react accordingly.