课程跨度

  2 – 3 个月

课程时长

  45 – 55 小时

项目方式

  理论知识 + 团队协作

项目介绍

  项目团队会帮助美国某消费金融服务公司寻找一个在多个系统中整合离散数据的最终解决方案。项目团队需要满足该公司的金融、会计、风险分析以及市场部一次性解决数据问题的需求,并且同时找到满足标准化分析报告以及临时性分析报告要求的端到端(end to end)解决方案。

适合学生

  数据方向求职,成功学员多就业企业数据分析部、科技金融公司数据分析部、咨询公司等大数据相关岗位

部分学员成功案例
Shadow
Slider
项目大纲
Week 1Data Technology’s EvolutionCore RDBMS Topics. Normalization Design and De-Norm Design.Hands on: Design of Relational Table
Week 2Table Design and Schema Design – Keys and IndexesBasic Form of DML INS/UPD/DELHands On: Design relational schema for a real business problem
Week 3DML language – SELECTCore Concepts: Basic form and JOINHands on: Answer business question with basic form of SELECT
Week 4DML Language – Deep Dive JoinHands on: Use Join to solve business questions. Practice and Lab
Week 5DML Language – Advanced TopicsCore Concepts: Sub query, CTE and Recursive CTEHands on: Code challenge, use Recursive CTE to solve business problem
Week 6DML Language – Advanced Topics PracticeCore : ERD, Data exploration, SQL Scalar functions and aggregate functionsLab: use ERD for data exploration.
Week 7DML Language – Advanced TopicsCore: Window functionHands on: Code challenge . OLAP and business problem
Week 8SQL ProgrammingCore: T-SQL, Control flow, Loop , Cursor, Store ProcedureHands on: SQL Server specific features and tools.
Week 9Data Warehouse Technology – MPP RDBMSCore Concepts: MPP Style Computation, TeradataLab: setup and execute SQL on Teradata
Week 10Teradata Deep Dive: concepts and featuresCore: Deep Dive into Explain , TTULab: practice , TTU Practice 1
Week 11Teradata Deep Dive: Features 2Core: TTU Part 2Lab: TTU Practice 2
Week 12Data Pipeline I : basic data pipelineCore: Data Pipeline, Data Flow. Introduction to SSISLab: Data flow lab 1-2
Week 13Data Pipeline II: complex data pipelineCore : complex data flow,Lab: Data flow lab 3-4
Week 14Data Pipeline III : advanced topicsCore : control flow , variable, parameter, job schedulingLab : control flow lab 1-2
Week 15Big Data – IntroductionCore : Linux cheat sheet, Hadoop history and architecture, HDFSLab: Setup and data preparation.
Week 16Big Data – Build Data Pipeline using Hadoop tools ICore: Mapreduce (concept and framework), Pig LatinLab: Build a simple MR program , Pig hands on
Week 17Big Data: -- Build Data Pipeline using Haodop Tools IICore: Spark, Sqoop and Hive. Architecture and conceptsLab: Data Manipulation using Spark/Sqoop and Hive
Week 18Conclusion --- Q & A - -