Professional Diploma in Digital Transformation – Big Data

Professional Diploma in Digital Transformation – Big Data

Introduction

Big data usually includes data sets with sizes beyond the ability of commonly used software tools to capture, curate, manage, and process data within a tolerable elapsed time. Big data is a set of techniques and technologies that require new forms of integration to uncover large hidden values from large datasets that are diverse, complex, and of a massive scale. In 2012, Gartner defined it as: “Big data is high volume, high velocity, and/or high variety information assets that require new forms of processing to enable enhanced decision making, insight discovery and process optimization.

The objective of this program is to familiarize student with knowledge and skills required for becoming a competent and employable BIG DATA Professional. It’s important for a Big Data professional to have programming knowledge in Java as Hadoop & Hive are written in Java.

This program consists of the following courses:

S/N Courses
1 Hadoop(MapReduce/HDFS)
2 Zookeeper
3 Hbase
4 Hive
5 Storm Distributed Live Computing
6 Sqoop

Job Opportunities: Big Data Analyst.