Big Data Training

Hadoop/Big Data Training at MetaScale

Enterprise Focused Big Data Training

Our big data training courses provide an enterprise focused approach and vendor neutral view when working with big data tools.

Whether you are beginning your big data journey or you are looking to delve deeper into big data analytics, the next step is to jump into hands-on exercises and develop skills for coding in MapReduce, Pig, Hive, Storm, YARN, Spark and more. MetaScale offers courses in the latest Hadoop tools to help you solve the puzzle of implementing big data technologies in the enterprise.

MetaScale Tech Academy

MetaScale grew out of a Fortune 100 enterprise that leverages the Hadoop ecosystem to manage several petabytes of data and new business intelligence capabilities. We established MetaScale Tech Academy to offer big data courses based on our heritage of implementing Hadoop technology in the enterprise.

MetaScale Tech Academy packages Hadoop training to help organizations quickly address the scarcity of talent and experience available for today’s big data projects. 

Whether you are beginning your big data journey or you are looking to delve deeper into big data analytics, the next step is to jump into hands-on exercises and develop skills for coding in MapReduce, Pig, Hive, Storm, YARN, Spark and more. MetaScale offers courses in the latest Hadoop ecosystem tools to help you solve the puzzle of implementing big data technologies in the enterprise.

Course Itinerary Hadoop 101: Begin Your Big Data Journey
A good crash course to actually see the tools working rather than just hearing about them

Course Description

This full day instructor led workshop gives both seasoned technical professionals and beginners an opportunity to understand Big Data and Hadoop. MetaScale offers a vendor-agnostic view of the Hadoop Ecosystem and introduces you to the key technologies that make up a Hadoop environment.

By the end of the Workshop you will

  • Understand “What is Hadoop” and the “ History of Hadoop”
  • Understand when to use “Hadoop” and when not to use “Hadoop”
  • Develop a high-level understanding of the following Hadoop components: Hive with MySQL, Pig, MapReduce, HDFS, Sqoop, Flume, HBase, Oozie and Zookeeper
  • Perform basic hands-on tasks using Pig, Hive, HBase and HDFS in a customized VMWare cluster

All students receive a USB drive with access to course materials including instructor deck, lab deck, Linux/Unix cheat sheet and a copy of the VM software. If you are ready to take a practical look at Hadoop, we strongly encourage you to attend this comprehensive one-day course and begin on your Big Data Journey.

Audience

  • IT Professionals: Managers, System Engineers, Developers, Statisticians, Data Analysts, Data Scientists
  • C-levels
  • Business Managers
  • Educators

Prerequisites

  • None, this course is designed for anyone looking to understand Hadoop, regardless of prior programming experience

Registration Information

Email:
Phone: 1.800.234.8769
Course Itinerary Hadoop Fundamentals for Developers

Course Description

MetaScale Tech Academy presents Hadoop Fundamentals for Developers. This fast-paced, one-day instructor led workshop takes you on a deep-dive into scripting, programming, and coding in Hadoop. MetaScale provides a vendor-agnostic view when training on Pig, Hive, HBase, MapReduce, and Flume.

By the end of the Workshop you will

  • Have a good understanding of MapReduce, HBase, Hive, Pig, and Flume in a Hadoop Ecosystem
  • Complete a ½ day of hands on training in a Hadoop Ecosystem
  • Build a working Hadoop script start to finish, using raw files in a VM Cluster
  • Write a Query using Hive DDL
  • Write and convert Pig scripts into MapReduce Jobs to aggregate data and get outputs
  • View and analyze new outputs in both Hive and HBase through a UI analytics tool

All students receive a USB drive with access to course materials including instructor deck, lab deck, Hadoop cheat sheet and a copy of the VM software. If you are ready to take a practical look at Hadoop, we strongly encourage you to attend this comprehensive one-day course and begin on your Big Data Journey.

Audience

  • IT Professionals: System Engineers, Developers, Statisticians, Data Analysts, Data Scientists, Computer Programmers

Prerequisites

  • Programming experience required ( Java, C++ recommended)
  • Basic knowledge of Linux/Unix

Registration Information

Email:
Phone: 1.800.234.8769
Course Itinerary Principles of Data Mining in Hadoop

Course Description

This high octane one-day instructor led workshop teaches you the principals needed to drill down into your data through data mining. Explore different techniques that will allow you to gain insight into your data and turn it into actionable decision making. MetaScale provides a vendor-agnostic view when training on Hadoop technologies.

By the end of the Workshop you will

  • Understand how Big Data and Analytics work together
  • Understand the concepts of Machine Learning (Mahout)
  • Learn multiple techniques associated with Data Mining in Hadoop, such as grouping/clustering, recommendation systems, matrix association, predictive modeling using decision trees and many more
  • Build hands-on use cases in a VM cluster

Who Should Attend

  • System Engineers, Developers, Statisticians, Data Analysts, Data Scientists, Computer Programmers
  • Anyone looking to apply data science techniques to large datasets

Prerequisites

  • Students must have basic computer and programming skills
  • Basic knowledge of statistics (recommended not required)

Additional Information

Email:
Phone: 1.800.234.8769
Course Itinerary Exploring NoSQL in Hadoop – Intro to MongoDB, Cassandra, HBase

Course Description

This two-day course takes you on a journey into using NoSQL (also known as “Not Only SQL”). Participants will achieve an in-depth understanding of the innate abilities of MongoDB, Cassandra and HBase, and an understanding of using the right technology for the right job. Participants will setup a cluster in all three distributions and perform hands-on tasks.

By the end of the Workshop you will

  • Understand how big data is being used and how Hadoop technology works
  • Develop a detailed understanding of the interworking of MongoDB, Cassandra and HBase
  • Understand when and how to apply the correct NoSQL tool (MongoDB, Cassandra, HBase) for the right job
  • Setup a cluster in MongoDB, Cassandra and HBase, along with applying use cases that utilize the innate power of each tool

Who Should Attend

  • Solution Architects, System Engineers, Developers, Database Managers, Data Analysts, Computer Programmers
  • IT professionals with SQL background

Prerequisites

  • Students must have basic computer and programming skills
  • Basic knowledge of databases (recommended not required)

Additional Information

Email:
Phone: 1.800.234.8769

Begin Your Big Data Journey

Download the whitepaper Making a Case for Hadoop in your Organization and get the checklist for getting started.

Get Started

Modern Enterprise Data Hub

Establish your Enterprise Data Hub (EDH) with Hadoop to ensure your data is available, accurate, complete and secure.

See How

Your One-Stop Big Data Helpline

Do you have all the answers you need to know if Hadoop is the right solution for your data size, sources and analytics needs?

Request a Call from Sales