Talend Big Data Certified Developer Exam Training

Duration: Hours

Enquiry


    Category:

    Training Mode: Online

    Description

    Talend Big Data Certified Developer Overview:

    The Talend Big Data Certified Developer Exam Training program is designed for individuals aiming to validate their expertise in big data technologies, particularly within Talend’s data integration and management solutions. It encompasses essential topics such as data integration techniques, big data technologies, data quality, performance optimization, and best practices. Completion of the training and passing the certification exam demonstrates proficiency in leveraging Talend’s platform for real-world data challenges, thereby enhancing career prospects in big data analytics.

    Moreover, this certification exam covers the Talend Big Data Basics, Talend Big Data Advanced – Spark Batch, and Talend Big Data Advanced – Spark Streaming learning plans. The emphasis lies in understanding the Talend and Big Data architectures, Hadoop ecosystems, Spark, Spark on YARN, Kafka, and Kerberos.

    Exam details:

    Exam content is updated periodically. Furthermore, the number and difficulty of questions may change. Additionally, the passing score is adjusted to maintain a consistent standard.

    1. Duration: 90 minutes
    2. Number of questions: 55
    3. Passing score: 70%

    Recommended experience:

    To be eligible for the exam, candidates should have at least six months of experience using Talend products. Additionally, they should possess general knowledge of Hadoop (HDFS, Hive, HBase, YARN), Spark, Kafka, Talend Big Data, and cloud storage architectures, as well as Spark Universal.

    Furthermore, candidates should have hands-on experience with Talend Big Data solutions and Talend Studio. This includes proficiency in metadata creation, configuration, and troubleshooting.

    Preparation:

    To prepare for the Talend Big Data Certified Developer certification exam, Talend recommends:

    1. Firstly, taking the Big Data Basics, Big Data – Spark Batch, and Big Data – Spark Streaming learning plans.
    2. Secondly, studying the training material in the Talend Big Data Certified Developer preparation training module.
    3. Finally, reading the product documentation and Community Knowledge Base articles.

    Badge:

    After passing this certification exam, you are awarded the Talend Big Data Certified Developer badge. Additionally, to learn more about the criteria to earn this badge, refer to the Talend Academy Badging program page.

    Certification exam topics:

    Defining Big Data

    1. To begin with, let’s define Big Data.
    2. Moving on, we’ll describe the Hadoop ecosystem.
    3. Next, we’ll differentiate between Talend architecture and Big Data architecture.
    4. Furthermore, we’ll describe cloud storage architecture in a Big Data context.

    Overseeing metadata in a Big Data environment

    1. Supervising a Talend metadata stored in the repository
    2. Moreover, outlining the main elements of a Hadoop cluster metadata
    3. Furthermore, constructing a Hadoop cluster metadata
    4. Additionally, establishing metadata connections to HBase, HDFS, YARN, and Hive

    Managing data using Hive

    1. Importing data into a Hive table
    2. Subsequently, processing data stored in a Hive table
    3. Following that, analyzing Hive tables in the Profiling perspective
    4. Moreover, supervising Hive tables on Hive Warehouse Connector with CDP public cloud

    Managing Spark in a Big Data Environment

    1. Illustrating the principal usage of Spark
    2. Administering Spark Universal, including modes, environments, and distributions
    3. Configuring Spark Batch and Streaming Jobs
    4. Resolving issues with Spark Jobs
    5. Optimizing Spark Jobs at runtime

    Streaming with Talend Big Data

    1. Illustrating the principal usage of Spark
    2. Furthermore, administering Spark Universal, including modes, environments, and distributions
    3. Moreover, configuring Spark Batch and Streaming Jobs
    4. Additionally, resolving issues with Spark Jobs Finally, optimizing Spark Jobs at runtime

    Configuring a Big Data environment

    1. Administering Kerberos and security
    2. In addition, managing Apache Knox security with Cloudera Data Platform (CDP)

    Overseeing data on Hadoop and cloud

    1. Describing the principal usage of Hadoop (HDFS, HBase, and Hive) and cloud technologies Additionally, transferring and retrieving big data files to/from HDFS
    2. Moreover, transferring and retrieving big data files to/from the cloud
    3. Furthermore, transferring data to an HBase table

    Administering Big Data Jobs

    1. Distinguishing between Big Data Batch and Big Data Streaming Jobs
    2. Furthermore, migrating and transforming Jobs in a Big Data environment

    Overseeing a Spark cluster

    1. Defining Spark on YARN
    2. Moreover, detailing the principal usage of YARN

    Administering YARN, including client and cluster

    1. Monitoring Big Data Job executions
    2. Additionally, configuring Studio to allocate resource requests to YARN

    TABLE OF CONTENTS

    1 Overview

    1.1 Setup Talend Big Data Sandbox

    1.1.1 Pre-requisites to Running Sandbox

    1.1.2 Setup and Configuration of Sandbox

    2 Talend License and Services Status

    2.1 Talend License Setup

    2.2 Hortonworks Services Status

    3 Scenario: Clickstream Insights

    3.1 Overview

    3.2 Clickstream Dataset

    3.3 Using Talend Studio

    3.3.1 Talend HDFS Puts

    3.3.2 Talend MapReduce Review

    3.3.3 Talend to Google Charts and Hive

    4 Scenario: Twitter Sentiment Insights

    4.1 Twitter Sentiment Analysis Overview

    4.2 Twitter Data

    4.3 Talend Processes

    4.3.1 Retrieve the Data

    4.3.2 Process and Aggregate Results

    4.3.3 Analysis and Sentiment

    5 Scenario: Apache Weblog Insights

    5.1 Apache Weblog Overview

    5.2 Apache Weblog Data

    5.3 Scenario: Talend Processing

    5.3.1 Talend Filter and Load Data

    5.3.2 Talend PIG Scripts to Process

    5.3.3 Talend MapReduce to Process

    6 Scenario: ETL Off-Loading

    6.1 Overview

    6.2 Data

    6.3 Talend Process

    6.3.1 Single-Click Execution

    6.3.2 Step-by-Step Execution

    6.3.3 Extended Demo Functionality

    7 Demo: NoSQL Databases

    7.1 Hadoop Core – Hive and HBase

    7.1.1 Hive ELT

    7.1.2 HBase

    7.2 Cassandra

    7.3 MongoDB

    8 Conclusion

    9 Next Steps

    For additional information regarding this training, please visit here.

    Contact Locus IT support team for further training details.

    Reviews

    There are no reviews yet.

    Be the first to review “Talend Big Data Certified Developer Exam Training”

    Your email address will not be published. Required fields are marked *

    Enquiry


      Category: