Call : (+91) 968636 4243
Mail : info@EncartaLabs.com
EncartaLabs

Informatica Data Engineering Integration - Administration

( Duration: 3 Days )

In Informatica Data Engineering Integration - Administration training course, you will learn to set-up a live DEI environment by performing various administrative tasks such as Hadoop integration, Databricks integration, security mechanism set up, monitoring, and performance tuning. Learn to integrate the Informatica domain with the Hadoop and Databricks eco-system leveraging Hadoop’s lightning processing capability, and Databricks’ analytics cloud platform technology to churn huge data sets.

By attending Informatica Data Engineering Integration - Administration workshop, delegates will learn to:

  • Prepare and list the steps for installation and configuration of DEI
  • List the steps to enable Kerberos on the Domain
  • List the steps to upgrade DEI from the previous versions to
  • Create Cluster Configuration Object for Hadoop integration
  • Set up Informatica Security that includes different Authentication and Authorization mechanisms
  • Tune the performance of the system
  • Monitor, view and troubleshoot DEI logs
  • Monitor using REST APIs and log aggregator

The Informatica Data Engineering Integration - Administration class is ideal for:

  • Administrators

COURSE AGENDA

1

Introduction to Data Engineering Integration Administration

  • Data Engineering and the role of DEI in the big data ecosystem
  • DEI Components
  • DEI architecture
  • Roles and responsibilities of Informatica DEI Administrator
  • DEI engines: Blaze, Spark, and Databricks
  • DEI features
2

Data Engineering Integration Installation and Configuration

  • Basic setup for installation
  • Plan the Installation Components
  • Steps to install the DEI product
  • Steps to create and configure Application Services
  • Steps to install the Developer client
  • Steps to uninstall Informatica Server
3

Enable Kerberos Authentication on the Domain

  • Kerberos concepts
  • Kerberos protocol authentication steps
  • Single and Cross realm Kerberos authentication
  • Prepare to enable Kerberos Authentication on the Domain
4

Upgrade Data Engineering Integration to

  • Informatica upgrade overview
  • Informatica upgrade support
  • Steps involved in the upgrade process
  • Steps to upgrade DEI
  • Steps to upgrade DEI Developer client
5

Hadoop Integration

  • Cluster Integration overview
  • Data Engineering Integration Component Architecture
  • Pre-requsites for Hadoop integration
  • Metadata Access Service (MAS)
  • HDP integration tasks
  • Create a Cluster Configuration
  • Integration with Hadoop
6

Data Engineering Integration Security - Authentication

  • DEI security
  • Security aspects
  • Authentication overview
  • Operating System profiles
  • Kerberos authentication
  • Apache Knox Gateway authentication
7

Data Engineering Integration Security - Authorization

  • Authorization overview
  • HDFS permissions
  • Configure access to an SSL-Enabled Cluster
  • Security using Apache Ranger authorization
  • Fine Grained authorization
8

Data Engineering Recovery

  • DIS processing overview
  • DIS Queuing
  • Execution Pools
  • Data Engineering recovery
  • Monitor recovered jobs
  • Tune for Data Engineering Job Processing
9

DEI Performance Tuning

  • DEI Deployment types
  • Sizing recommendations
  • Hadoop cluster Hardware tuning
  • Tune Blaze performance
  • Tune Spark performance
  • Tune Databricks performance
  • Tune Data Integration Service
  • Tune Model Repository Service
  • Tune Sqoop performance
  • infacmd autotune command
10

Monitoring Mappings

  • View Data Integration Service generated logs
  • View logs for the Blaze, or the Spark engine
  • Monitor Spark engine
  • View Spark logs
  • Log Aggregation
  • REST Operations Hub overview
  • Monitoring Metadata document
  • REST APIs
  • Display Nodes Used in Mapping
11

Troubleshooting

  • Troubleshoot tips for common admin problems
12

Databricks Integration

  • Databricks Integration overview
  • Informatica and the Databricks environments components
  • Run-time Process on the Databricks Spark Engine
  • Databricks Integration Task Flow
  • Pre-requisites for Databricks integration
  • Steps to integrate Databricks and run a mapping within the Databricks environment
13

Configuring Data Engineering Integration on Kubernetes

  • How Data Engineering Integration Works with Kubernetes
  • Kubernetes Architecture
  • Prerequisites to install Kubernetes
  • Create a Kubernetes cluster
  • Configure Data Engineering Integration on Kubernetes

Encarta Labs Advantage

  • One Stop Corporate Training Solution Providers for over 6,000 various courses on a variety of subjects
  • All courses are delivered by Industry Veterans
  • Get jumpstarted from newbie to production ready in a matter of few days
  • Trained more than 50,000 Corporate executives across the Globe
  • All our trainings are conducted in workshop mode with more focus on hands-on sessions

View our other course offerings by visiting https://www.encartalabs.com/course-catalogue-all.php

Contact us for delivering this course as a public/open-house workshop/online training for a group of 10+ candidates.

Top
Notice
X