Call : (+91) 968636 4243
Mail :

Informatica Data Engineering Integration - Administration

( Duration: 3 Days )

In Informatica Data Engineering Integration - Administration training course, you will learn to set-up a live DEI environment by performing various administrative tasks such as Hadoop integration, Databricks integration, security mechanism set up, monitoring, and performance tuning. Learn to integrate the Informatica domain with the Hadoop and Databricks eco-system leveraging Hadoop’s lightning processing capability, and Databricks’ analytics cloud platform technology to churn huge data sets.

By attending Informatica Data Engineering Integration - Administration workshop, delegates will learn to:

  • Prepare and list the steps for installation and configuration of DEI
  • List the steps to enable Kerberos on the Domain
  • List the steps to upgrade DEI from the previous versions to
  • Create Cluster Configuration Object for Hadoop integration
  • Set up Informatica Security that includes different Authentication and Authorization mechanisms
  • Tune the performance of the system
  • Monitor, view and troubleshoot DEI logs
  • Monitor using REST APIs and log aggregator

The Informatica Data Engineering Integration - Administration class is ideal for:

  • Administrators



Introduction to Data Engineering Integration Administration

  • Data Engineering and the role of DEI in the big data ecosystem
  • DEI Components
  • DEI architecture
  • Roles and responsibilities of Informatica DEI Administrator
  • DEI engines: Blaze, Spark, and Databricks
  • DEI features

Data Engineering Integration Installation and Configuration

  • Basic setup for installation
  • Plan the Installation Components
  • Steps to install the DEI product
  • Steps to create and configure Application Services
  • Steps to install the Developer client
  • Steps to uninstall Informatica Server

Enable Kerberos Authentication on the Domain

  • Kerberos concepts
  • Kerberos protocol authentication steps
  • Single and Cross realm Kerberos authentication
  • Prepare to enable Kerberos Authentication on the Domain

Upgrade Data Engineering Integration to

  • Informatica upgrade overview
  • Informatica upgrade support
  • Steps involved in the upgrade process
  • Steps to upgrade DEI
  • Steps to upgrade DEI Developer client

Hadoop Integration

  • Cluster Integration overview
  • Data Engineering Integration Component Architecture
  • Pre-requsites for Hadoop integration
  • Metadata Access Service (MAS)
  • HDP integration tasks
  • Create a Cluster Configuration
  • Integration with Hadoop

Data Engineering Integration Security - Authentication

  • DEI security
  • Security aspects
  • Authentication overview
  • Operating System profiles
  • Kerberos authentication
  • Apache Knox Gateway authentication

Data Engineering Integration Security - Authorization

  • Authorization overview
  • HDFS permissions
  • Configure access to an SSL-Enabled Cluster
  • Security using Apache Ranger authorization
  • Fine Grained authorization

Data Engineering Recovery

  • DIS processing overview
  • DIS Queuing
  • Execution Pools
  • Data Engineering recovery
  • Monitor recovered jobs
  • Tune for Data Engineering Job Processing

DEI Performance Tuning

  • DEI Deployment types
  • Sizing recommendations
  • Hadoop cluster Hardware tuning
  • Tune Blaze performance
  • Tune Spark performance
  • Tune Databricks performance
  • Tune Data Integration Service
  • Tune Model Repository Service
  • Tune Sqoop performance
  • infacmd autotune command

Monitoring Mappings

  • View Data Integration Service generated logs
  • View logs for the Blaze, or the Spark engine
  • Monitor Spark engine
  • View Spark logs
  • Log Aggregation
  • REST Operations Hub overview
  • Monitoring Metadata document
  • Display Nodes Used in Mapping


  • Troubleshoot tips for common admin problems

Databricks Integration

  • Databricks Integration overview
  • Informatica and the Databricks environments components
  • Run-time Process on the Databricks Spark Engine
  • Databricks Integration Task Flow
  • Pre-requisites for Databricks integration
  • Steps to integrate Databricks and run a mapping within the Databricks environment

Configuring Data Engineering Integration on Kubernetes

  • How Data Engineering Integration Works with Kubernetes
  • Kubernetes Architecture
  • Prerequisites to install Kubernetes
  • Create a Kubernetes cluster
  • Configure Data Engineering Integration on Kubernetes

Encarta Labs Advantage

  • One Stop Corporate Training Solution Providers for over 6,000 various courses on a variety of subjects
  • All courses are delivered by Industry Veterans
  • Get jumpstarted from newbie to production ready in a matter of few days
  • Trained more than 50,000 Corporate executives across the Globe
  • All our trainings are conducted in workshop mode with more focus on hands-on sessions

View our other course offerings by visiting

Contact us for delivering this course as a public/open-house workshop/online training for a group of 10+ candidates.