Pentaho Data Integration Cookbook Second Edition

Pentaho Data Integration Cookbook Second Edition

Alex Meadows, María Carina Roldán

Language: English

Pages: 462

ISBN: 1783280670

Format: PDF / Kindle (mobi) / ePub

The premier open source ETL tool is at your command with this recipe-packed cookbook. Learn to use data sources in Kettle, avoid pitfalls, and dig out the advanced features of Pentaho Data Integration the easy way.


  • Intergrate Kettle in integration with other components of the Pentaho Business Intelligence Suite, to build and publish Mondrian schemas,create reports, and populatedashboards
  • This book contains an organized sequence of recipes packed with screenshots, tables, and tips so you can complete the tasks as efficiently as possible
  • Manipulate your data by exploring, transforming, validating, integrating, and performing data analysis

In Detail

Pentaho Data Integration is the premier open source ETL tool, providing easy, fast, and effective ways to move and transform data. While PDI is relatively easy to pick up, it can take time to learn the best practices so you can design your transformations to process data faster and more efficiently. If you are looking for clear and practical recipes that will advance your skills in Kettle, then this is the book for you.

Pentaho Data Integration Cookbook Second Edition guides you through the features of explains the Kettle features in detail and provides easy to follow recipes on file management and databases that can throw a curve ball to even the most experienced developers.

Pentaho Data Integration Cookbook Second Edition provides updates to the material covered in the first edition as well as new recipes that show you how to use some of the key features of PDI that have been released since the publication of the first edition. You will learn how to work with various data sources – from relational and NoSQL databases, flat files, XML files, and more. The book will also cover best practices that you can take advantage of immediately within your own solutions, like building reusable code, data quality, and plugins that can add even more functionality.

Pentaho Data Integration Cookbook Second Edition will provide you with the recipes that cover the common pitfalls that even seasoned developers can find themselves facing. You will also learn how to use various data sources in Kettle as well as advanced features.

What you will learn from this book

  • Configure Kettle to connect to relational and NoSQL databases and web applications like SalesForce, explore them, and perform CRUD operations
  • Utilize plugins to get even more functionality into your Kettle jobs
  • Embed Java code in your transformations to gain performance and flexibility
  • Execute and reuse transformations and jobs in different ways
  • Integrate Kettle with Pentaho Reporting, Pentaho Dashboards, Community Data Access, and the Pentaho BI Platform
  • Interface Kettle with cloud-based applications
  • Learn how to control and manipulate data flows
  • Utilize Kettle to create datasets for analytics


Pentaho Data Integration Cookbook Second Edition is written in a cookbook format, presenting examples in the style of recipes.This allows you to go directly to your topic of interest, or follow topics throughout a chapter to gain a thorough in-depth knowledge.

Who this book is written for

Pentaho Data Integration Cookbook Second Edition is designed for developers who are familiar with the basics of Kettle but who wish to move up to the next level.It is also aimed at advanced users that want to learn how to use the new features of PDI as well as and best practices for working with Kettle.















the log you will be able to see the real prepared statements that Kettle performs when inserting or updating rows in a table. There's more... Here there are two alternative solutions to this use case. Alternative solution if you just want to insert records If you just want to insert records, you shouldn't use the Insert/Update step but the Table Output step. This would be faster because you would be avoiding unnecessary lookup operations; however, the Table Output step does not check for

at runtime with the full path of the current transformation. Note that the variable will be undefined until you save the transformation. Therefore, it's necessary that you save before running a preview of the step. If you are running a script in multiple environments, it is recommended to set variables for the path instead of using ${Internal.Transformation.Filename. Directory}.You don't have to type the complete name of the ${Internal.Transformation.Filename.Directory} variable. It can be

next to the labels that identified them. So, in order to find what you were looking for, you read the first 10 columns by using generic names a, b, c, and so on. By normalizing the cells, you put the cells row by row. This way, each value remained in the row just beneath its label. For example, if the cell with the value YEAR remained in the tenth row, the cell with value 2010 was in row 11. You can confirm this by doing a preview on the Row Normalizer step. For each row, the Analytic Query step

sandbox out on GitHub at dbaAlex/hortonworks_pentaho_shim. It is likely with Version 5.0 that Kettle should work with the Hortonworks' Sandbox without having to build a custom shim. Now we only need a dataset in which to load our newly minted Hadoop environment. Fortunately, there are quite a few large datasets for us to play with. For the exercises in this chapter, we will be using the Baseball Database found at files/database/ 116

and his Master's degree in Business Intelligence from St. Joseph's University in Philadelphia, Pennsylvania. First and foremost, thank you Christina for being there for me before, during, and after taking on the challenge of writing and revising a book. I know it's not been easy, but thank you for allowing me the opportunity. To my grandmother, thank you for teaching me at a young age to always go for goals that may just be out of reach. Finally, this book would be no where without the Pentaho

Download sample