Apache Sqoop Cookbook : Unlocking Hadoop for Your Relational Database
Kathleen Ting
Sold by AHA-BUCH GmbH, Einbeck, Germany
AbeBooks Seller since August 14, 2006
New - Soft cover
Condition: New
Quantity: 1 available
Add to basketSold by AHA-BUCH GmbH, Einbeck, Germany
AbeBooks Seller since August 14, 2006
Condition: New
Quantity: 1 available
Add to basketNeuware - Integrating data from multiple sources is essential in the age of big data, but it can be a challenging and time-consuming task. This handy cookbook provides dozens of ready-to-use recipes for using Apache Sqoop, the command-line interface application that optimizes data transfers between relational databases and Hadoop.Sqoop is both powerful and bewildering, but with this cookbook's problem-solution-discussion format, you'll quickly learn how to deploy and then apply Sqoop in your environment. The authors provide MySQL, Oracle, and PostgreSQL database examples on GitHub that you can easily adapt for SQL Server, Netezza, Teradata, or other relational systems. Transfer data from a single database table into your Hadoop ecosystem Keep table data and Hadoop in sync by importing data incrementally Import data from more than one database table Customize transferred data by calling various database functions Export generated, processed, or backed-up data from Hadoop to your database Run Sqoop within Oozie, Hadoop's specialized workflow scheduler Load data into Hadoop's data warehouse (Hive) or database (HBase) Handle installation, connection, and syntax issues common to specific database vendors.
Seller Inventory # 9781449364625
Integrating data from multiple sources is essential in the age of big data, but it can be a challenging and time-consuming task. This handy cookbook provides dozens of ready-to-use recipes for using Apache Sqoop, the command-line interface application that optimizes data transfers between relational databases and Hadoop.
Sqoop is both powerful and bewildering, but with this cookbook’s problem-solution-discussion format, you’ll quickly learn how to deploy and then apply Sqoop in your environment. The authors provide MySQL, Oracle, and PostgreSQL database examples on GitHub that you can easily adapt for SQL Server, Netezza, Teradata, or other relational systems.
Transfer data from a single database table into your Hadoop ecosystem Keep table data and Hadoop in sync by importing data incrementally Import data from more than one database table Customize transferred data by calling various database functions Export generated, processed, or backed-up data from Hadoop to your database Run Sqoop within Oozie, Hadoop’s specialized workflow scheduler Load data into Hadoop’s data warehouse (Hive) or database (HBase) Handle installation, connection, and syntax issues common to specific database vendors"About this title" may belong to another edition of this title.
General Terms and Conditions and Customer Information / Privacy Policy
I. General Terms and Conditions
§ 1 Basic provisions
(1) The following terms and conditions apply to all contracts that you conclude with us as a provider (AHA-BUCH GmbH) via the Internet platforms AbeBooks and/or ZVAB. Unless otherwise agreed, the inclusion of any of your own terms and conditions used by you will be objected to
(2) A consumer within the meaning of the following regulations is any natural person who concludes...
More InformationWe ship your order after we received them
for articles on hand latest 24 hours,
for articles with overnight supply latest 48 hours.
In case we need to order an article from our supplier our dispatch time depends on the reception date of the articles, but the articles will be shipped on the same day.
Our goal is to send the ordered articles in the fastest, but also most efficient and secure way to our customers.