Site icon DataFlair

Sqoop Import All Tables – A Complete Guide

Sqoop Import All Tables - A Complete Guide

Sqoop Import All Tables - A Complete Guide

A tool which imports a set of tables from an RDBMS to HDFS is what we call the Sqoop import all tables. Although there are many more insights on this tool. So, this document aims the whole concept of Sqoop import all tables in Sqoop. After its introduction, we will cover its Syntax also. Moreover, we will also cover its examples to understand it better.

So, let’s start the Sqoop Import All Tables tutorial.

Introduction to Sqoop Import All Tables

A tool which imports a set of tables from an RDBMS to HDFS is what we call the import-all-tables. Basically, here in HDFS, data from each table is stored in a separate directory.
However, there are various conditions that must meet for the import-all-tables tool to be useful. Such as:

Syntax of Sqoop import All Tables

$ sqoop import-all-tables (generic-args) (import-args)
$ sqoop-import-all-tables (generic-args) (import-args)
However,  the import arguments can be entered in any order with respect to one another, But the Hadoop generic arguments must precede any import arguments only.
Table 1. Common arguments

Argument Description
–connect <jdbc-uri> Specify JDBC connect string
–connection-manager <class-name> Specify connection manager class to use
–driver <class-name> Manually specify JDBC driver class to use
–hadoop-mapred-home <dir> Override $HADOOP_MAPRED_HOME
–help Print usage instructions
–password-file Set path for a file containing the authentication password
-P Read password from console
–password <password> Set authentication password
–username <username> Set authentication username
–verbose Print more information while working
–connection-param-file <filename> Optional properties file that provides connection parameters
–relaxed-isolation Set connection transaction isolation to read uncommitted for the mappers.

Table 2. Import control arguments

Argument Description
–as-avrodatafile Imports data to Avro Data Files
–as-sequencefile Imports data to SequenceFiles
–as-textfile Imports data as plain text (default)
–as-parquetfile Imports data to Parquet Files
–direct Use direct import fast path
–inline-lob-limit <n> Set the maximum size for an inline LOB
-m,–num-mappers <n> Use n map tasks to import in parallel
–warehouse-dir <dir> HDFS parent for table destination
-z,–compress Enable compression
–compression-codec <c> Use Hadoop codec (default gzip)
–exclude-tables <tables> Comma separated list of tables to exclude from import process
–autoreset-to-one-mapper Import should use one mapper if a table with no primary key is encountered

While we use the Sqoop-import tool, these arguments behave in the same manner. However, for Sqoop import all tables the –table, –split-by, –columns, and –where arguments are invalid. Although, the argument –exclude-tables is only for +sqoop-import-all-tables.
Table 3. Output line formatting arguments

Argument Description
–enclosed-by <char> Sets a required field enclosing character
–escaped-by <char> Sets the escape character
–fields-terminated-by <char> Sets the field separator character
–lines-terminated-by <char> Sets the end-of-line character
–mysql-delimiters Uses MySQL’s default delimiter set: fields: , lines: \n escaped-by: \ optionally-enclosed-by: ‘
–optionally-enclosed-by <char> Sets a field enclosing character

Table 4. Input parsing arguments

Argument Description
–input-enclosed-by <char> Sets a required field encloser
–input-escaped-by <char> Sets the input escape character
–input-fields-terminated-by <char> Sets the input field separator
–input-lines-terminated-by <char> Sets the input end-of-line character
–input-optionally-enclosed-by <char> Sets a field enclosing character

Table 5. Hive arguments

Argument Description
–hive-home <dir> Override $HIVE_HOME
–hive-import Import tables into Hive (Uses Hive’s default delimiters if none are set.)
–hive-overwrite Overwrite existing data in the Hive table.
–create-hive-table If set, then the job will fail if the target hive table exits. By default this property is false.
–hive-table <table-name> Sets the table name to use when importing to Hive.
–hive-drop-import-delims Drops \n, \r, and \01 from string fields when importing to Hive.
–hive-delims-replacement Replace \n, \r, and \01 from string fields with user defined string when importing to Hive.
–hive-partition-key Name of a hive field to partition are sharded on
–hive-partition-value <v> String-value that serves as partition key for this imported into hive in this job.
–map-column-hive <map> Override default mapping from SQL type to Hive type for configured columns.

Table 6. Code generation arguments

Argument Description
–bindir <dir> Output directory for compiled objects
–jar-file <file> Disable code generation; use specified jar
–outdir <dir> Output directory for generated code
–package-name <name> Put auto-generated classes in this package

Technology is evolving rapidly!
Stay updated with DataFlair on WhatsApp!!

Basically, the import-all-tables tool does not support the –class-name argument. Although, with –package-name in which all generated classes will be placed we can specify a package.

Sqoop Import Example

Basically, Import all tables from the corp database:
$ sqoop import-all-tables –connect jdbc:mysql://db.foo.com/corp
Moreover, here we are verifying that it worked:
$ hadoop fs -ls
Found 4 items
drwxr-xr-x   – someuser somegrp       0 2010-04-27 17:15 /user/someuser/EMPLOYEES
drwxr-xr-x   – someuser somegrp       0 2010-04-27 17:15 /user/someuser/PAYCHECKS
drwxr-xr-x   – someuser somegrp       0 2010-04-27 17:15 /user/someuser/DEPARTMENTS
drwxr-xr-x   – someuser somegrp       0 2010-04-27 17:15 /user/someuser/OFFICE_SUPPLIES

Conclusion

As a result, we have seen all the concepts of Sqoop- import-all-tables. Also, we have seen its purpose as well as its syntax. Moreover, we have seen all arguments. Afterwards, we have also covered its Example Invocations to understand it well. However, still, if you want to ask any query regarding, please ask through the comment section. Likewise, we ensure we will definitely get back to you.

Learn about Sqoop Export & Sqoop List Databases
For reference

Exit mobile version