Sqoop Import All Tables – A Complete Guide

Boost your career with Free Big Data Courses!!

A tool which imports a set of tables from an RDBMS to HDFS is what we call the Sqoop import all tables. Although there are many more insights on this tool. So, this document aims the whole concept of Sqoop import all tables in Sqoop. After its introduction, we will cover its Syntax also. Moreover, we will also cover its examples to understand it better.

So, let’s start the Sqoop Import All Tables tutorial.

Introduction to Sqoop Import All Tables

A tool which imports a set of tables from an RDBMS to HDFS is what we call the import-all-tables. Basically, here in HDFS, data from each table is stored in a separate directory.
However, there are various conditions that must meet for the import-all-tables tool to be useful. Such as:

  • Basically, it is important that each table must have a single-column primary key. Also,  –autoreset-to-one-mapper option must be used.
  • Moreover, we must intend to import all columns of each table.
  • However, via a WHERE clause, we must not intend to use non-default splitting column. Also, we must not impose any conditions.

Syntax of Sqoop import All Tables

$ sqoop import-all-tables (generic-args) (import-args)
$ sqoop-import-all-tables (generic-args) (import-args)
However,  the import arguments can be entered in any order with respect to one another, But the Hadoop generic arguments must precede any import arguments only.
Table 1. Common arguments

ArgumentDescription
–connect <jdbc-uri>Specify JDBC connect string
–connection-manager <class-name>Specify connection manager class to use
–driver <class-name>Manually specify JDBC driver class to use
–hadoop-mapred-home <dir>Override $HADOOP_MAPRED_HOME
–helpPrint usage instructions
–password-fileSet path for a file containing the authentication password
-PRead password from console
–password <password>Set authentication password
–username <username>Set authentication username
–verbosePrint more information while working
–connection-param-file <filename>Optional properties file that provides connection parameters
–relaxed-isolationSet connection transaction isolation to read uncommitted for the mappers.

Table 2. Import control arguments

ArgumentDescription
–as-avrodatafileImports data to Avro Data Files
–as-sequencefileImports data to SequenceFiles
–as-textfileImports data as plain text (default)
–as-parquetfileImports data to Parquet Files
–directUse direct import fast path
–inline-lob-limit <n>Set the maximum size for an inline LOB
-m,–num-mappers <n>Use n map tasks to import in parallel
–warehouse-dir <dir>HDFS parent for table destination
-z,–compressEnable compression
–compression-codec <c>Use Hadoop codec (default gzip)
–exclude-tables <tables>Comma separated list of tables to exclude from import process
–autoreset-to-one-mapperImport should use one mapper if a table with no primary key is encountered

While we use the Sqoop-import tool, these arguments behave in the same manner. However, for Sqoop import all tables the –table, –split-by, –columns, and –where arguments are invalid. Although, the argument –exclude-tables is only for +sqoop-import-all-tables.
Table 3. Output line formatting arguments

ArgumentDescription
–enclosed-by <char>Sets a required field enclosing character
–escaped-by <char>Sets the escape character
–fields-terminated-by <char>Sets the field separator character
–lines-terminated-by <char>Sets the end-of-line character
–mysql-delimitersUses MySQL’s default delimiter set: fields: , lines: \n escaped-by: \ optionally-enclosed-by: ‘
–optionally-enclosed-by <char>Sets a field enclosing character

Table 4. Input parsing arguments

ArgumentDescription
–input-enclosed-by <char>Sets a required field encloser
–input-escaped-by <char>Sets the input escape character
–input-fields-terminated-by <char>Sets the input field separator
–input-lines-terminated-by <char>Sets the input end-of-line character
–input-optionally-enclosed-by <char>Sets a field enclosing character

Table 5. Hive arguments

ArgumentDescription
–hive-home <dir>Override $HIVE_HOME
–hive-importImport tables into Hive (Uses Hive’s default delimiters if none are set.)
–hive-overwriteOverwrite existing data in the Hive table.
–create-hive-tableIf set, then the job will fail if the target hive table exits. By default this property is false.
–hive-table <table-name>Sets the table name to use when importing to Hive.
–hive-drop-import-delimsDrops \n, \r, and \01 from string fields when importing to Hive.
–hive-delims-replacementReplace \n, \r, and \01 from string fields with user defined string when importing to Hive.
–hive-partition-keyName of a hive field to partition are sharded on
–hive-partition-value <v>String-value that serves as partition key for this imported into hive in this job.
–map-column-hive <map>Override default mapping from SQL type to Hive type for configured columns.

Table 6. Code generation arguments

ArgumentDescription
–bindir <dir>Output directory for compiled objects
–jar-file <file>Disable code generation; use specified jar
–outdir <dir>Output directory for generated code
–package-name <name>Put auto-generated classes in this package

Basically, the import-all-tables tool does not support the –class-name argument. Although, with –package-name in which all generated classes will be placed we can specify a package.

Sqoop Import Example

Basically, Import all tables from the corp database:
$ sqoop import-all-tables –connect jdbc:mysql://db.foo.com/corp
Moreover, here we are verifying that it worked:
$ hadoop fs -ls
Found 4 items
drwxr-xr-x   – someuser somegrp       0 2010-04-27 17:15 /user/someuser/EMPLOYEES
drwxr-xr-x   – someuser somegrp       0 2010-04-27 17:15 /user/someuser/PAYCHECKS
drwxr-xr-x   – someuser somegrp       0 2010-04-27 17:15 /user/someuser/DEPARTMENTS
drwxr-xr-x   – someuser somegrp       0 2010-04-27 17:15 /user/someuser/OFFICE_SUPPLIES

Conclusion

As a result, we have seen all the concepts of Sqoop- import-all-tables. Also, we have seen its purpose as well as its syntax. Moreover, we have seen all arguments. Afterwards, we have also covered its Example Invocations to understand it well. However, still, if you want to ask any query regarding, please ask through the comment section. Likewise, we ensure we will definitely get back to you.

Learn about Sqoop Export & Sqoop List Databases
For reference

We work very hard to provide you quality material
Could you take 15 seconds and share your happy experience on Google

follow dataflair on YouTube

1 Response

  1. Senthilkumar says:

    Could you please explain/share the commands for SQOOP import all operation commands arguments for ORACLE database.
    How to pass oracle DB details etc..
    Thanks in Advance

Leave a Reply

Your email address will not be published. Required fields are marked *