1. i s
th
Writing Codes in in
a nd
d ,
in
m
to
es
m
co
n
a tio
lic
a pp
eb
w
i ze
-s
n et
la ing
p tt
i
rm s f
te t i
e i
Th se
ca
DWIVEDISHASHWAT@GMAIL.COM
13. System.out.println("===========get one record========");
HBaseTest.getOneRecord(tablename, "zkb");
System.out.println("===========show all record========");
HBaseTest.getAllRecord(tablename);
System.out.println("===========del one record========");
HBaseTest.delRecord(tablename, "baoniu");
HBaseTest.getAllRecord(tablename);
System.out.println("===========show all record========");
HBaseTest.getAllRecord(tablename);
} catch (Exception e) {
e.printStackTrace();
}
}}
14. Sqoop (“SQL-to-Hadoop”) is a straightforward command-
line tool with the following capabilities:
Imports individual tables or entire databases to files in
HDFS
Generates Java classes to allow you to interact with your
imported data
Provides the ability to import from SQL databases
straight into your Hive data warehouse
15. sqoop --connect jdbc:mysql://db.example.com/website --table USERS --local
--hive-import
This would connect to the MySQL database on this server and
import the USERS table into HDFS. The –-local option instructs
Sqoop to take advantage of a local MySQL connection which
performs very well. The –-hive-import option means that after
reading the data into HDFS, Sqoop will connect to the Hive
metastore, create a table named USERS with the same
columns and types (translated into their closest analogues in
Hive), and load the data into the Hive warehouse directory on
HDFS
16. Suppose you wanted to work with this data in MapReduce and
weren’t concerned with Hive. When storing this table in HDFS,
you might want to take advantage of compression, so you’d like
to be able to store the data in Sequence Files.
sqoop --connect jdbc:mysql://db.example.com/website --table
USERS --as-sequencefile
Sqoop includes some other commands which allow you to inspect
the database you are working with. For example, you can list
the available database schemas (with the sqoop-list-databases
tool) and tables within a schema (with the sqoop-list-tables
tool). Sqoop also includes a primitive SQL execution shell (the
sqoop-eval tool)
17. sqoop help
usage: sqoop COMMAND [ARGS]
Available commands:
codegen Generate code to interact with database records
create-hive-table Import a table definition into Hive
eval Evaluate a SQL statement and display the results
export Export an HDFS directory to a database table
help List available commands
import Import a table from a database to HDFS
import-all-tables Import tables from a database to HDFS
list-databases List available databases on a server
list-tables List available tables in a database
version Display version information
See 'sqoop help COMMAND' for information on a specific command.
18. sqoop help import
usage: sqoop import [GENERIC-ARGS] [TOOL-ARGS]
Common arguments:
--connect <jdbc-uri> Specify JDBC connect string
--connect-manager <jdbc-uri> Specify connection manager class to use
--driver <class-name> Manually specify JDBC driver class to use
--hadoop-home <dir> Override $HADOOP_HOME
--help Print usage instructions
-P Read password from console
--password <password> Set authentication password
--username <username> Set authentication username
--verbose Print more information while working