Sei sulla pagina 1di 10

Password:

Last login: Sun Apr 30 09:10:13 on pts/27


[manikantan20505061@ip-172-31-60-179 ~]$ mysql
ERROR 2002 (HY000): Can't connect to local MySQL server through socket
'/var/lib/mysql/mysql.sock' (2)
[manikantan20505061@ip-172-31-60-179 ~]$ mysql -u sqoopuser -h ip-172-31-13-154 -p
NHkkP876rp
Enter password:
ERROR 1044 (42000): Access denied for user 'sqoopuser'@'%' to database 'NHkkP876rp'
[manikantan20505061@ip-172-31-60-179 ~]$ mysql -u sqoopuser -h 172.31.13.154 -p
Enter password:
Welcome to the MariaDB monitor. Commands end with ; or \g.
Your MySQL connection id is 62594
Server version: 5.6.30 MySQL Community Server (GPL)
Copyright (c) 2000, 2015, Oracle, MariaDB Corporation Ab and others.
Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.
MySQL [(none)]> use sqoopex
Reading table information for completion of table and column names
You can turn off this feature to get a quicker startup with -A
Database changed
MySQL [sqoopex]>

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.
MySQL [(none)]> use sqoopex
Reading table information for completion of table and column names
You can turn off this feature to get a quicker startup with -A
Database changed
MySQL [sqoopex]> create table sqoop_testgg1
-> (id INT, name VARCHAR(20));
Query OK, 0 rows affected (0.04 sec)
MySQL [sqoopex]> desc sqoop_testgg1
-> ;
+-------+-------------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+-------+-------------+------+-----+---------+-------+
| id | int(11) | YES | | NULL | |
| name | varchar(20) | YES | | NULL | |
+-------+-------------+------+-----+---------+-------+
2 rows in set (0.00 sec)

MySQL [sqoopex]> insert into sqoop_testgg1 values(1,'one'),(2,'two'),(3,'three'),


(4,'four'),(5,'five'),(6,'six'),(7,'seven'),(8,'eight'),(9,'nine');
Query OK, 9 rows affected (0.02 sec)
Records: 9 Duplicates: 0 Warnings: 0

MySQL [sqoopex]> sqoop import --connect jdbc:mysql://172.31.13.154/sqoopex


-> --username sqoopuser --password NHkkp876rp --table sqoop_test_gg2
-> --target-dir vishal/ -m 1

[manikantan20505061@ip-172-31-60-179 ~]$ sqoop import --connect


jdbc:mysql://172.31.13.154/sqoopex --username sqoopuser --password NHkkP876rp
--table sqoop_testgg1
--target-dir girilinux/ -m 1;
17/04/30 09:47:15 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.3.4.0-3485
17/04/30 09:47:15 WARN tool.BaseSqoopTool: Setting your password on the command-
line is insecure. Consider using -P instead.
17/04/30 09:47:15 INFO manager.MySQLManager: Preparing to use a MySQL streaming
resultset.
17/04/30 09:47:15 INFO tool.CodeGenTool: Beginning code generation
17/04/30 09:47:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM
`sqoop_testgg1` AS t LIMIT 1
17/04/30 09:47:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM
`sqoop_testgg1` AS t LIMIT 1
17/04/30 09:47:16 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
/usr/hdp/2.3.4.0-3485/hadoop-mapreduce
Note: /tmp/sqoop-
manikantan20505061/compile/dbeb39aa7a4306bb1bf05cdd6224986f/sqoop_testgg1.java uses
or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/04/30 09:47:17 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-
manikantan20505061/compile/dbeb39aa7a4306bb1bf05cdd6224986f/sqoop_testgg1.jar
17/04/30 09:47:18 WARN manager.MySQLManager: It looks like you are importing from
mysql.
17/04/30 09:47:18 WARN manager.MySQLManager: This transfer can be faster! Use the
--direct
17/04/30 09:47:18 WARN manager.MySQLManager: option to exercise a MySQL-specific
fast path.
17/04/30 09:47:18 INFO manager.MySQLManager: Setting zero DATETIME behavior to
convertToNull (mysql)
17/04/30 09:47:18 INFO mapreduce.ImportJobBase: Beginning import of sqoop_testgg1
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.4.0-3485/hadoop/lib/slf4j-log4j12-
1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.4.0-3485/zookeeper/lib/slf4j-
log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.4.0-3485/accumulo/lib/slf4j-
log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
17/04/30 09:47:19 INFO impl.TimelineClientImpl: Timeline service address:
http://ip-172-31-13-154.ec2.internal:8188/ws/v1/timeline/
17/04/30 09:47:19 INFO client.RMProxy: Connecting to ResourceManager at ip-172-31-
53-48.ec2.internal/172.31.53.48:8050
17/04/30 09:47:22 INFO db.DBInputFormat: Using read commited transaction isolation
17/04/30 09:47:22 INFO mapreduce.JobSubmitter: number of splits:1
17/04/30 09:47:22 INFO mapreduce.JobSubmitter: Submitting tokens for job:
job_1493302944662_5275
17/04/30 09:47:23 INFO impl.YarnClientImpl: Submitted application
application_1493302944662_5275
17/04/30 09:47:23 INFO mapreduce.Job: The url to track the job:
http://a.cloudxlab.com:8088/proxy/application_1493302944662_5275/
17/04/30 09:47:23 INFO mapreduce.Job: Running job: job_1493302944662_5275

sqoop list-tables --connect jdbc:mysql://172.31.13.154/sqoopex --username sqoopuser


--password NHkkP876rp

[manikantan20505061@ip-172-31-60-179 ~]$ hadoop fs -ls girilinux


Found 2 items
-rw-r--r-- 3 manikantan20505061 manikantan20505061 0 2017-04-30 09:48
girilinux/_SUCCESS
-rw-r--r-- 3 manikantan20505061 manikantan20505061 63 2017-04-30 09:47
girilinux/part-m-00000
[manikantan20505061@ip-172-31-60-179 ~]

-rw-r--r-- 3 manikantan20505061 manikantan20505061 63 2017-04-30 09:47


girilinux/part-m-00000
[manikantan20505061@ip-172-31-60-179 ~]$ hadoop fs -ls
Found 51 items
drwx------ - manikantan20505061 manikantan20505061 0 2017-04-30 06:00
.Trash
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-10 17:53
.hiveJars
drwx------ - manikantan20505061 manikantan20505061 0 2017-04-30 09:53
.staging
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-29 08:02
AlbertDir
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-30 09:36
Prabakaran_sqoop
-rw-r--r-- 3 manikantan20505061 manikantan20505061 0 2017-04-29 07:23
abc
-rw-r--r-- 3 manikantan20505061 manikantan20505061 28 2017-04-29 08:01
abc.txt
-rw-r--r-- 3 manikantan20505061 manikantan20505061 0 2017-04-29 07:16
albert
-rw-r--r-- 3 manikantan20505061 hadoop 0 2017-04-29 07:18
albert.txt
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-30 09:39
albert1
-rw-r--r-- 3 manikantan20505061 manikantan20505061 19 2017-04-29 07:55
albertabc.txt
-rw-r--r-- 3 manikantan20505061 manikantan20505061 0 2017-04-29 07:32
file1
-rw-r--r-- 3 manikantan20505061 manikantan20505061 0 2017-04-29 07:14
file2
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-30 09:48
girilinux

hadoop fs -cat girilinux/part-m-00000


1,one
2,two
3,three
4,four
5,five
6,six
7,seven
8,eight
9,nine

[manikantan20505061@ip-172-31-60-179 ~]$ sqoop import-all-tables --connect


jdbc:mysql://172.31.13.154/sqoopex --username sqoopuser --password NHkkp876rp
--table sq
oop_test_gg2 --warehouse-dir girilinuxWH -m 1;

oop_test_gg2 --warehouse-dir girilinuxWH -m 1;


17/04/30 09:58:47 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.3.4.0-3485
17/04/30 09:58:47 WARN tool.BaseSqoopTool: Setting your password on the command-
line is insecure. Consider using -P instead.
17/04/30 09:58:47 ERROR tool.BaseSqoopTool: Error parsing arguments for import-all-
tables:
17/04/30 09:58:47 ERROR tool.BaseSqoopTool: Unrecognized argument: --table
17/04/30 09:58:47 ERROR tool.BaseSqoopTool: Unrecognized argument: sqoop_test_gg2
17/04/30 09:58:47 ERROR tool.BaseSqoopTool: Unrecognized argument: --warehouse-dir
17/04/30 09:58:47 ERROR tool.BaseSqoopTool: Unrecognized argument: girilinuxWH
17/04/30 09:58:47 ERROR tool.BaseSqoopTool: Unrecognized argument: -m
17/04/30 09:58:47 ERROR tool.BaseSqoopTool: Unrecognized argument: 1
Try --help for usage instructions.
[manikantan20505061@ip-172-31-60-179 ~]$

[manikantan20505061@ip-172-31-60-179 ~]$ sqoop import-all-tables --connect


jdbc:mysql://172.31.13.154/sqoopex --username sqoopuser --password NHkkP876rp
--warehous
e-dir giriWH -m 1

rwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-30 09:34


vishal_new
[manikantan20505061@ip-172-31-60-179 ~]$ sqoop import-all-tables --connect
jdbc:mysql://172.31.13.154/sqoopex --username sqoopuser --password NHkkP876rp
--warehous
e-dir giriWH -m 1;
17/04/30 10:04:49 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.3.4.0-3485
17/04/30 10:04:49 WARN tool.BaseSqoopTool: Setting your password on the command-
line is insecure. Consider using -P instead.
17/04/30 10:04:49 INFO manager.MySQLManager: Preparing to use a MySQL streaming
resultset.
17/04/30 10:04:50 INFO tool.CodeGenTool: Beginning code generation
17/04/30 10:04:50 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM
`AkhilEnergy` AS t LIMIT 1
17/04/30 10:04:50 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM
`AkhilEnergy` AS t LIMIT 1
17/04/30 10:04:50 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
/usr/hdp/2.3.4.0-3485/hadoop-mapreduce
Note: /tmp/sqoop-
manikantan20505061/compile/5b1e3708505b1208db0eaf36e971590f/AkhilEnergy.java uses
or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/04/30 10:04:53 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-
manikantan20505061/compile/5b1e3708505b1208db0eaf36e971590f/AkhilEnergy.jar
17/04/30 10:04:53 WARN manager.MySQLManager: It looks like you are importing from
mysql.
17/04/30 10:04:53 WARN manager.MySQLManager: This transfer can be faster! Use the
--direct
17/04/30 10:04:53 WARN manager.MySQLManager: option to exercise a MySQL-specific
fast path.
17/04/30 10:04:53 INFO manager.MySQLManager: Setting zero DATETIME behavior to
convertToNull (mysql)
17/04/30 10:04:53 INFO mapreduce.ImportJobBase: Beginning import of AkhilEnergy
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.4.0-3485/hadoop/lib/slf4j-log4j12-
1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.4.0-3485/zookeeper/lib/slf4j-
log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.4.0-3485/accumulo/lib/slf4j-
log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
17/04/30 10:04:55 INFO impl.TimelineClientImpl: Timeline service address:
http://ip-172-31-13-154.ec2.internal:8188/ws/v1/timeline/
17/04/30 10:04:56 INFO client.RMProxy: Connecting to ResourceManager at ip-172-31-
53-48.ec2.internal/172.31.53.48:8050
17/04/30 10:05:01 INFO db.DBInputFormat: Using read commited transaction isolation
17/04/30 10:05:01 INFO mapreduce.JobSubmitter: number of splits:1
17/04/30 10:05:01 INFO mapreduce.JobSubmitter: Submitting tokens for job:
job_1493302944662_5358
17/04/30 10:05:02 INFO impl.YarnClientImpl: Submitted application
application_1493302944662_5358
17/04/30 10:05:02 INFO mapreduce.Job: The url to track the job:
http://a.cloudxlab.com:8088/proxy/application_1493302944662_5358/
17/04/30 10:05:02 INFO mapreduce.Job: Running job: job_1493302944662_5358
17/04/30 10:05:10 INFO mapreduce.Job: Job job_1493302944662_5358 running in uber
mode : false
17/04/30 10:05:10 INFO mapreduce.Job: map 0% reduce 0%

[manikantan20505061@ip-172-31-60-179 ~]$ hadoop fs -ls giriWH


Found 3 items
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-30 10:05
giriWH/AkhilEnergy
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-30 10:05
giriWH/BK_EMP
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-30 10:06
giriWH/BUILDING_LOOKUP

[manikantan20505061@ip-172-31-60-179 ~]$ sqoop import --connect


jdbc:mysql://172.31.13.154/sqoopex --username sqoopuser --password NHkkP876rp
--table sqoop_testgg1
-hive-import -m 1;
17/04/30 10:16:22 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.3.4.0-3485
17/04/30 10:16:22 WARN tool.BaseSqoopTool: Setting your password on the command-
line is insecure. Consider using -P instead.
17/04/30 10:16:22 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for
output. You can override
17/04/30 10:16:22 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by,
etc.
17/04/30 10:16:23 INFO manager.MySQLManager: Preparing to use a MySQL streaming
resultset.
17/04/30 10:16:23 INFO tool.CodeGenTool: Beginning code generation
17/04/30 10:16:24 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM
`sqoop_testgg1` AS t LIMIT 1
17/04/30 10:16:24 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM
`sqoop_testgg1` AS t LIMIT 1
17/04/30 10:16:24 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
/usr/hdp/2.3.4.0-3485/hadoop-mapreduce
Note: /tmp/sqoop-
manikantan20505061/compile/0ee34c9b3b70e40f0691fe70388bc810/sqoop_testgg1.java uses
or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/04/30 10:16:26 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-
manikantan20505061/compile/0ee34c9b3b70e40f0691fe70388bc810/sqoop_testgg1.jar
17/04/30 10:16:26 WARN manager.MySQLManager: It looks like you are importing from
mysql.
17/04/30 10:16:26 WARN manager.MySQLManager: This transfer can be faster! Use the
--direct
17/04/30 10:16:26 WARN manager.MySQLManager: option to exercise a MySQL-specific
fast path.
17/04/30 10:16:26 INFO manager.MySQLManager: Setting zero DATETIME behavior to
convertToNull (mysql)
17/04/30 10:16:26 INFO mapreduce.ImportJobBase: Beginning import of sqoop_testgg1
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.4.0-3485/hadoop/lib/slf4j-log4j12-
1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.4.0-3485/zookeeper/lib/slf4j-
log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.4.0-3485/accumulo/lib/slf4j-
log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
17/04/30 10:16:27 INFO impl.TimelineClientImpl: Timeline service address:
http://ip-172-31-13-154.ec2.internal:8188/ws/v1/timeline/
17/04/30 10:16:28 INFO client.RMProxy: Connecting to ResourceManager at ip-172-31-
53-48.ec2.internal/172.31.53.48:8050
17/04/30 10:16:34 INFO db.DBInputFormat: Using read commited transaction isolation
17/04/30 10:16:34 INFO mapreduce.JobSubmitter: number of splits:1
HDFS: Number of bytes written=63
17/04/30 10:16:34 INFO mapreduce.JobSubmitter: Submitting tokens for job:
job_1493302944662_5417
17/04/30 10:16:34 INFO impl.YarnClientImpl: Submitted application
application_1493302944662_5417
17/04/30 10:16:35 INFO mapreduce.Job: The url to track the job:
http://a.cloudxlab.com:8088/proxy/application_1493302944662_5417/
17/04/30 10:16:35 INFO mapreduce.Job: Running job: job_1493302944662_5417
17/04/30 10:16:41 INFO mapreduce.Job: Job job_1493302944662_5417 running in uber
mode : false
17/04/30 10:16:41 INFO mapreduce.Job: map 0% reduce 0%
17/04/30 10:16:47 INFO mapreduce.Job: map 100% reduce 0%
17/04/30 10:16:47 INFO mapreduce.Job: Job job_1493302944662_5417 completed
successfully
17/04/30 10:16:47 INFO mapreduce.Job: Counters: 30
File System Counters
FILE: Number of bytes read=0
FILE: Number of bytes written=149199
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=87
HDFS: Number of bytes written=63
HDFS: Number of read operations=4
HDFS: Number of large read operations=0
HDFS: Number of write operations=2
Job Counters
Launched map tasks=1
Other local map tasks=1
Total time spent by all maps in occupied slots (ms)=9510
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=3170
Total vcore-seconds taken by all map tasks=3170
Total megabyte-seconds taken by all map tasks=4869120
Map-Reduce Framework
Map input records=9
Map output records=9
Input split bytes=87

Spilled Records=0
Failed Shuffles=0
Merged Map outputs=0
GC time elapsed (ms)=48
CPU time spent (ms)=1310
Physical memory (bytes) snapshot=219414528
Virtual memory (bytes) snapshot=3239182336
Total committed heap usage (bytes)=179306496
File Input Format Counters
Bytes Read=0
File Output Format Counters
Bytes Written=63
17/04/30 10:16:47 INFO mapreduce.ImportJobBase: Transferred 63 bytes in 19.9532
seconds (3.1574 bytes/sec)
17/04/30 10:16:47 INFO mapreduce.ImportJobBase: Retrieved 9 records.
17/04/30 10:16:47 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM
`sqoop_testgg1` AS t LIMIT 1
17/04/30 10:16:47 INFO hive.HiveImport: Loading uploaded data into Hive
Logging initialized using configuration in jar:file:/usr/hdp/2.3.4.0-
3485/hive/lib/hive-common-1.2.1.2.3.4.0-3485.jar!/hive-log4j.properties
OK
Time taken: 6.968 seconds
Loading data to table default.sqoop_testgg1
Table default.sqoop_testgg1 stats: [numFiles=1, totalSize=63]
OK
Time taken: 0.592 seconds
[manikantan20505061@ip-172-31-60-179 ~]$ hive
WARNING: Use "yarn jar" to launch YARN applications.
Logging initialized using configuration in file:/etc/hive/2.3.4.0-3485/0/hive-
log4j.properties
hive (default)> desc sqoop_testgg1
> ;
OK
id int
name string
Time taken: 1.409 seconds, Fetched: 2 row(s)

pig -x mapreduce
WARNING: Use "yarn jar" to launch YARN applications.
17/04/30 11:16:48 INFO pig.ExecTypeProvider: Trying ExecType : LOCAL
17/04/30 11:16:48 INFO pig.ExecTypeProvider: Trying ExecType : MAPREDUCE
17/04/30 11:16:48 INFO pig.ExecTypeProvider: Picked MAPREDUCE as the ExecType
2017-04-30 11:16:48,440 [main] INFO org.apache.pig.Main - Apache Pig version
0.15.0.2.3.4.0-3485 (rexported) compiled Dec 16 2015, 04:30:33
2017-04-30 11:16:48,440 [main] INFO org.apache.pig.Main - Logging error messages
to: /home/manikantan20505061/pig_1493551008437.log
2017-04-30 11:16:48,458 [main] INFO org.apache.pig.impl.util.Utils - Default
bootup file /home/manikantan20505061/.pigbootup not found
2017-04-30 11:16:48,918 [main] INFO
org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting to
hadoop file system at: hdfs://ip-172-31-53-48.e
c2.internal:8020
2017-04-30 11:16:49,743 [main] INFO org.apache.pig.PigServer - Pig Script ID for
the session: PIG-default-398d1adb-4df0-4b71-9b44-6b340ae4044a
2017-04-30 11:16:50,278 [main] INFO
org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl - Timeline service
address: http://ip-172-31-13-154.ec2.internal:818
8/ws/v1/timeline/
2017-04-30 11:16:50,278 [main] INFO org.apache.pig.backend.hadoop.ATSService -
Created ATS Hook
grunt>
runt> studentggrel = LOAD 'giripig/sudentgg' USING PigStorage(',') as ( id:int,
firstname:chararray,lastname:chararray,phone:chararray,city:chararray);
grunt>
grunt> studentggrel = LOAD 'giripig/sudentgg' USING PigStorage(',') as ( id:int,
firstname:chararray,lastname:chararray,phone:chararray,city:chararray);

grunt>studentggrel = LOAD 'giri_pig/studentgg' USING PigStorage(',') as ( id:int,


firstname:chararray,lastname:chararray,phone:chararray,city:chararray);

Counters:
Total records written : 3
Total bytes written : 144
Spillable Memory Manager spill count : 0
Total bags proactively spilled: 0
Total records proactively spilled: 0
Job DAG:
job_1493302944662_5604
2017-04-30 12:04:00,568 [main] INFO
org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl - Timeline service
address: http://ip-172-31-13-154.ec2.internal:818
8/ws/v1/timeline/
2017-04-30 12:04:00,569 [main] INFO org.apache.hadoop.yarn.client.RMProxy -
Connecting to ResourceManager at ip-172-31-53-48.ec2.internal/172.31.53.48:8050
2017-04-30 12:04:00,571 [main] INFO org.apache.hadoop.mapred.ClientServiceDelegate
- Application state is completed. FinalApplicationStatus=SUCCEEDED. Redirecting
to job history server
2017-04-30 12:04:00,657 [main] INFO
org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl - Timeline service
address: http://ip-172-31-13-154.ec2.internal:818
8/ws/v1/timeline/
2017-04-30 12:04:00,657 [main] INFO org.apache.hadoop.yarn.client.RMProxy -
Connecting to ResourceManager at ip-172-31-53-48.ec2.internal/172.31.53.48:8050
2017-04-30 12:04:00,659 [main] INFO org.apache.hadoop.mapred.ClientServiceDelegate
- Application state is completed. FinalApplicationStatus=SUCCEEDED. Redirecting
to job history server
2017-04-30 12:04:00,737 [main] INFO
org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl - Timeline service
address: http://ip-172-31-13-154.ec2.internal:818
8/ws/v1/timeline/
2017-04-30 12:04:00,737 [main] INFO org.apache.hadoop.yarn.client.RMProxy -
Connecting to ResourceManager at ip-172-31-53-48.ec2.internal/172.31.53.48:8050
2017-04-30 12:04:00,739 [main] INFO org.apache.hadoop.mapred.ClientServiceDelegate
- Application state is completed. FinalApplicationStatus=SUCCEEDED. Redirecting
to job history server
2017-04-30 12:04:00,787 [main] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher -
Success!
2017-04-30 12:04:00,789 [main] INFO org.apache.pig.data.SchemaTupleBackend - Key
[pig.schematuple] was not set... will not generate code.
2017-04-30 12:04:00,799 [main] INFO
org.apache.hadoop.mapreduce.lib.input.FileInputFormat - Total input paths to
process : 1
2017-04-30 12:04:00,799 [main] INFO
org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths
to process : 1
(1,Rajiv,Reddy,9840447884,hyderabad)
(2,sudha,Ramesh,9600055807,chennai)
(3,giriprasad,gunalan,9566222358,chennai)

- manikantan20505061 manikantan20505061 0 2017-04-30 10:07 shyne


-rw-r--r-- 3 manikantan20505061 manikantan20505061 0 2017-04-29 07:36
shyney.txt
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-30 09:44
sqoop_new
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-30 09:59
sqoop_new1
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-30 10:06
sqoop_new2
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-30 09:45
sqoop_sunil_dir
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-30 10:01
sqoop_sunil_dir1
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-30 09:42
srini1
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-30 10:00
srinialltables
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-30 11:10
sunil_hadoop
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-29 10:57
syresh
-rw-r--r-- 3 manikantan20505061 manikantan20505061 0 2017-04-29 07:24
syresh.txt
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-30 11:12
syresh_PIG
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-30 10:00
syresh_all
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-30 09:43
syreshkum
-rw-r--r-- 3 manikantan20505061 manikantan20505061 0 2017-04-29 07:25
test.txt
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-29 07:44
testgiri
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-29 10:25
training
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-30 11:10
tutosrini
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-29 07:35
varn
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-30 09:59
varun
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-30 11:12
varunnow
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-29 07:52
venkHadoop
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-30 11:13
venkHadoop2
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-29 08:03
venky
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-29 08:03
vishal
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-30 09:58
vishal_all1
drwxr-xr-x - manikantan20505061 manikantan20505061 0 2017-04-30 09:34
vishal_new
[manikantan20505061@ip-172-31-60-179 ~]$ pwd
/home/manikantan20505061
[manikantan20505061@ip-172-31-60-179 ~]$ hadoop fs -ls giri_pig
Found 1 items
-rw-r--r-- 3 manikantan20505061 manikantan20505061 115 2017-04-30 11:14
giri_pig/studentgg
[manikantan20505061@ip-172-31-60-179 ~]$

Potrebbero piacerti anche