Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
WORK EXPERIENCE
Hadoop Developer
Sears Holding - Hoffman Estates, IL - February 2013 to Present
Sears Holdings Corporation is a leading integrated retailer with almost 2,500 full-line and specialty retail stores
in the United States and Canada. Sears Holdings is the leading home appliance retailer as well as a leader in
tools, lawn and garden, fitness equipment and automotive repair and maintenance. I was a part of Big Data
Processing Team to take advantage of available user data to make better decisions that significantly enhanced
organizational success. I was involvedin setting up Cloudera Hadoop Cluster and wrote MapReduce jobs,
Hive queriesand PigLatin scripts to explore through the data of customer sales to find significant information
for trend analysis.
Responsibilities:
Involved in start to end process of hadoop cluster installation, configuration and monitoring.
Responsible for building scalable distributed data solutions using Hadoop
Installed and configured Hive, Pig, Sqoop, Flume and Oozie on the Hadoop cluster
Setup and benchmarked Hadoop/HBase clusters for internal use
Developed Simple to complex Map/reduce Jobs using Hive and Pig
Optimized Map/Reduce Jobs to use HDFS efficiently by using various compression mechanisms
Handled importing of data from various data sources, performed transformations using Hive, MapReduce,
loaded data into HDFS and Extracted the data from MySQL into HDFS using Sqoop
Analyzed the data by performing Hive queries and running Pig scripts to study customer behavior
Used UDF's to implement business logic in Hadoop
Implemented business logic by writing UDFs in Java and used various UDFs from Piggybanks and other
sources.
Continuous monitoring and managing the Hadoop cluster using Cloudera Manager
Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as
required
Installed Oozieworkflow engine to run multiple Hive and Pig jobs
Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports
for the BI team
Environment: Hadoop, MapReduce, HDFS, Hive, Pig, Java, SQL, Cloudera Manager, Sqoop, Flume, Oozie,
Java (jdk 1.6), Eclipse
Hadoop Developer
American Express - Phoenix, AZ - February 2012 to January 2013
American Express provides innovative payment, travel and expense management solutions for individuals
and businesses of all sizes. It helps customers realize their dreams and aspirations through industry-leading
benefits, access to unique experiences, business-building insights, and global customer care. Purpose of the
project is to create Enterprise Data Hub so that various business units and use the date from Hadoop to do
Data Analytics.The solution is based on the Cloudera Hadoop. The data will be stored in Hadoop file system
and processed using Map/Reduce jobs.
Responsibilities:
Installed and configured Hadoop MapReduce, HDFS and developed multiple MapReduce jobs in Java for
data cleansing and preprocessing.
Involved in loading data from UNIX file system to HDFS.
Installed and configured Hive and also written Hive UDFs.
Evaluated business requirements and prepared detailed specifications that follow project guidelines required
to develop written programs.
Devised procedures that solve complex business problems with due considerations for hardware/software
capacity and limitations, operating times and desired results.
Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
Provided quick response to ad hoc internal and external client requests for data and experienced in creating
ad hoc reports.
Responsible for building scalable distributed data solutions using Hadoop.
Responsible for cluster maintenance, adding and removing cluster nodes, cluster monitoring and
troubleshooting, manage and review data backups, manage and review Hadoop log files.
Worked hands on with ETL process.
Handled importing of data from various data sources, performed transformations using Hive, MapReduce,
and loaded data into HDFS.
Extracted the data from Teradata into HDFS using Sqoop.
Analyzed the data by performing Hive queries and running Pig scripts to know user behavior like shopping
enthusiasts, travelers, music lovers etc.
Exported the patterns analyzed back into Teradata using Sqoop.
Continuous monitoring and managing the Hadoop cluster through Cloudera Manager.
Java/J2EE Developer
Capital One Bank - McLean, VA - October 2009 to June 2010
CapitalOne Auto Finance (COAF) is project where we design an application which can be used by the
CapitalOne Bank to deal with different types of Auto loans depending upon the Customers Eligibility. COAF
includes the maintaining of the existing Application.
Responsibilities:
Played the role of Java developer in the project called "Coverage Selection Tool".
Technologies involved are EJB 3.0, Web services, Dojo (UI Framework) and other J2EE server components.
Analyze and prepare technical specifications with UML diagrams (Use case, Class, and Sequence diagrams
Used Rational Rose to develop the components required by client.
Wrote complex logic for forecasting the price of the products and subparts in next future quarters.
Development of business components applying OOAD and using good design patterns like, DAO, Value
Objects, DTO, Factory, singleton.
Implemented DOM parsing module and created XSD and XSLT components.
Used stored procedures and Triggers extensively to develop the Backend business logic in Oracle database.
Involved in performance improving and bug fixing.
Analyze old database table fields and map to new schema tables using complex SQL Queries and PL/SQL
procedures.
Developed ANT scripts for deploying the application using Apache ANT.
Coordinate the Functional users and testing teams in testing the application in Test environment.
Given production support for this after deployed in to the production server.
Involved in data base migration testing activities.
Environment: Java, JSP, Servlets, XML, JDBC, Java Script, PL/SQL, ANT build, CSS, HTML, Eclipse IDE.
JAVASCRIPT.
Java Developer
Telesis Global Solutions - March 2008 to June 2009
Government Business Module [GBM] is a complete web-enabled product developed by Telesis, catering to
all nationalized and private sector banks in India. It supports various government transactions like Taxes,
Bonds, Public Provident Fund and Pension. It enables Fund transfer, Consolidation, Commission, Calculation,
Pension Processing, Reports and Scrolls in a format regulated by the Government of India. The functionality
of each module is customized to suit different target audiences in the banking sector.
Responsibilities:
Extensively involved in the design of JSP screens for the Public Provident Fund and Bond modules.
Developed the user interface screens for the above modules.
Worked with the front-end applications using HRML, XML.
Developed the business components (in core Java) used in the JSP screens.
Implemented Delegate, Faade, DAO patterns for building the application.
Written Ant scripts for build, unit testing, deployment, check styles etc.
Used JUnit for unit testing.
I was part of all testing phases. Provided UAT support.
Created war files and deployed in Web Logic and Websphere Application Server.
Created tables, stored procedure, fulfills the requirements and accommodate the business rules in Oracle
8i database.
Delivered Zero defects in UAT.
Environment: Java, JSP, XML, HTML, Servlets, SQL, PL-SQL, JDK JDBC, Web Logic 6.1, Websphere, EJB,
JNDI, Eclipse, Ant.
Java Developer
Satyam Computers - May 2007 to February 2008
Catalyst Data Warehouse Application is aweb application provides a web interface through which the user
can put a request for activation or deactivation of an originating open digit (telephone numbers). Further the
application provides various functionalities like search, advanced search, move and change of open digits.
Responsibilities:
Capturing project requirements and analyzing the requirements.
Involved in analysis, design and developing front end/UI using JSP, HTML, DHTML and JavaScript.
Build the whole application. Application was completely build on MVC architecture using some internal custom
frameworks.
Developed Adjustment screens using JAVA and Servlets.
Prepared workflow diagrams using MS VISIO and modeled the methods based on OOPS methodology
Developed the Host modules using C++, DB2 and SQL.
Responsible for creating the front-end code and java code to suit the business requirement
Written Ant scripts for build, unit testing, deployment, check styles etc.
Created tables, stored procedure fulfill the requirements and accommodate the business rules in Oracle 8i
database.
Environment: Java, HTML, Web Logic 6.1, Java, JSP, Servlets, SQL, DB2, PL-SQL, JDK JDBC, EJB, JNDI,
Eclipse, Ant.
ADDITIONAL INFORMATION
TECHINCAL SKILLS
Hadoop/Big Data HDFS, MapReduce, Hive, Pig, Sqoop, Flume, Oozie, and ZooKeeper.
No SQL Databases Hbase, Cassandra, mongoDB
Languages C, C++, Java, J2EE, PL/SQL, Pig Latin, HiveQL, Unix shell scripts
Java/J2EE Technologies Applets, Swing, JDBC, JNDI, JSON, JSTL, RMI, JMS, Java Script, JSP, Servlets,
EJB, JSF, JQuery
Frameworks MVC, Struts, Spring, Hibernate
Operating Systems Sun Solaris, HP-UNIX , RedHat Linux, Ubuntu Linux and Windows XP/Vista/7/8
Web Technologies HTML, DHTML, XML, AJAX, WSDL, SOAP
Web/Application servers Apache Tomcat,WebLogic, JBoss
Databases Oracle 9i/10g/11g, DB2, SQL Server, MySQL, Teradata
Tools and IDE Eclipse, NetBeans, Toad, Maven, ANT, Hudson, Sonar, JDeveloper, Assent PMD, DB
Visualizer,
Version control SVN, CVS
Network Protocols TCP/IP, UDP, HTTP, DNS, DHCP