Sei sulla pagina 1di 5

Aditya Dhavala

Hadoop Developer - Sears Holding


- Email me on Indeed: indeed.com/r/Aditya-Dhavala/4a1ec64204eb4bd3
Around 7 years of IT experience as a Developer, Designer & quality reviewer with cross platform integration
experience using Hadoop, Java, J2EE and SOA.
Hands on experience in installing, configuring and using Apache Hadoop ecosystems such as Map Reduce,
HIVE, PIG, SQOOP, FLUME and OOZIE.
Hands on experience on Hortonworks and Cloudera Hadoop environments.
Strong understanding of Hadoop daemons and Map-Reduce concepts.
Experienced in importing-exporting data into HDFS format.
Experienced in analyzing big data using Hadoop environment.
Experienced in handling Hadoop Ecosystem Projects such as Hive, Pig and Sqoop.
Experienced in developing UDFs for Hive using Java.
Strong understanding of NoSQL databases like HBase, MongoDB & Cassandra.
Familiar with handling complex data processing jobs using Cascading.
Hands on experience with Hadoop, HDFS, MapReduce and Hadoop Ecosystem (Pig, Hive, Oozie, Flume
and Hbase).
Extensive experience in design, development and support Model View Controller using Struts and Spring
framework.
Develop reusable solution to maintain proper coding standard across different java project.
Proficiency with the application servers like WebSphere, WebLogic, JBOSS and Tomcat
Developed core modules in large cross-platform applications using JAVA, J2EE, Spring, Struts, Hibernate,
JAX-WS Web Services, and JMS.
Expertise in debugging and optimizing Oracle and java performance tuning with strong knowledge in Oracle
11g and SQL
Ability to work effectively in cross-functional team environments and experience of providing training to
business users.

WORK EXPERIENCE

Hadoop Developer
Sears Holding - Hoffman Estates, IL - February 2013 to Present
Sears Holdings Corporation is a leading integrated retailer with almost 2,500 full-line and specialty retail stores
in the United States and Canada. Sears Holdings is the leading home appliance retailer as well as a leader in
tools, lawn and garden, fitness equipment and automotive repair and maintenance. I was a part of Big Data
Processing Team to take advantage of available user data to make better decisions that significantly enhanced
organizational success. I was involvedin setting up Cloudera Hadoop Cluster and wrote MapReduce jobs,
Hive queriesand PigLatin scripts to explore through the data of customer sales to find significant information
for trend analysis.
Responsibilities:
Involved in start to end process of hadoop cluster installation, configuration and monitoring.
Responsible for building scalable distributed data solutions using Hadoop
Installed and configured Hive, Pig, Sqoop, Flume and Oozie on the Hadoop cluster
Setup and benchmarked Hadoop/HBase clusters for internal use
Developed Simple to complex Map/reduce Jobs using Hive and Pig

Optimized Map/Reduce Jobs to use HDFS efficiently by using various compression mechanisms
Handled importing of data from various data sources, performed transformations using Hive, MapReduce,
loaded data into HDFS and Extracted the data from MySQL into HDFS using Sqoop
Analyzed the data by performing Hive queries and running Pig scripts to study customer behavior
Used UDF's to implement business logic in Hadoop
Implemented business logic by writing UDFs in Java and used various UDFs from Piggybanks and other
sources.
Continuous monitoring and managing the Hadoop cluster using Cloudera Manager
Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as
required
Installed Oozieworkflow engine to run multiple Hive and Pig jobs
Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports
for the BI team
Environment: Hadoop, MapReduce, HDFS, Hive, Pig, Java, SQL, Cloudera Manager, Sqoop, Flume, Oozie,
Java (jdk 1.6), Eclipse

Hadoop Developer
American Express - Phoenix, AZ - February 2012 to January 2013
American Express provides innovative payment, travel and expense management solutions for individuals
and businesses of all sizes. It helps customers realize their dreams and aspirations through industry-leading
benefits, access to unique experiences, business-building insights, and global customer care. Purpose of the
project is to create Enterprise Data Hub so that various business units and use the date from Hadoop to do
Data Analytics.The solution is based on the Cloudera Hadoop. The data will be stored in Hadoop file system
and processed using Map/Reduce jobs.
Responsibilities:
Installed and configured Hadoop MapReduce, HDFS and developed multiple MapReduce jobs in Java for
data cleansing and preprocessing.
Involved in loading data from UNIX file system to HDFS.
Installed and configured Hive and also written Hive UDFs.
Evaluated business requirements and prepared detailed specifications that follow project guidelines required
to develop written programs.
Devised procedures that solve complex business problems with due considerations for hardware/software
capacity and limitations, operating times and desired results.
Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
Provided quick response to ad hoc internal and external client requests for data and experienced in creating
ad hoc reports.
Responsible for building scalable distributed data solutions using Hadoop.
Responsible for cluster maintenance, adding and removing cluster nodes, cluster monitoring and
troubleshooting, manage and review data backups, manage and review Hadoop log files.
Worked hands on with ETL process.
Handled importing of data from various data sources, performed transformations using Hive, MapReduce,
and loaded data into HDFS.
Extracted the data from Teradata into HDFS using Sqoop.
Analyzed the data by performing Hive queries and running Pig scripts to know user behavior like shopping
enthusiasts, travelers, music lovers etc.
Exported the patterns analyzed back into Teradata using Sqoop.
Continuous monitoring and managing the Hadoop cluster through Cloudera Manager.

Installed Oozie workflow engine to run multiple Hive.


Developed Hive queries to process the data and generate the data cubes for visualizing.
Environment: Hadoop, MapReduce, HDFS, Hive, Ooozie, Java (jdk1.6), Cloudera, NoSQL, Oracle 11g, 10g,
PL SQL, SQL*PLUS, Toad 9.6, Windows NT, UNIX Shell Scripting.

Java/J2EE Application Developer


Columbia Bank - Lakewood, WA - August 2010 to January 2012
The Columbia Bank Card Verification system is assessment system for credit card application. It processes
the application of the customer who wants a credit card and processes till it gets either accepted or rejected.
Responsibilities:
Involved in Java, J2EE, struts, web services and Hibernate in a fast paced development environment.
Followed agile methodology, interacted directly with the client provide/take feedback on the features, suggest/
implement optimal solutions, and tailor application to customer needs.
Rich experiences of database design and hands on experience of large database systems: Oracle 8i and
Oracle 9i.
Involved in design and implementation of web tier using Servlets and JSP.
Used Apache POI for Excel files reading.
Written build scripts with Ant for deploying war and ear applications.
Developed the user interface using JSP and Java Script to view all online trading transactions.
Designed and developed Data Access Objects (DAO) to access the database.
Used DAO Factory and value object design patterns to organize and integrate the JAVA Objects
Coded Java Server Pages for the Dynamic front end content that use Servlets and EJBs.
Coded HTML pages using CSS for static content generation with JavaScript for validations.
Used JDBC API to connect to the database and carry out database operations.
Used JSP and JSTL Tag Libraries for developing User Interface components.
Performing Code Reviews.
Performed unit testing, system testing and integration testing.
Involved in building and deployment of application in Linux environment.
Deploying application in Development and Production servers.
Environment: Java, J2EE, JDBC, Struts, SQL language. Hibernate, Eclipse, Apache POI, CSS.

Java/J2EE Developer
Capital One Bank - McLean, VA - October 2009 to June 2010
CapitalOne Auto Finance (COAF) is project where we design an application which can be used by the
CapitalOne Bank to deal with different types of Auto loans depending upon the Customers Eligibility. COAF
includes the maintaining of the existing Application.
Responsibilities:
Played the role of Java developer in the project called "Coverage Selection Tool".
Technologies involved are EJB 3.0, Web services, Dojo (UI Framework) and other J2EE server components.
Analyze and prepare technical specifications with UML diagrams (Use case, Class, and Sequence diagrams
Used Rational Rose to develop the components required by client.
Wrote complex logic for forecasting the price of the products and subparts in next future quarters.
Development of business components applying OOAD and using good design patterns like, DAO, Value
Objects, DTO, Factory, singleton.
Implemented DOM parsing module and created XSD and XSLT components.

Used stored procedures and Triggers extensively to develop the Backend business logic in Oracle database.
Involved in performance improving and bug fixing.
Analyze old database table fields and map to new schema tables using complex SQL Queries and PL/SQL
procedures.
Developed ANT scripts for deploying the application using Apache ANT.
Coordinate the Functional users and testing teams in testing the application in Test environment.
Given production support for this after deployed in to the production server.
Involved in data base migration testing activities.
Environment: Java, JSP, Servlets, XML, JDBC, Java Script, PL/SQL, ANT build, CSS, HTML, Eclipse IDE.
JAVASCRIPT.

Java Developer
Telesis Global Solutions - March 2008 to June 2009
Government Business Module [GBM] is a complete web-enabled product developed by Telesis, catering to
all nationalized and private sector banks in India. It supports various government transactions like Taxes,
Bonds, Public Provident Fund and Pension. It enables Fund transfer, Consolidation, Commission, Calculation,
Pension Processing, Reports and Scrolls in a format regulated by the Government of India. The functionality
of each module is customized to suit different target audiences in the banking sector.
Responsibilities:
Extensively involved in the design of JSP screens for the Public Provident Fund and Bond modules.
Developed the user interface screens for the above modules.
Worked with the front-end applications using HRML, XML.
Developed the business components (in core Java) used in the JSP screens.
Implemented Delegate, Faade, DAO patterns for building the application.
Written Ant scripts for build, unit testing, deployment, check styles etc.
Used JUnit for unit testing.
I was part of all testing phases. Provided UAT support.
Created war files and deployed in Web Logic and Websphere Application Server.
Created tables, stored procedure, fulfills the requirements and accommodate the business rules in Oracle
8i database.
Delivered Zero defects in UAT.
Environment: Java, JSP, XML, HTML, Servlets, SQL, PL-SQL, JDK JDBC, Web Logic 6.1, Websphere, EJB,
JNDI, Eclipse, Ant.

Java Developer
Satyam Computers - May 2007 to February 2008
Catalyst Data Warehouse Application is aweb application provides a web interface through which the user
can put a request for activation or deactivation of an originating open digit (telephone numbers). Further the
application provides various functionalities like search, advanced search, move and change of open digits.
Responsibilities:
Capturing project requirements and analyzing the requirements.
Involved in analysis, design and developing front end/UI using JSP, HTML, DHTML and JavaScript.
Build the whole application. Application was completely build on MVC architecture using some internal custom
frameworks.
Developed Adjustment screens using JAVA and Servlets.

Prepared workflow diagrams using MS VISIO and modeled the methods based on OOPS methodology
Developed the Host modules using C++, DB2 and SQL.
Responsible for creating the front-end code and java code to suit the business requirement
Written Ant scripts for build, unit testing, deployment, check styles etc.
Created tables, stored procedure fulfill the requirements and accommodate the business rules in Oracle 8i
database.
Environment: Java, HTML, Web Logic 6.1, Java, JSP, Servlets, SQL, DB2, PL-SQL, JDK JDBC, EJB, JNDI,
Eclipse, Ant.

ADDITIONAL INFORMATION
TECHINCAL SKILLS
Hadoop/Big Data HDFS, MapReduce, Hive, Pig, Sqoop, Flume, Oozie, and ZooKeeper.
No SQL Databases Hbase, Cassandra, mongoDB
Languages C, C++, Java, J2EE, PL/SQL, Pig Latin, HiveQL, Unix shell scripts
Java/J2EE Technologies Applets, Swing, JDBC, JNDI, JSON, JSTL, RMI, JMS, Java Script, JSP, Servlets,
EJB, JSF, JQuery
Frameworks MVC, Struts, Spring, Hibernate
Operating Systems Sun Solaris, HP-UNIX , RedHat Linux, Ubuntu Linux and Windows XP/Vista/7/8
Web Technologies HTML, DHTML, XML, AJAX, WSDL, SOAP
Web/Application servers Apache Tomcat,WebLogic, JBoss
Databases Oracle 9i/10g/11g, DB2, SQL Server, MySQL, Teradata
Tools and IDE Eclipse, NetBeans, Toad, Maven, ANT, Hudson, Sonar, JDeveloper, Assent PMD, DB
Visualizer,
Version control SVN, CVS
Network Protocols TCP/IP, UDP, HTTP, DNS, DHCP

Potrebbero piacerti anche