Sei sulla pagina 1di 8

G Ravi Kumar Total IT Exp: 10+ years

Email: badprince.32@gmail.com Mobile: +91-9948591391

Expe rie nce Summary:

 Has over 10+ years of experience in IT. Holds a Bachelor in Computer Science and
Information Technology.
 Experience in developing applications using Java, Scala, Python and Unix Shell
Scripting.
 Experience in Big Data Ecosystem: Hadoop, HBase, Hive, Kafka, Apache Spark,
Apache Nifi, Fluentd, DataTorrent (Apex).
 Familiar with Apache Flink and Apache Beam.
 Experience in Enterprise Search tools: Apache Solr and ElasticSearch
 Experience in Data reporting and Visualization tools: Tableau, MicroStrategy, Kibana,
and Graphana.
 Experience in RDBM databases (Oracle, SQL server, DB2).
 Experience in No-SQL databases : MongoDB, CouchDB, Hbase, MemSql, Neo4j, Titan
 Experience in AWS Data Analytics platform: Kinesis, S3, EMR, Redshift, EC2, Dynamo,
and Lambda.
 Experience in Data Governance tools: Waterline Data and Apache Atlas
 Familiar with GoogleCloud and Azure Data analytics platform.
 Familiar with Machine learning, IOT, Chatbot and Data Marketplace platforms.
 Hands on Experience in understanding of requirements, solution development, preparing
work specifications, effort estimation, analysis, design, build, testing and documentation as
Individual Contributor.
 Hands on Experience as Data architect in data modelling, design patterns, Batch & Real-
time processing, cataloguing, profiling, lineage, wrangling/blending, building models,
visualization, consumption, enterprise search, Graph processing.
 Key participation as Solution architect in Pre-sales process which include responses to
RFPs/RFIs, evaluation of technology, architectural design, estimation (staffing) etc.
 Major contribution as Technology consultant in niche areas of Cloud, Big data and AI in
defining end to end solutions, driving full stack architecture based engagements.
 Highly effective team player, consistently helping realize the overall project goals and
bringing out the best from other team members.
 Demonstrated ability to quickly master new technologies and comprehensive problem
solving abilities.
 Have good inter personal, communication and presentation skills.
 Have worked on significant projects and gained unique well-versed knowledge in different
software methodologies.
 Has very good knowledge on the client business process and domain expert in Banking &
Financial services, Media & Entertainment, Telecom and Health Care.
 Agile (Scrum, Kanban and Lean) and DevOps practitioner.

 +91-9948591391
 badprince. 32@gmail.com Page 1 of 8
Expe rie nce De tails:
 Currently working at Infosys, Hyderabad.

Name of the Designatio Address of From To Duration(Years


Company n the employer )
Infosys Technology Infosys-SEZ, April Till Currently
Lead Pocharam 2015 date working

Cognizant Associate Vanenburg IT 2009 April 6Years 2Months


Technology Projects Park, Jan 2015
Solutions India Pvt. #17, Software
Ltd. Units Layout,
Madhapur

Skill se t:

Programming Languages Core Java, Scala, Python and UNIX


Shell Scripting
Big Data Ecosystem Hadoop, HBase, Hive, Pig, Sqoop, Kafka,
DataTorrent (Apache Apex), Fluentd
Databases Oracle, MySql,DB2
Data warehouses Vertica, Redshift
NoSQL Databases Hbase, MongoDB, CouchDB, MemSql,
Neo4j, Titan
Data Reporting & Visualization Tableau, MicroStrategy, Kibana,
Graphana
Enterprise Search Apache Solr, Elastic Search
Cloud Platforms AWS, Azure, Google Cloud, OpenStack

Education:

Title of the Degree with Branch College/University Year of Passing Percentage/


GPA
Bachelor Of Technology – Jawaharlal Nehru 2008 82.1%
Computer Science and Technological University
Information Technology
XII Board of Intermediate 2004 92.8%
Education, Andhra Pradesh
X Board of Secondary 2002 86.1%
Education, Andhra Pradesh

 +91-9948591391
 badprince. 32@gmail.com Page 2 of 8
Awards & Achie ve me nts:
 Sun certified Java Programmer.
 IBM certified - AIX 6.1 Basic Operations.
 Cognizant certified Professional in Oracle 10g.
 Infosys certified Hadoop and Spark Developer.
 Infosys certified Data architect.
 Recipient of multiple awards for outstanding contribution from Cognizant.
 Recipient of Most valuable performer award from Infosys in the years 2016, 2017 and
2018.

Proje ct Profile s:

#1 Supplier Management System – Spark Migration Feb 2018 – Till now


Client: World’s Largest Telecommunications Company
Technologies: Cloudera Hadoop, Hive, Sqoop, Java 1.8, Unix, Apache Spark, Oracle SQL,
DataTorrent, GIT, Jenkins
Team size: 20
Role: Technology Architect
Description:
Supplier Management System (SMS) is a very complex financial system used for
making monthly payments of more than one billion USD to its entertainment content providers. The
current system was based out of DataTorrent (Apache Apex) which was not scaling to the Client current
work load. The Client’s vision to move towards Spark for speedy decisions and actions and also to cater
to increasing user base of customer.

Responsibilities:
 Key role as Architect in designing the application architecture, migration planning and
technology evaluation.
 Preparation of Self-help guides and Ready-reckoners on Spark usage for the team.
 Participation in Design discussions with Client and preparation of design documents.
 Setting up the basic Skeleton of Spark Java project.
 Development of critical modules using Spark Dataset API in Java.
 Integration of multiple layers from Source to Destination and configura tion.
 Evaluation of various tools for Logging, Monitoring and Alerting.
 Setup Logging, Monitoring and Alerting framework for the application at all layers.
 Perform benchmarking (Spark core vs Dataset vs Dataframe) on multiple environments.
 Setting up CI/CD pipeline using Git and Jenkins.
 Manage Git repositories and perform code reviews.
 Performance tuning the Spark application.
 Key role in automation of data validation and testing utilities post -migration.
 Team management.
 Guiding the team in development, T esting and Code reviews.
 Adhering to the agile methodology.

 +91-9948591391
 badprince. 32@gmail.com Page 3 of 8
#2 Supplier Management System – DataTorrent March 2017 – Feb 2018

Client: World’s Largest Telecommunications Company


Technologies: Cloudera Hadoop, Hive, Sqoop, Java 1.6, Unix, Oracle SQL, DataTorrent, SVN
Team size: 15
Role: Technology Lead
Description:
SMS Pre-processing is a Big data Solution based on Cloudera Hadoop distribution
as the back end and DataTorrent (Apache Apex) as the processing layer. It acts as an Enterprise Data
Warehouse which would club data from a large number of external sources and churn the data in a
format that is needed for processing thereby isolating the processing layer from the data layer. The
Processing layer is handled by DataTorrent. DT has in memory processing capabilities that makes its
processing very fast and efficient.

Responsibilities:
 Key role as Architect in designing the application architecture, Data model for Hive and Oracle
and technology evaluation.
 Scaling up on the DataTorrent tool, Preparing cook books and Self -help guides on DataTorrent
usage and training the team.
 Participation in Requirements, Analysis and Design phases with Client partners.
 Development of core modules using the DataTorrent API in Java.
 Integration of multiple layers from Source to Destination and configuration.
 Setup Logging, Monitoring and Alerting framework for the application at all layers.
 Perform benchmarking and Performance tuning on the DataTorrent tool.
 Team management.
 Roll out releases of the application/builds as per the Sprint schedule.
 Guiding the team in development, Testing and Code reviews.
 Adhering to the agile methodology.

#3 Customer Experience May 2016 – March 2017

Client: A large Cables & Communications Company in US


Technologies : ElasticSearch, Fluentd, HBase, Kafka, Apache Nifi, Vertica, Openstack, Grafana, Kibana,
Wily, OP5, Ansible
Team size: 15
Role: Technical Lead
Description: This Cables & Communications Company provides video, high-speed Internet and phone
services to residential customers. Customer Experience application utilizes a large Big Data Platform
operating in multiple datacenters in client’s private cloud platform. Customer Experience deals with
collecting, visualizing, and acting on the Customer's data to support customers’ journeys and their
experience with the Clients services. All the applications, tools, big data platform, cloud platform which are
part of the Customer Experience are to be supported and monitored real-time in the DevOps model.

Responsibilities:
 Key role in setting up the team and training the team for required technologies.
 Setup Monitoring and Alerting tools for all the applications and Infrastructure under the scope
of the project.
 +91-9948591391
 badprince. 32@gmail.com Page 4 of 8
 Team management
 Develop automation cookbooks for administration and Configuration management at both
application and infrastructure level.
 Installation of systems on cloud infrastructure
 Roll out new releases of the application/builds
 Automation / orchestration on cloud environment.
 Root Cause Analysis of issues of Platform issues
 Guiding the team in developing and configuring automated scripts leveraging Python and
Shell scripting for repetitive tasks.
 Interaction with client teams in the Devops-Agile model.
 Weekly and Monthly deliverables like Operational reports and KPI trackers.

#4 Global Meta-data project April 2015 – May 2016

Client : Leading Cables and Communications, U.S.A


Technologies : Hadoop, MapReduce, Java, Hbase, Apache Solr, Sqoop, Unix, Jboss, Sql, Spark.
Team Size: 4
Role: Team member
Description : Global meta-data (TV or Movie data) system developed to serve as a central metadata
ingest, consolidation, and storage system capable of accommodating multiple metadata sources having
varying metadata schema. The consolidation requires metadata to be compared (matching) and to be
assigned with an ID to represents same program coming from different vendors. The matc hing and ID
assignment should be in near real time at the time of ingestion.

Responsibilities:
 Key role in setting up the Cloudera (CDH5) cluster in Dev environment.
 Key role in setting up the Apache Solr cloud in Dev environment.
 Writing Sqoop and SQL sc ripts for data ingestion and indexing into Solr.
 Developing Java API for meta data matching and ID assignment using Solr and Hbase API.
 Involved in build and testing of MapReduce/Spark programs
 Coordination with various vendor data teams.

#5 Compliance & Risk Mitigation Jul 2009 – April 2015

Client : Leading Banking and Financial Services Company, U.S.A


Technologies : Java/J2EE, Web Services, UNIX – Shell Scripting, SQL, Informatica
Description : One of the leading financial institutions, which adheres to certain compliance and
security norms that the legal and regulatory department lays down time to time. The applications execute
in distributed environment built on various technologies interfacing with various third party tools.
Part of the “Compliance and Risk mitigation” portfolio which essentially focus on development, major
enhancements and maintenance. The major applications in Compliance and Risk Mitigation portfolio
include solutions to Anti-Money Laundering, Fraud detection, Personal Trading, Wire Transfer Monitoring,
Enterprise Risk Case Management etc.

Responsibilities:
 Tech lead and SME for the AML ,Wire Transfer Monitoring application, Personal Trading and Risk
Case Management applications at offshore.
 +91-9948591391
 badprince. 32@gmail.com Page 5 of 8
 Involved in development of Personal Trading system, AML monitoring application and Enterprise
Risk case Management solutions applications.
 Involved in all the major releases from the time I have joined the project.
 Involved in designing and developing AML application upgrade which was critical to the
Compliance Business.
 Involved in major Infrastructure upgrade project at Organizational level.
 Undergone Cognizant Hadoop training.
 Major responsibilities in this project involved Effort estimation, requirement analysis, design,
development and testing.
 Mentored new team members on the compliance applications and the process that is being
followed in Cognizant.
 Played role of Auditee for internal & external audits of the project.
 Co-ordination with Business users and other vendor teams.

#6 a. Anti-Money Laundering Analytics Jan 2014 – April 2015

Team Size :5
Technologies : Hadoop, MapReduce, HDFS, Hive, Java, SQL, Pig, Sqoop, Oozie
Project Description: AML Analytics is a solution to generate Anti-money Laundering (AML) reports for the
Compliance department. Provides insightful daily analyses on huge transaction data received from multiple
sources across the organization. In addition to Fraud detection and Personal Trading, the solution provided
reports required for Auditing and Regulatory requirement.

Responsibilities: As an Offshore Tech Lead, I am responsible for:


 Key role in requirements gathering from client Business.
 Interacting with AML Analysts on daily basis.
 Importing and exporting data into HDFS and Hive using Sqoop.
 Supported design analysis, strategy development and project planning.
 Involved in build and testing of MapReduce programs.
 Involved in writing Pig and Hive queries.
 Coordination with Various dat a source teams.

#6 b. Enterprise Risk Case Management May 2012 – Aug 2013

Team Size : 12
Technologies : Java/J2EE, WebServices, UNIX, Oracle, Apache Tomcat, Actimize –RCM,
Informatica, Java Script
Project Description: Enterprise Risk Case Management (ERCM) system is a web application to track
financial cases involving Suspected Money Fraud, Money Laundering and Alert on artifacts for possible
anomalies. The system will also produce SAR’s (Suspicious Activity Reports) for the Government.

Responsibilities: As an Offshore Tech Lead, I am responsible for:


 Played a key role in building the team.
 Key role in requirements gathering from client Business.
 Managing Team in assigning the tasks to the team members.
 +91-9948591391
 badprince. 32@gmail.com Page 6 of 8
 Interacting with Program Managers and Business Development Managers on regular basis
 Involved in initial project set -up and Server configurations
 Involved in Analysis, Design and Coding.
 Involve in Design reviews and Key technical issues and ensure the deliverables are as per
timelines.
 Involved in Code reviews and testing co-ordination.
 Played role of Auditee for in internal and external audits of the project.
 Coordination with Various Cross Commit teams and make sure the deliverables delivered in
time.
 Maintenance of the application in the Warranty period.

#6 c. Anti-Money Laundering monitoring Application Jan 2011 – Feb 2012

Team Size : 10
Technologies : Java/J2EE, UNIX, DB2, Apache Tomcat, IHS server, Actimize – Monitor product
Project Description: AML application provides a monitoring solution for the Compliance department to
detect possible fraud or money laundering required by USA PATRIOT Act and the Bank Security Act by
using software package by Actimize. The application will monitor client/account/advisor/transaction data
and identify suspicious activity based upon known fraud/money laundering scenarios and unusual
behavior.

Responsibilities: As an Offshore Tech Lead, I am responsible for:


 Played a key role in requirements gathering from client Business.
 Coordinating with onshore team and managing work allocation and delivery activities.
 Interacting with Program Managers and Business Development Managers on regular basis
 Involved in initial project set -up and Server configurations
 Involved in Analysis, Design and Coding.
 Handled critical technical issues during SIT and UAT
 Coordination with Various Cross Commit teams and make sure the deliverables delivered in
time.
 Maintenance of the application in the Warranty period.

#6 d. Personal Trading System Oct 2009 – Nov 2010

Team Size :8
Technologies : Java/J2EE, UNIX & Windows Batch scripting, Java Script, SQL server, Apache
Tomcat
Project Description: Personal Trading is a web based employee trading interface used in conjunction
with the Examiner vendor product. Through Personal Trading system employees place personal trade
requests requiring preclearance by the Compliance department as well as submit Certification statements.

Responsibilities: As an Offshore Team member, I am responsible for:


 Played a key role in requirements gathering from client Business.
 Involved in Server configurations and setup
 Involved in Analysis, Design and Coding.
 +91-9948591391
 badprince. 32@gmail.com Page 7 of 8
 Coordinating with Onshore team and managing delivery activities.
 Handled critical technical issues on ease.
 Coordination with Various Cross Commit teams and make sure the deliverables delivered in
time.
 Maintenance of the application in the Warranty period.

#7 Communication Integration May 2009 – June 2009

Client : Leading Risk Management Company, US


Team Size : 15
Technologies : Java, J2EE
Description : As part of Communication system migration, a mail component was developed to trigger
mail notifications to required users in the corresponding environments. Java mail component was used to
develop the same. In addition to the mail component, few other enhancements were also made as part of
the project.

Responsibilities:
 Involved in design, coding and testing of the mail component.
 Involved in bug fixing during UAT and Production support.

PERSONAL PROFILE

Father’s Name : G Naga Bhushanam


Date of Birth : 13th May 1987
Address : H.no 8-46, Allwyn Colony
Miyapur,
Hyderabad.
Email : badprince.32@gmail.com

 +91-9948591391
 badprince. 32@gmail.com Page 8 of 8

Potrebbero piacerti anche