Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Has over 10+ years of experience in IT. Holds a Bachelor in Computer Science and
Information Technology.
Experience in developing applications using Java, Scala, Python and Unix Shell
Scripting.
Experience in Big Data Ecosystem: Hadoop, HBase, Hive, Kafka, Apache Spark,
Apache Nifi, Fluentd, DataTorrent (Apex).
Familiar with Apache Flink and Apache Beam.
Experience in Enterprise Search tools: Apache Solr and ElasticSearch
Experience in Data reporting and Visualization tools: Tableau, MicroStrategy, Kibana,
and Graphana.
Experience in RDBM databases (Oracle, SQL server, DB2).
Experience in No-SQL databases : MongoDB, CouchDB, Hbase, MemSql, Neo4j, Titan
Experience in AWS Data Analytics platform: Kinesis, S3, EMR, Redshift, EC2, Dynamo,
and Lambda.
Experience in Data Governance tools: Waterline Data and Apache Atlas
Familiar with GoogleCloud and Azure Data analytics platform.
Familiar with Machine learning, IOT, Chatbot and Data Marketplace platforms.
Hands on Experience in understanding of requirements, solution development, preparing
work specifications, effort estimation, analysis, design, build, testing and documentation as
Individual Contributor.
Hands on Experience as Data architect in data modelling, design patterns, Batch & Real-
time processing, cataloguing, profiling, lineage, wrangling/blending, building models,
visualization, consumption, enterprise search, Graph processing.
Key participation as Solution architect in Pre-sales process which include responses to
RFPs/RFIs, evaluation of technology, architectural design, estimation (staffing) etc.
Major contribution as Technology consultant in niche areas of Cloud, Big data and AI in
defining end to end solutions, driving full stack architecture based engagements.
Highly effective team player, consistently helping realize the overall project goals and
bringing out the best from other team members.
Demonstrated ability to quickly master new technologies and comprehensive problem
solving abilities.
Have good inter personal, communication and presentation skills.
Have worked on significant projects and gained unique well-versed knowledge in different
software methodologies.
Has very good knowledge on the client business process and domain expert in Banking &
Financial services, Media & Entertainment, Telecom and Health Care.
Agile (Scrum, Kanban and Lean) and DevOps practitioner.
+91-9948591391
badprince. 32@gmail.com Page 1 of 8
Expe rie nce De tails:
Currently working at Infosys, Hyderabad.
Skill se t:
Education:
+91-9948591391
badprince. 32@gmail.com Page 2 of 8
Awards & Achie ve me nts:
Sun certified Java Programmer.
IBM certified - AIX 6.1 Basic Operations.
Cognizant certified Professional in Oracle 10g.
Infosys certified Hadoop and Spark Developer.
Infosys certified Data architect.
Recipient of multiple awards for outstanding contribution from Cognizant.
Recipient of Most valuable performer award from Infosys in the years 2016, 2017 and
2018.
Proje ct Profile s:
Responsibilities:
Key role as Architect in designing the application architecture, migration planning and
technology evaluation.
Preparation of Self-help guides and Ready-reckoners on Spark usage for the team.
Participation in Design discussions with Client and preparation of design documents.
Setting up the basic Skeleton of Spark Java project.
Development of critical modules using Spark Dataset API in Java.
Integration of multiple layers from Source to Destination and configura tion.
Evaluation of various tools for Logging, Monitoring and Alerting.
Setup Logging, Monitoring and Alerting framework for the application at all layers.
Perform benchmarking (Spark core vs Dataset vs Dataframe) on multiple environments.
Setting up CI/CD pipeline using Git and Jenkins.
Manage Git repositories and perform code reviews.
Performance tuning the Spark application.
Key role in automation of data validation and testing utilities post -migration.
Team management.
Guiding the team in development, T esting and Code reviews.
Adhering to the agile methodology.
+91-9948591391
badprince. 32@gmail.com Page 3 of 8
#2 Supplier Management System – DataTorrent March 2017 – Feb 2018
Responsibilities:
Key role as Architect in designing the application architecture, Data model for Hive and Oracle
and technology evaluation.
Scaling up on the DataTorrent tool, Preparing cook books and Self -help guides on DataTorrent
usage and training the team.
Participation in Requirements, Analysis and Design phases with Client partners.
Development of core modules using the DataTorrent API in Java.
Integration of multiple layers from Source to Destination and configuration.
Setup Logging, Monitoring and Alerting framework for the application at all layers.
Perform benchmarking and Performance tuning on the DataTorrent tool.
Team management.
Roll out releases of the application/builds as per the Sprint schedule.
Guiding the team in development, Testing and Code reviews.
Adhering to the agile methodology.
Responsibilities:
Key role in setting up the team and training the team for required technologies.
Setup Monitoring and Alerting tools for all the applications and Infrastructure under the scope
of the project.
+91-9948591391
badprince. 32@gmail.com Page 4 of 8
Team management
Develop automation cookbooks for administration and Configuration management at both
application and infrastructure level.
Installation of systems on cloud infrastructure
Roll out new releases of the application/builds
Automation / orchestration on cloud environment.
Root Cause Analysis of issues of Platform issues
Guiding the team in developing and configuring automated scripts leveraging Python and
Shell scripting for repetitive tasks.
Interaction with client teams in the Devops-Agile model.
Weekly and Monthly deliverables like Operational reports and KPI trackers.
Responsibilities:
Key role in setting up the Cloudera (CDH5) cluster in Dev environment.
Key role in setting up the Apache Solr cloud in Dev environment.
Writing Sqoop and SQL sc ripts for data ingestion and indexing into Solr.
Developing Java API for meta data matching and ID assignment using Solr and Hbase API.
Involved in build and testing of MapReduce/Spark programs
Coordination with various vendor data teams.
Responsibilities:
Tech lead and SME for the AML ,Wire Transfer Monitoring application, Personal Trading and Risk
Case Management applications at offshore.
+91-9948591391
badprince. 32@gmail.com Page 5 of 8
Involved in development of Personal Trading system, AML monitoring application and Enterprise
Risk case Management solutions applications.
Involved in all the major releases from the time I have joined the project.
Involved in designing and developing AML application upgrade which was critical to the
Compliance Business.
Involved in major Infrastructure upgrade project at Organizational level.
Undergone Cognizant Hadoop training.
Major responsibilities in this project involved Effort estimation, requirement analysis, design,
development and testing.
Mentored new team members on the compliance applications and the process that is being
followed in Cognizant.
Played role of Auditee for internal & external audits of the project.
Co-ordination with Business users and other vendor teams.
Team Size :5
Technologies : Hadoop, MapReduce, HDFS, Hive, Java, SQL, Pig, Sqoop, Oozie
Project Description: AML Analytics is a solution to generate Anti-money Laundering (AML) reports for the
Compliance department. Provides insightful daily analyses on huge transaction data received from multiple
sources across the organization. In addition to Fraud detection and Personal Trading, the solution provided
reports required for Auditing and Regulatory requirement.
Team Size : 12
Technologies : Java/J2EE, WebServices, UNIX, Oracle, Apache Tomcat, Actimize –RCM,
Informatica, Java Script
Project Description: Enterprise Risk Case Management (ERCM) system is a web application to track
financial cases involving Suspected Money Fraud, Money Laundering and Alert on artifacts for possible
anomalies. The system will also produce SAR’s (Suspicious Activity Reports) for the Government.
Team Size : 10
Technologies : Java/J2EE, UNIX, DB2, Apache Tomcat, IHS server, Actimize – Monitor product
Project Description: AML application provides a monitoring solution for the Compliance department to
detect possible fraud or money laundering required by USA PATRIOT Act and the Bank Security Act by
using software package by Actimize. The application will monitor client/account/advisor/transaction data
and identify suspicious activity based upon known fraud/money laundering scenarios and unusual
behavior.
Team Size :8
Technologies : Java/J2EE, UNIX & Windows Batch scripting, Java Script, SQL server, Apache
Tomcat
Project Description: Personal Trading is a web based employee trading interface used in conjunction
with the Examiner vendor product. Through Personal Trading system employees place personal trade
requests requiring preclearance by the Compliance department as well as submit Certification statements.
Responsibilities:
Involved in design, coding and testing of the mail component.
Involved in bug fixing during UAT and Production support.
PERSONAL PROFILE
+91-9948591391
badprince. 32@gmail.com Page 8 of 8