Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Seeking for position that can challenge in development and architecting BI application on IBM DB2 MPP database
IBM Certified in DB2 database administration, database development and application designing with total 10+ years of experience.
(Data warehouse with High-TPS/Batch/Analytical processing on HADOOP/DB2 (multi-node) Data Lake)
INVESMENT MANAGEMENT BANKING Domain Knowledge (Morgan Stanly-MSIM division) (3.5 Year Experience)
ü Feed SOD and EOD position and valuation data of the investors/Traders/RISK Data Into Warehouse as FACTs with historical data.
ü Various Investment products like stocks (Equity), Bonds (FI), MFs, Options, Multi-Asset, alternate investments, ETF’s, Future. OTC
Derivatives, Money Markets, Trade Life Cycle, Real Estate / Mortgages / Loan processing.
ü Different Kind of Accounts (investor account, Dealer, broker, SMA), Securities processing and portfolio accounting systems.
ü Different kind of Client data (Issuer), Shareholder, Geographic location data, currency Data, NAV.
CI InfoTech Pvt. Ltd. Data center Engineer DELHI- ONGC DATA CENTRE May’07 - Dec’07
TEAM Computers Pvt. Ltd. IT Support Engineer DELHI- AT&T Pvt. Ltd. July’05 - DEC’06
Professional Certifications
Ø IBM DB2 CERTIFIED DATABASE ASSOCIATE
Ø IBM DB2 CERTIFIED DATABASE ADMINISTRATOR
Ø MICROSOFT WINDOWS SERVER 2003
Ø MICROSOFT WINDOWS SERVER 2003 NETWORK INFRASTRUCTURE
Ø MICROSOFT WINDOWS SERVER 2003 ACTIVE DIRECTORY INFRASTRUCTURE
MORGAN STANLEY PVT. LTD.
Position: - Sr. Manager experience from last 3.5 years as BI Data warehouse solution architects with Database on IBM db2 platform.
Team Size: - Sr. Team Lead for the team of 8 db2 developers.
ü Understanding the business requirement and prepare a conceptual high level diagrams to depict component integrations & information
flow.
ü Developing POC on new technology and features and help management to adopt new solutions & technologies, providing training to
teams.
ü Supporting developers in various situations for developing complex logics and business transformation.
ü Reporting to Executive directors and helping them to leverage the existing system, solutions and new technologies.
ü Good understanding of different system and their businesses in the ecosystem and act as bridge to build integrated solution.
ü Understand IBM web sphere / Web logic, TOMCAT, Web services, API, Rest API/MQ, HADOOP- YARN ecosystem.
ü Good hands experience on SPARK API, HIVE, IMPALA and HDFS, IBM DB2 Distributed environment.
ü Creating/Altering HLD /LLD documents depicting the business transformation.
ü Discuss with Business to understand its complexity to build various kind of integration application solution around DW and Data LAKE.
ü Understand business and breaking into execution task and assigning it to the right people with required technical skills.
ü Make best use of MDC / range partitioning/Materialize dataset/in memory computing/ temporary tables techniques to process raw data
and transform into easy and fast ready to use form by business with clear understanding of pros and cons of each techniques.
ü Developing BI queries to address complex business use cases and designing & scheduling algorithm.
ü Doing size estimation for hardware and Database software and other system resources.
ü Designing ACL base Access to Database objects and masking and permissions using Role Based Access Control, CBAC.
ü Designing data migration Strategy from RDBMS to HDFS and vice versa.
DB2 SME:
Ø Working as DB2 SME with Extensive experience on tuning, configuration and monitoring the DB2 Database, application, Query in MPP
with Multi TB database.
Ø Supporting db2 developers in various situations for developing/review complex logics in optimal manners.
Ø Create/Alter HLD and LLD design documents and development and purposing various wild/ perfect ideas and encourage brainstorming
and innovation among team.
Ø Team leading and Drive collaboration among team members in their respective area.
Ø Doing size estimation for hardware and Database software and other system resources.
Ø Handles the database performance and severities issues and providing RCA.
Ø Providing support as App DBA on DB outage and application slow production/non-production system.
Ø Identifying the use of MDC / range partitioning, MQT, temporary tables in the application and help the developer to implement the logic
accordingly with clear understanding of pros and cons of these techniques.
Ø Identifying and solving problem related to slow applications/SQL and locking issue/deadlock /rollback / timeout.
Ø Designing ACL base Access to Database objects and masking and permissions using Role Based Access Control, CBAC.
Developer/ BI Warehouse Dimensional Modeling based on Ralph Kimbal.
Ø Dimensional modeling (TYPE 1, TYPE2, TYPE3 and TYPE 6 dimensions)
Ø Designing snowflake and star schema.
Ø Implemented system temporal and bi-temporal tables and history tables to support time travel queries.
Ø Implement replicated MQT and designing and implement the optimal FACT data processing store procedures.
Ø Development of reco logic for SOD, EOD data.
Ø Development for handling early arriving fact and late arriving Dimensions.
Ø Automation of RFB checks and Reconnects.
Ø Understand the requirement from Business analyst and Designing solutions and Data model for OLTP /BI Database ( INF,2NF,3NF)
Ø Designing and writing the complex logic in BI queries, store procedures, functions, triggers to find AUM of banking products etc.
Ø Using Power designer for Database modeling
Ø Using Rapid SQL for adhoc queries execution.
HADOOP development: -
Ø Setup CLOUDERA HADOOP 2.6 with 2 nodes with spark/hive/beeline.
Ø Good hold on PYSPARK and creating RDD and joining RDD.
Ø Experience on developing and tuning hive data warehouse,
Ø Data serialization using SQOOP (with Avro, text format) for ingestion for difference relational DW DB2, Sybase, MYSQL.
Ø Basic idea of Resource tuning of YARN for hive (container and memory usage and optimization) and spark.
Ø Good hold on core concept of HADOOP.
Ø Beeline/hive scripting in shell along with SQOOP to move data from external table to hive internal tables.
Ø Contribute in Hive warehouse schema design.
Ø Create spark RDD and Temp table and join then to address the complex requirement.
Ø Creating/maintaining hive external and internal tables.
Ø Developing ETL in python to load/reload/append data into hadoop via sqoop.
Ø Monitoring the resource usage of Data nodes using Red hat basic monitoring commands (vmstat, top, lscpu, ps, iostat, netstat).
Ø Introducing Hive bucketing, partitioning, caching at NODE level, skew join, side joins.
Ø Analyzing the DAG and control the execution of Mapper and reducer task.
Ø YARN container optimization and monitoring the execution of spark and hive jobs.
Ø Also apply experience in Optimization of IBM DB2 Database/registry configuration.
Here we are supporting all the telecom applications related prepaid stack of the AIRTEL like Charging, subscription, CDR’s, Recharge Plans,
Offers, Activation/Deactivation, MNP, Airtel money, Dedicated Accounts, DTH.
Profile Description: - Provide support to live production databases issues, and severities for all applications related to PREPAID STACK of the
AIRTEL CLIENT for INDIA, BANGLADESH, SRI LANKA, and SOUTH AFRICA.
I am also doing design & solutions, data modeling & data mining and providing technical support for medium to large scale Databases for new
incoming applications and requirement. (OLTP and data Warehouse type (in TERA-bytes).
2. Database architect
20. Using AIX tools to monitors CPU/Memory/ disk / network bottleneck like TOPAS, VMSTAT, IOSTAT, PS, SAR. NETSTATS, NMON
21. Monitoring database using MONITOR SNAPSHOT, DB2TOP, DB2PD, MONITOR FUNCTION, SYSIBMADM VIEWS, DB2DIAG LOG
22. Provide the support to different application development teams (java team) and other stakeholders for severity issue.
23. Assisting team in application production planning and implementation.
24. Designing storage layout non-partitioned and partitioned database (DPF) with required striping, and raid level, extents size and files system
(LV, VG, PV).
25. Design High availability solution for DB high availability HADR, HACMP, SQL replications
26. Tuning Buffer pools allocations, catalog cache, package cache, sort memory, lock list, DB cfg, DBM cfg, and registry parameters.
27. Recommend the AIX parameter and tuning setting NO, VMO, SCHEDO, IOO, AIO.
28. Use DB utilities such REORG, run stats, rebind, export, import, load, backup, restore.
29. Automate the maintenance activity writing various shell scripts (PARTITION ADD/ REMOVE/ RUNSTAT/REORG and Monitoring etc).
30. Configure and support feature like db2 federation activity like wrappers, server definitions, user mappings and nicknames to access remote
DB server and data.
31. Generating and creating and altering DDL, DML, DCL
32. Data migration support from one system to another and creating script to automate monitoring matrix.
33. Experience designing the logic, configuring and maintaining of MDC, MQT, Range partition tables and indexes features
34. Designing Database experience on symmetrical multi-processing (SMP), massively parallel processor (MPP) with enterprise storage area
networks (SAN) and AIX P series machine servers.
35. DB2 system, instance, and database security and auditing experience using table definitions, triggers, data replication, and automated
monitoring.
SUPPORTS: Providing DB2 application support for multiple large databases in development, test and productions environments for various types
of databases - OLTP and Data warehouse running on AIX platform and supporting various telecom software products in terms of tuning /
installation/ testing. We are managing 10 databases with performance tuning and database recovery. Sizes of databases varied from 5 GB to 40
GB. The projects using these databases are for web sphere application server for web development/Web portals and for maintenance of Client-
Server OLTP systems for Telecom business applications such as iECCM, DCM, EBPP, DUPBILL, RSWEB, CRM Bill view etc. Following are
some of the typical tasks carried out as part of database administration.
Training Programs
1) Red hat Linux basic administration in ONGC DATA CENTRE (DELHI) From HP
2) IECCM /DCM telecom billing software application installation/Integration on AIX and integration with db2 Database and IBM Websphere
Application server and operation on application in Intense Technologies.
Achievements
Ø Awarded with 2014 Orion Award – recognized as Eminence and Excellence by IBM INDIA.
Ø Recognized by IBM as Great work in the Launch of Subscription engine PROJECT in AIRTEL.
Ø Recognized By Morgan Stanley as great contribution in building Data Warehouse for Investment Banking.
Ø Three years Diploma in Computer engineering from Delhi Board of Technical Education
th th
Ø 10 and 12 from CBSE
Personnel Details