Sei sulla pagina 1di 2

1.

UPDATE REGARDING RPOJECT


2. LOOKUP
UNCON, DYNAMIC, UNCACHED, STATIC CACHE
3. XML
4. TRANSACTION CONTOL
5. SQLT
6. SP TRANS
1. SESSION PROPERTIES

1.
2.
3.
4.
5.

DOMAIN
GET THE REQUIREMENT (brd)
Analyze policy sales.
GET THE TECHNICAL REQUIREMENT
Get the report regarding policy performance (p/l amount ) on the basis of
customer type, region, season, branch, broker and employees wise
6. identify dimension and fact in the requirement
7. Identify OLTP tables/columns related to fact and dimension
8. Make the bucket of dimensions and combined related dimension to make
dimension tables
9. Customer bucket: customer type, customer city, customer salary range,
customer depandent
10.Make Dimension tables and establish relation with oltp tables and columns
11.Its called DataMapping sheet having logic to load data from source(oltp) to
target(olap : dim and fact tables)
12.Decide dimension table type (SCD1, SCD2 and put extra metadata columns
like sk, eff_dt,exp_dt, update_dt).
13.Create High level design document having below sections
13.1.
Introduction 9.2. Overview 9.3. Dataflow diagram 9.4. ETL frame
work for Pre and Post process
13.2.
9.5. Audit login 9.6. Error login and data cleansing process. 9.7.
Naming convention 9.8. Load frequency 9.9 Dependency 9.10. Source
information 9.11 target information 9.12. sample reports 9.13. load failure
and recovery steps
14.Create and design etl framework for pre post process
15.Design and create error login and cleansing process
16.Create DDL script for OLAP system (Dimension and fact table)
17.Create ddl script for OLTP too since you are taking care of source as well
18.Create Sample data for OLTP system in flat files
19.Create three level architecture of the project (Landing Zone, Staging Area,
DM) and prepare ddl for all these tables.
20.Create ETL design docs for all table load
21.Create Test case for all mappings
22.Create Schema for LNDG, STG and DM and create tables.
23.Create Informatica folder for all three Stages and create shared folder
24.Create reusable objects (udf, mapplet in main folder and source , targets in
shared folder )
25.Create mappings for all above three stages
26.Perform test and validate UTC for all three stages

27.Parametrize all sessions and create config object


28.Create worklet for LNDG, STG and DM and include sessions in corresponding
worklet
29.make link dependency that of only error records loaded or wrong bus data
then load error table and dont run next sessions.
30.configure Email task in case of any failure
31.configure command task to clean log and archive log files.
32.schedule job based on file watcher to run and clean file watcher in post
process
33.create partition creating store procedure and run it as preprocess
34.create lkp and expression method to generate sk.
32. enable recovery for fact load process
34.1.
33. enable informatica error login process to capture all error data
apart from user error login
34.2.
34. create a post process to capture all error records from user defined
error table and informatica error table and generate report in a flat file and
send to user by mailex command
34.3.
35. perform SIT (system integration testing)
34.4.
36. make the session link parameterize to recover workflow from failed
session
34.5.
37. create deployment group and prepare migration document..
34.6.
38. migrate all code to UAT repository for user acceptance testing.
34.7.
39. create unix script to run the job through cron job or any scheduler
tool
34.8.
40. Need to learn more but be happy

Potrebbero piacerti anche