Upload
sridhar1717
View
217
Download
0
Embed Size (px)
Citation preview
7/30/2019 Info Resume
1/4
A Kiran
Cell: +91-8105056816.
Porfessional Summary:
A competent professional with 3.5 years experience as Developer using Informatica. Knowledge on Data Warehousing Concepts, Business Intelligence, UNIX, SQL and Data
modeling.
Effective communicator with excellent relationship building & interpersonal skills. Solid data ware housing experience using Informatica Power Center (Source Analyzer,
Data Warehousing Designer, Mapping Designer, Mapplet, Transformations, Workflow
manager).
Extensive experience with Data extraction, Transformation, and loading (ETL) fromdisparate data sources like oracle and Flat files. Good grounding on the OLAP tools likeCOGNOS, Business Objects.
Strong experience with SQL statements. Solid in Data Warehousing concepts, dimensionalstar schema and snowflake schema methodologies.
Good at working with On-shore teams for on-time delivery of solutions. Good inCoordination and Communication between On-shore and Offshore Teams.
Ready to take up new challenges and easily adaptable. Ability to take on multiple assignments. Has very good problems solving skills.
Porfessional Experience:
Working with Emsys Software Technologies ,Bangalore from Nov 2009 to till date.Education:
B.Tech in Information Technology from JNTU.Technical Skills:
Operating System : Windows XP/7, Windows 2000, Windows 2003, UNIX
ETL Tools : Informatica Power Center 8.6
RDBMS : SQL Server 2005, Oracle 8i/9i/10g.
Languages : C/C++, SQL and PL/SQL
Reporting tools : Cognos 8.4
7/30/2019 Info Resume
2/4
EXPERIENCE SUMMARY:
#project 1:
Project Name : Data marts for Merrill Lynch & co., IncClient : Merrill Lynch & co., Inc, U.S.A
Duration : APR-12 to till date
Designation : Software Engineer
Role : ETL Developer
Technology : Informatica 8.6.0, Oracle 10g, Windows, UNIX
Description:
This project is involved in the development of data warehouse for Merrill Lynch,
Based on four kinds of data marts Accounts, Loans, Credit Cards and Insurance. Each data
mart represents a collection of data related to a single business process. In loan data mart
Merrill Lynch is involved in spending loan amounts for various purposes like: Personal Loan,
Educational Loan, Vehicle Loan, Housing loan, Consumer Durable loan etc.
The company requires different levels of analysis regarding loan amount, type of loans, type of
customers, type of payment schedules, interest rates (variable or fixed), defaulters list and the
penal interest calculations etc.
The purpose of the Data Warehouse is to maintain historical data and central location for
integration of different source data; analyzing business in different locations according to
profit areas, which served purpose of a DSS for decision makers.
Roles & Responsibilities:
Good understanding of Technical Design Document. Worked on Informatica Power Center tools like Source Analyzer, Warehouse Designer
to import the source and target database schemas. Mapping Designer, Workflow
Manger and Workflow Monitor.
Used transformations like Aggregators, Sorter, lookups, Filters, Expression, Router,Joiner, Source Qualifier, Update Strategy, Sequence Generator
Used mapping parameters and variables. Exclusively worked on slowly changing dimensions. Identified Bottlenecks in the mapping, and Involved in Performance tuning. Created sessions & workflows to ensure the data to be loaded into the specific tables of
DWH.
7/30/2019 Info Resume
3/4
Configured various tasks in the workflows for the dependency by using Command,Event wait, timer and Email task.
Involved in unit testing. Optimizing the mappings by changing the logic to reduce run time
#project 2:
Project Name : RBI Production Support and enhancement
Client : Retail Business Intelligence, Mexico
Duration : FEB-11 to MAR-12
Role : ETL Developer
Environment : Power Center 8.6.0, Oracle 10g, Windows XP, UNIX
Description:
The aim of the project was to support the BI applications of the client. The BI applications there
in, rely on Informatica Power center for Data integration. Oracle 9i serves as the OLTP whereas
Teradata V2R6 serves as the EDW. Some reporting applications were also based on SQL
Server 2005.The reporting tool used was Micro strategy and few other in house tools.
Responsibilities:
Monitoring and fixing workflow failures. Automation of regular tasks using Informatica / Unix Shell Scripts Solving data issues in reports. Coordinating Planned / Unplanned System Shutdowns. Used most of the transformations such as the Source Qualifier, Expression,
Aggregator, Connected & unconnected lookups, Filter, Router, Sequence
Generator, Sorter, Joiner, and Update Strategy.
Performance Tuning ofInformatica jobs, stored procedures already in production. Involved in Enhancements of the mappings according to the design documents. Imported data from various Sources transformed and loaded into Data Warehouse
Targets using Informatica.
Used shortcuts to reuse objects without creating multiple objects in the repositoryand inherit changes made to the source automatically.
Developing Informaticamappings & tuning them when necessary. Optimizing Query Performance, Session Performance.
7/30/2019 Info Resume
4/4
Unit and System Testing of developed mapping. Documentation to describe program development, logic, coding, testing,
changes and corrections.
#project 3:
Project Name : POS Rewrite for retail sales system
Client : Retail Business Intelligence (RBI), Mexico and Texas.
Duration : NOV-09 to JAN-11
Role : Team member
Environment : Informatica 8.6.0, Oracle 10g, Windows XP, UNIX
Description:
POS Data flow viz., reduplication, collection and publication is the backbone of enterprise
data warehouse integration in retail domain. The data from here gets integrated into the
enterprise data warehouse through a staging area. The aim of the project was migration of the
POS flow from Oracle 10g to Teradata V2R6. Subsequently, all the dependent integrations
were also modified for their dependency on Teradata rather than Oracle.
Responsibilities:
Imported data from various sources transformed and loaded into Data Warehouse
Targets using Informatica.
Analyzed the sources and targets, re-imported them to have the dimensional model to
be in sync.
Creating stored procedures, triggers, editing mappings according to the detailed design
document.
Identifying dependent jobs in the system.
Modified sessions and running sessions.
Performance tuning of Data mart, Data Validation.
There were two kinds of data loading processes (Daily and Weekly) depending on the
frequency of source data.
Extensively used Transformations like Router, Aggregator, Source Qualifier, Joiner,
Expression, Aggregator and Sequence generator.
Knowledge of slowly changing dimension tables and fact tables.
Worked with different sources such as Oracle, MS SQL Server and flat files.