- Featured in:
Find out what is the best resume for you in our Ultimate Resume Format Guide.
Additional Computer Software Resume Samples
ETL Developer Resume Samples
No results found
0-5 years of experience
Built a decision support model for the insurance policies of two lines of Business- workers compensation and business owners’ policy.
- Created variables, using Informatica power center to score the policies, a process that increased the renewable policies by 35%.
- Acted together with the users to design the underlying data warehouse and marts.
- Analyzed and executed the test cases for various phases of testing – integration, regression and user
0-5 years of experience
Involved in business analysis and technical design sessions with business and technical staff to develop requirements document and ETL design specifications.
- A Used Erwin for Logical and Physical database modeling of the warehouse, responsible for database schemas creation based on the logical models.
- Involved in performance tuning of targets, sources, mappings, and sessions.
- Wrote complex SQL scripts to avoid Informatica Look-ups to improve the performance as the volume of the data was heavy.
- Created and monitored sessions using workflow manager and workflow monitor.
0-5 years of experience
Worked on the maintenance and enhancements for VMware Entitlements related data mart.
- Co-ordinated monthly roadmap releases to push enhanced/new informatica code to production.
- Developed and maintained ETL (Data Extraction, Transformation and Loading) mappings using Informatica Designer 8.6 to extract the data from multiple source systems that comprise databases like Oracle 10g, SQL Server 7.2, flat files to the Staging area, EDW and then to the Data Marts.
- Created mappings using different look-ups like connected, unconnected and Dynamic look-up with different caches such as persistent cache
- Performed Impact Analysis of the changes done to the existing mappings and provided the feedback
- Create mappings using reusable components like worklets, mapplets using other reusable transformations.
- Participated in providing the project estimates for development team efforts for the off shore as well as on-site.
- Coordinated and monitored the project progress to ensure the timely flow and complete delivery of the project
- Worked on Informatica Source Analyzer, Mapping Designer & Mapplet, and Transformations.
0-5 years of experience
Worked on the maintenance and enhancements for VMware Entitlements related data mart.
- Co-ordinated monthly roadmap releases to push enhanced/new informatica code to production.
- Developed and maintained ETL (Data Extraction, Transformation and Loading) mappings using Informatica Designer 8.6 to extract the data from multiple source systems that comprise databases like Oracle 10g, SQL Server 7.2, flat files to the Staging area, EDW and then to the Data Marts.
- Created mappings using different look-ups like connected, unconnected and Dynamic look-up with different caches such as persistent cache
- Performed Impact Analysis of the changes done to the existing mappings and provided the feedback
- Create mappings using reusable components like worklets, mapplets using other reusable transformations.
- Participated in providing the project estimates for development team efforts for the off shore as well as on-site.
- Coordinated and monitored the project progress to ensure the timely flow and complete delivery of the project
- Worked on Informatica Source Analyzer, Mapping Designer & Mapplet, and Transformations.
0-5 years of experience
Technical environment/Tools used: MS SQL 2008, SSAS, SSRS, SSIS, MS Office, MS Project, Visio, Acorn EPS tools.
- Provided technical support and development to the Profitability Systems and Reporting group. The group leveraged Microsoft’s BI tools and Acorn’s modeling software to implement accurate models, cubes, and reporting on customer and product profitability for the company’s global business operations.
- Designed and developed Data Warehouse for the Profitability Systems and Reporting group.
- Implemented technical projects into production.
- Wrote various technical documents, including Business Requirements, Functional and Technical Specifications, Data Flow and Process diagrams using MS Visio tools.
- Implemented and managed project Change Control System and processes and tracks project issues resolution.
- Managed team SharePoint site.
- Coordinates and documents project’s lessons learned.
- Prioritized and handled multiple tasks in high-pressure environment.
0-5 years of experience
Developed ETL procedures to migrate data from an existing data warehouse to an integrated enterprise level data warehouse.
- Created and executed mapping using Slowly Changing Dimensions Type 2, Slowly Growing Target, and Simple Pass Through.
- Prepared technical designs for Informatica mappings.
- Developed mapping for fact loading from various dimension tables.
- Contributed to logical and physical data modeling for the new data warehouse.
- Created BTEQ, MLOAD and FLOAD scripts for loading and processing input data.
- Created and executed macros using Teradata SQL Query Assistant (Queryman).
0-5 years of experience
[company name] Resource Planning Database
- Gathered requirements, designed, and developed DB2 database solution for an [company name] resource planning application.
- The solution included DB2 configuration, data modeling, and identification of trusted data sources, database creation, and ETL processes.
- Served as the data expert and created DB2 views for Cognos developers to complete reporting requirements.
- Provided training and mentoring for junior database developer to resume with steady state operations.
0-5 years of experience
Established consulting firm specializing in data integration, Microsoft business intelligence, data warehousing, SSAS, SSIS, ETL solutions, and more. Plan and lead all projects and oversee requirements gathering. Supervise project teams. Carried out ETL development, process maintenance, flow documenting, and management, plus T-SQL tuning and SSIS for clients of Analyst Int’l, including Bank of America, Commonwealth of Pennsylvania, and Nordstrom.
- Led successful conversion of SQL Server 2008 data integration process to SQL Server 2012.
- Developed ETL framework for processing and monitoring import of flat files in SSIS for key client.
- Enabled clients to align BI initiatives with business goals to facilitate competitiveness and productivity.
- Improved efficiency for clients by training new programmers in organizational ETL processes.
0-5 years of experience
Interacted with end users and line managers to gather business requirements and conducted user acceptance.
- Interacted with third party vendors and identified different external and internal homogenous and heterogeneous sources and extracted and integrated data from flat files, Oracle, SQL Server sources and loaded to staging area and database/Datamart tables.
- Created Informatica Data quality plans according to the business requirements to standardize, cleanse data and validate addresses.
- Integrated data quality plans as a part of ETL processes.
- Debugged existing ETL processes and did performance tuning to fix bugs.
- Environment: Informatica Power Center 8.6.1, Informatica IDQ 8.6.2, Oracle 10g, SQL, PL/SQL, TOAD, SQL Server, Autosys
0-5 years of experience
Utilize my ETL background for the creation and execution of informatica workflow processes and complex SQL code.
- Developed and tested extraction, transformation, and load (ETL) processes.
- Designed and developed Informatica’s Mappings and Sessions based on business user requirements and business rules to load data from source flat files and oracle tables to target tables.
- Designed and implemented stored procedures, views and other application database code objects to aid complex mappings.
- Maintained SQL scripts and complex queries for analysis and extraction.
- Designed Tidal jobs that automated the ftp process of loading flat files posted on the ftp server.
- Provided guidance to campaign analyst on complex technical projects that required Advance SQL coding
0-5 years of experience
[company name] & Company is a diversified financial services company providing banking, insurance, investments, mortgage, and consumer and commercial finance through more than 9,000 stores, [company name] enterprise data warehouse is built for shipment planning and Order to Remittance to facilitate end-to-end visibility through the order fulfillment process. Data is extracted from various source systems like EOM (external order management), Oracle Apps OM (Order Management) and Excel Files. The reports are built using Cognos.
- Involved in analysis of source systems, business requirements and identification of business rules and creating low-level specifications.
- Responsible for developing, unit testing and supporting various testing phases for the ETL (Extract, Transform and Load) processes using Informatica Power Center.
- Extracted data from various source systems like XML, flat-files, CSV, Oracle
- Implemented Slowly Changing Dimensions (Type 1 2, & 4).
- Created mappings using the transformations like Source qualifier, SQL, Aggregator, Expression, look-up, Router, Filter, Update Strategy, Joiner, Stored procedure and etc. transformations.
- Instrumental in performance tuning of mappings/session at database and Informatica level to improve ETL load timings.
0-5 years of experience
Served as enterprise primary subject matter expert for Tealeaf. Maintained Tealeaf instance through event creation, capture configuration, software installation/upgrades, and troubleshooting.
- Contributed expertise for planning and deployment of an innovative Tealeaf capture of Capital One’s card servicing platform during a period of significant corporate transition.
- Provided subject matter expertise to a project team that planned, designed and implemented a refreshed taxonomy that reduced Tealeaf events by 30%, leading to improved end user accessibility and efficiency.
- Collaborated with developers on implementing standardized HTML tagging for sensitive customer information and configured privacy rules within Tealeaf Pipeline to block this information for Payment Card Industry Data Security Standards compliance purposes.
- Utilized data extracts created with Tealeaf cxConnect and designed a business solution that increased the knowledge of customer behavior by integrating mobile customer activities tracked in Tealeaf with Data Warehouse.
- Performed troubleshooting of traffic capture issues on Tealeaf Passive Capture Appliance using Wireshark analysis.
- Integrated data from Data Warehouse into Merced through creation of warehouse data extracts using Informatica PowerCenter that provided sales support leaders access to associate metrics that served for performance development and increased productivity.
0-5 years of experience
Designed complex job modules to generate customer Healthcare reports for all insured members in the states of Wisconsin and New Hampshire with populations of 5 million and 1 million, respectively
- Served as a contractor with United Health Care, the largest Healthcare provider in the United States and developed many modules in Datastage to provide insurance reports according to requirements; delivered insurance reports for 5 different states, which allowed the client to maintain a list of all eligible insured customers and serve their customers more effectively
- Developed a tool in Datastage to reduce the amount of time spent manually on testing and analysis; resulted in saving of more than 50 work hours on each module for each team member
- Spearheaded a team of four members in assessing the accuracy and functionality of a complex data delivery system with a huge quantity of data (5 million subscribers)
0-5 years of experience
Enhanced and developed rapidly growing internet marketing Data Warehouse.
- Designed, developed, tested and implemented custom ETL solutions, with primary focus on Data Warehouse design, administration and performance tuning.
- Provided ongoing assessment and improvement of ETL processes, including ETL error logging, data purge and load monitor.
- Developed Apex reports.
0-5 years of experience
Latin America Data Warehouse. Analyzed, developed and created stored procedures in PL/SQL to maintenance Warehouse’s catalogs.
- Supported users in México, Latin America and USA
- Created database objects like tables, indexes, stored procedures, database triggers, and views.
- Set up users, configured folders and granted user access
- Developed and created the new database objects including tables, views, index, stored procedures and functions, advanced queries and also updated statistics using Oracle Enterprise Manager on the existing servers
- Created PL/SQL packages to read data from a flat file and load it into tables using UTL_FILE.
- Wrote functional specification and detailed design document for the projects.
- Wrote Insert triggers which updates the same row which is being inserted (mutating trigger).
- Query optimization through the use of diagnostic tools like Explain Plan.
0-5 years of experience
[company name] is a total health care management system, which manages the total patientinformation maintained in three different modules. Patient Information, Accounts Information, Monitoring. The Patient information module contains the patient details, up to date health information. In the Accounts Information module day-to-day bill settlements will be entered in to the online system. In Monitoring, it monitors from the day one to discharge date of the patient and Case Sheet will be maintained which is confidential.
- ETL Design and development using Informatica transformations like Sorter, Aggregator, Router, Normalizer, sequence, Union, Ranker, Update Strategy, Lookup, Stored Procedure, Source, Target, Transaction Control etc.
- Responsible for enhancing existing mappings and creating new mappings.
- Developed mappings to implement type1, type2, type3 slowly changing dimensions.
- Developed ETL code for Incremental/delta loads.
- Involved in working with project Managers and analysts on estimates, timelines, and resource plans.
- Experience in managing offshore teams for developing and testing various projects.
0-5 years of experience
Worked on providing ETL extracts from various source systems to Master Data Management application. Also worked on extracting data from MDM to external vendors and other applications within the enterprise in the form of XML
- Involved in requirements gathering and analysis
- Designed and development of interfaces for feeding customer data into MDM from internal and external sources
- Involved in the development of enterprise wide XSD’s for extracting data from MDM and feeding to other systems within the enterprise
- Worked extensively on almost all the transformations such as Aggregator, joiner, Cached and Uncached Lookups, Connected and Unconnected Lookups, SQL, Java, XML, Transaction Control, Normalizer, Sequence Generator etc.
- Extensively used partitioning concepts to improve session performance in informatica
- Written UNIX scripts to invoke informatica workflows and sessions
0-5 years of experience
End to end ETL design and implementation for Master Data Management (MDM) batch processing using Datastage from ODS (Teradata) and various sources which are xmls/flat files/oracle DB
- Performed Bulk data load during historical data push using PL/SQL
- Implemented Real Time Address Standardization using Quality Stage
- Designed and implemented end to end process scheduling through Autosys by creating all the dependencies and file watchers for the entire MDM Batch Services ETL
- Production Support for the MDM Batch Services.
- Overseen the complete migration effort of ETL/Data from MDM v8 to MDM v10
0-5 years of experience
Designed, developed, and tested mappings, sessions and workflows using Informatica PowerCenter in both Relational and Data Warehouse databases
- Collaborated on the development of 2 new Data Marts
- Provided production Level 1 and Level 2 on-call support for ETL processing, requiring time-critical resolutions to issues
- Created procedures and packages using PL/SQL
- Maintained code integrity using version control software (PVCS)
- Created scripts to automate in-house metadata explorer project using Perl, Shell Script and SQL
- Created Perl and SQL scripts to automate project in order to provide monthly data updates for data segmentation effort
- Developed ETL solutions for numerous projects, supporting company-wide initiatives
- Collaborated on ETL and Brio test plans, ensuring report integrity was maintained as complex updates were made to upstream systems
- Designed, developed, and tested Brio Reports
0-5 years of experience
Developed and implemented ETL jobs that facilitated OLTP processing for systems integration.
- Re-designed and coded existing C# integration into ETL processes.
- Provided production support for all ETL processes.
- Database Administration for SQL Server and Oracle databases.
- Created naming standards for database metadata.
- Developed documentation for all ETL processes.
- Assisted in the gathering of functional and technical requirements for systems integrations.
- Developed and executed SQL queries for various ad-hoc reports requested from Executive Management.
- Led the effort of evaluating prospective ETL tools for procurement purposes.
0-5 years of experience
ETL Developer, Cigna Information Management
- Migrate existing data flows to third party scheduling tool CA ESP for large operational data store application and improve monitoring capabilities through development of new workflows and recommending new best practices
- Improved data quality of ODS application through small enhancement development and quality assurance
- Lead blog team through redesign of intake process, content, and web design; organized recruiting and interview events
0-5 years of experience
Responsible for setting up customer reporting system as a part of new report engine to allow customers self service report development and analytics in Jasper Server.
- Designed and developed the database in Vertica for reporting. Detailed reporting requirements were not available, therefore a tables were simply brought over “as is” . This will act a future staging database. The main schema will be utilized for multi-tenant data architecture using Talend again to move data into separate schema for each company as their needs dictate.
- Developed ETL jobs in Talend to integrate data from Oracle and MongoDB into Vertica system on a cloud. The process involved included a bulk import followed by an update/insert. Fact tables update every 5 minutes to provide near real time data for reports.
- Designed multi-tenant data architecture in Vertica. The purpose for a customized data structure is to ensure speed and customization in order to meet various needs of clients. Again used Talend for ETL jobs ensuring proper processing control and error handling.
- Developed Jasper Interactive reports using Vertica RDBMS as data source. Developed ETL to integrate user information into JasperServer postgresql database to allow for single user sign on.
- Set up multi-tenant reporting architecture in Jasper Server 5.5.. Customers can access a standard set of interactive reports and access their own sandbox to create ad hoc views and report using domains created using the multi-tenant data architecture in Vertica.
0-5 years of experience
This highly visible data warehousing project represented an effort to provide a view into and interact with wealth data using a single OBI (Oracle Business Intelligence) Dashboard utilizing agile methodology. This dashboard provided flexible investment reporting functionality, views and client-facing reports, as well as integration of Risk Monitoring functionality and Risk Optics product information.
- Analyzed data gaps and requirements analysis from business users, other data marts and down-stream technology partners
- Created ER and dimensional models using Embarcadero ERStudio Data Architect 8.5
- Structured data objects (via Embarcadero Rapid SQL 7.7.2) utilizing star schema and snowflake schema modeling in addition to data denormalization techniques
- Developed complex ETLs for wealth management data warehouse using Informatica PowerCenter 8.6.1 while following department coding standards and staying compliant with company information security policies
- Performed data profiling/gap analysis and impact analysis related to switching data sources
- Troubleshoot data
- Created and loaded multiple Metadata resources from the data warehouse (key focus on data-accurate lineage) using Informatica Metadata Manager for ERStudio data models, Informatica ETLs, Oracle BI and Custom Data Source using templates created via the Custom Metadata Configurator
0-5 years of experience
Responsible for setting up customer reporting system as a part of new report engine to allow customers self service report development and analytics in Jasper Server.
- Designed and developed the database in Vertica for reporting. Detailed reporting requirements were not available, therefore a tables were simply brought over “as is” . This will act a future staging database. The main schema will be utilized for multi-tenant data architecture using Talend again to move data into separate schema for each company as their needs dictate.
- Developed ETL jobs in Talend to integrate data from Oracle and MongoDB into Vertica system on a cloud. The process involved included a bulk import followed by an update/insert. Fact tables update every 5 minutes to provide near real time data for reports.
- Designed multi-tenant data architecture in Vertica. The purpose for a customized data structure is to ensure speed and customization in order to meet various needs of clients. Again used Talend for ETL jobs ensuring proper processing control and error handling.
- Developed Jasper Interactive reports using Vertica RDBMS as data source. Developed ETL to integrate user information into JasperServer postgresql database to allow for single user sign on.
- Set up multi-tenant reporting architecture in Jasper Server 5.5.. Customers can access a standard set of interactive reports and access their own sandbox to create ad hoc views and report using domains created using the multi-tenant data architecture in Vertica.
0-5 years of experience
Implemented procedures and methods to minimize the gap between business users and development team and improve communication
- Implemented the methodology for ETL Requirements and Execution Plan and Mapping documents
- Managed migration of multiple system for system upgrade enterprise wide
- Involved in CRM upgrades and data migration across the platform including SQL server and Oracle.
- Involved in data authentication process and Error Reporting
- Implemented ETL framework to facilitate the ETL development process
0-5 years of experience
Created database to track project management to help with managing contractor’s projects, hours and billing.
- Designed ETL packages to load data from different data ware houses and migrated it to text files.
- Used various data flow and control flow items for the ETL
- Wrote highly complicated and optimized stored procedures which were used for main extraction.
- Extensively used Joins and sub-queries for complex queries involving multiple tables from different databases.
- Analyzed the Extracted data according to the requirement.
- Wrote and discussed Analysis reports with Clients
- Worked on Client/Server tools like SQL Server Enterprise Manager and Query Analyzer to Administer SQL Server.
- Responsible for configuring, tuning and optimizing SQL Server 2005
- Created SSRS inventory management reports, focused on findings to save company millions of dollars in Client members Performance Health Management by providing Incentive, Claims, Biometrics file feeds, identified high risk, Low risk, and Moderate risk members using SSRS dashboards.
0-5 years of experience
- Completed an upgrade from PowerCenter 8.1.1 SP5 to Informatica 9.0.1 in current development, test and production environments
- Served as the subject-matter expert for all Informatica development and admin related project tasks
- Maintained Informatica ETL processes used to load the management reporting data warehouse (front end reporting done via SAP Business Objects and dashboards via Xcelsius)
- Converted complex SQL code into ETL mappings using Informatica
- Made updates to current ETL processes based on new requirements or bug fixes
- Followed SDLC to deploy new and updated ETL and PL/SQL code via monthly development cycle
- Supported off hours by monitoring critical data loads in order to remedy or escalate issues as they occur and avoid production environment down time
0-5 years of experience
Worked on several data integration projects for the extraction, transformation and loading of data from various
database source systems into the datawarehouse using Pentaho.Involved in data profiling, data modeling, design,
implementation and monitoring of several ETL projects.
- Designed and developed complex ETL workflows involving Star Schema, Snowflaked dimensions, Type 1 and Type 2
- Optimized several Pentaho jobs and transformations to improve the scalability of complex ETL processes.
- Worked extensively with several heterogenous data sources: Oracle, SQL Server, Flat Files, JSON, Splunk, XML,
- Developed several pig scripts, shell scripts, PL/SQL procedures, functions and materialised views for various ETL
- Developing a Java Map-Reduce job to sessionize log files in HDFS.
- Developed Pig UDFs in Java to custom process data.
- Designed and implemented the QA Framework for the datawarehouse.Developed a customized Plugin for Pentaho
- Customised sstable2json export utility by extending the Cassandra source code for SSTableExport which helped us
- Designed and developed Java applications: “Cubscout” using SNMP Manager API of the Java DMK to get the SNMP
0-5 years of experience
RDW (retail sales data warehouse) serviced financial and marketing groups that needed this data warehouse to better understand the customer purchases so this system will allow company to analyze what products were selling in which stores on what days and under what promotional condition. It was also used for calculating gross profit, gross margin, product-revenue ratios, promotions, trend analysis, and information from various dimensions.
- Involved in source system analysis to understand the incoming data into data warehouse and its sources
- Extracted data from various sources like SQL Server, Oracle, and Flat Files into the staging area for data warehouse; performed de-duping of data in the staging area
- Involved in design and development of logical and physical Erwin data models (Star Schemas) as part of the design team
- Developed ETLs using Informatica PowerCenter 5.2; used various transformations such as Expression, Filter, Lookup, Joiner, Aggregator, and Update Strategy
- Used Informatica Workflow Manager to create and run sessions to load data using mappings created
- Involved in performance tuning of transformations, mappings, and sessions to optimize session performance
- Experience migrating SQL server database to Oracle database
- Involved in maintenance and enhancements for existing ETLs and related processes; provided production support 24×7
- Involved in testing upgrade of Informatica from version 5.1 to 6.1.1
- Assisted in creating physical layer /business model and mapping layer/presentation layer of a repository using Star Schemas for reporting
0-5 years of experience
Maintained warehouse metadata, naming standards and warehouse standards for future application development.
- Understanding the business needs and implement the same into a functional database design.
- Data Quality Analysis to determine cleansing requirements.
- Designed and developed Informatica mappings for data loads.
- Extracted data from multiple sources to flat files and load the data to the target database.
- Hands on experience with developing Oracle PL/SQL stored procedures.
- Worked with ETL leads and contributed to conclude on the development of the project.
- Performance Tuning Informatica Mappings and Sessions.
- Contributed to help Quality Analysts help understand the design and development of the ETL logic
0-5 years of experience
- Utilized database performance tools (SQL Profiler)
- Debugged current procedures based on data inconsistencies
- Created, Modified stored procedures and Functions in SSMS
- Constructed, modified and tested ETL packages using SSIS
- Programmed in SSIS daily, processed millions of records
- Was utilized in office for exceptional scripting abilities
0-5 years of experience
Enterprise Data Warehouse Sourcing factory is capable of fulfilling mandatory FR Y-14 reporting needs for M&T. FR Y reports are a stress testing model for large BHC’s to check for Liquidity and capital adequacy. Various reports are generated. EDW is used for strategic thinking and planning from the part of the organization for overall development, fraud detection and to envision future prospects.
- Involved in all the stages of the SDLC and completed the ETL deliverables for Data warehouse load.
- Core function include ETL Development, leading development project activities like task assignment, monitoring pace of work, identifying blocks in development activities and resolve it.
- Discussed with Data Modelers, Data Profilers and business to identify, sort and resolve issues.
- Creating primary objects (tables, views, indexes) required for the application fro logical data model created by Data Modelers.
- Source data from COBOL Copy book, Teradata database, Oracle database, fixed length flat files etc.
- Distributed data residing in heterogeneous data sources is consolidated onto target Enterprise Data Warehouse database.
- Interacted with data profiling team, SME’s to identify business requirement changes.
- ETL deliverables are documented and provided as per the client requirement.
- Have done performance tuning on scripts.
0-5 years of experience
[company name] has been the leading provider of group disability benefits in the U.S., providing crucial income when people can’t work because of injury or illness. [company name] is also a leading provider of voluntary benefits in the country, offering a variety of valuable, affordable benefits that help protect the financial foundations of millions of U.S. workers.
- Loaded data from Source systems and sent to JMS queue for loading in to Target systems using XML Generator and Parser Transformations.
- Worked withInformatica Designer, Repository Manager, Repository Server, Workflow Manager/Server Manager and Workflow Monitor.
- Reviewed the design and requirements documents with architects and business analysts to finalize the design.
- Used mapplets for use in mappings thereby saving valuable design time and effort.
- Created Logical objects in Informatica Developer tool and exported them to Power Center 9.5 and used them in PC mappings.
- Built common rules in analyst tools for analyst to use in mapping specifications and profiling on tables.
- Created Pre/Post session to save the last generated numbers for SK’s.
- Used Informatica workflow manager, monitor and log files to detect errors.
0-5 years of experience
Data warehousing solutions is to build a consolidated repository of client, portfolio, position and transaction data (Data which was originally residing on local, disparate databases across geographies). It is a centralized repository for cross regional data like Client, Portfolio, Positions, Transactions, Performance and Attribution, with all the supporting reference data to support global/local products, e- applications and processes.
- Developed the mappings using needed 1-1 mappings in Informatica tool according to technical specifications.
- Importing Source and Target tables from their respective databases
- Developed Informatica Mappings and workflows for migration of data from existing systems to the new system using Informatica Designer
- Preparing the Documentation for the mapping according to the designed logic used in the mapping
- Running Jobs in UNIX
- Using existing Shell scripts, customized the new jobs and running workflows in UNIX environment.
- Optimizing performance tuning at mapping and Transformation level
- Preparing Test Cases and worked on executing the cases.
0-5 years of experience
Working on Project to Provide Minnesota State with Minnesota Basic Skill Test Reports for all Schools within the State.
- Wrote COBOL programs using MF Net Express (3.1). Analysis, Design and Coding of complex programs, involving High-level presentation reports controlling Fonts and Spacing using Xerox, Dynamic Job Descriptor Entries, “DJDE”. Create “OMRGEN” form controls for imaging system.
- Developing Unit test plans.
- Analyzing and documenting the current process flow.
- Convert NetExpress programs from 16-bit code to 32-bit code.
0-5 years of experience
Build 4 levels Risk Data Warehouse for Risk Weighted Asset (RWA) application.
- Develop complex ETL mappings and workflows to do data integration, data modeling for data mart
- Perform slowly changing dimensions Type1 and Type2 mappings
- Use SQL and PL/SQL script to support RDBM systems such as Oracle 11g
- Create complex, multipage reports using most of the IBM Cognos functionality
- Write UNIX Shell Scripts to do pre session extraction
- Schedule workflow by using Informatica schedule tool and Control-M
- Tools: Informatica Power Center 9.1, Oracle 11g, Toad and PL/SQL developer, IBM Cognos
0-5 years of experience
To understand the existing business processes and the required solution using Datastage
- Define the total number of Interfaces developments/enhancements required for the business request
- Create detail proposal for ETL process
- Design the solution in Datastage to meet the requirement gathered from business users
- Plan and develop Datastage solution and automation of the business process, which is in line with the design which meets business requirement
- Plan the cutover activities for the pilot and full roll out by discussing with business users and other functional teams
- Create plans, test files, and scripts for data warehouse testing, ranging from unit to integration testing
- Create supporting documentation, such as process flow diagrams, design document etc
- Designed job sequences to automate the process and document all the job dependencies, predecessor jobs, and frequencies to help the support people better understand the job runs
0-5 years of experience
The Problem of Obtaining customer and account information directly from mainframe operational systems, affects mainframe utilization, operational system performance, size of the mainframe environment and availability of mainframe. The objective of this project is to improve availability of key business data and processes by replicating the static or near static data into a non-mainframe database. A successful implementation will reduce mainframe load and in the long run will save money by not having to constantly invest in increasing mainframe capacity including best available quality of data and controlling it and populate it to Data Warehouse database going forward.
- Interact with programming team on overall methodology, practices, and procedures for support and development to ensure understanding of standards, profiling and cleansing process and methodology.
- Proficiently managed for ETL development using the Informatica Power Center.
- Worked with Source Analyzer, Data mappings, Repository Manager, Workflow Manager and Monitor.
- Used most of the transformations such as Source Qualifier, Router, Filter, Sequence Generator, Stored Procedure and Expression as per the business requirement.
- Prepare logical and physical process flows/data models for the new processes in collaboration with the Architecture, Business Analysis and Operations teams.
- Perform data quality analysis, standardization and validation, and develop data quality metrics.
- Insuring the data quality in Source and Target levels to generate proper data report and profiling.
- Created and used reusable Mapplets and transformations using Informatica Power Center.
- Done extensive testing and wrote queries in SQL to ensure the loading of data.
0-5 years of experience
Mostly, I have been working on report designing and ETL and some cube modifications. Designed several SSRS/SharePoint reports for clients like AAA, Macy’s, Barclaycards, Chase etc. Also designing reports, troubleshooting report failures, creating SSIS packages for ETL jobs, Analysis server Cubes and some minor DBA work are some other things that I have been working on. Minor DBA works includes setting up users in Active directory, moving data from one environment to another, working on SQL agent job information, categorizing SQL job agents jobs, minor troubleshooting of user access to different domains etc.
- Provided application support, design, development and implementation along with testing, enhancements, support and training.
- Provided business intelligence support for the applications and systems.
- Designed various email performance and trend reports for customers like AAA, Barclaycards, HSN, Verizon etc.
- Wrote stored procedures, SQL queries, MDX queries, some VB codes for reports. Design complex reports like dynamically driven cascade parameterized reports, reports with sub reports, drill through reports using SQL reporting services 2005/2008. Create scorecards, KPIs and dashboards, analytical charts and reports using SharePoint 2010. Use different report features like grouping, sorting, and reports parameters to build sophisticated reports. Built Ad hoc reports and added drill through functionality using report builder, and created tabular, matrix, graphical reports from SQL/OLAP data sources using SQL Server 2005/2008 Reporting Services. Installed reporting services in production server and deployed reports from development machine to production machines.
- Created complex Stored Procedures, Triggers, Functions, Indexes, Tables, Views, SQL joins and other T-SQL code to implement business rules. Implemented different types of constraints on tables for consistency.
- Created SSIS packages for extracting data from data bases like SQL, Oracle and load those data into tables in SQL server. Create package to handle reject data in customer tables. The package extracts reject records in those tables if the records are older than certain days. Then email in different formats if the file size is greater than 20 MB. Then delete files in the source folder after certain days. Create SSIS packages to move data between different domains. Create SSIS packages to synchronize SQL dbs and AS cubes. Create packages to process the cubes. Used different data transformation methods for data transformations. Use variables and parameters in packages for dynamically driven data extracting and loading. Use complex SSIS expressions. Create various SSIS configuration settings including Environment Variables, SQL Server and XML configuration files. Use event handler to handle errors.
- Performed various database tasks such as browsing database objects, creating new objects, viewing dependencies between various objects, generating DDL, executing SQL queries, Security Management, monitoring Sessions, and viewing/modifying Database Parameters. Create new users in Active Directory. Handle security issues of users in SQL data bases and SharePoint site.
- Modified some existing OLAP cubes. Add attributes, new dimensions etc. Create MDX reports using AS cubes as data sources.
0-5 years of experience
Leaded in Oracle/ETL full development cycle and code promotion. Only allow thorough tested and reviewed code to check into Subversion to ensure the quality of our programs. Followed closely with [company name] customer’s needs and promptly deliver high quality products.
- Restructured existing Oracle data structure to a more efficient, more scalable, and more maintainable system.
- Set the programing standard and error handling standard for Oracle application development. Supported production issues with efficiency and accuracy by 24×7 to satisfy [company name] customers.
- Strictly followed company and Postal rules and regulations to protect company assets, worked closely with Agile team.
- Used Agile methodology to guide product development and testing. Showed development progress daily by Agile planning in VersionOne to let Postal customer know where exactly our stages are.
0-5 years of experience
Develop & manage the design, coding, testing, implementation & debugging of Ab Initio Graphs using the GDE environment
- Work alongside with business analysts, DBA team & QA team to design and implement applications
- Create technical specification documents such as deployment guides, test cases, and ETL design
- Lead/participate in design/code reviews to ensure proper execution and complete unit testing
- Provide technical support to QA & Production teams by troubleshooting application code-related issues
0-5 years of experience
- Designed, Developed, Tested & Documented ETL processes and packages.
- Created reports on various tables as per requirements using SSRS 2008.
- Generated reports utilizing chart controls, filters, parameters, sorting etc.
- Creating Tabular, List and Matrix Sample Reports.
- Developed SSIS packages using Data flow transformations to perform Data profiling, Data Cleansing and Data Transformation.
- Extracting data from heterogeneous sources like text file, Excel sheets, and Sql tables.
- Data Analysis and providing inputs for creating Data mapping.
- Analyze data and resolve data issues, performance and production problems.
- Created Stored Procedures, designed tables, and tuned SQL queries for better performance using T-SQL.
- Extensively used Joins and Subqueires for complex queries involving multiple tables
0-5 years of experience
Provided development to integrate enterprise systems to help the Finance Directors, Marketing and Sales Team for their vital decisions. Business decisions were based on the reports produced using this data warehouse.
- Used DataStage Manager to define Table definitions, Custom Routines and Custom Transformations.
- Wrote Custom Routines and Transformations as per the business requirements.
- Used DataStage Designer to develop Parallel Jobs.
- Extensively worked with Data stage Designer to pull data from Sequential files, Data Set and Oracle-to-Oracle target databases.
- Created Shell Scripts, which will intern call the DataStage jobs with all paths to Source and targets and even with connection information.
- Used Shell Script to run and schedule Data Stage jobs on UNIX server. Scheduler used Control-M on UNIX.
- Used Built-in, Plug-in and Custom Stages for extraction, transformation and loading of the data, provided derivations over DS Links.
- Developed various jobs using ODBC, lookup, Funnel, Sort Sequential file stages.
- Extensively used Sort, Funnel, Aggregator, Transformer, and Oracle stages.
- Created shared containers to use in multiple jobs.
0-5 years of experience
Involved in creating specification document and requirement gathering for What-If analysis, Scheduling, Accounting, and Agreement.
- Involved in migrating SAP Business Objects XI 3.1 legacy reports to BI 4.1 using the Upgrade Management Tool (UMT).
- Involved in creating the BIAR files and moving them to support the migration.
- Used Crystal Report 2011 to develop reports for different clients by using Highlights experts, Select Expert, Record Select, sub reports, static and dynamic parameters.
- Used Business View Manager (BVM) to fix issues with Dynamic Cascading prompts.
- Created Agreement Universe for Accounting and Scheduling projects resolve chasm trap and fan trap in the universe by defining context and Alias and created complex objects using case and decode scripts.
- Modified price matching, contract by defining new context and setting cardinalities between tables.
- Used Webi Rich Client 4.1 and BI LaunchPad to create reports using Alerts, Groups, Element Linking with context ForEach, ForAll and complex logic.
- Used Informatica 9.1.5 to creating mapping to load Mart data warehouse from TPKPN, PPKPN (Application) schemas using Lookup, Expression, Rank, Sorter, Aggregator and Update Strategy transformation.
- Used custom SQL in webi and Add command using Sql in crystal report to full fill the requirment.
- Use Sql Developer/ Sql Navigator to run queries against the database to find out the root cause of the data discrepancy and validate result against the database.
0-5 years of experience
Involved in requirement gathering and Business Analysis.
- Analyzed the functional specs provided by the data architect and created technical specs documents for all the mappings.
- Implemented Slowly Changing Dimensions – Type I and Type II in different mappings as per the requirements.
- Designed and developed many simple as well as Complex Mappings, from varied transformation logic using Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy and more.
- Worked on performance tuning the transformations and mappings.
- Performed Unit Testing and fixed bugs in existing mappings.
- Loading the Data from the tables into the OLAP application and further aggregate to higher levels for analysis.
- Maintained warehouse metadata, naming standards and warehouse standards for future application development Parsing high-level design spec to simple ETL coding and mapping standards
- Created the dimensional logical model using ER Studio
0-5 years of experience
Develop ETLs for the conversion of Legacy data to the new CMIPS II system.
- Improve performance of existing ETLs.
- Participate in the execution of ETLs (Live Data) to bring legacy counties online.
- Build and deploy data fixes as needed for conversion defects.
0-5 years of experience
- Gather Customer Requirements
- Develop ETL solutions using Powershell SQL server and SSIS
- Optimize processes from over 48 hours load time to 2.5 hours
0-5 years of experience
Develop and maintain data marts on an enterprise data warehouse to support various UHC defined dashboards such as Imperative for Quality (IQ) program.
- Designated owner and accountable for major tasks and took responsibility for actions and outcomes to ensure timely and cost-effective results for our team.
- Coach new team members on technical tools such as Informatica Designer, Workflow Manager, Workflow Monitor and UHC Data Models.
- Analyze data and build reports using Informatica data profiling tool & Toad for Data Analyst tool so that UHC members can make informed decisions.
- Set and follow Informatica best practices, such as creating shared objects in shared for reusability and standard naming convention of ETL objects, design complex Informatica transformations, mapplets, mappings, reusable sessions, worklets and workflows.
- Evaluate business requirements to come up with Informatica mapping design that adheres to Informatica standards.
- Implement performance tuning on a variety of Informatica maps to improve their throughput.
- Work with peers from various functional areas to define IT wide processes like code reviews, unit test documentation and knowledge sharing sessions.
- Collaborate and work with business analysts and data analysts to support their data warehousing and data analysis needs.
0-5 years of experience
- Create SSIS packages to load electronic health records (EHR) from various health providers to submit to Centers for Medicare & Medicaid Services (CMS)
- Extract, transform, and load data from DB2, SQL Server 2005/2008/2012, Teradata, Oracle, ERwin, and flat files for various client/server applications
- Develop data manipulation language (DML), data definition language (DDL), T-SQL scripts and SSIS packages through IBC’s SDLC
- Automate manual data processes and transformations using VBA and VBS
- Conduct statistical analyses including headcounts, membership trends, membership satisfaction, and department forecast/budget dollar amounts
- Identify suspects of undocumented diagnosis for risk adjustment initiatives
- Re-construct manual excel and SAS reports into automated Tableau reports
- Develop ETL processes to sync simple code tables from IBC’s data warehouse using Collibra