- Featured in:
Find out what is the best resume for you in our Ultimate Resume Format Guide.
Additional Inventory Management Resume Samples
Data Warehouse Developer Resume Samples
No results found
0-5 years of experience
Assisted in data analysis, star schema data modeling and design specific to data warehousing and business intelligence environment. Concurrently, responsible for the support and administration of large (2TB) enterprise level data warehouses holding weekly credit card and real estate transactions which were aggregated overnight for Monday morning reporting to the banking customers.
- Led efforts to analyze data for source/target mappings, created T-SQL scripts for data processing and automations.
- Led database administration and database performance tuning efforts to provide scalability and accessibility in a timely fashion, provide 24/7 availability of data, and solve end-user reporting and accessibility problems.
- Proactively identified, managed, and corrected data quality issues with reports provided to CR managers.
- Developed ETL scripts to enable weekly processing of 2-4 GB data sets into data warehouse.
- Optimized ETL performance to provide quickest response time possible.
- Automated and executed operational processes and procedures to ensure uptime and performance in a high-availability web-based environment.
6-10 years of experience
Senior Data Warehouse Developer
- Managed assignments/ workload for team of 6 developers to meet project milestone and delivery goals.
- Designed and built (using Oracle, DataStage, and Brio) the Health and Welfare benefits reporting portal application used by over 50 clients to warehouse and report on approximately 3 million participants per week.
- Implemented and managed client health and welfare databases and customized application parameters.
- Cut 6 weeks off of the initial data analysis phase of new client implementations by designing and building a transformation data warehouse.
0-5 years of experience
Designed, developed and implemented a Big Data – Data Warehouse from scratch using SQL server 2012
- Created and configured SQL Server Analysis Services database which introduced company to a multidimensional tracking of subscribers special statistical techniques using SQL and Excel
- Developed and implemented custom data validation stored procedures for metadata summarization for the data warehouse tables, for aggregating telephone subscribers switching data, for identifying winning and losing carriers, and for identifying value subscribers
- Identified issue and developed a procedure for correcting the problem which resulted in the improved quality of critical tables by eliminating the possibility of entering duplicate data in a Data Warehouse.
- Spearheaded a project to implement company standards and establish procedures to ensure a unified data management approach
- Analyzed and compared performance of Redshift, Hadoop, MySQL and SQL Server databases using TPC-H benchmarking and made recommendations to management
- Implemented partitions on a large dataset as well as index functions using SQL Server 2012 resulting in 90% improved performance
- Designed and implemented SQL based tools, stored procedures and functions for daily data volume and aggregation status
- Completed a data warehouse dictionary
- Introduced company to geographic distance calculation, and chi-square for identifying affinity in subscribers, and survival quantifying subscribers events over the Customer life Cycle using Analytic SQL
- Created management reports with SSRS as well as SQL and MDX queries
0-5 years of experience
Led data warehouse integration efforts of Encounter data for multiple health plans.
- Collaborated with multiple health plans to understand and implement detailed requirements for HEDIS and other reporting needs.
- Created detailed functional and technical design documents.
- Planned, co-ordinated analysis, design and extraction of encounter data from multiple source systems into the data warehouse relational database (Oracle) while ensuring data integrity.
- Developed, documented and validated complex business rules vital for data transformation.
- Enhanced and expanded the encounter data warehouse model through sound detailed analysis of business requirements.
- Integrated a wide variety of source file layouts into the data warehouse. File layouts such as ANSI X12 837, NSF, UB92 files using the SeeBeyond ETL tool.
- Managed the development and execution of test plans, and effected corrective action on data quality issues.
- Instrumented and conducted regular reviews of audit reports with encounter vendors to improve the quality and integrity of encounter data
- Improved encounter data quality by up to ninety-percent on most measures as defined by Business Reporting unit.
0-5 years of experience
The [company name] provides flexible reporting environment for performing traffic and performance analysis of the SCP nodes of AT&T. The raw data for the data mart is collected in the form of flat files which are pulled from each remote Tandem SCP through FTP and placed into the Unix box of the SCP Data Mart. The ETL processing is performed through Informatica on Sun Solaris. Scheduling of the Informatica Sessions is automated through Perl scripts. Reporting is done through Business Objects.
- Monitor and Trouble Shoot Informatica maps in production environment.
- Prepared Operation Manual and Installation Manual
- Developed ETL procedures to ensure conformity, compliance with minimal redundancy, translated business rules and functionality requirements into ETL procedures.
- Developed ETL maps and shell scripts to clean and load data into Data mart
- Involved in developing reports using Business Objects to meet the reporting requirements
- Developed, executed and documented test plans for Unit and System Integration Testing.
0-5 years of experience
Analyze business requirements and implement data mart solutions for data mining and reporting.
- Translated business requirements into technical design specifications using Visio to design ERD schema.
- Normalized data fields to meet the business requirements and specifications to build a data marts for various departments into a metadata.
- Designed and implemented cubes across multiple data marts using OLAP methods in Business Object for users to access reporting as its needed.
- Designed ETL flows to extract data from CRM application and stored in MS-SQL DB.
- Maintained ETL jobs for OLAP cubes. Executed and supported ETL using MS-SQL Server and Informatica.
0-5 years of experience
Created reporting solution to reduce the number of custom report sets needed from 11 to 3 – dramatically reducing report maintenance needs.
- Devised custom Cognos OLAP cube and associated Cognos reports used by all U.S. branch offices to view up-to-date workload and employee performance statistics. Managed all aspects of requirements definition, documentation, and implementation.
- Served as team leader and project manager during successful migration to new version of Cognos software.
- Maintained Cognos business layers and self-service query environment for production Data Warehouse.
0-5 years of experience
Created a prototype Enterprise Information System front-end to allow end users to easily access reports.
- Implemented an algorithm in C to convert hierarchy information into a usable database table in Teradata.
- Created a C++ application to read messages from an MQSeries queue and perform filtering and writing operations.
- Developed ETL routines in Informatica to replace legacy transformation engine.
- Implemented an Access frontend to Teradata allowing connections to multiple machines facilitating seamless transfer of data.
- Created, tested, and documented Teradata Multiloads for moving data into Teradata.
- Created, updated versions of reports running in MVS to web-enabled versions with Brio.
0-5 years of experience
Project objective was to create a solution for interest rate risk management for citigrup treasury. QRM product APIs were used for loading banking and credit card transaction feeds into a master database and then running various statistical calculations. Resulting data was consolidated using analysis services and reported via excel.
- Designed and developed DTS packages for ETL and wokflow management
- Wrote .NET applications for execution of controller DTS package, developing GUI for checking job status, viewing logs.
- Implemented seamless code migration between Dev/UAT/Prod environment using INI files.
- Developed reconciliation reports in Excel to effectively monitor the data movement.
0-5 years of experience
The Data Warehousing project
- Designed, modeled and implemented Data Warehouse and Data Mart: tables, indexes, procedures, packages and UNIX configuration
- Developed PL/SQL packages to load Data Warehouse and Data Mart tables to satisfy the subscribers needs
- Created generic PL/SQL and Unix shell script procedures used by four different applications
- Designed staging tables and loaded them with fixed width and delimited source system data using SQL*Loader
0-5 years of experience
Developed PL/SQL procedures and packages which extracted, transformed and loaded data to/from Oracle 10g Databases.
- Downloaded data into delimited files and logs using IBM DB2 CLP (Command Line Processor), loaded it into Oracle 10g databases using SQLLDR (SQL Loader) and external tables.
- Performed data transformations using custom mappings. Processed and validated data as per business rules to create output files using UTL_FILE package.
- Developed scripts to generate scheduled quarterly reports from processed data.
- Performed scrubbing and validation on existing data.
- Created technical documentation and documented processes on several projects.
- Created and managed physical/logical standby databases using Recovery Manager (RMAN), Oracle Data Guard Broker, and Oracle Enterprise Manager (OEM).
- Created reports on existing data using PL/SQL.
0-5 years of experience
Design and develop data warehouse and reporting solutions for Sales reporting and the analysis of Web Click Stream data. Project scope was a complete end-to-end solution using SQL2000 as the database platform and Crystal Enterprise/Reports for the reporting solution for the Disney Direct Marketing group. The warehouse receives over 4 million rows of raw data per day via a SQLXML / Updategram interface. ETL is then performed at 3 hour intervals using DTS and Stored Procedures to take raw data and process it thru a set of ETL staging tables, tasks and processes to the final fact or dimension tables.
- Used ER/Studios for logical and physical dimensional data modeling to support data architecture designed to accommodate a minimum of 1 terabyte of data per year.
- Designed and developed Star Schema dimension and fact tables to support data warehouse requirements for the Sales and Marketing Click Stream DW
- Developed distributed partitioned views so that data and volumes could be managed more efficiently for future growth.
- The warehouse design incorporates full referential integrity via check and foreign key constraints
- Developed the Crystal Enterprise reporting system for report distribution and management of scheduled and ad hoc reporting for approximately 65 Crystal Reports v9 / 10 and the associated stored procedure data sources. Integrated all Crystal Reports in Crystal Enterprise for ad hoc and scheduled processing.
0-5 years of experience
- Designed and developed SSIS packages (Incremental and Bulk) for Extracting, Transforming and Loading Monster’s Site data and Siebel CRM data which includes sources like database tables, spreadsheets and files from FTP.
- Implemented optimized SQL queries, SPs & methods to reduce the completion time of SSIS packages used for loading Global Products Mart and Global Sales Mart that helped to load entire data within time to abide the service contract.
- Analyzed, Simplified and Standardized the complex Delta check process used for Siebel CRM sourcing in many packages which improved the efficiency of data loads by at least 40% and ending all the maintenance and bugs tickets related to this process along with reduced development & testing time required for these packages by at least 30%.
- Analyzed, Improved and Simplified critical Sales related legacy packages and code to provide an efficient solution
- Assigned as the Primary/ Lead Data warehouse developer for Monster’s Siebel CRM sourcing
- Worked with different groups and business analysts worldwide to determine business and functional requirements.
0-5 years of experience
Designed and developed several Essbase cubes integrating load rules, calc scripts and report scripts for Finance.
- Developed an automated process to refresh Essbase nightly processing using UNIX scripts, ESSCMD and MAXL.
- Programmer/Project Lead for several budget cycles utilizing Essbase uploading to Lawson and PeopleSoft.
- Developed and maintained ETL jobs utilizing Ascential DataStage for all Essbase cubes.
- Lead Developer in a highly visible project to create Business Objects universes and reports used to produce data to assist in branch level financial decisions utilizing VBA for automated distribution.
- Developed .NET application in C#/VB to refresh longitude and latitude information for “Print to Kinko’s” add-in.
- In 2001, redesigned existing Financial Reporting Applications using C/C++ with embedded ESQL.
6-10 years of experience
Data Warehouse Developer with responsibility for retrieving merchandise and inventory files from the mainframe system, creating procedures and updating Oracle tables.
- Supported the Efficiency, Quality and Speed of 9 Tara Bytes space Data Warehouse.
- Produced aggregate reports for Executives and Research Analytical team that provided summarized data by day, week, month sales and other important statistics. This information was used by management to evaluate profitability and identify trends, which aided decision making.
- Assured all SLA requirements were met on a daily basis
- Supported 24*7 hour on-call rotation.
- Used SQL Loader to load data from external files into Oracle Database.
- Designed PL/SQL code for client’s business process functionality.
- Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL.
- Strong experience in Data warehouse concepts, ETL and UNIX shell scripts.
6-10 years of experience
- Served as project manager overseeing multiple small projects simultaneously i.e. Professional Billing System, Blue Cross billing interface, hospital laundry system, and many other apps—financial, clinical, compliance, research, etc.
- Developed data warehouse ETL routines with Oracle, PL/SQL and IBM Websphere/DataStage, Crystal Reports, to integrate data from OLTP source systems into ODS and OLAP schemas to facilitate BI.
- Enhancemed Professional Billing System in Oracle, PL/SQL, VB6 and MS Access to increase accuracy, throughput, and daily billing. Used MS Access GUI tool featuring parameter driven reports to stratify billing data across dimensions such as time, department, billing status. Increased daily billing by 11% to $450,000. Reduce charge lag by two days.
- Assisted clinical and financial units in meeting business needs with third party apps.
- Interviewed clients to gather requirements and vendors to evaluate their products.
- Wrote and presented technical product evaluations to management.
- Developed data warehouse algorithm reducing load time by 20%.
- Developed customized extract routine from Data Warehouse to downhill adjudication system.
- Mentored and coached new data warehouse associates in development of DataStage ETL routines.
6-10 years of experience
Lead data warehouse developer responsible for the development of SQL Server objects, SSAS cubes, IBM Cognos cubes, SSRS and IBM Cognos reporting models, reports, and dashboards.
- Designed and built relational reporting models based on standard data warehouse using SSRS and converted those over to IBM Cognos Business Intelligence using Framework Manager.
- Developed a packet of out of the box reports using SSRS and converted them to IBM Cognos Business Intelligence using Cognos Report Studio.
- Created a dashboard starter kit for client KPIs using IBM Cognos Report Studio, Business Insight, and Business Insight Advanced.
- Worked with clients to document requirements and develop custom implementations of the data warehouse, including custom ETL processes, SSAS cubes, reports and dashboards.
- Worked with the sales team to create proof of concept environments for prospective clients to allow them to interact with our applications using their own data prior to purchase.
- Played an integral role in creating a universal data warehouse which streamlined the SQL code of 6 billing systems into 1 standard code base
- Successfully designed, built and tested an automated testing tool that cut testing time for new client installs and upgrades by 75%
0-5 years of experience
Designed the data model based on star schema and created the tables and indexes and developed the scripts to load the data mart tables
- Performed unit testing and Integration testing of all the processes
- Trained the Business Analysts on analysis and development of the reports
- Analyzed, designed and developed critical reports in the local division of MCI
0-5 years of experience
[company name]. (CFL), LA, USA is in the business of manufacturing goods and beverages with factories all over the globe. This has many products with different flavors and the product sale is through many channels like CSD, institutional and Dealership including North America, Africa, Asia pacific and Europe. Informatica is used as ETL Tool for loading a Data from heterogeneous source system from different location into Target Table.
- Worked on the project that involved development and implementation of a data warehouse for CFL
- Responsible for extracting, transforming, and loading data from oracle, flat files and placing it into data warehouse
- Designed, developed, and tested the mappings, using Informatica 8.1.0
- Interacted with management to gather sources and their requirements
- Involved in the some part of performance tuning.
- Developed simple and complex mappings using Informatica to load dimensions and fact tables as per STAR schema techniques.
0-5 years of experience
[company name]. (CFL), LA, USA is in the business of manufacturing goods and beverages with factories all over the globe. This has many products with different flavors and the product sale is through many channels like CSD, institutional and Dealership including North America, Africa, Asia pacific and Europe. Informatica is used as ETL Tool for loading a Data from heterogeneous source system from different location into Target Table.
- Worked on the project that involved development and implementation of a data warehouse for CFL
- Responsible for extracting, transforming, and loading data from oracle, flat files and placing it into data warehouse
- Designed, developed, and tested the mappings, using Informatica 7.1
- Interacted with management to gather sources and their requirements
- Involved in the some part of performance tuning.
- Developed simple and complex mappings using Informatica to load dimensions and fact tables as per STAR schema techniques.
0-5 years of experience
[company name]. (CFL), LA, USA is in the business of manufacturing goods and beverages with factories all over the globe. This has many products with different flavors and the product sale is through many channels like CSD, institutional and Dealership including North America, Africa, Asia pacific and Europe. Informatica is used as ETL Tool for loading a Data from heterogeneous source system from different location into Target Table.
- Worked on the project that involved development and implementation of a data warehouse for CFL
- Responsible for extracting, transforming, and loading data from oracle, flat files and placing it into data warehouse
- Designed, developed, and tested the mappings, using Informatica 8.1.0
- Interacted with management to gather sources and their requirements
- Involved in the some part of performance tuning.
- Developed simple and complex mappings using Informatica to load dimensions and fact tables as per STAR schema techniques.
0-5 years of experience
[company name]. (CFL), LA, USA is in the business of manufacturing goods and beverages with factories all over the globe. This has many products with different flavors and the product sale is through many channels like CSD, institutional and Dealership including North America, Africa, Asia pacific and Europe. Informatica is used as ETL Tool for loading a Data from heterogeneous source system from different location into Target Table.
- Worked on the project that involved development and implementation of a data warehouse for CFL
- Responsible for extracting, transforming, and loading data from oracle, flat files and placing it into data warehouse
- Designed, developed, and tested the mappings, using Informatica 8.1.0
- Interacted with management to gather sources and their requirements
- Involved in the some part of performance tuning.
- Developed simple and complex mappings using Informatica to load dimensions and fact tables as per STAR schema techniques.
0-5 years of experience
Provided technical expertise to clients on implementing PeopleSoft EPM 9.1 Data Warehouse with Websphere DataStage 8.5 ETL tool.
- Setup and configured EPM 9.1 Foundation (Business Units, SetIds, Calendars, and Currency Codes) for Financials and HCM Warehouse and Analytic Reporting.
- Gathered requirements and performed impact analysis for data model changes.
- Designed custom Dimensions and Facts Tables to extend data warehouse using Application Designer.
- Created Extract, Transform, Load (ETL) jobs for custom tables in data warehouse using Websphere DataStage 8.5. Modified delivered ETL jobs based on Functional Requirements. Troubleshooted performance issues with ETL jobs.
- Loaded QA and Production Environment using Websphere DataStage with minimal timeframe given.
- Worked with Test and OBIEE Reporting Team to resolve data and reporting issues.
- Setup nightly batch process for EPM Financials and HCM ETL jobs. Tasks included creating master sequencers in DataStage, writing Unix Shell Scripts and working with the T.O.C to setup jobs via Control-M.
- Provide On-Call support for DataStage ETL job failures.
- Documented software and operational procedures.
0-5 years of experience
Developed XML based solutions using SQL Server 2005, Tibco and .NET technologies.
- Developed SSIS packages programmatically allowing code re usability by other developers.
- Created web service applications to communicate with Microsoft CRM-AIF application.
- Architected advanced re-usable ETL Web Services components to import client data.
- Trained DBAs and Architects on SQL Server Analysis Services disaster recovery strategies and security.
- Key liaison between Middle tier team and ETL Team providing mentoring among application developers.
0-5 years of experience
Worked on a cross functional team with various software vendors, internal customers and consultants on a daily basis for development and production issues.
- Documented the technical architecture of the data warehouse, including the physical components, their functionality, source data and ability to integrate with existing system architecture.
- Attended TDWI conferences and training in relevant Enterprise Warehouse subjects including Netezza, Informatica, data governance, master data management and integration technologies.
- Developed, tested and implemented various Informatica ETL workflows to load data into the data warehouse objects including Oracle (Staging) and Netezza (EDW) using Informatica Power Center.
- Served as technical lead for subject area implementation.
- Responsible for overseeing ETL architecture and helping to define best practices and process standardization. Worked to enforce consistency in data warehouse for naming conventions and data types.
- Experienced using Kimball Data Warehouse Methodology.
0-5 years of experience
- Led a 5 member group to create data warehouse for Chicago law enforcement to commit better security monitor and prediction.
- Collected data of crime of Chicago for the latest two years, designed fact table and dimension tables, cleaned the useless attributes and bad data in Excel, and limited the data set to 100,000 records.
- Used SSIS to load the data into database, and used SSAS to generate pivot tables to show OLAP components for analysis.
- Reached important conclusions such as the most frequent crimes for a certain population, or the trends of assaults among teenagers.