- Featured in:
Find out what is the best resume for you in our Ultimate Resume Format Guide.
Additional Data Systems Administration Resume Samples
Data Modeler Resume Samples
No results found
0-5 years of experience
Led data modeling for multiple divisions (Bank, Card, Digital, Basel)
- Reviewed and approved hundreds of data models
- Recruited and managed team of up to 20 data modelers, including contractors and associates
- Established standard process for data model reviews
- Led effort to establish curriculum for data modeling at [company name]
- Developed automated tool for updating and running quality checks on ERwin data models
- Instrumental in establishing MSA with a single service provider for data modeling and DBA
0-5 years of experience
Oracle database administrator and data architect responsible for the creation of a new registration database to inventory all banks that are covered by the FDIC insurance.
- Built the physical data model for customer review and approval and constructed the registration database using Oracle 9i on a windows platform.
- Provided scripts for load strategies, backup and recovery procedures and performance tuning to ensure the system was available 24/7.
- Produce the data dictionary and database support documentation to the customer so the system could be turned over to them for future support.
- Lead and mentored the development staff on efficient coding design techniques and performed regular diagnostic checks on the production system to ensure peak performance.
0-5 years of experience
Designed and implemented a worldwide distributed database to assist the order processing, sales commissions, and customer support functions.
- Developed data modeling project plans
- Modeled data requirements for multilingual database project so support multiple countries
- Integrated multiple logical data models into a single data model
- Designed database for support worldwide application
- Implemented data-centric project methodology
- Coached and supervised other data modelers
0-5 years of experience
Analyzed and developed various business processes like loan processing, Account management and other necessary works related to banking in Oracle Packages.
- Designed conceptual data model based on the requirement, Interacted with non-technical end users to understand the business logics.
- Modeled the logical and physical diagram of the future state, Delivered BRD and the low level design document.
- Discussed the Data model, data flow and data mapping with the application development team.
- Helped developing PL/SQL packages, Conducted Unit testing, coordinated the production deployment and resolved Post implementation issues. Provided Maintenance & Support of the existing applications.
0-5 years of experience
Directed development of an enterprise master data model, implementation support for enterprise metadata management software, and management of the enterprise XML schema.
- Achieved super-user level of expertise with Adaptive Metadata Manager, mastering the metamodel(s) and developing load templates for business and technical metadata.
- Produced initial iterations of an Enterprise Logical Data Model for the Consumer domain, enabling gap analysis for acquisition of Master Data Management software.
- Facilitated bi-weekly Consumer Data Stewardship Council meetings, producing seven enterprise business object definitions and a producer/consumer matrix for the Consumer domain.
- Developed several releases of the Enterprise Canonical XML Schema (ECXS), enabling timely implementation of Exchange-related projects for the Affordable Care Act.
0-5 years of experience
Developed medium and difficult SAS software modules to meet customer specifications
- Prepared, processed numerous customer input files; parsed and reformatted the data to meet product requirements
- Processed and analyzed product return files; parsed and reformatted to meet customer specifications
- Generated statistical reports for quality checks and other reports for reporting and billing departments
- Actively participated in team meetings and inter-departmental workshops to discuss best practices/processes to improve
- Actively participated in technical/professional training
0-5 years of experience
Monitor and manage systems in a SQL Server 2000 clustered, replicated and distributed environment to provide High Availability for databases of over 500G in size that run 24×7.
- Implemented conversion to Litespeed backup system and am working on an across the board upgrade to SQL Server 2005.
- Wrote SQL and DTS packages to deal with weekly data extracts to a data repository.
- Worked to convert the DTS packages over to SSIS in SQL Server 2005.
- Assisted Customer Support Representatives with ad hoc queries and data analysis as well as with generation of reports for executive staff.
0-5 years of experience
Migration involved validating the accuracy of the existing data, designing the domain models required for understanding the existing structure, and integrating new functionality. Designed LDM and PDM for the Enterprise Reference Data.
- Translated and assembled business requirements into detailed, production-level technical specifications, detailing new features and enhancements to existing business functionality.
- Executed data modeling using ERwin data modeling tool.
- Created the data models for OLTP and analytical systems.
- Engaged in data profiling to integrate data from different sources.
- Performed the gap analysis and impact analysis.
- Assisted DBAs on support needs and provided guidance on architectural issues.
- Worked with Development teams with loading and extracting data (ETL).
- Facilitated QA for developing test plans and test cases for unit, system and Enterprise testing.
- Collaborated on the data mapping document from source-to-target and the data quality assessments for the source data.
0-5 years of experience
- Conducted the JAD sessions to determine the requirements for the Merchant Conceptual Model
- Designed the Merchant Conceptual, Logical and Physical Data Models.
- Analyzed Source Data to determine what meta-data was to be included in the Logical Data Model
- Performed data mapping to determine where identified source data belonged in the Data Base.
- Created Forward Engineered SQL from Case Tool, which was loaded to build and maintain the Data Base.
- Used SQL to load Source Data into the Data Base.
0-5 years of experience
Interacted with the Business Users to analyze the business process and gather the requirements, making necessary changes to schema objects to fulfill their reporting and business needs.
- Modeled new data models based on user requirements and updating existing data model. Created new metadata and updating existing metadata.
- Discussed with business clients on providing tactical solutions and designing logical data models accordingly. Worked with clients to generate use case scenarios based on user requirements.
- Verified if the data model helps in retrieving the required data by creating data access paths in the data model.
0-5 years of experience
Applied Oracle 10g theory of database modeling using Oracle designer.
- Assisted with development and ongoing maintenance of data modeling environment.
- Created logical business models and transformed into a physical data model instances.
- Maintained all Oracle corporate databases, data dictionaries and schemas.
6-10 years of experience
Coordinated for the Logical review team to processed all database structure changes. Disseminated information to the users on naming standards, data governance, normalization rules within CBP.
- Developed, and maintain Logical and Physical Models for Databases in MVS, DB2, ORACLE, and SQL Server using Erwin Data Modeler.
- Performed forward and reverse engineering processes to create DDL scripts or physical Data Models. Created ERWIN reports in HTML, RTF.
- Provided technical assistance and guidance to the program office as the physical data architect for analysis of data requirements to third form normalization.
- Maintained the contents of the CBP Data Dictionary and provided DBA support including backups, downloads, recoveries, upgrades to SQL server.
- Acted as ERWin administrator, installed ERWin client software on user’s desktop, provided technical support, testing and training.
- Developed SQL scripts to produce reports within ERWin’s Data Browser to enhance validation and identification of entities and tables components across Model Mart.
- Developed “Procedures” and “Process” documentation as needed.
- Reviewed and mapped data components to NIEM domains as input to creating IEPD’s.
0-5 years of experience
- Designed Physical Data Model using Erwin 4.1 for Projection and Actual database Worked with object modelers, worked with Business Analysts
- Created DDL, DML scripts. Converted text data into Oracle data.
- Created PL/SQL procedures, triggers, generated application data, Created users and privileges, used oracle utilities import/export.
6-10 years of experience
Experience with custom and industry data models. IBM IFW: Financial Services Data Model, Banking Data Warehouse (BDW).
- Developed strategy, principles and standards for Enterprise wide model/metadata efforts.
- Applied continuous improvement to keep process structured, manageable and repeatable for modeling and metadata as a member of the Data Governance focus group.
- Worked on DW with high volume transaction data of 100 Million per month and multiple TB storage
- Created Analytical checklist with questions to ask Business Analyst and SMEs to gather detailed data requirements.
- Involved in designing Web based GUI for ASG Rochade. The browser allows users to query data items, view and download model (Erwin and PDF formats) and generate reports.
- Established and utilized checklist for Metadata impact analysis, which assures the accuracy and reusability.
- Audited compliance of emerging data model with metadata against Enterprise standards.
- Developed process to track data lineage in Metadata repository.
- Trained associates in multiple countries to establish Enterprise data methodology and standards.
0-5 years of experience
Designed and developed digital cockpits (an evolutionary concept) for [company name].
- Digital cockpits were developed for following areas: [company name] CEO Cockpit (includes revenue, expense, head count, etc.), Six Sigma, GE Women’s Network, and GE Public Relations.
- Requirement analysis.
- Design & development (PL/SQL programming) and Informatica mappings.
- Reports development.
- Performance tuning & production deployment.
- Tools Used: Oracle, PL/SQL, Informatica, Business Objects.
10+ years of experience
Responsible for collaborating with project teams and business partners to collect and analyze data requirements and produce logical and physical data models in the Erwin data modeling tool.
- Responsible for corrections of on-line procedures created by outsourced developers that took 20+ minutes. Analyzed offshore code and identified nesting pattern and their fixed relationships. Modified table structure and combined three codes into one. Result: reduced query response time to seconds; vindicated that problem resided with outsourced company; received CSC award nomination.
- Lead global project teams in off-shoring / outsourcing large, multi-million dollar projects.
- Instrumental in creating a new multi-million dollar line-of-business by developing company’s first counting application designed for identifying companies that meet geographic, financial, and general business criteria. Result: received Presidential Citation Award.
- Coordinated project implementation schedules, trained and assisted associates and business partners in the use of SQL, QMF, and DB2.
- Reduced DASD and CPU expense by analyzing data volumes, transaction rates, and application access paths to determine the optimal physical database structure, indexes and normalization strategies.
- Additional tasks included data modeling of data warehouse and data marts, creation of all database objects, security structures, backup and recovery procedures, as well as the creation of disaster recovery facilities using DB2 utilities and third party products.
- Forecasted database growth, performance monitoring and tuning, evaluated DB2 software products, provided technical support for the application programming staff, and managed 24×7 production support effort.
- Specified and ran benchmarks for new releases of DB2 at IBM’s testing center.
- Developed the mathematical model to expand the company’s trademarked D-U-N-S account numbering system without altering the physical size or losing the embedded integrity checks, thereby eliminating the expense of substantial system rewrites.
0-5 years of experience
Facilitate meetings with client for requirements gathering, clarifications and documentation.
- Data modeling, source mapping and system analysis.
- Develops universe and reports, clarifies requirements with users, work with other teams (BA, developers, testers) to understand Curam and to determine if report requirements are available – if not, to determine course of action.
- Implemented report data security.
- Created tool for uploading initial set of users.
- Created adhoc universe and its implementation plan.
- Helped define tasks, provide estimates and assign for implementation.
- Helped in the definition of action plans, roles and responsibilities to ensure data quality.
0-5 years of experience
Worked with Business Analysts in gathering reporting requirements representing Data Design team.
- Design of Teradata data Warehouse tables as per the reporting requirements.
- Modeling the tables as per the Corporate standards using Erwin, generation of DDL and coordination with DBA in the creation of tables.
- Preparation of Report Mapping documents, Source to Target Mapping documents (STM) and for loading the data to Target tables.
- Developed ETL load SQL queries using Complex Joins which involves tables spanning the entire data model for the Teradata Data Warehouse.
- Developed SQL stored procedures as per the business requirements of the ETL load processes.
0-5 years of experience
Built a data model for the internal research
- Created a new flexible model for the internal research data (company and industry research and ratings created by Fidelity analysts)
- Along with business analysts, created an adaptable, meta-data based way of implementing internal Fidelity rules for research publication
- Gathered business requirements from one domestic and three international sites
- Created a nightly cycle to process and load the data
- Created stored procedures to handle challenging processing such as applying Corporate Actions to the model portfolios
- Supported a third-party performance calculation tool (Eagle PACE)
- Created multiple prescheduled reports for business users
6-10 years of experience
Lead a major effort to model inter-database interfaces using metadata; document extract-transform-load (ETL) stored procedures; organize and coordinate tasks of DBAs, developers, and other analysts. Use MetaCenter, SQL-Server, MS-Excel, and ERWin as primary tools. Brought order-of-magnitude improvements in IT modeling of business process; attained 50% reduction of unplanned “patches.”
- Integrated upwards of 5,000 disparate and inconsistent data elements into uniform metadata catalog.
- Develop ad-hoc SQL queries against a large group of databases, in aggregate 800 tables, 5,000 columns, plus derived views, bridges, etc. Use Informix-SQL, SQL-Server (T-SQL), and Crystal Reports as primary tools.
- Authored a comprehensive Analysis of Alternatives, examined development/deployment of a large-scale case management system through a 10-year life cycle, including extensive cost-benefit and cost-risk analyses; used verifiable metrics.
0-5 years of experience
The Fraud Investigation Database (FID) is a nationwide data entry and reporting system run out of CMS Data Center that allows CMS to monitor fraudulent activities and payment suspensions related to Medicare and Medicaid. The main subject areas of FID are Investigation and Case, Payment Suspension, Request for Information, Data Analysis Project, and CMS Projects.
- Protyped the Microstrategy logical layer in the local environment for analysis of FID reports.
- Analysed the 22 reports to determine the conversion of the reports either using FID tables and views or using Free Form SQL.
- Worked with MicroStrategy Admin team to fix issues with installation and configuration of MSTR project in CMS environment
- Converted several reports from PL/SQL into Microstrategy report services documents and also created the relevant MicroStrategy objects for these reports which includes logical tables, facts, attributes, metrics, filters, prompts, custom groups, grid reports that can also be used for adhoc reporting by end users.
- Created several Free Form SQL reports in MicroStrategy and also created PL/SQL procedures that were used by these Free Form SQL reports.
- Documented the project associated risks and communicated the same in weekly status meeting.
- Tested the reports for appropriate summary and detail counts and performed data validations.
- Worked on the Data Modelling Tasks for the FID Database.
0-5 years of experience
- Designed MBS pooling and pricing models through simulations and back-testing
- Built machine learning models for mortgage banker retention, closing rate prediction and market data monitoring
- Designed the system to analyze clients sentiments and topics from millions of client messages
- Designed the large scale job scheduling mechanism for mortgage underwriting operation teams
- Nominated for the 2013 Dan Gilbert Award for excellent work in predictive modeling and optimization
0-5 years of experience
- Access, SQL Server
- Provided database design consulting and development services to the Director of Information Systems
- Designed a 60-table client/server app against SQL Server including 24 stored procedures
- Supported management of in-production systems
- Reports were rendered in Access
- System was deployed
0-5 years of experience
Developed Cross Channels Data Models for Integrated Data Warehouse.
- Analyzed the business and data requirements in detail with project stakeholders.
- Performed detailed attribute analysis.
- Developed 3NF logical model for ATM & Mobile subject areas.
- Used Forward Engineering to generate DDL for creation of physical data model.
- Used CA Erwin to develop data models.
- Performed quality checks on data models.
- Enforced standards and best practices around data modeling effort.
- Conducted data model reviews with project team members.
- Managed loading of Reference Data in the datawarehouse.
0-5 years of experience
Subject Matter Expert in scripting CA ERWIN CASE tool.
- Analyzed business requirement documents and translating to Canonical data structure.
- Used Levenshtein distance to determine duplicates across ERWIN models.
- Technical assistance and mentoring to staff for understanding Canonical architecture.
- Leveraged ERWIN scripting APIs to automate canonical entity and attribute creation.
0-5 years of experience
Tasked to design and implement a data mart to cater to all Securities Accounting reporting business needs. Design involves creating logical/physical data models, extensive source system data analysis as well as data extraction design from transactional systems, Securities operational data store (ODS) and the Enterprise data warehouse.
- Designing of conceptual, logical and physical data models.
- Translation of business requirements into design and technical specifications for implementation.
- Working with various business stakeholders to identify business reporting needs (current/future).
- Co-ordination with business analysts, SMEs, ETL developers, QA team for the implementation of the data mart.
0-5 years of experience
Environment: IBM DB2 9.5, TOAD, Erwin 8, IBM DataStage, Business Objects
- Perform business analysis and requirements gathering to define database and reporting requirements
- Build and maintain enterprise data models and metadata
- Create and maintain IBM DB2 DDL for enterprise data warehouse involving tables, views, and materialized query tables
- Create and maintain data transformation logic for IBM DataStage
0-5 years of experience
- Involved with Business Analysts team in requirements gathering and based on provided business requirements, defined detailed Technical specification documents.
- Analyzed the Business information requirements and examined the OLAP source systems to identify the measures, dimensions and facts required for the reports.
- Performed the data source mapping.
- Utilized Power Designer’s forward/reverse engineering tools and target database schema conversion process.
- Documented logical, physical, relational and dimensional data models. Designed the data marts in dimensional data modeling using star and snowflake schemas.
- Redefined attributes and relationships in the model and cleansed unwanted tables/columns as part of data analysis responsibilities.
- Review the data model with functional and technical team.
- Worked on the reporting requirements and involved in generating the reports for the Data Model.
- Assist developers, ETL, BI team and end users to understand the data model.
0-5 years of experience
As a consultant, provide design and development support for enterprise applications for a large federal client. Responsibilities include:
- Developing conceptual, logical, and physical database models using Infosphere Data Architect for an Oracle DBMS
- Designing and developing a data mart to support complex Business Objects analytical reporting requirements using dimensional modeling
- Utilizing the SCRUM agile methodology for design and development
- Analyzing legacy systems in support of modernization efforts
- Ensuring adherence to enterprise data architecture standards and guidelines
- Designing ETL routines
- Performance tuning via query optimization and database tuning
0-5 years of experience
- Used R, SAS, Minitab, SQL Server, and Microsoft Office software to condense large, complex data and theories into easy-to-understand solutions
- Used R, SQL Server, and Excel to analyze data to gauge performance, quality and trending of business KPIs
- Developed statistical models to solve business bottlenecks
- Performed root-cause analysis on internal and external data to identify points of interest and to improve
- Used SQL Server and Excel to perform ad-hoc data reporting for multiple business areas
- Liaised between teams to gather and communicate information for process improvement
0-5 years of experience
Working in Mfg. Info systems & Validations group is responsible for delivery of systems and analytics and Operational reporting dash boards for Clinical Products, Lab Trails, Disposition of Manufacturing Lots, Vendor, Test result, validation and Vendor Test /Result Document Management. Responsible systems and and providing solution design and Data Architecture for Multiple Projects in the Java based Applications and Business intelligence applications area. Working on Projects like PQS, Gtrack, GVMS
- Provided the High Level Design and Data Architecture for GVMS Pahse2 Project
- Provided the High Level Design and Data Architecture for Gtrack Project
- Working on IQS to PQS data migration project. Providing Data Architecture and Data Mapping between IQS and PQS.
- Driving the initiatives with IT infrastructure and DBA team in building the server and DB for PQS UAT, VAL & PROD instances.
- Provided PQS Migration Data Flow diagrams and Data Migration Specs.
- Worked on GVMS Phase2 redesign and Data Architecture.
- Worked on Veeva Integration with BI and provided high level design doc
- Worked on Data flow Mapping templates and documents in Lab information Management (LIMs) group
6-10 years of experience
- Liaisoned with the senior leadership team to understand strategic BI initiatives with regard to Pricing, worked with process owners to understand goals/tactics, KPI’s & metrics, modeled business presentation layer using Cognos Framework manager, creation and distribution of cubes, development of dashboard reports and active reports, ad Hoc analysis reports.
- Performed Designing and Development of Executive dashboard solution that provided cross subject area analysis for the senior leadership team. Developed and maintained balance scorecard reports based out of multi-dimensional cubes built against sales, production and deliveries data marts.
- Performed Designing,Development and care of a BI solution built against the sales data mart. Authored monthly sales summarization reports to help financial analysts in monthly close activities, to help BU leads in evaluating business performance, budgeting and forecasting.
- Performed Designing, Development and care of a BI solution that helped in sales person performance evaluation and compensation, maintained a commission calculation solution that enabled the business to define commission plans for sales people, district managers and regional managers, developed automated ETL processes for master data needed for the application, maintained commission complex calculation engine built entirely in MS SQL, Modeled cognos solution to created cognos pay statement reports for the sales people, district managers and regional managers.
0-5 years of experience
- Design, build and deliver fact-based models and analytics by writing complex SQL queries and using a variety of statistical and data mining tools
- Employ and develop proprietary models and algorithms that leverage existing client data to identify bottlenecks to proï¬table sales and margin improvement
- Design and present model results in a straightforward, clean and impactful manner using data visualization tools
- Transform market-based research and transactional data into sales and marketing related insights that maximize customer ROI
- Manage multiple deliverables, projects and associated team communication in a dynamic environment
0-5 years of experience
Created data marts for Healthcare Physician application and billing system.
- Prepared OLAP tool MS Analysis Server and COGNOS comparisons and competency documents.
- Business requirements gathering, data analysis, data profiling the key physician data management system maintained in PROGRESS and Web Application data in MySQL DB.
- Designed ETL programs using Microsoft Data Transformation Server and Transact-SQL to feed the data from the upstream systems.
- Built data cubes using COGNOS Power Play transformer and key decision reports using Power Play Web Reports.
- Designed developed and deployed COGNOS Enterprise Performance manager setup and administration across the company and actively involved in end user training with the help of COGNOS team to use the tool and develop ad hoc reports using the COGNOS Impromptu.
- Architected healthcare data warehouse comprised of Physician, Customer and Network providers. Designed and reviewed complex clinical and physician data models, Requirements Gathering, Data Analysis, and Logical/Physical Data Modeling using Erwin. Data Migration of 600K physician’s information from Informix to Oracle.
- Prepared high level design and review of the Low level design documents, Visio diagrams to build dimension and Fact tables. ETL design and architecture setup. Preparation for Master test plan, Test scenarios for SIT and UAT.
- The EDW was built using the Snow Flake modeling with Inmon ODS/EDW methodologies. Interviewing key stake holders and end business users for Requirements Gathering, Data Analysis and Data Profiling. Designed and developed ETL programs using Ascential Prism Data warehouse to extract data from COBOL files from Mainframe Transaction systems. Designed the Control-M jobs to schedule the batch programs in the Stage and Production environment.
0-5 years of experience
- Participate in the design of data mart fact and dimension tables within a product distribution EDW supporting Sales and Marketing.
- Perform data profiling / auditing in a AS400 / DB2 environment.
- Define mapping documentation for the ETL process.
- Define business requirement documents.
- Utilizing Access Database to maintain the Meta Data Repository.
- Extrapolate business rules from Cobol source code.
- Participate in Functional Testing for the ETL process.
- Design Functional Testing Standard Operating Procedures.
0-5 years of experience
- Analyse Business requirements and reports.
- Complete data profiling using IBM Data Analyser.
- Develop relational and/or dimensional, conceptual, logical, physical data model.
- Develop data ETL (Extract, transformation and load) transformation logic for IBM data stage.
- Create Volumetric information to guide DBA to decide between 4K and 16K tablespace.
- Plan, design and implement index, partitioning key and range partitioning strategies on multi-partitioned database to minimize data shipping between partitions for improving performance.
- Create Data Definition Language (DDL) and Data Manipulation Language (DML) Pre, Post and Backout scripts. Also created permission files to help DBA with providing Grants.
- Create Functional test cases to test business rules in DIT, SIT, UAT and PRD environment.
- Provide scripts to Technical Analyst to access data from the source DB.
- Maintain IBM DB2 DDL for enterprise data warehouse.
0-5 years of experience
Gather Requirements from SME’s
- Reverse engineer database and create data models using Erwin
- Regroup the tables in diagram as required by SME’s
- Compare the database across different servers using Erwin
- Create Production data Models
- Do the complete compare of different Environments.
- Generate PDF of the data Models and send it to the project teams
- Conduct review sessions to explain the data Models.
- Generate the DDL by forward engineering using Erwin for the differences and send to the DBA’s
- Create a change ticket for the DDL to be deployed on different servers
- Attend weekly meetings to update the project status
0-5 years of experience
Design/DBA/Development/Production support for Oracle 8i Databases, Oracle Financials ERP Application and monitor interface programs to move data from Back office to financial applications. Process Vendor Invoice data files into Financial interface tables and then to application tables and finally, Invoice matching with Purchase Orders. Managing fallouts from Inventory Items loading, Purchase Order, Vendor interface and EDI interface.
- Data model design, extensions to data schema to support functional enhancements. Provide technical guidance to developers in the understanding data model.
- Design and enhancements to Data Warehouse Star Schema and ETL scripts to incorporate changes to Production data model and reflect new business rules and Partner specific logic. Changes were kept transparent to existing Applications, Universes and Reports.
- Environment: Oracle Applications 11.5.7 & 11.5.9 (PO, Inventory, HR, AOL, System Administrator), Oracle 8i, SQL*Plus 8.0, PL/SQL, TOAD, SQL Loader, Data Loader