Chip's Resume

From ChipWiki
Revision as of 12:39, 18 July 2011 by Chip (talk | contribs) (Updated recent employment)
Jump to navigation Jump to search

Data Warehouse Architect

e-mail: chip@chiplynch.com

I am a data warehouse architect with expertise in the design, implementation, and maintenance of enterprise-scale business intelligence and data migration projects. I have led teams of people through every phase of Data Warehouse creation, and I can bring experience from high profile, large-scale clients such as NASA, the City and County of San Francisco, the US Air Force, and the US Postal Service. I have experience with the Capabilities and Maturities Model (CMM) development lifecycle standards, where I've learned processes for running organized, repeatable and successful data warehousing projects. I have personally designed data models, built ETL and reporting architecture, and administered databases and application servers.


DOD Clearance: Top-Secret


Experience:


Wright Patterson Air Force Base Dayton, OH,

Data Warehouse Contractor (2003-Present)
US Air Force, GLSC Project, December 2010 – Present

  • Performed systems and network analysis, visualizations, and requirements planning for an IT assessment and Roadmap of GLSC owned and managed systems

US Air Force, ECSS Project, August 2008 - December 2010

  • Technical lead for several of the Data Management Organization’s initiatives, including Master Data Maintenance, Metrics and Monitoring, Business Rule Consolidation, Data Acquisition and Archival, and other data quality focused products
  • Team lead for a Business Intelligence study, which focused on architecting methods to bridge data from hundreds of legacy systems and existing reporting with the new policies, procedures, and data introduced by the ERP. The design stretches from high level governance and policy issues to technical BI architecture and enterprise warehouse design
  • Performed in depth data analysis as team lead for a Master Data pilot, resulting in methodologies for ongoing analysis, quality monitoring, and specific recommendations on trouble areas for data migration to the Oracle ERP

US Air Force, Data Warehouse, June 2003 – August 2008

  • Performed as the technical team lead for a development team, which integrated over 50 maintenance, supply, logistics, and asset systems into the data warehouse including D043, D200A, D035A, D035K, and G100
  • Built standards and repeatable CMMI processes for every aspect of Data Warehouse management, including modeling and loading new source data, data mart processing, report specification and building, and ongoing maintenance
  • Supported and designed the development of various versions of Oracle, WebSphere MQ, Business Objects, Cognos, Informatica, and Teradata in a closely integrated environment


FirstEnergy Corp. Akron, OH, Data Migration Contractor (2002-2003)

  • Supplied technical expertise and programming support for a large scale SAP R/3 migration using Informatica, Oracle, and SAP/ABAP tools.
  • Worked closely with both a functional and technical team to design, develop, test, and deploy a migration mechanism for Human Resources data.

US Postal Service Raleigh, NC, Data Warehouse Contractor (2002)

  • Provide Informatica and data warehousing expertise for Postal's very large data warehouse initiative.
  • Primary responsibility was on increasing performance on very large data loads to the warehouse. For example I reduced the running time for some loads from 80+ hours down to 4 hours.

Clients with KPMG Consulting Washington, DC, Senior Consultant (1998- 2002)

City and County of San Francisco (2000- 2001)

  • Designed and implemented a terabyte scale data warehouse for the City.
  • Responsible daily for Oracle tuning and DBA work, Informatica and Cognos design, development, and performance tuning. Also administered large NT and HP-UX servers.
  • Marketed the warehouse to senior department heads, eventually creating departmental data marts and a distributed Oracle environment.
  • Developed end user training, documentation, and robust testing/validation procedures.


National Aeronautics and Space Administration (NASA) (1999-2000)

  • I was part of a team that designed and developed an enterprise-wide data warehouse (EDW) for NASA. I was the main technical resource on the project, and my duties included researching, specifying, purchasing and administering $1.5 million of Sun hardware running the Solaris operating system.
  • I was involved in the entire lifecycle of this EDW. I met with scores of functional people from across the country to develop logical and physical data models. I led a small team in the Informatica development for data transformation and cleansing from disparate source systems to the Oracle EDW. We worked with the Holos OLAP package and Crystal Reports and Brio ad-hoc query tools to deliver a web-based reporting solution to the client's specifications.

Performance Executive (various clients)

  • Performance Executive was a packaged data warehouse that we marketed to clients of our proprietary accounting systems. I was one of several technical resources on the project throughout my tenure at KPMG, and I was involved with every aspect of the product's creation, testing, and marketing. Primarily the product was back-ended in Oracle 7,8, or 8i, and Microsoft SQL Server, with an Informatica engine for data cleansing and transformation. The front-end tool of choice was Cognos Impromptu and PowerPlay, although some clients used Business Objects, Brio, Microstrategy and Crystal reports.

US Senate, Washington, DC (Fall, 1999)

City of Ottawa, Canada (as needed 2000)

Dahlgren Naval Base, Dahlgren, VA (as needed, 1999-2000)

Wright Patterson Air Force Base, Dayton, OH (Summer, 1998)

Oakland County, Michigan (Sept. 2001-Feb. 2002)


Other Work Experience:

Xavier University Cincinnati, Ohio WebMaster/Programmer/Analyst (Fall 1993 - March 1998)

Farwell and Hendricks Cincinnati, Ohio Senior Thesis Project Worked on thesis project with Farwell and Hendricks to develop a Windows 95 program to collect and analyze data for electronic devices. The program collected 5,000-10,000 samples/second from hardware placed on an earthquake simulator, displayed graphs, and analyzed results to determine instabilities in electric current. Neat stuff!


Education

Xavier University Cincinnati, Ohio

  • Double Bachelor's of Science in Mathematics and Computer Science


Technical Experience Extensive/Up-To-Date Knowledge of

  • Data Warehousing best practices
  • SEI/CMM project lifecycle management standards
  • Informatica PowerCenter (w/PowerConnect for SAP/DB2)
  • Cognos and Business Objects Business Intelligence Products
  • MySQL, Oracle and Teradata Database Products and SQL
  • ErWin Data Modeling Tool
  • Windows Server, Linux and Unix (Various) Operating Systems
  • Apache, PHP, HTML, TCP/IP, Other Internet Technologies
  • Desktop Publishing

Working Knowledge of

  • Microstrategy, Brio, Holos, Crystal Reports
  • SAP R/3: ABAP and HR InfoTypes
  • Microsoft SQL Server
  • Apple Macintosh OS, OpenVMS
  • Sun, Hewlett-Packard, IBM, and Digital Hardware`
  • DCL, Pascal, C, C++
  • PGP, Numerical Cryptographic Theory, System Security
  • Numerous other software packages, feel free to ask