1 / 34

xxxxxxxxxxxxx

xxxxxxxxxxxxx. Name : xxxxxxxxxxxx ID : 2005HZxxxxxxx Organization : xxxxxxxxxxx Company Location : Bangalore. Introduction.

maura
Download Presentation

xxxxxxxxxxxxx

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. xxxxxxxxxxxxx Name : xxxxxxxxxxxx ID : 2005HZxxxxxxx Organization : xxxxxxxxxxx Company Location : Bangalore

  2. Introduction The LaQuinta Enterprise Reporting Architecture is a Strategic and Tactical BI platform for the analysis and reporting of LaQuinta enterprise data. The following business groups are the major users. OPERATIONS MARKETING SALES FINANCE REVENUE MANAGEMENT CALL CENTER INFORMATION TECHNOLOGY

  3. Architecture The Enterprise Architecture is based on a 4-Tier structure with "LaQuinta Data Warehouse 2.0" as the focal point. The four tiers consist of: Legacy and Transaction Source Systems Data Staging LaQuinta Data Warehouse Cognos Presentation

  4. Fact Tables With LAQ DW2.0, there are 5 new fact tables added along with providing more dimension data in the existing fact table. The following are the fact tables used in the LaQuinta DW2.0 project. Reservation Fact ( Exists in DW 1.7 ) Group and Tour Fact Central Reservation Office FACT Denials Fact Call Center Fact Budget Fact

  5. Key Features of Version 2.0 • Subject orientation - Multiple subjects such as Folio, Call Center, Budgets, Sales, HR, etc. • Conformal dimensions - These are dimensions that can be shared across subject areas • Multiple Aggregations - Need based aggregation for specific summary level reporting • Standard naming conventions across the board. • Optimized data structures using standard data warehousing principles.

  6. Architecture Goals Improved visibility of Enterprise BI Subject orientation - Add new areas for analysis and reporting Enable user teams to create strategic and tactical dashboards Make available ONE Enterprise Metadata dictionary (Business/Functional/Operational) Consistency in handling time, booking, stay, etc. across the board Fix existing problems – (Data Gaps, Cognos issues) Remove inefficiencies: Data structure, ETL, Reports, Cubes Standardize DW practices (features, design, naming, reference data…) Lighter, simpler Cognos presentation layer Scalable, flexible structure and processes Ease of configuration and maintenance Build upon the existing foundation

  7. Why Version 2.0? LaQuinta DW was the existing data warehouse application build by the other vendor. But in the on going business, there were lot of features needed and the existing systems could not scale up to accommodate the changes and also consisted issues/bugs. To accommodate, we developed and planned to release a version called LaQuinta DW 2.0. LaQuinta uses the application called MQC (Mercury Quality Center) to log the issues and bugs and suggestions with respect to all the projects.

  8. Project Schedule from MS Project Plan

  9. ETL Tool Used in the Warehouse ETL Tool used in this application is SQL SERVER 2000 DTS packages and following are the packages information used in the LaQuinta Data Warehouse 2.0 project with respect to Reservation Fact

  10. Reservation Fact The transactional data from Night Vision Corporate (NVCorp) is extracted using the master DTS script in the Data Warehouse Server at 07:00am EST everyday and loads the data in the Reservation Fact. The data flows into NVCorp from the Property Management System (PMS) at night and this process is call the Night Audit. The night audit process captures all the transaction that happens in a PMS and consolidates the data in NVCorp. The following are the detailed information about the sub packages executed by the master DTS script. The DTS package used for loading the data into the Reservation Fact is: LOAD DATA WAREHOUSE

  11. Use Case Diagram for Reservation Fact

  12. Reservation Fact The following are the Packages used with respect to Reservation Fact. • Daily Data Warehouse Dimension Load • Daily Data Warehouse Staging Load • Transactional Staging Feed • Transactional Fact Feed

  13. Reservation Fact Daily Data Warehouse Dimension Load This is a sub package of the Master DTS package that runs on a daily basis to load the LAQ Daily Data Warehouse fact table. This process will load the following dimensions: • ROOMTYPE • COMPANY

  14. Reservation Fact Daily Data Warehouse Staging Load Each day the Master Data Warehouse Scripts call this sub package and loads the data to the staging environment. The name of this DTS Package is Load Static Data to STAGING and it is located on the DW server. Generally the following items moved into the Staging tables. • Room Type • CRS Numbers • Company Data • Aptech Properites

  15. Reservation Fact Daily Data Warehouse Staging Feeds This process load the following items into the Staging Tables. • Daily Folio Charges • Cash Sale • Foliolink Reservation Rate • Zero Fact

  16. Reservation Fact Daily Transactional Fact Feeds This process load all the reservation details from staging tables to the Reservation Fact table and the following items are loaded into the Fact Table. • Daily Folio Charges • Daily Consumed Room Nights • Daily Consumed Revenue • Daily Cancels • Booking Fact • Booking Horizon

  17. Reservation Fact Load Data Warehouse Master Script • This is the MASTER DTS package that gets executed once a day to load the Reservation Fact. It also loads the Booking Horizon and Booking Pace fact tables on a daily basis as well. • It is a schedule job on the DW server. It is scheduled to run each day starting at 7:00 AM Eastern Standard Time. It will only start and process if the Night Audit posting has completed 99% on that day. The job that it polls for to see if it is complete is a DTS package called “Load Booking Data". If that job has not completed by 7:00 AM the job will not execute.

  18. Reservation Fact Data Audit Tool This is a daily process that Collects Occupancy and Revenue Numbers from NVCorp, APTECH and Data Warehouse tables and compares it by property by stay date. The Occupancy and revenue difference between NVCorp and APTECH and NVCorp and Data Warehouse is calculated by stay date. The comparison is stored in a table called DATA_AUDIT_TOOL on staging database of DW box.

  19. Group and Tour Fact Group and Tour Booking fact is the other fact table used to store only the Group Booking booked through call center by the VAS agents.

  20. Use Case Diagram for GNT Fact

  21. Group and Tour Fact Extract Group Link Data In NV, there are 3 files of importance and they are Group Master, Group Header and Group Link. Every time a property creates a Group, it uses the Group Master first. Group information is available only on a 'daily' basis and not by the hour. The hierarchy is like Group Master --> Group Header --> Group Link. Group link is similar to a folio link and contains information such as: GROUPLINK NUMBER SALES PERSON To reduce the authorized count on a group booking, the associated folio link records have to cancel first. The folio number of the group reservation is designated as the group master. To indentify the Group booking details, we have to use the foliolink table in NVCORP database using the master number and group link number to identify the group.

  22. CRO Booking and Denials Fact CRO Booking The Pegasus BDE file is used to pull daily Gross, Net and Cancellations from the daily extract. The extract is done by filtering on Reservation Center ID. There are currently 5 reservation center id's that are used to filter Call Center Reservation Data. They are: 'RETRC','NBCRC','LQCRC','VAOPS','NETRZ‘ Denials Each day as part of the Pegasus feeds we get the Yield records better known as Denials data. LaQuinta receives both System generated and User generated denials and following can be the reason for the denials. • Closed To Arrival • Location Does Not Meet Needs • Minimum Length Of Stay Restriction • No Availability - Hotel Is Sold Out

  23. Use Case Diagram for CRO Booking Fact

  24. Call Center Fact Call Center Fact loads the data with call center details like agent operator who made the reservations, his skill sets, Denials information and Call Center location. The Daily VAS extract files include all CRO feeds such as: • DNIS Reference • DNIS Activity • Skillset Reference • Skillset Activity • Agent operator – VAS Agent Mapping

  25. Budget Fact This ETL process is normally meant to be run on a yearly basis. However, it has been designed to run any time on demand. It is capable of detecting and updating changes any time during the year. The data pertains to the budget values (room revenue, room occupied percentage and rooms available) for all properties for all 365 days of the year. Extract Budget Data from Aptech Datamart Database and load it into the LAQUINTA_DW data warehouse. The budget data is available only at a property level in the TransBudget table of Aptech datamart.

  26. Use Case Diagram for Budget Fact

  27. UAT Environment How Data Initialization was made to start with the project? The following steps were used to initialize all the dimension tables used in the LaQuinta DW 2.0 Project. • Truncate all new tables. • Populate Dimension Data for each new dimension table from existing dimension tables after applying the required transformation logic. • Run the Fact Load process for loading New Fact tables using data extracted from source.

  28. UAT Environment Method used for Data Validation Following methods are used to do the data validation both in ETL and Cognos side. • Cube/Report Validation • Run Queries against the Data warehouse • Values from cubes/reports should match query results • ETL Validation • Run Queries against the source data • Run Queries against the Data warehouse These two query results must match.

  29. UAT Deployment Diagram CURRENT: DW1.7 on 10.240.200.105 1 PM 11AM 7 AM 3 PM Staging and Dimension Delta Load Cube spin DW Load 72 Reports (47-DW 25-Aptech) 6 Cubes Staging_DW LaQuinta_DW Source NVCorp BDE VAS • Run two ETL processes from NVCorp. • Two sets of cubes and reports for UAT comparison the next day 8AM CST. • Copy production cubes to the TEST portal and rename it. 47 Reports 10 Cubes LaQuinta_DWH Staging_DWH Staging and Dimension Delta Load DW Load Cube spin DW2.0 on Server 10.240.200.21 5 AM 1AM 3AM 9 PM 29

  30. ETL Test Results from MQC LaQuinta uses the application called MQC (Mercury Quality Center) to log the issues and bugs and suggestions with respect to all the projects. During the UAT environment, we ran the test cases in the issue tracking tool to log the test results and below is the test results graph.

  31. Target Deployment Diagram

  32. Summary With the implementation of DW2.0, we have been able to provide the LaQuinta business users with the benefits of in-depth data analysis/projections along with including other business components as a part of the data warehouse model. We were also able to streamline the existing version to meet the data warehousing standards. Along with the above inclusions, we were able to provide the customer with performance enhancements on the complete data warehouse model as an value-add and also been able to fix few know issues/bugs.

  33. Future Work Once the DW2.0 is released into production, we’ll move into the Production Support phase where the entire data warehouse environment would be monitored closely until the system stabilizes and also be able to fix any issues that crop-up during the initial stages. Once the data warehouse system is stable, we’ll start with the next level of enhancements as a part of DW2.1.

  34. Thank You

More Related