220 likes | 381 Views
TIMS – ODOT’s Mapping Portal. Garrett Staats, Solutions Architect, ODOT Melissa Williams, Senior Project Engineer , Stantec Ian Kidner, GIMS Systems Admin, ODOT. 2013 Ohio GIS Conference September 11 - 13, 2013 | Columbus Marriott Northwest | Dublin, Ohio. TIMS.
E N D
TIMS – ODOT’s Mapping Portal Garrett Staats, Solutions Architect, ODOT Melissa Williams, Senior Project Engineer, Stantec Ian Kidner, GIMS Systems Admin, ODOT 2013 Ohio GIS Conference September 11 - 13, 2013 | Columbus Marriott Northwest | Dublin, Ohio
TIMS Transportation Information Mapping System A tool providing centralized access to ODOT data, while minimizing the level of technical expertise required, while also providing sophisticated tools for viewing, analyzing, and communicating with ODOT information, allowing for greater collaboration between ODOT, ODOT customers, and ODOT business partners
Geodatabase Schema • Primary criteria: data efficiency • Create feature classes from event tables • Standardize domains, create new domains • Create indexes • Customized to ODOT source data vs. using ESRI’s Roads & Highways Data Model • Allow transfer of data without interrupting web services
Geodatabase Schema • Final data structure: • 40 feature classes, no feature datasets • PRJCT – ODOT project data • PVMT – Pavement Management System (PMS) data • RD_INV – state and county routes, roadway characteristics, ownership, rail crossings, scenic byways, maintenance authority, truck routes, etc. • REFER – cities, counties, ODOT districts, railroads, urban areas, etc. • STRUCT – bridges, culverts, outfalls • TRAFFIC – segments, stations • 4 data tables • 1 relationship
Geodatabase Schema • Domains: • Standardized across feature classes • County names to standard ODOT 3-letter abbreviation • Direction to 1 or 2-letter abbreviation • Yes/No/Unknown/NA to 1 or 2-letter abbreviation • Month to short integer • Created new coded value domains • Railroad operators • “Grand Trunk Western Railroad Incorporated” vs. “GTW” • Culvert channel protection • “CONCRETE RIP RAP” vs. “RP”
Geodatabase Schema • Edited schema using X-Ray for ArcCatalog Add-In • Provides spreadsheet interface to your geodatabase, with separate worksheets for each dataset and domain, creates xml file
Geodatabase Schema • Used ArcGIS Diagrammer to create .htm schema report
Develop ETLs • ETL = Extract, Transform, Load • Used Model Builder to create 10 ETLs • Transfer source data from Oracle into SDE geodatabase • Requires Data Interoperability extension • Needs to be performed on a routine basis, so models converted to geoprocessing services that can be scripted • Need to create routes first, then create feature classes from event tables • Need to use versioned editing to minimize interruption to data
Develop ETLs • Basic outline of ETL: • 1) Create new version in target dataset based on timestamp • 2) Delete all features from target dataset • 3) Copy source data to scratch.gdb • 4) Perform calculations on fields in scratch.gdb • Lower to upper case • Strip blanks • Parse field (e.g. address) • Convert to coded domain values • Replace values (e.g. 0/1 to N/Y) • 5) Make route event layer • 6) Append into target dataset • 7) Reconcile and post to default version • 8) Change version to default, delete temporary version
Develop ETLs • Problem: • Could not view attributes for Oracle spatial data table views • Solution: • Perform Feature Class to Feature Class to scratch.gdb even if no field calculations required, then define projection before appending to final dataset.
Develop ETLs • Problem: • Logpoint values appeared incorrectly • ODOT source data - 1.27 (float) • SQL Server SDE target geodatabase - 1.26999998 (field defined as float with precision/scale of 0/0 in xml schema) • Explanation: • SDE does not recognize the ArcGIS float data type, so defaults to double type with maximum precision/scale of 38/8. • Conversion of float to double yielded extra unwanted precision. • Solution: • Define field with specific precision/scale of 8/3 in xml schema. • Convert float to string in initial Feature Class to Feature Class step; then append string field to double target field.
Develop ETLs • Problem: • ChangeVersion tool doesn’t work with table views as input (documented bug NIM079735 in ArcGIS 10.0). • Solution: • Created custom Python script to create connection file on the fly to change versions.
Develop ETLs • Problem: • Error when deleting points in an origin feature class participating in a relationship. • Solution: • Changed foreign key to be nullable to allow for deletion of records.
System Architecture SystemDesign Goals
System Architecture • ETLs deployed to SDE Server as GP tools • Execute as scheduled python script tasks • Python script emails user for failure or completion • Administrative tasks (compressing, update statistics etc) have been scripted out as well
System Architecture • Problem • Geoprocessing tools “timing out” • Explanation • TCP sessions terminated after prolonged idle time • Solution • Ensure GP tool is not idle more than 30 minutes
System Architecture • Problem • Slow performance from initial Oracle geodatabase • Explanation • ESRI was able to reproduce similar performance • Oracle interface was being overhauled in subsequent releases of ArcGIS, thus, the issue would not be addressed 10.0 • Solution • Initial testing in SQL Server yielded substantially better performance
System Architecture • Problem • Several fields from Oracle tables not being read properly in ETLs or ArcCatalog • Solution • Changed the OLE DB Provider from Oracle Provider to Microsoft OLE Provider for Oracle
System Architecture • Problem • Geodatabase field definitions differed from actual data • NLFID fields in the Geodatabase defined as 14 character text (NLFID is a common 14-character text value used in the department) • ETL was failing due to bad data – NLFID values in one data were returning 15&16 character values causing the failure • Solution • Alter Geodatabase schema to reflect the actual data (even if technically incorrect)
System Architecture • Problem • ETLs with ODPS Crash data were failing or taking a long time to complete • Memory usage would increase, then the service would crash • Explanation • The ‘delete feature’ operation fails with large datasets • Would consume close to 3.4gb of RAM then crash • Solution • The ETL originally deleted features from the versioned table. (10.0 has no truncate function) • Migrated ETL to 10.1 and use a manual process to truncate and update data