80 likes | 196 Views
HST Proposal Processing, Planning and Scheduling. Solaris Desktops talking to a Solaris Disk Server via NFS Sybase database engine running on Solaris Software (Trans, Spike, SPSS, PASS, etc) binaries are Solaris only. Mac Integration Goals.
E N D
HST Proposal Processing, Planning and Scheduling • Solaris Desktops talking to a Solaris Disk Server via NFS • Sybase database engine running on Solaris • Software (Trans, Spike, SPSS, PASS, etc) binaries are Solaris only
Mac Integration Goals • Provide additional data analysis support capabilities not directly obtainable on Solaris (eg, MS Office, etc). • Provide superior single-thread performance compared to current Sun workstations. • Allows data analyses and studies to be done faster and more efficiently • Integrate in a way that ‘looks’ like the Solaris environment • Personnel familiarity • Cross-platform tools/scripts likely to work out-of-the-box • Must be careful not to remove necessary Solaris compute capability required for the process
NFS • Mount the data partitions from the Solaris Server onto the Mac • Mount point setup to be /data which means the paths on the Mac are identical to those on Solaris • UIDs/GIDs on Mac synchronized with Solaris which means same file protection schemes work between Solaris and Mac OS X • Note: Solaris ACLs and Mac OS X ACLs are different and not mutually recognizable • OK to mount from Solaris to Mac OS X (read/write). Likely problems exporting from Mac OS X to Solaris (HFS+ [case insensitive/case preserving] vs. UFS [case sensitive]) Mac OS X [bifrost:~] reinhart% ls -Hlg /data/scheduling total 19 drwx------ 2 root wheel 8192 Oct 19 2004 lost+found/ drwxrwsr-x 14 planinst spb 512 Jul 27 16:34 spss_flight_data/ drwxr-sr-x 32 spssoper spb 1024 Sep 5 14:13 spss_tools/ [bifrost:~] reinhart% Solaris -4> ls -Hlg /data/scheduling total 10 drwx------ 2 root root 8192 Oct 19 2004 lost+found/ drwxrwsr-x 14 planinst spb 512 Jul 27 16:34 spss_flight_data/ drwxr-sr-x 32 spssoper spb 1024 Sep 5 14:13 spss_tools/ -4>
The software setup • Majority of the operations software is Solaris only • Tools written in a combination of Python, TCL and shell scripting languages • Given data paths are the same, most of the Tools and Tool libraries work with little to no changes • Use same tool login setup scripts with minor changes so that environment variables/paths are as identical as possible (OS specific requirements excluded) • Sybase client software installed on the Mac which allows for database queries from within the Tools as on Solaris • Similar supporting software suite as on Solaris: nedit, emacs, python matplotlib package, etc (installed via Fink) • Additional analysis software available on the Mac: MS Office, sqlite, python matplotlib basemap toolkit, python pyephem package, pysqlite, etc
An Example Workflow2-FGS Operations • 23 scenarios run serially that utilizes SPSS on Solaris, but does the data analysis and results storage on the Mac • Start on the Mac • Setup Sybase SPSS database for the scenario • SSH to Solaris to run the script which runs all the guide star requests for that scenario. Return to the Mac when complete. • Extract the results from the Sybase database and store in a local sqlite database keyed by the scenario • Once all the scenarios are complete, pull certain data from sqlite database, process and store back in that database. • Pull results and make plots via matplotlib (equal area sky plots can be done via the basemap toolkit)
sqlite • Included in standard Tiger install (sqlite3) • ANSI SQL compliant database • see http://www.sqlite.org for all the information • No complex setups necessary • Supports several simultaneous readers but only one writer (locking is at the database level) • Databases up to 10’s of gigabytes is supported (FGS Study database was 9GB) • Writing is rather slow but reading is reasonably fast. Some queries can be very memory intensive. • Convenient in many situations compared to managing large numbers of individual files, however, populating the database can take a while • Allows interactive query of dataset without writing/modifying a tool • pysqlite is a python-to-sqlite connection package allowing queries of sqlite database from within python • see http://www.pysqlite.org for more information
sqlite example [bifrost:~/gs_study] reinhart% sqlite3 gs_study.db SQLite version 3.2.8 Enter ".help" for instructions sqlite> .tables gs_summary su_target_xref wgreswnd percent_change tgm_nogs_sched_windows wguide_stars qbwindows wgacquis study_description wggs_pairs sqlite> .schema gs_summary CREATE TABLE gs_summary (study_id text, proposal_id text, obset_id text, version_num text, total_angular_coverage float, num_angle_gaps integer, total_time_coverage_3g integer, num_time_gaps_3g integer, total_time_coverage_2g integer, num_time_gaps_2g integer, num_gs_pairs integer, norm_num_gs_pairs integer, total_day_coverage_3g float, total_day_coverage_2g float); CREATE UNIQUE INDEX gs_summary_1 on gs_summary (study_id, proposal_id, obset_id, version_num); sqlite> sqlite> select study_id, count(gs_pair_id) from wggs_pairs ...> where proposal_id = "00658" ...> and obset_id = "A2" ...> group by study_id; gsc1_3fgs_def_acs_3ps|231 gsc1_3fgs_def_acs_3s|51 gsc1_3fgs_def_ota_3ps|65 gsc1_3fgs_def_ota_3s|51 gsc2_3fgs_def_acs_3ps|231 gsc2_3fgs_def_acs_3s|223 gsc2_3fgs_def_ota_3ps|75 gsc2_3fgs_def_ota_3s|57 gsc2_nofgs2_145_acs_3ps|240 gsc2_nofgs2_145_acs_3s|203 gsc2_nofgs2_145_ota_3ps|44 gsc2_nofgs2_145_ota_3s|25 gsc2_nofgs2_150_acs_3ps|264 gsc2_nofgs2_150_acs_3s|238 gsc2_nofgs2_150_ota_3ps|59 gsc2_nofgs2_150_ota_3s|44 gsc2_nofgs2_def_acs_3ps|203 gsc2_nofgs2_def_acs_3s|150 gsc2_nofgs2_def_ota_3ps|43 gsc2_nofgs2_def_ota_3s|24 gsc2_nofgs3_145_acs_3s|161 gsc2_nofgs3_145_ota_3s|34 gsc2_nofgs3_150_acs_3s|172 gsc2_nofgs3_150_ota_3s|44 gsc2_nofgs3_def_acs_3s|124 gsc2_nofgs3_def_ota_3s|33 sqlite>