90 likes | 202 Views
Directory of Directories for Higher Education (DoDHE). Michael R. Gettes Lead Application Systems Integrator Georgetown University gettes@Georgetown.EDU. f Technologist, University of Colorado at Boulder. Is this anything new? No. Outgrowth of Georgetown WhitePages problem (Hospital)
E N D
Directory of Directories forHigher Education(DoDHE) Michael R. Gettes Lead Application Systems Integrator Georgetown University gettes@Georgetown.EDU f Technologist, University of Colorado at Boulder
Is this anything new?No. • Outgrowth of Georgetown WhitePages problem (Hospital) • A Web of People? • Exposes common schema issues. Edu-person applicability. • Performance issues for massively parallel searches. • Interesting lessons learned about LDAP API. • Worked with iPlanet/Netscape to use DSGW for this project. (Mark Smith) • Prototype from April, 2000. A search of 500 simulated dirs getting about 15,000 responses in approximately 30 seconds. Is this viable?
Where Are We Now? • Michael Gettes working 50% time on this project • funded by Internet2 • for 3 months (renewable) • does this model work? • Starting November 1, 2000 • MACE-DIR to provide oversight of project • Sun Microsystems to contribute hardware and assist with software procurement (iPlanet). Also, access intellectual to capital.
Where Are We Now?(cont.) • eduPerson coagulation • LDAP-Recipe for similar config and operations • Alignment of the above makes pure LDAP implementation intriguing. Maybe DoDHE could be an example for DoDHE for application deployment? (probably too far down the road) • MACE-DIR, Shibboleth, eduPerson, DoDHE, PKI • Incest is the tie that binds? • In other words – it’s all related. By Design!!
ProposedArchitecture Site Dir Site Dir Site Dir Gratuitous Architectural Graphic Site Dir Site Dir Site Dir . . . Web Page Parallel Search Engine Index Dir Index Dir Central Deposit Service
Next Steps…Indexing & Searching • Collaboration with Roland Hedberg, Catalogix, to understand indexing characteristics, referrals, differential updates. He has experience in providing a similar service. 500 sites, ~ 1.5M entries. Can this apply to our perceived desires. • Need to manufacture data or get extracts from a set of institutions to perform reasonable analysis. • Or, make use of vendor products (iPlanet/Sun) for indexing and searching for central service.
Heuristics and capabilities? • What should be available to search? • By Affiliation • Carnegie Classification • Geography • Institution • Job Classification • Area of Research or specialty • yada yada yada …
Human Interface • What should the web interface look like? • How do we present the capabilities just described? • Response analysis? What to do with 10,000 hits? • Human Interface Lab participation? Any takers?
The Mundane • Server Configuration • Scalability • Statistical Analysis • Security • Monitoring • Threat response (anti-slurpers) • Management & Maintenance • Self-Registration and Configuration? • Participation Requirements – DIT root suffix, etc. • Meta Directory functionality for central deposit? • iPlanet DS 5 filtered replication?