880 likes | 901 Views
This document presents the activities, objectives, and status updates of the SEE-GRID-2 project discussed at the PSC03 Meeting in Podgorica, Montenegro, from October 30 to November 1, 2006. It covers network resource provision, Bandwidth-on-Demand requirements, CA and RA guidelines deployment, and the current status of certification authorities in the SEE region. Topics such as stable connectivity, grid CAs, and user community services are highlighted.
E N D
WP3 Status / Hot Topics SEE-GRID-2 PSC03 Meeting, Podgorica, Montenegro, October 30 – November 1, 2006 Antun Balaz WP3 Leader Institute of Physics, Belgrade antun@phy.bg.ac.yu The SEE-GRID-2 initiative is co-funded by the European Commission under the FP6 Research Infrastructures contract no. 031775
Overview • Activities status • A3.2 • A3.3 • A3.4 • A3.1 • WP3 deliverables & milestones • T-infrastructure deployment • WP3 Hot topics & Development areas • WP3 Issues & Action points SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 20062
A3.2 Objectives • Objective: Network resource provision and assurance • WP3 will also deal with network resource provision, in close cooperation with the SEEREN2 project, thus ensuring stable connectivity for the RCs in the region. • Attention will be paid to Bandwidth-on-Demand requirements, to cater for bandwidth-intensive applications in case they need dedicated resources for particular experiments. • A3.2 - Network Resource Provision and BoD requirements (IPP) • Support liaison actions to ensure adequate network provision, including the requirements for Bandwidth-on-demand, if and where necessary depending on the application. SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 20063
A3.2 Status • A bandwidth on demand (BoD) service requirements questionnaire has been prepared and distributed. • Two applications with such requirements have been identified: • EMMIL (developed by International Business School, Hungary) • VIVE (developed by Belgrade University Computer Centre, Serbia). • Application requirements have been analyzed and EMMIL developers were contacted for some further input and clarifications. • The Data we’ve gathered so far was provided to Afrodite Sevasti (GRNET), leader of the GEANT2 Joint Research Action 3 “Bandwidth Allocation and Reservation”, for comments. SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 20064
A3.3 Objectives • Objective: CA and RA guidelines and deployment • Regional SEE-GRID catch-all Certification Authority (CA) will continue to operate providing certificates for countries without a CA. • Experienced CA team will provide support for per-country CA deployment and accreditation. The cycle to establish a National Grid CA will be in compliance with the procedures and accreditation process of the EU Grid Policy Management Authority (EUGridPMA). • Operations will be strengthened so as to support per-country CA operations • A3.3 - Deploy and operate Grid CAs (GRNET) • Should provide CA and RA guidelines and help establish per-country CAs to cover the authentication issues of accessing the grid. SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 20065
A3.3: Status of CAs in SEE • Status of CAs at the end of SEE-GRID (1) project • SEE-GRID CA (SEE catch-all) • HellasGrid CA (Greece) • NIIF CA (Hungary) • TrGrid CA (Turkey) • Current status (SEE-GRID-2) • Newly accredited CAs • SRCE CA (Croatia) • In progress of accreditation • Romania • Serbia • Bulgaria SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 20066
A3.3: SEE-GRID CA (1) • Last SEE-GRID CA Training on June 2005 at Budapest • Current RAs: • Bulgaria • Romania • Albania • Bosnia and Herzegovina • University of Banja Luka • University of Sarajevo • F.Y.R.O.M. • Faculty of Natural Sciences and Mathematics-Skopje • Ss Cyril and Methodius Skopje Faculty of Electrical Engineering • Serbia • Montenegro • Moldova SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 20067
A3.3: SEE-GRID CA (2) SEE-GRID CA in numbers (at the beginning of October 2006) SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 20068
BG.ACAD | CABulgarian Academic Certification Authority Stanislav Spasov spasov@acad.bg Luchesar Iliev iliev@acad.bg The SEE-GRID-2 initiative is co-funded by the European Commission under the FP6 Research Infrastructures contract no. 031775
BG.ACAD | CA - Introduction • Established by Institute for Parallel Processing, Bulgarian Academy of Sciences (IPP-BAS): – September 2006; • Single CA in the Bulgarian academic field (NO subordinate certification authorities) : • provide stable services; • long-term commitment; • largest possible user community - serves the needs of the research and education community in the country for PKI services. • Hosted by IPP-BAS and operated by the Distributed Computing Systems and Networking department, responsible as well for the maintenance of the NREN and the other related services (BG.ACAD NOC, BG.ACAD CSIRT, etc.). • Supported in its operation by the government institutions related to the IT sector (e.g. the State Agency for Information Technology and Communications). SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200610
BG.ACAD | CACP/CPS overview (1) In Accordance with: • IETF RFC 3647; • IGTF guidelines (v4.0) • Inspired by CP/CPSs from other EUGridPMA approved CAs, i.e. • SEE-GRID CA Certificate Policy and Certificate Practice Statement • DutchGrid and NIKHEF Medium-security X.509 Certification Authority Certificate Policy and Certificate Practice Statement • AEGIS Certificate Policy and Certificate Practice Statement • CERN Certification Authority Certificate Policy and Certification Practice Statement • GridKa-CA Certificate Policy and Certification Practice Statement • SWITCH Certificate Policy and Certification Practice Statement; • Revision: 1.0 (current status - draft ) • OID: 1.3.6.1.4.1.26646.1.3.1.1 SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200611
BG.ACAD | CACP/CPS overview (2) • CP/CPS – delivered and published at http://www.eugridpma.org/review • Presented at 8th EUGridPMA Meeting, 5-6 Oct ‘06, Karlsruhe • Minor technicalities to be reconsidered and amended • Meets minimum requirements • Current status and plans: • Infrastructure is in place • Web site and repository – under construction http://www.ca.acad.bg • Signing machine ( laptop ) – under construction • Hope to be able to test the services successfully till the end of 2006 and get accreditation in January 2007 SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200612
BG.ACAD | CA - Infrastructure and physical security • Building has an admission desk • Server room is with RFID card access control . • Constant air conditioning, powerful UPS. • Automatic fire-extinguishing system and flood protection in room SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200613
A3.4 Objectives • User portal deployment and operations • A user-friendly multi-Grid access portal will be deployed, enabling universal and more flexible user access to the regional infrastructure. • The work on the SEE-GRID-2 portal should increase the user-friendliness of being able to select a grid and execute a workflow on the selected grid, so that interoperability of different grids is seamlessly and transparently solved at the application (workflow) level. • A3.4 - Provide a user portal (SZTAKI) • This activity will support the deployment of a user-friendly and mult-grid interoperable portal for convenient Grid access and usage. SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200614
A3.4:P-Grade portal development status • New release of P-GRADE Portal: v2.4 • Documentation • SEEGRID Portal migration v2.3v2.4 done • User migration • Contact with external portals at SEEGRID partners • Two different development path • PS: • workflow level Parameter Study for EGEE és GT-2 grids • special generator and collector jobs • Under testing • GTbroker • Workflow Editor enhancement • Faster broker than EGEE broker • Based on GT2 • Tested and FULL Compatible with SEEGRID/VOCE SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200615
A3.4:New features / Ongoing work • New features in P-GRADE Portal 2.4 • Generic MPI support • Direct MPI job submission • MPI job submission using the LCG broker • Generic workflow level checkpointing • Ongoing and future work • Parameter study support • File Manager Portlet for Remote Storage Elements • Credential Manager for MyProxy SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200616
A3.4: Training / user activities • Portal user activities • Training and dissemination activities (SZTAKI) • EGEE Tutorial at ISGC 2006 - Taipei, Taiwan: 1 May • Induction to Grid Computing and EGEE - Budapest, Hungary: 11 May, 2006 • Grid application development on EGEE Grids by the P-GRADE Portal, -EGEE Earth science community, Paris, France: 22-23 June, 2006 • Joint Regional CE EGEE and SEEGRID-2 Summer School on Grid Application Support -Budapest, Hungary: 3-8 July, 2006 • P-GRADE Portal tutorial at the Baltic Grid Summer School - Tartu, Estonia: 4 - 8 July, 2006 • Grid Computing School in Conjunction with VECPAR ’06 - Rio de Janeiro, Brasil: 10-12 July, 2006 • Adding Workflow capabilities to ARC via the P-GRADE Portal - ARC Design Week -Budapest, Sárospatak, Hungary, 4-8 September, 2006 • P-GRADE Portal and ARC lectures and demonstrations at GridKa school 2006 - Karlsruhe, Germany: 13-14 September, 2006 • P-GRADE Portal lecture at the 4th Annual School of Young Scientists - Yevpatoria, Ukraine: 10 September, 2006 • Training and dissemination activities by (TUBITAK ULAKBIM) • Workflow application development • VO infrastructure test • Equation solver • Discovering Functional Modules in Large Scale Protein Networks • Evolutionary Boosting • Workflow Application repository SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200617
A3.4:P-Grade portal 2.4 Introduction New features of v2.4 • Revision of remote file handling: User option for non automatic copy to the worker node • Revision of rescue handling: The new functionality includes all types of resources involving the submissions to a Broker • Enhancement of verbosity level, localization and accuracy in the forwarding of the eventual errors occurring in the grid infrastructure • Protecting the Portal server by the introduction of a changeable limit of jobs being submitted and observed in one time. • Revision of MPI job handling: A totally new middleware ensures - and guaranties in defined circumstances - the success of submissions in case of MPI jobs • The new SEEGRID portal is up and running http://portal.p-grade.hu/seegrid SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200618
A3.1 Objectives (1) • Develop the next-generation SEE-GRID infrastructure • Next generation of EGEE MW (gLite), VOMS, WMS, information services and LFC will be assessed having in mind project and WP3 objectives • SEE-GRID infrastructure deployment of MW will follow and adapt its services according to the results of the above assessment. • Support in deployment and operations of RCs • Next generation monitoring services will be deployed so as to support the over-the-board infrastructure monitoring. • The current SEE-GRID helpdesk will be expanded in SEE-GRID-2, with the main goal of full EGEE interoperability. • Support the expansion and deal with the overall upgrade of the current infrastructure by proliferation of the participating RCs in each SEE country in order to increase: • the total available regional resources (CPUs, storage, etc.) thus boosting the capacity and reliability of the provision of Grid services at regional level, and • the diversity and distribution of participating teams per country thus strengthening cooperation and collaboration at national level. SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200619
A3.1 Objectives (2) • A3.1 - Implementation of the advanced SEE-GRID infrastructure (UOB/IPP) • Deals with support for configuration, deployment and operations of the Resource Centres within the SEE-GRID pilot infrastructure, as well as transition of mature centres into EGEE. • Sub-activities: • A3.1.1 - Expand the existing SEE-GRID topology by inclusion of new sites per SEE country • A3.1.2 - Deploy M/W components and OS in SEE Resource Centers • A3.1.3 - Test the site installations in local and Grid mode • A3.1.4 - Operate the SEE-GRID infrastructure • A3.1.5 - Monitor the infrastructure performance and assess its usage • A3.1.6 - Certify and Migrate SEE-GRID sites from Regional Pilot to Global production-level eInfrastructures SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200620
A3.1 Overview • Infrastructure status and expansion plans per country • Operations • SLA conformance monitoring per site for the next PSC meeting • gLite assessment & deployment status • Helpdesk tickets procedures and statistics analysis • Operational & monitoring tools deployment & integration • SEE-GRID reorganized Wiki status • HGSM • SAM (+ porting to MySQL) • BDII • GridICE • SEE-GRID GoogleEarth • SEE-GRID GoogleMaps • GStat • R-GMA • RTM • MonALISA SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200621
A3.1: Overview per country • Greece • Switzerland • Bulgaria • Romania • Turkey • Hungary • Albania • Bosnia and Herzegovina • FYR of Macedonia • Serbia • Montenegro • Moldova • Croatia • Overall infrastructure status • Expansion plans • Overview of A3.1 work • Resources committed to the SEEGRID VO • Other WP3 related effort SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200622
A3.1: Greece (1) • Number of Sites: 1 • HG-01-GRNET • Number of CPUs: 64 • Storage: 4.78 TB • MW Version: GLITE-3_0_2 • SEEGRID VO Resources • MAUI FAIR SHARE 20% • Max usage 60% • MIN CPUs: 64*20% ~ 13 CPUs • MAX CPUs: 64*60% ~ 38 CPUs • Expected (Based on Usage of other VOs): 64*30+% ~ 20 CPUs • GlueCEPolicyMaxRunningJobs: 38 • Expansion? SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200623
A3.1: Greece (2) • Operations of SEEGRID training CA • Operation of rb/bdii/VOMS for the SGDEMO VO (Training SEEGRID VO) SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200624
A3.1: Switzerland/CERN • Mainly development and support • Contributions to the SEE-GRID Wiki • Resources – should we ask for some? SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200625
A3.1: Bulgaria (1) • Status of the infrastructure and plan for expansion • 4 sites (EGEE/SEEGRID) infrastructure – significantly expanded now • Myrinet interconnection for low-latency MPI – bridging HPC with Grid • New site SEE-GRID only in the third party- FMI-SU in preparation with 20 WNs. • Core services, monitoring tools: • R-GMA (IPP responsible) - working • FTS (IPP responsible): first succesful local testing. Lack of SRM-enabled SEs in SEE-GRID was a problem, but good progress lately • Nagios monitoring proposition – can save manpower • Sites operation and ticket handling status/problems • Replace e-mails with tickets when R-GMA access requested • MPI problems at some sites – tickets created. • High level of job submission problems – tickets created, but nagios recommended as a solution • Slow response on tickets – apparently a communication problem • HGSM access not working • Contributions to the SEE-GRID Wiki SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200626
A3.1: Bulgaria (2)Nagios monitoring SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200627
A3.1: Bulgaria (3)New SEE-GRID-2 site under in FMI-SU • The site will be available before the end of the 2006. • User inteface node: running Debian GNU/Linux. • CE – running the pbs ( torque) and bdii. Using the maui scheduler. • Combine education and application porting activities SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200628
A3.1: Romania (1)Infrastructure and Services Status • Sites: • 8 registered in HGSM: RO-01-ICI, RO-03-UPB, RO-05-INCAS, RO-06-UNIBUC, RO-07-NIPNE, RO-08-UVT, RO-09-UTCN, RO-10-TUIASI. • RO-08-UVT & RO-09-UTCN are supported by GridMOSI national CEEX project • RO-05-INCAS not yet certified, site admin issue. • RO-10-TUIASI is experimenting grid testbeds • gLite 3.0, apart 1 certified site SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200629
A3.1: Romania (2)Future Plans • RO-07-NIPNE migrated to EGEE • hardware upgrade: currently 40 CPUs only for SEEGRID VO • 70 WNs dual core 3 Ghz(planned) • 10 Gb link to RoeduNET • 3TB DPM Storage Element • RO-09-UTCN • volunteered for FTS experimenting • RO-08-UVT • myrinet upgrade • 16 CPUs for SEEGRID VO • RO-01-ICI • Storage upgrade: 1 Tb SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200630
A3.1: Romania (3)Sites operation • RO-01-ICI maintains SEE-GRID Helpdesk • Mixed reaction to tickets by RoGrid sites • RO has the most total and open tickets in SEEGRID 2 • tickets not closed in 1 month • Not enough operational resources • Number of sites increased -> inceased support effort and operations • solution sought SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200631
A3.1: Romania (4) Core services, Monitoring, CA • RO-01-ICI deployed a RB/BDII for RO sites • RO-03-UPB deployed MonALISA with a SEEGRID Repository • very good cooperation with OSG and other projects which use MonALISA. • Willingness to contribute to SEEGRID Wiki • to be decided how • Romanian CA is at ROSA (Romanian Space Agency) • writing CP/CPS Document phase SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200632
A3.1: Turkey (1)Current TR-Grid Infrastructure SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200633
A3.1: Turkey (2) Current TR-Grid Infrastructure SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200634
A3.1: Turkey (3)Forthcoming TR-Grid Infrastructure SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200635
A3.1: Turkey (4)Status of TR-Grid Infrastructure Project • Our major project to form a National eInfrastructure is supported by TUBITAK research budget (~500.000 €) • Objective: • To build an eInfrastructure that is distributed among universities which will give computational resources to researchers all around the country. • To make this huge computing power a part of EGEE infrastructure. • To extend this infrastructure nationwide. • Tender for National e-Infrastructure was finalized in July 2006. • WAN interconnection of all sites are upgraded and ready for site deployment. • LAN equipment for all sites have been received. • Servers have been received, site installations will have been completed by the end of 2006. SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200636
A3.1: Turkey (5)Sites Operation/Ticket Handling • All TR-* SEE-GRID sites except TR-06-SELCUK have been upgraded to GLITE-3.0 • Necessary security updates and other patches announced have been done. • Temporary network problems with TR-06-SELCUK • Typical problems encountered during routine site operations: • Job Submission Failures at gLite-CE due to • Failure of BLParserPBS daemon • Failure of fetch-crl cron • SSH problem from WNs to CE • Site/user support is provided via national and SEE-GRID helpdesks. SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200637
A3.1: Turkey (6)Wiki Pages • Basic user and site administration documents are provided in native language on TR-Grid wiki pages • http://wiki.grid.org.tr • So far, there has been no contribution to the re-organized SEE-GRID wiki pages. • WP3 personnel of ULAKBIM will soon contribute to the SEE-GRID wiki. • Possible future contributions to the parts: • Experiences from site/service installations, upgrades • Installation of specific services and tools – VOMS • Policy documents SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200638
A3.1: Turkey (7)Core Services • We have been operating secondary SEE-GRID RB/BDII service – ui.ulakbim.gov.tr • LFC – lfc.ulakbim.gov.tr – Currently half terabyte of data is catalogued. LFC server is now serving national VOs. It can support forthcoming possible SEEGRID VOs. • Myproxy – myproxy.ulakbim.gov.tr • VOMS service for national VOs – voms.ulakbim.gov.tr • WMS server – wms.ulakbim.gov.tr • SFT server (planned to be installed for custom tests of TR sites and national VOs) SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200639
A3.1: Turkey (8)Monitoring Tools, P-Grade Portal • HGSM software has already been adopted to export data for monitoring tools like GridICE, RTM, GStat. • ULAKBIM has re-developed the HGSM tool to support multi Grid projects. • Cross project collaboration: Currently HGSM is also being used by EUMEDGRID and planned to be used by EELA. Required operational/deployment support is provided by ULAKBIM. • Googlemap tool has been developed and is being used to retrieve the information given by GIIS monitor for a selected site. • P-Grade – For national users a localized P-Grade portal will be in operation in the first quarter of 2007. SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200640
A3.1: Turkey (9)TR-Grid VOs • Current status of national VOs: • trgrida: 28 users (computational chemistry) • trgridb: 24 users (molecular dynamics) • trgridc: 3 users (earth science) • trgridd: 11 users (training) • trgride: 7 users (computational physics) • trgridf: 7 users (UlakNET network statistics) • trgridg: 3 users (Metallurgical science) SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200641
WP3: TurkeyCA Status • Since accredited in 5th EUGridPMA Meeting in September 2005, TR-Grid CA has been operating and providing certification services of TR-Grid. • In order to arise grid security awareness among users, related wiki pages are present and being updated in native language. • So far 28 host, 1 service and 68 user certificates have been issued by TR-Grid CA, at present 90 valid certificates in total. SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200642
A3.1: Hungary (1)Infrastructure summary • Infrastructure (now) • n27: • Dell Precision 410 M; 2xPIII 500M; 2x128M DIMM; • QUANTUM ATLAS IV 9 WLS SCSI 9G; 3Com 3c905B • n28: • Dell Precision 410 M; 2xPIII 500M; 2x128M DIMM; • QUANTUM ATLAS IV 9 WLS SCSI 9G; 3Com 3c905B • Site hardware upgrade (on-going task) • 4 new, dedicated P4 • CPU:2.4 GHz • 512 MB memory + additional storage • Software upgrades • Security patches (2.1.5 Torque update) SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200643
A3.1: Hungary (2)Contributions to the SEE-GRIDWiki User and developer oriented material New entries on the Wiki: • For Developers Case studies how-to use P-GRADE Portal for Grid application development • For Users User induction Manual for P-GRADE Portal SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200644
A3.1: Albania (1)Migration to gLite 3.0 • FIE (Faculty of Electrical Engineering) • The first to migrate to gLite 3.0 • INIMA (Institute of Informatics and Applied Mathematics) • Followed a month latter • FSN (Faculty of Natural Sciences) • Has prepared the cluster for integration in SEEGRID • Directly installed gLite 3.0 • Some delays related with certificates SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200645
A3.1: Albania (2)Further development plans • Further development plans • Improve the hardware creating dedicated to SEEGRID clusters • Actually existing clusters are used for other purposes • Critical for INIMA and FSN • Inclusion of new sites considered as [temporary] problematic • Difficult to find sites with enough experience on Linux systems • Necessary to create a “friendly” environment for grid applications and involve other domains researchers to experiment with grids, as a prerequisite to justify involvement in the project and improve the sustainability • Improve the access to grid structures for non ICT researchers through simple web pages that would permit easy upload, compilation and execution of jobs in remote Linux sites SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200646
A3.1: Albania (3)Sites operations • Problems during system upgrades to gLite 3.0.2 • Eventual changes in SSH packages • Problems while applying security patches • Ticketing • Tickets related with minor coordinate problems resolved and closed • Other issues • Difficult to access testing and monitoring sites in MK SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200647
A3.1: Albania (4)Contributions to SEEGRID Wiki • Experience of migration to gLite from FIE and INIMA included in SG GLITE-3 0 Assessment Notes: • AL-02-FIE Notes • AL-01-INIMA Notes SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200648
WP3: Albania (1)Network provision • Connectivity for each of sites (INIMA, FIE, FSN) is provided by different ISPs, with bandwidth range between 1 Mbps – 2 Mbps • QoS and BoD are not implemented • SEEREN2 connectivity under consideration for 2007 SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200649
WP3: Albania (2)Status of CA • CA status remains delegated to a single contact point [person] for SEEGRID activities only • Creation of formal CA organization remains as a task related with the establishment of NGI in Albania SEE-GRID-2 PSC03 Meeting – Podgorica, Montenegro, October 30 – November 1, 200650