1 / 12

DCS Software Installation

DCS Software Installation. computing, network, software guidelines, procedures Peter Rosinsky, Peter Chochula, ACC team. ALICE DCS Workshop, CERN, 5-6 March 2007. DCS Computing. CR3 Central services Application gateways (GW)

Download Presentation

DCS Software Installation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. DCS SoftwareInstallation computing, network, software guidelines, procedures Peter Rosinsky, Peter Chochula, ACC team ALICE DCS Workshop, CERN, 5-6 March 2007

  2. DCS Computing • CR3 Central services • Application gateways (GW) • Fileservers (FS), databases (DB), admin nodes (AN), DNS and Time srv., Web srv., etc. Detector computers • Operator nodes (ON) – WS2003, Worker Nodes (WN) – XP PVSS, Frontend Servers – SLC4 • CRx, SG, Cavern • Computers, DCS boards, Power supplies, PLCs • CNIC – ALICE network domain • Trusting, Exposing – IT central services (GPN), ALICE-TN communication • Control sets – protect PLCs Website on computing/network/SW/installation:www.cern.ch/alice-dcs/public Peter Rosinsky, DCS Workshop

  3. Status • Network • Fully operational (finally) - CR3, CRx, UX (Cavern), SG (gas bldg) • DCSNet, GPN+wireless, TechNet • Computers - CR3 • Fully installed: 3xGW, 4xAN, 10xBCK, 20xON, 60xWN, 1xFS, 2xOracle FS • DCS SW (PVSS) – TOF, TRD, HMP, services, (as demanded) • Other – TOF Linux servers • Devices(CRx, UX) • Power supplies - Wiener, CAEN • NETGEAR switches, DCS boards • Software • Transition towards PVSS 3.6 – please indicate if you still need 3.1 Peter Rosinsky, DCS Workshop

  4. Software Installation Detector expert: • Uploads to DCS LAB fileserver (“upload” server aldcsfs001) • Installs from CR3 fileserver (“installation” server alidcsfs001) • D-day, together with ACC team ACC: • Checks the SW uploaded to the LAB server • structure, paths, security scans (obviously, not functionality) • Transfers: DCS LAB “upload” server  CR3 “installation” server Prerequisites • Permissions on ON/WN and “upload” server (ask DCS admin) • Send NICE usernames to alice-dcs-admin@cern.ch • Detector users/experts group • E.g. “ALICE DCS TOF”, “ALICE DCS TOF EXP” Peter Rosinsky, DCS Workshop

  5. Installation Procedure Extract from: alice-dcs web → “Software Installation Procedure” • Check out your D date not to miss the D-4 weeks deadline. • Contact DCS admin - Peter Rosinsky - (alice-dcs-admin@cern.ch) prior to D-4 if you need to clarify the procedure. • Send the DCS admin a list of common software you are going to use. • Ask DCS admin to get the permissions on our installation fileserver. • Go to your installation site – e.g. \\aldcsfs001\DCS\agd • Create a subdirectory named after the current date in a format YYYY-MM-DD every time you bring a new version. • Make a link (shortcut) named "production" to the directory containing the current production version of your installation. • Unpack your archives to a simple and convenient logical structure, e.g. one directory per package. • Describe this structure (what's where) in a text file named "readme.txt" together with version info and short description. • Set the date (based on D) and time with the ACC team (DCS admin) for the installation session in CR3. In the meantime, the ACC team will perform basic checks and will transfer your installation site onto a production fileserver. • For the installation in CR3 you will use a copy of your installation site residing on a production fileserver, e.g \\alidcsfs001\DCS\agd. • Computers in CR3 will be pre-installed with several commonly used software packages including the current version of PVSS. Peter Rosinsky, DCS Workshop

  6. Fileserver Directory Structure • Fileserver share: • DCS LAB (upload): \\aldcsfs001\DCS - visible only from GPN • CR3: \\alidcsfs001\DCS - visible only from CR3 directories for • Subdetectors • three-letter code, small letters • e.g. \tof, \trd, \hmp • Projects • rack control (\rck), central DCS (\dcs) • Computers (whithin subdetector/project directory) • using machine names - \alidcscom123 • Common software • \software • Scratch– to upload changes done on the production machines back to the “upload” server \\alidcsfs001\Scratch Peter Rosinsky, DCS Workshop

  7. Detector Subdirectory • Detector PVSS projects, databases, scripts • Versioning – date • Subdirectory: \YYYY-MM-DD • e.g. \\aldcsfs001\DCS\rck\2006-06-21 • Shortcuts: production (mandatory), other (detector-specific), common SW pointers • readme.txt • description field: additions, improvements, bug fixes, comments \agd \alidcscomNNN \production - shortcut to the production version, \2006-08-24 \2006-08-24 - production description: added databases, improved ramp-up \agd_hvControl - main PVSS project \agd_hvControl _fw - framework for agd_hvControl \databases Peter Rosinsky, DCS Workshop

  8. Common Software Subdirectory \software • OPC servers • ELMB, CAEN, ISEG, Schneider, Wiener, … • Drivers • DIGI, USB CANbus, MXI, FalconFG, … • Tools • DB tools, OPC explorer, CAN explorer, … • Libraries, ActiveX controls, … • Ask DCS admin to include software packages you need Peter Rosinsky, DCS Workshop

  9. Networked Devices • Individual devices, small groups • Provide information about the device • manufacturer, model, MAC address, … • Large groups of devices – contact DCS admin • several tens of units • power supplies, DCS boards, … • name-space reservation • MAC address pre-registration • Network services for the DCS • final assignment • www.cern.ch/alice-dcs/public/networkP2user.txt www.cern.ch/alice-dcs/public→ “Networked Device Registration” Peter Rosinsky, DCS Workshop

  10. Working In Point 2 • “User” PCs • CR3, CRx, ACR (once it’s ready) – standard NICE environment • Thin clients(alice-dcs web →“MYLO Thin Client Guide”) • Transtec MYLO – provides RDP client (Terminal Server Connection) • Registered as portable – easy to bring to a convenient location • Intended use – UX, CRx (five units available) • Laptops • Wireless available in CRx and UX (except undergroud C-side) • Portable outlets – I and O racks (“gaps”), C-side rack area (walls) Note:all these are GPN machines =>use RDP connection to our App. Gateway (alidcscom001) the same way as from your office • KVM Consoles • Reserved for administrative access or troubleshooting, namely if the machine is not accessible over the network • Ask DCS admin if you need access (special user, rules, restrictions) Peter Rosinsky, DCS Workshop

  11. Accessing DCS Cluster • DCS resides on a protected network (CNIC) • No direct access from GPN to DCS computers (no RDP, no file transfer) • User login • Via application gateway – alidcscom001 (secondary GW - 002, backup GW - 003) • File transfer • Via DCS file servers by DCS admin – the same path used for installation • Notify DCS admin and indicate the directory starting from \DCS - e.g. “hmp\alidcscom123\2006-03-05” that has been added/changed - to avoid transferring the whole site (gigabytes!) each time Peter Rosinsky, DCS Workshop

  12. • … • … • … Peter Rosinsky, DCS Workshop

More Related