290 likes | 1.33k Views
Computing and Network Infrastructure for Controls. CNIC. Dr. Stefan Lüders IT/CO November 21th , 2005. Cyber Threats on the Horizon The CNIC Mandate CNIC Tools for Control Systems & Networks The Impact on You. Incidents at CERN.
E N D
Computing and Network Infrastructure for Controls CNIC Dr. Stefan Lüders IT/CO November 21th, 2005 • Cyber Threats on the Horizon • The CNIC Mandate • CNIC Tools for Control Systems & Networks • The Impact on You
Incidents at CERN • “A major worm (similar to Blaster) is spreading on the Internet” • (2004/5/3: Sasser Worm) • “New Virus / Nouveau Virus” (2005/05/30: MyDoom derivatives) • “It has been confirmed that the network problems during the week-end were due to a security break-in” (2004/6/7: General network problem) • “Vulnerable systems on the CERN site have been detected…” (2005/08/16: Zotob Worm targeting W2000) • “Do not open the message from Ab-dep-tech” (2005/11/2: virus Trojan.Lodear.A infected 40 machines) Insecure computers place site at risk DAILY !
Non-centrallymanaged PCs &download e d code Systems directly exposedto Internet IRC Based Hacker Networks Blaster Worm variants(Windows) Suckit Rootkits(Linux) Code Red Worm(Webservers) Change in Trend • 2004:1179 incid. • 2003: 643 • 2002: 123 • Oct. 2005: 80 • 14 systems compromised (13 Win, 1 Linux) • 1 user application compromised • 2 accounts compromised • 63 PCs with unauthorized P2P activity (7 VPN)
Zombies Higher IRC Based AttackingControls Root Kits BOT nets Denial of Service Zero Day Exploits Packet Spoofing Worms Back Doors Automated Probes/Scans Disabling Audits Viruses War Dialing Hijacking Sessions Sniffers Exploiting Known Vulnerabilities Password Cracking Password Guessing Lower 1980 1985 1990 1995 2000 20052010 Cyber Threats ─ Today’s Peril Era of Modern Information Technology(“From Top-Floor to Shop-Floor”) Transition Phase (“Controls goes IT”) Intruder Knowledge /Attack Sophistication Control Systems: Era of Legacy Technology (“Security through Obscurity”) Common Standards / Interconnectivity
Controls Goes IT • Controls Networks mate Business Networks • Proprietary field busses replaced by Ethernet & TCP/IP • Field devices connect to Ethernet & TCP/IP • Real time applications based on TCP/IP • VPN connections from the outside onto the Controls Network • Use of IT protocols & gadgets: • SNMP, SMTP, FTP, Telnet, HTTP (WWW), … • Wireless LAN, Notebooks, USB sticks, … • Migration to the Microsoft Windows platform • Windows not designed for Industrial / Control Systems • OPC/DCOM runs on port 135 (heavily used for RPC)
Threats due to Technique • Poorly secured systems are being targeted • Worms are spreading within seconds • Unpatched systems, O/S & applications • Missing firewalls, anti-virus software or old virus signature files • Zero Day Exploits: security holes without patches • Break-ins occur before patch and/or anti-virus available • …but how to patch/update Control PCs ? • Automation Systems are without security protections • Programmable Logic Controllers (PLCs), SCADA systems,field devices, power supplies, … • Security not integrated into their designs • …but how to secure e.g. PLCs ?
Threats due to People • Passwords are known to several (many?) people • No traceability, ergo no responsibility • People are increasingly the weakest link • Use of weak passwords • Infected notebooks are physically carried on site • Users download malware and open “tricked” attachments • Missing/default/weak passwords in applications • …but how to handle Operator accounts ? • …what about password rules ?
2000: Ex-Employee hacks “wirelessly” 46 times into sewage plant and spills basement of Hyatt Regency hotel. 2004: IT intervention, hardware failure and use of ISO protocol stopped SM18 magnet test stand for 24h. 2003: The “Slammer” worm disables safety monitoring system of the David-Besse nuclear power plant for 5h. 2005: DoS (70”) stopped manual control Aware or Paranoid ? 2003/08/11: W32.Blaster.Worm
Risks and costs ARE significant ! CERN Assets at Risk • Equipment being affected or even destroyed • Partially very expensive, esp. in experiments & accelerators • Sometimes impossible to repair / replace • Processes being disturbed • High interconnectivity, thus very sensitive to disturbances • A cooling process PLC failure can stop the accelerator • A power controller failure can stop a (sub-)detector • Difficult to configure • Time being wasted • Downtime reduces efficiency (esp. data loss in experiments) • Time needed to re-install, re-configure, test and/or re-start • Requires many people working, possibly outside working hours
CNIC Working Group • Created by the CERN Executive Board • Delegated by the CERN Controls Board • “…with a mandate to propose and enforce that the computing and network supportprovided for controls applications is appropriate” … “to deal with security issues.” • Members from all CERN controls domains and activities • Service users (LHC Experiments, AB, TS, AT, SC) • Service providers (Network, NICE, Linux, Computer Security)
CNIC Policy Approval Networking: Spec’s Spec’s NICEFC: LinuxFC: Spec’s 09/2004 01/2005 07/2005 01/2006 07/2006 Approved https://edms.cern.ch/document/584092/1 Phase I: Specification I II III Awareness campaign • Define rules, policies, and management structures • Define tools for • Controls Network Configuration, Management & Maintenance • Controls PC Configuration, Management & Maintenance • Investigate technical means and propose implementation • Stimulate general security awareness
Security Policy • Network Domains • Physical network segregation & logical sub-domains • Hardware Devices • Rules for use of USB, VPN, CD-ROMs, wireless access, … • Operating Systems & Applications • Centrally managed installations • Strategy for security patches • Logins & Passwords • Ensure traceability • Restriction of generic accounts • Following IT recommendations • Security Incident Reporting • Reporting and follow up • Disconnection if risk for others
CNIC Policy Approval Networking: Spec’s Dev. Pilot Spec’s Dev. Pilot NICEFC: LinuxFC: Spec’s Dev. Pilot WTS: Install. Pilot 09/2004 01/2005 07/2005 01/2006 07/2006 Phase II: Implementation I II III Deployment Awareness campaign Training on policy and tools • Deployment of CNIC policy • Tools for configuration, management & maintenance • User Training (e.g. CNIC Users Group) • Installation of Windows Terminal Servers • Full separation of TN and GPN (Jan. 9th 2006)
Network Segregation • Technical Network (TN) and Experiment Networks (EN) • Only operational devices • Domain Manager withtechnical responsibility • Authorization procedure fornew connections • Desktop Computing (GPN) • Development & Testing • Should be done on GPN • Network Monitoring • Statistics & Intrusion Detection
Separation of the TN • Essential Services are “trusted” • DNS, NTP, Oracle, DFS, AFS, TSM, … • Dedicated AB & TS servers • Separation planned for Jan. 9th 2006 • Temporary “trusted” groups available • Solutions to reduce set must follow • What about YOUR system ? • Check & update LAN DB (http://net/register) • List all vital devices connected to the GPN • List all vital central services
Long Term Adaptations • Office or Wireless Connectionto Control System: • Connection to application gateway • Open session to connect toFE, controls machines and/or PLCs • Vulnerable Devices (e.g. PLCs) : • Protected against security risks • Grouped into “Functional Sub-Domains” • Access only possible from their host system Disconnection on “Breakpoints”
NiceFC and LinuxFC • User Responsibility • Groups of computers • Responsible manager(s) • User defined configuration • Centrally managed OS & SW • Add your own packages • (soon also for office PCs) • Transparent procedures • Installation scenarios • Responsibles contacted ifpatches need to be applied
Central Services • Operation, Support and Maintenance • NiceFC & LinuxFC • Centralized servers (in bldg. 513): DNS, NTP, DB, DFS, AFS, … • Standard Windows Application Gateways (Windows Server 2003) • Hardware Support • Standard network equipment • Network connections (24h/day, 365d/year) • Standard (“office”) PCs • …do YOU have specific needs ?
CNIC Policy Approval Networking: Spec’s Dev. Pilot Operation Spec’s Dev. Pilot Operation NICEFC: LinuxFC: Spec’s Dev. Pilot Operation WTS: Install. Pilot 09/2004 01/2005 07/2005 01/2006 07/2006 Phase III: Operation I II III Deployment Op. Awareness campaign Training on policy and tools Operation • Review of effectiveness of policy & methods • Review improvements & suggestions • Incorporate user feedback • Reduce set of “trusted” machines • Optimize security on the TN • Adapt solutions for piquet & development
What do YOU have to do ? • As Hierarchical Supervisor • Make security a work objective • Ensure follow up of awareness training • As Budget Responsible • Collect requirements for security • Assure funding • As Technical Responsible • Take responsibility in your domain • Delegate implementation to system managers • Provide a list of devices and needed services
What can YOU do NOW ? • Use managed systems when possible • Ensure prompt security updates: applications, patches, anti-virus, password rules, logging configured and monitored, … • Ensure security protections before connecting to a network • E.g. Firewall protection, automated patch and anti-virus updates • Use strong passwords and sufficient logging • Check that default passwords are changed on all applications • Passwords must be kept secret: beware of “Google Hacking” • Ensure traceability of access (who and from where) • Password recommendations are at http://cern.ch/security/passwords
Summary • Adoption of open standards exposesCERN assets to security risks. • CNIC provides methods for mitigation.CNIC tools are ready to be applied. • Join the CNIC Users Exchange. Do you want to act BEFORE or AFTER the incident ?
Questions ? • Domain Managers: • GPN: IT/CS • TN: Uwe Epting & Søren Poulsen (TS), Pierre Charrue, Alastair Bland & Nicolas de Metz-Noblat (AB & AT) • ALICE EN: Peter Chochula • ATLAS EN: Giuseppe Mornacchi • CMS EN: Martti Pimia • LHCb EN: Beat Jost • CNIC Home page: http://cern.ch/wg-cnic • CNIC TWiki: https://uimon.cern.ch/twiki/bin/viewauth/CNIC/WebHome • Security Incidents: Computer.Security@cern.ch • Computer Security Info: http://cern.ch/security