380 likes | 515 Views
An Instantaneous Introduction to the Alliance Access Grid . Michael Grobe Assistant Director Academic Computing Services The University of Kansas September 2000 .
E N D
An Instantaneous Introduction to the Alliance Access Grid Michael Grobe Assistant Director Academic Computing Services The University of Kansas September 2000
The Access Grid is an Internet-based model for video conferencing developed by the Future Lab (FL) within the Mathematics and Computer Science (MCS) division of Argonne National Laboratories (ANL). The Access Grid is an extension of the Alliance Computational Grid which is a distributed computing environment designed to provide convenient access to high performance computer systems to any network user.
As described on the Access Grid web site: • "The Access Grid is the ensemble of resources that can be used to support human interaction across the grid. It consists of: • multimedia display, presentation and interaction environments, • interfaces to grid middleware, • interfaces to visualization environments
The Access Grid will support large-scale distributed meetings, collaborative work sessions, seminars, lectures, tutorials and training. The Access Grid design point is group-to-group communication (thus differentiating it from desktop to desktop based tools that are focused on individual communication).“ The Access Grid includes the notion of a "persistent" video conferencing venue, a conferencing site operating continuously and accessible to a wide audience of users on an ad hoc basis
Basic functionality • An Access Grid "node" will be a small conference room or auditorium, provisioned with the equipment to participate in a multipoint video conference. The basic functionality provided within the node is: • Audio encoding using one or more microphones • Video encoding or "capture" using one or more cameras • Audio presentation using one or more speakers • Video display via one or more computer monitors and/or video projection techniques • Display of PowerPoint "slides" under the control of a presenter located either on-site or at a remote site.
To achieve this functionality the Access Grid model relies upon the ability to send and receive Internet Multicast traffic to and from all conference nodes. The Access Grid is based on software (vic and rat) developed as part of the Internet Multicast backbone, or MBONE, which provided multicast services over the unicast Internet backbone (using "tunnels", or "bridges", between multicast nexus sites).
Software components • The Access Grid model revolves around two pieces of software: • vic: the video conferencing tool, and • rat: the robust audio tool. • and involves several other applications • Distributed PowerPoint • a MUD • the Multicast Beacon • Virtual Venue • Virtual Network Computing
Video Conference (vic) • Vic was developed by Steve McCanne and Van Jacobson at the Lawrence Berkeley Labs. It is intended to link multiple sites with multiple simultaneous video streams over a multicast infrastructure. • vic CAN perform 2 basic functions: • take data from video capture cards in the PC to which cameras (or other video devices) are attached and send it over the network. • receive data from the network and display it on a video monitor or on some other attached video device such as a video projector.
Note that vic may be run in such a way that it only receives video transmissions or only sends transmissions; it is not required to do both at the same time. For more information about vic see: http://www-mice.cs.ucl.ac.uk/multimedia/software/vic
Robust Audio Tool (Rat) • rat is a recent version of the Visual Audio Tool, also developed by Steve McCanne and Van Jacobson at the Lawrence Berkely Labs. rat allows multiple users to engage in a audio conference over the Internet in multicast mode. rat can perform 2 basic functions: • take data from the sound card in the PC to which microphones, headphones, or some other audio devices are attached and send it over the network. • receive data from the network and send it to speakers, headphones, or other attached sound processing device, such as a tape recorder, etc.
rat displays a list of connected participants and identifies who is speaking and who is listening at any given time. For more information about rat see http://www-mice.cs.ucl.ac.uk/multimedia/software/rat and the Access Grid web site.
The Gentner AP400 Echo Canceller Within the Access Grid model, signals from and to attached audio equipment are funneled through an "echo canceller" made by the Gentner Communications Corporation, to eliminate certain kinds of echoes produced during networked conferencing. It is probably fair to say that the Gentner echo canceller is the major component of the audio conferencing system Networks of Gentners work together to provide useful audio signal exchanges.
The Gentners can use 3 different connectivity infrastructures: • a point-to-point telephone connection, • a telephone connection to a telephone bridge, • a computer network, or • Gentner's own local area network, called G-link. • When a Gentner uses a computer network to connect to other Gentners, it connects to the computer just as it would to a simple Codec (compression/decompression device).
The Distributed PowerPoint software The Argonne Distributed PowerPoint software allows a single presenter at one node to control PowerPoint applications running on computer systems located at other Access Grid nodes. For example, a conference speaker can run PowerPoint along with the Distributed PowerPoint master software on her laptop computer at the podium of one of the AG sites. When the speaker changes slides, the master will notify the DPPT server, which will notify DPPT clients running on systems at other nodes which will, in turn, direct their local PowerPoint programs to change slides.
Note that this approach requires that some PowerPoint features be removed or disabled prior to presentation, because Distributed PowerPoint cannot deal with them. (See later discussions of VNC and "scan conversion" for alternatives.) The DPPT clients can operate on PowerPoint slidesets published on a Web server, or on local copies of the slidesets.
The MUD software Operators at each site involved in an Access Grid conference typically keep in touch by using software originally developed for online "role-playing" games generically called Multi-User dragons and Dungeons" games, or "MUDs". (MUD functionality is similar to that of Inter net Relay Chat operating with access control.) Argonne runs a MUD server for use by Access Grid operators who run MUD clients on their desktop systems. tkMOO-lite is currently the recommended MUD client for this purpose, but others, such as Tiny-Fugue in the Unix environment can be used as well. tkMOO will run on both Windows and Linux systems, so it may be be run on any of the AG component systems described below.
The Multicast Beacon • To help diagnose multicast network problems during conferences, Argonne promotes the use of the NLANR multicast "Beacon" monitoring system, which includes three pieces of software: • a Beacon to be run at each AG node, • a server to collect transmission statistics from a collection of Beacons, and • a Beacon viewer that displays data collected by the server.
The Beacon at each node connects to a Multicast group and collects latency, loss, and packet misordering statistics from all other beacons connected to that Multicast group and sends them to the Beacon server. The Beacon viewer displays these traffic statistics as a matrix showing traffic to and from each Beacon attached to the server. (There is also a web-based Beacon.) At KU the Beacon is running on the AG node's video capture system.
The Virtual Venue software • Coordinating multiple group conferences can be complicated. Argonne has developed a collection of web pages and Java applications that can simplify the process. • The Virtual Venue is basically a web-page that lets users select a "conference" to attend. In this context a "conference" is composed of • a vic multicast address, • a rat multicast address, and • a MUD identifier.
If your systems are Virtual Venue-enabled, the display system operator can click on a conference room name and the vic, rat and MUD applications running on the video display, video capture and audio processing systems will all be started with target addresses and settings appropriate to the selected conference room. This coordination is accomplished by running an "event server" and the event controller on the display system, along with "event listeners" on the video capture and audio processing systems.
Virtual Network Computing (VNC) VNC allows users to share monitor screens over the Internet in a variety of modes. In the Access Grid environment, VNC allows a speaker to share his/her podium laptop with Access Grid display systems which can then project it at remote nodes. This is useful when a speaker wishes to give real-time demonstrations or present PowerPoint slides that include "fancy" features, such as animations, that cannot be displayed using Distributed PowerPoint. VNC employs a client server architecture, and there are clients and servers available for Windows98/NT/2000 and Unix operating systems.
Although not part of the original Access Grid canon, VNC has been employed during several Access Grid conferences, and shows promise for future applications. VNC eliminates the coordination effort required to display Distributed PowerPoint slide sets. (No files need to be downloaded ahead of time and no slide synchronization is required.) In general, update times are a function of the number of pixels changed and the number of remote viewers (as well as avaible bandwidth), so VNC will not be appropriate for all applications. Instructions for setting up a VNC relay, are presented in Using Unix-based VNC to relay other VNC traffic.
Basic system configurations The AG model uses a collection of commodity components to provide various services. To assure optimal responsiveness individual functions (video capture, video display, audio capture and presentation) are placed on separate computer systems. There is a variety of hardware and software configurations that can provide the required video conferencing functionality:
This section shows one such configuration: • 1 computer system running Linux for audio capture and presentation • 1 computer system running Linux for video capture • 1 computer running Windows2000 video display through 1 or more video cards controlling one or more video projectors illuminating 1 or more screens. • 1 computer system running Win98 for controlling the audio echo-canceller/mixer • 1 speaker's podium computer running Windows2000 or NT to control remote PowerPoint displays and/or give real-time demonstrations using some Windows application
Audio capture and presentation computer • The audio capture computer: • Converts analog audio from mixers and mics to digital form for transmission by rat over the multicast network • Converts digital audio to analog audio for distribution to room speakers and/or headsets . • Software • RedHat Linux version 6.1 • rat • AudioResourceManager from the Virtual Venue suite
Video capture computer • The video capture computer system converts analog video from cameras and/or VCRs, etc. to digital for transmission by vic over the multicast network. • Software • RedHat Linux version 6.1 • Stock kernel with the BTTV drivers • vic • VideoResourceMonitor from the Virtual Venue suite
Video display computer • Receives video content over the network and displays it on the PC monitor as well as one or more other monitors and/or video projectors if desirable (using the ability of Win2K to display its console screen across multiple video cards) • Decodes Video streams • Runs collaboration applications such as Distributed PowerPoint and the VNC viewer
Software • Windows 2000 • vic • PowerPoint • Argonne Distributed PowerPoint client • EventServerMonitor and DisplayResourceManager from the Virtual Venue suite • VNC viewer
Echo canceller control computer The audio control computer runs Windows 98 and uses custom Genter Control Software to control the Gentner mixer/echo canceller. See http://www.gentner.com for more details. Within the KU ACS node, this function is provided by a 133MHz PC.
Speaker's podium computer • The speaker's podium computer runs: • Windows98/NT/2000 • PowerPoint • the Argonne Distributed PowerPoint master software, and the • VNC server • Configuration suggested by Argonne: Some laptop powerful enough to run PowerPoint
The KU ACS Podium laptop is connected to a "scan converter" that can convert the VGA/SVGA signal generated by the laptop to NTSC video expected by video capture cards. The CORIOscan Select from TVONE is lists for around $495, and can be used to produce a reasonably high-resolution image (1280x860).
Alternatives for displaying speaker slidesets • As mentioned earlier, the Access Grid provides several methods for displaying speaker slidesets. • use Distributed PowerPoint. • This is the "standard" method and provides high quality representation at every site with very little network traffic. Using DPPT means getting each slide set prior to use, stripping it of special PPT features and publishing it on a Web server for distribution to each remote site. This approach may not work well if the speaker relies on special features (such as fancy animations) or launches other applications during the talk.
use a VNC server running on the Podium laptop and a VNC relay (as discussed earlier). This approach provides high quality video including simple animations and all PowerPoint features, but introduces some update delay, and generates much more network traffic than the other alternatives. (If a version of VNC were produced to employ Multicast for image distribution network traffic would be significantly reduced.)
split the Podium laptop video output, send one channel to a local projector for the local audience, and one to a scan converter and then to a video capture card for distribution over vic from the video capture machine. • This will give excellent update speed both locally and remotely, but relatively poor image quality at remote sites. Text smaller than 20 points is usually not legible, but animations and videos present well (as long as high resolution is not necessary). This approach could be a very effective, general solution IF vic could be used with a higher quality codec than the usual H.261. An MPEG-1 codecs is apparently under development and should provide a significant improvement.
use commercial streaming video package. • For example, during the Kansas issue of Alliance Chautauqua 2000, Cisco IPTV was employed to present full-motion animations at high resolution. However, setting up for IPTV broadcasts is complex and requires access to an IPTV server, so this alternative will not be available to all.
Ancillary Servers • You may need to run some of the ancillary servers mentioned earlier on separate computer systems. For example, you may need boxes to run a • Distributed PowerPoint server, • VNC relay server, • MUD, and/or a • Virtual Venue server (should you wish to define your own Virtual Venues).
Operators • You will need from 1 to 4 operators, depending on how you apportion duties, to run an Access Grid node. With one operator per basic function you will need an operator for: • video display, • video capture, • audio control, and • network monitoring. • To some degree there is a trade-off between system costs and operator costs, and the staffing requirements will vary with the complexity of the presentations being offered at a site.
Additional Info The Access Grid web site: http://www.fp.mcs.anl.gov/fl/accessgrid/ For a more detailed version of this talk see: http://www.cc.ukans.edu/~acs/docs/access-grid-node/ Acknowledgments Some of the material for this web page has been taken from the Argonne Labs web site listed above, or from documents provided via that site.