Tutorials

Topics In Architecture

For Those Who Do Not Work In Computer Architecture: Fundamental Principles Of Current Issues In Computer Architectures
Yale N, Patt, University of Michigan-Ann Arbor

We will address today¹s hot issues and key ideas in computer architecture. The exact topics we cover will depend on the questions the attendees want us to address. This tutorial is absolutely NOT intended for those working in the field of Computer Architecture. Rather, it is for the computer professional who knows little or nothing about RISC, Data Flow, RAID, Neural Nets, Massively Parallel Supercomputers, Superscalar, Speculative Execution, Snoopy Caches, VLIW, DAE, HPS, Alpha, Pentium, Power-PC, DASH, NUMA, COMA, etc., but is willing to spend a day changing that. We will cover, in as much time as one day allows, a fair piece of the above plus a lot of other stuff that is interesting but not directly vital to your current job. My intent will not be to race through 400 slides in 7 hours. In fact, we will start with only blank slides - what goes on them depends on what questions get asked. We will attempt to examine the fundamental concepts, answer as many questions as people want to ask, and in whatever ways I can, try to explain the crux behind many of the current market offerings in computer architecture. Work in information science, decision science, computer science and computer engineering has become so specialized that it is pretty tough to keep up with anything tangential to one¹s own focused research area. This tutorial is one attempt to do something about that in a relaxed but not superficial way.

Yale N. Patt is Professor of Electrical Engineering and Computer Science at the University of Michigan, Ann Arbor, where he teaches computer architecture and directs the research of ten Ph.D. students in high performance computer implementation. He is also developing, with support from NCR, Digital Equipment Corporation, Hewlett-Packard, Motorola, and Tektronics, an experimental research laboratory for students to develop and measure the performance of real code executing on real systems. In addition, he consults extensively in industry on problems related to implementing high performance computer systems. His clients include Digital Equipment Corporation (since 1977) and AT&T/GIS Division (since 1986).

Topics In Biotechnology I

The Problem Of Computation Biology: An Introduction For The Computer Scientist
Lawrence, National Library of Medicine

This tutorial, aimed at the non-biologist, will survey modern computational biology at a high level. Topics addressed by this introduction include protein folding, genetic sequence analysis, molecular visualization and rational drug design. The computational challenges of each program will be described, the necessary biological background provided and some current methods will be presented. Technologies discussed will include scientific visulization, supercomputing, statistical inference, artificial intelligence, and physical stimulation.

Topics In Biotechnology II

Statistical Algorithms For Multiple Sequence Alignment And Structure Prediction
Chip Lawrence, Wadsworth Center for Laboratories and Research

Recently, statistical algorithms were found useful for several problems in computational molecular biology. These included: an EM algorithm for indentification and characterization of DNA regulatory binding sites; a Gibbs sampling algorithm for local multiple alignment of subtle sequence signals in protein sequence; alignment of large families of protein sequence using hidden Markov models; and the prediction of common RNA secondary structures using context free grammars. In the 1970's it became widely recognized that many statistical problems are most easily addressed by pretending that critical missing data are available. In fact, for some problems, statistical inference is facilitated by creating a set of latent variables, none of whose values are observed. The key observation was that conditional probabilities for the values of the missing data could be inferred by application of Bayes theorem to the observed data. This common statistical framework forms the basis of all of the above algorithms. This tutorial will review this framework, illustrate its application to each of the above algorithms and describe the biological underpinnings of the models they use.

Dr. Charles Lawrence received his B.S. in physics form Rensselaer Polytecnic Institute, and his Ph. D. in applied Operations Research and Statistics from Cornell. He was an assistant professor at RPI from 1971-1976. He moved to the NYS Health Department where he conducted research in statistical and epidemiology. In 1985, he joined the Wadsworth Center for Laboratories and Researcj where his research has focused on computational molecular biology. He has also been a statistical consultant to private corporations and agencies.

Topics In Biotechnology III

Introduction To Computer Tools In Molecular Modeling
Mark C. Surles, San Diego Supercomputer Center
Teri E. Klein, University of California, San Fransisco

Improvements in Computer algorithms, hardware performance, and display technology are increasing the use of computes for drug discovery and molecular analysis. This tutorial surveys disparate tools used in computer-aided drug design. We emphasize benefits of interactive computer graphics for understanding molecular structures and properties, analyzing simulations, designing new compounds, and communicating results. We also discuss the requirement of massive database searches for known compounds, uses of interactive simulation of molecular mechanics, automated rule-based generation of new compounds, and docking of molecules (drug-receptor and macromolecule-macromolecule).

Mark Surles is a researcher at the San Diego Supercomputer Center and the University of California, San Diego, Department of Chemistry. He received a M.S. and Ph.D. in Computer Science under Fred Brooks at the University of North Carolina at Chapel Hill. He is the author of the SCULPT, an interactive graphical modeling systems for proteins. His research interests include molecular graphics, parallel processing, physically based modeling, numerical analysis, and programming languages.

Teri E. Klein is associate adjunct professor at the University of California, San Fransisco. She serves as a scientific consultant for collaborative research with scientists from both academics and industry in the world renowned NIH Research Resource Computer Graphics Laboratory. Her areas of research include development of computational methods for the analysis and visualization of molecular structures. She has published more than 25 refereed papers.

Topics In Software

Introduction To Parallel Computing
Vipin Kumar, University of Minnesota

This tutorial will provide an overvies of parallel architectures (SIMD/MIMD, shared versus distributed memory, interconnection networks, embeddings), routing (store-and-forward vs. worm-hole routing), examples of some currently available parallel computers, basic communication operations (such as 1-to-all broadcast, scan, synchronization), basic metrics for performance evaluation, (speed up, efficiency, isoefficiency), matrix algorithms (matrix-vector products, matrix-matrix multiplication), sorting, graph problems, and parallel programming paradigms. This tutorial would be useful for anyone interested in solving problems on parallel computers and is intended for both people from the computer science community as well as practicing engineers/scientists. Level of presentation: 50% beginner, 40% intermediate, 10% advanced.

Dr. Vipin Kumar has done extensive research and teaching in the area of parallel computing over the last 10 years. He has published over 70 research articles, and co-authored a text book on Introduction to Parallel Computing. He has also taught many seminars on parallel computing to NASA, U.S Army, and the industrial community.

Topics In Information Systems I

Strategies For Successful Implementation Of Re-Engineering And Other I.T. Projects
Don Rossmoore, Rossmoore Associates

The biggest threat to successful implementation of reengineering and other I.T. Projects is the defensive and political behavior of the individuals of the organization. Typical approaches to implementation ignore human factors and assume the individuals of the organization, will act rationally, and in the best interests of the organization. Rationality is seldom found. The normal organizational and individual response to change appears to be defensive and political, undermining the communication and coordination necessary to successful implementation.

In this workshop, we will provide participants with: 1) A description of normal human defensiveness and politics, and the barriers and problems they trigger; 2) A robust, reliable diagnostic method and steps; 3) A reliable implementation strategy, which consists of a) the political role and responsibilities of top management; b) who to empower, how to empower, and how power is used; c) what competent communication looks like and how to get it started; and
d) what structure and procedures are needed to insure acceptable coordination.

Don Rossmoore received his Ed.M. from Harvard University and his M.S. and Ph.D. from the Anderson School of Management at UCLA. For the past twenty years, as managing partner of Rossmoore Associates, he has facilitated the implementation of strategy and strategic change projects for clients such as Apple Computer, Hughes Aircraft Company, MasterCard International, the New York Stock Exchange, and the Prudential Insurance Company.

Topics In Information Systems II

Unifying Expert Systems And Decision Analysis: Recent Developments
Barry G. Silverman, The George Washington University

This tutorial concerns the unification of the expert systems (broadly defined) and decision science fields. More precisely, the topic concerns the unification of qualitative with quantitative reasoning.

This tutorial is intended for decision science practitioners or knowledge engineers interested in a how-to lecture on; integrating expert systems and decision analysis; and unified expert advisory systems approaches they can embed with environments of interest. There will be hands-on interaction with several working decision analytic critics and with a critic programming shell. Researchers interested in improving the knowledge based decision support approach should also find this tutorial beneficial.

Dr. Barry G. Silverman is the recipient of a 1991 Innovative Application of A.I. Award from AAAI for a 5,000 chunk knowledge based, decision-analytic critic he built and deployed for decision paper authoring at 17 sites of the U.S. Army. He is also the recipient of two Phase II SBIR Awards for the A. Ferscha received the Mag. degree in 1984, and a Ph.D. in business Informatica in 1990, both from the University of Vienna, Austria. In 1986, he joined the department of Applied Computer Science at the University of Vienna, where he is presently an Assistant Professor. He has been a lecturer in parallel processing and performance evaluation since 1988. His current research interests include performance modeling of parallel systems, communication structures in parallel programs, computer aided engineering or parallel software, distributed simulation strategies and parallel neural network simulation.

HICSS-28 HOMEPAGE

HICSS HOMEPAGE