Researchers at the University of Arizona have worked for the past several years to create collaborative groupware tools, procedures, and facilitation techniques to enable groups of end-users to participate effectively in different phases of the software engineering process. The research has resulted in a new methodology - Collaborative Software Engineering Methodology - that combines advanced group collaboration techniques with the best elements of systematic reuse, data integration, and rapid prototyping methods in order to produce integrated and interoperable systems.
Collaborative Software Engineering Methodology has been taught to multiple systems development teams within the U. S. Department of Defense, and is being used by these teams to guide the development of integrated information systems for the Army, Air Force, Navy, Marines, and Defense Logistics Agency.
Collaborative Software Engineering Methodology provides a framework for effective and efficient user involvement throughout the systems development process. This new methodology includes mechanisms to support three layers of user involvement: selected user representatives, user groups of subject matter experts (SMEs), and the entire user community. Specifically, it includes intimate involvement of individual user representatives for development of preliminary models and prototypes; of groups of SMEs to refine, validate, and prioritize requirements; and of the broader user community during initial needs surveys and wide-scale beta testing.
This tutorial includes an overview of the methodology and a demonstration of the specialized collaborative tools developed for use within this methodology: Groups Systems Activity Modeler, Group Data Modeler, and Group Scenario/Process Modeler.
Douglas L. Dean is a research scientist, facilitator, and consultant at the Center for Management Information (CMI) at the University of Arizona. He received his Ph.D. in MIS from the University of Arizona in 1995. His areas of expertise include electronic meeting system support of modeling, business process improvement, and information systems requirements collection. He also consults and facilitates meetings in these areas.
James D. Lee is a research scientist, consultant, and facilitator for CMI at the University of Arizona. He received his Ph.D. in MIS from the University of Arizona in 1995. Dr. Lee's areas of expertise include electronic meeting support for activity/data modeling, BPR, systems analysis and design, and activity-based costing. He has developed software prototypes to support collaborative activity and data modeling.
Ann M. Hickey is an MIS doctoral candidate at the University of Arizona. She received her M.S. in MIS from the University of Arizona in 1990, and has worked as a program manager and senior systems analyst for the Department of Defense. Her interests include collaborative requirements elicitation, systems analysis and design, and electronic meeting support for activity and scenario modeling.
Jay F. Nunamaker is Regents and Soldwedel Professor of MIS at the University of Arizona. Dr. Nunamaker's research interests include Group Support Systems and computer-aided support of systems analysis and design.
People are now learning in ways that were impossible before the advent of information technology. With new technology comes new opportunity, and new challenges. Not all of these challenges are technical. Some technologies allow fundamental shifts in the roles of teachers and learners, but sometimes these shifts can be difficult. How does a teacher move from being the "Sage on the Stage" to being "Guide on the Side?" Change can be difficult.
In this workshop you will see and use new technologies for synchronous learning, ranging from real-time video for synchronous distance learning - to web-based tools for worldwide network interactions - to technology for co-located cooperative learning. Participants and presenters will discuss successes and failures of roll-out efforts, and will explore the process of moving instructors through the paradigm shift from information-deliverer to mentor or coach. In this workshop you will hear and contribute to discussion of technology, research and practice, and the lessons learned in the front-lines of technology-supported synchronous learning.
Robert O. Briggs investigates the use of technology to improve group productivity as a Research Fellow in the Center for the Management of Information at the University of Arizona. His work includes theoretical modeling of group productivity to support the design, development, use, and evaluation of new technologies developed at the Center. Recent work examines the use of Group Support Systems in the classroom to support cross-disciplinary problem-based learning. Dr. Briggs holds a Ph.D. in information systems from the University of Arizona, and received his MBA and B.S. in information systems, and a B.S. in art history from San Diego State University.
Jay F. Nunamaker is Regents and Soldwedel Professor of MIS at the University of Arizona. Dr. Nunamaker's research interests include Group Support Systems and computer-aided support of systems analysis and design.
Asynchronous Learning Networks (ALNs) have been used for on-line courses which typically are limited to a small number of students. Over the last few years several networked tools have been developed to use ALNs as supplements in large traditional lecture classes (<250 students).
The tutorial will focus on the system CAPA (Computer-Assisted Personalized Approach) developed at Michigan State University. CAPA can be used for assignments, supplementary questions, quizzes, and examinations. Emphasis is placed on conceptual exercises to stimulate discussion among students during and outside lecture sessions. Specific examples of problems, questions and simulations that are applicable in many different fields will be presented. In addition to an introduction of the features and main components of CAPA and similar systems, the tutorial will present and discuss results of student performance data.
Michael Thoennessen is Associate Professor at Michigan State University in the Department of Physics and Astronomy with an appointment at the National Superconducting Cyclotron Laboratory. He received his Ph.D. in experimental nuclear physics from the State University of New York at Stony Brook. His main research interest is the study of exotic nuclei and the application of networked technology in large traditional lecture classes.
Edwin Kashy is University Distinguished Professor at Michigan State in the Department of Physics and Astronomy. He received a B.A. and Ph.D. from Rice University. His recent research interest has been the impact of networked technology in his classes. He has led the teams at MSU who have developed CAPA.
The goal of this tutorial is to present a succinct, focused discussion of the best approaches to the conception, design, execution, and write-up of experimental research on collaboration technology. Our goal is to help researchers establish a program of experimental research that will be publishable in leading MIS, management, and psychology journals.
We will address five topics. Each will be considered separately, but all five are related and all must be done well to produce top-quality research. (In our experience, authors tend to emphasize some topics at the expense of others.) For each topic, we will explain why it is important, and discuss how to approach it. We will present a set of issues to consider, and provide a checklist of key factors.
1. Developing research question(s). Our intent is to focus on the process of how to identify and develop relevant questions.
2. Developing theory. Every experimental research study should extend current theory. We will discuss what is and is not "theory," the detail needed to define and refine constructs, and how to tell the theory "story" that underlies a research study.
3. Designing a study. Most researchers at some point in their career have read books and papers on experimental design. In this section, we will very briefly review these concepts, and quickly turn to their practical implementation in a collaboration technology research study.
4. Drawing conclusions and implications. Every study must present conclusions that generalize empirical results to theory. Good studies go beyond their results to develop insight and draw implications for future research and for practitioners (e.g., managers, system developers).
5. Top Ten list. As journal reviewers and editors, we've seen many good and bad examples of experimental research. We will discuss the leading Do's and Don'ts.
Alan Dennis is an associate professor of MIS at The University of Georgia. His research focuses on groupware and led to the development of Consensus @nyWARE, a Web-based groupware system by Soft Bicycle Corporation. His research has appeared in MIS Quarterly, Information Systems Research, Management Science, Journal of Applied Psychology, Communications of the ACM, and Academy of Management Journal. Dr. Dennis is department editor for groupware at DataBase.
J. S. Valacich is The Hubman Distinguished Professor in Information Systems at Washington State University. His primary research area is technology mediated group decision making and his past research has appeared in publications such as MIS Quarterly, Information Systems Research, Management Science, Academy of Management Journal, Communications of the ACM, Decision Science, Organizational Behavior and Human Decision Processes, and Journal of Applied Psychology. Dr. Valacich is associate editor at MIS Quarterly.
Environmental negotiation has recently resurfaced as an important issue, both in local and global scales, and confirmed by an informal survey of participants in HICSS-31. Recent controversial topics include waste management, natural resource issues, legislative proposals, and historic preservation. Too often in an environmental dispute, the original disagreement escalates, and resolution is delayed for months or years. Amplified by media blitz, many disputes result in needless litigation, as legal costs steadily climb. The need for a quicker, less costly, and more satisfying process has resulted in the need for a more effective approach to dispute resolution.
We propose an integrative framework that encourages negotiation, facilitation, mediation and arbitration as effective tools for problem framing and re-framing, problem solving and decision-making. Further, we focus on the use of information technology - e.g., group decision support systems, negotiation support systems, argumentation tools and the like - to enhance the effectiveness of environmental negotiation.
Using a series of lectures, case studies, and software demonstrations on environmental negotiation, mediation and conflict resolution using information technology, the tutorial is divided into four modules:
The four tutorial leaders are academics from three countries, all experts in the field of negotiation theory, cross-cultural dispute settlement and environmental negotiation, and together representing almost a hundred years of research and consulting experience in the negotiation field and the design of information systems to support conflict resolution. Preparing this tutorial involved working the Negotiation Support System minitrack (in its 7th year) and communicating by email with various research institutions in France, Finland, Japan, and the United States.
Information Retrieval algorithms support the computerized search of large document collections (millions of documents) to retrieval small subsets of documents relevant to a user's information need. Such algorithms are the basis for Internet search engines and digital library catalogues.
The three major theoretical models in information retrieval are Boolean/logic, vector space, and probabilistic. This tutorial will explain the unique characteristics and problems of each model. Application areas are foreign language retrieval (for example, Chinese where there are no word boundaries) and cross-language retrieval (where queries on one language are used to search document collections in another language).
Search engine performance can be subject to unbiased, objective large scale testing (hundreds of queries against millions of documents) as done in the Text Retrieval Conferences (TREC) sponsored by DOD Advanced Research Projects Agency (DARPA) and National Institute of Standards and Technology (NIST).
This course is designed to provide a fast-paced yet rigorous introduction for academic and industrial researchers whose background lies outside the Information Retrieval area.
Attendees will obtain a basic understanding of the models upon which modern text retrieval software is based and how search engine performance is measured. The tutorial should provide each participant with a starting point for further self-education.
Frederick Gey directs the UC Berkeley entries to the TREC conferences, and is the General Chairman for ACM SIGIR99, the 22nd Annual Conference of Research and Development in Information Retrieval to be held at the University of California, Berkeley in August 1999. His research specializes in probabilistic techniques. He holds a Ph.D. in Information Science from UC Berkeley. He is principal investigator of NSF Grant IRI-9630765 ("Probabilistic Retrieval of Full-Text Document Collections Using Logistic Regression") and co-principal investigator of the DARPA research contract "Search Support for Unfamiliar Metadata" (1997-2000).
Skillful graphic design for user interfaces of all kinds (for GUIs of productivity tools on client-server networks, multimedia CD-ROMs, and the Web) is crucial to the success of innovative computer-based products, especially as computers become absorbed into consumer products intended for diverse, international user communities.
Presented by a pioneer of graphic design for computer graphics and a leader in the field of user interface design, electronic document design, and knowledge visualization, this tutorial will give researchers and developers valuable insight into key design issues, and show how to achieve effective visual communication, providing practical guidance for research and commercial product development. It will introduce terminology, principles, guidelines, and heuristics for using information-oriented, systematic graphic design in user interfaces, especially for
the design of metaphors, mental models, navigation schema, icons, and dialogue boxes. Many current window managers and user-interface design tools do not provide sufficient functions or guidance for these topics.
Aaron Marcus is an internationally recognized authority on user interface, multimedia, and document design, and the founder of AM+A, cited in 1995 as one of the fastest growing companies in Northern California. In 1992, the year he founded AM+A, he received the National Computer Graphics Association Industry Achievement award. He also was cited by Multimedia Producer Magazine as one of the top 100 Multimedia Producers in 1995 and 1997. Mr. Marcus was one of the world's first professional designers of computer graphic displays at AT&T Bell Labs in 1967. He has taught computer graphics since 1970, and has published extensively on how to achieve effective visual communication for technical and professional journals. He received his B.A. in physics from Princeton University, and a B.F.A. and M.F.A. in graphic design from Yale University Art School. Earlier versions of this tutorial have been presented at HICSS, 1988-1992 and at SIGGRAPH conferences, 1992-1997.
This highly interactive tutorial will present virtually, "Everything You Wanted To Know About the Future of Health Care in the U. S., But Were Afraid To Ask." The U. S. health care system is a constantly evolving complex web of institutions, contracts, funding streams, public policies, and private players. The Institute for the Future, a California think tank, has recently completed a ten-year forecast of the health care system, parts of which are highly relevant to information systems developers. The forecast includes:
Robert Mittman brings a multidisciplinary perspective to health care forecasting and planning. He assesses the impacts of managed care, outcomes research and technology assessment, health policy, and advanced technology on the health care system. His research work combines quantitative and qualitative methodologies to look at how the health system will evolve, most recently looking at the future of cost and utilization controls across different health care practice settings; and creating an index of the penetration of managed care in different regions of the United States. He holds graduate degrees in computer science and public policy analysis, and a Bachelor of Science degree in electrical engineering, all from the University of California at Berkeley.
Information technology has long been viewed as a powerful tool for improving the quality, increasing the efficiency, and expanding the access in the health care system. Yet, the health care system has been slow to realize the full potential of technological advances - estimates place the health care industry five to ten years behind other industries in effectively utilizing available information technologies. A concerted effort is now underway to address this problem involving medical and health care professionals, computer scientists, and information systems researchers and practitioners. The goal of this tutorial is to provide participants with a starting point from which to get involved in this effort.
Medical informatics encompasses the areas of research and practice at the intersection of information technology and medicine and health care. While the field covers a wide variety of topics from telemedicine to electronic patient record systems, this tutorial will focus on the use of patient record and other database systems for clinical performance improvement, the design and use of clinical decision support systems, and clinical use of the Internet.
In focusing on technology aspects in health care, this tutorial complements the "US Health Care Labyrinth" tutorial, which focuses on the health care system.
More specifically, this tutorial will provide participants:
Daniel C. Davis, Jr., MD, is Medical Director, Clinical Informatics, at the Queen's Medical Center in Honolulu and Chief of the Division of Medical Informatics at the John A. Burns School of Medicine at the University of Hawaii. His primary interests are the development and implementation of information technologies for clinicians. He is a member of the Board of Directors of the Medical Information System Physician Association, and speaks at international informatics meetings such as the scientific sessions of the American Medical Informatics Association and annual sessions of the American College of Physicians.
William G. Chismar is an associate professor and chairman of the Decision Sciences Department at the University of Hawaii. His teaching and research interests include information technology in Asia, information technology and public policy, and the use of information technology in restructuring the delivery of health care. He is cochair of the new HICSS track "Information Technology in Health Care."
This full day tutorial will be covered in two parts.
Java Basics and Programming (morning) Today, the key areas of computing are the Internet, Intranets, and the World Wide Web (WWW). Of course, a new programming language well-suited to the network environment is Java. In a short time, Java has risen from an experimental language to be the Internet programming language. Java is a new object-oriented language developed by Sun Microsystems, which allows developing stand alone programs as well as WWW applications. The most important feature of the Java language is that it makes a Web page come alive, making it more interactive.
How is Java different from traditional programming languages? Is it a completely different programming paradigm? What are its implications? This tutorial will answer these questions, and go beyond the hype. We will introduce Java programming and language, and explain the concepts of high performance (computing), networking (communication), and imaging integrated into Java. We will cover the strengths and weaknesses of Java, showing with simple examples how it can be used on the WWW, and demonstrate its greater potential by discussing selected non-WWW applications (such as multithreaded servers) for which Java provides unique capabilities.
Parallel and Distributed Computing Using Java (afternoon) The popularity of the Java Language has exploded in the past few years, with representatives from various disciplines actively participating in its rapid growth. Java takes the promise of parallel processing and distributed computing a step further, by providing a portable programming language that can be used on any machine or combination of different machines that support Java. Additionally, it can be used seamlessly on the web. Java is itself multithreaded, and has several models of parallel programming and distributed computing at different levels of abstraction, allowing the user to choose the model suited to the level of detail they desire and the effort they are willing to spend to learn the model. It offers these various layers through its very well defined object-oriented classes that are fairly straightforward to use.
The tutorial will cover the entire spectrum of this topic, from using core Java classes such as the Socket and URL classes, to the Remote Method Invocation facility. Programming examples will be provided for each major topic. The tutorial will also touch upon the latest Java facilities such as JavaBeans, JDBC and JNI.
Rajkumar Buyya is a Research Scholar in School of Computing Science, Queensland University of Technology, Brisbane, Australia. Prior to joining QUT, he worked in Centre for Development of Advanced Computing from 1995 to 1998, Bangalore, who have built India's first Supercomputer; and at Applied Computer Techonologists, Bangalore, India from 1992 to 1994. He also served as a visiting faculty member for the Bangalore University from 1995 to 1998 and as a reporter he writes for Asian Technology Information Program, Japan/USA. He has coauthored the books Microprocessor x86 Programming (BPB Publications, '95) and Mastering C++ (Tata McGraw Hill Publications, '97). He is an associate editor of th ePDPTA '97 Conference Proceedings, USA. Currently, he is editing a book entitled "Cluster Computing, HPC on Clusters", which has contributing authors from more than 15 countries around the globe and is expected to be published by one of the leading USA based publishers as an International Edition in multiple volumes. He is a guest editor for a special issue of Parallel and Distributed Computing Practices Journal. He is also a regular columnist for Techno-World Magazine.
Rajkumar has delivered/delivers tutorials on advanced technologies such as Parallel, Distributed and Multithreaded Computing, Client/Server Computing, Internet and Java, and Cluster Computing at HPC ASIA '97, Korea; HiPC '97, India; NCS '98, USA; CATE '98, Mexico; ASC '98, Mexico; HPC ASIA '98, Singapore; PDCN '98, Australia' and HICSS-32, Hawaii, USA International Conferences. He is a member of the program committees of PDPTA '98 and CISST '98 Conferences. He has also organized a special session in HPC on Clusters at PSPTA '98 conference. His research papers have appeared in National and International Conferences. His research interests include Programming Paradigms and Operating Environments for Parallel and Distributed Computing.
Ira Pramanick received her B. Tech. (Hons.) in Electrical Engineering from IIT Kharagpur, India in 1985, and her Ph.D. in Electrical & Computer Engineering from the University of Iowa in 1991. She was an Assistant Professor in the ECE Department the University of Alabama in Huntsville from 1991 to 1992. She worked for IBM Corporation as an Advisory Engineer from 1992 to 1995, and joined Silicon Graphics in 1995 where she currently works for the Strategic Software Division. Dr. Pramanick's research interests include parallel processing and distributed computing, highly available systems, algorithms and applications. She has served on the program committee for the IEEE International Symposium on High Performance Distributed Computing for the past four years, and has chaired several technical sessions at HPDC, ICPP and SECON. She is a holder of two US Patents.
With increasing use of small portable computers, wireless networks and satellites, a trend to support Mobile Computing or "computing on the move" has emerged - meaning that a user may not maintain a fixed position in the network. The mobile user still expects uninterrupted network access and the ability to run some networked applications. To support such mobility, a mobile user typically is provided a wireless interface to communicate with other fixed and mobile users. This computing environment raises issues on how to route packets as the mobile user (host) moves from one place to the other. If an application running on such a mobile host needs a certain quality of service (QoS), how to guarantee that? Which transport protocol is used on top of a Mobile Host? What about the poor error rate of wireless links?
The tutorial will focus on solving problems in designing high performance mobile computing. We will cover a discussion of both Mobile IP and Wireless ATM. Important issues such as optimum error control for different wireless links (e.g., ground vs. satellite), admission control techniques to support quality of service for mobile multimedia applications, design of efficient MAC and Transport Layer protocols for mobile computing will be discussed. We also will present techniques to evaluate the performance of mobile computing systems.
Upkar Varshney is on the CIS faculty of Georgia State University. His research interests include the design of mobile computing systems, transport protocols and Wireless ATM. Prior to coming to Georgia State, he was at Washburn University in Topeka, Kansas, and earlier was a research associate at the Center for Telecommuting Research, University of Missouri-Kansas City. He holds a B.S. degree in electrical engineering from University of Roorkee, and an M. S. in computer science and Ph.D. in Telecommunications and Computer Networking from the University of Missouri-Kansas City.
The objective of this half-day tutorial is to present the basic elements of the formal specification language Z. Z has become one of the most popular formal methods languages in recent years and has been used in a number of relatively large industrial software projects. Basic elements of Z (e.g., Z schemas, set-theoretic types such as sets, relations, functions, and sequences) and Z calculus (e.g., schema linking and inclusion, and D and X conventions) will be presented by means of examples and easy-to-follow case studies. This tutorial is designed for those who have no prior knowledge of Z (or formal methods) and would like to become familiar with a formal methods notation such as Z. At the end of tutorial, the participants should be able to read and write specifications in Z.
Hossein Saiedian is well-known in the software engineering community. He has published over 40 technical articles in software engineering and recently organized a roundtable on formal methods for IEEE Computer, April 1996 (in which a number of distinguished computer scientists participated), and completed a special issue of the Journal of Systems and Software, March 1998, on formal methods technology transfer. Hossein Saiedian will chair the 12th Conference on Software Engineering Education and Training in New Orleans, in March 1999.
Return to tutorials list
Return to APA table of contents