Here's what we have on this site

Online Publications
Publications List
JD's Background
Feedback to JD
Search this site

Copyright 1997, JD Eveland. All rights reserved.



Presented to the Invited Workshop on CSCW Evaluation, Third European Conference on Computer-Supported Cooperative Work, Milan Italy Sept. 1993




The systematic analysis of communication patterns in groups and organizations is a relatively recent development. Although the field draws on traditions as old as the "sociometry" of Moreno (1934), the development of replicable and widely accepted metrics and analytical strategies is barely more than 20 years old (Rogers and Kincaid, 1981). Due to the computational complexity of many of the algorithms employed, the field had to await the emergence of accessible computing power; as recently as 10 years ago, for example, the popular clique-analysis program NEGOPY would only run on CDC supercomputers (Richards, 1989). At present, however, there are few if any technical barriers to the application of network analysis techniques. For a variety of reasons, however, network analysis has seen little application in the assessment of computer-supported cooperative work (CSCW) applications. This paper will review some of the ways in which it has been applied, and suggest some directions in which it may be effectively used in the future. The overall aim is to do a bit of consciousness-raising among CSCW analysts as to the potential advantages -- and limitations -- of network methods and techniques.

Network analysis is generally defined as the systematic analysis of interactions (Knoke and Kuklinski, 1982). As Rice and Richards (1985, p. 106) phrase it,

"The goal of network analysis is to obtain from low-level or raw relational data higher level descriptions of a system. The higher-level descriptions identify various kinds of patterns, or test hypotheses about those patterns, in a set of relationships. These patterns will be based upon the way individuals and objects interrelate in a network, and, to some extent, upon the measurement tools and methods used."

A distinction is often made between "positional analysis" -- the assessment of positions that individuals or units hold in a social structure" -- and "relational analysis" -- the assessment of interactions among those units (Burt, 1982). Positional analysis assesses roles in social structures and the equivalences among roles held by individual nodes. Relational analysis examines the connections between nodes, and defines certain properties of individuals such as centrality and certain properties of the network as a whole such as centralization and clique/subgroup membership and structure. The two modes of analysis are complementary but not overlapping; that is, both are of help in gaining an appreciation of network structure, but they describe rather different things about how the network is put together.

The raw material for application of network analysis is a pattern or matrix of interactions among units. The significant point is that it does not matter what the units are, or what form the interactions take. The analytical techniques are entirely independent of the subject matter or the context to be assessed. "Nodes" can be individuals, offices, documents, machines, or any other point capable of participating in a relationship to other points or combination of points. Connections can likewise take virtually any form that the analyst can define to be meaningful. This provides, of course, enormous flexibility and power; it also imposes some significant analytical burdens. More than most analytical procedures, network analysis can be understood only in context. While the specific meaning and interpretation of, say, a Pearson's r of .85 between two variables may vary, most analysts would conclude that it is "big", and it is possible to say that it means that one variable accounts for about two-thirds of the variance in another. On the other hand, the interpretation of a network index such as a Freeman centralization coefficient of .5 is totally impossible apart from a knowledge of the data, their collection, and the relevant context. Whenever relationships can be reflected in a matrix structure, network analysis is possible. However, interpreting the results of that analysis remains more of an art form than most social "scientists" would like to admit.

The use of network analysis techniques in the assessment of CSCW projects has been rather limited. Network analysis would seem to be an obvious tool in understanding systems in which the communication/cooperation dimension is by definition a major part of the emphasis. Unfortunately, network analysis remains a rather arcane discipline, rather than making its way into the general tool repertoire of social analysts. [FOOTNOTE: The annual International Social Networks Conference regularly attracts only about 300 participants; even CSCW conferences attract more than that.] Clearly, there is room for improvement.

Network analysis has been applied to CSCW in both laboratory and field contexts. In the lab, the predominant contributions have been made by the Carnegie Mellon University team. Finholt, Sproull, and Kiesler (1990) have studied the relationship between communication and performance in student teams. Markus (1991) reported a similar experiment. Lab studies have been useful in identifying some of the emergent dynamics in electronically augmented networks, particularly those related to the formation and interaction of subgroups, but have generally been difficult to generalize to broader contexts.

Field studies of electronic networks include the work of Rice (1980) on the use of the pioneering EIES electronic conferencing system among scientists, Rice, Hughes and Love's (1990) study of e-mail in an R&D organization, Rice and Case's (1983) study of university messaging systems, Kraut, Galegher and Egido's (1987) work on scientific collaboration, and several studies by Eveland and Bikson (1988, 1990) on the use of electronic mail and other information tools in several contexts. Findings in general support the idea that electronic augmentation of communication networks changes how organizations work together, but in ways that are not necessarily predictable or even sometimes explainable. In general, where network analysis methodologies are explicitly employed, the overall picture is easier to grasp, if not necessarily easier to account for.

At present, our research group at the Claremont Graduate School is undertaking a study of the emergence of "help networks" to provide and support learning about computers, as electronic support for cooperative work within the School becomes more intensive and more linked. While findings are quite preliminary at this point, it is clear that:

bullethelp networks exist, and almost everyone participates in one or more
bulletthey tend to be directed rather than reciprocal
bulletthey often extend outside the bounds of the formal organization, and usually have little to do with formally assigned roles and hierarchies
bulletthey tend to change rapidly and unpredictably as people acquire new kinds of expertise and abilities

Without the formal techniques of network analysis, it would be difficult to understand many of these dynamics.

While it is clear that network analysis can provide a variety of interesting and useful insights into the processes and effects of CSCW implementations, it is also clear that a variety of issues and problems need to be considered in its application. These include the following problems:

Defining data elements: As noted, network analysis makes no a priori assumptions about what nodes and relationships might be. The analyst must define the nature of the behavior that is of interest, the level of aggregation of that behavior (across both time and individual units) that is meaningful, the level of interaction necessary to constitute a significant relationship, and the bounds to be placed on the system under investigation. Each of these is capable of being defined in myriad ways, with major consequences for the kinds of results found. At present, there is little consistency in how different analysts approach these issues -- with, of course, the consequence of limited cumulativity in research findings even where nominally similar phenomena are being studied.

Data sources: Network data can and have been collected from a very wide range of sources, including self-report questionnaires asking about interactions, electronic message logs, unobtrusive or obtrusive observation, document tracing, and other methods. Not unexpectedly, each of these sources tends to produce different patterns which can be interpreted as either evidence of error (Bernard and Killworth, 1980) or as meaningful reflections of alternative realities (Eveland, 1990) -- or perhaps both. The problem of confidentiality is particularly critical, since informed consent is not usually sought in normal office situations. Given the sensitivity of communication issues in most organizational situations, the more obtrusive the data collection arrangements, the more the analyst risks the introduction of systematic biases into the data. Since the analysis is of course no better than the quality of the data going into it, careful understanding of these issues and resolution of them ahead of time is absolutely essential.

Analysis procedures: Conducting effective network analyses has been vastly eased in the last few years with the introduction of standardized analysis packages such as UCINET IV (Borgatti, Everett, and Freeman, 1991) and STRUCTURE (Burt, 1991). In addition, the translation of packages such as NEGOPY to run on personal computers means that anyone with access to a reasonable PC can do all the network analysis anyone could ask for in minutes. The problem is that there are few if any generally accepted metrics in the field (or rather, a wide range of competing metrics, each applicable to certain circumstances), and almost no well-developed procedures for statistical analysis of the data. Furthermore, even small variations in initial parameter settings for clique detection procedures as NEGOPY and STRUCTURE can produce widely differing results, and there are relatively few guidelines for determing what is most appropriate. Thus, hypothesis testing in networks tends to be more qualitative than quantitative. While this gives the analyst a good deal of freedom to experiment with alternative metrics, parameters, and relational techniques, it often makes it hard to judge others' work.

Presentation of findings: As in analysis, there are no particular standards governing how findings are presented. Each analysis package has its own peculiar format of presentation, and none of them lends itself well to ready interpretation. The longer one works with a particular package, the easier it becomes to interpret output -- but not necessarily to turn that output into material accessible to others. There is some tension, for example, between those who prefer numerical analysis and description and those who prefer graphical presentation. Each mode can do things the other cannot do -- but tying them together is often difficult.

Interpretation of findings: A major consequence of the lack of consistency in analytical approaches is a general lack of consistency in understanding what different metrics identified in analysis really mean in either theoretical or practical terms. With a choice of different measures -- Freeman (1979), for example, identifies at least five conceptually different interpretations of the idea of "centrality", and more have been suggested since then -- the chances of the analyst selecting a measure that conveys the same meaning to a wide audience are almost nil. In general, systems are most meaningfully compared only to themselves rather than across settings. This in turn makes understanding of single-shot network data rather difficult -- and that is all the data we frequently have at our disposal. With such difficulties in interpreting the dynamics of even single studies, it is no wonder that the field has been slow to develop a body of widely accepted generalizations.

Instability of results: The downside of having to have data over time is that we are frequently left with the problem of accounting for change. Social networks are notoriously subject to change due partly to reporting artifacts and partly to the inherent dynamism of social relations generally, and when one adds in the element of rapidly changing technologies, one has a mixture of great volatility and complexity. The aim of research on computing and networks is, explicitly or implicitly, to try to make causal inferences about what makes them work "better" (however one defines that -- a topic outside our present task). But how can one shoot for such inferences if every time one looks at a system, it behaves differently?

Given all these difficulties, what future might there be for the application of network analysis in the understanding of CSCW processes and effects? There seem to be at the least several things that we could do to use this potential tool more effectively:

Use network analysis more. One of the main problems is that not enough analysts feel comfortable with the field's concepts and vocabulary, and consequently fail to take advantage of its insights. We need to get more people who do not mind looking ridiculous out in front of CSCW audiences waving sociograms and centrality indices. This is partly a matter of training and partly a matter of general exposure.

Try to be consistent. At the least, we need to try to use the same kinds of measures of key concepts such as centrality that others have used and presumably interpret at least somewhat similarly. We must resist the temptation to make up new metrics at the drop of a hat. Further, we have an obligation to specify clearly what analytical assumptions and parameters we have specified, and why we believe them to be justified. This will allow others to achieve the same consistency that we seek for ourselves.

Design CSCW applications with networks in mind. Network analysis can be used prospectively as well as retrospectively. Insights from creative looks at the network properties of potential CSCW users can inform system design and implementation significantly. In short, we need to think of network analysis less as an arcane analytical methodology and more as one formative tool among many others.

Be willing to be speculative. The exploration of social networks in the CSCW context is not necessarily a matter of right or wrong insights, of true or false models. Many different interpretations of the data are usually possible, and we need to be aware of them and not afraid of offering alternatives to be interpreted in terms of their relative utility rather than their "truth". In a field with as few methodological truisms as this, we need to embrace, enjoy, and use our flexibility, not be intimidated by it.

To the extent that the workshop can become engaged with some of these issues and questions, I will have accomplished my goal.



Bernard and Killworth, 1980

Borgatti, Everett, and Freeman, 1991

Burt, 1982

Burt, 1991

Eveland, 1990

Eveland and Bikson (1988

Eveland and Bikson 1990

Finholt, Sproull and Kiesler (1990

Freeman (1979

Knoke and Kuklinski, 1982

Kraut, Galegher and Egido's (1987

Markus (1991

Moreno (1934

Rice, Hughes and Love's (1990

Rice (1980

Rice and Case's (1983

Rice and Richards (1985

Richards, 1989

Rogers and Kincaid, 1981