Computational Models of Social Forms:
Advancing Generative Macro Theory
Weatherhead Center for International Affairs
1033 Massachusetts Avenue
Cambridge, Mass. 02138
April 28, 2003
Building on Simmel’s theoretical foundations, sociological process theorists continue to challenge mainstream social theory. So far, however, they have rarely relied on formal modeling. I argue that recent advances in computational modeling offer tools to explore the emergence of social form in the Simmelian tradition. Thanks to strong resemblances relating to both epistemology and ontology, these two fields can benefit from drawing more explicitly on each other. The process-theoretic tradition in social theory and contemporary agent-based models shift theorizing from nomothetic to generative explanations of social forms, and from variable-based to configurative ontologies. In order to formalize sociational theory, I focus on how to model dynamic social networks and emergent actor structures.
Thanks to theoretical advances in the natural sciences and the decreased cost of computer technology, computational modeling is becoming an increasingly popular tool in the social sciences. Due to its relative novelty and somewhat marginal position in most disciplines, however, most modelers have focused on methodological challenges posed by its application to social phenomena. By contrast, the method’s theoretical foundations are still relatively ill understood and many theoretical possibilities remain unexplored by computational scholars. At the same time, social theorists, following in the footsteps of Georg Simmel’s pioneering contributions a century ago, have developed a process-based research tradition that anticipates the scientific practices of today’s computer-based research. In short, if the sociological process theorists have been computational modelers avant la lettre, the latter can be seen as process theorists “après la lettre”.
In view of this apparent, but relatively unknown, conceptual convergence, it would seem useful to bring the two intellectual traditions together. Agreeing with recent efforts to bridge the two fields (e.g. Hanneman, Collins, and Mordt 1995; Sawyer 1998; 2003; Müller, Malsch, and Schulz-Schaeffer 1998; Sallach 2000; Macy 2002), I argue that they both stand to benefit from such cross-fertilization. Scientists relying on computational techniques would gain a better understanding of what they are actually doing by anchoring their research more firmly in social theory, both in terms of epistemology and ontology. This research strategy would enable them to respond more effectively to criticism that assumes other philosophical positions. Even more importantly, such a widening of the ontological spectrum would help them realize new opportunities that may not be obvious by focusing on a more narrow, method-driven agenda.
Conversely, I assert that the intellectual bridge-building exercise could assist process theorists working in the Simmelian mold as well. Customarily relying on rather loose metaphors and analogies, these theorists have struggled to grasp the consequences of their uncompromising assumptions. Although some of them have devised simple models, the analysis has generally remained qualitative and intuitive. Due to their compatibility with the basic principles of the process perspective, computer-based techniques provide unique conceptual resources in the quest to capture the evolution of complex social forms in time and space.
In this paper, I limit the scope to macro-theoretic problems. It is in this domain that formal modeling is the most needed, but also where its application causes the most severe difficulties due to the overwhelming complexity of social systems (Hanneman, Collins, and Mordt 1995; Cederman 1997). Computer-based research promises to overcome some of the conceptual hurdles that have provoked a powerful trend away from systemic thinking to micro-level theorizing, especially in the rational-choice tradition (Collins 1999).
I proceed by highlighting the main epistemological and ontological principles characterizing sociological process theory before turning to the corresponding dimensions in contemporary computational research. Then a section exemplifying different types social forms in agent-based modeling follows. A concluding section assesses what each literature has to offer and what theoretical gaps remain to be filled.
Sociological process theory
Reacting to positivist currents in the late 19th century, Georg Simmel pioneered a distinctive tradition of process-driven theorizing in sociology. Rather than assuming social reality to consist of fixed and given entities, whether individuals or collectives, Simmel conceived of it as an ongoing interactive process giving rise to social forms in a spatio-temporal continuum:
The large systems and the super-individual organizations that customarily come to mind when we think of society, are nothing but immediate interactions that occur among men constantly every minute, but that have become crystallized as permanent fields, as autonomous phenomena (Simmel in Wolff 1950, p. 10).
Stressing the emergent quality of such social forms, Simmel refers to them under the heading of sociation (“Vergesellschaftung”, see Wolff 1950, lxiii). Viewed in sociational terms, society is thus not a “‘substance’, nothing concrete, but an event” (Simmel in Wolff 1950, p. 11). Languages, social structures, norms and conventions are created through “societal production, according to which all these phenomena emerge in interactions among men” (p. 13). According to Simmel, the primary goal of sociology is to analyze social forms: “Sociologists are directed to identify and classify different forms of social interaction: to analyze their subtypes; to study the conditions under which they emerge, develop, flourish, and dissolve; and to investigate their structural properties” (Levine 1971, pp. xxvii-xxviii).
This “process worldview,” which parallels the strictures of Whitehead’s philosophy (Fararo 1989), has also given rise to a rich tradition of scholarship that explicitly, or sometimes implicitly, follows Simmel’s theoretical lead. In the US, primarily thanks to the efforts of Albion Small, Robert Park, and George Herbert Mead, the Chicago School became an influential bastion of sociological process theory. Its main insight tell us that “every social fact is situated, surrounded by other contextual facts and brought into being by a process relating it to past contexts” (Abbott 1997. p. 1152).
Yet, despite the influence of Simmel’s work and that of his American successors during the first decades of the 20th century, mainstream sociology went off in a very different direction, mostly due to the dominance of Parsons’s relatively static theorizing (Levine 1991) and the trend favoring decontextualized quantitative research methods (Abbott 1997). However, the process perspective continued to thrive in other scholarly contexts. Outside the mainstream, for example, Norbert Elias (1978; 1982) developed his own somewhat idiosyncratic macro-historical process theory based on the notion of “figurations.” In anthropology, Fredrik Barth (1981) stands out as one of the greatest process theorists. Like Simmel’s original research agenda, Barth’s theoretical approach accounts for social forms by showing how the are generated by social processes.
More recently, social theorists have attempted to revive and synthesize these and related streams of thinking by promoting “relationism” (Emirbayer 1997), “structuration” (Giddens 1978), a “morphogenetic” perspective (Archer 1995), or the continued relevance of the Chicago School (Abbott 1997). It goes without saying that this is a somewhat sprawling literature with considerable internal tensions.1 Nevertheless, it is nonetheless possible to discern a common meta-theoretical pattern of inquiry: all these process approaches advance generative explanations of social forms conceptualized as configurations. To articulate these principles more clearly, let us now consider the epistemological notion of generative theory (how do we explain?) before turning to the ontological issue of social forms (what do we explain?).
From nomothetic to generative epistemology
Generative theory differs radically from the positivist focus on explanation in terms of laws and regularities. Most of today’s social science subscribes at least loosely to a Humean regularity notion of causation, requiring a constant conjunction of factors, typically across a large number of cases. Such a conception of what it means to explain has been reinforced by the wider use of statistics across the disciplines. For example, the political scientists King, Keohane, and Verba (1994) argue that such a logic of social inquiry applies regardless of whether the analysis proceeds qualitatively or quantitatively. These, and many other authors (e.g. Goldthorpe 1997), emphasize the search for, and explanation in terms of, causal regularities through comparative analysis or statistical control as the primary goal of social science. Within philosophy of science, Hempel’s (1965) idea of covering laws provides the clearest and most sophisticated expression of this nomological epistemology.
How, then, can explanation be achieved, if not by reference to regularities? The sociological process approach starts with an observed social phenomenon, whether unique or ubiquitous, and then postulates a process constituted by the operation of mechanisms that together generate the phenomenon in question. Ultimately, explanatory value thus resides in the specification of (often unobservable) mechanisms and the reconstruction of a process within which they are embedded.2
The process theorists’ preference for generative explanations stems from the difficulties of finding and isolating regularities in complex social systems. Although Simmel (1989) was especially pessimistic on this point, he did not exclude the possibility of identifying ideal-typical forms and mechanisms under a diverse set of conditions. Ultimately, however, the question of whether there are macro-level regularities is not essential to the generative approach, because even in those cases where they can be said to exist, process theorists would regard them as insufficient and superficial substitutes for the deeper understanding yielded by a generative explanation. Simmel believed that it is impossible to account for social forms without describing how they are generated through a “genetic method.” Instead of resorting to divine intervention or individual will, such entities should be explained “by the notion of societal production, according to which all these phenomena emerge in interaction among men, or sometimes, indeed, are such interactions” (Wolff 1950, p. 13).
Using various labels, such as “abductive” or “retroductive” inference or “inference to the best explanation,” philosophers since Peirce distinguish this pattern of inquiry from both induction and deduction (Lipton 1991). Intimately linked to scientific realism, this notion of causal reasoning has gradually gained terrain at the expense of Hempel’s nomothetic ideal. Examples abound in the natural sciences, especially in historically oriented fields such as geology and evolutionary biology (McMullin 1964). As we have seen, sociology is no exception from this trend. Although Simmel did not explicitly use these terms, his method has been characterized as “abductive” (Krähnke 1999).3 The same goes for much of social theory, including Giddens’ structuration theory (e.g. Wendt 1987).
The construction of generative explanations based on abductive inference is an inherently theoretical endeavor (McMullin 1964). Instead of subsuming observations under laws, the main explanatory goal is to make a puzzling phenomenon less puzzling, something that inevitably requires the introduction of new knowledge through theoretical innovation. Simmel’s response to this challenge was to rely heavily on metaphors and analogies, thus shedding new and unexpected light on his subject of study (Simmel 1989; see also Krähnke 1999, p. 94). Likewise, Elias (1978) resorts to metaphoric constructs, such as social dance and the use of personal pronouns, to clarify the meaning of his social figurations.
Though regarding metaphors and analogies as integrated parts of scientific practice, realist philosophers of science also stress the pivotal role of models in the construction of generative theories. Like metaphors and analogies, models mediate knowledge from a known “source domain” to a “target domain”, but they do so in a more consistent and precise fashion especially in complex settings (Harré 1970). The main function of models derives from their capacity to trace and reconstruct causal processes in a coherent way.
It thus comes as no surprise that more recent efforts to spell out a process-theoretic vision often involves explicit model building. Fredrik Barth (1981, p. 32) offers an especially clear justification of such a research program. In his view, generative models
are not designed to be homologous with observed social regularities; instead they are designed so that they by specified operations, can generate such regularities or forms. They should be constituted of a limited number of clearly abstracted parts, the magnitude or constellation of which can be varied, so that one model can be made to produce a number of different forms. Thus by a series of logical operations, forms can be generated; these forms may be compared to empirical forms of social systems, and where there is correspondence in formal features between the two, the empirical form may then be characterized as a particular constellation of the variables in the model.
Note that Barth distinguishes generative explanations from the mere discovery of emergent social forms. Clearly, it is not sufficient to identify the associations. Rather, what is needed is a deeper explanatory reconstruction of how the social forms in question were generated:
Explanation is not achieved by a description of the patterns of regularity, no matter how meticulous and adequate, nor by replacing this description by other abstractions congruent with it, but by exhibiting what makes the pattern, i.e. certain processes. To study social forms, it is certainly necessary but hardly sufficient to be able to describe them. To give an explanation of social forms, it is sufficient to describe the processes that generate the form (Barth 1981, pp. 35-36).
Reasoning along similar lines, Fararo (1989) elaborates a general conceptual framework for generative theory. In agreement with other modern process theorists, Fararo subscribes to realist philosophy of science. To explain an empirical macro phenomenon, he argues, a social scientist should “construct a generating process as” that produces it (p. 43). Processes are constituted by recursively iterated operations of generative mechanisms and rules, very much like a computer program. At the same time, Fararo’s realist approach to explanation raises the difficult question of how to model process:
In the context of present-day empirical research, a great deal of stress is placed on the specification of variables and “explaining” variation in one variable by appeal to variation in a battery of other variables. This conception of explanation is based on a legitimate aspect of the dynamic or process-oriented frame of reference of general theoretical sociology. Namely ... as the generative mechanism produces changes of state, it does so under parametric conditions. Variations in parametric conditions, either dynamically or comparatively, thereby produce variations in states via the generative mechanism. What is omitted in the usual accounts of “causal models” based on “explaining variation” is the explicit formal representation of process (Fararo 1989, p. 42).
This representational challenge makes it impossible to postpone further a discussion of ontology. As argued in the next section, sociological process theory differs from mainstream approaches in this respect as well.
From variable-based to configurative ontology
True to its scientific realist principles, sociological process theory requires explanations to specify theoretical entities, relations, and mechanisms that together generate the social forms to be explained. Often these components have to be postulated because they are unobserved, or in some cases, even unobservable (Miller 1987). By contrast, positivist research tends to express causation in terms of variables measuring properties of social entities, relations, and mechanisms without specifying these components directly. To the extent that social forms and their components are postulated at all, this is done primarily for instrumental reasons, without attributing actual existence to the theoretical building blocks. After all, prediction of actual outcomes is often more important than explanation in these non-realist perspectives (see Friedman 1953).
The difference between variable-based ontologies and process-theoretic alternatives can be traced back to Simmel’s general distinction between “form” and “content.” In his formal sociology, Simmel stressed the former at the expense of the latter (Levine 1971).4 By virtue of its ability to capture form, Simmelian sociology thus differs from conventional approaches, that “do not describe social processes, but merely correlations between elements. Simmel studies the concrete sociations between persons and groups, not indicators of their outcomes (number of killed, riot frequencies etc), and explains social structures as the synthesis of these interactions” (Sylvan and Glassner 1985, p. 45; see also p. 92).
Despite the importance of social form in Simmel’s theory, however, it is hard to find a clear definition of this key concept in his work (Wolff 1950, p. xxxix). Levine defines social forms as “the synthesizing principles which select elements from the raw stuff of experience and shape them into determinate units” (Levine 1971, p. xv), but that definition seems too broad for the purposes of the present paper. On the other hand, Barth’s (1981, p. 32) equation of social forms with “series of regularities in a large body of individual items of behaviour” is too narrowly behaviorist. For the present purposes, then, it seems reasonable to define social forms as configurations of social interactions and actors that together constitute the structures in which they are embedded.5 Furthermore, configurative theories, as opposed to variable-based ones, provide an explicit representation of configurations, whether exogenous or endogenous.
From the assumptions of generative theory, it follows that all social forms are dynamically produced through social processes. Or to put it even more starkly: social forms are by definition social processes. The extent to which actors and structures vary over time depends on the particular case, however. Some processes generate configurations of behavior or attitudes, without any change in their component entities and their interaction structures. Other processes are more complex in that they feature change in actors’ boundaries and their structural arrangement. Configurative theories that endogenize such phenomena can be characterized as sociational (Vergesellschaftungstheorien) to use Simmel’s terminology. The longer the time span, and the more profound the societal transformation, the more likely it is that sociational processes operate. Considering primarily macro-processes in the Simmelian tradition, I will focus on such complex configurative transformations of actors and structures.
It is in these cases that the weaknesses of the variable-based paradigm become most evident, for, as has already been stated, variables merely measure dimensions of social forms, but they cannot represent the forms themselves except in very simple cases. Such variable-based analysis “detaches elements (substances with variable attributes) from their spatiotemporal contexts analyzing them apart from their relations with other elements within fields of mutual determination and flux” (Emirbayer 1997, p. 288). By contrast, social forms always possess a duration in time and an extension in physical or some abstract, conceptual space. Thus they can be said to have their own “bodies”, or “corporate identities” in a literal or figurative sense. The latter case applies to collections of individuals, such as social groups or organizations (Wendt 1999).
Theories that “hard-wire” their actors and structures into their ontologies can be characterized as “essentialist” or “substantialist”. In these cases, “analysis is to begin with these self-subsistent entities, which come ‘preformed,’ and only then consider the dynamic flows in which they subsequently involve themselves” (Emirbayer 1997, p. 282). In contrast, sociational theories, which Emirbayer labels “relationist”, “reject the notion that one can posit discrete, pre-given units such as the individual or society as ultimate starting points of sociological analysis” (p. 287). Such a perspective “sees relations between terms or units as preeminently dynamic in nature, as unfolding, ongoing processes rather than as static ties among inert substances” (p. 289).
The overwhelming majority of all social science research falls into the variable-based category. In a lucid article on explanations of macro-political processes such as democratization and state-formation, Charles Tilly (1995, p. 1595) provides examples from historical sociology showing how almost all analysts in that field
(1) assume a coherent, durable, self-propelling social unit; (2) attribute a general condition or process to that unit; (3) invoke or invent an invariant model of that condition or process; (4) explain the behavior of the unit on the basis of its conformity to that invariant model.
But this practice makes very little sense, because “coherent, durable, self-propelling social units—monads—occupy a great deal of political theory but none of political reality” (p. 1596).
Likewise suggesting that mainstream sociology postulates “that the social world consists of fixed entities (the units of analysis) that have attributes (the variables)”, Andrew Abbott (1988) criticizes the substantialist mainstream position for making strong methodological assumptions about social reality that unduly limit or distort social ontology. Most seriously, the conventional, variable-based paradigm “ignores entity change through birth, death, amalgamation, and division” (pp. 171-172). These are all themes that have recurred in the process-theoretic literature since Simmel (1992) asked the fundamental question about the intersubjectivity, duration, and spatial extension of social forms.
Sociological process theory offers suggestive insights into, and conceptual tools for studying, the emergence of social forms. Yet, its potential is partly unrealized, because, as we have seen in the previous section, it remains a somewhat underspecified research program in terms of modeling. As we have seen, most process theorists have had to content themselves with employing metaphors and analogies, although some have developed simple models (e.g. Elias 1978; Barth 1981). In this section, I therefore turn to formal modeling as a way to attain a higher level of conceptual precision and theoretical consistency. Given the complexity of macro-sociological research problems, this paper focuses on computational modeling rather than on rational-choice models, and within the computational category, more specifically on agent-based modeling.
Agent-based modeling is a computational methodology that allows scientists to create, analyze, and experiment with, artificial worlds populated by agents that interact in non-trivial ways and that constitute their own environment (for introductions, see Axelrod 1997b; Casti 1997; Epstein and Axtell 1996; Epstein 1999; Axtell 2000; Macy and Willer 2002). In these “complex adaptive systems,” computation is used to simulate agents’ cognitive processes and behavior in order to explore emergent macro phenomena, i.e. structural patterns that are not reducible to, or even understandable in terms of, properties of the micro-level agents. Such models typically feature local and dispersed interaction rather than centralized control (Resnick 1994). Moreover, as opposed to conventional rational-choice models that assume either a small number of dissimilar or numerous identical actors, agent-based models normally include large numbers of heterogeneous agents. Rather than studying equilibrium behavior, the focus is often on dynamics and transient trajectories far from equilibrium. Finally, instead of assuming the environment to be fixed, many agent-based models let the agents constitute their own endogenous environment.
Agent-based approaches should also be contrasted to earlier uses of simulation in the social sciences, including the tradition of global modeling (see references in Gilbert and Troitzsch 1999; Taber and Timpone 1996, pp. 48-49).6 Such macro models, which were especially popular in the 1970s following the pessimistic report Limits to Growth, typically attempt to predict population and economic trends many years, sometimes even decades, into the future (e.g. Forrester 1971; Meadows et al. 1972). By the early 1990s, the method had fallen in disrepute mostly due to overly ambitious predictive claims. Yet, it continues to inspire theory-building among some sociologists (e.g. Hanneman, Collins, and Mordt 1995; see also the discussion in Sawyer 2003).
To further distinguish agent-based modeling from the dynamic systems tradition, and to underline the congruence between this methodology and sociological process theory, let us now return to the two meta-theoretical themes of the previous section. By doing so, it will become clear that agent-based tools enable the social theorist to pursue generative and configurative theorizing without abandoning the support of formal models.
From nomothetic to generative models
Whereas most traditional uses of simulation modeling center on prediction, agent-based modelers usually rely on generative explanation. This epistemological orientation has been evident since Schelling’s (1978) classical study Micromotives and Macrobehavior. Using an example of aggregate behavior, such as people choosing seats in an auditorium, Schelling invites the reader “to try to figure out what intentions, or modes of behavior, of separate individuals could lead to the pattern we observed” (p. 13). Obviously, it may be quite easy to figure out what the social pattern is going to be, but often the aggregate behavior surprises us. Schelling points out that the latter typically applies when there are strategic connections among the individuals. Such settings do not lend themselves to easy summation of the individual motives, which implies that
we usually have to look at the system of interaction between individuals and other individuals or between individuals and the collectivity. And sometimes the results are surprising. Sometimes they are not easily guessed. Sometimes the analysis is difficult. Sometimes it is inconclusive. But even inconclusive analysis can warn against jumping to conclusions about individual intentions from observations of aggregates, or jumping to conclusions about the behavior of aggregates from what one knows or can guess about individual intentions (p. 14).
It is evident that this pattern of explanation conforms closely to the abductive logic discussed above. Adopting a realist as opposed to an instrumentalist position, Schelling (1978) is primarily interested in uncovering the causal mechanisms that generate surprising macro-level patterns (p. 18). This explanatory strategy is most clearly exemplified in his famous segregation model, which is one of the world’s first agent-based models. Puzzled by the stark segregation patterns that appear, for example, among ethnic neighborhoods in American cities, Schelling postulates a simple, and quite tolerant micro-level rule that produces two perfectly cultural clusters: it is sufficient to assume that the residents of each group are unwilling to live in neighborhoods almost totally dominated by the other group.
In another influential statement, Axelrod (1997a) articulates similar principles for agent-based research. Distinct from both induction and deduction, the agent-based method “is a third way of doing science.” (p. 3). Rather than creating accurate predictions, “agent-based modeling is a way of doing thought experiments. Although the assumptions may be simple, the consequences may not be at all obvious” (p. 4; see also Gilbert 2000). Ultimately, Axelrod’s primary aim is to advance a better theoretical understanding by uncovering the mechanisms that generate “emergent properties” (cf. Holland 1995, p. 156). Like Schelling, Axelrod therefore stresses simplicity rather than realism.
Explicitly using the label “generative social science,” Epstein (1999) further articulates the link to the theme of generative explanation in sociological process theory. In his view, agent-based modelers working in this mold have to answer a general question: “How could the decentralized local interactions of heterogeneous autonomous agents generate the given regularity” (p. 41). The preliminary answer to the question is to generate a process that generates the phenomenon under scrutiny. It should be stressed that this approach is by necessity tentative, because abductive inference can only produce “candidate explanations.” Or to put it more tersely: “If you can’t grow it, you haven’t explained it.” This formula obviously begs the question of how we are to select among the candidate explanations. One way to do this is to test the micro-level implications, as suggested by Epstein (p. 43).
Here the scientific-realist foundations of the computational paradigm becomes particularly evident, for in their search for causal mechanisms, agent-based modelers such as Axelrod, Schelling, and Epstein reject instrumentalist reasoning.7 In summary, Gilbert and Chattoe (2001) put it well:
The realist epistemology is also a natural progenitor of computational simulations. It is a short step from developing theoretical models of mechanisms to developing computational models of mechanisms. It is also the case that simulations provide an appealing way of both formalising and exploring realist theoretical models.
Frequently, computational researchers introduce generative explanations under the label of “emergence,” although the latter is really a subset of the former because not all generative explanations exhibit emergence. This term is notoriously tricky to define due to its varying uses in both the computational (Holland 1998) and the sociological and philosophical literatures (Sawyer 2002a), but it could be said that emergence usually refers to the fundamental irreducibility of complex systems to their constitutive parts. The failure of reductionism in such settings often relates to the non-linearity of the systemic interconnections, something that implies that the whole is literally more (or less) than the sum of the parts (Simon 1981; Smith and Stevens 1996).
Simple forms of emergence in agent-based models rely on a straight-forward ascending logic featuring self-organized patterns that are compatible with methodological individualism (Macy and Willer 2002), which is why Epstein and Axtell (1996) refer to agent-based modeling as a “bottom-up” approach. Schelling’s model of residential segregation exemplifies such an ascending notion of emergence. But, as illustrated by Durkheim’s non-reductionist social theory, there is also a deeper sense of emergence that relies on “downward causation” (Sawyer 2002a). In this case, emergent social phenomena acquire a causal impact and cannot even in principle be reduced to laws and mechanisms operating at the micro-level. In a careful conceptual analysis, Crutchfield (1994) calls this type of emergence “intrinsic” because it presupposes that novelty be detected and acted upon by the systems’ agents: “The observer in this view is a subprocess of the entire system. In particular, it is one that has the requisite information processing capability with which to take advantage of the emergent patterns”(pp. 3-4).
Given that most existing agent-based models assume that agents are equipped with very simple rules and possess extremely primitive cognitive capacity (cf. Axelrod 1997a; Schelling 1978), it is not surprising that Sawyer (1998; 2002a; 2003) argues that such frameworks fail to embrace intrinsic emergence. Limiting their schemes to self-organization, computational modelers conceive of emergent properties as epiphenomenal configurations that emerge from dynamic webs of interactions, but that do not feed back down to the micro-level through the agents’ internal models. Yet, as Sawyer (2003) admits, the macro level may affect the micro-level through ecological effects in agent-based models, and to that extent they are less reductionist compared to rational-choice models that restrict the systemic view. All the same, by limiting top-down emergence to such an external logic, current computational modeling still does not fully capture the sociological process tradition, which subscribes to a non-individualist ontology featuring downward causation in both the intrinsic and external sense. Below, in the context of viewing collective actors as social forms, I will return to this important issue.
From variable-based to agent-based modeling
By now, it should be clear that sociological process theory distinguishes itself from conventional “variable-based” theory-building by featuring explicit and endogeneous representations of social forms. Again, the classical process perspective in sociology anticipates more recent developments in computational methodology. In fact, agent-based modeling has often been contrasted to the variable-based approach to simulation. In the latter form, computer-based modeling usually amounts to the construction of dynamic models linking variables together in systems of discretized differential equations (i.e. difference equations), as exemplified by global modeling. In ecology and population biology, this contrast pits “individual-based” against “equation-based” modeling (Parunak et al. 1998; see also McCauley et al. 1993; Fahse et al. 1998).
According to Parunak et al. (1998), there is a fundamental difference between these two types of approaches. Whereas equation-based modeling attempts to express causal relations among variables, individual-based modeling represents interactions among the agents directly. This does not mean that the individual-based paradigm is incapable of tracing the evolution of macro-level variables. In fact, “the modeler pays close attention to the observables as the model runs, and may value a parsimonious account of the relations among those observables, such an account is the result of the modeling and simulation activity, not its starting point” (p. 10). The opposite does not hold, however, because equation-based modeling makes it difficult to represent mechanisms operating at the agent-level, especially if they do not add up in a linear fashion. In systems featuring heterogeneous and spatially distributed agent populations agent-based modeling has an edge over equation-based techniques thanks to its superior tractability and increased opportunities of micro-level validation.
Modern software technology has paved the road for agent-based model architectures. In particular, object-oriented programming (OOP) facilitates the representation of agents as autonomous and discrete entities in simulation models (Hill 1996, Chap. 2). In contrast to traditional procedural approaches, which separate data from algorithms, the principle of encapsulation introduces objects as self-contained bundles holding both data and algorithms. Objects communicate with each other through message passing that respects their boundaries. Furthermore, as opposed to information held in conventional programming variables, each such entity has its own inherent identity, implying that “two objects are distinct even if all their attribute values ... are identical” (Rumbaugh et al. 1991, p. 2; see also Khoshaflan and Copeland 1986). Thus, an object has its own dynamic existence in that it can be created and destroyed. This is an aspect of identity that is sometimes referred to as persistence (Booch 1991; see also Hill 1996, p. 55). It is hardly a coincidence that the world’s first object-orientated programming language, Simula, was developed to conceptualize the semantics of simulation models in a natural way (Hill 1996, p. 25).
More recently, multi agent systems (MAS) have added the notion of an agent to the repertoire of computer science. As the term is used in distributed artificial intelligence (DAI), agents are more autonomous than objects by virtue of their ability to refuse performing messages and to effect flexible behavior without central program control (i.e. multi-threading) (Wooldridge 1999; Ferber 1999, p. 57). Depending on the complexity of their internal models, MAS feature either reactive or cognitive agents. So far, agent-based applications in the social sciences have tended to be of the former type (Sawyer 2003).
Although OOP and MAS may appear to be merely technical details, these advances reflect an important semantic shift away from procedural software engineering to more decentralized paradigms, a move that facilitates the parallel trend pointing from equation-based to agent-based modeling. After all, social agents are self-propelled, autonomous entities with distinct, time-dependent properties. In complex systems, it is especially useful that encapsulation forces the analyst to declare explicitly what information that agent has access to, and which of their own properties are visible to other agents. Identity is important because each agent can be represented as an independent and unique object instance of a more general template, usually referred to as a class in object-oriented programming languages. As opposed to conventional variables, which in principle have infinite life spans, agents implemented as dynamically allocate objects possess an explicit existence in time, and can therefore both be born and die. Beyond object-orientation, MAS techniques emanating from DAI are likely to inspire conceptual breakthroughs in theory-building and further strengthen the trend toward agent-based perspectives in social-science modeling.