The Journal of Peer Production - New perspectives on the implications of peer production for social change New perspectives on the implications of peer production for social change
Caring about the plumbing: On the importance of architectures in social studies of (peer-to-peer) technology image
JoPP Signal:
9.5/15

Reviewing process: [original] [reviews] [signals]

This article discusses the relevance, for scholars working on social studies of network media, of “caring about the plumbing” (to paraphrase Bricklin, 2001), i.e., addressing elements of application architecture and design as an integral part of their subject of study. In particular, by discussing peer-to-peer (P2P) systems as a technical networking model and a dynamic of social interaction that are inextricably intertwined, the article introduces how the perspective outlined above is particularly useful to adopt when studying a promising area of innovation: that of “alternative” or “legitimate” (Verma, 2004) applications of P2P networks to search engines, social networks, video streaming and other Internet-based services. The article seeks to show how the Internet's current trajectories of innovation increasingly suggest that particular forms of architectural distribution and decentralization (or their lack), impact specific procedures, practices and uses. Architectures should be understood an “alternative way of influencing economic systems” (van Schewick, 2010), indeed, the very fabric of user behavior and interaction. Most notably, the P2P “alternative” to Internet-based services shows how the status of every Internet user as a consumer, a sharer, a producer and possibly a manager of digital content is informed by, and shapes in return, the technical structure and organization of the services (s)he has access to: their mandatory passage points, places of storage and trade, required intersections. In conclusion, this article is a call to study the technical architecture of networking applications as a “relational property” (Star & Ruhleder, 1996), and integral part of human organization. It suggests that such an approach provides an added value to the study of those communities, groups and practices that, by leveraging socio-technical dynamics of distribution, decentralization, collaboration and peer production, are currently questioning more traditional or institutionalized models of content creation, search and sharing.

Keywords:
Peer-to-peer, architecture, innovation, Internet-based services, distribution

Francesca Musiani

Study an information system and neglect its standards, wires, and settings,
and you miss equally essential aspects of aesthetics, justice, and change.

– Susan Leigh Star (1999, p. 339)

1. Introduction

“Peer-to-peer is plumbing, and most people don’t care about plumbing,” pointed out some years ago Dan Bricklin, the father of the first spreadsheet VisiCalc, in a seminal book about peer-to-peer (P2P) technology’s potential as a “disruptive” technology (Bricklin, 2001 in Oram, 2001, p. 59). The “most people” Bricklin refers to in this citation are, of course, end users of the popular first-generation P2P file-sharing applications, like Napster, that were experiencing their hour of glory at the dawn of the 21st century.

Indeed, Bricklin may have been right in his assessment of the first P2P file-sharing applications’ success: likely, it owes more to the suitability of such tools to rapidly find a song and obtain it, than to their underlying peer-to-peer architecture in itself. Yet, this argument raises new and interesting methodological questions for scholars of social studies of networking technologies, be they used for communication, sharing, or production purposes. To what extent may Bricklin’s perception of indifference towards architecture apply not only to a majority of users of Internet-based services, but to these scholars, as well – and why, instead, it is important for them to “care”?

This article discusses the relevance, for scholars working on social studies of network media, of addressing elements of application architecture and design as an integral part of their subject of study. By discussing an ongoing research on “alternative” or “legitimate” (Verma, 2004) applications of P2P networking models, the article argues that social studies of network media need to “care about the plumbing,” or as Susan Leigh Star has effectively put it, “surface invisible work” (1999, p. 385) underlying networked practices, uses and exchanges – as an integral part of the “processes of constitution, organization, and change of […] the network society” (Castells, 2000, p. 693).

In doing so, the article proposes to acknowledge how Internet-based services’ current trajectories of innovation increasingly suggest that particular forms of distribution and decentralization (or their lack), impact specific procedures, practices and uses. As Barbara van Schewick has recently suggested, architectures should be understood an “alternative way of influencing economic systems” (2010, p. 3), indeed, the very fabric of user behavior and interaction. Most notably, the status of every Internet user as a consumer, a sharer, a producer and possibly a manager of digital content is informed by, and shapes in return, the technical structure and organization of the services (s)he has access to: their mandatory passage points, places of storage and trade, required intersections. This article is a call to study the architecture of networking applications as a “relational property, not as a thing stripped of use” (Star & Ruhleder, 1996, p. 113), “as part of human organization, and as problematic as any other” (Star, 2002, p. 116). It suggests that such an approach provides an added value to the study of those communities, groups and practices that, by leveraging socio-technical dynamics of distribution, decentralization, collaboration and peer production, are currently questioning more traditional or institutionalized models of content creation, search and sharing.

2. Architectures, fieldwork and methods: fleshing out the invisible

The architecture of a network or an application is its underlying technical structure (van Schewick, 2010), designed according to a “matrix of concepts” (Agre, 2003): its logical and structural layout, consisting of transmission equipment, communication protocols, infrastructure, and connectivity between its components or nodes.[1] The choice of taking architectures, artifacts transparent to end users by fiat of their creators, as the starting point – or at least as an important and integral part – of a study of practices and uses with network media raises a number of challenges, as well as great promise.

As Barbara van Schewick points out, the compartmentalization of disciplines may have led in the past to a general understanding of architectures as artifacts that are “relevant only to engineers”, and as such, should be exclusively left to their purview (2010, p. 2). However, in relation to network media, software, code and cyberinfrastructure studies have recently taken up the challenge of interdisciplinarity (e.g. Fuller, 2008), drawing on past endeavours in the field of sociology of technology and science, exploring the social and political qualities of infrastructures (e.g. Star, 1999). In addition, some authors experimenting at the intersection of computer science, sociology, law and science & technology studies (STS) explore innovative methodological approaches to architectures, working on the integration of architectures and practices in their analyses. These bodies of work will now be addressed in some more detail.

2.1. Disciplines and layers
Literature in computer science and computer engineering has, perhaps quite obviously, paid a great deal of attention to architectures of Internet-based applications and networks: their definition (Schollmeier, 2002; Schoder & Fischbach, 2003 ; Shirky et al., 2001), their technical advantages and disadvantages in a comparative perspective (e.g. client/server vs. peer-to-peer architectures, Verma, 2004, p. 11-16) and their application to specific projects serving a variety of uses (Oram, 2001, p. 67-159); these “purely” technical aspects of such systems are seldom addressed in relation to their societal, relational and organizational properties (Taylor & Harrison, 2009, p. 113-115). In some cases of highly publicized, debated applications – as it has been the case for some P2P systems – engineers have at times sought to present a technical perspective on the limits and advantages of specific architectures within at-large political and public debates (Auber, 2007; Le Fessant, 2006, 2009). Other scholars, interested in the metrology of networks, seek to model interactions by means of large-scale graphs, so as to study patterns of information propagation, the robustness of networks, the forms of exchange and sharing (e.g. Aidouni et al., 2009). Their aim is to build measuring tools that are better adapted to the ever-increasing size and complexity of networks, and more able to face the increasing inadequacy of traditional statistical and sampling methods to account for the magnitude of this scaling process (Baccelli, 2005).

On the other hand, as of today, an important number of works in economic and social sciences has sought to explore the practices of sharing, cooperation and interaction facilitated or enabled by online environments: it is the case of many contributions exploring new forms of organization, contribution and collaboration, like social networks (e.g. Boyd, 2004; Cardon, 2008) or online communities (Auray, 2011), be they composed of fans (Hellekson & Busse, 2006), contributors to wiki projects (Reagle, 2010), or specialized professionals (Lock, 2006).

The body of work on the law of network technologies has extensively addressed, on its hand – again, perhaps unsurprisingly – the dynamics of file-sharing practices by means of direct-exchange networking technologies, and has focused the debate on the ways in which innovative networking practices may be assimilated, by analogy, to mechanisms of remuneration and compensation similar to those in place for material, private copies (e.g. Gasser & Ernst, 2006). As pointed out by Mélanie Dulong de Rosnay (2005, 2007), as of now, only a comparatively small number of works has been devoted to the ways in which law can take into account the objects and sources of value (such as metadata and personal data) produced by new technical configurations.

2.2. Towards an integration of architectures and practices: the STS legacy
Some examples in recent literature open very interesting paths by undertaking the next step in the experimentation with interdisciplinarity. These authors, coming from a variety of different backgrounds, approach architectures in innovative ways by integrating the link between architectures and practices in their analyses.

Perhaps the most notable attempt in this direction is constituted by the work, carried out during the last fifteen years by Susan Leigh Star and colleagues within the field of STS, on infrastructures as constantly evolving socio-technical systems, informed not only by physical elements invisible to the end user, but also by factors such as social organization and knowledge sharing (Star & Ruhleder, 1996; Neumann & Star, 1996; Star, 1999; Star, 2002; Star & Bowker, 2006) Through her “call to study boring things,” Star effectively conveys the idea that architectural design choices, technical specifications, standards and number sequences are no less important to the study of information systems because they are “hidden mechanisms subtending those processes more familiar to social scientists” (Star, 1999, p. 337). As she writes in a seminal article on the ethnography of infrastructure:

It takes some digging to unearth the dramas inherent in system design creating, to restore narrative to what appears to be dead lists. […] Much of the ethnographic study of information systems implicitly involves the study of infrastructure. Struggles with infrastructure are built into the very fabric of technical work […]. However, it is easy to stay within the traditional purview of field studies: talk, community, identity, and group processes, as now mediated by information technology. […] Study an information system and neglect its standards, wires, and settings, and you miss equally essential aspects of aesthetics, justice, and change (Star, 1999, p. 337-339).

This “relational” approach brings about considerable changes in methods, as the scope of the fieldwork enlarges to include arenas where the shapes of architecture and infrastructure are observed, deconstructed, reconstructed, and decisions are made about codes, standards, bricolages, reconfigurations (Star & Bowker, 2006, p. 151-152), where the scholar undertakes a combination of “historical and literary analysis, traditional tools like interviews and observations, systems analysis, and usability studies” (Star, 1999, p. 382).

Emergent bodies of work such as software studies, critical code studies and cyberinfrastructure studies (Manovich, 2001; Fuller, 2008; Marino, 2006; Ribes & Lee, 2010) owe a lot to the STS approach, seeking, as Matt Kirschenbaum (2003) puts it, to balance “the deployment of critical terms like ‘virtuality’ […with] a commitment to meticulous documentary research to recover and stabilize the material traces of new media”. The materiality of software, code, and so-called virtual elements of the Internet user’s experience is reaffirmed, and the relationship between these layers (or “levels”, as defined by Mark Marino) explored:

Meaning grows out of the functioning of the code but is not limited to the literal processes the code enacts. Through CCS, practitioners may critique the larger human and computer systems,from the level of the computer to the level of the society in which these code objects circulate and exert influence (Marino, 2006).

2.3. Architectures: social, legal, political
On the side of computational and quantitative sociology, David Hales and colleagues seek to explore features of particular groupings that he calls “virtual tribes”, such as dynamic formation and dissolution overtime, cooperation, specialization, reputation systems, and occasional antagonist behavior; he considers that a thorough understanding of such phenomena is a necessary precondition for the construction of robust and resilient software systems, both today and in the future (Hales, 2006; Marcozzi & Hales, 2008; Hales, Arteconi, Marcozzi & Chao, 2008).

Information studies scholar and Internet pioneer Philip Agre explores on his side the relationship between technical architecture and institutions, notably the difference between “architecture as politics” and “architecture as a substitute for politics” (Agre, 2003). He argues that technologies “often come wrapped in stories about politics”, and while these stories may not explain the motives of the technologists, they are indeed useful to account for the energy that makes a technology an inherently social one, and projects it into the larger world (p. 39). Defining architectures as the matrixes of concepts (e.g. the distinction between clients and servers) designed into technology, and institutions as the matrixes of concepts that organize language, rules, job titles, and other social categories in particular societal sectors, Agre suggests that the engineering story of rationally distributed computation and the political story of institutional change through decentralized architecture are not naturally related. They reconfigure and evolve constantly, and for these reconfigurations and evolutions to share a common direction, they need work:

Decentralized institutions do not imply decentralized architectures, or vice versa. The drive toward decentralized architectures need not serve the political purpose of decentralizing society. Architectures and institutions inevitably coevolve, and to the extent they can be designed, they should be designed together. […] Radically improved information and communication technologies do open new possibilities for institutional change. To explore those possibilities, though, technologists will need better ideas about institutions (Agre, 2003, p. 42).

At the crossroads of informatics, economics and law, Barbara van Schewick has recently put forward the idea that the architecture of the Internet, and of the applications running on it, is relevant to economics. Her work seeks to examine how changes, notably design choices, in the Internet’s architecture (that she defines operationally as the “underlying technical structure” of the network of networks) affect the economic environment for innovation, and evaluates the impact of these changes from the perspective of public policy (2010, p. 2). According to van Schewick, this is a first step towards filling a gap in how scholarship understands innovators’ decisions to innovate and the economic environment for innovation: after many years of research on innovation processes, we understand how these are affected by changes in laws, norms, and prices; yet, we lack a similar understanding of how architecture and innovation impact each other (p. 2-3). Perhaps, van Schewick suggests, this is due to the intrinsic appeal of architectures as purely technical systems:

Just as the architecture of a house describes its basic inner structure, the architecture of a complex system describes the basic inner structure of the system — its components, what they do, and how they interact to provide the system’s functionality. That such a technical structure may have economic consequences at all is a relatively recent insight. Most people still think of architectures as technical artifacts that are relevant only to engineers. Thus, understanding how the Internet’s architecture affects innovation requires us to think more generally about how architectures affect innovation (van Schewick, 2010, p. 4).

Traditionally, she concludes, policy makers have used the law to bring about desired economic effects. Architecture de facto constitutes an alternative way of influencing economic systems, and as such, it is becoming another tool that actors can use to further their interests (p. 389).

Along the same lines, within a large-scale project investigating how the corpus of Requests for Comments (RFCs) of the Internet Engineering Task Force provides indications on the ways in which the Internet’s technical designers understood and engaged with law and policy issues, Sandra Braman has most recently (2011) explored how the core problem in the Internet’s technical design was to build structures that not only tolerated, but actually facilitated change. By addressing the ways in which change and stability themselves were conceptualized by Internet designers, Braman argues that undertaking research on architectural « design for instability » as applied to the Internet provides insight not only into the Internet itself, but into its social, legal and technical relations with other information and communication technologies (ICTs).

Drawing on pioneering works such as those of Yochai Benkler on sharing as a paradigm of economic production in its own right (2004) and of Lawrence Lessig on “code as law” (2002), the relationship between architecture and law is further explored by Niva Elkin-Koren (2002, 2006); a common trait of her works is its underlying perspective on architecture as a dynamic parameter, and she treats it as such while studying the reciprocal influences of law and technology design in information and communication systems. Elkin-Koren argues that the interrelationship between law and technology often focuses on one single aspect, the challenges that emerging technologies pose to the existing legal regime, thereby creating a need for further legal reform; thus, she notes how juridical measures involving technology both as a target of regulation and as a means of enforcement should take into account that the law does not merely respond to new technologies, but also shapes them and may affect their design (Elkin-Koren, 2006).

3. What architecture for the future Internet (-based services)?

The Internet’s current trajectories of innovation are making it increasingly evident by the day: the evolutions (and in-volutions) of the “network of networks”, and at a broader level of electronic communications, are likely to depend in the medium-to-long term on the topology and the organizational/technical model of Internet-based applications, as well as on the infrastructure underlying them (Aigrain, 2011).

The development of services based on distributed architectures is currently affirming itself as one of the Internet’s most important axes of transformation. The concept of distribution is somehow shaped and inscribed into the very beginnings of the Internet – notably in the organization and circulation of information fluxes – but its current topology integrates this structuring principle only in very limited ways (Minar & Hedlund, 2001). The limits of the “classic” urbanism of the Internet, which has been predominant since the beginning of its commercial era and its appropriation by the masses, are becoming evident with regards to phenomena such as the widespread success of social media (Schafer, Le Crosnier & Musiani, 2011). While Internet users have become, at least potentially, not only consumers but also distributors, sharers and producers of digital content, the network of networks is structured in such a way that large quantities of data are centralized and compressed within specific regions of the Internet, at the same time when they are most suited to a rapid re-diffusion and re-sharing in multiple locations of a network that has now reached its full globalization.

3.1. Architectures and the Internet’s “social value”
The current organization of Internet-based services and the structure of the network that enables their functioning, with its mandatory passage points, places of storage and trade, required intersections, raises many questions, both in terms of the optimized utilization of storage resources, and of the fluidity, rapidity and effectiveness of electronic exchanges. Other interrogations, on the security of exchanges and on the stability of the network, must also be added to these issues: a series of malfunctions and breakdowns with important consequences at the global level [2] draw our attention on questions of security and data protection, inherent to the Internet’s current structure.

These questions impact largely the balance of powers between users and network providers, and reach questions of net neutrality. To what extent can network providers interfere with specific uses? Can the network be optimized for specific uses? As Barbara van Schewick points out, by enabling users to use the Internet in the way that creates the most value for them, changes in architecture are not only likely to impact the value of the Internet for users, but also to increase or diminish the Internet’s overall value to society:

But the social value of architectures […] goes beyond that. The Internet has the potential to enhance individual freedom, provide a platform for better democratic participation, foster a more critical and self-reflective culture, and potentially improve human development everywhere. The Internet’s ability to realize this potential, however, is tightly linked to features — user choice, non-discrimination, non-optimization (van Schewick, 2010, p. 387),

that may be achieved in different ways by designing its underlying architecture in different ways. Resorting to decentralized architectures and distributed organizational forms, then, constitutes a different way to address some issues of management of the network, in a perspective of effectiveness, security and digital “sustainable development” (better resource management), and of maximization of its value to society.

This idea is further explored by Michel Bauwens (2005) who, proposing a vision of the P2P model that is based on but goes beyond computer technology, puts forward a P2P theory as a “general theory” of collaborative and direct human interaction, an emerging, pervasive and inherently social phenomenon that may be profoundly transforming the way in which society and human civilization is organised.

3.2. The peer-to-peer model: a return to the past, a promise for the future
Since the inception of the Internet, the principle of decentralization has governed the circulation of transmissions and communications on the “network of networks” (Aigrain, 2011). However, the introduction of the World Wide Web in 1990 has progressively and widely led to the diffusion of “client-server” architecture models; the most widespread and diffused Internet-based services (social networks, instant messaging tools, digital content storage services…) are based upon technical and economic models in which end users ask for information, data, services to “farms” of powerful servers, stocking information and/or managing network traffic (van Schewick, 2010, p. 70). Even if traffic on the Internet functions on the generalized distribution principle, it has now taken the form of concentration around servers delivering access to content. Yet, this modality of organization for structure and services, in and on the network, is not the only possible one – and while being the most widespread, it is maybe not the most effective. Thus, the search for alternatives is currently in progress (Aigrain, 2010, 2011; Moglen, 2010).

Peer-to-peer (P2P) architecture is reclaiming its place among these alternatives. It is a computer network model structured in such a way that communications and/or exchanges take place between nodes having the same responsibility within the system. The dichotomy between server (provider of the service) and client(s) (requesters of the service), typical of the client-server model, is replaced by a situation where every client becomes a server as well, where all peers have a resource and all peers request it (Schollmeier, 2002).

The P2P model is not per se innovative in the history of the Internet. Indeed, the original Internet was fundamentally designed as a peer-to-peer system, before the network started being populated by an ever-increasing number of end users, and became the device through which millions of consumer clients communicated with a “relatively privileged” set of servers (Minar & Hedlund, 2001, p. 4). Yet, as the quantity and quality of bandwidth increased, home computers became more powerful, and domestic users progressively diversified their activities beyond browsing the Web and trading emails, the conditions were set for another change – or, perhaps, a reversion, with “machines in the home and on the desktop are connecting to each other directly, forming groups and collaborating to become user-created search engines, virtual supercomputers, and file systems”. So, while noticing the “many specific problems where the Internet architecture has been strained”, application developers often find themselves looking back to the Internet of twenty years ago when considering how best to solve a problem (Minar & Hedlund, 2001, p. 3; Figueiredo et al., 2008).

P2P architecture embraces the decentralization principle by harnessing the network in a different way than client-server applications. In this architecture, users ask for services to a cluster of servers of limited capacity; unless there is the possibility to add further servers at any time, a critical point in data transmission for and to all users may be eventually reached depending on additional clients joining the network (and, in extreme conditions, turn into denial-of-service situations). In P2P architecture, users are not only exploiting a resource (be it bandwidth, storage space, computing power) but are providing it, as well – so that, if the request to which the system must respond augments, the total capacity of the system increases, too. P2P systems may also present advantages in terms of stability and endurance, as the distributed nature of the system improves its overall strength and avoids its complete invalidation in case one of the nodes fails to perform as expected or disconnects from the system. Indeed, the effectiveness of P2P as a distribution model is strictly linked to its “plumbing”: the repartition of computing power and bandwidth among all components of the system, which changes the distributive structure and the allotment of costs by increasing bandwidth use at the level of the network, not of the server(s) (Elkin-Koren, 2006, p. 21-23).

In the course of their relatively short history, P2P systems have often been considered as a threat to the interests of the industries of digital content, as their main use by the public has been the unauthorized sharing of materials covered by intellectual property rights, notably copyright. More specifically, this reputation has been forged in the first years 2000, with the advent of exchange and sharing practices at the global scale, concerning millions of users – the most emblematic case being that of Napster and its sixty millions of sharers, a service functioning on a centralized P2P architectural model, that was shortly followed by hybrid and purely decentralized versions. Shortly after the explosion of these “renewed” P2P technologies, attempts have also been made to find economic models promoting this means of exchange within the current legal framework, but they have generally proven unsatisfactory [3].

The crucial role that such considerations have had in shaping the controversial status of P2P technologies vis-à-vis the media and the public may have led researchers to some pitfalls, as well. A reductionist interpretation of the “P2P effect”, often underplayed as a proxy for illegality, should be avoided – a perspective that is particularly evident, Niva Elkin-Koren remarks (2006), in the juridical literature on P2P and law. Also, social scientists should watch out for the traps that P2P, a model with strong a priori connotations of equality and decentralization, may set up. As noted by Philip Agre, it is particularly easy in the case of P2P to juxtapose architecture to the stories of institutions, individuals and groups, assuming that one determines the other – but this may lead to a misleading shortcut:

In the case of P2P technologies, the official engineering story is that computational effort should be distributed to reflect the structure of the problem. But the engineering story does not explain the strong feelings P2P computing often evokes. The strong feelings derive from a political story, often heatedly disavowed by technologists but widespread in the culture: P2P delivers on the Internet’s promise of decentralization. By minimizing the role of centralized computing elements, the story goes, P2P systems will be immune to censorship, monopoly, regulation, and other exercises of centralized authority. This juxtaposition of engineering and politics is common enough, and for an obvious reason: engineered artifacts such as the Internet are embedded in society in complicated ways […] the case of P2P computing (is good) to analyze the relationship between engineering and politics—or, as I want to say, between architectures and institutions. […] The P2P movement understands that architecture is politics, but it should not assume that architecture is a substitute for politics (Agre, 2003, p. 39-42).

P2P-based socio-technical systems may be better analyzed and understood with an approach that addresses, studies, explores architecture as the very fabric of those interactions and examines how these shape, in return, subsequent negotiations and redesigns of the system. Scholars interested in networking technologies of communication and exchange need to “learn to read these invisible layers of control and access. In order to understand how this operates, however, it is necessary to ‘deconstruct’ the boring, backstage parts […], to disembed the narratives it contains and the behind-the-scenes decisions […], as part of material information science culture” (Star, 2002, p.110).

4. When architectures matter: the many faces of P2P systems

This article has sought to discuss the relevance, for social scientists interested in network media and systems, of paying analytical attention to elements of application architecture and design, as a core feature of their subject of study. In particular, by discussing P2P technology as a technical networking model and a dynamic of social interaction that are inextricably intertwined, it has endeavored to illustrate the potential and challenges of this approach when addressing issues of transformation and sustainability of the current Internet model. While the primary purpose of the article has been to discuss the foundations of a methodological perspective, and not to delve into the field by its means [4], this last section introduces – as both a conclusion and an overture – some elements on how I have actually taken architectures into account in my methodology when addressing an often underplayed, yet promising area of innovation within the field of Internet-based services: that of “alternative” or “legitimate” applications of peer-to-peer networks.

4.1. “Alternative” P2P and Internet-based services
A critical examination of different models of technical architectures, in terms of their impact on Internet-based cooperation and production practices – a better understanding of what the “plumbing” is about – makes it possible to single out a growing number of P2P applications, under-represented and somewhat hidden by the media buzz and the trials engendered by the illegal sharing of musical files (Laflaquière, 2005).

In recent years, mostly since 2004, many projects and applications have seen the light, that propose alternatives – built on decentralized or P2P-based architectures – to Web-based online services occupying an important place in the daily life of Internet users. The uses entailed by such tools include information search and retrieval, sharing, and communication. Thus, these projects are positioning themselves with respect to services proposed by actors every Internet user is well acquainted with, such as Google, Facebook, Picasa. By harnessing the potential of P2P and of decentralization, the developers of such projects aim at satisfying the same needs from the point of view of the end user (who continues to search keywords, network with friends, share pictures with them), but building the application on a different architectural model or technical platform. A move that has potentially long-reaching implications vis-à-vis the service provider’s status, its access to information, and the material locations in which storage and sharing operations of user-created content are conducted.

The analysis of how the integration of architecture and practices is enacted in “alternative” P2P applications appears especially useful when studying up-and-coming experiments with the decentralization of storage and search services with a social networking component. This investigation has been at the core of my PhD dissertation, currently in the writing phase, parts of which have been published in previous papers (Musiani, 2010a, 2010b, 2011). These applications reveal their specificities with respect to both their centralized counterparts (serving the same purpose, but underlying a different architecture) and file-sharing P2P networks. The attention to the “plumbing” allows to delve into dynamics of articulation between local and global dimensions in a distributed application; of sharing of disk space and bandwidth as the cornerstone of a socio-economic model for P2P; of deployment of technical uncertainty and social opportunity at the “edges” of the network, where under-utilized resources, both human and material, can be leveraged.

4.2. A pragmatic approach to P2P architectures
Thus, the elaboration of case studies on “alternative” P2P applications – when it becomes an exploration of the ways and means in which the opportunity of change is constituted with P2P – entails a plural approach, that follows on one hand the innovators, trying to identify their strategies in the construction of the technologies, as well as their valors, cultures and imaginaires of reference, and on the other hand, the role played, where possible, by the first users of the systems. The objective is threefold: retracing and breaking down, in developers’ and users’ narratives, the actions and dynamics that represent at once P2P technology and the changes it purports; following, by means of onsite and online ethnography, how P2P innovators manage the economic, political and social “relapses” of technical changes development processes; tracing how discussions and controversies that take place on technical forums between developers and users, and among users themselves, progressively shape directions of mobilization for and by means of P2P.

For all these reasons, it proves useful to avoid considering “P2P” as a pre-defined object. Adopting a pragmatic approach, the starting point for the fieldwork becomes the observation that, in the ICTs domain, currently exists a variety of research projects and applications that, in different manners and for different purposes, take up with a “P2P technology” that is defined in a transversal way as a decentralised, legal, private, social and user-centered alternative. A name and five adjectives that become the entry points into the fieldwork, of which to observe the (re)configurations and (re)compositions in the hands of the actors and the shaping of the systems.
An empirical inquiry carried out by means of this approach helps identifying “live”, and in a manner transversal to the different cases, uses and technologies “in the making” (Callon, 1987; Callon & Latour, 1990), trying to obtain a common vision of the directions of appropriation of P2P technologies. What I have called a “real-time sociology of innovation”, with which I have experimented during my PhD, proves a viable method to apprehend variable, multi-dimensional situations, and attempt to draw some conclusions on their possible developments and applications. At the same time, there is a need to address the more ideological and utopian dimension of these “alternatives” – that which speaks of an Internet ideal of decentralization and autonomy – that is taken as a subject of inquiry, to try and show how it leads to ways of doing things, explains choices, validates assumptions. Along these lines, and once again following an STS-based tradition, the observation of transformations, passages, negotiations, modifications of objects, and of the moments where these are put on “trial” beyond the scheduled phases of development, are of special importance.

A particularly stimulating aspect of this approach is the consideration of how law and rights take shape with the P2P alternative, in the pursuit of three objectives. Firstly, in order to successfully define the “legality” of such services, strictly linked to their constantly evolving architecture that is often only partially accounted for in written juridical documents. Secondly, to try and give instruments of analysis able to rise above a conception of the relationship between law and technology that all too often focuses on one aspect: the fact that emerging technologies pose challenges to existing legal regimes, creating a need for reform of these regimes. Thirdly, so that the objects and the resources enabling P2P, and being produced by P2P, may be fully conceived and treated as means of definition and protection of the rights of users of Internet-based services.
In short, the acknowledgment of the importance of architectures calls, in the specific case of the study of “alternative” P2P for Internet-based services, for a process of methodological readjustment. It implies delving into the technical functioning of direct transmission of data between machines of a decentralized network, perhaps including mechanisms of file fragmentation, encryption and maintenance, and take it as a core feature (even if not necessarily the cause) of the types of exchanges taking place within a service, of their effectiveness, of their directness. It implies addressing the total or partial removal of technical “intermediaries” (Elkin-Koren, 2006) in online networking and sharing activities, as a structuring dynamic in new-generation participative instruments. It means understanding where in the “fringes and materialities of infrastructures” (Star, 2002, p. 107) a password is stored, a file is indexed and encrypted, a download starts and ends, so as to understand how new dynamics for the protection of personal liberties and rights are taking hold – or are endangered. In short, learning to read the “invisible layers” of P2P-based socio-technical systems is as much a challenge as it is an opportunity to explore collaborative practices carried out in, on and through them, and to observe how these practices in-form the architecture in return, the sharing of resources it entails, its medium- and long-term socio-technical sustainability.

However, in a connected world where more applications than ever want to use the network, send packets, consume bandwidth – thereby placing new strains and tensions on the Internet’s architecture – social scientists need to accept the challenge just as much as the technical people who are working on the future topology of the “network of networks”. It is, likely, one of the most promising ways to shed new light on dynamics of content creation, sharing, publishing and management, that are shaping, and being shaped by, the future Internet – one of the best ways to contribute to its future sustainability.

5. Conclusions. The “lower layers”, a key for the sociology of networks

“Caring about the plumbing”; “[f]inding the invisible work […] in the traces left behind by coders, designers, and users of systems” (Star, 1999); the inclusion of the lower layers in the analysis – this article has wished to suggest – means doing a sociology of networks that is not afraid of its subject of study.

A consequence of this approach is a specific attention to an aspect of networks that is not only very discreet, but even invisible to the eyes of the users: their architecture. Of course, we remain social scientists: this interest in architectures derives from the hypothesis that particular forms of distribution call for specific procedures, particular uses, peculiar “user portraits”. In doing so, one is able to flesh out how some attributes of technology, of which users often lack a direct knowledge or awareness, are bound to fully influence and inform issues that are often crucial for uses and practices, such as the treatment and physical location of data, the management of computing resources, the shape and results of their queries to search engines.

In the specific context of P2P, this article is also an invitation to further pursue the renovation of academic (and political) debates on what are currently very lively, but “alternative”, processes of content creation, search and sharing. Considering the architectural dimension helps to overcome today’s prevailing paradigm when taking P2P as a subject of study, that which, even when it focuses on forms of organization in or by means of P2P dynamics, opts for a reduction of P2P to the uses it entails and makes possible, one among them in particular.
The link between the ways in which P2P applications take shape – notably evolutions of their technical architecture – and their possible influences on practices, relations and rights still remains quite under-explored. Yet, the shaping of links, nodes, mandatory transit points, information propagation protocols – in one word, their architecture – tell us social scientists many things about the specificities and promises of P2P-based applications, the challenges they face, the opportunities they may present for the medium-term evolution of the Internet model.

[1] The IEEE Standard for Architectural Description of Software-Intensive Systems (IEEE P1471/D5.3) defines [technical] architecture as ‘the fundamental organization of a system embodied by its components, their relationships to each other and to the environment and the principles guiding its design and evolution’ (Bredemayer & Malan, 2001).

[2] E.g., respectively, Twitter’s repeated outages and the controversy over the service’s long-term sustainability (see Pingdom, 2007: Twitter had about six fully days of downtime in 2007, due to server overload and the service’s failure to scale according to user demand), and the 2008 worldwide YouTube paralysis (see Bortzmeyer, 2008: the lack of access to the popular video streaming website was due to a massive routing of BGP requests by Pakistan Telecom, aimed at blocking the diffusion of some contents in the country).

[3] As is the case for “Peer Impact”, a 2005-born pay-for-download file sharing service running on a BitTorrent-like peer-to-peer distributions system while maintaining centralized control of verification and authorization of downloads.

[4] Something I attempt to do in other venues: see Musiani 2010 and 2011.

Francesca Musiani is based at the Centre de Sociologie de l’Innovation, MINES ParisTech, Paris, France. This work is supported by a grant of the French National Agency for Research (ANR), Programme CONTINT-Contenus et Interactions, Project ADAM-Architectures distribuées et applications multimédias.

Works cited

Aigrain, P. (2010). Declouding Freedom: Reclaiming Servers, Services and Data. In 2020 FLOSS Roadmap (2010 Version/3rd Edition), retrieved November 28th, 2011,from https://flossroadmap.co-ment.com/text/NUFVxf6wwK2/view/

Aigrain, P. (2011). Another Narrative. Addressing Research Challenges and Other Open Issues session, PARADISO Conference, Brussels, 7–9 Sept. 2011.

Agre, P. (2003). Peer-to-Peer and the Promise of Internet Equality. Communications of the ACM, 46 (2), 39-42.
Aidouni, F., Latapy, M. & Magnien, C. (2009). Ten Weeks in the Life of an eDonkey Server. IEEE International Symposium on Parallel & Distributed Processing (IPDPS), 2009, 1-5.

Auber, O. (2007, March 19). Le Net, un bien commun: Quel projet politique pour le réseau? Club de l’Hyper-République. Retrieved March 29, 2011, from http://hyperrepublique.blogs.com/public/2007/03/quel_projet_pol.html

Auray, N. (2011, forthcoming). Information Communities and Open Governance: Boundaries, Statuses and Conflicts. In E. Brousseau, M. Marzouki, C. Méadel (Eds.), Governance, Regulations and Powers on the Internet. Cambridge: Cambridge University Press.

Baccelli, F. (2005, December 16). Internet : modéliser le trafic pour mieux le gérer. Interstices. Retrieved March 29, 2011, from http://interstices.info/jcms/c_12842/internet-modeliser-le-trafic-pour-mieux-le-gerer

Bauwens, M. (2005). P2P and Human Evolution: Placing Peer to Peer Theory in an Integral Framework. Integral Visioning. Retrieved March 29, 2011, from http://www.integralworld.net/bauwens2.html.

Benkler, Y. (2004). Sharing Nicely: On Shareable Goods and the Emergence of Sharing as a Modality of Economic Production. The Yale Law Journal, 114 (2), 273-358.

Boyd, D. (2004). Friendster and Publicly Articulated Social Networks. Conference on Human Factors and Computing Systems. Vienna, ACM, April 24-29, 2004.

Bortzmeyer, S. (2008, February 25). Le Pakistan pirate YouTube. Blog de Stéphane Bortzmeyer. Retrieved March 31, 2011, from http://www.bortzmeyer.org/pakistan-pirate-youtube.html.

Braman, S. (2011). Designing for Instability: Internet Architecture and Constant Change. Media In Transition 7 (MIT7) Unstable Platforms: the Promise and Peril of Transition, Cambridge, MA, May 13-15, 2011.

Bredemayer, D. & R. Malan (2001). Architecture Definitions. Retrieved September 5, 2011, from http://www.bredemeyer.com/pdf_files/Definitions.pdf.

Bricklin, D. (2001). The Cornucopia of the Commons. In A. Oram (Ed.), Peer-to-Peer: Harnessing the Power of Disruptive Technologies (pp. 59-63). Sebastopol, CA : O’Reilly.

Callon, M. (1987). Society in the Making: The Study of Technology as a Tool for Sociological Analysis. In W. Bijker, T.P. Hughes and T. Pinch (Eds.), The Social Construction of Technological Systems: New Directions in the Sociology and History of Technology (pp. 83-103). Cambridge, MA and London: The MIT Press.

Callon, M. et B. Latour (Eds., 1990), La science telle qu’elle se fait, Paris : La Découverte.

Cardon, D. (2008). Le design de la visibilité. Un essai de cartographie du web 2.0. Réseaux, 2008/6 (152), 93-137.

Castells, M. (2000). Toward a Sociology of the Networked Society. Contemporary Sociology, 29 (5), 693-699.

Dulong de Rosnay, M. (2005). Image et droit, là où la technique s’en mêle… Documentaliste – Sciences de l’information, 42 (6), 405-411.

Dulong de Rosnay, M. (2007). La mise à disposition des œuvres et des informations sur les réseaux: régulation juridique et régulation technique, Unpublished PhD dissertation, Université Panthéon-Assas, Paris, France.

Elkin-Koren, N. (2002). It’s All About Control: Rethinking Copyright in the New Information Landscape. In N. Elkin-Koren & N. W. Netanel (Eds.), The Commodification of Information (pp. 415-431). The Hague, Netherlands: Kluwer Law International.

Elkin-Koren, N. (2006). Making Technology Visible: Liability of Internet Service Providers for Peer-to-Peer Traffic. New York University Journal of Legislation & Public Policy, 9 (15), 15-76.

Figueiredo, R. J., Boykin, P. O., St. Juste, P. and Wolinsky, D. (2008). Social VPNs: Integrating Overlay and Social Networks for Seamless P2P Networking. Proceedings of the 2008 IEEE 17th Workshop on Enabling Technologies: Infrastructure for Collaborative Enterprises. Washington, DC: IEEE Computer Society.

Fuller, M. (2008, Eds.). Software Studies: A Lexicon. Cambridge, MA: The MIT Press.

Gasser, U. & Ernst, S. (2006, December). European Union Copyright Directive Best Practice Guide: Implementing the EU Copyright Directive in the Digital Age. University of St. Gallen Law & Economics Working Paper No. 2007-01; Berkman Center Research Publication No. 2006-10. Retrieved March 29, 2011, from http://papers.ssrn.com/sol3/papers.cfm?abstract_id=952561.

Hales, D. (2006). Emergent Group-Level Selection in a Peer-to-Peer Network. Complexus, 2006 (3): 108-118.

Hales, D., Arteconi, S., Marcozzi, A. & Chao, I. (2008). Towards a Group Selection Design Pattern. In F. Meyer (Eds.), The European Integrated Project “Dynamically Evolving, Large Scale Information Systems (DELIS)” Proceedings of the final workshop. Barcelona, February 27-28, 2008.

Hellekson, K. & Busse, K. (Eds., 2006). Fan Fiction and Fan Communities in the Age of the Internet. Jefferson, NC: McFarland.

Kirschenbaum, M. (2003). Virtuality and VRML: Software Studies after Manovich. Electronic Book Review. Retrieved November 28th, 2011, from http://www.electronicbookreview.com/thread/technocapitalism/morememory

Laflaquière, J. (2005). Les “autres” applications des technologies Peer-to-Peer. Multitudes, 2 (21), 59-68.

Le Fessant, F. (2006). Peer-to-peer: comprendre et utiliser. Paris: Eyrolles.

Le Fessant, F. (2009). Un point de vue technique sur la loi Internet et Création. Retrieved March 29, 2011, from http://fabrice.lefessant.net.

Lessig, L. (2002). The Future of Ideas. New York: Vintage Books.

Lock, J. V. (2006). A New Image: Online Communities to Facilitate Teacher Professional Development. Journal of Technology and Teacher Education, 14 (4), 663-678.

Manovich, L. (2001). The Language of New Media. Cambridge, MA: The MIT Press.

Marcozzi, A. & Hales, D. (2008). Emergent Social Rationality in a Peer-to-Peer System. Advances in Complex Systems (ACS), 11 (4), 581-595.

Marino, M. C. (2006). Critical Code Studies. Electronic Book Review. Retrieved November 28th, 2011, from http://www.electronicbookreview.com/thread/electropoetics/codology

Minar, N. and Hedlund, P. (2001). A Network of Peers – Peer-to-Peer Models Through the History of the Internet. In A. Oram (Ed.), Peer-to-Peer: Harnessing the Power of Disruptive Technologies (pp. 9-20). Sebastopol, CA: O’Reilly.

Moglen, E. (2010). Freedom In The Cloud : Software Freedom, Privacy and Security for Web 2.0 and Cloud Computing. ISOC Meeting, New York Branch, 5 February 2010.

Musiani, F. (2011). Privacy as Invisibility: Pervasive Surveillance and the Privatization of Peer-to-Peer Systems. tripleC, 9(2): 126-140.

Musiani, F. (2010a). When Social Links Are Network Links: the Dawn of Peer-to-Peer Social Networks and Its Implications for Privacy. Observatorio, 4 (3), 185-207.

Musiani, F. (2010b). Ménager le droit à la vie privée, entre anonymat et connaissance de l’identité: les débuts des réseaux sociaux en pair-à-pair, Terminal, 105: 107-116.

Neumann, L. & Star, S. L. (1996). Making Infrastructure: the Dream of a Common Language. In J. Blomberg, F. Kensing, & E. Dykstra-Erickson (Eds.), Proceedings of the PDC ’96 (pp. 231-240). Palo Alto, CA: Computer Professionals for Social Responsibility.

Oram, A. (Ed., 2001). Peer-to-Peer: Harnessing the Power of Disruptive Technologies, Sebastopol, CA: O’Reilly.
Pingdom staff writer (2007, December 19). Twitter Growing Pains Cause Lots of Downtime in 2007. Royal Pingdom (blog of Pingdom). Retrieved March 31, 2011, from http://royal.pingdom.com/2007/12/19/twitter-growing-pains-cause-lots-of-downtime-in-2007/

Reagle, J. (2010). Good Faith Collaboration: The Culture of Wikipedia. Cambridge, MA: The MIT Press.
Ribes, D. & Lee, C. P. (2010). Sociotechnical Studies of Cyberinfrastructure and e-Research: Current Themes and Future Trajectories. Computer Supported Cooperative Work, 19, 231-244.

Schafer, V., Le Crosnier, H., & Musiani, F. (2011). La neutralité de l’Internet, un enjeu de communication. Paris: CNRS Editions/Les Essentiels d’Hermès.

Shirky, C., Truelove, K., Dornfest, R., Gonze, L., & Dougherty, D. (Eds., 2001). 2001 P2P networking overview. Sebastopol, CA: O’Reilly.

Schoder, D. & Fischbach, K. (2003). Peer-to-peer prospects. Communications of the ACM, 46 (2), 27–29.

Schollmeier, R. (2001). A Definition of Peer-to-Peer Networking for the Classification of Peer-to-Peer Architectures and Applications. Proceedings of the IEEE 2001 International Conference on Peer-to-Peer Computing (P2P2001) (pp. 101-102), Linköping, Sweden, August 27-29, 2001.

Star, S. L. (1999). The Ethnography of Infrastructure. American Behavioral Scientist, 43 (3), 377-391.

Star, S. L. (2002). Infrastructure and ethnographic practice: Working on the Fringes. Scandinavian Journal of Information Systems, 14 (2), 107-122.

Star, S. L. & Bowker, G. (2002). How To Infrastructure. In Lievrouw, L. A. (Ed.), Handbook of New Media (pp. 151-162), London: Sage.

Star, S. L., and Ruhleder, K. (1996). Steps Toward an Ecology of Infrastructure: Design and Access for Large Information Spaces. Information Systems Research, 7, 111-133.

Taylor, I. & Harrison, A. (2009). From P2P to Web Services and Grids: Evolving Distributed Communities. Second and Expanded Edition. London: Springer-Verlag.

van Schewick, B. (2010). Internet Architecture and Innovation. Cambridge, MA: The MIT Press.

Verma, D. (2004). Legitimate Applications of Peer-to-Peer Networks. Hoboken, NJ: John Wiley & Sons.