Linux - Governance

From P2P Foundation
Revision as of 12:34, 20 February 2007 by Dafermos (talk | contribs)
Jump to navigation Jump to search

Linux is a Free Software based alternative operating system for computers.

It uses Peer Production methods and can be considered a Virtual Network Organization.

The material below especially looks at its Peer Governance structure.



History of Linux

By George Dafermos at http://www.firstmonday.org/issues/issue6_11/dafermos/


In 1971, Richard Matthew Stallman (RMS), started working at MIT's AI Lab. At the time, the AI (artificial intelligence) Lab was regarded as the heaven for most computer programmers as it gave rise to many innovations, contributing to the development of ARPAnet in 1969 (the ancestor of the Internet). Around MIT, an entire culture was built that defined programmers as scientists and members of a networked tribe because ARPAnet enabled researchers everywhere to exchange information, collaborate and hence, accelerate technological innovation (Raymond, 1999a). After all, one of the motivations for launching ARPAnet was to connect communities of computer programmers in order to share their programs and knowledge.[1]

Programming for 'hackers' was the ultimate pleasure and nothing but programming mattered. Their home had become the AI Lab. They defined themselves through programming and the hacker culture was their religion (Levy, 1984). They were scientists and as such, their creations-discoveries (software) should be available to everyone to test, justify, replicate and work on to boost further scientific innovation. Software is code (or source code) and the code they developed was available to everyone. They could not conceive that software would have been copyrighted some day.

RMS was fascinated by this culture and spent nearly ten years at the AI Lab until in 1981 a company called Symbolics hired all AI Lab programmers apart from two: one of them was RMS. The era of commodification of software had begun and increasingly more hackers entered the payroll to develop proprietary software whose source code was well guarded as a trade secret. RMS was so frustrated that his community had been destroyed by Symbolics and proprietary software. He decided to embark on a crusade that has not been matched before or since: re-build the hacker community by developing an entire free operating system. For him, "free" means that the user has the freedom to run the program, modify it (thus the user needs to be provided with the source code), redistribute it (with or without a fee is irrelevant as long as the source code is been provided with the copy) or redistribute the modified version of it. The term "free software" has nothing to do with price. It is about freedom.[2]

In 1984, he started his GNU (it stands for GNU's Not Unix) project which meant to become a free alternative of the Unix operating system and in 1985 he founded the Free Software Foundation (FSF). To ensure that GNU software would never be turned into proprietary software, he created the GNU General Public License (GNU GPL).[3]

The GNU GPL outlines the distribution terms for source code that has been "copylefted" with guidelines from the FSF. Copylefting involves copyrighting a program and then adding specific distribution terms that give everyone the right to use, modify and redistribute the code. The distribution terms of the GPL are 'viral' in the sense that derivative works based on GPL'd source code must also be covered by the GPL. As a result, other programs that use any amount of GPL'd source code must also publicly release their source code. Therefore, GPL'd code remains free and cannot be co-opted by proprietary development.[4] In fact, the Linux OS is licensed under the GNU GPL and uses most of the GNU programs. That is why it is often referred as GNU/Linux.

During his crusade, he wrote powerful, piece-of-art software that could be both used by programmers as programming tools and also provide 'pieces' that when combined would eventually make up his dream: the GNU operating system. His software appeals to many programmers and so, the pool of GPL'd software grows constantly bigger. He remains a strong advocate of all aspects of freedom, and he sees free software as the area he can contribute the most.[5]

However, at a historic meeting in 1997, a group of leaders in the Free Software Foundation came together to find a way to promote the ideas surrounding free software to large enterprises as they concluded that large companies with large budgets were the key drivers of the software industry. "Free" was felt to have too many negative connotations to the corporate audience and so, they came up with a new term to describe the software they were promoting: Open Source.[6] Stallman was excluded from this meeting because "corporate friendly" is no compliment in his book.[7]

They concluded that an Open Source Definition and license were necessary, as well as a large marketing-PR campaign. The "Open Source Definition and license[8] adhere to the spirit of GNU (GNU GPL), but they allow greater promiscuity when mixing proprietary and open source software".[9]

Apparently this strategy has worked wonders since then: key players of the industry (IBM, Netscape, Compaq, HP, Oracle, Dell, Intel, RealNetworks, Sony, Novell and others) have shown great interest in the Open Source Movement, its development and business model and open source products. These days, an equivalent economic web around Open Source exists, in which for example, many companies offer support services and complementary products for Open Source products [10]. In addition, there is plenty of favourable press coverage and even Microsoft regards Open Source as a significant competitive threat (Valloppillil, 1998).


In 1991, Linus Torvalds made a free Unix-like kernel (a core part of the operating system) available on the Internet and invited all hackers interested to participate. Within the next two months, the first version 1.0 of Linux was released. From that point, tens of thousands of developers, dispersed globally and communicating via the Internet, contributed code, so that as early as in 1993, Linux had grown to be a stable, reliable and very powerful operating system." (http://www.firstmonday.org/issues/issue6_11/dafermos/)

Linux Organization

From George Dafermos at http://www.firstmonday.org/issues/issue6_11/dafermos/


"Linus cultivated his base of co-developers and leveraged the Internet for collaboration. The Open Source self-appointed anthropologist, E. Raymond explains that the leader-coordinator for a 'bazaar style of effort' does not have to be an exceptional design talent but to leverage the design talent of others (Raymond, 1998a).

However, "the Linux movement did not and still does not have a formal hierarchy whereby important tasks can be handled out ... a kind of self-selection takes place instead: anyone who cares enough about a particular program is welcomed to try" [11]. But if his work is not good enough, another hacker will immediately fill the gap. In this way, this 'self-selection' ensures that the work done is of superb quality. Moreover this "decentralisation leads to more efficient allocation of resources (programmers' time and work) because each developer is free to work on any particular program of his choice as his skills, experience and interest best dictate" (Kuwabara, 2000). In contrast, "under centralised mode of software development, people are assigned to tasks out of economic considerations and might end up spending time on a feature that the marketing department has decided is vital to their ad campaign, but that no actual users care about".[12]

The authority of Linus Torvalds is restricted to having the final word when it is coming to implementing any changes (code that has been sent to him). On the other hand, he cannot be totalitarian since everything is done under perfect transparency. Most communication takes place at the Linux mailing list (which serves as a central discussion forum for the community and is open to the public) and Linus has to justify all his decisions based on solid technical arguments. The management's accountability is essential and only by earning the community's respect, leadership can be maintained.[13]

There is only one layer between the community of Linux developers and Linus: the "trusted lieutenants". They are a dozen hackers that have done considerably extended work on a particular part of the kernel to gain Linus' trust. The "trusted lieutenants" are responsible to maintain a part of the Linux Kernel and lots of developers sent their patches (their code) directly to them, instead of Linus. Of course, apart from Linus that has encouraged this to happen, this informal mechanism represents a natural selection by the community since the "trusted lieutenants" are recognised [by the community] as being not owners but simple experts in particular areas [14] and thus, their 'authority' can always be openly challenged. This does not mean that Linus has more influence than they have. Recently, "Alan Cox (one of the "trusted" ones) disagreed with Linus over some obscure technical issue and it looks like the community really does get to judge by backing Alan and making Linus to acknowledge that he made a bad choice".[15]

What made this parallel, decentralised development feasible is the highly modular design of the Linux kernel. "It meant that a Unix-like operating system could be built piecemeal, and others could help by working independently on some of the various components".[16] Modularity means that any changes done, can be implemented without risk of having a negative impact on any other part of the kernel. Modularity makes Linux an extremely flexible system and propels massive development parallelism [17] and decreases the total need for coordination.[18]

Open Source programmers write software to solve a specific problem they are facing, 'scratching their personal itch' as E. Raymond points out (Raymond, 1998a). Once they have done so, they can choose either to sit on the patch or give it away for free and hope to encourage reciprocal giving from others too (Raymond, 1999c). But it is the 'hacker ethic' and the community dynamics that provide strong incentives to contribute. First of all, the process of developing such software is quite pleasurable to them. "They do not code for money, they code for love"[19] and they tend to work with others that share their interest and contribute code. They enjoy a multiplier effect from such cooperation (Prasad, 2001) and "when they get the opportunity to build things that are useful to millions, they respond to it".[20]

In addition, the Linux community has dealt with the lack of centralised organisation through an implicit reputation system that characterises the Open Source (OS) community (Kuwabara, 2000). OS programmers have always being unconsciously keen on gaining the community's respect ("seek peer repute") and the best way to do so is by contributing to an already existing project that is interesting enough or has created enough momentum within the community (Raymond, 1998b).

Massive parallel development is evident in the case of Linux. "In a conventional software development process, because of economic and bureaucratic constraints, parallel efforts are minimized by specifying the course of development beforehand, mandated by the top. In Linux, such constraints are absent since Linux is sustained by the efforts of volunteers. In the 'Cathedral', parallelism translates to redundancy and waste, whereas in the 'Bazaar', parallelism allows a much greater exploration of a problem-scape for the global summit and the OSS process benefits from the ability to pick the best potential implementation out of the many produced". Thereafter, (apart from quality and innovation), social learning is maximized in a 'bazaar style' of development (Figure 10) [21].

Linux has a parallel release structure: one version (1.2.x and 2.0 series) is stable and 'satisfies' the users that need a stable and secure product, and another version (2.1.x series) is experimental and 'cutting-edge' and targets advanced developers [22].

The significance of the parallel release structure should not be underestimated. All decision-making situations face a trade-off: exploration versus exploitation. This principle states clearly that the testing of new opportunities comes at the expense to realising benefits of those already available and vice versa (Holland, 1975; March, 1991; Axelrod and Cohen, 1999). The obvious organisational implication is that whenever resources are allocated to the search for future innovation, resources dedicated to the exploitation of existing capabilities are sacrificed and vice versa. This trade-off is fundamental to all systems and vital to their ability to evolve and survive; a favourable balance is therefore critical (Kauffman, 1993).

Surprisingly, this trade-off is not applicable to Linux: the parallel release structure ensures that both present and future (potential) capabilities are harnessed and this renders Linux extremely adaptive to its environment and also ensures the highest level of innovation and organisational learning." (http://www.firstmonday.org/issues/issue6_11/dafermos/)


Linux Governance

George Dafermos:

"Linux relies on four structural layers that are comprised by the leader (Torvalds), the "trusted lieutenants", the "pool of developers" and the Open Source community. Even though the community layer does not appear to be part of the Linux structure at first glance, it is in fact incorporated in it and has significant influence over the Project. The pattern of the layers cannot be said to be vertical but rather horizontal if there has to be a hierarchical classification.

This structure is not paradox if we consider the technical objectives of Linux as a project: quality and flexibility. For such quality and flexibility to be achieved, a system ensuring the highest quality and flexibility had to be in place. This mechanism takes shape in the form of emergent leadership and emergent task ownership which implies that all decisions and hierarchical positions can be openly challenged so as the best decision is reached and the most capable individual is chosen. Naturally, the prerequisites for such a style of managing are product and decision-making transparency and Linux has embraced both from its inception.

Consequently, this transparent 'collaborative-community centred' management is by far the greatest innovation of Linux and the key enabler for limitless value extraction out of the 'knowledge functions' since all organisational members including the surrounding community are encouraged to freely share information.

The "trusted lieutenants" is a support value stream/function and not an additional layer since the developers (can) go directly to Torvalds, and the developers are the primary value stream. This is a pure network structure as all the participants have access to all the nodes of the system. Accordingly, the flow of information is 'networked' since all stakeholders have access to every occurring communication through the mailing list.

The Linux Project is characterised by flow of information within the entire organisation, implying among the thousands of developers, Linus Torvalds and the "trusted lieutenants". All of them have access, can engage into all occurring conversations and can communicate directly with every other member-implementer through discussion forums, mailing lists, and newsgroups. But the flow of information is not restricted only within implementers but it extends to the global community reaching virtually everyone interested, including commercial companies (i.e. Sygnus Solutions, SuSe, Red Hat, VA Linux) that provide support services and packaged distributions, computer scientists that may have not been involved directly (as implementers), companies that consider adopting open source software for their own internal use, users that need help from experts and anyone interested or curious enough to observe or even participate in any given conversation. Access to the various open source-related Web sites, discussion forums, etc. is open to the public and all interested parties." (http://www.firstmonday.org/issues/issue6_11/dafermos/)


Virtual Roofs

Concept to characterize the management structure in Peer Production projects such as Linux.

George Dafermos:


"The virtual roof is the common denominator of the users, the implementers and the surrounding community and acts as intermediary (as a trusted third party) to ensure trust in a largely anonymous virtual marketplace by providing an electronic platform whereby network communication is nurtured.

However, the virtual roof has no power to delegate authority or abuse its power in favour of either the implementers, the users or the surrounding community by establishing asymmetries of information.[23]

The main reason is its transparent nature - open to the public and all "transactions" are visible. It is based on the mutual realisation (among the various members) of the long-term benefits that are impeded by a potential abuse of power and asymmetries of information. This is the organisational 'copyleft'. Deep down, it is the system of 'lean production' modified to create trust in the public, hence it extends its reach to include also the users and the global community.

All the virtual roof can do, is enforce common standards-rules of practice (that facilitate collaboration and decentralised development) that are approved by the community.

In a virtual roof, people can gather, talk, share ideas, spread information, announce projects and get community support.[24] These online communities, once formed, make new relationships possible: direct user-implementer interaction. The 'implementers talking to the users mechanism' is possible in any project depending on one thing: the level of dedication by the surrounding community to the project [25], and this is why virtual roofs exist: to establish trust, enable critical relationships that would be otherwise impossible on any significant scale and orchestrate decentralised efforts towards a centralised goal.

Virtual roofs dedicated to global (user-oriented) communities are essential to the life of projects whose success depends on the dedication of the surrounding global community and utilising geographically dispersed resources under no central planning. In such places, leadership is emergent and has a dual objective: give direction and momentum to the project." (http://www.firstmonday.org/issues/issue6_11/dafermos/#note21)


Notes

[1] E.E. David, Jr. and R.M. Fano, 1965. "Some thoughts about the social implications of the accessible computing," excerpts reprinted in IEEE Annals of the History of Computing, volume 14, number 2, pp. 36-39, and at http://www.multicians.org/fjcc6.html, accessed 27 October 2001; J. Abbate, 1999. Inventing the Internet. Cambridge, Mass.: MIT Press; and, J.J. Naughton, 2000. A Brief History of the Future: From Radio Days to Internet Years in a Lifetime. Woodstock, N.Y.: Overlook Press.

[2] R.M. Stallman, 1999. "The GNU operating system and the free software movement," In: C. DiBona, S. Ockman, and M. Stone (editors). Open Sources: Voices from the Open Source Revolution. Sebastopol, Calif.: O'Reilly.

[3] Accessible at www.gnu.org/licenses/gpl.html

[4] P. Hood and D. Hall, 1999. Lighthouse "Open source software: Lighthouse case study," Toronto: Alliance for Converging Technologies, p. 9.

[5] G. Moody, 2001. Rebel Code: the Inside Story of Linux and the Open Source Revolution. Cambridge, Mass.: Perseus, p. 29.

[6] DiBona, Ockman, and Stone (editors), 1999, p. 4; Hood and Hall, 1999 p. 24.

[7] M. Lewis, 2001. "Free spirit in a capitalist world: Interview with Richard Stallman," Computer Weekly (20 April), at http://www.cw360.com/, accessed 27 October 2001.

[8] Accessible at www.opensource.org/docs/definition.html.

[9] DiBona, Ockman, and Stone (editors), 1999, p. 4.

[10] For a discussion of the business models based on Open Source, see Hood and Hall, 1999; Raymond, 1999c; and, DiBona, Ockman, and Stone (editors), 1999.

[11] Moody, 2001, p. 62.

[12] Interview with Philip Charles.

[13] Interview with Chris Dibona.

[14] Moody, 2001, pp. 81, 84.

[15] Interview with Glyn Moody.

[16] Moody, 2001, pp. 14, 82.

[17] L. Torvalds, 1999a. "The Linux edge," Communications of the ACM, volume 42, number 4, pp. 38-39; L. Torvalds, 1999b. "The Linux edge," in C. DiBona, S. Ockman, and M. Stone (editors). Open Sources: Voices from the Open Source Revolution. Sebastopol, Calif.: O'Reilly, pp. 101-119.

[18] J.Y. Moon and L. Sproull, 2000. "Essence of distributed work: The Case of the Linux kernel," First Monday, volume 5, number 11 (November), at http://www.firstmonday.org/issues/issue5_11/moon/, accessed 28 October 2001.

[19] Interview with Glyn Moody.

[20] Interview with Richard M. Stallman.

[21] K. Kuwabara, 2000. Linux: A Bazaar at the edge of chaos, First Monday, volume 5, number 3 (March), at http://www.firstmonday.org/issues/issue5_3/kuwabara/, accessed 28 October 2001; T. Nadeau, 1999a. "Learning from Linux: OS/2 and the Halloween Memos," Part 1 - Halloween I, at http://www.os2hq.com/archives/linmemo1.htm, accessed 28 October 2001.

[22] C.B. Browne, 1998. "Linux and decentralized development," at http://www.firstmonday.org/issues/issue3_3/browne/, First Monday, volume 3, number 3 (March), accessed 27 October 2001; J.Y. Moon and L. Sproull, 2000. "Essence of distributed work: The Case of the Linux kernel," First Monday, volume 5, number 11 (November), at http://www.firstmonday.org/issues/issue5_11/moon/, accessed 28 October 2001.

[23] Interview with Chris Dibona.

[24] Interview with Dan Barber.

[25] Interview with Dan Barber.



References

R.M. Axelrod and M.D. Cohen, 1999. Harnessing Complexity: Organizational Implications of a Scientific Frontier. New York: Free Press.

J.H. Holland, 1975. Adaptation in Natural and Artificial Systems: An Introductory Analysis with Implications to Biology, Control, and Artificial Intelligence. Cambridge, Mass.: MIT Press.

S.A. Kauffman, 1993. The Origins of Order: Self-organization and Selection in Evolution. New York: Oxford University Press.

K. Kuwabara, 2000. Linux: A Bazaar at the edge of chaos, First Monday, volume 5, number 3 (March), at http://www.firstmonday.org/issues/issue5_3/kuwabara/, accessed 28 October 2001.

S. Levy, 1984. Hackers. London: Penguin.

J.G. March, 1991. "Exploration and exploitation in organizational learning," Organization Science, volume 2, number 1, pp. 71-87.

G. Prasad, 2001. "Open Source-onomics: Examining some pseudo-economic arguments about open source," at http://www.freeos.com/printer.php?entryID=4087, accessed 28 October 2001.

E.S. Raymond, 1998a. The Cathedral and the bazaar, First Monday, volume 3, number 3 (March), at http://www.firstmonday.org/issues/issue3_3/raymond/, accessed 28 October 2001.

E.S. Raymond, 1999a. A Brief history of hackerdom, at http://www.tuxedo.org/~esr/writings/hacker-history/, accessed 28 October 2001.

E.S. Raymond, 1999b. A response to Nikolai Bezroukov, First Monday, volume 4, number 11 (November), at http://www.firstmonday.org/issues/issue4_11/raymond/, accessed 28 October 2001.

E.S. Raymond, 1999c. The Magic cauldron, at http://www.tuxedo.org/~esr/writings/magic-cauldron/, accessed 28 October 2001

V. Valloppillil, 1998. "Open source software: A (new?) development methodology," also referred as the Halloween document, unpublished working paper, Microsoft Corporation.

Encyclopedia