SciELO - Scientific Electronic Library Online

 issue97NU Building. Miraflores, Lima, PerúThe value of transforming: Academic building, Faculty of Arts, Oriente campus author indexsubject indexarticles search
Home Pagealphabetic serial listing  

Services on Demand




Related links

  • On index processCited by Google
  • Have no similar articlesSimilars in SciELO
  • On index processSimilars in Google


ARQ (Santiago)

On-line version ISSN 0717-6996

ARQ (Santiago)  no.97 Santiago Dec. 2017 


Securitizing the Demos: Constructing the First U.S. Real Estate Financial Index, 1975-1983

Manuel Shvartzberg Carrió1 

1 PhD (c) in Architecture Columbia University, USA.


By exploring the making of the first public real estate financial index in the United States - an effort which begun in 1975 and was completed in 1983 - this text seeks to understand the relation between finance and housing from the perspective of the sociotechnical tools that made these distinct realms commensurable, calculable and, therefore, governable.

Keywords: housing; property; Lyotard; databases; capital markets

Right in the middle of this history, in 1979, Jean- François Lyotard published his seminal “report on knowledge” - The Postmodern Condition - where he famously declared the death of the “grand metanarratives” in favor of what he termed “paralogy,” or non-zero-sum innovation, ambiguously defining the possibility of a radically contingent, democratic sociality based on creativity (Lyotard, 1984 (1979)). Lyotard’s theorization of the end of great metanarratives was no doubt a reflection upon the “legitimation crisis” (Habermas, 1976)22 wrought by the socio-political and economic changes in Europe and the us in the late 1960s and 1970s, which he interpreted as a function of the politics of knowledge at that particular point in time. Such a politics, he argued, had thus far been posed as a false dichotomy between an impersonal “life of the spirit” incarnated in the systems-theoretical, functionalist methods of capitalist regulation in liberal democracy (a genealogy he traces back to Parsons, and via Luhmann, influences Habermas’s theorizations of “consensus”), and the projects of ‘negative’ critique of the Frankfurt School that sought to create clearances, however slight, for a potential “emancipation of humanity” at large. Instead, Lyotard identified the “postmodern condition” immanent to post-Fordist capitalism as one in which “the relation to knowledge is articulated (…) in terms of the users of a complex conceptual and material machinery and those who benefit from its performance capabilities” (Lyotard, 1984 (1979):52). While much has been written on Lyotard’s “report” in terms of its acute insights on cultural discourse and postmodernist reflections on science and creativity - often in the guise of a superficial celebration of “surface play” escaping the depth of Lyotard’s thought - what I wish to focus on here is his description of an emergent mode of political organization predicated on the technical accumulation of data. At stake in this politics is not just the destruction of master narratives, but the creation of a new idea of ‘justice’ through the socialization of the means of computational production, and ‘knowledge’ itself (Lyotard,1984 (1979):13-14).

In his foreword to the book, Fredric Jameson ponders the difficulties of Lyotard’s inquiry by emphasizing the very complexities with calculating value from ostensibly ‘immaterial’ practices of knowledge (Lyotard, 1984 (1979):XV). However, capitalists had long began the process of technically defining ways of capturing what Jameson called “nonmeasurable commodities” as surplus value, and one of the key sites for this capture would be the housing market. Beyond the simple transformation of housing’s use values to exchange values, this occurred through the general articulation of ‘dwelling’ itself - a thoroughly qualitative and “substantive” category23 - into a formal, calculable function of capital accumulation and speculation. As we will see, despite Jameson’s methodological cautions - and vindicating Lyotard - the real estate industry was by 1979 well on the way to capturing the value from so-called “biopolitical labor”24 (Hardt & Negri, 2009) via the deployment of a sociotechnical mechanism for generating and harvesting information on the connections between the housing occupant, private enterprise, and the state. My general hypothesis is that analyzing this particular techno-political arrangement in some detail might serve to qualify and specify that otherwise unwieldy term, ‘neoliberalism,’ as related to the housing question since the late 1970s. Lyotard’s account of late capitalism as a highly regulated and technically-managed mode of power predicated on data would thus appear to serve as a good starting point (Lyotard, 1984 (1979):13-14).

The emergence of a technical apparatus enabling mass data accumulation around housing is historically twinned with the process of securitization of real estate - both histories come together in the making of a real estate index for the capital markets25 (Figure 1) (Figure 2). This development in the late 1970s and early 1980s radically changed the nature of real estate by making it possible to constitute it as a financial market security - a re-framing of the conventional metrics of real estate, changing its very grounds and scale of operability from a predominantly local or regional asset, to a global, evermore fungible one.

Figure 1 (a to d) FTSE EPRA/ NAREIT Global Indices (L) & NCREIF/NAREIT Q4 summary 2014. 

Figure 2 Wilshire US REIT Index, Sector Classifications, 2015. 

The indexing of real estate is thus a chapter in the history of markets through the expansion of networked capipricing systems over the past two centuries (Nitzan & Bichler, 2009:152). With the rise of computerization and information technologies in the 1970s, real estate becomes evaluated as price more quickly and over broader territories than ever before, radically compressing the space-time relation. The system that allows this compressed equalization to take place is the development of geographically comprehensive, and informationallyrich, networked real estate indices. Prices can now be compared and contrasted over vast territories, in seconds - a crucial condition of possibility for the subsequent globalization of real estate as a security26.

Real estate indices are used by investors to analyse potential returns, risks, and market conditions as they invest. They provide quasi-universal benchmarks that active investors attempt to ‘beat’ in order to gain higher returns than the market’s average - they are thus absolutely central for structuring competitive capitalist markets. But as we shall see, market indices don’t just obey ideal-external rules of competition; they actively create them in particular ways - thus challenging facile (i.e. unmediated) equivalences between capitalism and democracy.27 The political aspects of this technical development were thus far from smooth - indexing was an ambiguous and conflicted process whereby important shifts of power took place at the level of populations, institutions, and the politics of knowledge.

In what follows I will analyze the making of the first public real estate equity index, the still highly-influential NPI,28 focusing on just two techno-political aspects of this history: the role of capitalization - a specific mode of financial calculation to determine value; and secondly, the role of aggregation - technical and institutional methods for collecting, handling and interpreting real estate at the level of data, which recursively construct a managerial-financial class at the organizational level. Together, these dimensions constitute two crucial codependent, elementary variables of a purportedly closed, self-referential functional system - the securitized real estate market.


The first conversations to begin formulating an industrywide, public real estate index in the United States began in the mid-1970s when a number of national real estate organizations came tentatively together to address the possibility. Their first strategic move was to attempt to find a stable calculation model, a shared baseline, for the valuation of properties. In 1975, the American Society of Real Estate Counselors and the American Institute of Real Estate Appraisers commissioned an expert’s report on capitalization theory. The report suggested that real estate capitalization could be divided into ‘internal’ and ‘external’ components - or ‘primary’ and ‘secondary’ levels (Akerson, 1975).29.

Real estate investment, in both specific development projects and at the level of the real estate securities market, involves two types of capitalization: capitalization rate (‘cap rate’) and market capitalization (‘market cap’), respectively. In principle, the ‘cap rate’ establishes a logic for the proper valuation of specific real estate securities (claims over properties, sectors, or companies), and is thus described as ‘primary,’ while ‘market capitalization’ involves aggregate figures of these values, and is thus considered ‘secondary.’ Capitalization “represents the present value of a future stream of earnings: it tells us how much a capitalist would be prepared to pay now to receive a flow of money later” - the distribution of which will depend upon an estimate of future income and a particular rate of return, or risk premium (Nitzan and Bichler, 2009:153). In real estate, the formula that determines the present value of a property is the following (1):


The capitalization rate is thus a percentage that ‘discounts’ the projected income from the property according to the projected risk of the undertaking, or alternatively, according to the internal rate of. return deemed appropriate for the investment. A proper assessment and computation of these variables would provide ‘fundamental’ values - prices for current properties that are ‘in sync’ with the current market in terms of supply and demand, expected risks and returns, and competitive operating costs. The cap rate determines the perceived value of the investment as a function of the overall market. Thus, determining the cap rate involves interpreting the aggregated underlying values of the properties to account for more generally qualitative aspects, such as perceived market-sector risk, the perceived managerial capacity of the managing agents, or property-type performance in historical perspective (Block, 1997:166).

However, financial economists found a way to quantify these more qualitative aspects by assuming that the price of a security already accounts for them. This idea corresponds to the theorization, in the 1960s, of what came to be known as the Efficient Market Hypothesis (Nitzan and Bichler, 2009:192). In this complementary theory to the Capital Asset Pricing Model, the individual asset and the market as a whole are apprehended as intrinsically co-constituted - the only external variable that may affect the asset’s valuation, and thus the overall performance of the market, is new information pertaining to the asset’s or market’s risk/return ratio as expressed in their prices (Nitzan and Bichler, 2009:145- 214). Information-rich property indices, then, are one part of a large system of information (including the financial media, news outlets, etc.) to be able to deduce the relation between risk and return of a given investment. But because capitalization rates are themselves the dynamic product of a ratio between the evolving primary and the secondary markets (a speculation on the relation between the commodity and the market), and not to an external, stable source of value-evaluation, they constantly demand newer and more accurate information, as well as forms of cross-referencing, in order to ‘be assessed’ more accurately. This process of pervasive informational enclosure via the “intellectual technology”30 of capitalization, the definition of a dynamic part-towhole system, is precisely one of the main functions of financial indices themselves as they seek to produce and make use of further and further sets of data - financial, geographical, sociological, demographic, environmental, etc., to determine appropriate valuations.


Having established capitalization as the shared logic by which properties could be evaluated, at the end of the 1970s the alliance of national real estate organizations turned to the question of ‘what’ precisely ought to be capitalized and how it might come together as a single index. In 1982 they commissioned Frank Russell Company, a financial services firm, to implement technically and organizationally the making of the index. The commissioning party represented a broad cross-section of the US national real estate services industry, from academics to private professionals and over a dozen large financial firms including banks, trusts, and insurance companies.31

This association - a veritable power-diagram of the US real estate industry - was eventually formalized as the National Council of Real Estate Investment Fiduciaries (NCREIF), with the stated purpose “to increase the understanding of real estate as an asset… through the use of advanced analytical techniques” (Eagle, 1983:3). The index would provide evidence of the “historical real estate rates of return that would allow for comparison of property as a distinct asset class with other investment alternatives” (Eagle, 1983:5). The project thus entailed the careful selection, compilation, and processing of NCREIF’s participating members’ ‘private’ databases to make a This effort, therefore, strained industry boundaries by asking competing companies to share their privatelyharvested and proprietarily-developed modes of processing real estate transaction information.32 After much negotiation and institutional diplomacy, the tensions implicit in this uneasy alliance were effectively subsumed in strict confidentiality protocols. But a vast number of technical issues still had to be resolved on how exactly to capture, calculate and articulate a singular rate of return real estate index out of a multifarious collection of databases.33 Thus, in 1983 NCREI F commissioned another research report by a real estate appraisal-industry pioneer, James A. Graaskamp, to provide a possible template for resolving the questions on data, calculation, and commensurability opened up by the fabrication of the index.34 Head of one of the country’s most prestigious real estate programs at the University of Wisconsin and with access to a range of datasets from the university, NCREIF’s members, and government departments, Graaskamp sought to achieve a harmonization between the different existing quantitative databases, their calculation modes, and their other potentially useful qualitative information. Tellingly, the actual intricacies of the final Frank Russell Company index design remain proprietary and confidential, but we may explore Graaskamp’s disclosed experimental report published in 1983 as an example of how the industry imagined and constructed the potentiality field of housing securitization (Graaskamp et al., 1983).

Graaskamp began by collecting, analyzing, aggregating and attempting to harmonize the data from existing real estate databases of the different NCREIF contributing member companies (Table 1), toward the making of a theoretical real estate index.35 The report’s method was to disaggregate these databases in order to establish the equivalent, “common building blocks” out of which a ‘universal’ index would then be constructed. The disaggregation would develop by determining regional definitions; property classifications; modes of inter-database compatibility; and finally, by finding relevant geographic socioeconomic variables. If constructed appropriately, the report suggested, this georationalization would become the granulated ‘universe’ out of which the correct weighting of part-to-whole relationships could be ascertained (Figure 3) (Figure 4).

Table 1 Graaskamp report, Participating organizations and their databases. 

Figure 3 Graaskamp report, Defining geographic elements as per HUD regions. 

Figure 4 Graaskamp report, Testing sub-county disaggregation units. 

The most ambitious step in the process was to find the relevant geographic socioeconomic variables - available from computer tape sources - that would be, or could be made to be, compatible with the study’s identified “common building blocks.” The motivation for this step was argued from both an experimental research perspective, but also from commercial grounds: incorporating socioeconomic data would enable managers to have a competitive edge in their evaluations, but would also create a competitive advantage for the index itself as a market securitization device.36 In other words, Graaskamp understood the tremendous potential involved with wielding control over the privatized means of securitization. The index itself could, recursively, become a commodity in the broader, emergent market for financial services.

Four different resources from the city and the university were identified for gathering the relevant computer tape data for the index: the Data Program Library Service, the Center for Demography and Ecology, the Applied Population Lab, and the Wisconsin Department of Administration.37 The decision on why to use these particular sources stemmed from a mix of pragmatic opportunism and cultural-institutional judgments and preferences.38 Yet, its utility as a ‘secondary market’ device was rationalized exclusively in terms of its use of ‘primary’ level property-specific data: “The listings were reviewed and any tapes having a reasonable possibility of being useful for an investment property database were extracted by subject, year available, disaggregation level, and source” (Graaskamp et al., 1983:34). Each available data category was then evaluated and correlated, as appropriate, to the newly created georationalized grid. The datasets that were deemed “useful” included “(p)opulation items” such as “schooling” or “ethnicity,” and many others:

Housing items include air/condition, value, age, water, sewage and heating, monthly owner costs (…) survey of kitchens, heating units, electrical systems. Costs of mortgage payments, real estate taxes, property insurance, utilities, garbage collection. Family and individual characteristics including income, age, race, household structure, education, family relationships, occupation and employment history (Graaskamp et al., 1983:43).

Although too expensive for the study, the authors noted that, eventually, other computer tape resources could be examined toward finding and aggregating more information of this kind, especially from US government sources such as the Bureau of the Census, Bureau of Economic Analysis, Department of Commerce, Department of Agriculture, Department of Labor, and the Environmental Protection Agency. According to the study, substantial political variables such as the “change in rate of employee compensation for private white collar, blue collar, and service workers,” or environmental factors like “ambient air quality data and trends,” (Graaskamp et al., 1983:50) could all be fed into the index - thus formalizing a whole series of politically-charged epistemologies in terms profitable to the real estate industry.

All this disaggregated socioeconomic data could be correlated with the fabricated geo-rationalized grid, the study concluded, by re-aggregating it in the form of two codes: one according to property location and another according to property type. These codes would then form the basis for the index values that could be monitored and re-calculated throughout regular periods, providing a certain rate of return in each case.


The making of the first public US real estate financial index illustrates both the meteoric rise of financial services as an industry, as well as techniques for the self-replication of the city, and its occupants, in the image of capital. Both were mediated through securitization as a technological and institutional-political process. As others have shown, this shift necessitated various ideological and political maneuvers to take place in the 1960s and 70s - among them, the systematic unthinking of ‘utopian’ possibilities for public housing (Martin, 2010). In more technical terms, we might say this ‘unthinking’ took the form of an apparatus to make the multifarious dimensions of ‘dwelling’ commensurable with new modes of capitalist exploitation.

The 1970s, therefore, mark a decisive historical acceleration in the dynamics of capital accumulation via housing. For at this time it becomes possible to correlate the conventional accumulation of ground rent in purely spatial-geographic terms, with systematic analytical exploitation of data on housing populations. In other words, at this time, real estate evolves beyond a chiefly geographic (extensive) enterprise to become a full-fledged biopolitical (intensive) technical project.39.

This ‘intensive’ mode of exploitation does not just mirror forms of life, but actively produces them by creating incentives (for both housing users and managers) to orient important dimensions of their own lifeworlds (i.e. their ethnicity, education, employment or familial structure), around metrics formalized purely in terms of income or risk as deemed profitable by the real estate industry, and more broadly, the capital markets. The index thus turns human dwelling into human capital - substantive aspects of subjectivity itself now clearly become the real price of access to housing. This involves the enclosure of the internal dimensions of the subject, parameters often ascribed from without, by environmental datasets, casting subjectivity as formally empty (or, in fact, qualitatively full but only as “human capital”), rather than in more particular cultural, social or philosophical terms.40 The dwelling is thus denied its political dimensions in favor of an epistemology - that of markets - which cannot supply a sophisticated enough arena for the full complexity of its contentions.

Beyond biopolitical inquiries over subjectivity, the techno-political mechanisms of real estate securitization also allow us to see how particular modes of neoliberal rationalization program and re-produce groups, classes, and thus urban space, according to their own interests, logics, and beliefs. As Lyotard so presciently foresaw, this occurs through the constitution of privately-shared and thus quasi-centralized data systems that aggregate socio-economic functions, thereby constructing a self-replicating city in the techno-ideological image of capital. ‘Capital’ here signifies not only the imperative dynamic of accumulation, but the creation of strategic alliances between corporations, interest groups, lobbies, and their technical representatives (i.e. financial services providers), to constitute heterogeneous groups with outsize influence and power over the market’s very foundational-operational structure, thus attenuating competition around minimal baselines of consensus and harnessing the overall maximization of their profits, while strategically guarding a privileged position.41

In this sense, the index was a complex technical and institutional exercise in balancing accuracy with opacity. ‘Accuracy’ in terms of the voracious aggregation of datasets, including the intellectual technologies required to make them calculable and commensurable; ‘opacity’ in terms of the contradictory alliance between competing institutional actors required for aggregation. As Leyshon and Thrift observe, it is “the ‘system’ for aggregating ground rents into a mass” and turning them into different forms of assets for the capital markets that gives these arrangements their profit-making capability (Leyshon & Thrift, 2007:104). This ‘system,’ as demonstrated by the index’s techno-intellectual history, comprises both open sources of data and proprietary calculation techniques; government agencies and industry alliances - ‘public media’ and ‘private appropriation’ - thus entangling dynamics of informational enclosure and disclosure in highly torqued configurations which recursively affect the system as a whole. The index ‘creates’ as much as it ‘reflects’ the reality it seeks to report on.

Despite these apparent indeterminacies, there is a real material consequence to this system in that “the benefits (…) are mainly reaped by financial intermediaries, especially those with access to computing power and software which can remake assets so that they are tradeable” (Leyshon & Thrift, 2007:109). The real risk of this lopsided techno-institutional arrangement, to reprise Lyotard’s warning, is the massive separation between “the users of a complex conceptual and material machinery and those who benefit from its performance capabilities” (Lyotard, 1984 (1979):52); between those who use the index and those who are used - or foreclosed - by it. In the case of securitized housing, this machinery results in the literal securitization of the demos: based on the systematic extraction of information from dwelling practices and conditions, a second-order mode of governance emerges whereby - as seen at the level of the city’s development - sovereignty is appropriated from democratic politics and handed to technical systems guarded by financial ‘experts’ whose chief constituency is not a population to be housed, but a population to be capitalized.


AKERSON, Charles B., «The Internal Rate of Return In Real Estate Investments: A Research Monograph». asrec/airea: Chicago, IL, 1975. Disponible en: ]

BELL, Daniel. The Coming of Post-Industrial Society: A Venture in Social Forecasting. New York, Basic Books, 1973. [ Links ]

BLOCK, Ralph, The Essential REIT: A Guide to Profitable Investing in Real Estate Investment Trusts. Brunston Press, 1997. [ Links ]

BROWN, Wendy. Undoing the Demos: Neoliberalism’s Stealth Revolution. Brooklyn, New York: Zone Books, 2015. [ Links ]

BURCHILL, Graham; Gordon, Colin; Miller, Peter (Eds.) The Foucault Effect: Studies in Governmentality. London: Harvester Wheatsheaf, 1991. [ Links ]

EAGLE, Blake. «Developing a real estate performance index», Pension World, June 1983. Disponible en: ]

ELSON, Diane. «Market Socialism or Socialization of the Market?», New Left Review, I/172 (Nov-Dic 1988). [ Links ]

FOUCAULT, Michel. The Birth of Biopolitics: Lectures at the Collège De France, 1978-79. Basingstoke: Palgrave Macmillan, 2008. [ Links ]

GRAASKAMP, James A.; Tossey, Thomas P.; Hungerford, Craig Preliminary Report For National Council of Real Estate Fiduciaries: Coding Factors and Data Base Formats for Real Estate Investment Properties Performance, Sept. 1983. Disponible en: ]

HABERMAS, Juergen. Legitimation Crisis. London: Heinemann, 1976. [ Links ]

HARDT, Michael; Negri, Antonio. Commonwealth. Cambridge, Mass.: Belknap Press of Harvard University Press, 2009. [ Links ]

JACKSON, Kenneth T. «Race, Ethnicity, and Real Estate Appraisal: The Home Owners Loan Corporation and the Federal Housing Administration», Journal of Urban History, 6 (Agosto 1980): 419-452. [ Links ]

KRIPPNER, Greta R. Capitalizing On Crisis: The Political Origins of the Rise of Finance. Cambridge, Mass.: Harvard University Press, 2010. [ Links ]

LAZZARATO, Maurizio. Signs and Machines: Capitalism and the Production of Subjectivity. Los Angeles, CA: Semiotext(e), 2014. [ Links ]

LEYSHON, Andrew; Thrift, Nigel «The Capitalization of Almost Everything: The Future of Finance and Capitalism», Theory, Culture & Society, 24 7-8 (Dic. 2007). [ Links ]

LYOTARD, Jean-François. The Postmodern Condition: A Report on Knowledge. Manchester: Manchester University Press, 1984 (Primera Edición en francés, 1979) [ Links ]

LUCAS, Chris. Entrevista. Disponible en: ]

MARTIN, Reinhold. Utopia’s Ghost: Architecture and Postmodernism, Again. Minneapolis: University of Minneasota press, 2010. [ Links ]

MILLER, Norman C. Jr., and Markosyan, Sergey «The Academic Roots and Evolution of Real Estate Appraisal», The Appraisal Journal, 172-184, April 2003. [ Links ]

MILLER, Peter; Rose, Nikolas. «Governing economic life», Economy and Society 19 (1990). [ Links ]

NITZAN, Jonathan; BICHLER, Shimshon. Capital as Power. A Study of Order and Creorder. Routledge, 2009. [ Links ]

SCHILL, Michael H. «The Impact of the Capital Markets on Real Estate Law and Practice». John Marshall Law Review. 269 (1998-1999). [ Links ]

SHVARTZBERG, Manuel. «Play Turtle, Do It Yourself: Flocks, Swarms, Schools, and the Political-Architectural Imaginary». En: The Politics of Parametricism: Digital Technologies in Architecture, Matthew Poole and Manuel Shvartzberg, Eds. London; New York: Bloomsbury Academic, 2015. [ Links ]

SHVARTZBERG, Manuel. «Foucault’s ‘Environmental’ Power: Architecture and Neoliberal Subjectivization». En: Peggy Deamer, Ed. The Architect as Worker: Immaterial Labor, The Creative Class, and the Politics of Design. London; New York: Bloomsbury Academic, 2015. [ Links ]


Manuel Shvartzberg Carrió Architect, Bartlett School of Architecture, London. MA in Aesthetics and Politics, CalArts. PhD in Architecture candidate, GSAPP, Columbia University. Current researcher at The Temple Hoyne Buell Center for the Study of American Architecture and Graduate Fellow of the Institute for Comparative Literature and Society, both at GSAPP. His dissertation work, “Designing ‘Post-Industrial Society’: Settler Colonialism and Modern Architecture in Palm Springs, California, 1876-1973”, examines the intersection between architecture, technology, geopolitics and economic discourses on automation and the environment under US hegemony.

Creative Commons License Este es un artículo publicado en acceso abierto bajo una licencia Creative Commons