12 may 2026

Uploaded Image Uploaded Image

Roy, A. (2005) ‘Urban informality: Toward an epistemology of planning’, Journal of the American Planning Association, 71(2), pp. 147–158.

Ananya Roy defines urban informality as a mode of metropolitan urbanisation rather than a marginal sector separate from the formal city. Informality organises housing, land, labour, infrastructure, and property through flexible relations of legality and illegality, making it central to contemporary urban growth. Roy’s argument redirects planning theory towards cities of the Global South, where informal settlements, elite subdivisions, peri-urban expansion, and irregular land markets reveal how the state actively produces the state of exception. Planning power determines which illegalities are tolerated, upgraded, regularised, demolished, or protected, thereby transforming informality into an instrument of urban governance. A specific case emerges in land titling and slum upgrading policies: formalisation promises market access and security, yet it can intensify displacement, debt, gendered hierarchy, and unequal property ownership. Roy’s discussion of the “politics of shit” offers a sharper alternative, since infrastructure becomes a political process shaped by residents’ knowledge rather than a merely technical improvement imposed from above. Her comparison of Third World informality policy with American poverty policy also shows that planning repeatedly treats poverty as a spatial disorder to be corrected, while deeper questions of wealth distribution remain unresolved. The conclusion is decisive: urban informality teaches planners to move from land-use order towards distributive justice, from best-practice models towards critique, and from property rights towards the right to the city.

Uploaded Image Uploaded Image

Lewis, D.W. (2020) A Bibliographic Scan of Digital Scholarly Communication Infrastructure. Atlanta, GA: Educopia Institute.

Lewis’s A Bibliographic Scan of Digital Scholarly Communication Infrastructure maps the fragmented ecosystem of tools, services, projects and organisations that sustain digital scholarship, presenting scholarly communication not as a single publishing pipeline but as a distributed infrastructure spanning research workflows, repositories, data, publishing, discovery, assessment and preservation. The report identifies 206 projects, distinguishing nonprofit and commercial actors, and situates them within a broader concern: whether the future of scholarly communication will be governed by market consolidation or by community-controlled open systems. Its central argument is that digital scholarship depends upon infrastructure whose sustainability, ownership and interoperability are as consequential as the publications and datasets it circulates. Lewis shows that commercial providers increasingly seek end-to-end integration across the research workflow, raising the risk that universities may lose control over core academic functions, while open and nonprofit initiatives face collective-action problems, unstable funding and the difficulty of long-term governance. The workflow map near the beginning of the report visually synthesises this ecology by placing researcher tools, repositories, publishing systems, discovery services, assessment mechanisms, preservation projects and general services within the scholarly process. As a case-study synthesis, the report’s treatment of repositories, open access publishing, research data management and preservation reveals that openness alone is insufficient unless supported by durable business models, shared standards, community investment and institutional commitment. Ultimately, Lewis concludes that scholarly communication must be understood as a collective infrastructural problem: the academy’s ability to preserve autonomy, equity and openness depends on coordinated funding, bibliodiversity and governance structures that protect knowledge production from commercial capture while sustaining the tools on which research now relies.


Uploaded Image Uploaded Image

Starosielski, N. (2015) ‘Against Flow’, in The Undersea Network. Durham, NC: Duke University Press, pp. 1–25.

Starosielski’s “Against Flow” argues that the apparently immaterial, wireless and deterritorialised Internet is in fact sustained by undersea fibre-optic cables whose routes are material, political, ecological and historically sedimented. Rather than treating global communication as frictionless circulation, the chapter insists that signals move through grounded infrastructures shaped by coastal politics, cable stations, fishing practices, military interests, colonial histories, environmental risk and corporate secrecy. Its central intervention is to replace the fantasy of digital flow with an account of networked transmission as precarious, territorial and dependent on extensive labour. The maps of transpacific cable routes from 1922, 1982 and 2012 visually reinforce this argument, showing that contemporary systems do not simply dissolve geography, but often repeat older telegraph, telephone, trade and military pathways. Starosielski’s case of Arctic Fibre synthesises the chapter’s method: the proposed Arctic route appears innovative, yet its feasibility depends on climate change, indigenous and governmental relations, oil interests, ice movements, fishing activity, cable protection, interconnection points and maintenance logistics. Through this example she develops key concepts such as turbulent ecologies, pressure points, strategies of insulation, strategies of interconnection and traction, all of which show how networks must both separate themselves from and attach themselves to their environments. Ultimately, the chapter concludes that digital media systems are not abstract diagrams of nodes and vectors, but ecological infrastructures embedded in rural, aquatic and coastal worlds. Its decisive claim is that the experience of global fluidity depends on fixed, vulnerable and unevenly distributed routes whose histories and material conditions must be made visible.


Uploaded Image Uploaded Image

Bryson, J.J. (2016) ‘Patiency Is Not a Virtue: AI and the Design of Ethical Systems’, Proceedings of the AAAI Spring Symposium Series, pp. 202–207.

Bryson’s “Patiency Is Not a Virtue” argues that the ethical status of artificial intelligence is not a discoverable fact about machines, but a normative design decision for which humans remain responsible. The paper rejects the assumption that advanced intelligence, linguistic capacity or social responsiveness automatically entitles AI to moral agency or patiency, insisting instead that ethical systems and artefacts are co-constructed by societies. Bryson’s central claim is that because machines are designed objects, the question is not merely what moral status they deserve, but what kinds of entities we ought to build in the first place. Her account of morality centres on socially recognised choice, sanctioned action and responsibility, yet she argues that attributing such responsibility to AI would often function as an evasion of human accountability. The case of robotics is therefore not analogous to natural moral subjects: unlike children, animals or other humans, artificial agents can be specified so that they do not suffer, compete for status, fear death or require moral protection. The discussion of the EPSRC Principles of Robotics synthesises this position, especially the claims that robots are tools, humans are responsible agents, robots are products, machine nature should remain transparent, and legal responsibility should remain attributable. Ultimately, Bryson concludes that AI should be treated as a manufactured extension of human agency rather than as an autonomous moral subject. Her decisive ethical proposition is that creating machines to which we owe patiency would be avoidable, disruptive and morally incoherent, because it would shift responsibility away from the humans and institutions that design, deploy and profit from them.


Uploaded Image Uploaded Image

Rheinberger, H.-J. (2018) ‘On Science and Philosophy’, Crisis & Critique, 5(1), pp. 341–347.

Rheinberger’s “On Science and Philosophy” advances a historical epistemology in which philosophy of science can no longer claim to organise knowledge from above, but must instead follow the concrete, changing and experimental practices through which scientific objects come into being. Beginning with Cassirer and Bachelard, the essay argues that the age of grand philosophical systems has passed, since the sciences themselves have diversified so radically that only historically situated reflection can grasp their development. Cassirer’s importance lies in replacing metaphysical system-building with an account of knowledge as a problem-oriented process, where objects are not simply given but mediated through specific instruments, practices and conceptual forms. Bachelard radicalises this position by insisting that every hypothesis, problem, experiment and equation demands its own philosophy, because scientific reason is not fixed in advance but transformed by the very activity of research. Rheinberger’s own contribution emerges from this lineage: modern science must be understood through experimentation, not as a subordinate test of theory, but as the generative space where epistemic things take shape within experimental systems. These systems stabilise objects enough to make them researchable while preserving the ambiguity that drives inquiry beyond its existing limits. The specific case of experimental molecular biology underpins this view, demonstrating that scientific knowledge advances through apparatuses, procedures, materials and uncertainties rather than through speculation alone. Ultimately, Rheinberger concludes that philosophy remains vital only when it accepts its entanglement with scientific practice and becomes a historical reflection on how research risks, reorganises and renews reason itself.


Uploaded Image Uploaded Image

Mattern, S. (2015) ‘Deep Time of Media Infrastructure’, in Parks, L. and Starosielski, N. (eds.) Signal Traffic: Critical Studies of Media Infrastructures. Urbana: University of Illinois Press, pp. 94–112.

Mattern’s “Deep Time of Media Infrastructure” argues that media infrastructure did not begin with telecommunications, electronic networks or contemporary smart cities, but with the earliest urban forms that organised communication, ceremony, inscription and public address. Her central intervention is to stretch media history backwards into archaeology, urban history and architectural history, showing that cities have always been communicative environments whose streets, walls, plazas, facades, voids and acoustic volumes function as media systems. Rather than treating infrastructure as a modern technical layer, Mattern presents it as deep time: a long historical accumulation in which oral, written, architectural, graphic, sonic and digital forms coexist, overlap and reshape one another. Her examples range from the agora and Roman Forum as acoustic infrastructures for governance, to public inscriptions on ancient buildings, Fatimid Cairo’s exterior texts, Chinese stone writings, Yemeni spiral urban forms and New York’s Union Square as a democratic space of assembly. The illustrated plans and urban images in the chapter reinforce this argument visually, showing how public space itself becomes a communicative apparatus rather than a neutral container. As a case-study synthesis, the smart city becomes the negative example: when contemporary developments privilege seamless digital systems while suppressing informal, residual and embodied communication, they risk becoming over-rationalised machines rather than living cities. Ultimately, Mattern concludes that media infrastructures must be studied as layered techno-socio-spatio-material entanglements, shaped by path dependency, informal practices, human labour, scale and historical residue. Her decisive claim is that to understand media cities adequately, one must excavate not only cables and screens, but also voices, walls, streets, inscriptions and the longue durée of urban mediation.


Uploaded Image Uploaded Image

Brenner, N. and Schmid, C. (2017) ‘Elements for a New Epistemology of the Urban’, in Hall, S. and Burdett, R. (eds.) The SAGE Handbook of the 21st Century City. London: SAGE, pp. 47–66.

Neil Brenner and Christian Schmid’s ‘Elements for a New Epistemology of the Urban’ argues that contemporary urbanisation can no longer be understood through the inherited image of the city as a bounded settlement opposed to countryside, wilderness or hinterland. Their central claim is that the urban is not an empirical object but a theoretical category, meaning that it must be constructed through concepts rather than simply observed as a visible form. Against city-centred approaches, they propose planetary urbanisation as a framework for understanding how capitalism reorganises territories far beyond dense metropolitan cores. Mines, logistics corridors, data centres, agro-industrial zones, energy grids, waste sites, oceans and former wilderness areas are all incorporated into urban processes because they sustain the metabolism of distant agglomerations. The chapter distinguishes concentrated urbanisation, where people, infrastructure and capital cluster in cities; extended urbanisation, where remote landscapes are operationalised to support urban life; and differential urbanisation, where inherited spatial arrangements are repeatedly destroyed and remade. This triad is crucial because it shows that the urban is a process, not a fixed form. The authors also insist that urbanisation is multidimensional, involving spatial practices, territorial regulation and everyday life. This means that the urban is produced not only through buildings and infrastructures, but also through governance, labour, displacement, routine and struggle. Ultimately, Brenner and Schmid make an epistemological and political intervention: if urbanisation has become planetary, then urban theory must abandon the rural/urban binary and analyse the uneven, contested networks through which contemporary life is organised. The urban is therefore not merely where people live; it is the planetary fabric through which capitalism extracts, connects, regulates and transforms the world.


11 may 2026

Uploaded Image Uploaded Image

Rossi, A. (1982) The Architecture of the City. Cambridge, MA and London: MIT Press.



Aldo Rossi frames the city as a collective artefact whose architecture accumulates historical duration, civic meaning and psychological resonance. In the uploaded excerpt, Peter Eisenman’s introduction clarifies that Rossi’s project treats the city as an autonomous object of knowledge, composed of urban artefacts that persist through time while absorbing changing uses, memories and symbolic values. The visual material on page 2, juxtaposing the amphitheatre at Nîmes with Daedalus’ labyrinth, condenses Rossi’s analogical method: architecture becomes intelligible through correspondences between form, myth, permanence and urban destiny . Rossi’s central proposition turns on permanence, understood as the capacity of certain monuments, plans and urban fragments to survive functional change and become repositories of collective consciousness. The case study of the locus is especially decisive: Eisenman presents it as a component of the individual artefact, determined by space, time, topography, form and memory, through which the city transforms from physical settlement into a legible structure of human experience. In this sense, the city appears as a theatre of accumulated lives, where monuments, streets and districts operate as mnemonic instruments, preserving traces of civic identity while enabling future transformations. Rossi’s urban theory therefore gives architectural form a disciplinary dignity beyond immediate utility: type, monument and analogy become instruments for reading the city as both material fabric and historical mind. Ultimately, The Architecture of the City establishes urban architecture as a house of memory, where the endurance of form sustains the collective will of history. 

Uploaded Image Uploaded Image

Mumford, L. (n.d.) The City in History: Its Origins, Its Transformations, and Its Prospects. Spanish translation.

Lewis Mumford defines the city as a historical organism of human association, shaped by symbolic, domestic, economic and political energies. From the preface onwards, his inquiry moves from the archaic city conceived as “symbolically, a world” towards the modern world converted, in practical terms, into a city, thereby establishing a comprehensive genealogy of urban forms and their civilisational possibilities . His central proposition locates the origin of urbanity in the sanctuary, the cemetery, the cave, the village and the ritual place of assembly, where humanity first rehearsed civic institutions through memory, pilgrimage, art and reverence for the dead. The Neolithic village becomes the decisive case study: Mumford presents it as the matrix of permanence, nurture, storage, neighbourliness and cooperation, from which the house, granary, cistern, altar and public way gradually emerged. The city thus appears as a container of containers: a condensation of material techniques, ecological adaptations and moral bonds. Its historical development culminates in the tension between Megalopolis, with its vast administrative and technological apparatus, and the humanly scaled city capable of sustaining organic community. In conclusion, Mumford calls for an urban image that integrates technology, memory and personhood, so that the city may serve as an instrument of human fulfilment, civic continuity and cultural renewal. 
Uploaded Image Uploaded Image

Hayles, N.K. (2017) Unthought: The Power of the Cognitive Nonconscious. Chicago: University of Chicago Press.


N. Katherine Hayles’s Unthought: The Power of the Cognitive Nonconscious radically revises inherited assumptions about cognition by arguing that thought is not confined to conscious human reflection but distributed across nonconscious, biological, and technical processes. In the opening chapter, Hayles distinguishes thinking from cognition: thinking refers to higher mental operations such as reasoning, abstraction, and verbal formulation, whereas cognition is a broader faculty of interpreting information in ways that enable adaptive action. This distinction permits her to include humans, animals, plants, and technical systems within a wider ecology of cognitive activity, without reducing all cognition to human self-awareness. Her concept of the cognitive nonconscious is especially significant because it names the rapid, embodied, interpretive processing that precedes consciousness and makes conscious thought possible. For instance, she notes that nonconscious cognition processes sensory and environmental information faster than consciousness, enabling action before reflective awareness intervenes. A specific case study emerges in her discussion of technical cognition, where automated systems parse information, make selections, and generate outcomes through programmed yet adaptive processes. Such systems do not “think” like humans, but they participate in cognitive assemblages that bind people, machines, environments, and infrastructures into dynamic interpretive networks. Hayles therefore compels the humanities to abandon anthropocentric models of mind and to recognise cognition as relational, material, and distributed. Her conclusion is not that consciousness is obsolete, but that it is only one layer within a larger architecture of meaning-making.


 

Uploaded Image Uploaded Image
Uploaded Image Uploaded Image

10 may 2026

Uploaded Image Uploaded Image

Caldeira, T.P.R. (1996) ‘Fortified enclaves: The new urban segregation’, Public Culture, 8, pp. 303–328.

Teresa P. R. Caldeira defines fortified enclaves as privatised, enclosed, and monitored spaces for residence, consumption, leisure, and work, produced through walls, surveillance, guards, controlled access, and the rhetoric of security. These spaces transform urban segregation by replacing older centre–periphery divisions with a fragmented landscape in which rich and poor may live physically near one another while remaining socially separated by visible barriers. In São Paulo, Caldeira shows how economic crisis, democratic transition, urban restructuring, fear of crime, and rising police violence generated a city of walls where the affluent retreat into protected condominiums, malls, and office complexes. The case of closed residential condominiums is especially revealing: real-estate advertisements sell isolation, homogeneity, services, leisure, nature, and “total security” as markers of prestige, turning separation itself into a status symbol. These enclaves do more than protect; they reorganise public life by withdrawing elite sociability from streets and squares, leaving public space to those excluded from private worlds. Caldeira’s comparison with Los Angeles demonstrates that this is a global urban form, though São Paulo makes its inequalities unusually explicit through armed guards, fences, and stark proximity between luxury and poverty. The political consequence is profound: public space, once associated with openness, circulation, encounter, and democratic citizenship, becomes fractured by suspicion and exclusion. Caldeira’s contribution lies in showing that fortified urbanism corrodes citizenship by teaching social groups to inhabit separate worlds rather than recognise one another as co-citizens. 

Uploaded Image Uploaded Image

Wilson, M.O. (n.d.) ‘Future Memory: Space, Monumentality and Images of Blackness’ [draft].

Mabel O. Wilson defines future memory as a political relation between black historical consciousness, spatial exclusion, and forms of monumentality that anticipate action still to come. African American memory, shaped by slavery and Jim Crow segregation, emerges through oratory, music, photography, temporary exhibition halls, and counter-public spaces as much as through permanent stone or bronze. Wilson’s argument develops through the figure of Frederick Douglass, whose monument in Rochester and repeated appearance in Negro Buildings and international exhibitions transformed black achievement into public evidence against racist narratives of incapacity. Douglass becomes a case of black monumentality: his image, speeches, and photographic circulation produce memory as a demand for emancipation, equal treatment, and future justice. Wilson then turns to Carrie Mae Weems’s Roaming series, where the black female figure stands before Roman monuments, pyramids, gates, fascist architecture, and imperial urban space. In these images, photography itself becomes a monument: it marks time, witnesses power, and inserts black subjectivity into histories from which it has been excluded. The monument therefore operates as a time machine, synchronising past and present while asking who is authorised to inhabit history. Wilson’s contribution lies in showing that black monumentality is not merely commemorative; it is a critical practice that contests historical exclusion and imagines political futures. 

Uploaded Image Uploaded Image

The Quiet Performance of Structure

A field does not become real when it is declared. It becomes real when its structures begin to act with enough consistency that they no longer appear incidental. This is the point at which support stops looking secondary and starts behaving like form. The quiet performance of structure names that threshold. It describes a condition in which a project does not only produce texts, concepts, images, or archives, but also constructs the channels through which those elements persist, recur, and acquire public consequence. In such a condition, infrastructure is not merely what holds the work after the fact. It becomes part of the work’s operative surface. A scattered set of posts becomes a series. A series becomes a corpus. A corpus becomes a navigable field. A field becomes a semantic object that can be found, traversed, cited, and returned to. At each stage, the project does not simply expand; it is reconstituted through support. Structure begins to perform.

Uploaded Image Uploaded Image

Porter, T.M. (1995) Trust in Numbers: The Pursuit of Objectivity in Science and Public Life. Princeton, NJ: Princeton University Press.

Theodore M. Porter defines quantification as a technology of trust that allows science, administration, and public life to operate across distance. Numbers, graphs, formulas, standards, and explicit rules create an impersonal language through which decisions appear fair, transportable, and independent of personal judgement. Porter’s central concept, mechanical objectivity, describes this reliance on procedures that restrain discretion and reduce suspicion: the authority of numbers grows precisely where trust in individuals, experts, or institutions becomes fragile. His argument develops through the relation between science, bureaucracy, and public legitimacy. In contexts of conflict or institutional vulnerability, numerical methods make decisions defensible because they seem to follow rules rather than interests. A clear case appears in cost-benefit analysis, accounting, actuarial calculation, and public administration, where officials use quantitative criteria to justify choices before distant publics and critical outsiders. The table of contents reinforces this architecture: Porter moves from “Power in Numbers” to “Technologies of Trust” and finally to “Political and Scientific Communities”, showing that objectivity is social, moral, and institutional as well as epistemic. Quantification therefore functions as a disciplined form of communication: it standardises judgement, travels beyond local knowledge, and makes authority appear neutral. Porter’s contribution lies in showing that modern societies trust numbers because numbers promise impartiality where personal authority has become contestable. 
Uploaded Image Uploaded Image

Bush, V. (1945) ‘As We May Think’, The Atlantic Monthly, July, pp. 101–108.

Vannevar Bush’s seminal essay As We May Think constitutes one of the foundational conceptual documents of contemporary information culture, anticipating hypertext, networked databases, and digital knowledge systems decades before the emergence of personal computing or the Internet. Written in 1945 at the conclusion of the Second World War, the essay begins from a profound anxiety regarding the accelerating expansion of scientific knowledge and humanity’s growing incapacity to navigate its own informational abundance. Bush argues that the central crisis of modern science is no longer the production of data but the inability to organise, retrieve, and meaningfully connect the proliferating “mountain of research” generated by specialised disciplines. Traditional indexing systems, grounded in alphabetical or hierarchical ordering, appear increasingly inadequate because they fail to mirror the actual operations of human cognition. Against this rigidity, Bush proposes the revolutionary principle of associative indexing, insisting that the human mind operates through dynamic relational pathways rather than linear taxonomies. His speculative solution, the celebrated memex, is envisioned as a mechanised archival desk capable of storing vast microfilmed repositories while enabling users to create personalised associative “trails” linking disparate texts, images, annotations, and records. These trails effectively prefigure hyperlinks, digital annotation systems, and nonlinear navigation structures characteristic of contemporary cyberspace. Particularly remarkable is Bush’s insistence that knowledge should become collaborative, cumulative, and infrastructural, allowing scholars to inherit not merely isolated texts but entire architectures of thought constructed by previous researchers. The essay simultaneously anticipates speech recognition, wearable cameras, automated information retrieval, and machine-assisted cognition, all conceived as extensions of human memory rather than replacements for intellectual creativity. Yet Bush’s ultimate concern remains deeply humanistic: technological systems should liberate thought from repetitive informational labour so that creative and analytical capacities may flourish. Consequently, As We May Think stands not merely as a technological prophecy but as an epistemological manifesto arguing that civilisation’s survival depends upon constructing infrastructures capable of transforming information overload into meaningful associative knowledge.


Uploaded Image Uploaded Image

Latour, B. (1990) ‘Drawing Things Together’, in Lynch, M. and Woolgar, S. (eds.) Representation in Scientific Practice. Cambridge, MA: MIT Press, pp. 19–68.

Bruno Latour’s Drawing Things Together radically reconceptualises scientific knowledge by arguing that the authority of science does not arise primarily from abstract rationality or superior cognition, but from the capacity to produce, stabilise, transport, and accumulate inscriptions. Rejecting both cognitive idealism and naïve realism, Latour proposes that scientific practice depends upon material operations through which complex phenomena are transformed into combinable visual traces—maps, graphs, diagrams, charts, tables, specimens, and texts—that may circulate across vast distances without losing coherence. Central to his argument is the concept of the immutable mobile: an inscription that remains sufficiently stable to preserve information while remaining sufficiently mobile to travel through institutional networks. Through examples ranging from Renaissance perspective drawing and La Pérouse’s navigational charts to geological maps and laboratory instrumentation, Latour demonstrates how power accrues to those capable of gathering dispersed phenomena onto flat surfaces where they can be compared, superimposed, recombined, and manipulated. Particularly influential is his discussion of optical consistency, whereby perspective techniques, printing technologies, cartographic systems, and scientific diagrams create standardised visual spaces enabling objects from radically different contexts to become commensurable. The essay repeatedly insists that scientific “objectivity” emerges not from disembodied observation but from infrastructures of visualisation that permit phenomena to be rendered simultaneously visible and calculable. On pages 36–37, Latour’s discussion of pathology and clinical medicine demonstrates how diseases become intelligible only after inscriptions enable physicians to aggregate and compare cases within centralised institutions. Equally significant is his claim that laboratories function as centres of accumulation where inscriptions from diverse locations are assembled into strategic archives capable of mobilising political, economic, and epistemic authority. Scientific facts therefore succeed not because they transcend mediation, but because they are embedded within durable representational systems that amplify credibility and facilitate circulation. Ultimately, Latour transforms representation itself into a material and logistical achievement, revealing that the history of science is inseparable from the evolution of visual technologies, paperwork, and inscriptional infrastructures through which the world is progressively rendered legible, transportable, and governable.





Uploaded Image Uploaded Image