Of standards… and technological sovereignty in the city
How this seemingly dry concept can become a tool for tech sovereignty
Technological changes are shaping politics, from local to global, as recent commentary about the geopolitics of AI or semiconductors show. At the global level, the debate is often dominated by a narrative that pivots around the competition between the US and China, each representing a different paradigm: the US corporate-led model dominated by a few players and the Chinese model led and tightly controlled by the Government. This is, of course, a simplistic picture of a more nuanced painting, where the US Government plays as substantial role, both in financing and developing the technology as well as setting up surveillance structures. The Chinese side of the story, although opaquer, is far from that simple either. If, and when mentioned in these accounts of global technology politics, the EU appears to be characterized as running behind in an already-lost race.
An emerging concept in some of these debates is the idea of technological sovereignty, or when talked about data issues specifically, data sovereignty. As with similar broad and charged concepts, sovereignty is often used a bit ambiguously and to mean different things. Some focus on the sovereignty of a State or a political entity (the EU, a city or an indigenous community), while others apply it to the individual sovereignty of citizens.
In the geopolitics debate, the flag of sovereignty is often hoisted to defend two types of actions. The first one is the need for Governments to control the technological infrastructure (such as 5G or cloud) through ownership, close oversight or entry barriers to foreign actors. The second one relates to the need to promote companies that can compete with foreign technological corporations. Whether these actions will ensure the technological sovereignty of their sponsors is questionable.
Most importantly, if the ultimate goal of protecting technological sovereignty is to ensure that technological transformations incorporate the values of that political community (for example the EU), those measures may not be the most effective ones. Public ownership of the infrastructure or having domestic tech giants will not guarantee that technological transformations do not undermine democratic accountability, individual freedoms or social justice. Rules, processes, institutions and tools that are democratic, transparent, and that specifically protect those values and rights may be more fit for purpose.
An often-neglected tool that may play a fundamental role in how those values and rights are integrated into technologies are data standards.
Data standards are the guidelines by which data are described and recorded. The literature about quantification that explores how decisions about what data to collect, and how to do it, shape reality and reflect structures of power is far-reaching. Scott (1998) may be a good place to start for those interested in how the efforts to make the reality “legible” to political authorities have ended up in disasters. What is important, however, is to be aware that data standards can determine how reality is structured, who is left in or left out, and that therefore, the definition of data standards is a highly political process.
In his fascinating Master’s Thesis, titled Code Shift: Data, Governance, and Equity in Los Angeles’s Shared Mobility Pilots, Emmett Z. McKinney studies how data standards such as the Mobility Data Specification launched in Los Angeles can impact mobility equity:
Like human languages, digital languages simplify our knowledge of the world into a format that is standardized, legible and scalable. Similarly, digital languages are not infallible measures of reality; rather they are the products of interpretation. Programmers must decide how best to represent the complexity of the world in a simplified format, which means that some information is left out, some related concepts are grouped together, and some are emphasized over others. (Page 22)
McKinney uses the example of the street to beautifully illustrate what this means for data standards in urban settings:
A standard set of information to include in that dataset may be physical attributes of a street: its width, the presence of a bike lane, how many lanes it contains, and in which directions. This construction of a street depicts it primarily as a link in a network; holding little value apart from conveying travelers and goods. In reality, a street serves many functions – a social space in and of itself. Embedded in each street are different sensory experiences, as well as different histories. Individuals encounter this street in different ways. For example, the width of a sidewalk or height of a curb take on special importance for a person with disabilities. Considering a more complete set of attributes to be part of the same entity within a database, could inform the design of transportation networks that meet the specialized needs of groups that have been underserved in the past. (Page 24)
As this example shows, the process for the definition of the standards needs to be participatory and inclusive if the technology is to reflect and incorporate the values of the community that will be impacted by it. In other words, data standards can be a key tool to protect or undermine the democratic sovereignty of a political community, and therefore, the process of how those standards are defined is highly political.