What is a burning question? One that needs to be answered! Taking a prompt from Rebecca Solnit's "To Break the Story, You Must Break the Status Quo", we set out to hold a virtual panel discussion that thinks about and beyond the data status quo. What are the dominant narratives? Where are the gaps existing data practice? What's worrying and what's hopeful?
Currently we've discussed two burning questions:
Question 1: What does a bright / dark data future look like?
[Alex] Looking back to how we started this with the SWCTN fellowship call, we’re trying to think critically about the role of data in our lives, to help explore present and future challenges and open a discussion on ways forward. Because data is so pervasive it can be really tough to know where to start - we could be looking at data as it relates to many different industries, about the nature of big data, about bias, privacy, misinformation, sustainability… But it’s hard to step back from those specifics and ask ourselves, well, what’s the actual point in talking about this? What’s motivating us to be involved in these discussions?
So I like this question as a place to start, to begin by thinking about our hopes and fears. How much are we motivated by positive futures to be won? Or by a fear of certain downward spirals. When we think about these futures, how much are we focusing on the ‘new’, of the fundamental properties of digital data? Or are we looking more at how aspects of our society have been translated into digital environments?
[Pete] I will just start with two quotes, both directly related to the question, in my mind anyway and focused on our belief we have a future with data and data with us, after all, we contribute, consume and contribute to it, the big difference being we now, collect and distribute data differently than we once did and we use it differently too.
' Progress means humanity emerges from its spellbound state, no longer under the spell of progress as well, itself nature, by becoming aware of its own indigenousness to nature and by halting the mastery over nature through which nature continues its mastery’ --- Theodor Adorno: Progress” (p. 62)
‘This future is unthinkable. Yet here we are, thinking it’--- Timothy Morton: Dark Ecology (p. 1)
If we consider data as a material, malleable, expandable, recyclable, connectable and delivered through an anthropocentric system, which identifies an age marked by global capitalism, nuclear development, rapid industrialization, urban sprawl and the digital revolution, could we can start to treat data as we (in an ideal world) would plastics. Should it be a particular sort, should it have limited life, should it have governance and ethical considerations around use, should governments legislate against particular forms of data, most of all should humans play a bigger part in understanding and the uses it has. These are for me, both the light and the dark of data politic, we certainly need a viable and arguably urgent, framework for thinking through, the imperatives of the long-lasting effects of our current views around data; and these need discussing in a social, human-political, and holistic-ecological way.
[Alex] I think this framing around material data is really interesting, I'm curious what draws you to plastics as an analogue over other materials, is it to do with being simultaneously useful and dangerous?
[Pete] The idea of using plastic is interesting, because like DATA it is both micro/macro/nano/useful and terrible. And like plastic use it is behaviour change that will twist/change our mind sets. One of the issues with DATA is we feel it is not in our control, or we cant regulate, dispose, stop or question its use, for me the metaphor of a real material allows us, maybe, to see it differently, as we are starting to with our use of plastics, fuel and other toxic/useful materials.
[Natasha] Right now, with what’s going on in the US it’s hard not to think of dystopian data futures. Trump’s use of social media and propagating distorted truths and outright lies in handy soundbites has been frighteningly effective, and it’s a strategy that is difficult to fight because social media allows the messaging to spread like wildfire, and replicate like a virus, among the population. If we’re lucky these recent events will be a dire warning that opened our eyes to the dangers of unchecked information and unlimited media, and changed things. If we’re not so lucky, a darker and scarier future threatens.
What’s this got to do with data sensemaking? I think it’s got a lot to do with it, because increasingly we live ‘in data’ and this is a new condition for our species, to which we have to adapt. You could think of this as a new environmental condition, like light, air or heat from the sun. We have learnt to use, and modify, these conditions in order to survive. And so we need to understand how data works, its properties, limitations, opportunities, dangers. To make or craft good things from it, we need to consider it as a material with unique properties, to be worked with appropriate tools. As events in the US show, data is not benign, though it may look it. It has explosive potential. Like sodium, which looks innocent enough, until combined with water.
So in data sensemaking the selection of data sets (what is selected and what is omitted) and the framings applied to that data, are loaded with the values and bias of its creators, and the outcomes of a project or process may look bright for some, but dark for others, despite best intentions. A positive data future will only emerge by getting under the skin of these difficult issues, making the sourcing, construction and sharing of data transparent and accessible and honing our skills in working with it.
Annie, as you have practical experience of working with data sensemaking, does this resonate with you? In your projects have you encountered conflicts or difficulties e.g.in the selection or framing of data, that illuminated something about what we need to do better to build bright futures with data?
[Annie] Yes, this makes me reflect on a recent experience on a project which was seeking to explore how to reach people who wouldn’t have previously have a way of entering world of creative technology innovation. We were mapping a regional ecosystem that could support the strengthening of relationships, and the pathways to inviting new communities into regional programmes and networks. In fact the reality of this work was surfacing the invisible - where underprivileged and excluded communities were, those without the privilege of ever knowing creative technology could be a future for them.
Sensemaking work is by its very nature an organic process. I see it as creating an architecture of knowledge to which data is at the core. Sense is made from the data, but fundamentally the experience of collecting that data. It was this project that made me really see how data we were surfacing and sensemaking is a key step in systems change, change making for our future. A brighter future that in this case is orientated and guided by a data map of relationships that can enable the strengthening and creation of inclusive networks.
As researchers, what was the unintentional bias we bringing to this work, the unintended consequences? Who and what have we not included? How do we define exclusion? Most fundamentally, who and what were we putting at risk by not acknowledging them within our data?
[Alex] Annie, I wonder how this question links to your purpose with founding DOT PROJECT. You're working with organisations in the social and public sector, how much is data an antagonistic force there vs. an empowering one?
[Annie] Indeed DOT PROJECT is about systems change, but in the particular context of enabling civil society to thrive with digital, with data. We see DOT INSIGHT and our sensemaking practice as a fundamental building block on our journey where data becomes an enabler for change. Yes I agree in the context of civil society, our bright future is one where organisations are enabled by technology, to exist in a digital world that allows them to achieve the change and the societal impact they seek to achieve. Particularly during this time of turbulence and change we are exist more than ever in a digitised space where physical contact and relationships have been removed, where the most marginalised and excluded are even more so. Data from sensemaking the systems in which we exist can help to see where gaps and opportunities exist, to strengthen the case for investment across civil society.
[Alex] Personally I like the book ‘Four Futures’ by Peter Frace as a way of answering this question. It describes the ways abundance or scarcity could lead to certain forms of future, specifically around resource scarcity through climate catastrophe and technology scarcity through licensing and rentierism. So if we live in a future where data and technology are owned by all communities, we might find ways of thriving sustainably even when facing extreme resource scarcity. Or if some moonshot provides for extreme resource abundance, our lives can still be crappy if access to these resources is gated behind access to exclusive technology. I find this framing helps motivate some of my burning questions - how does data affect the entrenchment of power in the hands of a few? What ‘common sense’ systems in our society lead to that entrenchment? What does it mean for data to be controlled by a community, and used for the good of that community?
I really like the framing from Pete and Natasha particularly, around whether we can learn to treat data as a sustainable material, and/or as part of our environment. I think the negatives kind of spring to mind here - that there are these forms of data pollution which wreak havoc on our physical and digital ecosystems. And because we’re thinking in terms of negative externalities, it’s natural to talk about regulation and harm reduction. Is that where the ‘positive’ lies in this framing, as a way of countering the damage from data misuse? Are we fighting a defensive battle against a slide into data darkness? Or are there developments here that you see as being positive in their own right?
[Natasha] Briefly, yes! I absolutely think particular kinds of data practice can punch through the status quo and potentially lead to ‘bright’ outcomes, and we have some good overlaps in our interests in and positive feelings about community data structures, data ownership and, for me, democratic practices, placemaking etc. But ‘Bright’ and ‘dark’ data futures are just two sides of the same coin – we need to understand the mechanics of both constructions to be able to realise our intentions when designing with data. Data is a slippery material, and easily appropriated for different ends than originally intended.
[Alex] Ah, right, when you say two sides of the same coin that reminds me of one of the conclusions in 'Four Futures' - that these 'futures' are really our present, it's just that different people are born into different situations and relationships to abundance and scarcity, both in access to physical resources as well as technology. That's always something valuable to keep in mind as we discuss things here, I find it easy to slip into assumptions about data because I have a certain experience of data that is 'globalised', english language, etc.
[Annie] Alex I'm thinking a lot about community ownership of data - if not for the obvious need for a sustainable and responsible approach to maintaining a data asset. The mapping, or sensemaking practices I've been focused on, are where the data can create a shared, actionable understanding of the world in which a client is operating in, the organisations, people, initiatives.
Clients in my context are of the social economy, so the data surfaced will undoubtedly have value to others across the same system. There is significant potential for social capital where the dataset itself can become the catalyst for cooperation, for shared value and understanding. Convening around the data can bring about new communities of practice who are interested in sharing, managing and maintaining datasets that bring value to their work and social impact.
DOT PROJECT is a cooperative, and the mapping approach is rooted in the very heart of our principles of autonomy, equality and democratic control. Integral to the mapping is considering how to embed transparency, accountability and equality throughout the process. In particular, exploring the following question of how might we create an approach which supports the long-term ambition of community focused collectives to collect, interpret and maintain data? A collective approach is desirable as, from a practical perspective it reduces the risk of duplicating data collection and system analysis activities, furthermore it starts to model a way in which key community stakeholders and take decisions together and use the information to create aligned community responses.
[Pete] There are some positives ... DATA unions being one and the work being done on AI bias. With all our doom scrolling lately its hard not to fall over on the negative side though. As our futures with data become more apparent/visible I expect we will be more discerning about how to use data, but as always it remains crucial for big data operators to develop ethical practices for real-time analysis that protects our user information. One of the important things about this platform is just that, making us aware of how to choose, how to think through data, the good stuff opposed to the bad stuff. I think it's also really important to remember the different scales of data, from big to small, local to global. A quick search pulls up these 'positives' of data capture:'
Using big data cuts your costs. ...increases your efficiency....improves your pricing. ...You can compete with big businesses. ...Allows you to focus on local preferences. ...Using big data helps you increase sales and loyalty... Using big data ensures you hire the right employees.'
There seems a long way to go before we see a nuanced, focus change. Shoshane Zuboff talks to big data as being a massive supporter of 'surveillance capitalism' (just think of your last netflix's suggestions) But returning to the point more positively, while a lot of big data is up for questionable use, we need to be thinking more about reclaiming the value that big data offers and apply aspects of it for social good.
[Natasha] Alex, in your question/ provocation you talk about an axis between bright and dark futures. I think my axis, or pivot point, is around our capacity as citizens (as opposed to big data players) to make use of data, to decipher it, to frame it effectively, to see through it, to store or share data on our own terms, to build collective structures, to make change with it, to call out abuses (or to fail to do this). This relates to tools or a data toolkit because to build this capacity we, as citizens, need new tools and the knowledge to use them. Are we heading on one side of the axis to dark data futures forced on us by bad actors (corporations, politicians etc.)? Although this is a possibility, in my research through the Fellowship (conferences, reading, talking etc) I’ve come across enough brilliant people and projects to convince me otherwise. Based on understanding the inherent difficulties and threats in this new material called data, they are finding new ways to operate with it in the real world and facilitate individual and collective action. The way to the bright data uplands has citizens, active and empowered in data, at the heart of it.
[Natasha] A framing of bright data futures that interests me concerns the critical position, and methods or devices that facilitate a critical process around datasets among citizens, and for some a first way in to working with data. I’m interested in deliberative processes (or deep thinking), such as used in citizens assemblies where a representative group of citizens of all ages, backgrounds, ethnicities, socio-economic groups, spend time considering a difficult issue, like climate change, from all angles. Data is naturally slippery and hard to get hold of. By nature data is live, it moves fast, is constantly updating and is vast in scale, potentially overwhelming. (Watery metaphors, river and ocean fit it well). These characteristics mitigate against a critical understanding of data collected, including establishing its value, use and ethical aspects. In my research I’ve looked into the use of signal-blocking materials to design ‘analogue’ spaces (where mobile digital comms signals are blocked), for example, a design for a citizens assembly space for deliberative democracy. This is one example of a mechanism to stop or catch the flow of data, to stop the rush of digital pings and nudges, to make space and time to think and reflect, either collectively or individually, before returning to the immersive world of ‘in data’ with new insights and fresh eyes. In this context I think analogue devices and objects, as well as spaces, are interesting mode-changers.
[Pete] I like the way you are thinking through this Natasha and I agree about the toolkit, we have mentioned this on quite a few occasions. With little available, except for mash-ups of field research from information design, or slightly out of touch design/thinking methodologies, that negate technology or media theory, or simply ignore it altogether, where is the sense-making with data. I like the optimistic tone directed towards the future and this flags up for me how the concept of mediated spaces becomes super interesting, from both a spatial perspective and a material one, when related or converted by, or through data. In my mind this could be communities, groups, unions, visualised experiences and analogue experiments.
[Annie] Natasha this is fascinating, and the notion of deliberative democracy. To stop or catch the flow of data. I've been exploring ways in which to surface the invisible, a responsible sensemaking approach if you like! In particular bringing the art of conversation to the research process to determine what kind of signals to pay attention to. A mixed method approach that must include diverse stakeholders that reflect the very conditions in which the data is collected. Creating and holding the space for diverse voices to converse around the system in which they operate, the relationships they see, the gaps they experience. In a way creating the space to allow us to pay attention to signals that are otherwise lost in the immersive world you refer to.
[Annie] Systems mapping for the purpose of our conversation is referring to a way in which to better understand and identify opportunities. A system being a set of ‘things’ working together as parts of an interconnecting network to accomplish an overall goal.
My own use of the term is coming from the field of social network analysis, the aim of which is to understand a "community" (or "system") by mapping the individuals, groups, and relationships that connect them as a networks. The system could be simple, or indeed highly complex. Systems mapping helps us to understand all these ‘things’ — the entities of a system — and how they relate to each other. It is a tool, or artefact that can support a systems thinking approach by communicating an understanding of the system.
My particular interest is in the use of system mapping to help tackle strategic problems that require working together and a partnership approach for social development - public sector, civil society organisations, social enterprise - collective action.
So the question is really seeking to explore how the information, knowledge and wisdom enabled through a system mapping approach can be democratised? How can we ensure the outputs of this tool are accessible to all, trusted by all, usable by all? Where can the mapping data itself become a catalyst for change?
[Natasha] This is interesting. I know you have an established practice in data sensemaking for specific clients and with specific partners. It sounds like you’re interested in looking beyond the discreet project to the idea of connecting the data collected to a wider audience. In your current practice, if I understand correctly, the data is collected within an initial closed group of partners working to a defined purpose. This produces a repository of data, drawn from the ‘system’ or community, but also potentially useful to others beyond the initial group. So I think your interest is in how to democratise access to this data, make sure that the quality of the data can be scrutinised and verified, and potentially bring in more active partners to the network in order to bring about meaningful change.
[Annie] Yes exactly. The process starts with a closed group of stakeholders until there is, as you say, an initial repository of data. In a way, the projects themselves aim to work along the data spectrum (as defined by the ODI), ideally to a point of public access, and even open where there is value to the wider system in doing so. There are some really interesting developments in this space including Open Data Manchester, a not-for-profit community interest company driving responsible and open data across the city. An example project is leading the development of a data cooperative model for small energy cooperatives, "The aim of the project is to explore if a cooperative model of data custodianship could allow members energy data to be collected, used and shared more effectively both within and outside the cooperative, returning value to the individual member, the cooperative and wider society".
[Natasha] This is in the realm of data institutions, such as data commons, data trusts and data co-operatives which are organisational and legal structures for managing data, including data rights, consents, personal data and privacy. Most data is personal, in one way or another, so these kind of protocols are fundamental if data is to be collected and used in an open way outside of its usual silos. There is interesting work going on in this area, including how the mechanisms for members of a data institution to make decisions about what the data held can be used for. And these institutions can be used for stewardship of data collection as an ongoing activity, which may be a route to a financially sustainable entity.
[Annie] I think the concept of data cooperatives is very interesting and the power shift that comes with shared ownership. Mozilla Foundation has some interesting explanations of this, and a very useful list of data stewardship literature.
[Natasha] Your question raises lots of lines of inquiry. To democratise access to data, some basic requirements must be met i.e. citizens must have access to the technology, appropriate interfaces that make the data accessible and some data literacy (not insignificant asks). But beyond this, democracy fails if people don’t show up. Democratic systems depend on a level of engagement from a diverse range of citizens. If the project is to be built on a genuinely broad base of engagement, the people (the demos) have to see the value in engaging with data i.e. that data can make a case for change, that data can make things happen and that they can play a part in deciding what that change might be. I think that sharing practical examples to demonstrate the power of data to effect change are useful here, and there’s more to say about deliberative processes in democracy, but I’ll leave it there for now. Can you see a place for any of this in your sensemaking practice, Annie?
[Annie] Absolutely. Accessibility and inclusion are key principles, both from a collective ownership perspective, but also in access to the data itself - what tools for adding, editing and sharing the dataset that are appropriate, open and relevant for the mapped system so that it can evolve in an open and democratic way. For example, when mapping organisations are we thinking about access for organisations that have limited resources, barriers to digital, or English, and are unable to find the time to engage and miss opportunities? Are there power structures in place that make organisations feel like they don’t have control? We have built in to our process a phase of facilitating guiding principles and addressing ethics at the outset of each research project.
[Pete] As far as I understand successful system mapping approaches, is that in many cases, they are human systems, they rely upon clear, fair, and discrete exchanges of value between people and a product or service. Other human systems might rely on contributions from participants and highlight in-kind benefits of collaboration. But exchanges of value can be distorted when a system is tracking behavior, monitoring clicks, and stockpiling detailed personal information, because people may never learn the true value of information they have contributed, and so not able, to truly benefit from the exchange. To maintain trust and integrity over the long term, perhaps we should always examine the detail and degree of the benefits being exchanged and explore what is valuable to those whose information is being used.
So, the later part of the question ‘How can we ensure the outputs of this tool are accessible to all, trusted by all, usable by all? Where can the mapping data itself become a catalyst for change’? is relevant. The process of data gathering, pushing, using, storing and promoting over a system has developed quickly over the last five years. One of the recognized gaps, is that data governance, which should enable us as humans, communities, collectives and organisations, to answer critical questions about results and decision-making of data applications, is not what it should be. This includes, who is accountable? how does data align with the community, collective, human or business strategy. What processes could be modified to improve the safe collection and ethical use of outputs? What controls need to be in place to track data for bias and pinpoint problems?
Are these results consistent and reproducible? I think the ability to answer these questions and respond to the outcomes of data systems requires a more flexible and adaptable form of approach, when you think traditionally these functions would have been applied to static processes. But data processes are iterative (they change) So you would imagine the governance/care must as well; I think we can improve this, in exactly how the question states, that the data itself becomes a catalyst for change, because, although the map is just a map, driving successful, change means creating successful co-operative execution of the mapping, which depends then on the data being part of something greater than just the collection process.
[Alex] This kind of sensemaking is kind of central to the identity of Google, Facebook, and many other companies. Finding ways to capture system information, to catalogue and organise, and then use this data to drive changes. They are "data driven", and this is both high praise and the source of many issues. Is the answer to these issues to create "the peoples Facebook"? Is the primary issue the governance of the machine or the machine itself? I think that the question of dual-direction engagement with data, of both contributing to a system map and and benefitting from that system map, is interesting - but I think it's necessary to strip back the network question and ask how a 'node' can exist in that system in a way that treats their agency and control as sacrosanct. Consent for wider connections and communities can only be built on that foundation of trust.
In other words, if I represent a collective, a small group. What data do I own? Well, an infinite variety, but on a basic level there's the fact that I exist in some form. That I exist with some purpose. And, perhaps, that I'm not insular. I'm looking to connect in some way to other people. This is information that I (or we) hold, regardless of whether I'm conscious of it, or if I choose to write it down or digitise it or share it publicly. For some groups, sharing this kind of information openly might be natural, but these groups are often more well served by existing network structures. For other groups this isn't the case, and it's particularly for those groups who are failed or put in danger by existing practices that this question has the most urgency. The only way to build a network that everyone can participate in is to ensure that everyone can trust the network, and the only way to do that is to remove the concept of trust entirely, to guarantee that each group is 100% in control of all aspects of their data at all time - what data is in the system, who gets to see it, who gets to even know about it.
As a baseline we can think about the advantages of real life, physical relationships. We can encounter each other as one person to another. We can mutually agree to meet. We can walk away and forget each other. Our relationship isn't mediated through a third party. The question for me is how do we maintain these advantages of physical communities while expanding into online spaces - using the advantages of digital connection to build networks that span physical and virtual without sacrificing control or privacy?
There's a ton of interesting development (and a ton of less interesting development) going on in the distributed peer-to-peer space that might help - for example, Solid, Sandstorm, or Urbit - technologies build around complete private ownership of data, and direct connection between peers without intermediary mediation. For a long time I felt quite pessimistic about these developments, as the language of privacy and personal control feels inadequate - we so often see that privacy concerns alone are not enough to motivate individual action. It's only recently that something has clicked, that I've realised that part of that pessimism is from being trapped in a 'digital realism', and that these software developments can be a first step in reclaiming technology as a tool for personal + collective empowerment (even if some of the first developments look a lot like recreations of things that already exist).
Well, that's a bit of a tangent from where this question started - but my point is to question what it means to have a democratic network, or to ask where the limitations are of open data.
[Annie] Yes and thank-you Alex as these are also hugely relevant points - referring to my reference to The Data Spectrum, truly being open is hard and inevitably will take time and patience. I often think about the concept of data stewardship, and how this is one way in which to try and engender trust, but also comes with the challenges you identify quite clearly here - is their trust in the who / what / how the data stewardship is managed? Are there hidden power structures at play that will influence the desired state of openness?
[Alex] Yeah, I think the Data Spectrum is an interesting model for this in a few ways. First in highlighting that natural gradation between something which is completely private, vs something I share with a specific person or group. But there's some stuff left unsaid here (to an extent deliberately - it's a simple and useful model). They offer a spectrum between personal-commerical, and this is what I was trying to get at earlier when saying that the language of personal privacy feels like it's limited, or obscuring something. For a lot of systems data it makes sense to use the 'commercial' label - even if non-commercial, you might be dealing with a formalised organisation with a legal status - but when we talk about democratising data or thinking about agency, what we're adding to the mix is often an informal group of people. I.e. there may be a system or a collective, but one that exists only as a gathering of people rather than as an abstract, formal entity. So for me that's a key part of this question, how to link systems mapping with existing collectives without forcing engagement in some formal or mediated way with a third party.
The other interesting thing to touch on with regards to the ODI Data Spectrum is that it covers the licensing of data itself, but not the way that data is often controlled by a platform owner. I.e. we might have a closed conversation, but the connection between us is undoubtedly being modelled on some social graph by various companies. And I'm not a purist about that, but this comes back to wanting to break out of this 'digital realism' that suggests it's natural or inevitable for our interactions to be tracked. The alternative is increasing conspiracy, skepticism, and unwillingness to engage - i.e. is the NHS covid app spying on you? Well, no, but your phone definitely is in general so it's not unreasonable to be wary.