3 Data Types Every ICT4D Organization Needs – Your Weekend Long Reads

After five years researching the effectiveness of non-profit organizations (NPOs) in the USA, Stanford University lecturer Kathleen Kelly Janus found that while 75% of NPOs collect data, only 6% feel they are using it effectively. (Just to be clear, these were not all tech organizations.)

She suggests the reason is because they don’t have a data culture. In other words, they need to cultivate “a deep, organization-wide comfort level with using metrics to maximize social impact.” Or, in ICT4D speak, they need to be data-driven.

Perhaps NPOs feel that if they start collecting, analysing and using big data, that need will be satisfied. But one cloud server of big data does not a data culture make. While big data can be a powerful tool for development, there are three other data types that could significantly improve the impact of any ICT4D intervention.

Thick data

Technology ethnographer, Tricia Wang, warns us about the dangers of only looking to big data for the answers, of only trusting large sets of quantitative data without a human perspective. She proposes that big data must be supplemented with “thick data,” which is qualitative data gathered by spending time with people.

Big data excels at quantifying very specific environments – like delivery logistics or genetic code – and doing so at scale. But humans are complex and so are the changing contexts in which they live (especially true for ICT4D constituents). Big data can miss the nuances of the human factor and portray an incomplete picture.

As a real-life example, in 2009 Wang joined Nokia to try to understand the mobile phone market in China. She observed, talked to, and lived amongst low-income people and quickly realised that – despite their financial constraints – they were aspiring to own a smartphone. Some of them would spend half of their monthly income to buy one.

But the sample was small, the data not big, and Nokia was not convinced. Nokia’s own big data was not telling the full story – it was missing thick data, which led to catastrophic consequences for the company.

Adjacent data

Sometimes there is value in overlaying data from other sources onto your own to provide new insights. Let’s call this “adjacent data”. Janus provides the case of Row New York, an organization that pairs rigorous athletic training with tutoring and other academic support to empower youth from under-resourced communities.

To measure success, Row started by tracking metrics like the number of participants, growth, and fitness levels. But how could they track determination or “grit” – attributes of resilient people?

They started recording both attendance and daily weather conditions to show which students were still showing up to row even when it was 4C degrees and raining. “Those indicators of grit tracked with students who were demonstrating academic and life success, proving that [Row’s] intervention was improving those students’ outcomes.”

Pinpointing adjacent data requires thinking outside of the box. Maybe reading Malcom Gladwell or Freakonomics will provide creative inspiration for finding those hidden data connectors.

Lean data

Lastly, there is a real risk in just hoovering up every possible data point in the hope that the answers to increased impact and operational efficiencies will emerge. That’s not referring only to the data security and privacy risks related to the sponge approach. Rather, that’s because it’s easy to drown in data.

Most ICT4D initiatives don’t have the tech or the people to meaningfully process the stuff. Too much data can overwhelm, not reveal insights. The challenge is gathering just enough data, just the data we need – let’s call this the “lean data”. When it comes to data, more is not better, just right is better. In fact, big data can be lean. It’s not about quantity but rather selectiveness.

Lean data is defined by the goals of the initiative and its success metrics. Measure enough to meet those needs. When I was head of mobile at Pearson South Africa’s Innovation Lab, we were developing an assessment app for high school learners called X-kit Achieve Mobile.

With the team we brainstormed the data we needed to serve our goals and those of the student and teacher users. We threw in quite a lot of extra bits based on “Hmm, that would be cool to know, let’s put it in a dashboard.”

The company was also preparing to report publicly on its educational impact, so certain data points were being collected by all digital products. Having a common data dictionary and reporting matrix is something worth considering if you’re implementing more than one product.

After building the app we only really used about 20% of all the reports and dashboards. Only as we iterated did we discover new reports that we actually needed. The fact is that data is seductive, it brings out the hoarder in all of us. We should resist and only take what we need

So, perhaps the path to building a data culture is to always have thick data, be creative about using adjacent data, and keep all data lean.

Image: CC by janholmquist

Every Big Data Algorithm Needs a Storyteller – Your Weekend Long Reads

The use of big data by public institutions is increasingly shaping peoples’ lives. In the USA, algorithms influence the criminal justice system through risk assessment and predictive policing systems, drive energy allocation and change educational system through new teacher evaluation tools.

The belief is that the data knows best, that you can’t argue with the math, and that the algorithms ensure the work of public agencies is more efficient and effective. And, often, we simply have to maintain this trust because nobody can examine the algorithms.

But what happens when – not if – the data works against us? What is the consequence of the algorithms being “black boxed” and outside of public scrutiny? Behind this are two implications for ICT4D.

The Data Don’t Lie, Right?

Data scientist and Harvard PhD in Mathematics, Cathy O’Neill, says that clever marketing has tricked us to be intimidated by algorithms, to make us trust and fear algorithms simply because, in general, we trust and fear math.

O’Neill’s 2016 book, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, shows how when big data goes wrong teachers lose jobs, women don’t get promoted and global financial systems crash. Her key message: the era of blind faith in big data must end, and the black boxes must be opened.

Demand Algorithmic Accountability

It is very interesting, then, that New York City has a new law on the books to do just that and demand “algorithmic accountability” (presumably drawing on the Web Foundation’s report of the same name). According to MIT Technology Review, the city’s council passed America’s first bill to ban algorithmic discrimination in city government. The bill wants a task force to study how city agencies use algorithms and create a report on how to make algorithms more easily understandable to the public.

AI Now, a research institute at New York University focused on the social impact of AI, has offered a framework centered on what it calls Algorithmic Impact Assessments. Essentially, this calls for greater openness around algorithms, strengthening of agencies’ capacities to evaluate the systems they procure, and increased public opportunity to dispute the numbers and the math behind them.

Data Storytellers

So, what does this mean for ICT4D? Two things, based on our commitment to being transparent and accountable for the data we collect. Firstly, organisations that mine big data need to become interpreters of their algorithms. Someone on the data science team needs to be able to explain the math to the public.

Back in 2014 the UN Secretary General proposed that “communities of ‘information intermediaries’ should be fostered to develop new tools that can translate raw data into information for a broader constituency of non-technical potential users and enable citizens and other data users to provide feedback.” You’ve noticed the increase in jobs for data scientists and data visualisation designers, right?

But it goes beyond that. With every report and outcome that draws on big data, there needs to be a “how we got here” explanation. Not just making the data understandable, but the story behind that data. Maybe the data visualiser does this, but maybe there’s a new role of data storyteller in the making.

The UN Global Pulse principle says we should “design, carry out, report and document our activities with adequate accuracy and openness.” At the same time, Forbes says data storytelling is an essential skill. There is clearly a connection here. Design and UI thinking will be needed to make sure the heavy lifting behind the data scenes can be easily explained, like you would to your grandmother. Is this an impossible ask? Well, the alternative is simply not an option anymore.

Data Activists

Secondly, organisations that use someone else’s big data analysis – like many ICT4D orgs these days – need to take an activist approach. They need to ask questions about where the data comes from, what steps were taken to audit it for inherent bias, for an explanation of the “secret sauce” in the analysis. We need to demand algorithmic accountability” We are creators and arbiters of big data.

The issue extends beyond protecting user data and privacy, important as this is. It relates to transparency and comprehension. Now is the time, before it’s too late, to lay down the practices that ensure we all know how big data gets cooked up.

Image: CC by kris krüg

In ICT4D We’re Principled, But Are We Practiced Enough? – Your Weekend Long Reads

Last month CO.DESIGN published the 10 New Principles of Good Design (thanks Air-bel Center for the link). The article, which is based on a set of industrial design principles from the 1970s, makes for important reading.

According to the author, and concerning commercial digital solutions, 2017 was “a year of reckoning for the design community. UX became a weapon, AI posed new challenges, and debate erupted over once rock-solid design paradigms.” What is most interesting — and wonderful to boot — is that many of the “new” principles we, the ICT4D community, have endorsed for years.

Good Design is Transparent

For example, the article calls for transparency in design. Apparently today, “amid a string of high-profile data breaches and opaque algorithms that threaten the very bedrock of democracy, consumers have grown wary of slick interfaces that hide their inner workings.”

We know that user-centered design is participatory and that we should expose the important parts of digital solutions to our users. We believe in telling our users what we’ll do with their data.

Good Design Considers Broad Consequences and is Mindful of Systems

The article warns that in focusing on the immediate needs of users, user-friendly design often fails to consider long-term consequences. “Take Facebook’s echo chamber, Airbnb’s deleterious impact on affordable housing,” as examples. Not for us: we understand the existing ecosystem, are conscious of long-term consequences and design for sustainability.

A Little History Lesson

Today we have principles for sectors — such as refugees, health and government (US or UK version?); for cross-cutting themes — such as identity, gender and mobile money; for research; and the grand daddy of them all, for digital development.

These principles have been developed over a long time. Fifteen years go I wrote a literature survey on the best practices of ICT4D projects. It was based on the work of then research pioneer, Bridges.org, drawing on a range of projects from the early 2000s.

In my paper, Bridges.org put forward seven habits of highly effective ICT-enabled development initiatives. By 2007 the list had grown to 12 habits — many of which didn’t look that different from today’s principles.

Do We Practice What We Preach?

But if these principles are not new to us, are we practicing them enough? Don’t get me wrong, the ICT4D community has come a long way in enlisting tech for social good, and the lessons — many learned the hard way — have matured our various guidelines and recommendations. But should we be further down the line by now?

The principles mostly outline what we should do, and some work has been done on the how side, to help us move from principles to practice. But I think that we need to do more to unpack the why don’t we aspect.

Consider this data point from a recent Brookings Institute report Can We Leapfrog: The Potential of Education Innovations to Rapidly Accelerate Progress (more on this report in a future post). Brookings analysed almost 3,000 education innovations around the world (not all tech-based, just so you know) and found that:

… only 16 percent of cataloged interventions regularly use data to drive learning and program outcomes. In fact, most innovations share no information about their data practices.

We know that we should be data-driven and share our practices. So what is going on here? Do the project managers behind these interventions not know that they should do these things? Do they not have the capacity in their teams? Do they not want to because they believe it exposes their non-compliance with such principles? Or perhaps they feel data is their competitive edge and they should hide their practices?

Time for ‘Fess Faires?

Fail faires are an excellent way to share what we tried and what didn’t work. But what about ‘Fess Faires, where we confess why we can’t or — shock horror — won’t follow certain principles. Maybe it’s not our fault, like funding cycles that ICT4D startups can’t survive. But maybe we should be honest and say we won’t collaborate because the funding pie is too small.

If fail faires are more concerned with operational issues, then ‘fess faires look at structural barriers. We need to ask these big questions in safe spaces. Many ICT4D interventions are concerned with behavior change. If we’re to change our own behavior we need to be open about why we do or don’t do things.

Good Design is Honest

So, on the one hand we really can pat ourselves on the back. We’ve had good design principles for almost twenty years. The level of adherence to them has increased, and they have matured over time.

On the other hand, there is still much work to be done. We need to deeply interrogate why we don’t always practice our principles, honestly and openly. Only in this way will we really pursue a key new principle: good design is honest.