×
Convivium was a project of Cardus 2011‑2022, and is preserved here for archival purposes.
Search
Search
Tools of Science: Trusted or Busted?Tools of Science: Trusted or Busted?

Tools of Science: Trusted or Busted?

Putting algorithms to an accountability test doesn’t require junking them entirely, but it can help us catch out powerful interests more intent on abusing than using our data, argues Cardus Social Cities Director Milton Friesen.

Milton Friesen
3 minute read

The big success stories of our time have scaled at exponential rates -  Facebook, Amazon, Apple, and the other ‘super bigs’ that relentlessly enlarge themselves. When something scales, however, the mix of characteristics, benefits, and costs of a business or organization may scale with it.

In contemporary life, the silicon-based tools that business, research, and social interaction depend on carry such mixes with them. Their complex and intricate nature means the actual functioning of algorithms, calculations, and processing devices is opaque to nearly all of us. We also depend on them in hundreds of ways as Pedro Domingos (The Master Algorithm) so eloquently points out. That combination of dependence and invisibility means a high degree of trust is required. Not abstractly, but directly, personally, by each of us.

Consider the recent news about Charlsie Agro and her sister Carly, identical twins, who submitted their DNA to five different testing businesses and received five different profile answers. The magic of sending in a bit of your DNA and having a scientific wizard read your crystal ball has been jostled just a little by this smart and practical experiment. While the results do not undermine the science of DNA testing per se, the conflicting results suggest that we should temper our confidence in consumer level DNA testing.

In that case, the answer is that the science requires interpretation, the algorithms used are tuned uniquely to the companies that own and maintain them, and variability enters the equation. We don’t trust only what is perfect or we would trust nothing. However, trust does require consistency, transparency, and by extension must be something we understand.

A recent Time article by Roger McNamee, one of Facebook founder Mark Zuckerberg’s mentors, complements these considerations. Where we cannot all understand or engage with the details of a function, some form of regulation that serves the public interest is needed. We cannot all test the food that enters our home for heavy metals or poisons, so we have a food regulation system that does it for us. The same is true of the water that comes out of our faucets. I took a drink this morning without a second thought because I trust the water supply regulations that are operative in my city.

McNamee calls for, among other things, regulation of algorithms that have significant public impact. We need to know which applications of scientific and computational capability are toxic to us personally or collectively, and which ones are good for us. For now, our trust in the machines is premised on the traditional regulation of the organizations that develop, own and control them. They have provided so much usefulness that the issue of their bigger systemic effects gets less attention.

We will doubtless see more cases of abuses and distortions that are costly to real people. These stories are essential. There are important benefits we all realize through various tools like DNA testing and optimizing algorithms that help us navigate around emerging traffic jams. But the potential of the tools to truly solve problems can be tainted when they are applied to less noble ends: giving powerful people a means to shrewdly take more from the weak, harvesting our family history curiosity by coaxing DNA from us that in turn becomes a data asset they can benefit from without us, and using our family and friend networks like a massive Amway promotional channel where we don’t even get the lousy cleaning spray.

The unwitting lessons we are learning is that scaling by means of the powerful tools of science means that the perennial dilemmas of sorting out the beneficial from the harmful take on new potency. It’s as if we have, through these powerful tools, added an exponent to our organizing – helping2 or harming2. We will need to deliberate together what that potency means socially and even how it may change our conception of what it means to be human. Evaluating harm and benefit will hinge on our collective sense of human value, debates that have been sparked by people like Gifford lecturer, J. Wentzel  van Huysteen and carried on in places like Wycliffe College.

It will be up to us to be alert, creative, and inventive in curling that power back on itself to see what happens – conjuring up new versions of the identical twin DNA test just to keep the game honest.

You'll also enjoy...

Figuring Out Social Isolation

Figuring Out Social Isolation

Twenty-three per cent of Canadians suffer from extreme social isolation and loneliness, according to a recent Angus Reid Institute survey in partnership with Cardus. Convivium sits down with executive director Ray Pennings to discuss this and other results from the survey.

Follow the Political Science

Follow the Political Science

In the second of two parts, Travis Smith argues that our responses to the pandemic reveal a Canada progressively squeezing out its commitment to liberty.

Pluralism and the Blue Plate Special

Pluralism and the Blue Plate Special

Weekly media teeth-gnashing over deepening political polarization is finally turning up good news, writes Josh Nadeau. A path back to true pluralism leads through small local institutions such as places called Judy's Diner.