Creative Habitable Cities: The Trouble with Technology

In the search for ideas to create habitable cities, technology is often the easy option. Against the backdrop of the Creative Bureaucracy Festival 2020, we look at the importance of assigning technology the right purpose and knowing when to use it.

 

Photo by Shoeib Abolhassani on Unsplash
Creative Nudge

The Singapore government recently announced that it was partnering with Apple on a two-year project to promote the health of its citizens. Called LumiHealth, the scheme pairs the Apple Watch with an app-driven incentive scheme that will pay residents up to S$380 a month in cash and vouchers if they hit certain health and exercise-related goals. 

It’s the latest in a slew of global initiatives that aim to nudge our social behaviour and find creative outcomes to health and other public crises. But there’s a thin line between nudge and control, the point where good intention becomes intrusion, and it’s symbolic of our problematic relationship with technology in general. As the Creative Bureaucracy Festival 2020, an international forum for government innovation, launches with a virtual programme of around 175 talks, workshops and keynote speeches, it’s worth looking at some of ways that barriers biases may be hard-wired into the technology itself. 

Photo by Parker Coffman on Unsplash
Surveillance Capitalism

As documentaries like The Social Dilemma and The Great Hack, as well as books like The Age of Surveillance Capitalism by Shoshana Zuboff, make abundantly clear, the technology we use  isn’t always created with our needs uppermost. Partly because of its ubiquity and the amount of time we spend using it, social media is an obvious example, but it’s far from the only one. One way to think of our relationship with social media is to imagine the companies as dairy farms.

As consumers, we think we’ve negotiated an incredible deal: these farms want to give us a never-ending supply of free milk. But we’ve misconstrued the relationship. We’re not their customers. We’re the herd they monetise; the cows that produce the milk, or rather, data, that that they sell to their real customers, the advertisers and marketers that pay them billions.

Once we get our heads around the fact that we’re resources rather than customers, the way those companies act is easier to understand. Like any good farmer, their motivation is do the bare minimum required to keep their herd passive and healthy enough to turn a profit. It’s no wonder we feel discomfited. It’s been such a breathless journey that it’s easy to forget that Facebook only opened to the public in 2006, that Twitter launched in 2006 and that TikTok was only made available outside China in 2017.

In just over a decade we’ve transformed from dizzy optimists convinced that digital devices, data and citizen journalism would usher in a new era of freedom and democracy to jaded, bloodshot-eyed device addicts, who, ironically, rely on smartphone apps to tell us how many hours we’ve spent staring at our smartphone screens this week.

With Big Tech, at least we have someone to rail against. Firms like Apple, Facebook, Google, Twitter and Microsoft have global figureheads that governments and the media can question and scrutinise.

Photo by Nicola Fioravanti on Unsplash
False Positives

That might be the case in the future. We’re increasingly handing control of our world over to algorithms and automated systems that we’re told we can trust but whose logic and reasoning processes are, at best, opaque and, at worst, unknowable. Only this month, Twitter apologized for a racist quirk in an image-cropping algorithm that caused it to crop dark-skinned faces and home in on white ones. Other than the blatant inequality it demonstrates, it illustrates the danger of letting a machine decide what is ‘most important’ in an image. 

Similar issues plague products and services that rely on algorithms. Facial recognition software that creates more false positives for women than men and for dark skin than white skin. Criminal recidivism predictors reinforcing the racial biases they were designed to eliminate. Human resources tools that perpetuate gender and ethnic imbalances. 

In the majority of cases, we only become aware of these issues when they affect us or someone close to us. Though systemic in nature, their impact tends to be felt by a single individual or group at a time. A rare example of algorithms creating collective injury and outcry was seen in England this summer, after the government opted to use an algorithm to grade school leavers whose exams had been cancelled due to COVID-19, instead of basing them on predictions made by their teachers.

The subsequent AI-assisted rounding down of grades disproportionately affected students from state schools, sparking massive street protests and a policy U-turn on the part of the government. That sight of people protesting against machines is only set to increase. 

Photo by Marcus Kauffman on Unsplash
White Balance

Other problems when we mistake our tech for something it isn’t. As wildfires raged across California and Oregon, smartphone and digital photographers were surprised to find that their cameras replaced the apocalyptic images of devastation their eyes perceived with washed out oranges and greys. 

While traditional film cameras reflect light through a lens which excites crystals coating the film, essentially ‘burning’ a copy of the framed shot onto the film, digital cameras simply record data – ones and zeroes – that an algorithm processes into the image we see. More importantly, unlike traditional cameras that follow the colour profile of the film, digital cameras are colour-blind. 

Software, coupled with tools like automatic white balance help to create the realistic – or unrealistic – tones that make selfies so addictive but they weren’t programmed to cope with the reds and oranges of a fire-tinged world. With little or no white to balance against, the software assumes that the scene is neutral and adjusts the contrast accordingly. It’s not a glitch or a fault; we’ve simply misunderstood what our smartphone and digital cameras are.

Human Solutions

That’s not to say that these problems with AI and other forms of technology won’t be resolved. Despite its ubiquity, most of our digital technology is less than two decades old. The companies acting as gatekeepers to that technology are even younger. The data they produce is only as good as our ability to access it and understand it. As Charles Landry, the founder of the Creative Bureaucracy Festival, noted in a recent interview with The Citymaker: “Our data is public data. Decision makers and politicians should safeguard that. If they don’t…then you’ve given one of the main sources of information, knowledge, and potential to private companies.”

It’s part of what makes events like the Creative Bureaucracy Festival so important. Looking to technology to help us solve the world’s most pressing issues is a natural response. At the same time, we have to ensure that those solutions serve our needs. That requires human involvement, collaboration and ingenuity. It involves giving the machines the right purpose and setting the proper parameters. And knowing, above all, when not to use them.

Creative Bureaucracy Festival 2020: Programme of Events

Follow Think City on Facebook and Twitter for more information on our involvement with the Festival.