Inequality and Social Mobility
Folk Tradecrafts, Black Boxes and Caches: A Request for Algorithmic Legibility in Gig Work

Neil, a private hire platform driver, shared the above ‘secret’ to getting more jobs during an interview with me. Another ‘secret’ shared by another driver was having multiple phones, each with a particular gig work app on, instead of having all the apps switched on the same smartphone. As he says, “If one phone, they will compete with each other!”.

These ‘secrets’ are two of many folk theories or folk tradecraft that have surfaced in an ongoing four-year ethnographic study of especially gig workers by myself and other researchers at the Institute of Policy Studies. The study is part of a more extensive study aided by the Social Science Research Council Thematic Grant examining young workers in precarious jobs in Singapore. Folk tradecrafts in gig work are ground -up ideas and strategies that emerge due to workers attempting to figure out and thrive in the platforms’ competitive and often opaque algorithmic management systems.

Folk tradecrafts are not peculiar to just platform workers in Singapore. There have been many reported examples from around the world about how platform workers try to figure out and game the system. From relatively simple ideas like Neil‘s to more sophisticated methods like delivery riders in the US, hanging smartphones in trees to get more work and Indonesian workers using multiple accounts referred to as joki accounts on the same platform, workers worldwide face similar challenges related to figuring out seemingly inaccessible systems and respond accordingly.

In Singapore, gig workers are an undeniable part of our everyday urban experience. Regarding platform drivers alone, they “outnumber taxis by a mile” on the island. Our study also suggests that platform workers come from various backgrounds, from those with lower educational qualifications to those with diplomas and some higher. While they may have varied backgrounds, many, especially workers who mainly depend on such work (also referred to as full-timers), face similar challenges – related to the lack of worker benefits and protections.

Our reports highlighted aspects of the job’s precarious nature and were cited in the Advisory Committee on Platform Workers’ recommendation report to the Singapore government. In November 2022, the government accepted the committee’s recommendations to provide better job protections for gig workers. As a result, workers will get better insurance protection, representation, and even CPF contributions. While these are substantive policies intended to safeguard the interests of workers, we should also continue to examine other unique aspects of the gig economy, like the impact of opaque algorithmic management systems, to protect these digital workers.

THE NEED TO ADDRESS ALGORITHMIC MANAGEMENT

Gig workers are also seen as digital workers and are believed to be strongly influenced by platforms and their sophisticated algorithms, which control their earnings and aspects of behaviour. This is referred to by academicians as algorithmic management or software algorithms that perform the functions of a manager.

However, crucial things like fare prices, trip allocation, and factors influencing these things are often unclear to workers. In addition, getting basic answers seems more complex “when your boss is an algorithm” as with platform workers.

Things like algorithms are commonly associated with other terms like Artificial Intelligence (AI), Big Data and Machine Learning, which are parts of the technologies we use daily. From powerful popular applications like AI resume screening tools used by human resource (HR) departments of major organisations to ChatGPT, which is available to practically anyone, their usage is becoming ubiquitous. However, many – myself included – do not really know how they function.

There has been much enthusiasm around how algorithms are changing our lives positively. The positive changes they have brought are undeniable. Nevertheless, hearing about algorithms’ adverse effects is also common. The existence of groups like the Algorithmic Justice League, which was set up to “illuminate the social implications and harms of artificial intelligence”, indicates algorithms’ adverse – actual or potential – effects and the need for oversight.

However, the algorithms employed are not legible to everyday stakeholders like platform workers, who depend on the platforms to earn a living – workers do not know how trips are assigned and how earnings are calculated.

Also, the lack of possessing qualities that make these algorithms understandable or legible to people makes it very difficult to govern and audit them to determine fairness and adherence to ethical practices. Without transparency, it is an inscrutable black box affecting, in this case, gig workers, a sizeable swarth of Singapore’s labour market.

REDESIGNING FOR TRANSPARENCY AND LEGIBILITY VIA XAI

Employing frameworks and principles from the emerging field of XAI or explainable artificial intelligence might help organisations make their AI systems more legible and accountable. XAI-engineered systems will allow relevant stakeholders to understand AI results and see that they meet regulatory standards. This would help increase trust between humans and machines or, in this case, between workers and platform providers. There is also evidence to suggest that digital trust could help increase revenues for businesses.

By disclosing the decision-making processes of AI systems, XAI promotes transparency. Gig workers could potentially comprehend why they were given particular jobs, how their performance is assessed, and how their rates are set.

XAI can assist with accountability in disputes or disagreements. A gig worker can receive a clear explanation and relevant data from the system if they feel they have been treated unfairly or received an unjustified rating. Because of this openness, it is simpler for workers or their representatives from relevant associations to address complaints and obtain just compensation.

Also, by understanding how personal fare and ratings are calculated clearly, a worker would better understand what has worked and what has not. This would help them hone their skills and improve their overall performance.

DATA COLLECTIVES AND ASSOCIATIONS

Other than pushing for algorithmic legibility, some labour scholars have argued that data cooperatives would empower platform workers by helping them compare fares for similar routes and distances. This would help workers know whether they are being fairly paid. An example of such a data collective is Driver’s Seat Coop which was started in the US. These initiatives help collect data using a different app and help workers better understand their working conditions, like their actual hourly rate, how much they make after expenses and time to log in to particular platforms to maximise pay. Other organisations like the Worker Info Exchange (WIE), which is a platform that a British driver started, are responding to the information asymmetry that is said to exist in the gig economy by helping workers retrieve their personal data from various platforms. Associations in Singapore set up to represent the interest of private hire drivers and delivery riders might want to take a leaf from such collectives so that data is available when needed in times of dispute and/or providing drivers with personalised tips on how to better their earnings.

In summary, significant progress has been made in giving gig workers in Singapore better protections, benefits, and representation. We do not yet know how these rules will be implemented or how increased costs will affect things like fares, but they seem consistent with making the workers’ jobs safer and more viable in the long term. On top of this, I believe we should also address issues related to opaque algorithmic management systems and work towards making them more accountable. This is so that gig workers like Neil will finally know why he earns the amount he does, whether what he earns is fair, and whether the daily clearing of his phone’s cache actually works.

 

Shamil Zainuddin is Research Associate at the Institute of Policy Studies (IPS), National University of Singapore. Before IPS, he was Senior Design Ethnographer at a tech multi-national company. He believes in doing the work to make every day easier, especially for disadvantaged communities.

This piece was first published in The Karyawan on 20 July 2023.

Top photo from Freepik.

  • Tags:

Subscribe to our newsletter

Sign up to our mailing list to get updated with our latest articles!