Ai Meets Informality: Why Most Systems Break Outside the Lab
14 February 2026
Ai in Everyday Contexts
margaret-hammond
Data Infrastructure
Ghana
AI Strategy
Big Data
Digital Sovereignty
Ai Meets Informality: Why Most Systems Break Outside the Lab
When a map demands a street name, but the driver knows only the 'Blue Kiosk' the system fails. We explore 'Algorithmic Cruelty' and why AI must learn to navigate the relational space of African cities
Picture a ride hailing driver navigating his way downtown Accra. On his dashboard, with bold blue dots, the map insists he has reached the pickup point. Passenger texts : ‘Where are you?’ After three minutes of what looks like a frustrating geographic hide and seek, the passenger texts, “ Chairman, forget the map and turn right at the blue kiosk on the junction”.
This is not just a one-off navigation glitch but the reality of many others in Africa caught up in the rigidity trap. It is an ongoing systemic collision where Western designed AI systems continue to crash because they assume fixed addresses and stable data often ignoring the fluid, informal and adaptive nature of everyday African life.
Unfortunately, by failing to understand the real context, the system punishes the driver who smartly relies on the “Blue Kiosk” to navigate his way to the pickup point. This is what Nithya Sambasivan terms as the Algorithmic Cruelty. It occurs when automated systems and AI harm users by operating without context and empathy. The system interprets the driver’s reliance on visible landmarks as going off route or wasting time. This cruelty then reflects in the lowered driver ratings, difficulties in accessing rides and even worse, account suspension.
The confusion is set between Absolute Space and Relational Space. GPS, often western designed, relies heavily on absolute space. These are unable to navigate the African hood since they are easily interrupted by high-rise interference and unmarked turns. Often used to standardized mapping and precise coordinates, they fail to admit to loss but confidently navigate with guesses that are seldom successful. This is why the blue kiosk, koko seller, or the light pole are relied on. These social landmarks can be hardly abandoned in an environment where everyone sees. Relational space will continue to dominate a terrain where the GPS hardly recognizes.
The bigger problem here is that this mismatch keeps scaling. It is no longer ride hailing drivers missing pick up points but these rigid trained western systems are being integrated into the very infrastructure of everyday African life. Now determining drop off points, who qualifies for a loan, and even the legal identity of a person.
The African economy is largely powered by micro entrepreneurs and small businesses contributing roughly 80% of employment often with growing demand for finance. While these businesses generate large cash flows daily, they are denied credit by algorithms that rely on traditional requirements such as bank statements, tax records and digital inputs that rarely exist in the cash dominant world . These business owners not only miss opportunities to grow their business but pay for the blindness of these systems to the informal reality of their business.
A credit scoring algorithm trained on western Absolute data like the GPS will not only miss pick up points, but fail to see the Makola trader. In a western environment dependent on a silicon valley dataset , the market trader without a bank statement is invisible. In an invisible economy however, this woman is regarded as an economic pillar. To excuse herself from the rejection of an AI that relies on a paper, she resorts to loan sharks. These predatory borrowers not only understand the informal nature of their business but base their demands on the relational data. They value her based on the physical stability of the business or her social standing. A credit scoring AI that rejects the market trader not only loses a customer but also ignores the undocumented wealth of a typical market trader.
An algorithm’s confidence in a pdf bank statement over the long-standing reputation of a Makola trader is a classic design flaw in the Human Computer Interaction for Development field. HCI4D emphasises the design of systems that are sensitive to context, user’s needs and behaviour however, these are ignored by current models. A few pages of the pdf bank statement highlights only a snippet of formal cash flow. Prioritizing that over her social collateral built through years of steady business, and the collective memory of market peers is the system failing to capture her true economic statement. One that is verifiable on ground unlike a statement that can be easily falsified.
Even more, these systems and models that assume a one to one relationship between devices and humans often get confused when they encounter the communal nature of African life. According to the GSMA Access to Mobile Service and Proof of Identity report (2021) , about one out of four women do not have a sim card registered in their names, likely sharing devices with members of their households. With many of these systems requiring a fixed digital identity to grant access to services like credit, women and children forming the majority of persons without sim cards or one registered in their names, are left to pay the ‘rigidity fine’ for a system designed away from their reality. The prevalence of Proxy users and shared SIMS features the communal lifestyle of the African society. For instance, the Makola trader with consistent business payment patterns on her Momo is likely to be absorbed by the gambling activities of her son she shares the same phone and sim card with. In more extreme cases, a credit scoring AI is likely to also pick up the ‘high risk noise’ of her neighbour who borrows the phone to check her outstanding balance on a predatory loan app. This leads to a lower credit scoring since the AI is unable to differentiate the contradicting behaviours of all three users. The trader now has distorted digital footprints created by a system not designed to understand the shared reality. These systems are often trained on ‘toy datasets’ that are standardized western data that fails to represent the complexity of the African environment. When these rigid systems collide with what’s on ground, the failure is often hidden from users by human middleware working behind the scene. Ironically, what seems as a cost saving automation demands a human workforce fixing the many crashes. Mobile money agents on the streets of Accra and M-pesa across Kenya have become the indispensable middleware for a system designed to automate the movement of money. Though they barely can hide in their temporary yellow and red containers, the automated system would collapse without them. These agents continue to fill the gap created by the rigid algorithms. The way forward isn’t a matter of taking the blue kiosk away. Neither is it waiting for all lanes to be marked and named. Rather , if the GPS is designed to speak and understand the proximity language of the African neighborhood and instead of algorithms that are used to precise coordinates of a street in London or New York, an algorithm that blends in the social landmarks of the African streets must be built. Imagine a GPS trained to recognize the picture of a blue kiosk or the light pole in real time and not just coordinates that are unheard of by these lanes. Furthermore, instead of leaving the eligibility of a market trader for a loan to a credit scoring AI that depends on a pdf of bank statement, design a system that recognizes the number of peers, suppliers and even community leaders that can vouch for her. The system that understands the social standing of a trader in an informal economy that has been operating in the same shop in Makola for two decades generating huge cash flows shows her true creditworthiness. A system that considers her Susu or Chamas contributions as an economic trail showing disciplined commitments and not an “off book” activity. This is not asking for a system that creates financial loopholes but one that sees the credibility of a trader with traces on ground.
Key Takeaways:
- AI design must adapt to the vast diversity of African populations and infrastructures rather than applying a universal standard.
- To be reliable and generalizable, systems must account for the unstructured informal data on the streets.
- We need flexible systems that recognize local realities, like a map that identifies a "blue kiosk" or an AI that values a trader’s social reputation.
- The ultimate goal is to build AI that deeply understands and reflects the specific context of African life.