Somali Teriyaki and the Ethics of Automated Decisions
The three lessons from 20 years that we still need to learn about automation
If you have traveled to East Africa, you will know two things: the wonderful people and the extraordinary food. The region’s strategic location along major trade routes infused its cuisine with Arab, Bantu, Indian, Italian and Turkish influences.
I haven’t been to Somalia: it’s a difficult country to visit. But in Seattle, we have a lively Somali community, and they have brought the best of their food and, even better, their eclectic delight in new experiences. Visit Salama in Tukwila to see what I mean. Their Somali Chicken Teriyaki is famous, not just as a curiosity. And just to complete the Seattle experience, the espresso is good.
Delightful as Somali food is, what does it have to do with the ethics of automated decisions? As it happens, there’s a lot to learn, but I need to take you back to 2001, a few months after the awful events of September 11.
Good cooking starts with good shopping
As the English food writer Jane Grigson said, Good cooking starts with good shopping. And she also knew that did not mean expensive shopping. Much of the best food in the world is made with the cheapest ingredients, because people who cannot afford much but still live in a culture of good food learn to make the best of them and economically too.
The Somali community in Seattle - especially back in 2001 - had many refugees from seemingly endless war. They mostly lived on food stamps, which were adequate but hardly generous. So, Somali families often pooled their resources to buy food in bulk. This was partly due to the cultural practice of sharing meals, especially large communal dishes, a staple in Somali cuisine. And they found bulk buying to be generally more cost-effective, allowing families to get more food for their money.
In many cases, multiple families or members of the extended community would go shopping together, sharing a van to go to the local Somali or East African grocery store. This reduced transportation costs and made bulk buying easier, especially for those with limited access to transportation.
Other advantages included shopping in groups, which allowed for shared decision-making regarding the best products to buy and pooled knowledge about prices and product quality.
For Somali families, efficiently using food stamps (from the Supplemental Nutrition Assistance Program - SNAP) was crucial. By pooling their resources, they could afford larger quantities of staple foods like rice, meat (especially halal meat, which can be more expensive), and spices. They did so by negotiating bulk prices with the storekeeper. One family would buy the rice, another the beans, another the meat and so on.
I feel triggered
The SNAP system worked through terminals in the store, which processed transactions. The system had fraud detection algorithms, as you would expect. Also, as usual, these algorithms were trained over existing data. Therefore, The SNAP system expected transaction patterns that aligned with the shopping habits of a typical low-income American family: varied transaction amounts because they buy different items each time, depending on need; purchases spread out over days or weeks, reflecting typical household consumption rates; and small to moderate transaction amounts.
Somalis, intentionally trying to get the best value (for the taxpayer, too!) and the best nutrition for their families, did not shop that way. Their patterns included bulk purchases, group shopping, and even dollar amounts because transactions in round numbers ($100, $200) make sense in negotiated bulk buying.
You can guess what happened, but you probably won’t guess all of it.
The shock and the aftershock
The unusual transactions detected by the automated fraud detection systems in the Electronic Benefit Transfer (EBT) system resulted in sanctions against the Somali shopkeepers.
The primary sanction was the disqualification of the Somali grocery stores from accepting food stamps at all. This meant they could no longer process transactions using the EBT cards, which was a significant part of their business, given that many of their customers relied on SNAP benefits to purchase groceries. The disqualification devastated the shopkeepers but also had a profound impact on the Somali community. The community’s access to food that met both their cultural and dietary needs was significantly reduced. And Somali customers, already on tight budgets, could no longer shop efficiently.
And how did the local press report these events? Accurately, but accuracy is not neutral …
FBI raids Somali stores Maka Mini Mart and Halal Meats in Seattle
USDA disqualifies three Somalian markets from accepting federal food stamps’
Remember, this happened just a few months after September 11th, 2001. You can imagine what the anti-immigrant, anti-Islamic, anti-benefits crowd (they mostly overlap) made of it.
Thankfully, the decision was reversed after a few months, but not without a great deal of hurt, damage and loss.
Don’t blame the algorithm
I was very much interested in this story at the time. I was myself a new immigrant. I loved East African food. I also worked with the data mining and research teams at Microsoft.
Even now, it is fair to say the algorithm got it right - but only as an anomaly detection system, not for fraud detection. These were indeed unusual behavior patterns and probably quite unlike the data with which the model had been trained. But suppose you think of an anomaly as primarily an indicator of fraud. In that case, you overlook the possibility that the anomaly represents a new use case you just don’t yet know about.
So, the first fault lay not in the algorithm but in the users’ mindset, which led them to the wrong conclusion.
From this mess, I took away three important lessons. Not only are they still relevant today 23 years later, but I believe they are even more important as we increasingly model and automate our businesses and work.
Three lessons
All data is a metaphor for something in the real world, but only a metaphor.
Our decisions must be made with humility, building into the process the simple possibility that we may be wrong. We may do harm and thus need to reverse or amend our decision.
We must be deeply engaged with the subjects of our work.
Data as metaphor
We must always remember this when dealing with data—everything is a metaphor, or at best, an incomplete, inexact and crude proxy of something else in the real world.
There are no customers in your customer databases, only records of customers. There are no transactions in your general ledger, only records of transactions.
Something as simple as a price, for example, is not a property of a product. No matter how hard I look, I can’t tell the cost of my watch by examining it. The retail price was a result - a property - of the seller’s processes, and the price I paid was a property of my buying process. But we call it a property of the watch as a convenience.
Similarly, the sum of a transaction is not just the amount paid; it is a proxy for the process of selling and buying, which may be simple or complex.
So bear in mind that data represents real-world phenomena but does not embody them entirely: it can only provide a simplified abstraction of reality. As AI systems increasingly make decisions based on data interpretations, it is essential to be aware that data might not capture the full spectrum of human behavior and cultural nuances to avoid misinterpretations and unintended consequences.
Decision-making with humility
We will soon discover that, with pervasive AI, decisions can have far-reaching impacts on people’s lives. Building systems with mechanisms to question their conclusions and correct errors is a vital safety net. It means designing AI systems not just for efficiency but with built-in safeguards and ethical checks that recognize the potential for harm and provide clear pathways for remedy and accountability.
Deep engagement with subjects
If automated systems are to benefit individuals or communities, we must understand them deeply, not superficially. So, cultural engagement with the subjects of our work is essential. However, we must also be aware that the interaction between technology and behavior is dynamic and adaptive.
As technology is integrated into daily activities, it invariably influences and alters user behaviors, which can lead to unintended consequences. If traffic is busy, a GPS routing system may take you down an otherwise quiet and little-used suburban street. But because traffic is busy, you are making up for lost time and so drive faster along a street where it is even less appropriate than usual! Recognizing these feedback loops is vital, as it highlights the necessity for technology to be responsive, adapting to changes in behavior that its introduction may precipitate.
I don’t know for sure, but I can imagine different ways of distributing benefits, which would have had different behavioral results for Somali customers. Coupons for products biased to an Anglo-American consumer might be traded, store credits may have accumulated over a more extended period than SNAP payments for even larger purchases, and cash could have been used within the community to buy not only ingredients but as a contribution to communal meals or for home-made specialties.
Only deep engagement can make this work humanely and efficiently. We need to be as adaptive, imaginative and culturally rich as the chef who invented Somali Chicken Teriyaki.
I’m hungry now, but I’m a vegetarian. So I want some delicious Cagaar and Muufo.
Donald, this post is instructive for all of us, whether working on computer intelligence or on our own old-fashioned organic intelligence. You’ve highlighted a key categorical boundary on the continuum of decision-making: Approach/Avoid response vis-a-vis the Novel.
First-order decision-making (whether silica or carbon) starts with rules that relate an objective to the norm or to the familiar. At best it ignores the unfamiliar, and often, as in your example, it will reject the unfamiliar. It is primed to AVOID anything new, unusual, or unfamiliar.
Higher-order decision-making is based upon learning, which requires curiosity about the stuff beyond the familiar that might be related to the objective. Curiosity-driven learning sees opportunity in the unusual, leading to an APPROACH instinct.
So, to move beyond the primitive level of decision-making (in ourselves and in our synthetic progeny) we need to develop a bias toward the novel -- an APPROACH bias -- as captured in one deceptively powerful little sentence:
"I wonder what that is about."
This is a wonderful story and a great example of learning and applying lessons, of iterative decision making, and, given that for now at least, metaphor, humility and deep engagement are beyond the computational statistics that we hype as AI, the abiding need for humans to make the decisions beyond the automated process.