Tue, 14 Oct 2025
Financial services like to believe they run on numbers, not narratives. Yet as a panel I moderated at Sibos in Frankfurt on “Culturally aligned AI” made clear, money moves through culture first.
From how communities think about obligation and reciprocity to whether trust sits in credit scores or social ties, payments and money are never culturally neutral. If banks want AI that works globally, they must design for local meaning.
Deliberately, respectfully and with the right guardrails.
Philosopher–lawyer Ammar Younas (CEO, Ai Mo Innovation Consultants) set the scene bluntly: without understanding cultural context, you can’t really understand a financial system at all. He pointed to traditions such as Ubuntu’s community-first ethos in parts of Africa and relationship-centred approaches across Latin America and China, arguing these shape how people perceive debt, guarantees, and trust. In some places, a guarantor from your network matters more than a score; in others, social shame, not contract wording, keeps repayment on track. Product misfires in the Middle East and early peer-to-peer lending mishaps in China show what happens when systems ignore local norms. Conversely, small but telling features, like “red packet” transfers that a recipient can’t return, can encode cultural reality into UX.
The takeaway is not “do different marketing”. It’s to “embed culture at the manufacturing stage”: in the data you train on, the features you ship, and the incentives you optimise for. Training on generic, web-scraped data will at best reproduce the centre-of-gravity culture and at worst amplify stereotypes. Assembling contextual, locally sourced datasets is essential to avoid repetition and cultural drift.
Tech ethicist Patricia Shaw (CEO & Founder, Beyond Reach Consulting) framed the institutional challenge: the goal is “precision finance”. Products tuned to individuals and groups, without sliding into a one-size-fits-all pattern that erases context. Her pragmatic approach is a hub-and-spoke model of AI governance: universal, horizontal guardrails at the centre rooted in human rights and common values, with decentralised spokes that adapt to local regulation, laws, norms and definitions of a “fair and good outcome”.
That lets firms benchmark performance both locally and globally, and avoid the “copy an algorithm from one region to another” mindset that echoes past colonialism in digital form.
But governance can’t be paperwork alone. Trish emphasised participatory design and co-governance with the communities you serve, especially where finance is more communal than individual. It’s not fragmentation; it’s “intelligent adaptation”.
There’s a warning here, too: over-collection in the name of context becomes privacy overreach. Banks must learn what to leave out as much as what to put in. Standards work can help, and newer IEEE/ISO efforts explicitly address cultural norms and societal impacts, with annexes on how to surface cultural aspects in AI deployments.
Emmy-award winning artist–filmmaker Lisa Russell (Founder & CEO, ArtsEnvoy.ai) argued that trust in AI will rise or fall on narrative. She asked: “if you hire auditors to check your numbers, why not hire artists to check your narratives?” That’s how you ensure people see AI as empowerment, not something being done to them.
She drew a comparison with the humanitarian sector’s “poverty-porn” era: well-intentioned, trust-destroying storytelling, to the risks of historic, biased data in finance that cements intergenerational disadvantage when fed into algorithms.
Lisa also highlighted current frictions in finance experienced by the creative economy: fast-growing, cross-border incomes; irregular cashflows; and systems ill-suited to portfolio work, which put creatives (and the broader “gig economy”) at a disadvantage. Here, culturally aligned AI can include supportive rails: from eligibility signals that reflect creative livelihoods, to payment flows that respect regulatory realities without punishing how artists and the gig economy actually work.
1) Data scarcity and tacit knowledge. Many cultures’ wisdom about debt, dignity and reciprocity lives in tacit, non-digitised form. If your board lacks representatives from those communities, your training data likely does too, and your models will predict the majority, not the reality. The fix is fieldwork: co-creating datasets with communities and preserving indigenous knowledge responsibly, so you’re not “harvesting people” but partnering with them.
2) The “one-size-fits-all” trap. Porting an underwriting model from London to Lagos because “it worked here” is a cultural own goal. Copy-paste AI recreates colonial dynamics and increases bias, even when headline metrics improve.
3) Privacy and proportionality. Context does not license surveillance. Participatory approaches must set clear “no-go” zones for personal data; otherwise, the very communities you aim to include will opt out, or be excluded, on privacy grounds.
4) Legal and organisational complexity. Culture shows up in law and vice versa. Hybrid legal regimes (e.g., common-law features inside civil-law zones) can open room for culturally specific products, but they also complicate compliance, especially as ideas like “algorithmic persons” arrive at the regulatory frontier.
5) Cost and mindset. Doing this properly takes investment and cross-disciplinary time: ethnographers with model engineers, policy with product. The payoff is resilience and adoption; the cost of skipping it, is mis-fires and mistrust.
Design for cultural use-cases you can name. Sharia-compliant accounts in the UK and some European banks show how a system can honour religious constraints while staying mainstream. Small UX choices, like non-returnable “red packet” transfers, recognise relational norms and reduce friction.
Make participation a core capability. Move beyond focus groups. Co-design scorecards, appeal pathways and agent behaviours with local communities, and budget for their time. Use the hub-and-spoke model to keep global consistency while adapting definitions of “good” and “fair” locally.
Treat narrative as infrastructure. Build “narrative audits” into your model lifecycle: who benefits, who carries risk, and does the story we tell align with outcomes customers actually experience? Creatives and cultural translators can stress-test communication before launch as rigorously as validators stress-test code.
Re-imagine eligibility signals. Open banking proved that new data can widen access without diluting prudence. Apply the same thinking to local communities and informal networks, using signals that reflect communal trust (e.g., guarantees and mutual-aid activity) where appropriate, while protecting privacy.
Curate local training data ethically. Invest in collecting and stewarding culturally specific data with communities, preserving heritage while enabling better models. Done well, this both improves performance and builds shared value.
Culturally aligning AI means acknowledging that culture is already in the system. Explicitly in law and product design, implicitly in data and defaults, and choosing to handle it with care.
During my preparatory interview calls with each of the panellists, I asked if they thought that culturally aligning AI was both desirable AND feasible within today’s regulatory constraints.
They all agreed that it is desirable, and the good news is that much of this is feasible today. Banks already adapt products for religious compliance; regulators already create space for hybrid legal arrangements; standards bodies are already incorporating cultural annexes; and creative talent stands ready to help rebuild trust through better stories.
If the financial services industry embraces participation, precision and narrative honesty, AI can widen access and deepen relevance rather than automate yesterday’s blind spots. The destination is not a thousand bespoke systems, but a global platform that knows how to listen locally and learns to serve people the way their communities already do.
Wouldn’t you agree that is a future worth building?
Join us at The Banking Scene Art Night on November 17 in Brussels and experience first-hand the power of storytelling as Lisa immerses us in a scenario that forces us to contemplate the choices we make today as we "Reimagine the Art of The Possible".
In this deep dive, Rik and Andrew dig into some of the key points raised during the panel and discuss a couple of examples that are close to home.
We also hear directly from Lisa, who shared her thoughts on how banks could collaborate with the creative economy in order to communicate complex concepts and improve customer experience and financial literacy.
You can view the video below or follow on your favourite podcast platform here - and don't forget to subscribe!