In some ways, machine learning and psychology can seem the furthest away from each other, where one is based on algorithms and the other about human minds and behaviour. But one thing that rests at the core of these two seemingly opposite disciplines is that both need a healthy dose of emotional intelligence to understand people. Emotional intelligence is not only pivotal for strengthening human relationships, but also for informing a customer-centric AI product.
Central to the product development process is the question: who are we building for? The starting point is people and prioritising user needs. From user stories to developing personas, it is about recognising where people are coming from and how their identity shapes the way they will interact with the product.
A product shouldn’t be exclusive or confusing. We don’t want to perpetuate a digital divide in the tech space and instead aim to make Pomegranate as intuitive and as inclusive as possible. It’s too easy to default to the infamous tech monoculture and carry the implicit biases that persist in the real world into the AI model. At DataSine, we believe that good intentions and people directly impact practice. As the people behind the algorithms and the product, we have worked to mitigate bias by actively recognising the value of diversity in building more inclusive systems and AI. As a team, we take an active decision to put empathy, diversity and respect at the forefront of our product design, as well as our machine learning models.
In practice, this translates into intentionally investing organisational capacity into developing a participatory design approach that actively seeks community input. Our product Pomegranate has been guided by user voices. Our users have been pivotal in directing our next steps. From understanding the human needs of the B2B user to anticipating the experience of an email recipient, Pomegranate has been informed by people for people.
During a 1-to-1 conversation with a user today, I could see a coworker’s head poking through the phone booth window to gauge how things were going, and when I returned to the office, everyone was eagerly reaching out to hear about the user’s experience. It’s not that we are just open to feedback, but we rely on its energy to drive our sprints.
An example of one such exercise that we have used is an empathy mapping activity, with representatives from the Tech and Data Science team.
Each team member was asked to go outside their comfort zone and develop a persona – with different gender identities, professional backgrounds and experiences – to better understand how another person might experience our product. From stepping into the shoes of our end users, our goal was to ensure what we are building is for the end user whoever that might be. Central to this exercise was recognising that no one user is the same and that the best product supports a great experience for all.
Psychologist Brene Brown has shared how empathy is about human connection and the “ability to tap into our own experiences in order to connect with an experience someone is relating to us.” Our product was born out of empathy. DataSine’s goal has been to improve customer experience and that means beginning by creating a product with people in mind. In a digital era in which communication has become about efficiency, scale and maximum reach, often the human aspect is forgotten. Our driving mission is grounded in making communication more meaningful through personalisation. It is about enabling digital communication to drive connection rather than disconnection. Fundamentally, we want people to feel respected for their complexity, and that is reflected in our product design.
Mary Wallace (IBM) said at CogX that actively mitigating human bias in machine learning models involves cultivating diverse teams. Our small but growing team of 15 has 9 different nationalities just to begin with (British, Indian, Greek, Russian, Ukrainian, Belgian, American, Canadian, Bulgarian, Hong Kongese). From our C-suite asking for input to write gender-inclusive job descriptions for software engineers to seeing how our Data Science team ensures that our product does not perpetuate microaggressions through careful modeling, I saw how for DataSine diversity isn’t just a buzzword or a checkbox. As a woman of colour working in tech, I found this refreshing not only at a professional level where I was affirmed by the product’s iterative nature of actively listening, reflecting, and learning, but also at a personal level where I know my voice is valued in that process.
Our collaborative culture and horizontal team structure has meant that every team member’s ideas matter, and that each of us have been able to directly inform our product’s design through daily conversations with the tech team. For me, I’ve been able to bring insights from user experience conversations and qualitative stakeholder interviews directly to our Chief Scientist to ensure that what we are building is human-centric and gives our users helpful insights for creating more thoughtful campaigns. It was our collective humanity that also drove our decision to consciously design with privacy in mind. Taking the next step past our own ‘right to be forgotten’ in the digital era, we made absolutely certain that we protect individual privacy within our product. We have asked ourselves constantly how we would like our data to be managed. Our CTO Chris, knowing how passionate I am about accessibility and inclusivity issues, asked me to help write Pomegranate’s Code of Conduct, alongside other voices in our company traditionally excluded from the tech industry.
It takes a village to raise a child; in the same way, it takes a community – across technology, business, data science and industry – to build a product.