There’s a famous case study circulated in business schools about literally building the proverbial “better mousetrap.” As the story goes, a company in the ‘60’s set out to do just that, invested lots in design and unveiled a high-tech electronic masterpiece for the market. Its technological features were unparalleled.  It was guaranteed to rid homes of vermin forever and save money in the process since it could be reused.  However, the company was shocked when it sold about as well as popsicles in the Arctic.  You see, they had invested millions in market research (on male consumers…), but no one had thought to speak to the actual users: women who were the ones who actually had to deal with the dead mice in the early mornings preparing breakfast for their families.  They could have told the company outright that they didn’t like the idea of a reusable trap in which they had to get up close and personal with the rodent to remove it, or worked with the company to tweak the new design to meet their concerns.  Instead, the company lost millions in investments and future sales and the consumer was left with a traditional trap– adequate, but with lots of room for improvement to the user experience.


Human-Centered Designimg_7568-1

Given the example above, it’s no surprise that Human-centered design (HCD) a creative approach to problem solving that starts with people and arrives at new solutions tailored to meet their lives — has been gaining momentum, even in fields like mobile gaming which are driven by technological advances.

At Kukua, we’ve taken a human and user-centered approach from the outset. For the past 18 months, we’ve embedded ourselves in rural communities and urban slums across The Gambia, South Africa and Kenya, observing dozens of classrooms, spending time with target users, and interviewing scores of parents and teachers to understand what they value and the underlying challenges to effective learning in their communities.

We tested 50+ versions of SEMA, giving 500+ children an opportunity to play, and interviewed 200+ target parents to understand their lifestyle, phone habits and views on education.  Our trials in Kibera, Nairobi have created event logs and EGRA-EGMA test results for hundreds of children over thousands of gameplay hours, giving us rich data about progression and adoption. All of this feedback has informed every single product choice in a systematic way following five steps outlined in IDEO’s Open Source Field Guide: Empathize, Define, Ideate, Prototype, and Test.

Empathize: User & Context Research

Despite reading as much as we could on the problem, nothing taught us more about the context than living with and working alongside our target populations. Starting in mid-2015, our co-founder Lucrezia traveled to rural Gambia and stayed for a couple of weeks in a hut with a family living on less than a dollar per day. There, she carried the initial user and contextual research that informed all of our early product decision. 

Lucrezia_ Bisignani_Kukua_SEMA_AppOver the next few months, we dove deep into our users’ context to understand the deep rooted underlying problems that prevented so many children from learning to read, write and count. To get a much more granular understanding, we sat at the back of dozens of primary school classrooms in different villages across The Gambia and Kenya, as well as slums across Cape Town and Nairobi. Lucrezia led focus groups with children both in and out of school to understand their passions and dreams, and what excited them the most. She interviewed 50+ families to understand their hopes, fears, needs and the role of education in their lives. Seeing that teachers were a key part of the challenge, we also interviewed teachers to get a better understanding of their experience becoming teachers, the methodologies and pedagogies they use, the successes and failures they were seeing. Finally, Lucrezia interviewed a random sample of people across slums and villages to understand their context: family dynamics, employment and income, religion and religious influence, societal systems etc. This research was invaluable in helping us understand where the system failed, and what opportunities we should solve for.

In parallel, we started getting a much better sense of user interaction with technology. We tested four existing literacy training apps and games in Mbollet-Bah, The Gambia with children who had never used technology before, and reiterated those experiments later on in urban slums in Cape Town, South Africa as well as Nairobi, Kenya.

DefineLucrezia_Bisignani_Kukua

After the research phase we spent months synthesising our research and defining the exact problems we wanted to solve. We identified 5 specific challenges to solve for:

  1. Complex literacy acquisition methodology
  2. Lack of personalized and immediate feedback
  3. Lack of individual practice time
  4. Lack of engagement
  5. Lack of localized content

 

Ideate
Our product slowly began to take shape as learnings from ground work informed our thinking. From the get go, we saw the importance of building:

  • A game to solve for engagement and provide a lot of practice in
    an environment where practice is fun and learners get regular corrective feedback through game dynamics
  • An adaptive software that could feed appropriate content based on learner’s level
  •  A local narrative to the game inspired by local stories as well as local content
  • A carefully designed literacy path to run as the backbone of the software.

Kukua_SEMA_LucreziaBisignani1

Prototype

Our very first prototype was a series of school exercises and content that we adapted to an app format and that included some gamification elements. We tested it in the fall of 2015 in Kibera, Nairobi. As expected with such an early version, the results were revealing. We tested this prototype alongside other literacy training applications and games, and saw that children consistently turned to some of the more engaging games we had on each tablet after only 10 minutes or so on our App. While we were more confident in our academic and pedagogical approach, engagement was clearly a challenge. This informed our major shift away from gamified content to proper game-based learning.IMG_7549

From then on, we decided to build a hyper-engaging experience – and everything we designed for SEMA had to be engineered for engagement as well as literacy and numeracy outcomes. As we started to brainstorm, we took all of our initial sketches and narrative drafts to primary school-aged children in Nairobi and South Africa to see how those ideas resonated with them. Children provided us with invaluable input throughout the build process – and all design and product decisions were made taking their feedback into account.

In total, we prototyped and tested 50+ versions of the game and all its different component parts with 500+ children before assembling the entire App. We then tested that as a whole and kept iterating on it based on results. The app is built with the feedback of the testing sessions and directly from what the children have experienced.

Test

While our European engineering and design team was hard at work building Sema, our local Kenyan team was equally hard at work running continuous testing sessions over the past 1.5 years in slums around Nairobi, most notably in Kibera and Mathare. The test design set three objectives:2016-07-21-PHOTO-00000192

  1. Gather early usage data and anecdotal evidence to explore the effectiveness and inform the design of the game
  2. Build a continuous relationship with local communities and familiarize children and their parents with SEMA
  3. Build our local capacity, systems and experience for running high quality trials in the future

In addition to testing the game with 500+ children in an informal way on a weekly basis – in places like Red Rose School in Kibera, AMREF’s Dagoretti Rehabilitation Center, Rescue Dada Center, Mother House Centre, Agape Hope Children’s Center and rural communities in Masai Mara – we ran three structured month-long cohorts with 50 children each to prove our processes for evaluating learning gains and, more importantly, build our capacity for larger quantitative field testing. Children were given an opportunity to play for 4 weeks with ~150 tablets that we left in the centers, and we tested each child before and after with EGRA and EGMA. Overall we have logged thousands of sessions and millions of data points.The data on learning we have gathered from our tests will be applied to optimise SEMA content. We have been able to observe the areas of the game and the aspects of the platform which obtain most user traction, and issue updates accordingly.

Selection of design features inspired by user input

The list of design choices and product features that were informed by user feedback and observations is too long to include, but here are ten representative and substantial changes we made to SEMA based on our field tests:

Culturally relevant content. This guiding principle of SEMA – to build a literacy training software inspired by local culture and content – came directly from our initial field tests. As we tested existing literacy training apps, we saw that learning “ai for Igloo” made comprehension and memorization challenging for our target learners as they always had to inquire about the meaning of the word Igloo. To solve for this, we decided early on that all of our content would use words and images that felt culturally authentic and resonated with our target users. We listed everything our target users loved as we met and talked to them across different villages: soccer balls, soccer jerseys, trucks, scooters, earphones and phones, the sea, mangos and coconuts, sneakers and boots… All these elements along with many others are now present across SEMA’s games. Our character’s gadgets – from Sema’s smartphone, boots and the magic scooter she builds to Paul’s cool sneakers and his soccer ball – are a particular favorite with children.

Kenyan voice overs. One of our most interesting early findings (while testing a successful english game in Kenya) was that the British accent was hard to understand for our learners and often made them giggle hysterically (which distracted them). We therefore decided that all of SEMA’s voice overs and instructions in English would be in a Kenyan accent. We contracted John Sibi Okumu, a well-known Kenyan actor and dubber, as our lead male voice artist. It was a hit with children as they recognized his voice instantaneously from his other work, notably the voice of the Lion in Tinga Tinga Tales. We also contracted Phy Mwihaki, a kenyan 23-year old emerging music pop star and winner of two of the most popular musical competitions in Kenya, to be the voice of SEMA. Though she’s a successful artist in her own right, Phy was thrilled to work with us: “my dream growing up was to give my voice to a cartoon! Being the voice to SEMA and contributing to teach so many children is more than what I had dreamed!”

Inspiring characters.SEMA_Kukua All of the literacy training games that we have seen or tested have western characters (there even seems to be a trend to use monsters for literacy training!). We also noticed that there were very few inspiring African characters or superheroes that children can look up to – and none of them are children! After coming up with the early sketches for SEMA’s characters, we showed them to target users in rural areas alongside other characters from other games, and they were clearly excited by characters that resembled them. We then went into designing Sema and Paul – both completely inspired by the children we met.

Photo Rewards. In our early trials of existing educational apps and games, we regularly caught children who had our tablets switching to the camera app to take selfies or record videos of themselves. While some might have seen it as a threat to children spending more time on the app (instead of learning), we thought there might ways to take advantage of that in SEMA. Now, any level completion unlocks SEMA’s camera allowing learners to take selfies.

Story Rewards. Story rewards were also inspired by our early field research. In a rural village in The Gambia, a young girl once invited us to her home after school “to show us something.” As soon as we stepped into her hut, she slid a storybook from under her rudimentary bed – a story of roaming animals and their adventures. It was the only book she owned. She’d received it as a reward from her teacher because she was “the first of the class.” She felt so proud. She treasured it as gold, read it over and over night after night, even though she now knew the story by heart. It was absolutely beautiful to witness. And we realized just how precious storybooks are for our target users. This inspired us to insert stories within SEMA that learners can collect to keep as soon as they learn how to read (after World 5). We worked with the African Storybook Project, an open source library of authentic African stories, to select 16 stories for which we redesigned the graphics to fit SEMA’s graphic style.

Multiple profiles. Before we started, we hadn’t thought much about building the capability for different profiles. We quickly learned that many families share a smartphone at home (sometimes, they even share one across several families) and kids share devices in school. For every child to have a personalized learning experience on SEMA, we built individual profiles that store the progress of each child, even as many children share the same device.

Photo Log in. To enable those multiple profiles, we initially built a regular login with a username and password. We quickly saw that none of our target users – 5-7-year old learner – managed to log in. They’d never logged into a digital application before, and obviously still didn’t know how to read or write! As we brainstormed possible solutions, we decided to leverage their love for selfies a second time. You now login to SEMA by taking a selfie, and you can select your profile by clicking on your picture in the library. Any child, no matter their literacy level or technological ability, can now easily log into their own session.

Navigation gestures. In rural villages in The Gambia, children – who for the most part had never used a smartphone or a tablet before – did not intuitively know how to “drag” an object across the screen, a gesture that seemed completely intuitive to us. Instead, they just tapped repeatedly to move any object or character on the screen. In urban slums in Nairobi, however, children were by and large more accustomed to “dragging.” Given the diversity of target users that we hope will be using the app, we therefore decided to leave both gestures available so that no child will get stuck based on the gesture itself.

Navigation on the map. SEMA's MapWe also completely changed the design of our Map. We initially copied so many games that are based on maps such as Diablo, where the undiscovered part of the map is completely grayed out and only becomes visible as you explore it. In SEMA’s case, however, children told us it was frustrating not knowing how much more progress they had to make in the game before reaching the end. The unknown part of the map contributed to them losing their sense of progress through the game. Now, learners can see the entire path on SEMA’s map, helping build eagerness and excitement to move from level to level.

Notifications. Notifications of success and failure are an essential part of any learner’s progress and literacy and numeracy acquisition in SEMA. We initially designed notifications that resembled the ones we have in our own apps. When a player failed to jump over an obstacle in our Running Game, Sema turned red and stumbled. This, however, was far from clear for our target users – who just kept playing as if nothing had happened. We therefore reinforced the notifications to make them much stronger, with sound effects and an omnipresent progress bar at the top that clearly jumps down when a learner fails.

In the next six months our focus will shift to measuring the SEMA prototype’s impact — looking at kids’ engagement levels and literacy gains–,but we will surely continue to make improvements to the game’s content and design based on kids’ ongoing feedback.