Notes from a Mobile, Bottom-Up, Rapid, Multi-Country Perception Survey

Posted by anoushrima on Nov 30, 2010

Several months ago, MobileActive.org partnered with UN Global Pulse to implement a mobile phone survey across multiple countries including Uganda, India, Mexico, Ukraine and Iraq, as part of a two-part project on mobile data collection.

UN Global Pulse was interested in gaining a preliminary understanding of how vulnerable populations deal with and describe (in their own words) the ongoing impacts of the global economic crisis.

The survey asked two simple multiple choice and three open-ended questions focusing on economic perceptions:

  1. In the past year, meeting your household needs has been: Easier, Same, More difficult, Very difficult
  2. In the past year, how has the (insert country) economic situation changed?: Better, Same, Worse, Much Worse
  3. What has been the greatest change you had to make to meet your household needs this past year?
  4. How has your quality of life changed over the past year? 
  5. In one word, how do you feel about your future?

The project was, in many ways, an exercise in rapid, bottom-up data collection. We wanted to see if, and how quickly, it is possible to implement a flash mobile poll, to determine existing technical capacities of partner organizations, to determine how people respond via SMS, to learn how people describe economic vulnerability and attitudes, and to discover unexpected challenges and lessons.

Needless to say, the responses from people in the five countries were diverse and fascinating. But while we await the results of the survey to be analyzed and visualized for publication by UN Global Pulse, we wanted to share some practical observations about the process and experience of implementing a multi-question interactive phone based survey across five different countries (with a caveat that the survey was not required to be statistically representative!).

Through MobileActive.org’s global network, we identified partner organizations that have the technical capability to implement a large scale mobile phone-based survey in each of the target countries. We vetted and selected partner organizations that had:

  1. capacity to customize software for SMS (or voice based) data collection,
  2. an existing relationships with local telecom providers/networks,
  3. an established record of working in the target country, and 
  4. access to the public via large networks or mobile phone number lists/subscribers to ensure 1,000 valid responses to the 5-question survey.

As this was a different sort of survey, there were a unique set of issues which we needed to address in order to survey across countries and through different vendors. Below are some notes about the process, method, and issues we faced. 

METHOD OF DELIVERING THE SURVEY: While the survey questions were required to be the same across countries, the mode of delivering the survey over the phone was not pre-defined. Our strategy was to have our partners suggest the best method for how to conduct the survey, as they work in country and best understand country-specific legal and technical requirements, as well as social norms. This approach was educational in helping us gain an understanding of how quickly interactive mobile surveys can be deployed based on existing networks, as opposed to creating a new top-down system.

For example: Our partner in India opted to do a phone based survey (voice-calling); due to the high volume of available and trained call-center operators as this was the cheapest and most effective method. Partners in Mexico, Ukraine and Iraq implemented interactive surveys over SMS (but each utilized their own custom/available software systems).

COST: Costs for implementing surveys can vary greatly due to local SMS costs, and partner capacity. For example, in Mexico SMS rates are relatively more expensive than in most of the other target countries. Partner capacity and cost for labor also varied, depending on the size of staff and level of customization required to set up an interactive survey.

INCENTIVES FOR USERS: Based on their local knowledge, we asked each partner organization to recommend what incentive structure (if any) and promotional language would yield higher response rates and results. In some instances the full cost of answering the SMS survey were reimbursed by topping up participants’ phone credits in exchange for completing the survey (Iraq), while in others instances, users who completed the survey were eligible for a drawing to win a given amount of free airtime (but no direct reimbursement for the cost of the SMSs - like in Uganda and Mexico). The language used to describe the incentives also varied in different countries due to rules/regulations that may apply in each country with regard to offering “prizes”, “airtime,” etc.

LANGUAGE: Words/phrasing of the introductory message (asking users if they would like to participate in the survey) were necessarily varied in an effort to maximize peoples’ understanding and response rates. For example, using the phrase “quiz” rather than a “survey” was suggested as a more effective approach by our partners in Uganda, as users there were accustomed to mobile-based quizzes and games. In other instances, utilizing the UN’s name in the introductory message gave the survey more legitimacy, while in others it was more effective to mention the mobile carrier’s name in the introductory message (in places where the carrier is well respected).

TARGET AUDIENCE: Some partners relied on an existing network or list of contacts and phone numbers to whom the survey questions were sent, while in other country implementations partners sent out message advertising the SMS survey to random users of a certain mobile provider (databases provided by the mobile provider - such as in Ukraine and Mexico, for example). Again, it is important to note that while we tried to diversify by age, income, gender, and geographic location, this was NOT designed to be a random survey that would yield statisticially relevant results. This would entail a much more careful metholodogy than we pursued for the pruposes of this trial.

DEMOGRAPHIC DATA: Some partners are able to get some demographic data automatically, while others explicitly asked survey participants to self-identify age, gender and location as part of the survey. Meanwhile, in some countries asking for certain types of demographic data is too sensitive - for example, in one country where we did not end up implementing the survey, asking about gender and age was considered inappropriate as there are high instances and fears of human trafficking, while questions asking users to identify their location are considered too sensitive in Mexico for fear of kidnapping and extortion.

FORMAT OF DATA SETS:
We asked all partners to submit their final raw data from the survey in an .XLS spreadsheet, but did not specify or require a certain format for laying out the data (as each organizations’ system/data collection software has its own way of displaying incoming data) in the interest of attempting to test the rapid-data collection capacity of our partners.

Key Takeaways & Issues to Consider

  • The benefit of working with partners and allowing them to design and implement the survey was that they already have the best understanding of mobile marketing regulations in-country (such as legal implication of giving away “prizes,” or process for topping-up credits to reimburse users’ SMS cost, etc.), and what format/method would be most effective given the society’s phone culture norms and habits.
  • The technical capacity in countries will vary. Some countries do not have vendors with experience in implementing SMS or mobile-based surveys, or, quite simply mobile-maturation in a given society may not be quite up to capacity. Our contacts in Georgia, a country under consideration for this project, for example, informed us that no such an interactive SMS survey has been conducted so far in that country.
  • It is critical to choose countries carefully, and consider potential political complications and risks of asking for public participation in answering survey questions. All efforts should be made to chose countries for implementation carefully and consider legal counsel to mitigate risk and putting anyone in harm’s way.  For instance, we aborted a survey in one country where only 'official sources' are allowed to survey on economic issues.
  • As highlighted above, it is important to be aware that some information is easier to obtain in some countries, but more sensitive in others. This pertains especially to demographic data. Privacy and security of survey respondents is paramount and needs to be considered as well as made part of any data collection.
  • Depending on the survey design/structure, not all respondents/participants answered ALL of the questions. It would be important to spend more time working with partners to design methods that will yield consistent results (whether by specifying that SMS cost-reimbursements or incentive air-time will only be awarded to completed surveys, etc.)
  • Working with existing capacity and partners already established in country meant that there was not uniformity in the format of the .XLS files where the data was collected. As described above, all the data received is in differing formats, perhaps making it more challenging to compare and analyze the data across countries quickly.  If you want data sets to be uniform, it would be advisable to work with partner organizations and learn more intimately what their technical capability and default format is up front, in order to normalize data formats in their final reports.

Photo courtesy Wayan Vota

Notes from a Mobile, Bottom-Up, Rapid, Multi-Country Perception Survey data sheet 3026 Views
Countries: India Iraq Mexico Uganda Ukraine

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd><p><br> <b><i><blockquote>
  • Lines and paragraphs break automatically.

More information about formatting options