LUMS Team Wins Meta/Facebook Award 2022-23!
We are proud and excited to share that this year, for the third time, team LUMS has won the prestigious Meta/Facebook Foundational Integrity Research Award 2022-23! Dr. Ayesha Ali from the Department of Economics at the Mushtaq Ahmad Gurmani School of Humanities and Social Sciences, and Drs. Ihsan Ayyub Qazi and Agha Ali Raza from the Department of Computer Science at the Syed Babar Ali School of Science and Engineering won the award for their proposal, ‘Gamifying Media Literacy Interventions for Low Digital Literacy Populations’.
This year Meta received over 503 proposals from 349 universities from across the globe and only 11 proposals were selected for funding. The proposals were evaluated by a selection committee comprising members of Meta research and policy teams. The award provides funding of USD 50,000 for improving the uptake of digital media literacy interventions among low digital literacy populations.
We spoke to the team to learn more about the project and why tackling misinformation is so important.
Can you tell us more about your winning proposal and what it entails?
Our proposal tackles the problem of misinformation, which can negatively affect people's well-being and reduce trust in information sources. We propose to design and test digital media literacy interventions that can help people with low digital literacy skills to critically evaluate online information and distinguish facts from fiction. The need for such interventions is urgent, as social media use has surged in developing countries, where many users lack the necessary skills and knowledge to navigate digital spaces safely and effectively.
Research has shown that low digital literacy is correlated with greater susceptibility to misinformation. However, there is little evidence on what kinds of digital media literacy interventions work best for populations with low digital literacy. Moreover, it is difficult to reach and recruit low digital literacy users for online surveys or experiments.
Can you tell us how technology countering misinformation has evolved in the past few years and how your proposal aims to increase digital literacy?
Misinformation has become more prevalent and sophisticated with the advancement of technology. For example, deepfake misinformation can manipulate images and videos to create false impressions, which can be hard to detect by low digital literacy users. Technology companies are trying to combat fake news with machine learning models, but these models are not perfect and can struggle with the dynamic and diverse nature of misinformation. To address this challenge, social media platforms employ content moderators, however, human-driven content moderation is difficult to scale. Therefore, we believe that a comprehensive strategy for countering misinformation should also include enhancing digital media literacy among citizens, especially those with low digital literacy skills.
However, there is a lack of evidence on what kinds of digital media literacy interventions are effective for such populations and how they compare with each other. What complicates direct comparability between different existing interventions are the differences in study samples and modalities of interventions, which makes it challenging for policymakers and social media platforms to seek adoption. Moreover, it is difficult to reach and recruit low digital literacy users for online surveys or experiments and to evaluate the long-term impact of different interventions in the field. Our proposal presents a solution to the challenge by creating a game that will offer entertaining and engaging digital media literacy interventions. We are also going to organise a Randomized Control Trial with a sample of low digital literacy users obtained from the field in Pakistan.
Why is the issue of misinformation and fake news still important, particularly in the context of Pakistan, given that it was a common element in all three of your proposals that won Meta/Facebook awards?
By several accounts, the spread of misinformation is growing and becoming increasingly challenging to detect. With newer ways of spreading misinformation (e.g. deepfakes), fake news can influence political and social behaviour in unprecedented ways, which in turn can lead to political polarisation, election interference, and even violence. Therefore, cutting-edge research is needed to develop technologies, educational programmes and policies to counter the spread of misinformation.
We are committed to advancing our research on how to counter misinformation among low digital literacy populations. Our research includes exploring the factors that affect people’s susceptibility to misinformation, such as their digital literacy skills and their ability to think critically. We have also developed reliable tools for measuring digital literacy and testing the effectiveness of different educational interventions to improve people’s resilience against misinformation.
How did your diverse backgrounds in the School of Science and Engineering and the School of Humanities and Social Sciences result in this collaboration?
We believe that countering fake news requires a multidisciplinary approach that combines technology, social science, and policy. We were driven by our shared interest and motivation to work on this critical issue. Our collaboration was not hindered by our diversity, but rather enriched by it. Working together, we could leverage the tools and insights from each discipline to address the different aspects and challenges of the problem.
You’ve won the Facebook grant thrice. What would be your advice to other researchers who are hoping to do impactful research?
Our advice is to work on important problems that matter to people and society. These problems should also align with the researcher’s interests. When writing proposals, it is essential to identify a clear challenge, propose a credible solution, and articulate why the researchers are qualified to execute it.
Furthermore, one must be persistent and learn from failures. We did not win an award the first time we applied. We wrote several proposals before we got our first award. If you are passionate about your research and the problem you are trying to solve, you must keep trying. There will be setbacks and rejections, but they are part of the process, and they can help you improve your work and achieve your desired impact.
In what ways has the research environment at LUMS facilitated your progression in your research?
LUMS has been an incredible support for our research. A unique feature of LUMS is that it fosters cross-disciplinary collaboration among faculty and students from different disciplines. This is the key ingredient for generating innovative and impactful projects.
On the resources front, LUMS has also been generous with its Faculty Initiative Fund (FIF) awards, which we used for conducting some preliminary research before winning the Meta/Facebook grants. This helped us lay the groundwork for our current research. Additionally, the culture of flexibility and freedom that LUMS provides is hard to find in this region.