Michael Bossetta is a researcher and associate senior lecturer in media and communication studies at Lund University. His research explores how social media are used by politicians and citizens during elections and how the design of platform shape the use of social media in politics. One of his recent studies explored the emotions expressed by presidential candidates in images used in ads on Facebook. For that project, Michael Bossetta and his colleague Rasmus Schmøkel, PhD Student at Southern Denmark University developed two open science tools.
Karolina: Hello Michael! Could you tell us a little about the open software tools that you and your colleague have developed?
Michael: The FBAdLibrarian basically helps researchers download images from Facebook’s Ad Library, which is a searchable database of political ads on Facebook and Instagram. While verified researchers can collect textual data from the Ad Library as a spreadsheet, our tool allows researchers to extract the ads’ images. Images can carry important information that is non-textual, like facial expressions or political symbols.
The other tool is called Pykognition. It creates an easier way for researchers to access the Amazon Rekognition API, which performs emotion and object detection. It gives you a prediction of what emotions people are expressing in images. I think it is important to know that this tool by Amazon is used more by companies, it is not actually built for research. What we did with Pykognition was to make it easier to connect with that product in a way that provides the data in a way more suitable for researchers, not for software developers.
Karolina: Could you say something about the background, which were your reasons for developing these particular tools?
Michael: Facebook launched this Ad Library in 2018, so 2020 was the first US presidential election that we could study political ads on Facebook. In recent years, researchers have made huge strides in developing computational methods software for R, an open source programming language. We wanted to create something in that open source spirit, but most of the existing R software is built for analysing text. However, when it comes to political advertisements on social media, images are probably are more powerful than text in shaping what people think about candidates. So our idea was to develop tools that could collect and analyse images. The paper FBAdLibrarian and Pykognition: open science tools for the collection and emotion detection of images in Facebook political ads with computer vision details how to use the tools, and it’s really a stepping-stone to apply FBAdLibrarian and Pykognition in a bigger research project.
Karolina: Both tools have the GNU GPL license that permits freedom to use, change and share the software. How would you like other people to use or develop these tools?
Michael: I hope people will first try them out and think of other ways to use them. We know facial recognition software has limitations, so how do others deal with those? The way that we actually used Pykognition was to automatically classify happy images, which is where the tool performs best. So we could use Pykognition as a filtering mechanism to remove happy images where we didn’t need human coders. This was not what we originally built it for, but we found it was very useful for reducing the amount of data we needed to code manually. There are probably other, unexpected ways that these tools can help research, and maybe not in the ways that we originally intended. I also hope that people do use these tools to researcher images on social media. A lot of social media research is on text, but images seem to be more powerful. There’s many things we do not know about political images, in terms of how candidates present themselves and the impact their images have on voters.
More information about Michael’s research is available here