Misinformation generates misperceptions, which have affected policies in many domains, including economy, health, environment, and foreign policy. Co-Inform is about empowering citizens, journalists, and policymakers with co-created socio-technical solutions, to increase resilience to misinformation, and to generate more informed behaviours and policies. The aim of Co-Inform is to co-create these solutions, with citizens, journalists, and policymakers, for (a) detecting and combating a variety of misinforming posts and articles on social media, (b) supporting, persuading, and nourishing misinformation-resilient behaviour, (c) bridging between the public on social media, external fact checking journalists, and policymakers, (d) understanding and predicting which misinforming news and content are likely to spread across which parts of the network and demographic sectors, (e) infiltrating echo-chambers on social media, to expose confirmation-biased networks to different perceptions and corrective information, and (f) providing policymakers with advanced misinformation analysis to support their policy making process and validation. To achieve these goals, Co-Inform will bring together a multidisciplinary team of scientists and practitioners, to foster co-creational methodologies and practices for engaging stakeholders in combating misinformation posts and news articles, combined with advanced intelligent methods for misinformation detection, misinformation flow prediction, and real-time processing and measurement of crowds' acceptance or refusal of misinformation. Co-Inform tools and platform will be made freely available and open sourced to maximise benefit and reuse. Three main stakeholder groups will be directly engaged throughout this process; citizens, journalists, and policymakers. Operational time Jan 2018 - Dec 2020 Source of funding Horizon 2020 - Research and Innovation Framework Programme Partner.
April 1, 2018 - 31.03.2021
Source of funding:
- EU Horizon 2020
- International Institute for Applied Systems Analysis
- Cyprus University of Technology
- UK Open University
- International Hellenic University
- Stockholm University
- Expert Systems Iberia and Scytl
Project home page
I have studied computer science and computational linguistics at the Universität Erlangen-Nürnberg and at the University of Pennsylvania. I worked in the previous computational linguistics research group at the Universität Freiburg and did my Ph.D. in computer science in the faculty for technology in 1998. Afterwards I joined Universität Stuttgart, Institute IAT & Fraunhofer IAO, before I moved on to the Universität Karlsruhe (now: KIT), where I progressed from project lead, over lecturer and senior lecturer and did my habilitation in 2002. In 2004 I became professor for databases and information systems at Universität Koblenz-Landau, where I founded the Institute for Web Science and Technologies (WeST) in 2009. In parallel, I hold a Chair for Web and Computer Science at University of Southampton since March 2015.
Data represent the world on our computers. While the world is very intriguing, data may be quite boring, if one does not know what they mean. I am interested in making data more meaningful to find interesting insights in the world outside.
How does meaning arise?
- One can model data and information. Conceptual models and ontologies are the foundations for knowledge networks that enable the computer to treat data in a meaningful way.
- Text and data mining as well as information extraction find meaningful patterns in data (e.g. using ontology learning of text clustering) as well as connections between data and its use in context (e.g. using smartphones). Hence, knowledge networks may be found in data.
- Humans communicate information. In order to understand what data and information means, one has to understand social interactions. In the context of social network knowledge networks become meaningful for human consumption.
- Eventually meaning is nothing that exists in the void. Data and information must be communicated to people who may use insights into data and information. Interaction between humans and computers must happen in a way that matches the meaning of data and information.
The World Wide Web is the largest information construct made by mankind to convey meaningful data. Web Science is the discipline that considers how networks of people and knowledge in the Web arise, how humans deal with it and which consequences this has for all of us. The Web is a meaning machine that I want do understand by my research.
Where else you might find me?
For thesis supervision, you are very welcome to speak to me about your aims that relate to politics and social science. If you are unsure where to start, check out the list of open master and bachelor thesis topics.
I'm the (as opposed to "a") political scientist at WeST. What am I doing here? In short: I'm part of a collective endeavor to cross academic disciplines and methods, which is a stance that is calibrated for the digital era. Phenomena that are so particular to online media spheres are racing past us since yesterday. I believe that structures of most departments are too rigid to properly prepare us for the mix of amplified exposure and perceptions that emerge from the largest medium of the world: the Web.
- Text analysis, political polarization, digital politics, misinformation
- Since coming to WeST, I'm basically facing the dragon when it comes to how to really do interdisciplinary research, first-hand. Luckily, that has always been one of my driving interests.
2016 PhD Political Science, Freie Universitaet Berlin
2012 MIS International Studies/European Area Studies, Seoul National University
2009 BA Fine Art, Central Saint Martins
Long vita: download CV
- WS1819: For Introduction to Web Science, I taught the Jan 28th session on "Social Science of the Web"
- WS1819: Computation for Social Science
- SS19: Research Lab "Building AI Systems for Detecting Political Polarization"
At WeST, my task is to offer elements of Social Science as theoretical framework for a Computer Science audience. In practice, this means that I try to shape your research questions and, as result, the method. For example, a behavioral pattern in tweets may have political causes, which can be explained by a fitting political science theory.
[Trivia:] How fact-checking online misinformation works: never as well as here, unfortunately.
My research interest lies in learning tasks with limited labels (e.g rumour detection, stance detection in political domain), learning representation of low resourced languages (e.g Turkish, Korean) by utilizing deep neural networks. I seek highly motivated students who are interested in these topics, and in working with me on their bsc/msc thesis.