Misinformation generates misperceptions, which have affected policies in many domains, including economy, health, environment, and foreign policy. Co-Inform is about empowering citizens, journalists, and policymakers with co-created socio-technical solutions, to increase resilience to misinformation, and to generate more informed behaviours and policies. The aim of Co-Inform is to co-create these solutions, with citizens, journalists, and policymakers, for (a) detecting and combating a variety of misinforming posts and articles on social media, (b) supporting, persuading, and nourishing misinformation-resilient behaviour, (c) bridging between the public on social media, external fact checking journalists, and policymakers, (d) understanding and predicting which misinforming news and content are likely to spread across which parts of the network and demographic sectors, (e) infiltrating echo-chambers on social media, to expose confirmation-biased networks to different perceptions and corrective information, and (f) providing policymakers with advanced misinformation analysis to support their policy making process and validation. To achieve these goals, Co-Inform will bring together a multidisciplinary team of scientists and practitioners, to foster co-creational methodologies and practices for engaging stakeholders in combating misinformation posts and news articles, combined with advanced intelligent methods for misinformation detection, misinformation flow prediction, and real-time processing and measurement of crowds' acceptance or refusal of misinformation. Co-Inform tools and platform will be made freely available and open sourced to maximise benefit and reuse. Three main stakeholder groups will be directly engaged throughout this process; citizens, journalists, and policymakers. Operational time Jan 2018 - Dec 2020 Source of funding Horizon 2020 - Research and Innovation Framework Programme Partner.
April 1, 2018 - 31.03.2021
Source of funding:
- EU Horizon 2020
- International Institute for Applied Systems Analysis
- Cyprus University of Technology
- UK Open University
- International Hellenic University
- Stockholm University
- Expert Systems Iberia and Scytl
Project home page
I have studied computer science and computational linguistics at the Universität Erlangen-Nürnberg and at the University of Pennsylvania. I worked in the previous computational linguistics research group at the Universität Freiburg and did my Ph.D. in computer science in the faculty for technology in 1998. Afterwards I joined Universität Stuttgart, Institute IAT & Fraunhofer IAO, before I moved on to the Universität Karlsruhe (now: KIT), where I progressed from project lead, over lecturer and senior lecturer and did my habilitation in 2002. In 2004 I became professor for databases and information systems at Universität Koblenz-Landau, where I founded the Institute for Web Science and Technologies (WeST) in 2009. In parallel, I hold a Chair for Web and Computer Science at University of Southampton since March 2015.
Data represent the world on our computers. While the world is very intriguing, data may be quite boring, if one does not know what they mean. I am interested in making data more meaningful to find interesting insights in the world outside.
How does meaning arise?
- One can model data and information. Conceptual models and ontologies are the foundations for knowledge networks that enable the computer to treat data in a meaningful way.
- Text and data mining as well as information extraction find meaningful patterns in data (e.g. using ontology learning of text clustering) as well as connections between data and its use in context (e.g. using smartphones). Hence, knowledge networks may be found in data.
- Humans communicate information. In order to understand what data and information means, one has to understand social interactions. In the context of social network knowledge networks become meaningful for human consumption.
- Eventually meaning is nothing that exists in the void. Data and information must be communicated to people who may use insights into data and information. Interaction between humans and computers must happen in a way that matches the meaning of data and information.
The World Wide Web is the largest information construct made by mankind to convey meaningful data. Web Science is the discipline that considers how networks of people and knowledge in the Web arise, how humans deal with it and which consequences this has for all of us. The Web is a meaning machine that I want do understand by my research.
Where else you might find me?
I'm the (as opposed to "a") political scientist at WeST. For thesis supervision, speak to me about aspects that relate to politics and social science.
Text analysis, political polarization, media bias
2016 PhD Political Science, Freie Universitaet Berlin
2012 MIS International Studies/European Area Studies, Seoul National University
2009 BA Fine Art, Central Saint Martins
- Co-Inform (I wrote a blog post on my personal focus regarding "misinformation")
- SS19: Research Lab "Building AI Systems for Detecting Political Polarization"
- WS1819: For Introduction to Web Science, I taught the Jan 28th session on "Social Science of the Web"
- WS1819: Computation for Social Science