Maybe I can come back to something you said a moment ago about overselling your research results. I think that's an interesting link to a question that I ask of all the people that are being interviewed. One thing that's happening in, and that's a, an important topic in this course, is the focus on questionable research practices, research integrity. Fraud cases that have come up recently. So the, the overselling your research results activates that in my mind. How is, is this whole controversy experienced in the field of political sciences? How is your field been affected? I know that social psychology has been impacted very greatly. >> Yeah. >> Other fields somewhat less so. How is that in political sciences? Obviously it's, it's had an impact on a, I think on social science, and even science in general. I think within the discipline of political science, I don't know of fraud cases that are so big as the ones you've had in social psychology and so incredibly. They're widespread, but I think it is a potential problem, obviously. I mean, where, wherever people collect their own data and wherever people analyze their own data, there is a risk of of, of something like that. I mean it would be silly to, to think that political scientists are inherently different than other science, that's just, it's just simply not true. So, there, I, wherever data is collected by individuals, there will be a, a, a risk of fraud, whether this is qualitative or quantitative, by the way, I mean so- >> Yeah. >> So lead interviews use, with, with policymakers, especially if they're anonymous sources or- >> Exactly. >> Are just as prone to fraud as experimental approaches or. One thing in my, in my research, so for instance, in the survey tradition, I think it's fairly difficult because I don't collect my own data. And as I just, I mean, as I just said, it's impossible to collect your own data. So what we usually do, is use data that have been collected by, you have professional agencies or, or- >> Right. >> I mean, where. So we might have a say in what questions are being asked. >> Mm-hm. >> And the question wording and categories and all that sort of stuff. But not the actual collection of the data or the interview process or anything. So we get we basically, me and, and many others in my field, we simply get the data set in SPSS or Stata or Excel or wherever and then we start working. >> Oh, okay. >> So if I, if I were to publish something that I made up that is a surprising finding, because it's always a surprising findings that should get scrutinized not. >> Yeah. >> then, that would be incredibly stupid because a lot of people have that same data set. >> Right. >> I mean, they can simply. So if I were to, if I were to set, correlate for instance. Let's say there's a correlation between education and and political participation. Higher educated people participate more in politics than lower educated people. >> Right. >> Let's say that I say that I've you know, I've conducted, I've conducted research on National Kiezers Onderzoek, Dutch Parliamentary Studies 2012, and I find no evidence of that whatsoever. Education has no role whatsoever in in predicting political participation. Big surprise. Everyone's talking about- >> Yeah. >> The Netherlands as a diploma democracy, and the higher educated are so powerful and influential. And here's my research that, that shows there's no correlation whatsoever. Obviously you can just open a data set yourself- >> And check. >> And check and you will find that there is an actual correlation. So you use, I mean in my, in my particular field lying with data that actually everyone has access to is just- >> Stupid. >> It's stupid. >> Yeah.