Andrew Leigh and Joshua Gans have a new paper out on media ‘slant’ (which they prefer to ‘bias’, given that reporting can be negative or positive for reasons unrelated to prior partisan feelings).
One of their methodologies for assessing ‘slant’, getting five people to code article and editorial content, seems sensible – though it would be good to extend the analysis beyond the 2004 election campaign, given that it would be quite possible that leadership issues in that campaign made some papers appear more anti-Labor than they are on ideological grounds alone.
But another methodology using public intellectuals, as Sinclair Davidson has argued at Catallaxy, just isn’t going to work.
They’ve rated the partisan nature of various public intellectuals according to whether they are most mentioned by Coalition or Labor politicians in a positive or neutral way. As Sinc points out, this immediately starts to get some very counter-intuitive results:
Does anyone really believe that Philip Adams (26 mentions, 65% Coalition) is a right-winger? Other right-wingers include Eva Cox (9 mentions, 56% Coalition), Germaine Greer (4 mentions, 75% Coalition) and Hugh MacKay (18 mentions, 78% Coalition). Kevin Rudd’s best friend Glyn Davis (18 mentions, 56% Coalition) looks to be a tory too.
The slant of media outlets is then judged by its media mentions of public intellectuals, with neutrality being even mentions of Labor and Coalition favoured intellectuals.
Doubtful original classifications aside, I think the deeper problem here is that most public intellectuals are not party partisan in their public intellectual work – either it just isn’t relevant to the things they are writing about, or their loyalties are to a discipline, ideology, theme or issue above whatever loyalties they might have when voting in elections.
Just because politicians might sometimes find an intellectual’s work convenient for their own purposes doesn’t mean that the intellectual has a partisan bias, or that the newspaper using them reflects any bias or ‘slant’.
As a minor example, one of my Labor parliamentary mentions is somewhat positive – but Kim Carr did not use me because he agrees with any of my views, but because he wanted to show that even right-wingers were unhappy with the 2003 Nelson higher education reforms. I suspect that at least some positive Phillip Adams mentions were like this – Adams used as a left-wing stick with which to beat the ALP.
I’d really like to see which public intellectuals get mentioned the most by which media outlets. But I don’t think this says much if anything about partisan media slant.
Update: Joshua Gans responds, with my reply in his comments thread.
14 thoughts on “Can public intellectuals be used to assess partisan media slant?”
What would be very interesting is for Simon Jackman to create statistical ideal point estimates for public intellectuals based on their support/opposition to issues raised in the public debate.
He currently does this for US Senators based on their Senate votes (PDF updated daily here).
Garbage in, garbage out.
Completely and utterly unconvincing. Gans has been a bit suspect at times, but I would expect better of Andrew Leigh.
Johno – That’s a bit tough. I think it is an unsuccessful experiment for the purposes of measuring partisan slant of the media, but the idea of seeing which public intellectuals are used most by politicians wasn’t silly – though I think the coding needs to eliminate mentions unrelated to their public intellectual work (eg their activities in other roles) and probably incorporate negative mentions (for example, Coalition criticisms of ‘black armband’ historians would have made those people look much more ‘left’).
I really don’t understand how they published such a paper when it was clear the findings were so wrong that it would cause people to ROTFL.
The idea The ABC is right wing biased may sound plausible to very far left… so far left you’d fall off the side of the earth.
Dead right jc. It should have been blindingly obvious from the preliminary results that there was something seriously awry with the project’s methodology. Instead, these two blindly trusted their models, and have ended up making horses’ asses of themselves.
Gans is political operator without the slightest scupples if it means it helps the left wing.
He is aware of he’s doing, which is to allow his side to cite an academic reference if anyone in the future claims there is left-bias in the media. He doesn’t give a toss about its veracity.
It has already started with LP peddling the nonsense that this so-called study ought to be taken seriously.
Your alternate model seems to be to throw away a study if the results don’t fit your preconceived idea of what they should be.
It is pretty clear that that approach isn’t going to create anything of value.
“Your alternate model seems to be to throw away a study if the results don’t fit your preconceived idea of what they should be. It is pretty clear that that approach isn’t going to create anything of value.”
Results that are strongly counter-intuitive do suggest that a methodology should be looked at again, and this methodology does have obvious problems – an MP mentioning a public intellectual in a positive or negative way does not necessarily imply any general endorsement of that intellectual’s opinions, and there is no reason to suppose that media mentions of intellectuals reveals partisan allegiances. Given such large holes in the underlying theory behind the methodology, there is no reason to regard the results as over-turning intuitive views of where various media outlets line up.
I read a little more about this “study”. They say they coded it with references to the number of times academics or some such were referred to and other indicators. With that they came up with the preposterous suggestion that the ABC slants libs.
I would not for a moment think they acted dishonestly in this purported “study”.
However lets face facts, the programs people are concerned with that carry an obvious leftist stand are programs such as the news, 7.30 Report, Lateline etc. Giving say equal weight to Landline and the 7.30 Report is plainly silly.
Sure, I agree that the methodology seems problematic – the first thing I thought after reading it was that a lot of positive mentions by politicians are likely to be “Look, even so-and-so agrees with us on this!”, referring to an intellectual or commentator associated with “the other side”.
I even suggested a better approach (research each public intellectual’s stated opinion on a large number of topics of public debate, either for, against or neither, then create an ideal point estimate).
I was just trying to point out, more to some of the other commenters here, that “the results aren’t what they should be” isn’t a valid criticism on its own – if you’re just going to keep redesigning things until you get the result you expect then you’re clearly not going to get a result of any more than propaganda value.