Sociologist is excited that the Australian Bureau of Statistics had a stand at the Multicultural Festival! Census night is 9 August 2016. Their pamphlet visually represents different ethnic, gender and age groups.
You may have read in late September that the ratio of women receiving Royal Society funding has “plummeted from one in three in 2010 to one in 20 this year.” While the Society also awards the Dorothy Hodgkin Fellowships to early career women researchers, this award exists to boost women’s participation in science, not to augment or mask the issues in the Society’s mainstream Fellowship program.
The Royal Society was silent for a couple of days after its list of fellows list was made public, despite a large outcry by the scientific community on social media and opinion columns in the media. The Society President, Sir Paul Nurse, finally announced an investigation a couple of days after the fact. The question is: why did the Society wait until it was made public to assess their program?
I want to stress that while I’m using the Royal Academy’s Fellowship outcomes as a case study, the issue I am illustrating is the reactionary treatment of gender bias in all fields of Science, Technology, Engineering and Mathematics (STEM). The point here is to tease out institutional patterns and to make the case that institutional approaches are needed to address gender inequality. While this point may seem obvious, the fact is that inequality in science, as with other spheres of social life, is still treated as a surprise. This is because, on the whole, organisations (and society in general) remains reactionary to addressing gender inequality. Diversity is an afterthought, when it should be a proactive and ongoing project at the organisational and societal levels.
This is the first in a series of articles I’m writing on why the scientific community, inclusive of various disciplines, needs to re-examine its position on the problem of inequality in STEM. The picture I am building up is one of methodological rigour and interdisciplinary collaboration in order to better work towards gender inclusion.
This graphic has been going around for a few weeks yet surprisingly with little analysis. A Backstage Sociologist first published it in late April, writing only:
Teaching and learning are not market transactions: They are sacred encounters of soulcraft. This graphic leaves one who teaches social science and the humanities with a heavy heart and despairing about the eventual extinction of well-educated citizens.
I suspect there is more to this chart and part of the soul searching should happen within sociology itself. I see the steep rise in business graduates and perhaps to a lesser extent in the life sciences and communications are partly a development in technology and the reality of the job market.
One way that sociology might address this is through a stronger focus on applied sociology. Without question, developing the sociological imagination has many personal and professional benefits, as critical thinking can help to improve civic participation and empower us to understand our lives in a broader context.
Then again, if you are a poor or otherwise disadvantaged young person thinking about the debt and other commitments you need to balance, pursuing a degree in sociology can be daunting. We are largely positioned as an academic discipline. There are few academic jobs for our graduates. Market forces may be driving graduates away from social science, but our discipline can be doing much more to demonstrate the applicability of our theories and methods to specific jobs and industries.
You can read more from my website Sociology at Work, with links to resources that can help provide tangible examples of how sociology students might find work in different industries, and how they might specifically use their degrees.
When most people think of labs, they imagine scientists in white coats staring into microscopes, carrying around beakers of bubbling chemicals, and holding test tubes over Bunsen burners. In social science, the reality is much more mundane. It’s usually just a room full of computers with software that may or may not be useful and may or may not be up to date. Even less compelling are the labs associated with statistical methods classes. The last couple years my own classes have been the worst case scenario–I just get up and lecture about how my students should use some particular piece of software to apply the methods we’ve been learning in the “lecture” part of the class. It doesn’t have to be this way.
Over the next few months I will have the opportunity to teach two new methods classes and completely re-invent how I incorporate labs. I had lunch with Mayur Desai the other day and I think he does a great job with labs in his classes and he’s inspired most (but not all) of the ideas here. This is what I’m thinking:
No lectures. None. Students enter the lab and get their assignment and spend the rest of the class trying to complete it.
Each assignment starts with a data set (preferably real) and a blank screen–that is, I don’t give them any code. Their job is to answer a substantive question by applying methods we’ve covered recently to the data.
Students work in pairs and take turns driving. I think this keeps students focused and they can teach each other. It also means only half the class has to have laptops if I want to implement a lab in a regular classroom.
I’m around to answer questions. In this way, it’s very different from a problem set where getting stuck on something dumb for hours at a time is a common occurrence. Struggling with problem is good for learning, but banging your head against a wall isn’t an efficient use of time.
The end product should be similar to results they might find in a published paper. Sometimes I’ll provide an empty table they must fill in and other times they will produce their own tables of results from scratch.
There should be opportunities for quicker/more advanced students to do more. One size does not fit all.
While it’s possible to use any statistical analysis tool in a lab successfully, I do think some packages are better than others. Most students already know Microsoft Excel and doing basic analyses (even regression) using it is easy, but you really hit a wall when you want to do anything even a little sophisticated. SAS is powerful, but there is a steep learning curve. My plan is to use Stata. You can browse your data in a spreadsheet style interface. You can play with commands through the menus and when you choose one, it shows you the command-line equivalent. You can work interactively at the command-line or build programs (using those same commands) in an editor. And the documentation is excellent and available online.
#VisualSociology of Avondale Heights, #Melbourne. The 2011 #Census by the Australian Bureau of #Statistics shows that Avondale Heights has a slightly older population (median age of 43 years versus 37 years for all of #Australia). These residents are slightly more likely to be married (55%), working full time in paid employment (62%), and a fewer proportion studying (24%). Just over 53% of residents are born in Australia. The rest are from #Italy (11.5%), #Vietnam (5%), Greece (3%), Croatia (2%) & India (2%). This is what interests me about this area: its diversity. Earlier waves of migrants settled here, mostly from Southern Europe. They have been upwardly mobile, with their children moving into the professional class. Over 66% of residents have two parents born overseas & a further 9.4% have at least one parent who is a #migrant. Together this means that 75% of residents are either first or second generation migrants. More than half of residents are #Catholic (51%). This is twice the national rate (25%). #sociology #migration #culture #Suburbs #society
I’m seeing a lot of activity about Deborah Blum’s interesting article on the hazardous metals used in lipsticks. The article has been shared widely in my circles and certainly got me engaged. Having just read Johnathan Chung’s statistical critique of the studies below, there is more to think about.
After first reading Blum’s article, I went back to the research published in Environmental Health Perspectives. I was initially struck by the methodology, where the researchers measured the impact of metals in lipsticks using comparisons to metals in water consumption. They did this because the materials used in lipsticks are not regulated in the USA. Blum notes that manufacturers have shown that they can manipulate the levels of these metals when they want to. I commented on Gaythia Weis’ post that these findings represent a powerful argument against self-regulation by industry. I think this comment stands, however, Chung argues that the sample and statistics overstate their case. It doesn’t change the fact that the metals used are dangerous, but Chung notes that the ingredients need to be put into statistical perspective and in context of how they are used. We don’t drink lipstick and our bodies don’t absorb these metals to the extent being reported.
The broader issue that I see arising from Blum and Chung’s analyses is that ordinary consumers are often unaware of the properties used in everyday products. While many of us have read other articles that warn about the dangerous compounds in make-up, new studies lead to a new wave of concern. The public is hungry for scientific guidance on how to respond to conflicting research. For these reasons, I love seeing scientists engaging with research in public forums, as both Blum and Chung have done.
This is a nice illustration of a basic mathematical principle that the general public does not always understand when they are presented with statistics. The media in particular do a poor job of conveying the simple fact that correlation does not equal causation. (See a larger image here.)
The Pew Research Centre’s Global Attitudes Project finds that humanitarian aid has a limited effect on improving the USA’s international image around the world. For example, in 2011, 85% of the 700 Japanese people who were surveyed reported a favourable view of America versus 66% of the Japanese participants in 2010. While the Pew Centre acknowledges that various reasons might contribute to an increased positive view of the USA, it seemed that America’s humanitarian commitment had a big impact in Japan. Then again, while the Pew Centre finds that America’s overseas aid improves its image in some countries, the link between humanitarianism and public goodwill is limited.
In Indonesia, the USA’s image improved in 2005, a couple of months after it delivered aid in the Banda Aceh region after a devastating tsunami. This positive view was not as strong as it was prior to the Second Gulf War.
In Pakistan, the USA’s public image improved modestly after it delivered aid to Northern Pakistan after a major earthquake in 2005, but this public image slipped again just one year later. By 2010, public goodwill towards the USA had slipped even further, despite America pledging humanitarian assistance following the floods.
Richard Wike, Associate Director for the Pew Global Attitudes Project writes:
The lesson for disaster relief efforts is that they are more likely to have a significant effect on public attitudes in countries where there is at least a reservoir of goodwill toward the U.S. In nations such as Pakistan, where countervailing issues and deeply held suspicions drive intense anti-Americanism, enhancing America’s image through humanitarian aid may prove considerably more difficult.
Having read hundreds of textbooks in my day… I absolutely love this footnote from a textbook, which I saw on College Humour today:
This chapter might have been called “Introduction,” but nobody reads the introduction, and we wanted you to read this. We feel safe admitting this here, in the footnote, because nobody reads footnotes either in this book.
The footnote is from a book called Stats: Modeling the World, 2nd edition (2007) by David E. Bock, Paul F. Velleman, and Richard D. De Veaux. Sounds like the type of book I wish I’d read when I was suffering through statistics in first year psych and third year sociology!