Chapter 3: Methods of Research

Section 3: Ethical Considerations

In the last couple of chapters, we’ve learned about the many methods of research in sociology. We learned about quantitative methods that use statistical and numerical data, usually through a deductive process to elaborate major trends in society. We also learned about qualitative research, focused on understanding the stories and lived experiences of individuals in research settings, usually using inductive methods, to get a more detailed understanding of society.

The bottom line is that research is absolutely central to sociology. The sociologist’s goal is to elaborate social truths, to formulate theories that are valid and reliable by looking at factual evidence. Without a strong background in cultivating, looking at and analyzing data, you really can’t do sociology. 

And the conclusions that we draw could have significant influence on shaping organizational and public policy or informing us in how to interact or build a more sympathetic understanding of the challenges we all face. 

Just recently, sociologist Arlie Hochschild published the results of her qualitative research in Louisiana.11 She wanted to understand why people were attracted to conservative policies despite the observed understanding that these policies seemed contrary to their material and personal interests. Hochschild was able to uncover what she referred to as a “deep story” that helps shape the goals and values of these communities. What appears to be irrational behavior makes sense in terms of this deep story.  Such research can help us develop sympathetic relationships with people during what are otherwise polarizing political times. Understanding her methods could also help us analyze our own deep stories. 

Without trained researchers like Hochschild going out and doing the field work, or crunching the data, we don’t have a sociology as a scientific discipline. Any work done by sociologists, whether it’s a humble but increasingly popular blog, professionally published analysis in academic journals, media commentary or book length expositions, without valid and reliable data, we don’t have sociology. Period. 

That being the case, the sociologist must be careful with how she conducts her research. The focus of our research is often real, living human beings who are trying to get through their everyday lives. These lives can be seriously disrupted, even harmed by the probing of sociologists. Also, sociological research is often used to help shape policy, policy that has a real impact on human lives. It’s important that this research is done well, and the researcher is honest about the perspectives and the limitations of his conclusions. 

Consequently, sociologists must address the ethical ramifications of our work, not just the quality of the research. How do we study human subjects in such a way that we get valid and reliable outcomes while, at the same time, protecting their safety and human dignity? And how do we make sure that the fruits of this work are used to benefit society and not used as another tool of coercion? It’s not as easy as it sounds. 

Putting this in perspective. The Tuskegee study predates the notorious medical experiments conducted by the Nazis on Jews and homosexuals.

The quintessential example of unethical research on human beings was the appalling Tuskegee Syphilis study. This infamous study, properly called the “Tuskegee Study of Untreated Syphilis in the Negro Male” began in 1932. To conduct the study, researchers told the men involved that they were being treated for “bad blood.” That was a lie. They did not have “bad blood”. They had syphilis. Not that it mattered, because they were not being treated. Instead, researchers were watching the progression of syphilis, which of course involves painful blistering, and neurological trauma. Left untreated, syphilis leads to blindness, madness, and death. This study continued into the 1970s despite the fact that penicillin proved an effective treatment for syphilis and was publicly available by 1947. 

The Tuskegee Study is a clear violation, a disgusting abuse of professional ethics. Most studies, however, are not so clear. Answering ethical questions is almost universal in sociological research. In the United States, the American Sociological Association guides professional ethics in sociology with a published Code of Ethics. This code of ethics emphasizes five principles guiding sociological research: Professional Competence, Integrity, Responsibility, Respect for Rights, Dignity and Diversity and Social Responsibility. Among the responsibilities of American sociologists, and I’m sure every other country has similar guidelines, is to allow no harm to come to human subjects, including and especially by getting informed consent to being studied, and especially protecting the confidentiality of research subjects. 

Now the ASA Code of Ethics does consider the nuances of research settings and tries to allow for the creativity and innovation necessary for doing sociological research. For instance, if doing research in a shopping mall, it’s difficult to get informed consent from everyone you encounter. At the same time, a shopping mall is a public space where public actions take place in the context of being seen. So, privacy is not necessarily paramount in public places. But what about research into more personal, private, or even stigmatizing behaviors where the researcher may not be able to get reliable data if their status as an observer were known?

In 1970, Laud Humphreys conducted research2 on what were known as “Tearooms.” But you are not going to get tea in these particular rooms. In this context, tea rooms were places, usually public restrooms, where men met to participate in homoerotic encounters. Humphreys believed that, because of the stigmatizing nature of these interactions, he couldn’t be forthcoming with the people he was studying and still get valid data. Instead, Humphreys offered to serve as a lookout or “watchqueen” responsible for warning the men inside of possible intrusion. While serving as watchqueen he took down the license plate numbers of the participating men and traced their names and addresses. He then visited the men at their homes, disguised as a medical researcher and conducted structured interviews. 

Now Humphreys learned some interesting things. For instance, he learned that many of the men who participated in the Tearooms led otherwise conventional, heteronormative lives. They identified as being heterosexual. They had families and were often respected members of the community. But was Humphreys breaching ethics by lying to his research subjects and misrepresenting himself? Though Humphreys protected his subjects’ confidentiality, his research subjects were certainly not able to give informed consent3 to being studied. 

On the other hand, could Humphreys have conducted the research in any other way? Would respected members of the community, fearful of discovery, avoid the research setting and thereby skew the results, creating an invalid outcome? If nobody was hurt, was it unethical? On the other hand, if the point is to prevent people from being hurt, can we condone a standard that can only be measured after the fact?  

How about research conducted by William Zellner in the late 1970’s.4 Zellner wondered how many fatal car crashes were actually suicides. You can’t just approach someone who has lost a loved one in a car accident and ask them if they thought their departed committed suicide. There are many stigmas and social norms that would distort any resulting data. So Zellner intentionally misrepresented his research as an attempt to reduce future car accidents. Okay. Maybe that could, conceivably, be an outcome of Zellner’s research, but that was not his focus. He lied to get the information that he wanted. On the other hand, he was able to determine that about 12% of fatal car crashes were likely to be suicides. This is valuable information, but by lying, his research subjects were not able to give informed consent.

Undoubtedly, Humphreys and Zellner both made some significant contributions to our understanding of society. However, did the dishonest means by which they gathered their data disqualify the value of their research? Or is their research validated by the fact that nobody, as far as we can tell, was hurt by this research? How many people may have been helped by the Humphreys and Zellner research. These are complicated questions with no clear answers. 

Are prisoners in a position to give consent in any realistic sense?

Whereas getting consent from research participants certainly mitigates many ethical quandaries, consent itself becomes controversial under certain circumstances. If a sociologist is doing research on prisoners, or patients in a mental health facility, or minors, or elderly people suffering from conditions like dementia or alzheimer’s, the nature of consent is debatable. Might the requirement for consent inhibit important research into marginalized communities? How do sociologists balance the quest for the truth with a respect for the rights of the individual? The answers to these questions are not always straight forward.

The most important way that sociologists can protect their research subjects is through strict adherence to confidentiality.5 Every effort is made to keep research subjects anonymous. This doesn’t always work out. For instance, in 1929, Robert and Helen Lynd conducted a famous study on the shallowness and triviality of American culture in a city they called Middletown.6 Turns out that Middletown was really Muncie, Indiana. When this became public knowledge, the good people of Muncie didn’t take kindly to being called shallow and trivial. 

Where shallowness and triviality thrive!

It is also important to remember that this confidentiality is not the same as client privilege like that protecting lawyers and doctors. As a researcher your notes can be subpoenaed, and you can be asked to give testimony. This can put you in quite a bind as you are ethically bound by your profession to protect your subjects’ anonymity. So what comes first? The law or your research subjects? Think that’s an easy answer? Remember, if you surrender your subject’s confidentiality, that not only impacts your subjects, but any future research that might be done in that setting. You sacrifice the integrity of every member of your profession. 

Ethnographer John Van Maanen discovered this when he was doing fieldwork on urban police officers. During the research he witnessed a police officer beating a homeless man who had done nothing wrong. The victim pressed charges and Van Maanen was subpoenaed to testify. Doing so would have compromised the confidential relationship he had with the police and would have compromised any future effort to conduct such important research. On the other hand, withholding information could lead to his being imprisoned for contempt of court. Also, the victim really was victimized and hurt. Did Van Maanen have a greater responsibility to the victim and to justice or to academic freedom and the pursuit of the truth? It’s a hard question. In Van Maanen’s case, he sided with his profession and preserved the confidentiality of the police officers. 

In one of my favorite moral quandaries, just for the pure visual value of it, in 1991 animal rights activists broke into the Washington State University labs and liberated animals used for scientific experiments. Sociologist Rik Scarce who was then conducting graduate research on animal rights activists, and certainly knew at least one of the participants in the raid/rescue, was questioned by police. Scarce was willing to answer questions that were not related to his research, but refused to answer questions that violated his subjects’ confidentiality. Scarce spent about four months in jail on contempt of court charges. 

Cry FREEDOM!

Any time you enter into a research setting involving people’s real lives there is the potential for ethical conflict. It is the responsibility of the researcher to protect her subjects from any kind of harm that might result from the research. Sociologists often study criminals, deviants and people at the moral boundaries of society, but these groups are no less protected by professional sociological ethics. 

Another ethical component to research is in being open and honest about research outcomes, even and especially when such outcomes are not what the researcher wanted to see. Now, under normal circumstances, this isn’t really an issue. Every researcher faces the prospects that their hypotheses might be disproven, or theories that they’ve worked on for years are contradicted or fail to explain outcomes. Sometimes there are just some glaring holes in the research that need to be filled and elaborated before a project can be brought into the marketplace of ideas. Sometimes the researcher simply needs to be honest that their work has certain weaknesses. But that’s a hard admission to make sometimes. 

This study was criticized for its inconsistencies and omissions. How does protecting subject anonymity impact the validity of the research?

A couple of years ago, sociologist Alice Goffman came under intense scrutiny for her much lauded ethnography of an inner-city community called On the Run: Fugitive Life in an American City.7 Goffman spent six years detailing daily life in a poor, marginalized community. This is a significant investment in time and academic energy. When her final product was published, however, some critics pointed out some significant inconsistencies in her narrative, as well as one event in which Goffman seems to be confessing to participating in a criminal conspiracy to commit murder. Now some of these inconsistencies were honest mistakes and typos. Others were more complicated. Some were instances in which Goffman seemed to accept hearsay stories without confirming their validity, which she then failed to qualify in her work. The rest, Goffman explains, is the result of her attempt to protect the anonymity of her subjects. This latter explanation becomes problematic because the very nature of science is such that it requires confirmation for reliability. So how does the researcher balance her ethical requirement to protect her subjects with the scientific requirement to validate the research? In many instances, Goffman was not able to prove that her claims were valid. 

For the sake of research integrity, academic journals usually institute a Peer Review Process. In other words, they have experts in the relevant field review research submissions before publication. The idea is to screen out problematic research. This usually works, but…sometimes it doesn’t. In 1996, physicist Alan Sokal submitted a work to the journal Social Text, a peer reviewed cultural studies journal, called Transgressing the Boundaries: Towards a Transformative Hermeneutics of Quantum Gravity. A close reading of the title should have been enough to set off alarms for any reviewer. A hermeneutics of quantum gravity? What does that even mean? Turns out, it doesn’t mean anything. And nothing in the submission made any sense. It was an intentional work of pure gibberish…that passed the peer review process and was published in a major academic journal. 

And this wasn’t a one-time affair. More recently, philosophers Peter Boghossian and James A. Lindsey published a piece in a peer reviewed academic journal called, if you’ll pardon me, The Conceptual Penis as a Social Construct, in which they argue that the penis is best understood not as an anatomic feature, but as a problematic concept. To quote from the abstract, the penis is “a social construct isomorphic to performative toxic masculinity.”

Among the problems exacerbated by this conceptual penis is nothing less than global climate change. According to Boghossian and Lindsey, “Climate change is driven by nothing more than it is by certain damaging themes in hypermasculinity that can be best understood via the dominant rapacious approach to climate ecology identifiable with the conceptual penis.”

Wait! ….What?

Okay, enough. Clearly this was balderdash, but it was balderdash that escaped the peer review process. To be clear, Boghossian and Lindsey intentionally submitted what they knew to be nonsense in order to test the peer review process. Clearly the process was lacking. 

And in case you think this is only characteristic of the so-called soft sciences, computer scientist Cyril Labbé, writing in Nature News, reports that since 2005 many respected scientific publications have published articles that were generated by a computer algorithm designed by MIT students as a joke. The program is called SCIgen. Using this program, Labbé, under the pseudonym Ike Antkare, became the 21st most cited scientist on Google Scholar. 

As funny as this stuff is, and don’t get me wrong…it’s freakin’ hilarious! they raise some serious issues about ethics in research. Look, there are serious theorists who do important work in Hermeneutics, and these scholars are thoroughly discredited when blather and yarn can pass for scientific research.  

Now, if you are a researcher for a major research institution, especially a university, then any research you do is governed by an Institutional Review Board, or IRB. The IRB is a panel of academics who have the power to approve, oversee and, if necessary, terminate research if it violates professional and ethical standards. Again, in most cases, this structure works pretty well, but there are some shortcomings.

Often, there are no sociologists on the board to advise on appropriate sociological research. Also, IRB’s tend to be predisposed toward traditional research methods. What if you want to think outside of the box and use more innovative methods? Well, you might have a difficult time getting your proposal past the IRB. IRBs also tend to be very cautious. Valid research that even hints at having ethical issues will probably be turned down. IRBs are of special concern to sociologists because here we have a bureaucratic entity that is tasked with deciding what constitutes knowledge. Yet critiquing how knowledge is constructed is a foundational element of sociology.

Also, IRBs are limited only to those conducting research within the institution. What about independent researchers? Today, in our web-based system, people like me can take their own research and post it without access to an IRB or a peer review process. This does not necessarily mean that the work done by independent researchers is no good, but it does mean that the reader of this research must be responsible for assessing the reliability and validity on their own. Theoretically, independent sociologists are still bound by the ASA Code of Ethics, but who provides the oversight? How is the independent researcher held accountable? We’re really not sure yet. 

On top of this, we can add another ethical dilemma. In this case, we must bring up the intrinsic ethical questions that arise out of good ol’ fashioned profit motive. What happens when we add money to the mix? For instance, Associate Professor Christoph Bartneck was invited to submit a paper to a conference on nuclear physics. Problem. Professor Bartneck is not a nuclear physicist. No problem. He simply entered the words “atomic” and “nuclear” into an iOS autocomplete program and submitted the nonsensical results. The paper was accepted, and Professor Bartneck was offered the chance to register as a speaker…at a cost of over $1000. Sometimes it’s difficult to separate the quest for knowledge and the search for profit opportunities. 

Well! Isn’t that nice!

This is especially true when private interests pay for the research. It’s further problematic when profits may hinge on the outcomes of scientific research. For instance, researchers working for Exxon discovered in the 1970s that burning fossil fuels would cause global warming and climate change resulting from the greenhouse effect. These conclusions were reached over a decade before the global warming debate began. Instead of releasing these research findings, the company decided that they would hide the results and prepare to sow doubts about climate science. After all, there were billions of dollars in profits on the line. 

We see the influence of money on research in pharmaceuticals, pesticides, medicines, and supplements. Anywhere there is a conflict between profit and truth, truth often loses. Again, this may be a problem with hard sciences, but what about social policy think tanks funded by private interests like the Koch Brothers or the Ford Foundation or the Gates Foundation. How has the desire to satisfy the interests of major funders distorted the research?

How might the funders of private “think tanks” influence the research

We don’t really know. And that may be the biggest problem of them all. Even in the case in which the science is not distorted for the sake of the funders, the very existence of the funders brings the legitimacy of the research and researchers into question. The very fact that funders who have agendas, profit motives and other incentives for privileging outcomes calls into question the objectivity of the research. When people lose faith in the legitimacy of scientific institutions, bad things happen. 

We have people who don’t believe that the earth is heating up because of human actions…it is!

We have people who believe vaccines cause autism…they don’t.

We have people who believe that eating genetically modified foods will hurt them… it won’t

Yet such people impact our environmental policies. Refusing to vaccinate your child has public health concerns. Irrational fears about GMO’s make it difficult to address real issues associated with this science. When researchers are delegitimized because of profit motive and greed, the implications can be dire for a society that is driven by science and technology.  

Producing objective research, especially objective social science research, rests on the presumption that the researcher is value neutral. But to what extent is it even possible for the researcher to be value neutral? The researcher is a human being, socialized with particular values, studying other human beings, also socialized with particular values, in research settings bound by cultural values. Values imbue the research from top to bottom. 

Choices have to be made about studying something as complex as teh Criminal Justice System. Those choices will be made based on the researcher’s values.

David Newman, in his text Sociology: Exploring the Architecture of Everyday Life8 points out that “We must remember that sociologists are people too, with their own biases, preconceptions, and expectations. Sociologists’ values determine the kinds of information they gather about a particular social phenomenon. If you were conducting research on whether the criminal justice system is fair, would you study criminals, politicians, law enforcement, judges, or victims?…The most accurate picture of reality is likely to be based on the views of all subgroups involved.” 

Exactly. And it’s even more complicated than that. Newman points out that the researcher’s value system will influence the very questions that they ask. Fortunately, by grounding our research in a theoretical perspective, sociologists control for this tendency. This serves as a way of identifying our biases if not controlling for them. If I conduct my research using a Marxist Theory, it’s understood that I have a critical approach to Capitalism, for instance. 

Value neutrality and objectivity is further complicated by the fact that research in the field involves human interaction, often in troubling social context. It is impossible to completely control one’s human responses to these interactions. One of the best examinations of this comes from a graduate thesis written by Carol Rambo Ronai. Ronai did participant observation research as an exotic dancer, but her emotional response to the setting made it impossible for her to remain objective. The anger and humiliation she felt in herself as a dancer conflicted with her understanding of herself as a sociologist and researcher. She says, “Regardless of my cognitive desire to be the intrepid sociologist, braving new frontiers, going where most women dare not tread, my dread of the coming night imposes its presence on my reality in the form of a tightness in my chest that constricts my breathing. The tightness is paralysis. My heart rebels.” She turned this experience into a potent examination of the overlap between being a researcher and an emotional human being. So how does one separate the two. Ronai expresses great animosity, even hatred and a desire for violence against the men at the club. What impact does that have on the researcher and the research itself?. 

Another problem arises when considering the interaction between the researcher and the subjects of his research. For instance, while conducting my graduate research, I found myself confronting the contradictions of performing multiple roles. I was conducting research at my job site. So, on one hand, I was responsible for doing a job as a counselor at a residential wilderness program for delinquent boys. I was also responsible for teaching sex education at the camp. On the other hand, I was a researcher, conducting participant observation research on how the program governed sexuality and sexual development. I was also an emotional human being who cared about the people I worked with and the clients I served. Because of my role as a researcher, and my role as a supervisor at the camp, I became the sex guy…which isn’t as glamorous as it sounds in the middle of the wilderness with troubled teens. Any issues related to sex and sexuality, I was called in to deal with. 

But this creates a conflict between the researcher and the counselor. The knowledge that I gained from my research could be used to help the boys, as well as the other counselors, deal with sexually related issues in the camp. But wait. If I use this knowledge as a counselor, am I not altering my research setting? On the other hand, if I have a knowledge set, regardless of how I acquired that knowledge, that could help someone, be it a client or just a young man who needs help, am I not morally obligated to do so? Wouldn’t keeping that knowledge to myself for the sake of protecting the integrity of my research also be a breach of my sociological responsibility to protect my subjects? In my case, I decided to use my knowledge to supplement the camp therapeutic process, then explained this conundrum in my research. I’ll let the reader decide how it impacted the validity of the research. 

This speaks to an ongoing debate in sociology with regard to the role of the sociologist in society. On one end of the debate are those who suggest that the research should remain emotionally aloof to what is going on in society in order to maintain objectivity. The role of the researcher, in this case, is to do nothing more than record information and analyze data. If others can use that information to advance society, great, but it’s not the role of the sociologist. 

On the other hand is a concept called praxis. Praxis is the application of theory to a practical end. From this perspective, the sociologist, having nuanced understanding of society, social structures and social interaction, should use this knowledge to actively pursue social betterrment. The activist sociologists is the ideal. But then how does one make the distinction between the activist and the sociologist? How can we be sure that the research is intent on furthering our understanding of the social world as opposed to pursuing a social agenda? And if this is the case, how can we vouch for the objectivity of our research?

How can publicly active sociologists like Michael Eric Dyson, balance their public facing activism with their responsibilities as scholars.

This debate is especially important as sociologists bring their unique perspectives into the popular marketplace of ideas. Such notables include William Julius Wilson, Michael Eric Dyson, and Judith Butler. It’s incumbent upon all of us who think it is important to develop a public presence for sociological thinking and analysis to do so in a way that respects sociology as an academic discipline. We have to make sure that our public positions are informed by valid and reliable sociology. Under no circumstances can we allow our public positions to inform the sociological work that we do. And that’s a pretty complicated balancing act. The only way to do this is through open introspection and self-evaluation. 

Economist Paul Krugman reularly writes a post on his New York Times Blog in which he evaluates his performance during that year. Most importantly, he makes it a point to highlight the areas where his analysis was off or just flat out wrong. This is never fun for any academic, but he does it. This is human science, and unlike protons and neutrons, human beings are prone to error. If we are going to do this science well, then we better be forthcoming about the errors that we make. 

Wow, this is a lot stuff to think about. Might just be easier to study protons and neutrons than it is to study human beings. 

Yeah, you could say that. That’s why last year, the noted cosmologist, Neil DeGrasse Tyson Tweeted, “in science, when human behavior enters the equation, things go non-linear. That’s why Physics is easy and Sociology is hard.” 

Well, let’s just say that sociology and physics both have their unique challenges. For sociologists, however, on top of all of the challenges posed by trying to keep the non-linear straight, is the added burden of maintaining an ethical grounding. 


Unlinked Sources and Notes

  1. Hochschild, Arlie Russell. 2016. Strangers in Their Own Land: Anger and Mourning on the American Right. New York: The New Press. ↩︎
  2. Humphreys, Laud. 1975. Tearoom Trade: Impersonal Sex in Public Places. New York: Routledge. ↩︎
  3. It’s not enough to give consent. Giving consent to a study under false pretenses may mean that the research subjects did not have the necessary information by which to decide if they wanted to participate. Informed consent means they have all of the necessary information. ↩︎
  4. Zellner, W. (1978). Autocide: Suicide by Automobile. ↩︎
  5. The identity of research subjects should not be discernable either directly or indirectly. It’s not enough to just change the names of the subjects in the study. A reader should not be able to tell, even through secondary clues, who the individual is. If you are doing research on a guy named Joe Biden, it’s not enough to change his name to Moe Diden. It’s also a violation of confidentiality to refer to Sam, a guy who lives in a big white house on Pennsylvania Avenue. ↩︎
  6. Lynd, Robert S. and Helen Merrell Lynd. 1929. Middletown: A Study in Contemporary American Culture. New York: Harcourt, Brace. ↩︎
  7. Goffman, Alice, author. (2014). On the run : fugitive life in an American city. Chicago ; London :The University of Chicago Press, ↩︎
  8. Newman, David M., 1958-. (1995). Sociology : exploring the architecture of everyday life. Thousand Oaks, Calif. :Pine Forge Press, ↩︎
  1. https://www.autism360.com/vaccines-and-autism-the-never-ending-controversy-131/
  2. https://www.pewresearch.org/short-reads/2020/03/18/about-half-of-u-s-adults-are-wary-of-health-effects-of-genetically-modified-foods-but-many-also-see-advantages/
  3. https://www.pewresearch.org/short-reads/2023/08/09/what-the-data-says-about-americans-views-of-climate-change/

Leave a comment

Trending