One Researcher’s Comments on the Unity Oversight Committee Survey and Findings

Editor’s Note: We asked researchers for their analysis of the Unity Oversight Committee Survey process and results. Here is one response.

As the conflict has continued around the ordination of women pastors and issues of compliance with voted actions of General Conference Sessions and of the General Conference Executive Committee, the Unity Oversight Committee requested that the Office of Archives, Statistics and Research (ASTR) conduct a global survey. The results of the survey, entitled a “Questionnaire on Compliance,” have been published in the Adventist News Network post dated March 23, 2018 and on Spectrum. The General Conference said that the findings represent the profile of global Seventh-day Adventist opinion on the issue of unity and compliance in the SDA Church. Because it is in the interest of us all to understand the profile of opinion among us on this and other issues, I offer these comments on the methodology and findings of this undertaking in the hope that they will help to clarify the relationship between the stated purpose of this study and its findings.

Study Purpose

Because this study is so important in the development of Church policy on a looming, divisive issue, it is essential to consider the authenticity of its findings. The key to this is the way the findings were generated by the data said to support them. And the very foundation of data generation is the methodology by which they were produced.

However, these issues cannot be engaged without first considering the study’s purpose. What was it after? What did it seek to discover or elucidate? The prelude to the survey’s questions states this:

“The General Conference Unity Oversight Committee would like to explore the opinion of the world field, represented by division and union presidents, on the issue of compliance with voted actions of General Conference Sessions and of the General Conference Executive Committee.

“We request that you, as a division/union president, record what you believe is the view of the majority of members in your territory (as opposed to your personal view) on the following questions.” [Underlining added by author.]1

Another indication of the survey’s purpose can be surmised from a statement about how the data will be used by the Committee.

“…the survey provides quantitative data, allowing the committee ‘to more accurately judge where the world Church leaders and members stand on these issues,’ according to Mike Ryan, chair of the committee. ‘This information will serve as a guide to the Unity Oversight Committee in defining consequences for unions who have not complied with votes of the GC Session and of the GC Executive Committee,’ he added.”2

This statement presupposes that there is a shared understanding among the leadership and general membership on the meaning of “compliance” and that division and union presidents can accurately know and represent the view of the majority of members in their territory.

Two Parts of the Study

The methodology of the study—the way its data were generated—has two main parts: a survey of 150 top Seventh-day Adventist leaders and a “qualitative” part involving conversations with a number of these leaders. There are five aspects of the study addressed here. Two are aspects of the survey, its sample and instrument. Two are aspects of the qualitative component, the extent to which it was systematic and its documentation. Finally, the findings of the two study components are addressed as they are related to the study’s stated purpose.

The Survey Questionnaire/Instrument

A basic issue in any sample survey is the extent to which it represents the population from which it is drawn. The best sample in any case is a strict probability sample in which every element of that population has a known and equal probability of being selected into the sample. This is rarely achieved because the response rates of respondents in a strict probability sample are seldom 100 percent. The question then becomes the extent to which the almost inevitable compromise with this standard corrodes the representativeness of the sample.

The sample in this study is not at all a probability sample, but one apparently based on the convenience of the investigators, as it was easy for them to poll 150 of the most senior Church leaders, who were supposed to be able to accurately know and report the opinions of congregants in their massive units. It is a problematic leap to get from leadership beliefs about the opinions of members of their groups to the opinions of the members themselves. It is misleading to assert that any leader can accurately know and report the range of opinion of hundreds of thousands of others in the group, particularly when no attempts to systematically gather information have been done within these large groups. Claiming to know the opinions of those in one’s union or division does not make it so, and it is a gross misrepresentation of the data to claim that it does. It is like saying that all the Cardinals and Bishops of the Catholic Church can accurately know and report the range of Catholic opinion on things like contraception or abortion.

The questionnaire, attached as Exhibit 1, is also problematic.3 The construction of instruments…often called questionnaires or interview schedules…is an extremely important step in the sample survey process. The most credible organizations engaged in this kind of work are generally the more well-known and seasoned university survey research shops. They often work for months and sometimes years to create reliable and well-validated items—questions—for their surveys. This means simply that the well-validated items measure what we think they measure.

The questions in this survey are derived from various actions proposed in the document entitled “Procedures for Reconciliation and Adherence in Church Governance Phase II” discussed at last year’s Annual Council and referred back to the Committee.4 Likely the committee wanted the wording of the questions to be consistent with the language in the compliance document. Yet the wording is important to the scientific nature of the survey process, findings, and conclusions. The six items in the Unity Oversight Committee survey are too long and too vague to meet this standard, though some seem to be more valid than others. (See Exhibit 1.)

In question 1, the meanings of some of the major terms are not clear and subject to manifold interpretations: “listen sensitively,” “counsel,” “not in compliance.” In question 2, the concept of “organizational consequences” is unclear. Questions 3, 4, 5, and 6 are clearer, but they could certainly be sharpened and made more valid with a substantial period of application and honing. But if this could not be done because of the urgency of launching the survey, researchers would have been well-advised to consult existing well-validated survey items and to base their new items on these.5 Even assuming that the items are reliable, meaning that they would consistently generate the same results when measuring the same opinions, the validity of the six items of the survey is questionable. We cannot know with real confidence that they measure what we think they measure. And without the assurance that the convenience sample of 150 Adventist leaders represents the range of opinion of 20 million of us and that the survey items measure what we think they measure, we cannot be at all sure of the apparent survey results.

In addition, the use of a five-item scale for responses for each question, such as strongly favor, favor, no opinion, oppose, strongly oppose, instead of the bi-modal ‘yes’ or ‘no’ responses would have produced a more varied range of positions on the studied compliance issues. Similarly, the addition of demographic data, such as age, ethnic background, length of service, and education, would have allowed for more nuanced findings on the opinion items.

Qualitative Data Collection

There are a number of well-accepted qualitative data collection methods in social analysis. One of them is nominally-scaled items in sample surveys, and some would argue that the six items in this survey are of this type. Other accepted qualitative methods include focus groups, in-depth interviews, simulations, and anthropological field studies and its cousin, participant observation studies. What the standards all of these methods have in common is that they must be systematic, and their procedures and results must be documented. Since there is no readily available documentation of the “personal visits and dialogues” with church leaders said to comprise the qualitative component of the unity project, it is impossible to know whether these conversations were appropriately systematic and documented. Therefore it is difficult to be confident in the data generated and to draw conclusions about the consistency of the information from the “listening sessions” and the findings from the questionnaire. This is especially true given the lack of anonymity in both the “quantitative” and “qualitative” responses.

The Findings

The findings of systematic social research are typically reported in such a way that there is a clear and logical link between the research operations and the conclusions drawn. As in all such studies the very foundation of data generation is the methodology by which they were generated. In this study the sample is not representative of the global body of the Seventh-day Adventist Church, as no person in the general membership was in the sample, only a small number of its higher leaders. And the validity of the survey instrument is questionable. In the qualitative component of the study, we have no assurance that the data collection was systematic or well-documented. For these reasons, we can have little confidence in the study findings as a whole.

The Presentation of the Findings

The appropriateness of the presentation of the study findings are open to challenge by those who bear the standards for the conduct of systematic social research.6 In the first place, in the reportage of the findings, the proportion of the global Seventh-day Adventist population represented by Church leaders responding “Yes” or “No” to survey questions is indicated, strongly suggesting that the responses represent the indicated proportion of the entire population under study. This is potentially misleading. Second, the identity of study respondents may be made known to some of the researchers apparently in such a way that individuals’ responses may be known. Any sample survey with such sensitive questions, ones that could lead to punishment of those who answer in ways that do not support leadership, should be absolutely anonymous in the sense that the responses of individuals could be known to researchers or anyone else. Otherwise, the survey can only be construed as an open plebiscite of followers by their leaders. How could that be presented as an adequate measure of opinion on sensitive issues?

It is this researcher’s hope that these observations will enhance our purpose in promoting the work of our Church in advancing the gospel.

Exhibit 1. Survey Instrument

Notes & References:

1. The instructions for the “Questionnaire on Compliance,” the questionnaire for this survey, are included below as Exhibit 1.

5. Among the many sources of well-validated survey items are the Survey Research Center at the University of Michigan and the National Opinion Research Center at the University of Chicago.

6. Note the findings have been reported in the Adventist New Network post of March 23, 2018 and in the Spectrum post also of that date. Adventist News Network. Survey results presented to Unity Oversight Committee: Qualitative Research Continues. [Accessed April 11, 2018]. Spectrum Magazine. Unity Oversight Committee Survey Results. [Accessed April 11, 2018]

William W. Ellis, Ph.D., is Professor of Political Studies at Washington Adventist University. Earlier in his career he held tenured faculty positions in political science at Northwestern University, the University of Michigan, and Howard University, as well as senior research and management positions industry and the federal government. At Northwestern University, then a leader in quantitative political research, he taught some the basic graduate courses in research methodology, including an advanced graduate course in multivariate analysis, which he developed. In addition to his doctoral studies at New York University, he was trained in survey research and other research methods at the Survey Research Center of the University of Michigan, a leader in its field.

Image courtesy of Adventist News Network.

Further Reading: Unity Oversight Committee Survey Results General Conference Re-asks the Questions of 2017 Unity Oversight Committee Releases Statement Regarding Way Forward Unity Oversight Committee Continues to Gather Data General Conference Re-asks the Questions of 2017

We invite you to join our community through conversation by commenting below. We ask that you engage in courteous and respectful discourse. You can view our full commenting policy by clicking here.

This is a companion discussion topic for the original entry at
1 Like

If they can accurately know what their members think then why should there be a “survey”. Or in other words, if division and union presidents know the views of the members of their territories, why wouldn’t the president of the GC and his staff also know. And yes they knew.
I simply mean that the “survey” was conducted only for the purpose of making someone think that there was a wide opinion gathering and that that opinion is again in favour of a predefined assumption.
I won’t exaggerate if I state that this “survey” was done on elementary school level.


Elementary School Level –
Perhaps THAT IS the Level of Mental Processing by the Division, Union, Conference
Leaders we have to look up to!


Isn’t the best available sampling of members opinions regarding WO the results of union constituency meetings? It seems to me that this survey is Ted Wilson saying that he will not believe or accept the representative vote of the Pacific Union constituency, for example. Thus I agree that this survey may well have been conducted only to make some people think there was wide opinion gathering. However, to claim that seems like “false witness”. It seems duplicitous.


Just curious… would you and others feel the same way had the percentages been reversed?

…that it was a sloppy, non-rigorous, piece of research…


Absolutely I would accept the findings regardless of what they were IF the survey was credible in its preparation and application. The question is not the “findings”. It is the fact that nothing reliable was found because the survey instrument was flawed. (BTW, I studied doctoral level statistics and survey methods. Thus my comments.)


There are three kinds of lies: lies, damned lies, and statistics.

Having shared this famous quote, I should add that I’m a big believer in statistics when a carefully designed study is properly analyzed, and the author of this critique is spot on regarding shortcomings of the survey.


Not fair. I taught my 3rd and 6th grade students better than that.


back in the stone age (when i was in college) i took a course entitled ‘ethics in business’, IF i had done this as a research project for that class, i doubt i would have passed the course!


Still waiting for the professional researcher who would be willing to ‘put their hand up’ and endorse the G.C. sponsored survey as valid and creditable.


Impossible. It’s not about “being willing” but about “being possible” and in ways of scientifically methods of how a survey is to be conducted - it’s impossible.


I find no Biblical basis for voting on “beliefs.” New Testament church was to be flexible with one another. What precedence for one part of the church to turn punitive on another part? They were to bear with one another in love and let each be fully persuaded. The job is more difficult now since the church has a global populations. Now we deal with the social construct called “organization” or “corporation.” Cultures are more diverse than in the Acts story.


Behind this and the overall controversy is an assumption that the entire world field had to be ready for WO and approve it in a General Conference Session. It was a faulty assumption by GC leadership from the start because it would inevitably lead to the opposite result intended: Church Unity. Many of us were willing to wait even 30 years as the church wrestled with a resolution to this problem. But not 50. Nor were we willing to wait for a faulty process to lead to several votes by the GC in Session including the most recent one. In my view, we are drinking the same legalistic approach to very thorny social and ethical issues that defined Judaism in the first century. The church cannot make momentous decisions by a vote that alienates so many. If each area were allowed some gospel freedom, as the Gentiles were in the early church, there might continue to be disagreement but there would not be organizational and structural splits which threaten unity. Sad, sad, sad.

  1. Does implicit bias have any meaning here.
  2. Employ the services of an independent agency or university to conduct the survey, i.e., The Burna Institute for example, an agency with no skin in the game. What are we afraid of?
  3. When the GC President rejects the result of the TOSC study recommended and commissioned by him what can be the expected result of this survey?
  4. Perhaps before surveying the definition and application of compliance most be defined, mutually understood and accepted as it relates to policy, as a starting point.
    Concluson: As of this date I don’t believe this has been done.

I haven’t achieved such a level of education, and yet will state that anyone that took Intro to Statistics as a college first-year should be able to tell at first blush how invalid this survey is. If they were paying attention in class.

Even if the respondents could accurately know what their members think, if the questions were precise and clear, and if the answers with anonymous, still the preamble to the questions states to answer only for the majority and that then a) creates a tyranny of the majority scenario, and b) may well hide a near 50/50 split in the world church.

In any case, for example, what difference does it really make if 51% of the church thinks that ~“presidents of unions out of any compliance with the GC be allowed to vote”? Does that have any bearing on anything? Since when are such things matters of belief or opinion of the masses?


Well, here’s a link to the staff of the GC’s “Office of Archives, Statistics, and Research”:
It is led by David Trim, PhD (History)

I read all the Bios there and could not find one person with a degree in statistics or math of any kind. One person’s job title is “Statistical Analyst”, but no education is listed.


Although I’m sure historians may do some statistical, quantitative research, historiography is the methodology used by historians.

I’m not surprised, then, that this research document which claims to be statistically significant, lacks the rigor of a statistical quantitative document.

It was stunning to see this particular survey coming out of this office of statistics at the GC.


Well, it should be stunning. I’ve rather started to expect it.

Even their mundane quarterly reports, summaries of membership and attendance and giving, lack clarity and in some ways are unhelpful or even misleading. They’re not really reports, for example with implications explained, but instead just tables (at least the public versions.) For example, when only ~15% of members are typically in attendance on the two weeks reported on each quarter, what does that likely mean? Is there a trend? Is there an issue with who we consider a member for reporting? Does the other ~85% really exist?


Oh the irony…the majority of the respondents want consequences for those out of compliance, but more than half don’t want to sign a form certifying compliance and conformity!

1 Like