National Needs Assessment Survey Results

The Open Education Association Development Project recently held a webinar reporting back on our first phase of work, which focused on mapping the needs, challenges, and priorities of the open education field. A major component of that work was the Open Education National Needs Assessment Survey, which was conducted in spring 2025 with input from 1,327 community members across all 50 states. While the survey was tailored to the goals of this project and not intended for formal publication, we are pleased to share more detail about the results. This blog post offers an overview of the methodology and findings, as well as downloads of the data tables and survey instrument (openly licensed).

Accessibility note: The images of graphs and data tables shown in this blog post are taken from the webinar slide deck.

Introduction

The Open Education Association Development Project is a national initiative to advance the open education field through strategic coordination and collaborative action. With a focus on U.S. higher education, the project recognizes that open education has made significant progress to benefit learners, yet is not as widely recognized as it should be. In today’s evolving landscape of policy changes, new technologies, and marketplace shifts, the open education field needs to evolve.

Funded by a two-year grant from the William and Flora Hewlett Foundation, the Open Education Association Development Project set out to identify shared needs, build connections across existing efforts, and expand support for open education’s collective impact. The goal of the Open Education National Needs Assessment Survey was to gather insights into the current state of the field and inform priorities for the next phases of work.  

Methodology

The survey was administered by the research firm Bay View Analytics, using funding from the William and Flora Hewlett Foundation’s grant to SPARC. The project steering committee provided input and oversight throughout the survey’s development, dissemination, and analysis.

The survey design was adapted from a market research approach known as Outcome Driven Innovation (ODI), which focuses on the type of activities a target audience is seeking to accomplish (also known as “jobs to be done”) and the extent to which this audience is satisfied with the current solutions to support that activity. For the purposes of this survey, the target audience was defined as individuals engaged or interested in U.S. open education work, with a focus on those in higher education (faculty, librarians, administrators, nonprofit staff, etc.). 

The full survey instrument took about 10 minutes for participants to complete and comprised several parts: 

  • Background: Participants were asked a series of questions to understand their professional role and relationship with open education. This section contained the only required questions in the survey.

  • Activities Rating: Participants were asked to rate a set of nine activities or “jobs to be done” related to open education on a five-point Likert scale according to the activity’s importance to their work. Those who rated an activity as at least moderately important were similarly asked to rate their satisfaction with the support available for that activity. Participants also completed a follow up question for each activity, as well as one open-ended question at the end.

  • Perceptions: Participants were asked several overarching questions about the state of the field. This section was followed by an optional set of demographic questions.

The set of nine activities was identified from data collected through national strategy webinars held in 2024, as well as a series of need discovery interviews conducted by project staff in January and February 2025. The nine activities were:

  • Find OER: efficiently find OER

  • Tools & Resources: access tools and resources to support your open education work

  • Stay Up to Date: stay up to date about open education

  • Respond to Change: respond to political or technological changes affecting open education

  • Professional Development: access open education-related professional development 

  • Funding: secure funding for your open education work

  • Adapt/Publish OER: adapt and/or publish OER

  • Networking: network and collaborate with open education peers

  • Recognition: receive recognition for your open education work

Data were collected anonymously via Qualtrics. The survey was disseminated organically in March–April 2025 through an email invitation sent to individuals and groups within the open education community, with an encouragement to disseminate the invitation further. This approach resulted in 1,024 valid responses, which we will call the “primary sample.” Bay View Analytics also disseminated a shortened version of the survey to a random sample of U.S. faculty in order to compare overall trends. This drew another 303 valid responses, which we will call the “national faculty panel.”  

Results were reported in aggregate to project staff. Consistent with the ODI approach, we used the activities rating results to calculate an opportunity score, which represents the gap between the perceived importance of an activity and the perceived satisfaction with its current solutions. The opportunity score is calculated using the proportion of respondents who chose boxes four or five on the five-point Likert scale. We calculated the opportunity score as:

Opportunity Score = Opportunity Score = 10 × (Top2Box Importance + MAX(0, Top2Box Importance − Top2Box Satisfaction)

High opportunity scores indicate activities where the population is underserved, meaning that the activity is very important but people are less satisfied with the solutions available. Low opportunity scores indicate activities where the population is overserved, meaning that the activity is less important or people are highly satisfied with the solutions available.

Results & Discussion

For the purposes of this blog post, we will report only on the results of the primary sample (n=1,024) and not the national faculty panel (n=303). While the two samples did show some differences, the overall patterns were sufficiently consistent across the two groups to rely on the primary sample.

Of the 1,024 valid responses, 90% were employed at postsecondary institutions, most commonly as faculty or librarians. The other 10% were primarily employed by government agencies and non-profit organizations, with a handful working in K-12 and other sectors. Years of engagement with open education varied, with just over half selecting 5 years or above. Participants were geographically distributed across the U.S.

The activities rating section revealed a clear hierarchy in the importance of activities. An activity was considered important if the participant rated it four or five on the five-point Likert scale. Most activities were rated as important by 50% or more participants, with finding OER (80%) and accessing tools and resources (71%) being highest. The only outlier was recognition for open education, which only 21% rated as important.

Satisfaction ratings indicated that most activities have a moderate level of satisfaction. A participant was considered satisfied with the solutions available if they rated it four or five on the five-point Likert scale. Six of the nine activities were grouped between 51% and 61% satisfaction. No activities ranked higher than 61%, and three ranked significantly lower. Unsurprisingly, access to funding (32%) and the ability to respond to political and technological change (20%) ranked lowest. Interestingly, recognition (36%) was also low, suggesting that while fewer people view this as important, those who do are highly dissatisfied with current solutions.

The opportunity score calculation suggests that the clearest areas of opportunity are support for finding OER (10.9), responding to change (9.5), accessing tools and resources (8.4), and securing funding (7.9). These four areas are not only ranked highly overall, they are consistently at or near the top when results are disaggregated by professions and other demographics. The next four activities—staying up to date, professional development, adapting or publishing OER, and networking—fall in a more moderate opportunity score range between 5.2 and 6.9. This suggests that while there is still some unmet need in these areas, there is not as wide of a gap in support. Recognition had the lowest score at 2.1, driven primarily by its low importance. 

When activities are plotted based on their importance and satisfaction scores, it illustrates the extent to which the activities are well-served by the field. As the graph below shows, none of the activities fall in the severely overserved or underserved areas. Most activities are grouped in the middle range. This type of pattern is consistent with a field that is generally aligned with the needs of its audience, but falls short in certain areas due to persistent technical challenges, current events, or a lack of resourcing.   

The final section of the survey assessed people’s perceptions of the open education field. The first question concerned participants’ level of confidence in knowing where to turn for support. About two-thirds (65%) of respondents rated their level of confidence as a 4 or 5 out of 5, suggesting that most people have a high level of confidence. While the overall amount of confidence varied slightly by profession, the distribution remained consistent.

The pattern of confidence also holds true across different types of open education networks, whether people are involved at a national, state, or institutional level. The only group that showed a much lower level of confidence is participants that indicated they were not part of any open education network. This suggests that there are portions of the audience not reached by existing efforts—likely many more than this survey reached, given that it was disseminated primarily through existing networks. 

While the results were positive for the open education field at an individual level, the outlook was less positive at a national level. Participants were asked to rate the state of the open education field nationally on a five-point Likert scale. When asked if the field is well coordinated and well recognized nationally, only 14% and 15% respectively rated it as a 4 or 5 on a five-point Likert scale. This suggests that while existing networks are successful in serving the needs of many individuals, there is a gap in representation at the national level.  

Conclusion

Overall, the results of the National Needs Assessment Survey show a field that is broad, diverse, and provides a significant amount of support for the most important needs. Most respondents reported knowing where to turn for support, and most respondents feel satisfied with the support available for most of the activities. This suggests the greatest need is less about any single solution and more about amplifying and uplifting all of the existing solutions so that they can be more effectively utilized by the open education field. This is a clear role for a national association.

That said, the survey did indicate several areas of need where the association can prioritize its programming. The four priorities were:

  • Finding OER, particularly making it easier to find repositories

  • Responding to change, particularly providing strategic guidance on how to navigate funding cuts, political restrictions, and artificial intelligence

  • Finding funding for open education work, particularly at the institutional, system, and state level

  • Accessing tools and resources to support open education work, especially curated lists and how-to guides

Finally, perhaps the clearest directive of the survey is the perception of the national state of the field. With only a small portion of respondents viewing open education as well-coordinated and well-recognized nationally, there is a clear gap and need for a national voice that could be filled by an association. Furthermore, the survey suggested that there are many individuals who are not part of existing networks, who do not know where to turn for support. A national association can play that connective role—preserving the field’s hard-won gains, advocating on its behalf, and opening doors for individuals who are not yet linked to existing networks.

For more information about how the Open Education Association Development Project plans to put the results of the survey into action, please watch the recording of the Report Out Webinar held on July 28th. Additional downloads and links are provided below.

Survey Downloads

The survey was designed specifically to inform the project and is not intended for formal publication. In the spirit of openness, we are sharing the instrument (openly licensed) and frequencies for those who may be interested.

Feel free to reach out to us at contact@opened.org if you have questions about any of these documents or would like to request more information about our data or analysis.

Previous
Previous

Project Update: August 2025

Next
Next

Project Update: June - July 2025