First Monday

Does the rising tide of OER lift all boats? by Candice Vander Weerdt

While open education resources (OER) have grown in popularity over the last few years, few researchers have considered the benefits of OER adoption beyond direct student benefits. Stakeholder theory argues there is merit in identifying value for several interconnected groups, in this context the institution and instructors. Using a quasi-experimental research design, I evaluated the effect of OER adoption on instructor benefits and institutional rewards, as well as multiple student outcomes. I found student satisfaction, perceptions of quality, and academic integrity were significantly related to OER adoption. While relationships to the other outcomes, notably of direct instructor benefits, were not supported. These findings reveal the benefits of OER adoption are not realized for all stakeholders.


Impact on stakeholders




Course material costs in higher education have risen sharply in the last few decades, straining already thin student budgets. From 1977 to 2015, college textbook prices increased more than 1,040 percent (Popken, 2015), outpacing the increase in tuition and even healthcare costs over the same period (Weissmann, 2013). The College Board expects students to spend US$1,240 and US$1,460 on course materials and supplies each year (Ma, et al., 2020).

These costs are meaningful for college students, who already take on significant debt. Researchers have found that only 18 percent of college students were not affected by textbook prices, with the majority of students needing to take out loans or work full-time to support their studies (Jhangiani and Jhangiani, 2017). Additionally, low-income, racial and ethnic minorities, and first-generation students (Jenkins, et al., 2020), were disproportionally affected by textbook costs.

Besides the cost of the resources, the materials themselves have changed dramatically in the last two decades. The traditional usage model of textbooks included purchasing a physical book (generally from a college or university bookstore), using the textbook throughout the semester, then selling the textbook back to the bookstore. In this model the student has access to the course assignments regardless of purchasing the textbook, they have the option of sharing textbooks or borrowing from a library, and they may also recover some of their costs by reselling the textbook at the end of the semester (U.S. Government Accountability Office, 2005).

For many students, these options are no longer available. In response to increasing institutional use of adjunct instructors, textbook publishers have begun providing further instructional materials to instructors, including test banks, lecture PowerPoint slides, learning management systems, and graded assignments (Del Valle, 2019). A physical textbook is not necessarily available and is instead replaced with a costly access code. This access code hides course assignments and materials behind a paywall, essentially requiring the purchase of the code to complete the course work. The student cannot share the code, nor borrow it from the library or a friend. They cannot sell the purchased access code back to the bookstore, nor use the materials after the class ends.

In response to these increasing costs and change in purchase behavior, some administrators, faculty, and non-profit organizations have turned to open education resources (OER) to supplement or replace high-cost course materials. OER is defined as “teaching, learning, and research materials that are either in the public domain or licensed in a manner that provides anyone with no-cost and perpetual permission to engage in the 5R activities [retaining, remixing, revising, reusing and redistributing the resources]” (Creative Commons, 2020). Many of these resources are available online for viewing and download. Some publishers also offer physical copies at distinctly low prices. For example, OpenStax of Rice University partnered with XanEdu to offer printed copies of OER textbooks for purchase on Amazon for only the cost of printing.

A primary concern of this move to OER is whether the reduced cost to students results in positive outcomes. Three independent reviews of OER research found inconsistent effects on student and instructor outcomes. Winitzky-Stevens and Pickavance (2017) identified student outcomes, in terms of course grade, likelihood of passing, and likelihood of failing, minimally impacted new students, and the researchers could find no significant effect on continuing students. Hilton (2016) reviewed 10 high-quality studies of OER efficacy and found only half obtained any significant relationships between measures of student performance and OER adoption. Even the studies supporting these relationships posed serious limitations to the findings. An extension by Hilton (2020) reviewed 16 additional efficacy studies published between 2015 and 2018 and found just over half the studies reported results favoring OER, while the rest of the reviewed studies reported no significance, mixed results, or results favoring traditional textbooks. The most recent review by Wells, et al. (2020) argues the cost savings is the most meaningful benefit for students and surmises the increased access to online course materials is likely to improve learning.

Furthermore, researchers have focused mainly on the effect of OER use on student outcomes (Bliss, et al., 2013). Student outcomes fail to recognize the important benefits to other stakeholders of higher education — namely instructors, institutions, and the academic community (Hilton, 2016; Winitzky-Stevens and Pickavance, 2017). Wells and colleagues (2020) argue institutional support and faculty culture and critical factors of OER adoption and use. The present study fills the gap in current OER research by considering the impact of OER adoption beyond the direct benefits to students. Using a quasi-experimental method, I considered the instructor’s direct benefits of adopting OER in terms of student perceptions of the instructor and student evaluations of instruction.

Another contribution of this research is the investigation of institutional benefits. Institutions are rewarded for improving access to education and performance of low-income students, both through state funding (Ortagus, et al., 2020) and accreditation standards (Association to Advance Collegiate Schools of Business [AACSB], 2020). State funding is a significant portion of public institutions revenue, accounting for 21 percent of institutional budgets (Pew Research Center, 2019). In a systematic review of state funding of higher education, Ortagus and colleagues (2020) explain that prior to the 2008 recession, most state funding was based on prior appropriations and student enrollment. As of the 2020 fiscal year, 41 states have adopted performance-based measures that link funding to institutional performance and student outcomes. Furthermore, two-thirds of these states include equity metrics, such as the number of graduates from low-income backgrounds, when allocating state funds (Ortagus, et al., 2020). For this reason, I also examined the direct benefits on institutions and the academic community by considering the impact of OER adoption on academic integrity and performance of low-income students in higher education. Finally, I investigated the effects of OER adoption on benefits to the students, in terms of student performance, student satisfaction, and perceptions of quality.



Impact on stakeholders

Stakeholder theory argues the management of a firm must consider the perspectives of important stakeholders instead of making decisions with only the direct interests of shareholders in mind (Freeman and Reed, 1983). The theory is rather general and has been used in a variety of contexts. For this study, I adopt a managerial sense of the theory in that the theory is instrumental in guiding decision-making (Donaldson and Preston, 1995). While not all stakeholders are equally involved in decisions, organizational structures and general policies are affected by legitimate interests of all appropriate stakeholders.

Stakeholder theory is founded in the idea of the ‘common good’ through institutional actions (Baden and Higgs, 2015). Argandoña (1998) argues the common good is composed of a collection of individual self-interests, each from a different stakeholder, flowing toward a societal outcome. As entities of civil society that defend society and meet the general need for education and culture, higher education institutions have an obligation to the common good (Argandoña, 1998). This is reflective of the Sustainable Development Goal (SG4) by the United Nations Educational, Scientific, and Cultural Organization (UNESCO), which aims to ensure inclusive and equitable quality education (UNESCO, 2016). The obligation to the common good is further highlighted by the addition of a specific standard, “societal impact” by the Association to Advance Collegiate Schools of Business (AACSB) in the 2020 revised standards for accreditation (AACSB, 2020).

Therefore, stakeholder theory is a logical guide for academe. Key stakeholders in an academic institution involve the educators, graduates, and employers (Borg, et al., 2019). If an institution were to focus only on the outcomes of one group, for example the students, then the institution may fail to see the perspective of the other groups. In this way it is important to consider how OER adoption affects multiple stakeholders for successful implementation and benefit to the common good.


Student outcomes are the primary focus of many OER research studies, and with good reason. The costs of course materials are commonly absorbed by the student directly. However, the effects of OER on student outcomes are mixed. Some OER studies fail to find a significant effect on student course grades or continuance (Lovett, et al., 2008; Hilton and Lamen, 2012; Winitzy-Stevens and Pickavance, 2017; Beile, et al., 2020; Dempsey, 2021). This is not necessarily negative; large changes in grades may indicate the resources are not as rigorous as for-profit resources. In fact, the researchers of these studies noted the students performed as well or better when using an OER compared to a traditional textbook (Lovett, et al., 2008). Given this evidence, it is unlikely course grades will be significantly related to OER adoption:

H1: OER adoption is not significantly related to student grades.

Student satisfaction and perceived quality of OER, however, have been positively and significantly linked to OER adoption in the literature. Cooney (2017) conducted interviews and short surveys of students using OER course materials. Students reported the OER was easier to access than traditional textbooks and 89 percent of respondents agreed or strongly agreed that the OER “increased my satisfaction with the learning experience” [1]. Weller and colleagues (2015) report consistent findings, with 62 percent of educators and 60 percent of students agreeing that OER increased student satisfaction. In further support of this finding, Feldstein, et al. (2012), found nearly two-thirds of the students preferred the online OER to a traditional textbook. In fact, of a total of 29 studies, measuring the perceptions of 13,302 students, student satisfaction with OER textbooks is largely positive (Hilton, 2020). Therefore, a positive relationship between OER adoption and student satisfaction is logical.

H2: OER adoption is positively related to student satisfaction.

Another important student outcome is perceived quality of the textbook. A common concern is whether OER textbooks are as reputable and reliable as traditional textbooks. In a survey of students using an OER, 96 percent of the students perceived the quality of the OER to be equal to other textbooks (Jhangiani and Jhangiani, 2017). However, some researchers have found inconsistent results. Howard and Whitmore (2020) offered two-page samples of four textbooks, two OER textbooks and two commercial textbooks, to rank in order of preference of quality. Students indicated a significant preference for one commercial textbook, though indicated no distinction between the other commercial textbook and the two OER textbooks. When prices were revealed, perceptions of quality increased slightly for OER textbooks and decreased slightly for commercial textbooks. Still, the meta-analysis of Hilton (2020) maintains the finding that the vast majority of students and instructors that have used both OER and traditional textbooks believe OER are of equal or superior quality. Given these findings, the testing of this relationship is necessary and a positive relationship is expected.

H3: OER adoption is positively related to student perceptions of textbook quality.


An important stakeholder to consider in the adoption of OER is the instructor of the course. In most cases, it is the instructor of the course that selects the required resources (Harley, et al., 2010; Del Valle, 2019). The instructor is also tasked with designing the course assessments and lecture materials to meet objectives. For-profit textbook publishers generally provide these resources to instructors, which can be particularly helpful for new instructors.

At the time of this study, very little research has been published regarding the perceptions of instructors who use OER. A 2017 study by Vojtech and Grissett (2017) used scenario vignettes to measure students’ perceptions of instructors that adopt OER course materials and found students rated OER faculty higher than other faculty in terms of kindness, encouragement, and creativity.

A second study, by Nusbaum and Cuttler (2020), investigated the effect of OER adoption on the students’ perceptions of the instructor. Instructors were pseudo-randomly assigned to use either an OER textbook or a traditional textbook in a section of an introductory psychology course. The researchers found significant results in terms of student perceptions. Specifically, instructors using OER were perceived as more passionate, likeable, encouraging, and kind. The random matching of instructor and course strengthen the argument for a relationship between OER and instructor perceptions. Given these findings, it is reasonable to expect a positive relationship between OER adoption and perceptions of instructor in an experimental type study.

H4: OER adoption is positively related to student perceptions of the instructor.

As described earlier, the instructor often has input in the selection of course materials. Therefore, it is important to consider what direct benefits the instructor may glean from adopting an OER. In many higher education institutions, promotion and tenure decisions are based partially on the instructor’s student evaluation of instruction (SEI) scores (Seldin, 1993). These scores are designed to provide a measure of an instructor’s teaching effectiveness (Baldwin and Blattner, 2003). Yet, researchers have found these measures may be contaminated by factors independent of the instructor’s behavior, such as student GPA (Langbein, 1994) and instructor gender (Sidanius and Crane, 1989; Bernard, et al., 1981). Relevant literature has indicated these scores have been used by students to punish instructors that demanded quality work (Pounder, 2007; Crumbley, et al., 2001). Furthermore, the importance of these scores has increased significantly since the 1970s, when the measures were considered formative, to the present where they are considered summative evaluations (Alkathiri, 2021). Seldin (1993) reported these SEI scores are used more frequently in evaluating teaching performance than evaluations by the department chair or dean.

Crumbley, et al. (2001) found the factors likely to lower SEI scores revolved mainly on the difficulty of the course. Specifically, instructors were likely to be rated lower (punished) if they were perceived as “Grades hard”, “Significant homework”, and “Pop quizzes”. At the same time, more than half of students (55.6 percent) would reward an instructor they perceived as “nice” with positive marks. Considering this evidence, it is logical to conceive students may also reward instructors who choose OER course materials, thereby saving students money. Rewards may include providing favorable SEI scores or positive perceptions of the instructor.

H5: OER adoption is positively related to instructor SEI scores.


Another stakeholder of OER adoption is the institution or academic community. While student performance and satisfaction are important to an institution, another important academic outcome is academic integrity. Academic integrity is a serious problem for higher education. McCabe and Trevino (1993) report that anywhere from 13–95 percent of college students have reported engaging in at least some form of academic dishonesty.

Digital learning environments may dimish academic integrity behavior more than in previous years. For example, online notes and exam sharing sites, such as Coursehero, allow students to share test questions and answers and can even post “instructor-only” test banks when obtained (Savage and Simkin, 2010). These sites may encourage students to adopt a transactional view of learning; students see course assessments as commodities instead of opportunities for learning (Harper, et al., 2019; Awdry and Newton, 2019). It is important to note that academic dishonesty not only compromises a student’s integrity (McCabe, et al., 2002), it has also been shown to replace and reduce learning in students that have cheated (Isserman, 2003).

Institutional structures may also impact academic integrity. Many faculty members do not report cheating because they believe it is difficult to prove (Harper, et al., 2019). Other faculty members have pointed to heavy teaching loads and performance reviews as factors in reducing cheating (Harper, et al., 2019). Finally, faculty members do not use all the tools available in detecting cheating when the tools are difficult to use (Sattler, et al., 2017).

Given the precariousness of academic integrity, ease of using online tools, and obstacles in identifying and penalizing academic dishonesty, the management of academic integrity is a timely and important institutional outcome. Some researchers have found cheating occurs when students feel the instructor does not care about them (MacGregor and Stuebs 2012; Simkin and McLoed, 2010; Beasley, 2014) and some have reported the student’s relationship with the teacher influenced their cheating behavior (Maeda, 2021). As discussed earlier, perceptions of an instructor may be influenced by the adoption of a free course material (Vojtech and Grissett, 2017; Nusbaum and Cuttler, 2020). If academic integrity is increased by the student-instructor relationship, and adopting an OER may improve that relationship, then it is reasonable to deduce academic integrity will be positively related to OER adoption.

H6: OER adoption is positively related to academic integrity.

Finally, institutional priorities commonly focus on the performance of high-risk student populations. The disparity in student performance per student income is known as the income achievement gap. Notably, this gap has grown in school age children and continues through higher education (Reardon, 2013). Several state governments (Wixom, 2015) and universities (Gooblar, 2020) have made closing this gap an institutional priority. Some have argued this gap will grow even larger as the COVID-19 pandemic affects higher education instruction (Mintz, 2020).

Furthermore, institutional budgets are directly impacted by low-income student performance. One of the largest sources of revenue for public institutions is government appropriations. At a typical four-year institution, government funding accounts for 18 percent of total revenue, comparable to the 20 percent received from tuition and fees (Ward, et al., 2020). Traditionally, state funding was portioned through an analysis of student enrollment. Shortly after the 2008 recession, many states adopted a performance-based determination, where institutions were funded based on student outcomes, usually through the measure of student completion (Ortagus, et al., 2020). Most of these state governments also considered equity measures in funding decisions, such as the number of graduates from low-income backgrounds or underrepresented minorities (Ortagus, et al., 2020). Federal funding also is likely to reward institutions for improving the income acheivement gap. Section 420cc of the College Affordability Act includes a bonus program that explicitly rewards higher education institutions that graduate a significant number of low-income students on time (U.S. Congress, 2019).

While significant improvement in student performance has not been consistent, there is reason to believe student performance for certain student populations is impacted by OER adoption. Colvard and colleagues (2018) found D, F, and Withdraw (DFW) grades were reduced by almost twice as much for Pell students (4.43 percent reduction) versus non-Pell students (2.05 percent reduction) by adopting an OER. These findings may be contaminated though, as the Pell status of students was given to the researchers in aggregate and could not be stratified by course and instructor. However, this evidence suggests the impact of OER adoption on student performance may be moderated by Pell status.

H7: The relationship between OER adoption and student performance is moderated by student Pell Grant status such that a positive relationship is amplified.





The study took place at a large, urban, public university. The focal course was an introductory business course, entitled “Principles of Management”. The class is typically taken in the third year for students with a business or related major. The course is primarily taught by the same instructor in an online and on-campus setting. The course requires weekly chapter quizzes, 30 self-assessment assignments, a self-assessment term paper, and three multi-chapter exams. The on-campus sections also required group projects and case presentations, while the online sections were assigned discussion board presentations and participation. This format was the same for both conditions.

Design and procedure

A two condition quasi-experimental design was deployed to test these hypotheses. Both conditions were evaluated in the same course, with the same instructor, with the same course design. The course in the first condition required the McGraw-Hill textbook, eighth edition published in 2018, with a unique Connect code. The Connect access code was required to complete course assignments. A physical book, with Connect code, could be purchased from the university bookstore for US$277.75 new and US$208.50 used. An electronic textbook with Connect code could be purchased from the bookstore for US$173.37. The electronic textbook and code could also be purchased directly from the publisher for US$125. This was the lowest cost option for students. This format was used for the previous three years by the instructor. Four sections of the course were included in condition 1.

The course in the second condition required an OER textbook from OpenStax published in 2019. The textbook was available online at no cost and could be downloaded in a PDF format. No access code was required. A physical copy was also available at the university bookstore for US$54 new and US$40.50 used but was not required. This was the first semester using this textbook for the institution and the instructor. Two sections of the course were included in condition 2.

Condition assignment was determined by the course the participant was enrolled in. In total, 200 students were recruited and 158 consented to participate by completing a survey, resulting in a response rate of 79 percent. Of these responses, three were eliminated for incomplete surveys and one was eliminated for completing the survey in under two minutes. Furthermore, following Meade and Craig (2012), the survey contained several bogus questions to identify careless responding. Eight cases were removed for failing to answer at least 70 percent of the bogus questions accurately. This led to a final sample size of 145 cases, 87 in condition 1 (traditional textbook) and 58 in condition 2 (OER textbook).

Full sample characteristics may be found in Table 1. Demographic and course delivery characteristics did not differ significantly across groups. Most cases were juniors or seniors under the age of 30 and reported being White. There were slightly more females than males. Most cases reported a GPA above 3.0. Data indicated the sample contained a diverse group of majors, with a third of the students reporting a non-business major. Nearly half of the respondents received Pell grant support.


Table 1: Sample characteristics.
Note: n=145.
Course delivery 
On-campus36 (25%)30 (21%)66 (46%)
Online51 (35%)28 (19%)79 (54%)
Class standing 
Freshman0 (0%)0 (0%)0 (0%)
Sophomore3 (2%)1 (1%)4 (3%)
Junior31 (21%)24 (17%)55 (38%)
Senior49 (34%)31 (21%)80 (55%)
Other4 (3%)2 (1%)6 (4%)
Male33 (23%)29 (20%)62 (43%)
Female54 (37%)29 (20%)83 (57%)
18–2238 (26%)28 (19%)66 (46%)
22–3036 (25%)23 (16%)59 (41%)
30+11 (8%)5 (3%)16 (11%)
Did not answer2 (1%)2 (1%)4 (3%)
White61 (42%)46 (32%)107 (74%)
Black or African American10 (7%)5 (3%)15 (10%)
American Indian or Alaska Native   
Asian2 (1%)1 (1%)3 (2%)
Native Hawaiian or Pacific Islander1 (1%)0 (0%)1 (1%)
Multiracial6 (4%)0 (0%)6 (4%)
Other7 (5%)5 (3%)12 (8%)
Did not answer0 (0%)1 (1%)1 (1%)
3.0–4.061 (42%)43 (30%)104 (72%)
2.0–2.9923 (16%)15 (10%)38 (26%)
<2.00 (0%)0 (0%)0 (0%)
Did not answer3 (2%)0 (0%)3 (2%)
Pell Grant 
Yes40 (28%)27 (19%)67 (46%)
No47 (32%)30 (21%)77 (53%)
No answer0 (0%)1 (1%)1 (1%)
Business administration11 (8%)12 (8%)23 (16%)
Health sciences18 (12%)5 (3%)23 (16%)
HR/Management11 (8%)16 (11%)27 (19%)
Other business12 (8%)10 (7%)22 (15%)
Other (non-business)34 (23%)14 (10%)48 (33%)
No answer/Undecided1 (1%)1 (1%)2 (1%)


Missing data were examined. Data was missing completely at random as no single item was missing more than three percent of scores and no case was missing more than six percent of the survey items. The average of data points was input for continuous variables and the median was used for survey items.


Variables were measured through multiple methods including survey items and archival data. Using data from more than one method reduced contamination of common method bias (Podsakoff, et al., 2003). Three sources of information were accessed. Student perceptions, characteristics, and behaviors were assessed through survey items. Student performance measures were collected through obtaining course grades and test scores. Student perceptions of the instructor were collected from the data from the institution’s SEI reports, as well as in survey items.

To measure the impact of OER adoption on the student outcomes, the student’s final grades and average exam scores were collected after the course ended. Student final grades are noisy measures, that include attendance and group participation scores, therefore collecting exam grades is appropriate (Hilton, et al., 2016). The exams in both conditions were identical in terms of number of questions, types of questions, and time allowed. Student satisfaction was assessed through one global item: “I am satisfied with the current textbook.” As well as through items assessing how easy it was to access the textbook, course materials, and assignments (Michael Schwartz Library, 2018). Student perceptions of textbook quality was measured with items derived from the “Faculty guide for evaluating Ooen education resources” (BCOER, 2015).

The impact on instructors was measured in two ways. First, students were asked to evaluate their instructor on a variety of characteristics including knowledgeable, engaging, and organized, used by Jhangiani and colleagues (2018) and available in the OER research toolkit. The survey was deployed in the last month of the course and confidentiality was ensured. Additionally, the SEI scores for the six courses of the study were obtained and analyzed. SEI surveys are completely anonymous and administered online by the university. Questions on the survey assess the perceptions of instructor and course quality.

Institutional impacts were solely assessed through the survey data. Five academic behavior questions were adapted from the earlier ease of access questions. These questions did not directly ask students if they had cheated on any of the course assignments, but instead asked how easy it was to find answers to the quizzes and exams online. This wording was important to soften the negative connotations with finding answers online in order to reduce the likelihood of social desirability. Furthermore, the wording depersonalizes the behavior in an attempt to remove the notion of a threat (Alreck and Settle, 1985). Finally, the impact of OER adoption on low-income students was identified by adding Pell Grant status into the analysis. Pell Grants are awarded to students who can display exceptional financial need (U.S. Department of Education, n.d.).

Data analysis

Analysis of variance (ANOVA) procedures were conducted in IBM SPSS 25 to test differences between groups. Differences were assessed per item and in aggregate. Aggregate scores are the average of all items in the scale. The corresponding F-values, effect sizes and significance are presented in Table 2 (Student outcomes), Table 3 (Instructor outcomes) and Table 4 (Institution outcomes). A factor analysis was assessed for all survey measures. Factor scores were also tested as aggregate measures and results were unchanged.


Table 2: Student outcomes based upon course materials (Traditional textbook vs. OER).
Note: *Observed items coded: “Strongly Agree” — 1 ... “Strongly Disagree” — 5.
 Cond. 1 — TextbookCond. 2 — OER   
Final grade percentage0.9500.0830.9430.0570.2420.0020.623
Average exam score0.9080.1300.8750.1062.5820.0180.110
Satisfaction (average)1.7840.6771.4140.36414.5600.0920.000
I am satisfied with the current textbook.2.0461.0991.2930.56223.1520.1390.000
It was easy for me to access and read the textbook material for the course.1.5520.8591.2410.5066.1340.0410.014
It was easy to access the textbook material from an e-reader device, such as iPad, Kindle, or Nook.2.2411.0672.0171.0841.5160.0100.220
It was easy for me to access and complete the self-assessments assignments for the course.1.2990.5931.1030.3075.3360.0360.022
Perceptions of quality (average)1.8390.6341.5720.4627.6020.0500.007
The textbook covers all areas and ideas of the subject.1.8390.9131.6720.6851.4030.0100.238
The content in the textbook is error-free.2.2870.9261.9830.6884.5810.0310.034
Content in the textbook is up to date.1.9430.8261.6380.6675.4890.0370.021
The text is clearly written.1.7130.7611.5520.5971.8370.0130.177
The topics in the test are presented in a logical, clear fashion.1.8390.9381.6030.7242.6160.0180.108
The textbook is easy to navigate online.1.9890.9941.5340.8218.3120.0550.005
The textbook is not culturally insensitive or offensive.1.7130.8341.4310.7284.3830.0300.038



Table 3: Instructor outcomes based upon course materials (Traditional textbook vs. OER).
Note: *Perceptions of instructor coded: “Strongly Agree” — 1 ... “Strongly Disagree” — 5; ; SEI items coded in reverse.
 Cond. 1 — TextbookCond. 2 — OER   
Perceptions of instructor (average)1.3760.4221.3030.3521.1840.0080.278
Uses a good mix of teaching approaches/styles1.5170.7761.3970.6200.9850.0070.323
Effectively uses visual aids1.4600.7441.4310.6780.0560.0000.814
Uses relevant examples1.2990.5521.2070.4501.1150.0080.293
Communicates well1.2640.6191.1550.3651.4640.0100.228
Is a fair grader1.2530.5951.2410.6010.0130.0000.910
Easy to understand1.1950.4271.2240.4600.1480.0010.701
SEI — Instructor (average)3.7680.0913.7210.1070.3270.0760.598
The instructor is knowledgeable in the subject matter.3.8900.0863.7300.1413.2350.4470.146
The instructor is well prepared.3.8250.1043.6650.2331.5700.2820.278
The instructor presents the course material clearly.3.7880.1023.7500.0000.2380.0560.651
The instructor makes course expectations clear.3.7730.0783.7300.1410.2520.0590.642
The instructor answers questions effectively.3.7530.1883.7500.0000.0000.0000.987
The instructor fosters student participation.3.7600.1493.6900.0850.3550.0820.583
The instructor fosters an effective learning environment.3.7750.0863.6250.1772.2490.3600.208
The instructor treats students with courtesy and respect.3.8150.0703.7500.0001.5330.2770.283
The instructor is available for student assistance and consultation.3.6380.2423.7300.1410.2330.0550.654
The instructor provides timely feedback on student assignments and examinations.3.6780.2193.7300.1410.0890.0220.780
The instructor provides useful feedback on student assignments and examinations.3.7600.1493.7300.1410.0560.0140.825
The instructor provides useful feedback on my performance in the course.3.7630.0963.7700.0850.0090.0020.931
SEI — Course (average)3.7190.1413.5960.0991.1640.2250.341
The course meets its stated objectives3.7900.1403.6250.1771.6200.2880.272
The required course materials are useful.3.5830.1773.5250.2050.1290.0310.737
The assignments help me to learn the course material.3.6880.1823.5400.0571.1260.2200.349
The course is well-organized.3.7630.1523.5400.0573.6230.4750.130
The course meets its stated objectives.3.7730.1493.7500.0000.0400.0100.850



Table 4: Institutional outcomes based upon course materials (Traditional textbook vs. OER).
Note: *Academic dishonesty items coded: “Strongly Agree” — 1 ... “Strongly Disagree” — 5.
 Cond. 1 — TextbookCond. 2 — OER   
Academic integrity (average)2.9100.8513.3031.0086.4040.0430.012
It was easy to find quiz questions for the textbook online.2.3330.9842.8101.2636.4980.0430.012
It was easy to find the answers to quiz questions for the textbook online.2.7011.1322.9481.2201.5580.0110.214
It was easy to find answers to the assignments online2.7131.2103.2761.2957.1300.0470.008
It was easy to find exam questions for the textbook online.3.0921.0413.4141.2002.9400.0200.089
It was easy to look up exam questions online while taking the exam.3.7131.2294.0691.0903.1980.2200.076
Pell Grant       
Have you received any PELL GRANTS or FEE WAIVERS to fund your education?1.5400.5011.5300.5040.0260.0070.871
Pell | Final grade0.9600.0790.9400.0651.5230.0170.219
Pell students: Final grade0.9370.0120.9490.0140.3630.0060.549
Non-Pell students: Final grade0.9600.0110.9400.0141.3540.0180.248
Pell | Average exam grade0.9120.1320.8790.1160.0270.0150.869
Pell students: Avg. exam grade0.9040.0180.8780.0220.8010.0120.374
Non-Pell students: Avg. exam grade0.9120.0180.8790.0231.2430.0160.268





Hypothesis testing

Hypotheses 1–3 evaluated the relationships between OER adoption and student outcomes in regard to final grades, satisfaction, and perceptions of quality. Consistent with previous research findings, I am unable to establish a significant effect of OER adoption on student final grades and average exam scores. However, satisfaction is significantly and positively affected by OER adoption with a mid to large effect size. This means students were more likely to be satisfied with the OER textbook than the traditional textbook. Furthermore, student perceptions of quality were significantly and positively related to OER adoption. Specifically, students were more likely to agree that the OER textbook was up to date, error-free, not culturally insensitive or offensive, and easy to navigate online, than the traditional textbook. Figure 1 graphically displays the comparison of Likert scale measures of student satisfaction between both conditions.


Likert scale student responses regarding student satisfaction
Figure 1: Likert scale student responses regarding student satisfaction.


Hypotheses 4 and 5 suggested instructor direct benefits, in terms of favorable perceptions and SEI scores, were related to OER adoption. While the mean values of almost every item measuring instructor qualities were more favorable when the instructor adopted an OER, the values were not significantly different. This result is further supported in the SEI scores. Neither the SEI scores for the instructor nor the course were significantly affected by the adoption of the OER.

In terms of institutional outcomes, Hypothesis 6 considered the effect of OER adoption on academic integrity. Students using the OER reported more difficulty in finding quiz or exam answers online than the students using a traditional textbook. While the difference between groups was significant, the effect size was rather low. Figure 2 graphically displays the comparison of student reponses regarding academic integrity.


Likert scale student responses regarding academic integrity issues
Figure 2: Likert scale student responses regarding academic integrity issues.


Hypothesis 7 considered student outcomes in terms of specific student populations, particularly low-income students receiving Pell Grant support. A two-factor ANOVA tested whether an interaction effect was significant between OER adoption and Pell student status. When analyzing only the Pell student group, neither measure of student performance, final grade or average exam score, was significantly different based on OER adoption. A summary of the hypothesis testing can be found in Table 5.


Table 5: Hypothesis testing.
Impact on students 
H1: OER adoption is not significantly related to student gradesFinal grade0.623Neg.Yes
Avg. exam score0.110Neg.Yes
H2: OER adoption is positively related to student satisfaction.Single item sat.0.000Pos.Yes
Avg. satisfaction0.000Pos.Yes
H3: OER adoption is positively related to student perceptions of textbook quality.Avg. perception of quality0.007Pos.Yes
Impact on instructors 
H4: OER adoption is positively related to student perceptions of the instructor.Avg. perceptions of instructor0.278Pos.No
H5: OER adoption is positively related to instructor SEI scores.SEI — Instructor avg.0.598Neg.No
SEI — Course avg.0.341Neg.No
Impact on institution 
H6: OER adoption is positively related to academic integrity.Avg. academic integrity0.012Pos.Yes
H7: The relationship between OER adoption and student performance is moderated by student Pell Grant status such that a positive relationship is amplified.Pell x final grade0.219Neg.No
Pell x Average exam score0.869Neg.No





The prime goal of this study was to investigate the direct benefits of OER adoption for different stakeholders of higher education. If these direct benefits can be identified, and communicated to stakeholders, then benefits may be leveraged to motivate further OER adoption behavior and add to the common good.

The findings in this study related to instructor benefits are troubling. The data indicates the direct beneficiaries of OER adoption are students and the institution, though these groups are not the individuals making most textbook adoption decisions. Furthermore, these are not the groups that must invest additional time and effort into adoption (Seaman and Seaman, 2017). As noted earlier, a significant benefit of adopting a traditional textbook is to acquire the ancillaries of the publisher (test bank, lecture materials, etc.) (Del Valle, 2019). Adopting an OER incurs the greatest cost to the instructor, without improving the most heavily weighted criteria for many promotion and tenure decisions.

However, this research provides a meaningful basis, and motivation, for improving these direct benefits. From the institution’s perspective, this research indicates OER adoption improves student satisfaction without sacrificing academic integrity. As some researchers have indicated, when rigorous work is demanded, some students voice dissatisfaction with low course evaluations (Crumbley, et al., 2001). These findings identify other ways of improving student satisfaction beyond making the classes easier. Students can find satisfaction when they have access to the materials they need to succeed.

Furthermore, these findings indicate OER adoption may make a meaningful impact on academic integrity. As some researchers have shown, dishonest behavior in higher education is likely to lead to further dishonest behavior in the workplace (McCabe, et al., 2002). Others have pointed out the existence of a “cheating culture” (McCabe and Trevino, 1993) one in which cheating is normalized and spread throughout the student body. If an institution can inprove academic integrity in one class through OER adoption, then the effects of the adoption may spread beyond the focal class.

A possible explanation for the lack of support for H4 and H5 is the instructor’s experience with the textbook. The traditional textbook had been used for several years. The instructor had an opportunity to give multiple lectures using the slides, test questions had been vetted by hundreds of students, and assessments had been refined over the years. According to Crumbley and colleagues (2001), 40 percent of students reported lowering SEI scores if the instructor appears to be inexperienced. If an instructor is inexperienced with the new materials, then it is likely these will be reflected in student perceptions and SEI scores. Future research may consider testing a lag for the effect of the OER adoption.

It would be cynical to assume that instructors are only concerned with student perceptions and SEI scores. In practice, instructors have cited significant incentives to adopt OER, including cost benefits for students to reduce income inequality and the flexibility in pedagogy and course design (Belikov and Bodily, 2016). Additionally, the documented benefits from this study — student satisfaction, perceptions of quality, and academic integrity — would certainly benefit the instructor in terms of adding joy to their work with happy students and reducing one of the most irritating and challenging functions of the profession, controlling dishonesty (Harper, et al., 2019).

Yet, it is important to recognize the role of motivation in individual behavior. Adopting an OER takes time, and that time is drawn from other duties of the instructor. Faculty generally pursue research endeavors and provide service to the University and profession, in addition to teaching. These activities are directly related to promotion and tenure decisions, whereas student satisfaction is usually considered only in the narrow form of SEI (Crumbley, et al., 2001). If instead, institutions recognize the value of OER adoption and consider this behavior as part of the promotion and tenure decisions, then OER adoption behavior may be motivated. While only a small subset of schools include OER language in promotion and tenure decisions (Todorinova and Wilkinson, 2020), an important example of this practice is in the University of British Columbia. Yano (2017) described how faculty may include contributions to open education repositories/resources as evidence of educational leadership, a requirement for tenure/promotion. Other institutions may use less formal forms of recognition. Cleveland State University initialized a gratitude campaign, where letters of appreciation from the provost are sent to OER faculty to include in the faculty members’ promotion and tenure dossiers (Goodsett, 2020).

Limitations and future research

The second condition — the semester when the OER was first adopted — was the spring of 2020. During this semester, the entire university transitioned to remote/online instruction halfway through the semester in response to COVID-19. It is possible this change reduced, or dampened, the student’s perceptions of the instructor as they were no longer attending class in the mode for which they enrolled. To assess the impact of this effect, I performed an ANOVA with only subjects enrolled in the online sections of the course. Arguably, the format of the online class did not change once the university moved to remote instruction, therefore any effect would not be contaminated. The results were consistent with the findings in Table 4; the average perceptions of the instructor and SEI scores were not significantly different after the adoption.

Another important limitation is the use of self-reported data. For some constructs, such as student perceptions and satisfaction, self-reported data is accurate and appropriate. However, the variable of academic integrity can be impacted by social desirability. Precautions were made to mitigate the effect of social desirability in terms of confidentiality and the timing of the survey. Furthermore, this study is focused on significant differences between two groups. There is no reason to believe social desirability would affect one group more than the other.




In a full academic year, adopting an OER in this course saved the 224 students US$28,000 collectively. Before the OER adoption, the course textbook fee was a significant burden for the students at this institution, as the minimum US$125 cost was nearly 10 percent of the tuition for the class itself. Furthermore, there are very few obstacles to controlling these costs. Changing course materials requires significant switching costs on the part of the instructor. The textbook market is heavily consolidated with 80 percent of the market being controlled by five companies (Berman, 2019). These two factors can limit the control buyers have in the market.

It is important to note the OER textbook was not considered inferior in any way to the traditional textbook. While not all outcomes were significantly improved, no outcome was significantly deteriorated when the OER was adopted. If we consider the considerable cost savings for students, then even a neutral outcome is positive. The empirical findings of this study identify further direct benefits for not only the students, but the University and academic community in general. End of article


About the author

Candice Vander Weerdt is Assistant College Lecturer in the Monte Ahuja College of Business at Cleveland State University.
E-mail: c [dot] vanderweerdt [at] csuohio [dot] edu



1. Cooney, 2017, p. 174.



M.S. Alkathiri, 2021. Decision-making by heads of academic department using student evaluation of instruction (SEI), International Journal of Learning, Teaching and Educational Research, volume 20, number 2, pp. 235–250.
doi:, accessed 19 February 2022.

P.L. Alreck and R.B. Settle, 1985. The survey research handbook. Homewood, Ill.: R.D. Irwin.

A. Argandoña, 1998. “The stakeholder theory and the common good,” Journal of Business Ethics, volume 17, number 9, pp. 1,093–1,102.
doi:, accessed 19 February 2022.

Association to Advance Collegiate Schools of Business (AACSB), 2020. “2020 guiding principles and standards for business accreditation,” at, accessed 19 February 2022.

R. Awdry and P.M. Newton, 2019. “Staff views on commercial contract cheating in higher education: A survey study in Australia and the UK,” Higher Education, volume 78, number 4, pp. 593–610.
doi:, accessed 19 February 2022.

D. Baden and M. Higgs, 2015. “Challenging the perceived wisdom of management theories and practice,” Academy of Management Learning & Education, volume 14, number 4, pp. 539–555.
doi:, accessed 19 February 2022.

T. Baldwin and N. Blattner, 2003. “Guarding against potential bias in student evaluations: What every faculty member needs to know,” College Teaching, volume 51, number 1, pp. 27–32.
doi:, accessed 19 February 2022.

BCOER, 2015. “Faculty guide for evaluating Ooen education resources,” at, accessed 19 February 2022.

E.M. Beasley, 2014. “Students reported for cheating explain what they think would have stopped them,” Ethics & Behavior, volume 24, number 3, pp. 229–252.
doi:, accessed 19 February 2022.

P. Beile, A. deNoyelles, and J. Raible, 2020. “Analysis of an open textbook adoption in an American history course: Impact on student academic outcomes and behaviors,&edquo; College & Research Libraries, volume 81, number 4, pp. 721–736.
doi:, accessed 19 February 2022.

O.M. Belikov and R. Bodily, 2016. “Incentives and barriers to OER adoption: A qualitative analysis of faculty perceptions,” Open Praxis, volume 8, number 3, pp. 235–246.

J. Berman, 2019. “What the McGraw-Hill, Cengage merger means for textbook prices,” MarketWatch (9 May), at, accessed 19 February 2022.

M.E. Bernard, L.W. Keefauver, G. Elsworth, and F.D. Naylor, 1981. “Sex-role behavior and gender in teacher-student evaluations,” Journal of Educational Psychology, volume 73, number 5, pp. 681–696.
doi:, accessed 19 February 2022.

T.J. Bliss, J. Hilton, D. Wiley, and K. Thanos, 2013. “The cost and quality of open textbooks: Perceptions of community college faculty and students,” First Monday, volume 18, number 1, at, accessed 19 February 2022.
doi:, accessed 19 February 2022.

J. Borg, C.M. Scott-Young, and M. Turner, 2019. “Smarter education: Leveraging stakeholder inputs to develop work ready curricula,” In: V., Uskov, R. Howlett, and L. Jain (editors). Smart education and e-learning 2019, Singapore: Springer, pp. 51–61.
doi:, accessed 19 February 2022.

N.B. Colvard, C.E. Watson, and H. Park, 2018. “The impact of open educational resources on various student success metrics,” International Journal of Teaching and Learning in Higher Education, volume 30, number 2, pp. 262–276, and at, accessed 19 February 2022.

C. Cooney, 2017. “What impacts do OER have on students? Students share their experiences with a health psychology OER at New York City College of Technology,” International Review of Research in Open and Distributed Learning, volume 18, number 4, pp. 155–178.
doi:, accessed 19 February 2022.

Creative Commons, 2020. “Open education” (12 August), at, accessed 14 October 2020.

L. Crumbley, B.K. Henry, and S.H. Kratchman, 2001. “Students’ perceptions of the evaluation of college teaching,” Quality Assurance in Education, volume 9, number 4, pp. 197–207.
doi:, accessed 19 February 2022.

G. Del Valle, 2019. “The high cost of college textbooks, explained,” Vox (6 March), at, accessed 19 February 2022.

M. Dempsey, 2021. “The impact of free and open educational resource adoption on community college student achievement,” International Journal of Open Educational Resources, at, accessed 19 February 2022.

T. Donaldson and L.E. Preston, 1995. “The stakeholder theory of the corporation: Concepts, evidence, and implications,” Academy of Management Review, volume 20, number 1, pp. 65–91.
doi:, accessed 19 February 2022.

A.P. Feldstein, M. Martin, A. Hudson, K. Warren, J. Hilton III, and D. Wiley, 2012. “Open textbooks and increased student access and outcomes,” European Journal of Open, Distance and E-Learning, at, accessed 19 February 2022.

R.E. Freeman and D.L. Reed, 1983. “Stockholders and stakeholders: A new perspective on corporate governance,” California Management Review, volume 25, number 3, pp. 88–106.
doi:, accessed 19 February 2022.

D. Gooblar, 2020. “We know what works to close the completion gap,” Chronicle of Higher Education (20 February), at, accessed 19 February 2022.

M. Goodsett, 2020. “Supporting faculty through an open education and affordability gratitude campaign,” Reference Services Review, volume 48, number 3, pp. 353–371.
doi:, accessed 19 February 2022.

D. Harley, S. Lawrence, S.K. Acord, and J. Dixson, 2010. “Affordable and open textbooks: An exploratory study of faculty attitudes,” UC Berkeley, Research and Occasional Papers Series (1 January), at, accessed 19 February 2022.

R. Harper, T. Bretag, C. Ellis, P. Newton, P. Rozenberg, S. Saddiqui, and K. van Haeringen, 2019. “Contract cheating: A survey of Australian university staff,” Studies in Higher Education, volume 44, number 11, pp. 1,857–1,873.
doi:, accessed 19 February 2022.

J. Hilton, 2020. “Open educational resources, student efficacy, and user perceptions: A synthesis of research published between 2015 and 2018,” Educational Technology Research and Development, volume 68, number 3, pp. 853–876.
doi:, accessed 19 February 2022.

J. Hilton, 2016. “Open educational resources and college textbook choices: A review of research on efficacy and perceptions,” Educational Technology Research and Development, volume 64, number 4, pp. 573–590.
doi:, accessed 19 February 2022.

J. Hilton and C. Laman, 2012. “One college’s use of an open psychology textbook,” Open Learning, volume 27, number 3, pp. 265–272.
doi:, accessed 19 February 2022.

J. Hilton, D. Wiley, L. Fischer, and R. Nyland, 2016. “Guidebook to research on open educational resources adoption,” Open Textbook Network, at, accessed 19 February 2022.

V.J. Howard and C.B. Whitmore, 2020. “Evaluating student perceptions of open and commercial psychology textbooks,” Frontiers in Education (5 August).
doi:, accessed 19 February 2022.

M. Isserman, 2003. “Plagiarism: A lie of the mind,” Chronicle of Higher Education (2 May), at, accessed 19 February 2022.

J.J. Jenkins, L.A. Sánchez, M.A.K. Schraedley, J. Hannans, N. Navick, and J. Young, 2020. “Textbook broke: Textbook affordability as a social justice issue,” Journal of Interactive Media in Education, volume 3, number 1.
doi:, accessed 19 February 2022.

R.S. Jhangiani and S. Jhangiani, 2017. “Investigating the perceptions, use, and impact of open textbooks: A survey of post-secondary students in British Columbia,” International Review of Research in Open and Distributed Learning, volume 18, number 4, at, accessed 19 February 2022.

R.S. Jhangiani, F.N. Dastur, R. Le Grand, and K. Penner, 2018. “As good or better than commercial textbooks: Students’ perceptions and outcomes from using open digital and open print textbooks,” Canadian Journal for the Scholarship of Teaching and Learning, volume 9, number 1 (16 April).
doi:, accessed 19 February 2022.

L.I. Langbein, 1994. “The validity of student evaluations of teaching,” PS: Political Science & Politics, volume 27, number 3, pp. 545–553.
doi:, accessed 19 February 2022.

M. Lovett, O. Meyer, and C. Thille, 2008. “The open learning initiative: Measuring the effectiveness of the OLI statistics course in accelerating student learning,” Journal of Interactive Media in Education, volume 2008, number 1, article number 13.
doi:, accessed 19 February 2022.

J. Ma, M. Pender, and C.J. Libassi, 2020. “Trends in college pricing and student aid 2020,” College Board, at, accessed 19 February 2022.

J. MacGregor and M. Stuebs, 2012. “To cheat or not to cheat: Rationalizing academic impropriety,” Accounting Education, volume 21, number 3, pp. 265–287.
doi:, accessed 19 February 2022.

M. Maeda, 2021. “Exam cheating among Cambodian students: when, how, and why it happens,” Compare, volume 51, number 3, pp. 337–355.
doi:, accessed 19 February 2022.

D. McCabe and L.K. Trevino, 1993. “Academic dishonesty: Honor codes and other contextual influences,” Journal of Higher Education, volume 64, number 5, pp. 522–538.
doi:, accessed 19 February 2022.

D.L. McCabe, L.K. Treviño, and K.D. Butterfield, 2002. “Honor codes and other contextual influences on academic integrity: A replication and extension to modified honor code settings,” Research in Higher Education, volume 43, number 3, pp. 357–378.
doi:, accessed 19 February 2022.

A.W. Meade and S.B. Craig, 2012. “Identifying careless responses in survey data,” Psychological Methods, volume 17, number 3, pp. 437–455.
doi:, accessed 19 February 2022.

Michael Schwartz Library, 2018. “Textbook affordability grant assessment tool,” at, accessed 19 February 2022.

S. Mintz, 2020. “Advancing equity post-pandemic,” Inside Higher Ed (15 September), at, accessed 19 February 2022.

A.T. Nusbaum and C. Cuttler, 2020. “Hidden impacts of OER: Effects of OER on instructor ratings and course selection,” Frontiers in Education (9 June).
doi:, accessed 19 February 2022.

J.C. Ortagus, R. Kelchen, K. Rosinger, and N. Voorhees, 2020. “Performance-based funding in American higher education: A systematic synthesis of the intended and unintended consequences,” Educational Evaluation and Policy Analysis, volume 42, number 4, pp. 520–550.
doi:, accessed 19 February 2022.

Pew Research Center, 2019. “Two decades of change in federal and state higher education funding” (15 October), at, accessed 19 February 2022.

P.M. Podsakoff, S.B. MacKenzie, J.-Y. Lee, and N.P. Podsakoff, 2003. “Common method biases in behavioral research: A critical review of the literature and recommended remedies,” Journal of Applied Psychology, volume 88, number 5, pp. 879–903.
doi:, accessed 19 February 2022.

B. Popken, 2015. “College textbook prices have risen 1,041 percent since 1977,” NBC News (2 August), at, accessed 19 February 2022.

J.S. Pounder, 2007. “Is student evaluation of teaching worthwhile? An analytical framework for answering the question,” Quality Assurance in Education, volume 15, number 2, pp. 178–191.
doi:, accessed 19 February 2022.

S.F. Reardon, 2013. “The widening income achievement gap,” Educational Leadership, volume 70, number 8, pp. 10–16, and at, accessed 19 February 2022.

S. Sattler, C. Wiegel, and F. Van Veen, 2017. “The use frequency of 10 different methods for preventing and detecting academic dishonesty and the factors influencing their use,” Studies in Higher Education, volume 42, number 6, pp. 1,126–1,144.
doi:, accessed 19 February 2022.

A. Savage and M.G. Simkin, 2010. “Ethical concerns about the online sale of instructor-only textbook resources,” Research on Professional Responsibility and Ethics in Accounting, volume 14, pp. 213–231.
doi:, accessed 19 February 2022.

J.E. Seaman and J. Seaman, 2017. “Opening the textbook: Educational resources in U.S. Higher Education, 2017,” Babson Survey Research Group, at, accessed 19 February 2022.

P. Seldin, 1993. “How colleges evaluate professors, 1983 v. 1993,” POD Network Conference Materials, at, accessed 19 February 2022.

J. Sidanius and M. Crane, 1989. “Job evaluation and gender: The case of university faculty,” Journal of Applied Social Psychology, volume 19, number 2, pp. 174–197.
doi:, accessed 19 February 2022.

M.G. Simkin and A. McLeod, 2010. “Why do college students cheat?” Journal of Business Ethics, volume 94, number 3, pp. 441–453.
doi:, accessed 19 February 2022.

L. Todorinova and Z.T. Wilkinson, 2020. “Incentivizing faculty for open educational resources (OER) adoption and open textbook authoring,” Journal of Academic Librarianship, volume 46, number 6, 102220.
doi:, accessed 19 February 2022.

United Nations Educational, Scientific and Cultural Organization (UNESCO), 2016. “Education 2030: Incheon Declaration and Framework for Action for the implementation of Sustainable Development Goal 4: Ensure inclusive and equitable quality education and promote lifelong learning opportunities for all,” at, accessed 22 September 2021.

U.S. Congress, 2019. “H.R.4674 — College Affordability Act,” at, accessed 19 February 2022.

U.S. Department of Education, n.d. “Federal Pell Grants are usually awarded only to undergraduate students,” at, accessed27 August 2021.

U.S. Government Accountability Office, 2005. “College textbooks: Enhanced offerings appear to drive recent price increases,” GAO-05-806 (July), at, accessed 19 February 2022.

G. Vojtech and J. Grissett, 2017. “Student perceptions of college faculty who use OER,” International Review of Research in Open and Distributed Learning, volume 18, number 4, pp. 155–171.
doi:, accessed 19 February 2022.

J.D. Ward, E.D. Pisacreta, B. Weintraut, and M. Kurzweil, 2020. “An overview of state higher education funding approaches: Lessons and recommendations,” Ithaka S+R (10 December).
doi:, accessed 19 February 2022.

J. Weissmann, 2013. “Why are college textbooks so absurdly expensive?” Atlantic (3 January), at, accessed 19 February 2022.

M. Weller, B. de los Arcos, R. Farrow, B. Pitt, and P. McAndrew, 2015. “The impact of OER on teaching and learning practice,” Open Praxis, volume 7, number 4, pp. 351–361, and at, accessed 19 February 2022.

M. Wells, R. Jesiolowski, J. Verwayne, and J. Pablo, 2020. “Meta-syntheses of OER transition in online higher education,” International Journal of Open Educational Resources, volume 3, number 2, pp. 243–256.
doi:, accessed 19 February 2022.

J.R. Winitzky-Stephens and J. Pickavance, 2017. “Open educational resources and student course outcomes: A multilevel analysis,” International Review of Research in Open and Distributed Learning, volume 18, number 4, pp. 35–49.
doi:, accessed 19 February 2022.

B. Yano, 2017. “Recognizing ‘open’ in tenure and promotion at UBC” (17 April), at, accessed 22 September 2021.


Editorial history

Received 2 September 2021; revised 22 September 2021; accepted 15 February 2022.

Copyright © 2022, Candice Vander Weerdt. All Rights Reserved.

Does the rising tide of OER lift all boats?
by Candice Vander Weerdt.
First Monday, Volume 27, Number 3 - 7 March 2022