Tag Archive | questions

ENGL 810 – Paper # 2: Intriguing Questions – What Do We Do With Placement Tests in Developmental/ESL English?

Screen Shot 2015-09-29 at 10.05.44 PM

According to this article from Slate.com, between 2003-2013, community colleges only received an extra $1 per full time student. Community colleges have had to cut back, which leads to shortcuts such as placement testing over more proven multi-assessment measures to place students. Without increased funding, are community colleges doomed to continue misplacing and discriminating against English Language Learners?

The study of ESL/developmental English has one important question that has been asked now for close to twenty-five years. That question is: do placement tests work for placing students into developmental English, and if they don’t work, what can replace them to accurately place students in need into developmental English classes without placing everyone or no one in them?

The history of this question begins with the placement tests themselves. As I have mentioned in other posts, Harvard was the first school to pioneer the placement exam. These tests was meant to contain students who did not have “correct” English in separate courses from those students who did, suggesting that “language differences [could] be effectively removed from mainstream composition courses.” With an influx of immigrant students coming to study in the U.S. during the second part of the nineteenth century, most other schools began requiring language exams as well (Kei Matsuda 641-42). This practice continued on without question, according to Deborah Crusan, until approximately thirty-five years ago. Crusan notes that indirect measures of assessment, such as the Compass or ACCUPLACER, were used by schools because they resulted in less inter-reader unreliability that was traditionally associated with scoring of placement essays by independent raters. By the 1970s, however, academia started criticising such tests on the grounds that writing could not be assessed by a computer, and by the 1990s, the idea that writing can be tested indirectly “have not only been defeated but also chased from the battlefield” (18).

Despite the fact that the question of the accuracy of these exams seems to have been debunked, according to Hassel and Baird Giordano, over 92 percent of community colleges still use some form of placement exams. 62 percent use the ACCUPLACER and 42 percent use Compass; other schools use some combination of the two (30). These tests are still very much alive and being used although research by the TYCA Council shows that these tests have “severe error rates,’ misplacing approximately 3 out of every 10 students” (233). Worse yet, when students are placed into courses based on these standardized placement scores, it is found that their outcome on the exam is “weakly correlated with success in college-level courses,” resulting in students placed in courses for which they are “underprepared or over prepared” (Hodra, Jagger, and Karp).  At Hassel and Baird Giordano’s community college, the retake rate of classes in which students are placed ends up being 20-30 percent for first semester composition, showing the extreme proportion of students testing into a class for which they are unprepared (34).

There has been a massive amount of research done on approaches to fix this problem. Typically these approaches involve using multiple methods for assessing student writing or re-considering how we use the placement exam. For example, Hassel and Baird Giordanao found greater student placement and success using a multi-pronged approach. This approach includes looking at ACT or SAT scores (37); asking students to complete a writing sample, with an assessment corresponding to the writing program’s learning outcomes (39); examining high school curriculum and grades (41); and student self-assessment for college English readiness (41-42). On the other hand, Hodra, Jaggers, and Karp suggest revamping the way we assess college placement exams by improving college placement accuracy. These methods include:  aligning standards of college readiness with expectations of college-level coursework; using multiple measures of college readiness as part of the placement process; and preparing students for placement exams (6). They also recommend standardizing assessment and placement policies across an entire state’s community colleges, such as what Virginia has done with the Virginia Placement Test (VPT) (19).

Screen Shot 2015-09-29 at 9.54.59 PM

Unfortunately, even redesigns of placement tests that become statewide are not always a good solution. These notes from a NOVA Administrative Council Meeting last April show that despite efforts to create a statewide standardized test, students were less successful in English than ever. Does this add to the data that standardized tests just don’t work?

These outcomes lead us to a final question that still remains: if we know how to “fix” the problem, why are colleges unable to implement the solution? This comes down to money. One administrator said that multiple measures sound “wonderful, but I cannot think about what measures could be implemented that would be practical, that you would have the personnel to implement” (Hodra, Jaggers, and Karp 23). Until we find a solution to the problem of funding and staffing, the placement test will remain.

If we acknowledge that the money for such a revamp at most big schools, such as NOVA (which has over 70,000 students), is not going to appear now (or likely ever), what other potential solutions remain? In thinking about such solutions, I began to consider the reading on the “Pedagogy of Multiliteracies” by the New London Group that we read for class. In this manifesto, the writers note that current literacy pedagogy is “restricted to formalized, monolingual, monocultural, and rule-governed forms of language” (61). The demands of the  “new world,” however, require that teachers prepare students that can navigate and “negotiate regional, ethnic, or class-based dialects” as a way to reduce the “catastrophic conflicts about identities and spaces that now seem ever ready to flare up” (68). This means that colleges must focus on increasing diversity and connectedness between students and their many ways of speaking. To me, this acceptance of diversity of person and language inherently seems to be part of the solution. In recognizing that all students should have their own language, we start to break down the separation and therefore the tests that misplace and malign students.

If we are to subscribe to the “Pedagogy of Multiliteracies” but find that we cannot afford to include multiple measures into our assessment of students, Peter Adams might come closest to having a good solution that brings together help for ESL/developmental students, does away with the placement exam as a site of discrimination, and make mainstream students more aware and respectful of multiliteracies. Adams suggests that schools should still have students take placement exams, yet if a student tested into developmental, they have the option to take mainstream English. The idea is that the weaker writers could be pulled up by the stronger writers and see good role models. In addition, developmental writers take an extra three-hour companion course after the regular course, which has about eight students, leading to an intimate space for students to ask questions and learn (56-57). This model was found to be very successful at Adams’ Community College of Baltimore County; students held each other accountable, were motivated by being part of the “real” college (60). They also avoided falling through the cracks as they passed their developmental course but never registered for English 101, which is a common problem in many community colleges (64).

I truly believe that such a method as Adams suggests is a great idea for NOVA. I teach developmental/ESL English courses in which all of our students are developmental, with no “regular” students. In the “regular” sections of first-semester composition I have students who are well-prepared as well as students who passed the newly deployed VPT exam, which has resulted in more students than ever placing into regular first-semester composition. From my experience, these weaker students in regular composition tend to be more resilient – they are perhaps pulled up by the stronger students, or maybe they know if they can pass this class they are done with half of their English requirement. I compare this to my developmental students, of which I will lose approximately 30-40 percent each semester to failure, attendance issues, disappearance, or language struggles. With the approach Adams suggests, I believe that NOVA could help pull our developmental students up to a higher level of achievement and we could also empower them to continue on with their studies. This would cost much less than a multi-measures approach proposed by those seeking to fully do away with placement exams. I believe this solution would be the best way to “meet in the middle” and solve both the financial and discriminatory practices that are frequently related to placement exams.

Works Cited

Adams, Peter, et al. “The Accelerated Learning Program:Throwing Open The Gates.” Journal Of Basic Writing 28.2 (2009): 50-69. Print.

Crusan, Deborah. “An Assessment of ESL Writing Placement Assessment.” Assessing Writing 8 (2002): 17-30. Print.

Hassel, Holly, and Joanne Baird Giordano. “First-Year Composition Placement at Open-Admission, Two-Year Campuses: Changing Campus Culture, Institutional Practice, and Student Success.” Open Words: Access and English Studies 5.2 (2011): 29–59. Web.

Hodara, Michelle, Shanna Smith Jaggars, and Melinda Mechur Karp. “Improving Developmental Education Assessment and Placement: Lessons From Community Colleges across the Country.” Community College Research Center. Teachers College, Columbia U CCRC Working Paper no. 51. Nov. 2012. Web. Accessed 23 Sept. 2015.

Kei Matsuda, Paul. “The Myth of Linguistic Homogeneity in U.S. College Composition.” College English 68.6 (2006): 637-651. Print.

The New London Group. “A Pedagogy of Multiliteracies: Designing Social Futures.” Harvard Educational Group 66.1 (1996): 1-32. Print.

Two Year College English Association. “TYCA White Paper on Developmental Education Reforms.” TYCA Council. 2013-2014. Accessed 18 Sept. 2015.

Engl 810 – PAB entry 4

Two Year College English Association. “TYCA White Paper on Developmental Education Reforms.” TYCA Council. 2013-2014. Accessed 18 Sept. 2015.

042211-obama

In 2011, President Obama visited NOVA. He spoke about the importance of education for all of our citizens and the great challenge facing America to become number one in higher education again. He asked community colleges to face these challenges by working harder and helping get students through school. Yet with ever-decreasing budgets and increasing emphasis on placement tests and other “quantifiable” measures of success imposed by state legislatures, are these realistic goals?

This white paper discusses the impact of Obama’s push to make the U.S. number one in college graduates, and the impact that legislative efforts from this push are having on students (developmental students in particular) (229). It also suggests new methods for attempting to redesign developmental writing programs and make them more effective.

The authors first note the failure of placement exams such as the Compass and ACCUPLACER in putting students into the correct developmental classes. They note that these tests have “‘severe error rates,’ misplacing approximately 3 out of every 10 students” (233). They also note that many states have their own versions of these placement exams, often implemented by state legislatures in an effort to “reform” developmental education. For example, here in Virginia, the Virginia Placement Test (VPT) has been implemented statewide alongside Compass and ACCUPLACER. As a result of this exam, “the success rate in first-year writing has dropped significantly” and has caused part time and adjunct faculty to be reassigned or laid off, which “deprives developmental students of opportunities for personal contact with expert, caring practitioners … instrumental to their retention and success” (234-35). The writers of the white paper suggest that replacing these placement exams and replacing them with multiple modes of assessment, including a writing sample, is crucial (238).

In addition to the legislative push for tests such as the VPT, many states are seeing state legislatures inserting themselves into the community college with little or no input from faculty (235). These efforts have frequently limited students by putting them into classes for which they are unprepared, has required them to co-enroll in community colleges for developmental education and four-year colleges for other courses, and has even removed developmental education from some four-year schools, forcing community colleges to turn away an influx of underprepared students (232-33). Because these legislature members have little or no experience with higher education, the negative ramifications for both faculty and students are stark.

Despite the bleak outlook, the authors do offer some suggestions for developmental education reform. These include: mainstreaming, or putting developmental students in class with nondevelopmental students with an additional lab session; studio courses, which allows all students in need to meet weekly with a writing instructor to support the work of the course; compression, which is taking 8-week classes to finish two semesters in a single semester; integration or contextualization, which offers developmental content within other general education courses; stretch courses, which stretches a one-semester class into a full year with the same teacher; and modules, which divides specific skills into modules in which the students focus only on their areas of weakness (236-37). The authors note that no matter what a school chooses to do, two-year college English educators must be trained to take part in conversations and insert themselves into legislative discussions to talk about these issues and how to best support students who are underprepared while preserving the mission of open access (238).

The problems that are facing developmental education, as part of both Obama’s push for the U.S. to be number one in higher education, is certainly a noble and important goal, but as it begins to increase legislative interference in a field of which lawmakers know nothing, it becomes more problematic. The issues that the authors of this paper bring up remind me of Ralph Cintron’s essay in the “Octolog III.” He notes: “Where I teach, state funding has plummeted over the decades until today it is about sixteen percent. The university is developing plans for consolidating departments and units” (126). This is happening across the country, particularly at publicly funded colleges; in the community college this hurts particularly badly. While we are getting an ever-increasing influx of students in colleges due to the push for increased societal higher education, we are also getting an ever-decreasing amount of funding from the government. Yet, the government still finds it appropriate to create tests and measures that assess our students and us, such as the VPT, which show how we and our students are failing. It is ultimately a catch-22. This movement towards Neoliberalism, as Cintron calls it, means that the government can continue to justify reducing spending while they are also forcing unrealistic expectations upon students and teachers.

One example of modifications the legislature has made is the requirement of exams such as the VPT. This has affected my own college, NOVA, very negatively. While the government sees such exam implementation as a cost- and time-saving measure, students are being placed into the wrong classes more than ever before. Studies show that other methods are more effective, yet with ever-reduced and ever-stretched full-time faculties, funding such initiatives would be impossible. These tests ultimately cause more frequent failure rates and, as a result, even more government spending as students get a small amount of government funding each time they take a class. In an ideal world, the government would work with educators to see the true needs of students, increase funding to schools for proper measures of assessment, and work with us rather than against us.

This leads us, inevitably, to the question: can these issues ever be solved? Can we move away from the testing model towards something better, and can we increase funding to improve these measures? While President Obama has positioned himself as a champion of free community college, I am skeptical. While such a tuition reduction might help students, I do not see it, ultimately, as an increasing revenue stream for the college. If anything, it will increase the number of students, and therefore, underpaid adjunct faculty, which is a whole separate major issue. Until such measures are passed, I must continue to ask: at what point will we stop sacrificing our student’s time and money and government money on what have been statistically proven to be failing methods? My attitude, while pessimistic, seems realistic.

In 2015, President Obama proposed the idea of free community college for up to two years. While this has not come to fruition yet, it is certainly a talking point up for debate. Unfortunately, while such a proposal sounds meaningful, ultimately, will it help more young people to succeed, or will it result in even greater legislative control at the college?

Works Cited

Cintron, Ralph. Lois Agnew, Laurie Gries, Zosha Stuckey, Vicki Tolar Burton, Jay Dolmage, Jessica Enoch, Ronald L. Jackson II, LuMing Mao, Malea Powell, Arthur E. Walzer, Ralph Cintron & Victor Vitanza. “Octalog III: The Politics of Historiography in 2010.” Rhetoric Review 30:2 (2011): 109-134. Print.

Two Year College English Association. “TYCA White Paper on Developmental Education Reforms.” TYCA Council. 2013-2014. Accessed 18 Sept. 2015.

Engl 810 – PAB Entry 3

Hassel, Holly, and Joanne Baird Giordano. “First-Year Composition Placement at Open-Admission, Two-Year Campuses: Changing Campus Culture, Institutional Practice, and Student Success.” Open Words: Access and English Studies 5.2 (2011): 29–59. Web.

http://www.citruscollege.edu/lc/testing/PublishingImages/Accuplacer.jpg

A student takes the ACCUPLACER so he can be placed “effectively” into English and math courses that are right for him.

In this article, Holly Hassel and Joanne Baird Giordano address one of the enduring issues in the world of English studies, and in particular, ESL and developmental English classes. That issue is one of placement tests; exams meant to place students into a particular English class based on their skill and abilities as a writer and thinker. According to Hassel and Baird Giordano, over 92 percent of community colleges use some form of placement exams. 62 percent use the ACCUPLACER and 42 percent use Compass; many other schools use some combination of the two. These tests, then, become the primary method of assessing student readiness for first year composition (30). Unfortunately, these tests tend to measure outcomes that “do not reflect the complex demands of academic discourse in the first college year” (30); or are disconnected from the learning outcomes the college writing program sets for students (30-31). For example, while standardized tests are good at measuring sentence correction, reading comprehension, and other quantifiable data, they do not test these abilities in action or in problem-solving scenarios. Even more problematic is that these placement tests are very poor at correctly placing students. While only 1-3 percent of students at competitive universities ended up re-taking first year composition, 25-35 percent of community college students did. This outcome can lead us to only one conclusion: these tests only define a student’s “test-taking skills” (38).

Hassel and Baird Giordano note that at their own community college in Wisconsin, the data they researched matches nearly identically with their own experiences. Their school places students into English courses based on only one test score; as a result, somewhere between 20 and 35 percent of students ended up re-taking their first year English courses due to misplacement and later poor performance (34). They advocate that a better system will take a multipronged approach to assess student levels and placement (36). The multipronged approach that they suggest includes looking at ACT or SAT scores, which would assess which students (they say around 50 percent) are under-prepared (37); asking students to complete a writing sample, with an assessment corresponding to the writing program’s learning outcomes (39); examining high school curriculum and grades, particularly for students with “borderline placement profiles” (41); and survey and student self-assessment, which asks students to assess their own readiness for college reading and writing courses, which is particularly useful for older returning students. In this approach, students can meet with English faculty, learn about the courses, and make an educated decision (41-42).

Hassel and Baird Giordano note that undertaking this endeavor, while work intensive, is much more successful that the test model. They noted that “students who remained in good standing at the end of the fall semester significantly increased over the implementation of this approach” (44). While Hassel and Giordano note the difficulty and work that such an approach takes, particularly for large institutions, they recommend trying to increase multiple measures for placement over time (53).

Last week I wrote about the history of ESL students in the composition classroom, and one of the major issues and ways students were being oppressed was through placement tests that put them in/outside the “right” group. As Kai Matsuda noted, these tests were given to students under the assumption that “language differences can be effectively removed from mainstream composition courses” (642). Clearly, language differences cannot be removed or “fixed,” and perhaps should not be removed from the language class. As Janice Lauer noted in chapter 2 of English Studies: An Introduction to the Discipline, in 1974, CCCC came out with the manifesto “Students Right to Their Own Language,” in which the organization affirmed that students should be able to use their own patterns, dialects, and varieties of speaking and writing in which they find their own style and identity (120). This was a huge turn for a discipline that was frequently non-inclusive to other language speakers. Placement testing, therefore, would seem to run directly counter to this recommendation – by asking students to read, think, and write in one particular way, those students who may have been excellent thinkers and writers are relegated to one part of the college, while those that are good test takers are relegated to another for which they may not be prepared. Placement testing, therefore, is not only ineffective as Hassel and Baird Giordano point out, but it ends up preventing students from having a right to their own language and misplaces them based on testable signs that say little about actual writing and thinking skill.

What Hassel and Baird Giordano have presented in this piece is particularly interesting and points out a major question that needs answering – do placement tests work? Their clear answer, based on the historical data is no. However, they have discovered that a multipronged approach can be effective. Yet, this leaves us with the question: how can we successfully facilitate their method over multiple colleges, big and small, when faculty are already overtaxed and often underpaid? While this clearly works well for a school with only 1400 students like the one these women teach at, how can this be applied to a very large school like NOVA, where thousands of students take composition 1 and 2 each semester? I do not think this paper leaves us with an answer that makes this feasible yet. From personal experience, even checking the prerequisite and developmental histories of each of my 100+ students is time-consuming, stressful, and riddled with error and discrepancies. The thought of having to look at multiple test scores, meeting with students, and examining writing samples seems insurmountable. The next question, then, is: if we get rid of the placement test, how do we quickly and more effectively replace them?

Works Cited

Hassel, Holly, and Joanne Baird Giordano. “First-Year Composition Placement at Open-Admission, Two-Year Campuses: Changing Campus Culture, Institutional Practice, and Student Success.” Open Words: Access and English Studies 5.2 (2011): 29–59. Web.

Kei Matsuda, Paul. “The Myth of Linguistic Homogeneity in U.S. College Composition.” College English 68.6 (2006): 637-651. Print.

Lauer, Janice. “Rhetoric and Composition.” English Studies: An Introduction to the Discipline. Ed. Bruce McComiskey. Urbana: NCTE, 2006. 106-152. Print.