Higher education is in a major state of change. New ideas, new models, new technologies, a shifting student demographic, new accountability standards—these are just some of the things that are challenging current perceptions. Although the changes are being driven by different groups of stakeholders—students, employers, the government, etc.—what unites many of them is that they are all in service of defining and developing new forms of measurement and credentials, ones that are meaningful and that everyone understands.
For example, competency-based education is currently posing a major challenge to the traditional credit hour. This isn’t just a movement at small institutions on the fringes of education; it’s starting to make its way into the mainstream. One of the biggest education stories of last year was the University of Wisconsin announcing its UW Flexible Option: self-paced, competency-based education programs in five areas including biomedical sciences and business and technical communications. More competency-based programs are being developed and announced all of the time, with these efforts being supported by the U.S. Department of Education. Similarly, digital badges are being tested in classrooms from K-12 to higher education to corporate training. Again, this isn’t just happening at the outskirts: even Purdue University is experimenting with them.
Students are getting into the game, demanding more from their schools. There has been a good deal of debate as to whether students should be considered consumers, but while academic leaders are discussing it, students are already starting to act that way. They are looking at the price of college and at the unemployment rates of recent graduates and saying, “This costs a fortune. What will I be able to do when I graduate?” It’s no longer sufficient for schools just to confer degrees; they need to specify exactly what those degrees mean. If students don’t like the answer, they will go somewhere else to get a better one. These days, that “somewhere else” is likely to be outside the traditional educational system, for example, MOOCs or coding bootcamps.
Employers are also starting to demand more meaningful credentials. The degree, which has served for so long, is losing its meaning, and even traditional transcripts with course titles and grades are viewed as barely worth the paper they are printed on. As Association of American Colleges and Universities President Carol Geary Schneider told Inside Higher Education: “Our employer studies show that employers basically find the transcript useless in evaluating job candidates” [emphasis added].
Meaningful credentials that everyone understands shouldn’t be so hard to come by. Colleges and universities need to be held accountable for student outcomes, students need to know that they money they spend will be worth it for their future, and employers need to know what different credentials mean on applicants’ resumes. It’s astonishing that we’ve gotten to a point where none of these things is actually true.
The 21st century has made many things obsolete. It’s time for traditional degrees, transcripts, and other relics of a 20th century education to make way for ideas, technologies, and credentials that are more meaningful and relevant to today.