Now that the end of the current American school year is nearing, at least for those in more traditional schools, colleges, and universities, I got to thinking even more about learning and education. I say “even more,” because I’ve always thought about both, and I can’t escape it, not with a wife who is a university professor, and two offspring who teach law and medicine, and that doesn’t include the three years I taught on the collegiate level.
Despite the fact that study after study has shown, year after year, that while cramming may get a student through the dreaded final exams, the vast majority of students retain little of what they’ve crammed. Knowledge learned and used bit by bit is retained with far greater detail. That’s why good apprenticeship programs work.
But students crammed when I was in college and they still do. When I was teaching, I gave “pop quizzes” at the beginning of every class, and the questions were either fill in the blank or short answer about important “secondary” material in the reading assignment, i.e., material that wasn’t covered by the equivalent of Cliff Notes or other cheat sheets, material that was easy enough to recall if a student read the material but not available in any other way. I made a point of calling on all students by name in the course of class discussion, especially those who didn’t look interested. I also always had a few students drop out in the first week.
Was that mean or sadistic? It wasn’t meant to be. The idea was simple. Even back then a large percentage of students were there for credentials, not an education. The way I taught was designed to make sure they retained and understood at least a portion of what they read.
Today, from what I see and from what I hear from a large range of teachers at various levels, far too many students want to be spoon-fed the answers that will be on tests. They demand to know what will be on the test. And teachers are under incredible pressure to teach to the test and to get everyone through.
Back in the ancient days, we understood that no test could cover everything a student was supposed to have learned, and that the test was used as a sampling device. That was why tests were changed from semester to semester. It was also why enterprising students tried to gather questions from past tests in order to game the system.
These days, even when students know the facts, they have great difficulty in synthesizing and analyzing what those facts mean and how they apply in a particular discipline.
And that’s what you get when the emphasis is on getting everyone through with a credential rather than on learning the material and being able to explain it and apply it in ways that you weren’t ever taught.
The same was similar with computer programming in the 80s. The really good programmers had been trained for 12+ months by their company before being let loose on customers, IBM had an 18 month programme as it included 6 months of special training – on banking for example.
This was after university. The first computer manufacturer I worked for had a policy of not recruiting computer science graduates as they (apparently) felt they knew it all and that corporate and military standards of coding were overblown and unnecessary.
Much later and before I retired ten years ago, we had fast-track, open source and Agile and as a result, code which was difficult to maintain. Outsourcing also meant that there was little control of quality standards.
Glad I am now out of it
In my own ‘mature student’ days I noticed how often I would see others clogging up the corridors doing last-minute cramming while waiting to begin a professional exam. I also noticed their faces when I handed over my completed exam papers and left them all still struggling (and still taking the same exam in the next year). I’d completed the five-year course in 2 1/2 years. Cramming doesn’t work!
I keep saying this, so… blah blah blah blah Goodhart’s Law.
https://en.wikipedia.org/wiki/Goodhart%27s_law