Recently, a certain university insisted that tenured and tenure-track faculty turn in their annual required faculty activity reports in electronic format in order to save time. This particular university requires extensive documentation as proof of faculty activities and teaching skills, but set out a helpful format, theoretically supported by a template, as well as a tutorial on how to comply with the new requirement.
The result was a disaster, at least in the College of Performing and Visual Arts. The template did not work as designed, so that faculty couldn’t place the documentation in the proper places. Even the two faculty members with past programming experience couldn’t make the system work properly. The supposed tutorial didn’t match the actual system. In addition, much of the documentation required by the administration existed only in paper format, which required hours of scanning, and to top it off, the links set up by the administration arbitrarily rejected some documentation. Not any of these problems have yet been resolved, but the time spent by individual faculty members is more than double that required by submitting activity reports in hard paper copy, and more time will doubtless be required.
Yet, this is considered time-saving. To begin with, the system was poorly designed, most likely because the administration didn’t want to spend the resources to do it properly. Second, to save a few administrators time, a far larger number of faculty members were required to spend extra time on paperwork that has little to do with teaching and more to do with justifying their continuation as faculty members, despite the fact that even tenured faculty are reviewed periodically.
Over the years, I’ve seen this in organization after organization, where the upper levels come up with “time-saving” or “efficiency” requirements that are actually counterproductive, because the few minutes they “save” for executives create hours of extra work for everyone else.
This tendency is reinforced by a growing emphasis on data-analysis, but data analysis doesn’t work without data. This means that administrators create systems to quantify work, even work, such as teaching, that is inherently unquantifiable, especially in the short term. When such data-gathering doesn’t result in meaningful benchmarks, instead of realizing that some work isn’t realistically quantifiable in hard numbers, they press for more and more detailed data, which not only wastes more time, but inevitably rewards those who can best manipulate the meaningless data, rather than those who are doing the best work.
Output data for a factory producing quantifiable products or components is one thing. Output data for services is almost always counterproductive because the best it can do is show how many bodies moved where and how fast, not how well or effectively the services were provided. Quantification works, to a degree, for a fast-food restaurant, but not for education, medicine, law, and a host of other activities. Yet forms and surveys proliferate as the “business model” invades everywhere, with the result of wasted time and meaningless or misleading “data.”
And yet the pressure for analysis and quantification continues to increase yearly, with administrators and executives failing to realize that their search for data to improve productivity is in so many cases actually reducing that very productivity. Why can’t they grasp when enough is enough?
As they sometimes say in the Navy, “how come there’s always enough time to do it over but never enough time to do it right in the first place?”
I suspect the other reason for this desire to quantificate everything has to do with the problem that if it can’t be quantified, the bosses have to either trust the people making the non-quantifantastic judgments, or learn the job well enough to judge the judgments. The first requires giving up control, the second requires doing actual work. Neither is appealing to the kind of people who like to be administrators.
Our rural English village decided to produce planning policies to form part of our county development framework over the next 20 years. We used an online survey package for people in the village to provide their views. (And a paper version for the majority of course who had no intention of going online, and these responses then had to be entered later into the package).
Then came the analysis.
The easy part was the multiple-choice questions as the survey software provides pretty graphs and quotable statistics.
We also included comment boxes for most questions.
You can guess where the value-add lay.
This is true not only with requirements, but with communication. where the problem is not limited to management.
Anytime there’s one-to-many communication _or_ levying of requirements, there needs to be some effort on the part of the one not to waste people’s time. Poorly written emails do that every day, if not individually to the degree that poorly levied requirements do – as do excessive recipients, poorly maintained mailing lists, etc.
People need to realize that even if it takes them longer to do it right, the savings in other people’s time is probably far greater.
Data analysis and ‘data mining’ : 90% of all the data that is gathered is crap. Few actually look at it and only a small number of those actually know what it means.
Not enough people have knowledge about statistics these days. There are differences in data: categorical cannot be collected and analyzed the way that numerical data can (and even that depends on whether or not it is continuous or discrete). Ordinal data can – in some cases – convert categories to numbers (eg – the mind numbing 1-5, where 1 = intensely dislike, 2 = dislike…. up to 5= intensely like) but too many people think that they can all be reduced to an “add’em up and divide” number that has actual meaning.
I can only imagine what collegiate administrators are like. If they are anything like hospital administrators, I would weep, but I’m all out of tears.