If
academic knowledge were simpler to understand and use, more people would
understand more, misleading misunderstandings should be less prevalent, the
education industry would be cheaper and more efficient, and humanity would make
faster and better progress. I am convinced this is an idea with enormous
potential, but it does not seem to be on anyone's agenda, and there are very
strong vested interests opposing it.
Human
civilization depends on knowledge. Lots of it, ranging from how to use
Pythagoras's theorem to produce a right angle to the science behind mobile
phones and GPS, from the idea that germs cause disease to the science behind
modern medicine, from stories and ideas about to produce them to the theories
behind voice to text software. There are lots of different types of knowledge,
and the boundaries of what counts as knowledge are a bit fuzzy. I'm talking
here about knowledge in people's heads, not the knowledge in databases and AI
algorithms - although the implications of these for what people need to store
in their heads is another, vital, and fascinating, story.
Some
knowledge is easy and we pick it up naturally as we grow up. But some of it is
complicated - it's difficult to learn and use: this is the rather fuzzily
defined "academic" knowledge that I'm concerned with here. Two
massive, interlinked, industries have evolved to cope with these difficulties:
education which disseminates the knowledge, and what, in the absence of a
suitable word, I'll call the knowledge production industry or KPI. (In
universities knowledge production is called research, but from my point of view
this term is too restrictive because it seems to imply the search for the
"truth" by modern academics, whereas I need a term which also covers
the work of Pythagoras and people trying to devise ways of making driverless
cars.)
The KPI
- scientists, researchers, and innovators both now and throughout history -
make discoveries or invent theories or better ways of dealing with the world,
and the results of their labours are then passed on to a wider audience by the
education industry - schools, colleges, universities, textbooks, etc. The
education system gets a lot of analysis and criticism: better ways of teaching
and learning are proposed and tested, and inequalities of access and the
ineffectiveness of a lot of the education system are bemoaned ad nauseam, and
so on. But the knowledge itself is seen as given, fixed, handed down from the
experts, and the job of education is to pass it on to students and the wider
public in the most efficient possible way.
My
suggestion here is that there are often good reasons to change the knowledge
itself to make it simpler or more appropriate for the audience. An
important KPI (key performance indicator) for the KPI (knowledge
production industry) is the simplicity of the product.
This
idea stems from a number of sources, some of which I'll come on to in a minute,
but first a little thought experiment. Imagine that a bit of knowledge could be
made simpler by a factor of 50%, so that, for example, the time needed to learn
it, or to use it, was halved, or that it led to about 50% fewer errors and
misconceptions in implementation. Imagine this applies to all knowledge taught
in universities and similar institutions. Then students would learn about 50%
more, or they would understand about 50% better, or have about 50% of their
time free to do something else. Leading edge researchers would arrive at the
leading edge in the half the time they take at the moment, giving them more
time to advance their subject. If such simplifications could be made across the
whole spectrum of knowledge, this would represent an enormous step forward for
humanity.
You
might think that the innovators and researchers of the KPI would have honed their
wares carefully to make them as simple as possible, so a 50% improvement is
simply impossible. But you'd be wrong. Very wrong. Except at the leading edge
there is absolutely no tradition in the academic world of trying to make things
simpler. Simplicity is for simple people, not academics who are clever people.
I've had a paper rejected by an academic journal because it was too simple: it
needed to be more complicated to appear more profound. Teaching and learning
methods are tweaked to make them easier for learners, but the knowledge itself
is considered sacrosanct: the experts have decreed how it is, and that's it.
There
are exceptions: areas where simplicity is a prized quality of knowledge. One
interesting example is the leading edge of one of the most complicated areas of
human knowledge: the physics of things like quantum mechanics and cosmology.
I've just been watching an interview of the physicist Roger Penrose who was
recounting his difficulties with lectures at Cambridge University: they were too
complicated to understand so he had to invent simpler ways of looking at the
issues. Einstein is supposed to have said that everything should be made as
simple as possible, but not simpler. I also came across similar sentiments by
two other Nobel prize winners, Paul Dirac and Murray Gell-Mann, and yet another
Nobel winner, Richard Feynman, invented a type of diagram -(subsequently called
Feynman Diagrams) which gives "a simple visualization of what would
otherwise be an arcane and abstract formula" (Wikipedia). Where things are
really difficult, simplicity is essential. But behind the pioneers of the
discipline, the normal practice is to accept what the gurus have produced.
The
history of science and the growth of knowledge in general are punctuated by occasional
revolutions that often lead to far simpler ways of looking at things. The
invention of the alphabet made record keeping far easier and more flexible so
that all sorts of stories could have a wider audience, and the replacement of
Roman numerals by the current system (2019 instead of MMIX) did a similar job
for arithmetic. The ideas introduced by Galileo and Newton provided a way of
understanding and predicting how things move which can be summarised in a few
simple equations and covers everything, both on earth and in the heavens. This
would probably not have been considered simple by contemporaries of Galileo and
Newton, or many present day students, but the equations are staggeringly simple
when you consider what they achieved. Similarly, Charles Darwin's theory of
evolution by natural selection provides a ridiculously simple explanation of
the evolution of life on earth.
But
what about the detailed, mundane stuff that students spend their time learning?
Quadratic equations and statistics, chemistry and the methodology of
qualitative research, medicine and epistemology? Are there opportunities for
simplification here?
My
contention is that there are, and the fact that are almost never taken is a
massive lost opportunity. There are two important differences between the
situations of the leaders and followers in a discipline. The first is that the
leaders will have a really good understanding of all the stuff leading up to
their innovation - the mathematics, other results in the field, the meaning of
the jargon, and so on. The followers are, inevitably, not going to have
such a thorough understanding of the background (they've got better things to
do with their time). The second is that the motivations are likely to be
different. The followers will want to fit new ideas into the mosaic of other
things they know and the current concerns of their lives with as little effort
as possible; the leaders, on the other hand, are likely to have a burning
desire to progress their discipline in the direction they want to take it.
These two factors mean that the best perspective for the followers may not be
the same as for the leaders.
But is
this possible? Are there alternative, simpler, or more appropriate,
perspectives in many branches of knowledge? Well, yes, there are: difficult
ideas have often spawned popular versions, or, as cynics would say, they have
been dumbed down for the masses. But pop science is not usually serious
science: if you want to use the ideas for real, or make breakthroughs yourself,
the dumbed down, popular version will not do: you need the original ideas
produced by the leaders, the experts themselves.
This is
not what I am talking about here. What I am suggesting is possibility of
producing a simpler more appropriate version for the followers, but one that is
as useful and powerful as the original expertise produced by the experts. Or,
possibly, more useful and more powerful.
I used
to be a teacher in a university, several colleges and on short courses for
business. As a teacher you try to explain your material as clearly as possible.
But often, perhaps usually, I found myself thinking of alternative ideas which
I thought were more appropriate. And I've been doing this for 40 years,
publishing the occasional article on what I came up with (the first such
article was published in 1978: there is a list of a few more here).
The
area I thought about in most detail was statistics. There are three key
innovations I would like to see promoted here. The first is computer simulation
methods: instead of working out some complicated maths for lots of specific
situations, you just do some simulation experiments on a computer so that you
can, literally, see the answer and where it comes from (e.g. Bootstrap resampling ...). The
second is jargon, which needs changing where it is misleading. The worst
offender is the word "significant". This has a statistical meaning,
and a meaning in everyday language which is completely different. This leads to
massive, and entirely predictable, and avoidable, problems. The third is to
focus on ideas that are helpful as opposed to ideas which fit statistical
orthodoxy - see for example Simple
methods for estimating confidence levels ... .
Other
areas I pondered include research methods as taught in universities (a lot of
the jargon is best ignored: see Brief notes on research methods and How to make research useful and trustworthy), decision analysis (The Pros and Cons of
Using Pros and Cons for Multi-Criteria Evaluation and Decision Making), statistical
quality control, mathematical notation
in general, Bayes' theorem in statistics (see pages 18-22 of this article), and the maths of
constant rates of growth or decline (traditionally dealt with by exponential
functions, calculus and logarithms but this is quite unnecessary).
Did I
act on these ideas and teach the simpler versions that I felt were more
appropriate? Sometimes I did, but usually I didn't. I was paid to teach the
standard story, and didn't feel I could go out on a limb and teach my own
version - which was usually untested and might not work. That's what the
students and the organisations I was working for expected. And, besides, the
system has an inertia that makes it difficult to change just one bit. The term
"significant", for example, might be, in my view and the view of many
others, awful jargon describing an awful concept which promotes confusion and
discourages useful analysis, but it is very widely used in the research
literature so people do need to know what it means.
There
were exceptions where I did follow my better judgment. Computer simulation
methods in statistics are widely used (but usually only where other approaches
fail) so, on some courses, I did use these. And sometimes, as with research
methods, the problem was that a lot of the standard material was just a waste
of time and was best ignored so that we could focus on things that mattered.
But even here, by not explaining the t-test, or emphasising the distinction
between qualitative and quantitative methods, I was failing to meet the
expectations of many colleagues and students.
But
surely, you're probably still thinking, if there really is such an enormous
untapped potential, people would be tapping into it already? Part of the reason
why they aren't, or are to only a very limited extent, is that the forces which
act against change are very powerful and go very deep. I was so deeply enmeshed
in the assumptions of academic statistics that an obvious alternative to the
concept of significance in statistics (as explained in Simple
methods for estimating confidence levels ...)
did not occur to me for 30 years after publishing an article critical of
the concept, and the statistics journals I submitted my idea to rejected it
often with a comment along the lines of "if this was a good idea the gurus
of statistics would have thought of it".
As well
as the inevitable conservatism of any cognitive framework there are three
factors which are peculiar to the knowledge production industry: the peer
review system, the lack of a market or responsive feedback system for
evaluating ideas and theories, and the desire of the education system to
preserve "standards" by keeping knowledge hard. I'll explain the
problems with each of these in the next three paragraphs.
The
peer review system is the way new academic knowledge is vetted and certified as
credible. Articles are submitted to a journal in the appropriate field; the
editor then sends it out to two or three peer (usually anonymous) reviewers -
often people who have published in the same journal - who make suggestions for
improving the article and advise the editor on whether it should be published.
The fact that an article has been published in a peer reviewed journal is then
taken as evidence of its credibility. This system has come in for a lot of
criticism recently (e.g. in Nature): mistakes and
inconsistencies are common, but one key issue is that the reviewers are peers:
they are in the same discipline and likely to be subject to the same biases and
preconceptions. Peer reviewers would seem unlikely to be sympathetic to the
idea of fundamentally simplifying a discipline. I think some non-peer review
would be a good idea as advocated in this article.
Mobile
phones and word processors are relatively easy to use. You don't need a degree
or a lot of training to use them. This is because if people couldn't use them,
they wouldn't buy them, so manufacturers make sure their products are
user-friendly. There are lots of efficient mechanisms (purchase decisions,
reviews on the web, etc) for providing manufacturers with feedback to make sure
their products are easy to use. The academic knowledge ecosystem lacks most of
these feedback mechanisms. If some knowledge is a difficult to master, you need
to enrol on a course, or try harder, or give up and accept you're too lazy or
not clever enough. What does not happen is the knowledge producer getting a
message along lines: "this is too complicated, please simplify or think
again."
This is
reinforced by the education system which has a strong vested interest in
keeping things hard. There is an argument that the purpose of the education
system isn't so much to learn things that are useful (everyone knows that a lot
of what is learned is immediately forgotten and never used), but to
"signal" to potential employers that you are an intelligent and
hard-working person (an idea popularised by the book reviewed here). From this perspective
difficult knowledge is likely to be better for differentiating the worthy from
the less worthy students. The fact that the difficult knowledge may not be much
use is beside the point, which is that the less diligent and intelligent should
fail to understand it. And of course difficult knowledge enhances the status of
teachers and means that they are more obviously necessary than they would be if
knowledge were easier. Universities would lose most of their business if
knowledge were easy to master: teaching and assessment would be much less
necessary.
Knowledge,
of course, is not just to make the economy more efficient; it is part of our
cultural heritage and what makes life worth living. Arguably, we have a duty to
pass on the work of the masters to future generations? OK, some unnecessarily
complicated theories may have historical or aesthetic value, but, in general,
if there is a simpler, more elegant version, isn't this preferable?
So ...
I would like to propose that simplifying knowledge, or making it more
appropriate for its purpose, is an idea that should be taken seriously.
Otherwise knowledge will evolve by narrowly focused experts adding bits on and
making it more and more complex until nobody really understands what it all
means, and progress will eventually grind to a halt in an endless sea of
technicalities.
This
requires a fundamentally new mindset. First we need some serious creative
effort devising new ways of looking at things, and then empirical research on
what people find useful, but also simple and appealing. Perhaps knowledge
should be viewed as art with aesthetic criteria taken seriously? Whatever we
are trying to do - discover a theory of everything, cure diseases, prevent
suffering or make people happier - simplicity is an important criterion for
evaluating the knowledge that will best assist us.
Then we
should make faster progress, more people will get to understand better, less
time will be wasted on unnecessary complexities, and we should make fewer silly
mistakes.
No comments:
Post a Comment