Previous month:
July 2014
Next month:
October 2014

August 2014

'Less' and 'fewer'

10 items or less

 

 

 

 

 

 

 

 

 

 

This controversy rumbles on.  The basic rule is:

Less of amount; fewer of number.

(We’ll talk about rules in a moment.)

Use fewer when referring to anything that you can count.

    These days, people buy fewer newspapers.

    We have fewer women studying science than we would like.

Use less when you’re referring to something that can’t be counted or doesn’t have a plural (for example: air, time, traffic, music).

    At the end of the week I always seem to have less money.

    Now that I’m singing regularly in a choir, I listen to less music on the radio.

We also use less with numbers on their own.

    His weight fell from 18 stone to less than 12.

When numbers refer to distance, time, ages or sums of money, we use less.  That’s because we’re thinking of the number as measuring an amount of distance, time, age or money; we’re not counting the individual kilometres, minutes, years or pennies.

    Their marriage lasted less than two years.

    Trafalgar Square is less than three miles from the Tower of London.

    The project should take less than four weeks to complete.

    They had been married for less than three years.

    The operation should cost less than £3000. 

In 2008, The Daily Telegraph reported that Tesco was replacing its checkout notices reading ‘Ten items or less’ after a long-standing run-in with grammatical sticklers.  An admirably honest spokesman for the store admitted:  “The debate about what is right has been going on for years now, and I still don't think we know if 'less' or 'fewer' is correct.”

Maybe, in the end, they needn’t have worried.  The Pocket Fowler’s Modern English Usage later commented:

    Supermarket checkouts are correct when the signs they display read 5 items or less (which     refers to a total amount), and are misguidedly pedantic when they read 5 items or fewer     (which emphasizes individuality, surely not the intention).

 [Thanks to the Oxford Words blog for that citation.]

And in fact, the whole controversy about less and fewer is arguably a case of misguided pedantry. 

The plain fact is that English speakers have use less for countable nouns for the best part of a thousand years. 

Ikon_of_King_St._Alfred_the_GreatBoth less and few derive from old Germanic languages, and they’re first recorded in Old English texts in the 700s.  Alfred the Great, no less, used less with countable nouns, in around 888.

Swa mid læs worda swa mid ma, swæðer we hit yereccan mayon.

With less words or with more, whether we may prove it.

Some say that’s because he was using a partitive genitive

Whoa: definition alert. 

The word 'genitive' refers to possession.  The word 'manager's' in the manager's office is in the genitive.  We can also say the office of the manager - though we probably wouldn't.  But we can say most of the managers - and that's a partitive genitive:  a genitive used to indicate a whole divided into parts.  Most of us is another typical example. 

Keeping up? 

Well, læs worda when Alfred wrote it means literally ‘less of words’.  When English lost its genitive plural case – at the end of the Old English period – people simply dropped the of (as you do), and started saying less words (or sheep, or cakes, or whatever’s countable). 

And we’ve continued to use this construction ever since.

Fewer, the comparative of few, appeared in English much later, around 1340.   So it was always rather weaker in the folk memory than less

Fewer received a bit of a boost in the late 18th century – when so many ‘rules’ of correctness were born.  In 1770, Robert Baker wrote his Reflections on the English Language, which was seemingly the first in a long line of guides to correct English usage (of which Fowler is now probably the most famous).  Baker cautiously commented on the use of less:

    This Word is most commonly used in speaking of a Number; where I should think Fewer would do     better. "No Fewer than a Hundred" appears to me, not only more elegant than "No less than a     Hundred," but more strictly proper.  

And, with those thoughtful and modest words, a rule was born. 

500-mistakesBy 1856, when New York publisher Daniel Burgess brought out the anonymous Five Hundred Mistakes of Daily Occurrence, the use of less for fewer had become regarded as a - well - daily mistake.

And we've been worrying about it ever since.

The distinction can be useful.  Burchfield quotes a newspaper magazine:

School leavers.  Over the next few years, you're going to see a lot fewer of them.

Which certainly means something quite different to you'll see a lot less of them.

But on the whole, most of us will probably continue to prefer less to fewer - and misguided pedants will continue to rail at us.

All of which goes to show that, like so many rules of usage, this one is essentially artificial.  It doesn't reflect the 'natural' usage of native speakers.  In the end, as The Cambridge Guide to English Usage reminds us, the choice between fewer and less is - more or less - not a matter of correctness, but of style.

 


Take responsibility, take ownership

In two earlier posts, I discussed blame and resistance.  Both are natural and predictable responses to problems that we place in our Circle of Concern:  the place where we put the problems life throws at us, and which we feel powerless to tackle.

Call them Presented Problems.

We usually express a Presented Problem as a statement of what’s wrong. There’s a perceived gap between what is and what should be.

Inside the Circle of Concern is our Circle of Influence. Into that circle we place the problems we feel we can deal with. Being more effective, according to Stephen Covey, means concentrating on the problems we can control. Being more effective means increasing our Circle of Influence.

But it’s not easy. Our wired world helps us to fill our Circles of Concern very easily. Yet, despite our helpless rage as we endure another night of Channel 4 News, it’s important to remember that we choose to put problems into one circle or the other. “Our behaviour,” says Covey, “is a function of our decisions, not our conditions.”

We take responsibility when we choose to take ownership of a Presented Problem.  Presented Problems happen to us: we’re not responsible for their existence, but we can take responsibility for dealing with them. 

Being responsible always means having an obligation to someone else, or to a group, or to society. After all, it’s other people who hold us responsible for our actions. So to be responsible is to enter into a kind of contract. We might speak of honouring our responsibilities. To be accountable means that someone can hold us to account for our responsibilities.

Most of our work, at work, then, is filled with responsibilities.

Responsibility-B

 

Responsibilities are paradoxical. On the one hand, like contracts, they have limits. Once we’ve discharged our responsibilities, we can walk away. It’s what we do at the end of the working day (some of us).

 

On the other hand, taking on a responsibility, like signing a contract, must be a free act. To be responsible for your actions is to know what you're doing – and to be free to choose not to. "A hero," said Bob Dylan, "is someone who understands the responsiblity that comes with his freedom."

Responsibilities tend to have certain troubling features. Here are just a few. (Think of your responsibilities at work, and you may see what I mean.)

  • Unclear goals. The person handing responsibility to us may not know precisely what they want us to do, or – even more troublingly – what they don’t want us to do.
  • Lack of control. If we had complete influence over what to do and how to do it, we’d be happier.
  • Lack of immediate feedback. We may have to check with others about aspects of the problem: information, deadlines, or criteria of success. And those people may be unavailable.
  • A mismatch between challenge and skill. We often have to assume responsibility for problems that are trivially easy, boring and tedious (call them chores). We may have to take responsibility for problems that are mind-numbingly difficult (call them headaches).

The best way to honour a responsibility is to know beforehand precisely what’s you’re taking on.

Why?

What’s the overall objective in tackling this problem? What outcome are you being expected to achieve? Do you have SMART goals (are they specific, measurable, achievable, realistic and timely)?

Who?

To whom are you accountable? And for whom are you accountable?

What?

What precisely is the problem you’re taking on? How well do you understand it? How well defined is it?

When?

Is there a deadline? Are there milestones that you will be expected to hit?

Where?

Where will your solution have an impact? How far does your responsibility stretch?

How?

What authority have you been granted? What constraints or restraints will you be expected to operate under? (Restraints are the things you can’t do; constraints are the things you must do.) What resources are available to you? What support can you count on?

Responsibility may be the price of freedom, as Elbert Hubbard suggested. But it carries a mighty payoff. "Let us not," said JFK, " seek to fix the blame for the past. Let us accept our own responsibility for the future." Blame and resistance look back; responsibility looks forward. Responsibility creates hope.

This post is based on material in my book,How to Solve Almost Any Problem. I run courses on problem-solving and decision-making. Check out an outline here.

How to solve almost any problem


Resistance is futile

Before you read on, please do this.

Count the number of pieces of clothing you put on this morning (pairs count as one), and write the number down.   Now, do a multiplication sum.   For example, for seven pieces of clothing, calculate: 7x6x5x4x3x2x1.

Did you do what I asked?  I’ll bet you didn’t.   My request interfered with your desire to read this article; and you resisted.   

In psychological terms, I tried to wrench you out of procedure.  And probably failed.

Procedural memory and why it’s good for us

Procedural memories underlie the routines that make us effective.   By repeating the same task repeatedly, we ‘pattern in’ the relevant networks in our brains until they fire automatically.   Before we know it, we’re driving the car, playing tennis, or getting dressed – without thinking.   

(Back to that calculation.   If you put on seven pieces of clothing this morning, you faced 5040 possible ways of getting dressed.   Hence the need for a routine!)

Getting dressed
Cartoon: Martin Shovel

Procedural memories have two key characteristics.   First: we must repeat them many times before they become embedded.   Second: once embedded, they’re permanent.   Even if you don’t ride a bicycle for years, you’ll remember how to do it after a few moments. 

As a result, procedural memories tend to resist being modified.   The more solidly imprinted the procedural memory, the more we resist changing it.   Remember the resistance that met decimal currency?  (Some of us remember!)

Throwing our toys out of the pram

We accumulate procedural memories over time.   Obviously, older people have more of them: more potential for resistance, perhaps.

But resistance isn’t just a sign of growing older.

ToddlerThink of the Terrible Twos.   Babies don’t resist.   That bawling is a clear demand for help: a fail-safe survival tool.   Who can ignore a crying baby?   As we grow up, we discover the power of these signals and start to use them deliberately.   And so the tantrum is born.   

In adolescence, we develop new needs.   Along with physical needs – food, water, sleep and exercise – we develop emotional needs: security; attention; intimacy; community; privacy; status; a feeling of competence and achievement; a sense of meaning in our lives.   Teenagers have powerful needs for both autonomy and belonging: threaten those needs and you’ll probably encounter resistance.  Ask any parent.

Resistance satisfies the important need to be in control.   But it can also do real harm.  Resistance, after all, is a form of stress.   

So we need to be able to manage it.

What need is not being met?

Start by recognizing the symptoms.   Are you putting off tackling the problem?  Or engaging in avoidance behaviours (‘I’ll just make a coffee first...’)?  Perhaps you’re indulging in malicious compliance: carrying out instructions to the exact letter, knowing that following the rules could inflict damage.   This kind of resistance – like denial – can be positively dangerous. 

Second, identify the need that’s being threatened.   Does the problem make you feel unsafe or exploited?  Does it threaten your sense of competence or status?  If you can find a way of meeting the need and solving the problem, your resistance might evaporate.

All things being equal (which they never are), our resistance to a problem is likely to decrease if we can make the problem more controllable, less serious, more urgent and less surprising.   If we can choose when to tackle the problem, so much the better.

What do you really, really want?

We usually resist because we feel powerless.   But sometimes, what threatens that need for control isn’t outside us; it comes from our deepest core.

Resistance includes desire.   Without the desire, what are you resisting? 

We humans are explorative solution-seekers more than problem-solvers: our natural urge is to look around for something better to do – or be.   (Think, again, of toddlers.)  As we grow, we sometimes resist our natural talent to find our true potential – because, to grow, we have to give up some control.   As Marianne Williamson famously wrote:

“Our deepest fear is not that we are inadequate.   Our deepest fear is that we are powerful beyond measure.  It is our light, not our darkness that most frightens us.  We ask ourselves, ‘Who am I to be brilliant, gorgeous, talented, fabulous?’ Actually, who are you not to be?”

So: if you find yourself resisting, ask two questions.

What do I need right now? And what do I really want to do?

This post is based on material in my book, How to Solve Almost Any Problem.

How to solve almost any problem


What shall we do?

Closing the mind gap

Closing the Mind Gap

Ted Cadsby

BPS Books, 2014

ISBN 978 1 927483 78 7

£18.00

 

 

 

China Miéville sets one of his novels, The City & the City, in two cities occupying the same physical space.  Citizens of each city, partly through choice and partly through political coercion, have trained themselves to ‘unsee’ the other city: to recognize the buildings and inhabitants of the other city without seeing them.  Crossing the cognitive divide, even by accident, is regarded as ‘breaching’ – a terrible crime invoking unspeakable punishments.

Ted Cadsby, in his ambitious and enjoyable new book, similarly invokes two coterminous worlds.  We live in both, but usually recognise only one.  The consequences of ignoring the other can be profoundly damaging.

World #1 is, in his description, ‘straightforward’.  In World #1, we easily differentiate meaningful signals from noise; patterns are consistent across different situations; feedback is direct, timely and clear.  In World #1, learning is easy and prediction is reliable.  World #1 is the world “in which countless generations of our ancestors lived and in which we continue to spend much of our time.” 

Fractal19World #2 is ‘complex’.  In World #2, signals are buried in noise; patterns vary across situations because each situation is unique; feedback on our actions is indirect, delayed and ambiguous.  World #2 has, Cadsby suggests, “snuck up on us”, principally in the evolutionary blink of an eye that witnessed the Industrial and Information Revolutions. 

The farmers of World #1 could reliably expect their predictions to turn out correctly (except, presumably, when they didn’t); the knowledge workers of World #2, in contrast, “cannot rely on simple cues and timely feedback to make decisions.”

Cadsby argues that our brains have evolved to navigate World #1 and are unprepared for World #2.   In fact, we have, figuratively, two brains: the ‘old’ brain, which operates unconsciously, and the ‘new’ brain, which has evolved over the past 100,000 years and which we think of as conscious.  We think automatically with the ‘old’ brain, and effortfully with the ‘new’ one.  But the partnership is unequal:  the ‘new’ brain has limited access to the ‘old’ one.  As a result of this ‘brain-brain’ gap, the way we think is not always matched to our modern world, and so we face the second challenge of a ‘brain-world’ gap. 

The challenge is to close the gaps.

Cadsby’s book works with an explanatory narrative of human cognition that has developed Old brain, new brainrapidly over the past decade or two.   The ‘left-brain-right-brain’ narrative of the 70s and 80s has gradually given way to an ‘intuition-and-rationality’ narrative, under the influence of psychology, complexity science, evolutionary anthropology, cognitive science and what’s loosely referred to as neuroscience.  Paul MacLean's model of the triune brain helped get the narrative going; Guy Claxton’s Hare Brain, Tortoise Mind, Stephen Mithen’s The Prehistory of the Mind and Chris Frith's Making Up the Mind have all made interesting contributions.  

Like its predecessor, the ‘intuition/rationality’ narrative relies on a satisfyingly simple dichotomy.  Where the earlier explanation concentrated on a lateral division between left and right brain, the new one emphasizes a vertical division, the ‘new’ brain (represented by the neocortex) sitting on top of the ‘old’, intuitive, emotional brain (represented mostly by the hippocampus and the amygdala). 

This new narrative has considerable explanatory power.  Cadsby argues that “our minds are meaning-making machines”: we predict the nature of reality by intuitively pattern-matching to pre-existing mental models, some inherited (like the ability to recognize a face), some learned (like the ability to ride a bike).  ‘Constructive realism’ is useful in World #1 because in this world the pattern-matches are usually more or less accurate; but in World #2, constructive realism falls prey to “greedy reductionism”: we oversimplify complexity and conclude overconfidently.   

Type 1 thinking, intuitive and automatic, will help us solve straightforward problems, but not complex ones.  It will help us read a novel but not write one; eat a meal but not cook it; watch tennis but not play it.  If we want to understand complexity more effectively, we need to invoke Type 2 thinking.

The catch is that Type 2 thinking requires concentration.  Where Type 1 is quick, Type 2 must be slow; where Type 1 operates in parallel, Type 2 can operate only one task at a time.  Much of the book is devoted to the strategies necessary to develop Type 2 thinking: study the problem landscape more carefully; pursue missing information; analyse causal relationships; and so on.  Cadsby suggests that we need to develop two types of Type 2 thinking:  Type 2.1, which helps us model complexity more accurately; and Type 2.2, thinking about thinking, which “brings us as thinking agents into the process of thinking”.  Cadsby calls Type 2.2 ‘metacognition’ and, with a Buddhist inflection, ‘mindfulness’. 

Bigstock-Are-You-Sure-45817090But we’re not inclined to do either.  We prefer Type 1 thinking.  For one thing, effortful thinking requires – well – effort, and we need to conserve cognitive energy.  Worse still, we’re addicted to certainty: we need to know, we need to be in control, and we’re desperate to enjoy the calm, pleasurable (intuitive) feeling of knowing that we have figured something out.  Ambiguity and doubt create too much discomfort.

Closing the Mind Gap develops this thesis in great detail.  Cadsby synthesises huge quantities of information and explains it elegantly.  This may not be quite a popular science book and it may not be quite a management book; but it's certainly a page-turner.  Cadsby is much influenced by Daniel Kahnemann (Thinking Fast and Slow), although he also cites the work of Robin Hogarth, Nassim Nicholas Taleb and Keith Stanovich, along with a host of experimental evidence to support his argument.  Along the way, he offers excellent accounts of theory of mind, the workings of the emotions, Bayesian probability theory and much more.  For anybody interested in understanding why we so often fail to think as well as we can, this book will be useful (though I wish his endnotes indicated his sources more precisely). 

And yet, and yet.  Something bothers me. 

To begin with, I’m not sure about these two worlds.  How do we distinguish #1 from #2?  Are they not both simply mental constructs?  After all, as Cadsby himself says:  “our earliest forms of conscious awareness enabled language, culture and innovation, and we began to create a new world for ourselves.”  We find ourselves paradoxically limited in our ability to understand the cognitive complexity that we ourselves have generated. 

DecisionsAnd then, understanding complexity is never the whole story.  The primary function of a brain is to enable an organism to move.  If “all life is problem solving” – as Karl Popper suggested – then, as Cadsby points out, “the brain interprets its environment so it can motivate actions that are conducive to thriving.”  Or, to quote José Ortega y Gasset:  “Living is a constant process of deciding what we are going to do.”  The truth, however complex, matters less than the solution, which is not an answer but an intervention in the world.

Cadsby touches on decision-making.  He discusses the Taylor-Russell diagram; and he acknowledges, entertainingly, the provisional quality of all decisions.  But his advice on how to decide better is somewhat negative: we should qualify our conclusions with ‘probably not’, ‘could be’ or ‘it appears to me that...’  I’d like more emphasis on how to choose what to do, and how to manage risk. 

Perhaps Cadsby has picked up Kahnemann’s pessimism, along with the undoubted insights of behavioural economics.  It seems that that the best we can do is overcome – effortfully – our inevitable cognitive shortcomings.  For example, we read a lot about confirmation bias, availability bias and myside bias, but nothing about optimism bias: the tendency to assume that everything will turn out ok, which becomes a useful learning tool when surprised by failure or the unexpected.  (I’d like to see more in the book about learning.)  Rather than celebrating our successes in combining Type #1 and Type #2 thinking – in collaborative research, artistic production, business and diplomacy – Cadsby invokes the quietism of Stoicism and Buddhism to help us outmanoeuvre Type 1 thinking and the depressing negativity bias of our emotions.  (“The marginal value of eating and sex declines rapidly once we have had our fill, but the marginal value of avoiding danger never declines.”  Hm. ) 

What’s missing?

The clue may be in the ‘cultural big bang’ that Cadsby describes early in the book.  It’s a critical part of the narrative.  This was the moment, perhaps 50,000 years ago, when human LW109mithen2consciousness seemed to take a sudden leap forward, “fuelled by the ... ability to communicate complex ideas and generalize learning by applying insight from one task to different ones.”   Something happened to our thinking; something that allowed us to transcend the difference between Type 1 and Type 2 thinking and combine them; something that offered us the opportunity, not merely to generalise, but to create wholly new ideas.  Cadsby acknowledges that this cognitive leap expanded our working memories and enabled us to speculate about the past and the future.  But there’s a more radically significant element in this new ‘cathedral of the mind’, as Stephen Mithen has called it.  And Cadsby, I can’t help feeling, has missed it.

That element is metaphorical thinking.   

“The metaphor,” said José Ortega y Gasset, “is probably the most fertile power possessed by man.”  Metaphorical thinking has generated the massive potential for creativity that continues to drive our cognitive development.  Where, I wonder, might metaphor might fit into Ted Cadsby’s splendidly articulated argument?