Weekly, the media highlight another threat to human health - from BSE to powdered baby milk. But how much do we really know about such risks? Since 1987 the Society for Risk Analysis - Europe has advanced our scientific understanding of a subject increasingly exciting academics. Next week it holds its annual meeting, organised by Ragnar Lofstedt, at the University of Surrey. This supp-lement contains the issues to be debated.
Next Monday, wander into the University of Surrey, find the big conference and enjoy a round of spot-the-insurance-people. It will be a great venue for the game because of the noisy background of the disciplines attending: scientists, industrialists, food researchers, engineers, psychologists and sociologists, as well as insurance specialists.
For the 1996 meeting of the Society for Risk Analysis - Europe the organisers have managed to get the insurance industry to take part. They hope that if insurers, who have in the past seemed immune to sociological musings, rub shoulders with sociologists, each might learn from the other's concerns.
Risk is a subject with many corners. In one, insurers are bent on a quantitative quest. They want precise descriptions of the risk of particular events. In another, sociologists began a few decades ago with a simple question of the type: why do people fear plane crashes more than car accidents when they are far more likely to die in the latter? Over the years their research agenda has widened into the study of entire cultural and political systems.
Insurers are interested in the likelihood of marine disasters or building collapses. But they also have to deal with more elusive concepts - such as "moral hazard". Once a person has insured a possession he or she may treat it less carefully because the financial incentive to look after it has diminished. Thus the risk that it will be damaged is magnified. Another concept, "adverse selection", arises because people who know they are high-risk get themselves insured.
Meanwhile, psychologists and sociologists are doing something different. They began by constructing a table of "reasons to be fearful", which added hugely to our understanding of public anxieties. A range of factors that cause people to overestimate risk was identified, a range which, it has been pointed out, may partly explain the public's extreme reaction to BSE. If the type of death at risk is vivid; if exposure is involuntary; if there is lack of personal experience of the risk; or if the effects of the exposure have been delayed, fears will be increased.
But these findings did not explain why individuals have different perceptions of, and reactions to, the same risk. Researchers have now found that women tend to fear risks more than men do and non-whites in America are more fearful than whites. In fact, white males form an isolated group less fearful of most risks than any other grouping.
These and other findings have had several spin-offs. One is the concept that much fear of risk can be explained as a corollary of the trust people have in the relevant bodies. Trust is the popular concept of the moment, says Ragnar Lofstedt, director of next week's conference. The risk to health from an accident may be very small in terms of failure of technology. But our judgement about whether to believe what the government or industry is asserting about the risk will be based on whether we trust these bodies. Also, the public, as Brian Wynne, research director at the centre for the study of environmental changes at Lancaster University, says, is realistic about how the world works. Again, people's fears about BSE can partly be explained by the fact that the public knows government rules on cattle feeding and slaughter are not necessarily being strictly implemented.
Another notion that has had some influence in the field was invented by British anthropologist Mary Douglas. She widened the academic study of risk perception by saying that it varies with the kind of society or group in which an individual lives. If the risk threatens institutional arrangements that are highly valued by the group then the group's fear of that risk will be correspondingly greater.
This "cultural" approach suggests that people's perceptions of risks cannot be entirely accounted for by means of psychological approaches (such as imagery of death) or by objective approaches (such as the statistical probability of fatality).
Reviewing the state of risk research in 1992, a Royal Society committee said that cultural theory had raised issues "crucial for the future of the field". It urged proponents of the psychological and cultural approaches to work together.
One attempt to do just that had already been made by Roger Kasperson, professor of geography at Clark University, Massachusetts, who devised a theory of social amplification of risk. We find out about the world through signals, signs and images, which are amplified or diminished by different groups, says Kasperson. These groups emphasise different aspects of a risk according to their social structure and circumstances.
But the insurers may be more swayed by the research of other players in this diverse field. Engineers, for instance, with their complicated equations for ensuring that the myriad parts they have designed for an industrial plant are not going to fail with catastrophic consequences, such as the release of clouds of toxic gas.
Engineers deal with uncertainties as simple as the absence of data and as unquantifiable as the unpredictability of human operators. Perhaps the most difficult link in the chain is made once a figure has been calculated and a decision has to be made about what is acceptable.
Acceptability depends partly on what is already accepted but it is hard to compare types of damage. Is it worse if an accident kills just one person than if it gives 500 an increased chance of cancer developing in 20 years time - or gives 1,000 people an allergy?
One area where many of the conflicting theories and practices of risk meet is in the practice of "risk communication". For many this means persuading the public its beliefs are irrational - a one-way model of communication. "The literature has been dominated by 'how are we going to persuade people to accept risks they would not otherwise accept?'", says John Handmer, reader in hazard and sustainability at the flood hazard research centre of Middlesex University.
In a two-way model information and opinions about the possible risks would be swapped by all the players. Handmer says both approaches are needed. A top-down approach is used to warn the public about the dangers of cancer from the thinning ozone layer, for example.
But there is a split in opinion about which of several models to use. Among social scientists there is consensus that public participation in risk communication is the only way forward. This leads to another buzzword: stakeholders. The public is more likely to trust a decision to site hazardous nuclear waste near a local community, for example, if they, as "stakeholders", have been involved in making that decision.
But the place where the objective and subjective can be brought together most spectacularly is in the field of risk management; the handling of risk, which incorporates the eminently practical and the theoretical. Tom Horlick-Jones, of the risk research group at the centre for environmental strategy at Surrey, says: "By necessity it is a ragbag of disciplines." It involves regulation and insurance (the pooling of social finances to spread risk) as well as arguments about secrecy of information and whether institutions should have risk management designed into their structures. Yet, says Lofstedt, it is an area where "natural science and social science don't really speak to each other".
The Royal Society report said: "the research map of risk management is a bit like the population map of Australia, with almost everything clustered round the edges and hardly anything in the central conceptual areas".
But one attempt to populate these inner territories has been made by risk consultant Jerry Ravetz. His idea is another new buzzword: "post-normal science". Ravetz says post-normal science is appropriate when normal science fails. It happens during a crisis such as global warming, "where you have a policy issue where facts are uncertain, values in dispute, stakes are high, decisions are urgent and traditional research and consultancy is inadequate".
To deal with such problems Ravetz says it is necessary to extend the community looking at the science to include all stakeholders and to consider "extended facts" - those collected outside laboratory conditions. Everyone involved must recognise "the diversity of legitimate perspectives". This is an idea that will be further explored at the conference.
Other experts think the most exciting idea to emerge in the sociology of risk is the concept of the "risk society", proposed by Ulrich Beck, professor of sociology at the University of Munich, and Antony Giddens, professor of sociology at Cambridge. They argue that society's central dynamic was once class and is now risk.
There will be plenty of disagreement at next week's conference. Lofstedt thinks there will be lots to engage the insurers. Their concerns about issues such as environmental risk, arising from problems such as global warming, leave them open to new ideas. A wave of disasters has also left many feeling vulnerable. There are tools available from the study of institutional risk management for assessing why some companies take risks while others avoid them.
Horlick-Jones says: "We wanted to promote a dialogue between people in society who carry out insurance work and people at the more theoretical end."
If theory and practice really get together next week then the conference will have been a success.