introduzione: Severe

introduzione: Severe
Uncertainty in Science,
Medicine, and Technology

Mattia Andreoletti
Università Vita Salute
San Raffaele University,
Milan, Italy

Daniele Chiffi
DAStU, Politecnico di Milano,
Milan, Italy

Behnam Taebi
Delft University of Technology,
Delft, Netherlands

introduzione

1.
This Special Issue titled “Severe Uncertainty in Science, Medicine and Tech-
nology” aims to shed new light on the understanding of severe uncertainty and
its multifaceted implications. The main idea of the papers of this collection is
Quello, despite possible sophisticated statistical judgments towards future risks
in science, medicine, and technology, severe forms of uncertainty still exist.
While ignorance is usually assumed to be a total absence of knowledge,
uncertainty often refers to the incompleteness of knowledge or information.
In its extreme form, this is called “severe uncertainty” but is also known as
“fundamental,” “radical,” “deep,” “great,” or “genuine” uncertainty. A com-
mon characteristic of these notions is that it may be difficult to meaningfully

Mattia Andreoletti’s research is supported by the PRIN project “From Models to Decisions”
(Bando 2017 Prot. 201743F9YE). Daniele Chiffi’s research is supported by the “Fragilità
Territoriali” project of the Department of Architecture and Urban Studies of Politecnico di
Milano as part of the “Departments of Excellence 2018–2022” MIUR program. Behnam
Taebi’s work for this article is part of the research program Ethics of Socially Disruptive
Technologies, which is funded through the Gravitation program of the Dutch Ministry of
Education, Culture, and Science and the Netherlands Organization for Scientific Research
(NWO grant number 024.004.031).

Perspectives on Science 2022, vol. 30, NO. 2
© 2022 by The Massachusetts Institute of Technology

https://doi.org/10.1162/posc_e_00411

201

l

D
o
w
N
o
UN
D
e
D

F
R
o
M
H

T
T

P

:
/
/

D
io
R
e
C
T
.

M

io
T
.

/

e
D
tu
P
o
S
C
/
UN
R
T
io
C
e

P
D

l

F
/

/

/

/

3
0
2
2
0
1
2
0
0
5
0
1
0
P
o
S
C
_
e
_
0
0
4
1
1
P
D

.

/

F

B

G
tu
e
S
T

T

o
N
0
7
S
e
P
e
M
B
e
R
2
0
2
3

202

introduzione: Severe Uncertainty

conceptualize uncertainties in probabilistic terms (Knight 1921; Ellsberg
1961, Shackle 1961; Keynes 1973; Langlois 1994; Chiffi and Pietarinen
2017; Kay and King 2020). When uncertainties are mainly shaped by norma-
tive facets, these are referred to as normative uncertainties (Taebi et al. 2020).
With severe uncertainty in this special issue, we refer to situations in

which the following issues are unknown, unclear or undefined:

(io) the adequate models to describe the relations between system’s

variables;

(ii) the probability distribution to represent uncertainty about relevant

parameters and variables; and/or

(iii) the correct theory of rational choice and the correct theory of epis-

temology to handle uncertainty.

(iv) the ethical dimensions that situations of uncertainty give rise to.

This means that severe uncertainty encompasses factual, methodological,
and normative aspects of decision-making.1 A variety of qualitative and
quantitative methods are available to identify and deal with severe uncer-
tainties. Some of these methods are philosophical in nature. As such, phi-
losophy has much to add to our understanding of future risks in science
and technology and, more specifically, the role of uncertainties.

Classically, those forms of uncertainty that can be probabilistically
quantified—as they are in many medical and engineering fields—are
labeled as “risks” (Royal Society 1983). Admittedly, the distinction
between risk and uncertainty is not always so sharp, and the two terms
are often used interchangeably by experts and laypeople. While probabi-
listic risks are fairly well investigated in theories of risk, discussion on
methodological tools and strategies regarding identifying and dealing with
severe uncertainty has received less attention. Even though emerging
research has contributed to reshaping the field, many scientific and tech-
nological decisions about future events occur under conditions of severe
uncertainty rather than probabilistic risk. Thus far, a family of mathemat-
ical and argumentative methodologies have been proposed to provide
rational (though not strictly probabilistic) approaches to decisions under
fundamental uncertainty; the most relevant of these are potential surprise
theory (Shackle 1961), scenario planning (van der Heijden 1996), possibil-
ity theory (Zadeh 1978), the Dempster-Shafer theory of belief functions
(Shafer 1976), and hypothetical retrospection (Hansson 2007). A compre-
hensive introduction to different methodologies that cope with uncertainty
is provided by Hansson (2018).

1. Our definition of severe uncertainty builds on and expands the presented defini-

tions in Marchau et al. (2019) and Lempert, Popper, and Bankes (2003).

l

D
o
w
N
o
UN
D
e
D

F
R
o
M
H

T
T

P

:
/
/

D
io
R
e
C
T
.

M

io
T
.

/

e
D
tu
P
o
S
C
/
UN
R
T
io
C
e

P
D

l

F
/

/

/

/

3
0
2
2
0
1
2
0
0
5
0
1
0
P
o
S
C
_
e
_
0
0
4
1
1
P
D

.

/

F

B

G
tu
e
S
T

T

o
N
0
7
S
e
P
e
M
B
e
R
2
0
2
3

Perspectives on Science

203

Let us expand on several aspects of severe uncertainty by reviewing an
extract taken from a speech given by former US Secretary of Defense
Donald Rumsfeld during the invasion of Iraq.

Reports that say that something hasn’t happened are always
interesting to me, because as we know, there are known knowns;
there are things we know we know. We also know there are known
unknowns; that is to say we know there are some things we do not
know. But there are also unknown unknowns, the ones we don’t know
we don’t know. (NOI. Department of Defense 2002)

Rumsfeld’s statement organized knowledge, ignorance, and uncertainty
into categories. Known unknowns exemplify those contexts in which uncer-
tainty can be probabilistically measured. In this Special Issue, we will focus
on the class that Rumsfeld calls unknown unknowns. This is the sense of
uncertainty mentioned in the following statement made by Keynes:

By ‘uncertain’ knowledge I do not mean merely to distinguish what
is known for certain from what is only probable. The game of
roulette is not subject, in this sense, to uncertainty []. The sense in
which I am using the term is that in which the prospect of a
European war is uncertain, or the price of copper and the rate of
interest twenty years hence [] About these matters there is no
scientific basis on which to form any calculable probability whatever.
We simply do not know. (Keynes 1973, pag. 113–114)

Inspired by the Keynesian idea of uncertainty, Shackle (1961) developed a
non-probabilistic approach to decision-making under uncertainty. Unfor-
tunately, his theory, termed Potential Surprise Theory, has largely been for-
gotten, although it has a strong connection to epistemic possibility. Arrow
(1951) defined Shackle’s proposal as the only fully formalized non-
probabilistic approach to decision-making at that time. The Potential Sur-
prise Theory was aimed to deal with the uncertainty of future scenarios in
which additivity fails.

Another relevant notion to potential surprises is the family of
approaches that exploit the idea of scenario building, also known as scenario
planning. Since scenarios typically do not forecast, these theories rely on
imaginative narratives that indicate how various scenarios might influence
our present decisions. Once those narratives have been composed, one uses
them to aid in conjecturing about future scenarios (van der Heijden 1996;
Martelli 2014). When structures do not repeat and we have little or no
experience of potentially hazardous situations (Hansson 1996), narratives
become an especially powerful tool for our imaginative capacities. More-
Sopra, when facing low probabilities, the urgency of paying attention to less

l

D
o
w
N
o
UN
D
e
D

F
R
o
M
H

T
T

P

:
/
/

D
io
R
e
C
T
.

M

io
T
.

/

e
D
tu
P
o
S
C
/
UN
R
T
io
C
e

P
D

l

F
/

/

/

/

3
0
2
2
0
1
2
0
0
5
0
1
0
P
o
S
C
_
e
_
0
0
4
1
1
P
D

.

/

F

B

G
tu
e
S
T

T

o
N
0
7
S
e
P
e
M
B
e
R
2
0
2
3

204

introduzione: Severe Uncertainty

likely possibilities with a gravid impact increases. Yet, scenarios can and
should be up and running well before dramatic decisions need to be made.
In hindsight, when weighing the possible consequences of a decision, IL
narratives that have been constructed come in handy.

It is worth noting that plausible and narrative scenarios are not unlike
scientific hypotheses: Hypotheses, even when they are, strictly speaking,
false, may remain fruitful in many senses of the term. Importantly, narra-
tive scenarios are not devoid of evidential value, which is useful as the
weighing of evidence becomes all the more important when probabilities
are small, and the events rarely repeat themselves. Yet, imaginative scenar-
ios are not always fictions—in fact, they are quite the contrary. They can
point out what does not work, what decisions are good in the sense of
avoiding future harm, and what alternative actions may help lead towards
achieving the desired goals. This indicates the existence of a strict connec-
tion between uncertainty and value-based decisions.

As commonly acknowledged in decision-making under risk (Hempel
1965), decision-making under severe uncertainty is a value-laden activity.
Here, the non-epistemic facets (per esempio., ethical, social, economic, political) Di
values especially rise to the podium. Recentemente, philosophers have begun
providing rational analyses of forms of uncertainty in which normativity
is a key component. Such forms are termed “moral uncertainty” or “nor-
mative uncertainty” (Lockhart 2000; Sepielli 2013, 2014; Bykvist 2017;
MacAskill et al. 2020). The latter is a much broader concept than the for-
mer. Moral uncertainty is “uncertainty about what we all-thing-considered
morally ought to do” (MacAskill et al. 2020, P. 2), while “normative
uncertainty also applies to uncertainty about which theory of rational
choice is correct and uncertainty about which theory of epistemology is
correct” (MacAskill et al. 2020, pag. 2–3). Hence, normative uncertainty
is about the conditions of inexactness, unpredictability, and ignorance,
with respect to not only the state of factual knowledge, but also the nor-
mative dimensions of such conditions (Taebi et al. 2020). Così, the con-
cept of normative uncertainty involves value-based considerations in those
aspects of decision-making related to epistemology, ethics, law, E
planning.

Several argumentative strategies have been developed to cope with eth-
ical judgments under severe uncertainty, such as hypothetical retrospection
(Hansson 2007). According to hypothetical retrospection, the ethical
values of decisions are evaluated assuming that a possible branch of future
development has materialized. This evaluation is based on the values and
information available when the original action took place, from the point
of view of the imagined future point of retrospection. The decision rule for
retrospective judgment then requires choosing an alternative that emerges

l

D
o
w
N
o
UN
D
e
D

F
R
o
M
H

T
T

P

:
/
/

D
io
R
e
C
T
.

M

io
T
.

/

e
D
tu
P
o
S
C
/
UN
R
T
io
C
e

P
D

l

F
/

/

/

/

3
0
2
2
0
1
2
0
0
5
0
1
0
P
o
S
C
_
e
_
0
0
4
1
1
P
D

.

/

F

B

G
tu
e
S
T

T

o
N
0
7
S
e
P
e
M
B
e
R
2
0
2
3

Perspectives on Science

205

as morally acceptable from all hypothetical retrospections in this branching
temporal structure. At any rate, moral argumentation under severe uncer-
tainty remains an emerging but extremely interesting field with opportu-
nity for further developments.

Severe uncertainty affects science at many levels. Rescher (1999), for
instance, emphasized many aspects associated with the uncertainty of
future science. Answers to open questions are unpredictable and often
take us by surprise. The very questions that future scientists will ask
are unpredictable in the first place. Consequently, probabilistic risk
assessment methods may soon lose their applicability. For instance, In
the evaluation of trends, the impact of an unexpected event is often left
out of account (Martelli 2014). While the significance of trends cannot
be ruled out in theories of scientific change, the future is shaped by
them, even if indirectly. Infatti, scientific progress is much more tied
to uncertainty (unknown unknowns) than to (probabilistic) risk (known
unknowns).

In the medical context, severe uncertainty has many clinical aspects. For
instance, prognostic judgments face severe uncertainty, as probability mea-
sures are not often assigned to specific future events (Chiffi and Zanotti
2017; Chiffi 2021). This is because relevant prognostic information is
not available when formulating a prognosis, and some future scenarios
associated with a patient’s disease course might be previously inconceiv-
able from a cognitive and methodological perspective. Despite this
remarkable fact, prognostics remains classically based on the concept of
probabilistic risk, without the recognition that it is frequently connected
with events showing severe uncertainty. More generally, clinical uncer-
tainty (also called “clinical equipoise”) has been considered in research
ethics for the justification of a new trial (Fried 1974), and the structure
of the future is considered to cover a key role in the assessment of the find-
ings of the trial itself (Djulbegovic 2007).

Since it is difficult to deal with severe uncertainty using exact methods,
it is often ignored (Alles 2009), and this is particularly true in technology
development. This type of bias is common in contemporary science and
society, where few disruptive and hazardous technological situations can
be identified and fully evaluated. Nevertheless, ignoring the possible exis-
tence of unknown unknowns can be, at best, irresponsible, and at worst,
existentially disastrous. Many technological disasters occur as a result of
extremely unlikely scenarios but have extremely severe consequences,
and these are ruled out in the design of technology. The necessity of
designing technologies sensitive to (cioè., those values of safety and respon-
sibility linked to) severe forms of uncertainty remains particularly
challenging.

l

D
o
w
N
o
UN
D
e
D

F
R
o
M
H

T
T

P

:
/
/

D
io
R
e
C
T
.

M

io
T
.

/

e
D
tu
P
o
S
C
/
UN
R
T
io
C
e

P
D

l

F
/

/

/

/

3
0
2
2
0
1
2
0
0
5
0
1
0
P
o
S
C
_
e
_
0
0
4
1
1
P
D

.

/

F

B

G
tu
e
S
T

T

o
N
0
7
S
e
P
e
M
B
e
R
2
0
2
3

206

introduzione: Severe Uncertainty

A well-known method of dealing with technological uncertainty is the
precautionary principle; Tuttavia, this principle does not seem to be the
most optimal methodological tool for dealing with many technological
innovation contexts. This is because this principle does not seem to be sus-
ceptible to different types of uncertainty and hazards associated with tech-
nological innovations. As noted by Sandin (1999), in every formulation of
the precautionary principle, the uncertainty in question is scientific uncer-
tainty; Tuttavia, in complex contexts, uncertainty is often trans-scientific.
This means that science cannot always answer the questions at stake.
Suggestions to take into account the asymmetry of uncertainty (for the
parties involved in the decision making), as well as spatial and temporal
limitations and the interference with complex systems in balance, Avere
been proposed to manage severe uncertainty (Hansson 1996). Tuttavia,
much remains to be done to deal with severe uncertainty and disruptive
innovation in science, medicine, and technology and their epistemolog-
ical, ethical, and socio-political implications. We hope that the present
Special Issue provides a small step towards a better interdisciplinary
understanding of severe forms of uncertainty.

The Content of the Special Issue

2.
The contributions in this Special Issue of Perspectives on Science investigate a
range of topics that pertain to the unfolding of severe uncertainty in sci-
ence, medicine, and technology. We provide an overview of the papers con-
tained in this issue and the contribution each makes to the philosophy of
severe uncertainty.

Sven Ove Hansson investigates whether and how uncertainty can be
either quantified or formalized. By going beyond some of the traditional
definitions of uncertainty, he offers a comprehensive taxonomy by identi-
fying eight major types of uncertainty. Many of these types are amenable to
both quantification and non-quantitative formalization; Tuttavia, it seems
that there is still no one-size-fits-all model that can describe all their prop-
erties. Therefore, there is room to conduct more research in the formal
modeling of uncertainty.

That uncertainty is not a homogeneous concept or problem is also the
entry point of Stefano Moroni and Daniele Chiffi’s paper on decision-
making under uncertainty. They deal with a precise form of decision-
making: public decision-making (in contrast with everyday life and private
decision-making) in the context of urban planning, where special issues
arise particularly with regard to the adoption of new technologies. In this
context, to improve the decision-making process, the authors suggest that
uncertainty can be reduced by adopting rules that are simple and stable.

l

D
o
w
N
o
UN
D
e
D

F
R
o
M
H

T
T

P

:
/
/

D
io
R
e
C
T
.

M

io
T
.

/

e
D
tu
P
o
S
C
/
UN
R
T
io
C
e

P
D

l

F
/

/

/

/

3
0
2
2
0
1
2
0
0
5
0
1
0
P
o
S
C
_
e
_
0
0
4
1
1
P
D

.

/

F

B

G
tu
e
S
T

T

o
N
0
7
S
e
P
e
M
B
e
R
2
0
2
3

Perspectives on Science

207

This is because, on this account, rules have not only the social function of
regulating public–private relations but also epistemic value.

The uncertainty surrounding the adoption of new technologies in soci-
ety is also the topic of Philip J. Nickel, Olya Kudina, and Ibo van de Poel’s
lavoro. In particular, they focus on a precise form of uncertainty—moral
uncertainty—as an analytical tool to explain the impact of new technolo-
gies on our moral values. The standard theories of “techno moral change”
seem unable to fully grasp the “disruptive” character of technological inno-
vation on individual and social moral norms. The authors argue that this
explanatory gap can begin to be bridged by looking at the epistemic and
deliberative dimensions of techno moral change. In this regard, a more
complete account of “disruption” is developed with the adoption of a
moral perspective.

The moral dimension of severe uncertainty is also at the heart of Viola
Schiaffonati’s paper. She focuses on the topical case of autonomous robotics,
in which severe uncertainty has a significant impact on the prediction of
robots’ behavior in complex environments. To address this issue, we need
both an epistemological and an ethical framework that will allow us to run
better explorative experiments. Traditional accounts of experimentation in
the natural sciences should be reconsidered in the case of exploratory exper-
iments with new technologies, such as autonomous and intelligent
systems.

The idea that uncertainty has an important ethical dimension is also
shared by Malvina Ongaro and Mattia Andreoletti. In their paper, Essi
highlight the importance of the analysis of non-empirical uncertainty in
establishing the boundaries of scientific advising in policy decision-
making. Recognizing the normative character of severe uncertainty is use-
ful for questioning the idea that evidence-based policy decision-making
can be truly impartial with regard to the policies for which it advises.

Finalmente, in their paper, Giovanni Valente and Massimo Tavoni offer a
comprehensive review of the role of uncertainty in climate change model-
ing. Focusing on “Integrated Assessment Models,” they argue that compu-
tational models should consider the severe uncertainty that characterizes
technological and human systems.

Riferimenti
Alles, M. 2009. “Governance in the Age Of Unknown Unknowns.” Inter-
national Journal of Disclosure and Governance 6: 85–88. https://doi.org/10
.1057/jdg.2009.2

Arrow, K. 1951. “Alternative Approaches to the Theory of Choice in
Risk-Taking Situations.” Econometrica 19: 404–437. https://doi.org/10
.2307/1907465

l

D
o
w
N
o
UN
D
e
D

F
R
o
M
H

T
T

P

:
/
/

D
io
R
e
C
T
.

M

io
T
.

/

e
D
tu
P
o
S
C
/
UN
R
T
io
C
e

P
D

l

F
/

/

/

/

3
0
2
2
0
1
2
0
0
5
0
1
0
P
o
S
C
_
e
_
0
0
4
1
1
P
D

.

/

F

B

G
tu
e
S
T

T

o
N
0
7
S
e
P
e
M
B
e
R
2
0
2
3

208

introduzione: Severe Uncertainty

Bykvist, K. 2017. “Moral Uncertainty.” Philosophy Compass 12(3), e12408:

1–8. https://doi.org/10.1111/phc3.12408

Chiffi, D., and Zanotti, R. 2017. “Fear of Knowledge: Clinical Hypotheses
in Diagnostic and Prognostic Reasoning.” Journal of Evaluation in Clin-
ical Practice 23(5): 928–934. https://doi.org/10.1111/jep.12664,
PubMed: 27882636

Chiffi, D., and Pietarinen, A.-V. 2017. “Fundamental Uncertainty and
Values”. Philosophia 45(3): 1027–1037. https://doi.org/10.1007
/s11406-017-9865-5

Chiffi, D. 2021. Clinical Reasoning: Knowledge, Uncertainty, and Values in Health
Care. Cham: Springer. https://doi.org/10.1007/978-3-030-59094-9
Djulbegovic, B. 2007. “Articulating and Responding to Uncertainties in
Clinical Research”. Journal of Medicine and Philosophy 32(2): 79–98.
https://doi.org/10.1080/03605310701255719, PubMed: 17454416
Ellsberg, D. 1961. “Risk, Ambiguity, and the Savage Axioms”. The Quarterly
Journal of Economics 75(4): 643–669. https://doi.org/10.2307/1884324
Fried, G. 1974. Medical Experimentation: Personal Integrity and Social Policy.

New York: American Elsevier Publishing Co., Inc.

Hansson, S. O. 1996. “Decision Making Under Great Uncertainty.”
Philosophy of the Social Sciences 26(3): 369–386. https://doi.org/10
.1177/004839319602600304

Hansson, S. O. 2007. “Hypothetical Retrospection.” Ethical Theory and
Moral Practice 10(2): 145–157. https://doi.org/10.1007/s10677-006
-9045-3

Hansson, S. O. 2018. “Representing Uncertainty.” Pp. 387–400 in Intro-
duction to Formal Philosophy. Edited by S. O. Hansson and V. Hendricks.
Cham: Springer. https://doi.org/10.1007/978-3-319-77434-3_19

van der Heijden, K. 1996. Scenarios: The Art of Strategic Conversation.

Chichester: John Wiley & Son.

Hempel, C. G. 1965. “Science and Human Values.” Pp. 81–96 in Aspects of
Scientific Explanation and Other Essays in the Philosophy of Science. Nuovo
York: The Free Press.

Kay, J., and M. King. 2020. Radical Uncertainty: Decision-making beyond the

Numbers. New York: WW Norton & Company.

Keynes, J. M. 1973. The General Theory and After: Defence and Development.
London: Macmillan (the collected writings of John Maynard Keynes,
vol. XIV).

Knight, F. H. 1921. Risk, Uncertainty, and Profit. Boston: Hart, Schaffner &

Marx; Houghton Mifflin Company.

Langlois, R.N. 1994. “Risk and Uncertainty.” Pp. 118–122 in The Elgar
Companion to Austrian Economics. Edited by P. J. Boettke. Cheltenham:
Edward Elgar. https://doi.org/10.4337/9780857934680.00026

l

D
o
w
N
o
UN
D
e
D

F
R
o
M
H

T
T

P

:
/
/

D
io
R
e
C
T
.

M

io
T
.

/

e
D
tu
P
o
S
C
/
UN
R
T
io
C
e

P
D

l

F
/

/

/

/

3
0
2
2
0
1
2
0
0
5
0
1
0
P
o
S
C
_
e
_
0
0
4
1
1
P
D

.

/

F

B

G
tu
e
S
T

T

o
N
0
7
S
e
P
e
M
B
e
R
2
0
2
3

Perspectives on Science

209

Lempert, R. J., S. W. Popper, and S. C. Bankes. 2003. Shaping the Next One
Hundred Years: New Methods for Quantitative, Long-Term Policy Analysis,
MR-1626-RPC. Santa Monica: RAND. https://doi.org/10.7249
/MR1626

Lockhart, T. 2000. Moral Uncertainty and its Consequences. Oxford: Oxford

Stampa universitaria.

MacAskill, W., Bykvist, K., and Ord, T. 2020. Moral Uncertainty. Oxford:
Oxford University Press. https://doi.org/10.1093/oso/9780198722274
.001.0001

Marchau, V. A., Walker, W. E., Bloemen, P. J., and Popper, S. W. 2019.
Decision Making Under Deep Uncertainty: From Theory to Practice. Cham:
Springer. https://doi.org/10.1007/978-3-030-05252-2

Martelli, UN. 2014. Models of Scenario Building and Planning: Facing Uncer-
tainty and Complexity. New York: Palgrave Macmillan. https://doi.org/10
.1057/9781137293503

Rescher, N. 1999. The Limits of Science. Pittsburgh: Università di Pittsburgh

Press. https://doi.org/10.2307/j.ctt9qh79p

Royal Society. 1983. Risk Assessment: Report of a Royal Society Study Group.

London: Royal Society

Sandin, P. 1999. “Dimensions of the Precautionary Principle.” Human and
Ecological Risk Assessment: An International Journal 5(5): 889–907. https://
doi.org/10.1080/10807039991289185

Sepielli, UN. 2013. “Moral Uncertainty and the Principle of Equity among
Moral Theories.” Philosophy and Phenomenological Research 86(3): 580–589.
https://doi.org/10.1111/j.1933-1592.2011.00554.x

Sepielli, UN. 2014. “What To Do When You Don’t Know What To Do
When You Don’t Know What to Do ….” Noûs 48(3): 521–544.
https://doi.org/10.1111/nous.12010

Shackle, G. l. S. 1961. Decision, Order, and Time in Human Affairs. Cambridge:

Cambridge University Press.

Shafer, G. 1976. A Mathematical Theory of Evidence. Princeton: Princeton

Stampa universitaria. https://doi.org/10.1515/9780691214696

Taebi, B., Kwakkel, J. H., and Kermisch, C. 2020. “Governing Climate
Risks in the Face of Normative Uncertainties.” Wiley Interdisciplinary
Reviews: Climate Change 11(5): e666. https://doi.org/10.1002/wcc.666
NOI. Department of Defense. 2002. Defense.gov News Transcript: DoD News
Briefing Secretary Rumsfeld and Gen. Myers, United States Department of
Defense (defense.gov). https://archive.defense.gov/Transcripts/Transcript
.aspx?TranscriptID=2636

Zadeh, l. UN. 1978. “Fuzzy Sets as a Basis for a Theory of Possibility.”
Fuzzy Sets and Systems 1: 3–28. https://doi.org/10.1016/0165-0114
(78)90029-5

l

D
o
w
N
o
UN
D
e
D

F
R
o
M
H

T
T

P

:
/
/

D
io
R
e
C
T
.

M

io
T
.

/

e
D
tu
P
o
S
C
/
UN
R
T
io
C
e

P
D

l

F
/

/

/

/

3
0
2
2
0
1
2
0
0
5
0
1
0
P
o
S
C
_
e
_
0
0
4
1
1
P
D

.

/

F

B

G
tu
e
S
T

T

o
N
0
7
S
e
P
e
M
B
e
R
2
0
2
3
Scarica il pdf