Decision-making under uncertainty

Choosing actions on the basis of imperfect observations with unknown outcomes is called decision-making under uncertainty, and exists in many important problems. It unifies researchers across various disciplines to develop and model tools and methodologies to solve real-world decision-making problems under uncertainty. This paper proposes an efficient uncertainty modelling method using neural network-based prediction intervals (NN-based PIs) based on [1]. PIs are excellent tools to model uncertainties. Particle swarm optimisation (PSO)-based lower upper bound estimation (LUBE) method is used to construct NN-based PIs. Thereafter, a scenario generation method is used to generate scenarios from a list of PIs. These scenarios are further incorporated into the stochastic model for decision-making.

For more than a decade now, information has been recognised as a form of aid (IFRC, 2005). Uncertainty has been largely related to the lack of predictability of some major events or stakes, or a lack of data (Argote, 1982). To overcome this uncertainty, the traditional decision support paradigms suggest collecting more information. Therefore, decision-makers have focused on gathering and analysing more and more data about potentially disaster-affected areas (Comfort, 2007;Wybo and Lonka, 2003).
In parallel, progress in engineering continues to promise connectivity, broader bandwidth and unknown computational power to all (Gao et al., 2011;Meier, 2014). The use of social media that first gained prom-inence in the 2010 Haiti earthquake has become 'main stream' in the response to Typhoon Haiyan in 2013 (Butler, 2013). Technology-driven data sources such as GPSs, radio frequency-based identification tracking, remote sensing, satellite imagery or drones enable real-time monitoring (Comes and Van de Walle, 2016). Biometric identification technologies are increasingly used as tools for refugee management (Jacobsen, 2015) and relief provision shifts towards virtual distributions through digital payment systems or 'mobile money' (Sandvik et al., 2014). However, the more decision-making depends on (big) data the more challenging it becomes to manage and analyse: • In a fragmented and 'post-factual' society, information coming from heterogeneous sources and actors is likely to be contradictoryand recent elections, from Brexit to the United States in 2016, highlight that (mis-)information becomes a commodity which is a source of influence and power.
• Volatility -the pace of change in data and public opinion is unprecedented, drastically reducing the time available for strategic policy decisions (Noveck, 2015).
• Because of the ever-more complex socio-technical interdependencies, the implications of decisions cannot be clearly assessed any more (Comes et al., 2011).
Technology has enabled new forms of data collection and participation. It has introduced a new layer of complexity in decisionand policymaking. Technologies are enabling but never the endsolution. 405 Besides a lack of information, uncertainty can also stem from a lack of understanding of the actual information (as opposed to rumours) and the impact of a decision on complex systems; as a result, decision-makers are not even aware of what is uncertain (Taleb, 2007). From this perspective, some authors have strongly advocated a renewed perspective of decision-making strategies (Makridakis and Taleb, 2009 The standard paradigm of decision-making under uncertainty suggests that uncertainties are due to inherent randomness in an event, such as throwing a coin. Such uncertainties can be best captured by probabilities. To this end, scientists or citizens collect and evaluate data, which are translated into a model. For instance, the chances of a flood, storm or earthquake affecting a community is typically given by the frequency of the occurrence of such events over a certain period, for example a 100-year flood. Data to predict such a flood include rainfall or changes in temperature upstream. Standard decision support tools assume that a crisis evolves from a chaotic beginning into a steady state that follows patterns which can be identified. Therefore it is sufficient to collect comparable data to retrieve the patterns. However, this implies that data are comparable and standardised and were collected following a series of specific methods. Applying expected utility theory (French et al., 2009), i.e. recommending the decision that leads to the highest expected value, also means that the recommendations lead to the best outcome over a series of (repeated, similar) events.
Disater risk management deals with highly uncertain situations. Such uncertainties can be best captured with probabilistic approaches.
Decision-making under uncertainty requires the understanding of the underlying uncertainties and assumptions within the probabilistic models or the data.
In addition, the variety of the data collected and analysed today ranges from sensor measurements to social media information or radio conversations (Comes, 2011). Each of these types of data is fraught with different types of uncertainty or error: while sensors can malfunction or fail, human judgement is typically ambiguous, subjective and highly contextualised (Palen et al., 2010). As such, new approaches that help policymakers consolidate the different types of uncertainty inherent to the heterogeneous data need to be developed.
In addition, the potential impact of a flood, for instance in terms of damage to infrastructure, is much harder to predict than the event itself. Behavioural issues need to be considered; for example where will people turn for help and how will they support each other? The use of smart phones in the refugee crisis, allowing refugees to navigate their way across European borders, for instance, has caught many organisations and governments by surprise (Comes and Van der Walle, 2015).
Despite these complexities, under the time pressure of (looming) disasters and crises, often simple and straightforward recommendations are sought for their ease of communication (Renn, 2008). Since disasters are low-probability events, however, such models can be misleading, particularly if there is 'blind trust' in a prediction or model (French and Niculae, 2005) -and no room to reflect upon the underlying uncertainties and assumptions within the model or the data.

Decision-making contexts and new sources of uncertainty
Three major contexts for decision-making in disaster risk reduction have emerged with the push for increasing digitalization. Creating information does not require specific education and background any more. By relying on open software tools anyone can create a map, dashboard or analysis, opening opportunities for participation and engagement.
• Participatory and community-based approaches emphasise novel possibilities of engagement and can empower local communities through joint planning and crowdsourcing (Edwards, 2009;Norris et al., 2008). An example is a citizen science approach to flood protection, where communities themselves were involved in research from scratch and were thus better informed in decision-making (Wehn et al., 2015). Uncertainty here is related to the fragmentation of voices, the subjectivity of data and the volatility of public opinions: • Increasing automation and dominance of technology-driven approaches refer to the integration of information into decision practices through pervasive information technology (IT). Using satellite imagery, drones and artificial intelligence for damage assessment after an earthquake or a forest fire is just one of many examples. While data-driven approaches sometimes suggest the increase in objectivity, they are often far from complete and digital shades persist. For instance, social media analyses that rely exclusively on Twitter neglect the fact that Twitter users are hardly a representative sample of the population. At the same time, commercial proprietary algorithms and software (such as those used by big search machines like Google and Facebook) are certainly not neutral, and uncertainty persists about how data are analysed.
• Virtual collaborations in networks of experts and volunteers include, for instance, 'crisis mappers' that help local communities map out assets such as hospitals or schools. To deal with these emerging decision-making contexts, policymakers, responders and scientists are expected to abide by given professional standards and norms such as emergency plans, risk management and resilience frameworks and good academic practice. Maybe most prominent are the humanitarian principles, which include humanity, impartiality, neutrality and independence (OCHA, 2010). However, through readily available software, new grassroots initiatives and volunteers that do not subscribe to any standard or code of conduct can produce the same types of information products, maps or analysis -without quality assurance. For instance, the easy use of Ushahidi or Google Maps contributes to the coexistence of similar maps with conflicting information, which can aggravate uncertainty. Moreover, algorithms that structure data collection and analysis underlying these products are often proprietary and not transparent. Having lost the exclusivity to create information, scientists should therefore ensure that their approach to data collection and modelling is transparent and matches the purpose of the specific situation and context. At the same time, uncertainty relat-ed to professional products that are designed to support decisions leave way for interpretation and 'spinning' of any information into a favourable direction, introducing motivational biases (Montibeller and von Winterfeldt, 2015). One important aspect of such decisions are power relations between actors and organisations.

Decision-making under uncertainty as a power relation
Uncertainty, information and power are intricately related concepts. As outlined in the previous chapter, decision-makers and scientists need to revise standards and practices that have emerged with increased information access. Likewise, decision-makers need to fully consider power dynamics in their approach to uncertainty and adapt their practices.
In practice, power can be defined as the extent to which an entity can guide or frame another entity's actions. Entities can be individuals, groups, organisations (companies, non-profit organisations, communities, governments, etc.) and groups of organisations (consortia, alliances, partnerships, networks, etc.). Power is thus key to understanding how collective action emerges and evolves (Prus, 1999).
Power fuels on 'an intent or capacity on the part of one person or one group to influence, control, dominate, persuade, manipulate or otherwise affect the behaviour, experience or situations of some target' (Prus, 1995, cited by Hall, 1997. Information and knowledge are essential to power: to influence, control, dominate, persuade and manipulate others, one needs to know more (Crozier and Friedberg, 1977). Thus, one can strive to maintain asymmetrical levels of information access and uncertainty to Power, information access, decision-making with uncertainty Source: courtesy of authors Improved decision -making 408 gain power over the others. Reciprocally, power shifts affect the level of uncertainty that concern the various actors involved in disaster risk.
Power is a driver of information creation and sharing, which biases seemingly objective data adding a layer of uncertainty to decisionmaking.
Various cases illustrate how disastrous the effect of power on uncertainty can be. In the aftermath of 2008 Cyclone Nargis, the Burmese junta feared losing its power because of the arrival of foreign aid. It significantly retained information by imposing a media ban. By struggling to control information, the Burmese junta prevented the relief actors from collecting information. Uncertainty about humanitarian needs increased at the expense of the population (Pan et al., 2012).
Criticism arose and was directed towards the overwhelming power of the international humanitarian apparatus in the aftermath of the 2010 Haiti earthquake. The government's infrastructures collapsed and international non-governmental organisations (NGOs) quickly took over, centralising information and allocating resources without sharing information. The local government remained blinded by uncertainty and compelled to rely extensively on international aid. Such asymmetry led to a vicious circle: priorities shifted to the import of western governance standards, which impeded the country's response to the 2010 outbreak of cholera (Biquet, 2013).
While thus being an important driver of uncertainty in decisions (Hart, 1993), power is often mixed up with the surrounding notions (Comfort, 2007). This is, at least in part, because the impact of power is hard to capture. Power relations can shift quickly through interactions and in changing circumstances (Hall, 1997). In addition, power is invisible and 'silent' (Brown et al., 2010) and cannot be bound to a single event, fact or process.
To address this issue, decision-makers need to be aware of uncertainty and information asymmetry in disaster risk.

A holistic approach to power highlights bigger challenges related to decision-making and uncertainty
Even though information access can contribute to increasing one's power at the response stage, one should keep the side effects in mind. From an institutional perspective, increased competition for information to gain power can result in opportunistic or fuzzy behaviour with respect to information. This, in turn, can negatively affect relationships between local or other professional actors at the expense of the population that has potentially been affected by a dis-aster. For instance, during the 9/11 response, a large spectrum of actors (citizens and local non-profit organisations in search of institutional visibility) urged on the crisis response stage, providing non-exploitable data and creating confusion, which slowed coordination down (Dawes et al., 2004).
In addition, NGOs can tend to exploit information as an opportunity to gain legitimacy and visibility. Such a tendency is not new. In 1994 Eng and Parker observed how local Mississippi communities shifted their efforts from social interactions to developing legitimacy towards their partners. However, we believe that digitisation can potentially lead to an opportunistic use of information and we therefore call scholars and practitioners to consider the ethical and legal implications of technology-based decisions as a burning issue.

The ethical and legal implications of technology-based decisions
The power implications and uncertainties related to technology require a critical review of the ethical, legal and social issues (ELSI). For instance, how to engage with citizens through social media or how to share information between different agencies and information systems in line with data protection laws remains a current issue. Consequently, designing and developing technologies and practices which address such issues becomes essential.

Pandora's Box? Uncertainty related to unintended consequences of informationalisation
We have previously highlighted that behavioural issues, particularly when reinforced by social media platforms, increase complexity and uncertainty in decision-making. Rather than relying on compliance of the population ('keep calm and carry on'), citizen and volunteer groups today emerge and organise, leading to 'unintended consequences'.
Specifically, the case of the 2011 Vancouver riots (Rizza et al., 2014) highlights risks associated with citizen engagement crises through social media. The Vancouver Police Department asked Vancouverites to send their material and to help identify rioters. Feeling empowered by local authorities, citizens started a real manhunt, and some families had to leave the city. This case has pointed out: 1) the 'institutional unpreparedness' in dealing with a huge quantity of data, their quality and the new processes of inquiry they require; 2) the 'unintended do-it-yourself justice', i.e. the shift from supporting crisis managers to vigilantes when citizens overruled authorities and enforced justice on their own terms; 3) the 'unintended do-ityourself society' supported by the potential of social media for prompting people to act. What happened in Vancouver challenged human rights and values such as fairness, justice, integrity, responsibility and accountability.
For the 2010 Eyjafjallajökull volcano eruptions, Watson and Finn (2014) discussed some of the privacy and ethical implications surrounding the use of social media. Social media allowed persons stranded in Europe to communicate, organise their travel, etc. as well as allowing the aviation industry to get information from its customers. At the same time, social media use led to privacy infringements and inequality. Indeed, over-focusing on social media could lead disaster risk managers to focus on those who produce a lot of data and, consequently, to down-prioritise those unequipped (for example foreign passengers) or unable to use ICTs (for example the elderly). Lastly, 'self-help' between citizens under the umbrella of resilience (i.e. a spontaneous peer-to-peer communication) should not become a way for corporate or public entities to neglect care responsibilities for those who have been impacted by a disaster.

Ethical and legal considerations have become essential in designing and developing technologies and practices which collect, analyse and communicate (uncertain) information and data.
Consequently, designers and practitioners in disaster risk need to consider the uncertainty related to unintended consequences of IT. This implies noticing, anticipating and knowing them.
Rizza, Büscher and Watson (2017, forthcoming) underline that (personal) data and information (sharing) constitute the core interest of ELSI concerns in the Big Data era, which makes mass surveillance possible. The collection and processing of data coming from different applications makes the boundary between decision support and control or surveillance fuzzy. For instance, the knowledge database created through such a monitoring system could reveal individuals' habits, routines or decisions and, consequently, infringes citizens' privacy. Big data has even been said to contribute to trapping particularly vulnerable populations in poverty by obstructing the possibility to get loans or access to good education (Waddell 2016). As such, the statistical likelihood that someone from a specific neighbourhood may not pay back a loan blocks individual opportunities. The collection and processing of personal data is also problematic because in crises it can erode basic rights such as freedoms of speech, associations and movement.
To balance the need to reduce uncertainty and collect data with ethical responsibility in scientific and technological developments, an ethic of co-responsibility should emerge (Schomberg, 2013). Research around ELSI aspects of IT also reveals opportunities: integrating IT into disaster risk management with an explicit commitment to ELSI considerations will provide useful insights for a proactive approach to innovation (op. cit.).
Initiatives like 'privacy by design' or 'ethics by design ' (European Commission, 2010) attempt to deal with current critiques of the lack of concern for ELSI in the development of new technologies (Rizza et al., 2011). Privacy impact assessments can ensure that technology for disaster risk reduction is developed to protect the interests of end users and stakeholders within the organisational and legal frameworks.

Decision-making under uncertainty:
better than muddling through?
The context of decision-and policymaking has become complex. The very nature of the different uncertainties we discussed makes it largely impossible to use probabilities: the socio-technical uncertainties in disaster risk reduction are deep (Comes et al., 2013;Comes et al., 2011;Pruyt and Kwakkel, 2014). Already in the 1950s, Lindblom (1959) had described that decision-makers confronted with such uncertainty are 'muddling through'. Participatory approaches to model design and scenario analysis have been advocated as a way ahead when the communities affected are clearly known (Comes et al., 2015b;Wright and Goodwin, 2009). Examples range from scenarios for water and flood management (Haasnoot et al. 2011) to urban planning and resource management (Vervoort et al., 2010), approaches that rely on connecting communities and policymakers in the preparedness phase. Scenarios are built in deliberative processes that capture expert knowledge, preferences and values of stakeholders (Kok et al., 2006;Vervoort et al., 2010). While those scenarios serve to establish plans and evaluate alternatives based on a common understanding, they are time consuming to update and adapt to new circumstances or information. As such, they are most useful in the preparedness phase, not in the least to help build networks and partnerships of trust (Comes, 2016b).
The opposing trend relies on artificial intelligence and data mining approaches that enable real-time analysis of data streams to be made. Automated algorithms and tools can be used to extract and illustrate largescale patterns and trends in human behaviour, damage assessments and communication flows (Meier, 2014;Monaghan and Lycett, 2013;Whipkey and Verity, 2015). As such, they promise fast answers, which is particularly relevant in the heat of a response. It is, however, necessary to ask how such analyses influence human sensemaking or possibly introduce biases (Wright and Goodwin, 2009). Particularly if analyses are run remotely and disconnected from the community, there is a series of typical errors that may mislead analyses or the interpretation of results (Comes, 2016a). In addition, the reliance on software, data and algorithms has been increasingly criticised for the lack of transparency and control that communities have over their own data (McDonald, 2016;Sandvik, 2013).
In between there is a large spectrum of semi-automated data collection efforts, semi-automated analyses and assessments that are run by scientists, policymakers from municipality to international level and an increasing amount of local and digital volunteers. With the global availability of technology, software and data, the creation of information products has been democratised. While in the past the design of a map or a dashboard required dedicated technical skills, today anyone can produce graphs, figures

Decision-making should
reflect the specific context, constraints, needs and stakeholders associated to a decision, including the specific phase of the disaster risk management cycle.
Decisions differ in terms of information required, time scales, geographical scope and actors. The question, for instance, of where to set up a hospital has very different characteristics from general resource-allocation decisions. Both decisions are important but have very different requirements in terms of information granularity, timeliness and updates. Addressing specific decision-makers needs or problems in the socio-technical context is, however, still not commonplace. We propose a decision-centric paradigm for information collection, processing and visualisation that focuses on specific information needs.

Partnership
Together, scientists, policymakers and communities need to agree on standards that reflect good processes and representations of uncertainties. Citizen science can be a way ahead to providing necessary training and education. In particular, we propose that cultural, social and professional specificities must be thoroughly taken into account in the settling of standards. Since information is always also a source of power, it is imperative to follow the principle of reciprocityempowering the people who provide information to use it for their own good and strictly following the principles of responsible data and technology.

Knowledge
Given that no single paradigm predominates how decision-and policymakers use information, data and uncertainties drive power relations and introduce ethical and legal dilemmas. So far, standard analyses use, at best, probabilistic approaches to represent uncertainties, neglecting the socio-technical dimension of decision-making, problems of data gaps and consent. The reflections on un-certainties presented in this chapter draw from both practical experiences and theory. They are, however, not readily translated into concrete policy measures or decisions because there is first a need for innovation in science and policy.

Innovation
Researchers need to frame the problem they are studying, including the context and the purpose of a model, simulation or analysis. Assumptions and limitations need to be reflected in the design of decision support systems. When situations are complex and uncertain there is a tendency to simplify the problem and to exert control through limited consultations and conflict avoidance. However, models and recommendations must not oversimplify complex problems, which is a challenge given the call for 'easily understandable' solutions.
In addition, we call for the development of methods and approaches that consider the different types of uncertainty from operational decision-making to strategic policymaking. So far, there is no clear understanding of the processes, models and tools that enable institutions to use operational and real-time information to collaborate with citizens to manage disaster risk.
Besides the uncertainty inherent in the new data environment, uncertainty is also rooted in the role of power in decision-making and the lack of addressing the ethical and legal stakes caused by information use. We therefore advocate further research on the socio-technical dimension of uncertainty in decision-making by putting technical, social, organisational, ethi-cal and legal dimensions of information into perspective.
Problems in disaster risk reduction are complex. As such, any model will necessarily reflect this complexity by various layers and levels of uncertainty that will need to be considered in the decision-making process. This means that deliberation processes and communication with stakeholders need to be carefully designed to reflect such uncertainties, even if there is a temptation to go with quick fixes or easy solutions. Error bars or margins of error should not be just a footnote, but rather should be openly discussed. In particular, critical tipping points need to be flagged, such as flood levels that cause a breach in a levee or top wind speeds that damage major infrastructures.
New participatory processes such as risk mapping are increasingly important. In the preparedness phase, they make it possible to establish networks and partnerships that people can rely on during the response. If such processes are also to work effectively in disaster response, decisions, processes and organisational structures need to be adapted to enable the uptake of information provided by communities. Such approaches can only work successfully, if connections are established prior to disasters.
Participatory processes and new governance structures should empower local communities in guiding disaster risk management and reducing uncertainty. However, this implies collective awareness of how power shapes decision-making. Power is a system-wide dynamic that can impact uncertainty for all.