Other language confidence: 0.9860087736822816
Greece is Europe’s most seismically active nation, as it is being deformed by an active subduction system and one of the world’s fastest-spreading rifts. Onshore active faults pose seismic hazard that cannot be reliably assessed in the absence of a comprehensive map of potential earthquake sources. Here, we use high-resolution Digital Elevation Models (DEMs), in conjunction with hillshades and slope models, to map and characterise faults in Greece at a scale of 1:25000. The Active Faults Greece (AFG) database records a total of 3815 fault-traces assigned to 892 interpreted faults. Of the AFG traces, 53% were mapped here for the first time, with their geometries and slip-sense constrained by displacement of landscape features. AFG includes >2000 active and 1632 probably active fault-traces, while 30 traces result from historic surface-rupturing earthquakes since 464 BC. About 57% of faults exhibit strong depositional control (DC) on sedimentation patterns, with active faults being characterised by approximately equal numbers of sharp (32%), moderate (29%) and rounded (29%) scarps. AFG is the first fault database in Greece generated using nationwide interpretation of geomorphology and has applications in paleoseismology, seismic-hazard assessment, mineral-resources exploration, and resilience planning. Data Access: - Download archive version via GFZ Data Services (upper left) - Web-Map Server: https://experience.arcgis.com/experience/a6c85b1edf9d4d17a3f01a70cef6d2b2 - GIS Users: https://services2.arcgis.com/T7iULq65Kp9Elquk/arcgis/rest/services/Active_Faults_Greece/FeatureServer - Layerfiles for use in ArcGIS Pro and QGIS: https://noaig.maps.arcgis.com/sharing/rest/content/items/4b93c25b931744dabc4851abf9c8ae38/data
This data collection contains inundation maps in Lima and Callao (Peru) based on tsunami simulations with two numerical wave propagation and run-up models (Tsunami-HySEA and TsunAWI) for a range of Manning values between 0.015 and 0.06, where constant values were applied in the whole model domain. The simulations were carried out in the framework of the RIESGOS project (https://www.riesgos.de/en/). The source is based on the historic event from October 1746, the parameters are derived from the study Jimenez et al. (2013). The moment magnitude is prescribed to Mw 9.0, the source area is split into five sub-faults, with inhomogeneous slip distribution and static deformation at time zero (this means no kinematic source model). The flow depth distribution in Lima/Callao after four hours simulation time obtained by the two models is interpolated to raster files and provided in geoTIFF format.
This dataset presented herein originates from the JAGUARS (The Japanese German Underground Acoustic Emission Research in South Africa) project, which took place from 2007 to 2009 in Mponeng Gold Mine, South Africa. Project partners included Ritsumeikan University, Earthquake Research Institute University of Tokyo and Tohuku University in Japan, the German Research Center for Geosciences Potsdam and Gesellschaft für Materialprüfung und Geophysik GMuG mbH in Germany, as well as the Council for Scientific and Industrial Research in Johannesburg, Seismogen CC in Cartonville, Anglo Gold Ashanti Ltd and the Institute of Mining Seismology in the Republic of South Africa. This publication forms part of the Geo-INQUIRE initiative (HORIZON-INFRA-2021-SERV-01 call, project number 101058518). It is cross-referenced on the EPISODES Platform (https://episodesplatform.eu/?lang=en#episode:JAGUARS (not yet existing)), which is managed by the EPOS TCS AH (European Plate Observing System Thematic Core Service Anthropogenic Hazards). Within the EPISODES Platform, the datasets are consolidated into an “episode” titled “JAGUARS: Mining induced picoseismicity associated with gold mining”. The EPISODES Platform offers open access to the integrated research infrastructures of the EPOS TCS AH, enabling users to download data and utilize a range of basic online visualization tools to graphically represent and process the datasets directly within their personal workspace.
The Climate Change Workflow is part of the Flood Event Explorer (FEE, Eggert et al., 2022), developed at the GFZ German Research Centre for Geosciences in close collaboration with Helmholtz-Zentrum Hereon , Climate Service Center Germany. It is funded by the Initiative and Networking Fund of the Helmholtz Association through the Digital Earth project (https://www.digitalearth-hgf.de/). The goal of the Climate Change Workflow is to support the analysis of climate-driven changes in flood-generating climate variables, such as precipitation or soil moisture, using regional climate model simulations from the Earth System Grid Federation (ESGF) data archive. It should support to answer the geoscientific question How does precipitation change over the course of the 21st century under different climate scenarios, compared to a 30-year reference period over a certain region? Extraction of locally relevant data over a region of interest (ROI) requires climate expert knowledge and data processing training to correctly process large ensembles of climate model simulations, the Climate Change Workflow tackles this problem. It supports scientists to define the regions of interest, customize their ensembles from the climate model simulations available on the Earth System Grid Federation (ESGF), define variables of interest, and relevant time ranges. The Climate Change Workflow provides: (1) a weighted mask of the ROI ; (2) weighted climate data of the ROI; (3) time series evolution of the climate over the ROI for each ensemble member; (4) ensemble statistics of the projected change; and lastly, (5) an interactive visualization of the region’s precipitation change projected by the ensemble of selected climate model simulations for different Representative Concentration Pathways (RCPs). The visualization includes the temporal evolution of precipitation change over the course of the 21st century and statistical characteristics of the ensembles for two selected 30 year time periods for the mid and the end of the 21st century (e.g. median and various percentiles). The added value of the Climate Change Workflow is threefold. First, there is a reduction in the number of different software programs necessary to extract locally relevant data. Second, the intuitive generation and access to the weighted mask allows for the further development of locally relevant climate indices. Third, by allowing access to the locally relevant data at different stages of the data processing chain, scientists can work with a vastly reduced data volume allowing for a greater number of climate model ensembles to be studied; which translates into greater scientific robustness. Thus, the Climate Change Workflow provides much easier access to an ensemble of high-resolution simulations of precipitation, over a given ROI, presenting the region’s projected precipitation change using standardized approaches and supporting the development of additional locally relevant climate indices.
As the negative impacts of hydrological extremes increase in large parts of the world, a better understanding of the drivers of change in risk and impacts is essential for effective flood and drought risk management and climate adaptation. However, there is a lack of comprehensive, empirical data about the processes, interactions and feedbacks in complex human-water systems leading to flood and drought impacts. To fill this gap, we present an IAHS Panta Rhei benchmark dataset containing socio-hydrological data of paired events, i.e. two floods or two droughts that occurred in the same area (Kreibich et al. 2017, 2019). The contained 45 paired events occurred in 42 different study areas (in three study areas we have data on two paired events), which cover different socioeconomic and hydroclimatic contexts across all continents. The dataset is unique in covering floods and droughts, in the number of cases assessed and in the amount of qualitative and quantitative socio-hydrological data contained. References to the data sources are provided in 2023-001_Kreibich-et-al_Key_data_table.xlsx where possible. Based on templates, we collected detailed, review-style reports describing the event characteristics and processes in the case study areas, as well as various semi-quantitative data, categorised into management, hazard, exposure, vulnerability and impacts. Sources of the data were classified as follows: scientific study (peer-reviewed paper and PhD thesis), report (by governments, administrations, NGOs, research organisations, projects), own analysis by authors, based on a database (e.g. official statistics, monitoring data such as weather, discharge data, etc.), newspaper article, and expert judgement. The campaign to collect the information and data on paired events started at the EGU General Assembly in April 2019 in Vienna and was continued with talks promoting the paired event data collection at various conferences. Communication with the Panta Rhei community and other flood and drought experts identified through snowballing techniques was important. Thus, data on paired events were provided by professionals with excellent local knowledge of the events and risk management practices.
Assetmaster and Modelprop are WPS (Web Processing Services) software components written in Python 3. They are implementing two of the several steps of a multi-hazard scenario-based decentralized risk assessment for the RIESGOS project. The reader can find more details in https://github.com/riesgos. Assetmaster provides as output a structural exposure model defined in terms of risk-oriented building classes (for a reference geographical region) in GeoJSON format. The simple service is based on an underlying exposure model in GeoPackage format (.gpkg). Modelprop provides as output for each defined building class the correspondent fragility function. The python code implementing the service can also be run locally in your computer to assess the physical vulnerability of a given building portfolio computing the direct financial losses associated to hazard and multi-hazard scenarios making use of the DEUS program. It is available in: https://github.com/gfzriesgos/deus/.
Ground motion models (GMM) have been employed in several domains, from traditional seismic hazard and risk analysis to more recent shakemaps and rapid loss assessment. In this framework, eGSIM is a Python package and web application intended to help engineers and seismologist in understanding how different models compare for specific earthquake scenarios and how well they fit to observed ground motion data, producing results as visual plot or tabular data in standard, accessible and convenient formats (CSV, HDF, JSON and several image formats). Based on OpenQuake, a popular open-source Python library for seismic hazard and risk analysis, eGSIM incorporates and makes available in two user-friendly interfaces hundreds of published GMMs implemented and tested in OpenQuake: an online graphical user interface (GUI) accessible at https://egsim.gfz-potsdam.de, ideal for comparisons that can be visualized or downloaded as images, and a web application programming interface (web API), implemented along the lines of popular seismological web services (FDSN), more suited for comparisons that may be automatized in scheduled jobs, or need to be integrated into custom code and further processed in the user's own workflows. By incorporating databases in form of so-called flatfiles (ESM) and regionalizations derived from seismic hazard models (SHARE, ESHM20), eGSIM allows users to seamlessly select data for comparison and models for comparison based on regions of interest. It also features management scripts to smoothly incorporate new flatfiles or regionalizations from future research projects.Moreover, via the generation of flatfile templates based on a custom selection of GMMs, and the possibility to upload user-defined flatfiles, eGSIM facilitates the non-trivial task of compiling data for model comparison, and can be used to analyze ground motions from any data set recorded anywhere in the world, including rapid analysis of earthquake records following large events.
The River Plume Workflow is part of the Flood Event Explorer (FEE, Eggert et al., 2022), developed at the GFZ German Research Centre for Geosciences in close collaboration with Helmholtz-Zentrum Hereon. It is funded by the Initiative and Networking Fund of the Helmholtz Association through the Digital Earth project (https://www.digitalearth-hgf.de/). The focus of the River Plume Workflow is the impact of riverine flood events on the marine environment. At the end of a flood event chain, an unusual amount of nutrients and pollutants is washed into the North Sea, which can have consequences, such as increased algae blooms. The workflow aims to enable users to detect a river plume in the North Sea and to determine its spatio-temporal extent. Identifying river plume candidates can either happen manually in the visual interface or also through an automatic anomaly detection algorithm, using Gaussian regression. In both cases a combination of observational data, namely FerryBox transects and satellite data, and model data are used. Once a river plume candidate is found, a statistical analysis supplies additional detail on the anomaly and helps to compare the suspected river plume to the surrounding data. Simulated trajectories of particles starting on the FerryBox transect at the time of the original observation and modelled backwards and forwards in time help to verify the origin of the river plume and allow users to follow the anomaly across the North Sea. An interactive map enables users to load additional observational data into the workflow, such as ocean colour satellite maps, and provides them with an overview of the flood impacts and the river plume’s development on its way through the North Sea. In addition, the workflow offers the functionality to assemble satellite-based chlorophyll observations along model trajectories as a time series. They allow scientists to understand processes inside the river plume and to determine the timescales on which these developments happen. For example, chlorophyll degradation rates in the Elbe river plume are currently investigated using these time series. The workflow's added value lies in the ease with which users can combine observational FerryBox data with relevant model data and other datasets of their choice. Furthermore, the workflow allows users to visually explore the combined data and contains methods to find and highlight anomalies. The workflow’s functionalities also enable users to map the spatio-temporal extent of the river plume and investigate the changes in productivity that occur in the plume. All in all, the River Plume Workflow simplifies the investigation and monitoring of flood events and their impacts in marine environments.
The Digital Earth Flood Event Explorer supports geoscientists and experts to analyse flood events along the process cascade event generation, evolution and impact across atmospheric, terrestrial, and marine disciplines. It applies the concept of scientific workflows and the component-based Data Analytics Software Framework (DASF, Eggert and Dransch, 2021) to an exemplary showcase. It aims at answering the following geoscientific questions: - How does precipitation change over the course of the 21st century under different climate scenarios over a certain region? - What are the main hydro-meteorological controls of a specific flood event? - What are useful indicators to assess socio-economic flood impacts? - How do flood events impact the marine environment? - What are the best monitoring sites for upcoming flood events? The Flood Event Explorer developed scientific workflows for each geoscientific question providing enhanced analysis methods from statistics, machine learning, and visual data exploration that are implemented in different languages and software environments, and that access data form a variety of distributed databases. The collaborating scientists are from different Helmholtz research centers and belong to different scientific fields such as hydrology, climate-, marine-, and environmental science, and computer- and data science. It is funded by the Initiative and Networking Fund of the Helmholtz Association through the Digital Earth project (https://www.digitalearth-hgf.de/).
The Socio-Economic Flood Impacts Workflow is part of the Flood Event Explorer (FEE, Eggert et al., 2022), developed at the GFZ German Research Centre for Geosciences . It is funded by the Initiative and Networking Fund of the Helmholtz Association through the Digital Earth project (https://www.digitalearth-hgf.de/). The Socio-Economic Flood Impacts Workflow aims to support the identification of relevant controls and useful indicators for the assessment of flood impacts. It should support answering the question What are useful indicators to assess socio-economic flood impacts?. Floods impact individuals and communities and may have significant social, economic and environmental consequences. These impacts result from the interplay of hazard - the meteo-hydrological processes leading to high water levels and inundation of usually dry land, exposure - the elements affected by flooding such as people, build environment or infrastructure, and vulnerability - the susceptibility of exposed elements to be harmed by flooding. In view of the complex interactions of hazard and impact processes a broad range of data from disparate sources need to be compiled and analysed across the boundaries of climate and atmosphere, catchment and river network, and socio-economic domains. The workflow approaches this problem and supports scientists to integrate observations, model outputs and other datasets for further analysis in the region of interest. The workflow provides functionalities to select the region of interest, access hazard, exposure and vulnerability related data from different sources, identifying flood periods as relevant time ranges, and calculate defined indices. The integrated input data set is further filtered for the relevant flood event periods in the region of interest to obtain a new comprehensive flood data set. This spatio-temporal dataset is analysed using data-science methods such as clustering, classification or correlation algorithms to explore and identify useful indicators for flood impacts. For instance, the importance of different factors or the interrelationships among multiple variables to shape flood impacts can be explored. The added value of the Socio-Economic Flood Impacts Workflow is twofold. First, it integrates scattered data from disparate sources and makes it accessible for further analysis. As such, the effort to compile, harmonize and combine a broad range of spatio-temporal data is clearly reduced. Also, the integration of new datasets from additional sources is much more straightforward. Second, it enables a flexible analysis of multivariate data and by reusing algorithms from other workflows it fosters a more efficient scientific work that can focus on data analysis instead of tedious data wrangling.
| Organisation | Count |
|---|---|
| Weitere | 1 |
| Wissenschaft | 32 |
| Type | Count |
|---|---|
| unbekannt | 33 |
| License | Count |
|---|---|
| Offen | 33 |
| Language | Count |
|---|---|
| Englisch | 33 |
| Resource type | Count |
|---|---|
| Keine | 33 |
| Topic | Count |
|---|---|
| Boden | 19 |
| Lebewesen und Lebensräume | 18 |
| Luft | 16 |
| Mensch und Umwelt | 32 |
| Wasser | 16 |
| Weitere | 33 |