Skip to content

OpenQuake Engine 2.4.0

Compare
Choose a tag to compare
@nastasi-oq nastasi-oq released this 25 May 09:14
· 149993 commits to master since this release

[Michele Simionato (@micheles)]

  • Now the command oq export loss_curves/rlz-XXX works both for the
    classical_risk calculator and the event_based_risk calculator

[Daniele Viganò (@daniviga)]

  • Remove the default 30 day-old view limit in the WebUI calculation list

[Michele Simionato (@micheles)]

  • Fixed a broken import affecting the command oq upgrade_nrml
  • Made it possible to specify multiple file names in
    in the source_model_logic_tree file
  • Reduced the data transfer in the object RlzsAssoc and improved the
    postprocessing of hazard curves when the option --hc is given
  • Changed the ruptures.xml exporter to export unique ruptures
  • Fixed a bug when downloading the outputs from the WebUI on Windows
  • Made oq info --report fast again by removing the rupture fine filtering
  • Improved the readibility of the CSV export dmg_total
  • Removed the column eid from the CSV export ruptures; also
    renamed the field serial to rup_id and reordered the fields
  • Changed the event loss table exporter: now it exports an additional
    column with the rup_id
  • Changed scenario npz export to export also the GMFs outside the maximum
    distance
  • Fixed scenario npz export when there is a single event
  • Replaced the event tags with numeric event IDs
  • The mean hazard curves are now generated by default
  • Improved the help message of the command oq purge
  • Added a @reader decorator to mark tasks reading directly from the
    file system
  • Removed the .txt exporter for the GMFs, used internally in the tests
  • Fixed a bug with relative costs which affected master for a long time,
    but not the release 2.3. The insured losses were wrong in that case.
  • Added an .hdf5 exporter for the asset loss table
  • Loss maps and aggregate losses are computed in parallel or sequentially
    depending if the calculation is a postprocessing calculation or not
  • Deprecated the XML risk exporters
  • Removed the .ext5 file
  • Restored the parameter asset_loss_table in the event based calculators
  • Added a full .hdf5 exporter for hcurves-rlzs
  • Removed the individual_curves flag: now by default only the statistical
    hazard outputs are exported
  • Saved a lot of memory in the computation of the hazard curves and stats
  • Renamed the parameter all_losses to asset_loss_table
  • Added an experimental version of the event based risk calculator which
    is able to use GMFs imported from an external file
  • Added a max_curve functionality to compute the upper limit of the
    hazard curves amongst realizations
  • Raised an error if the user specifies quantile_loss_curves
    or conditional_loss_poes in a classical_damage calculation
  • Added a CSV exporter for the benefit-cost-ratio calculator
  • The classical_risk calculator now reads directly the probability maps,
    not the hazard curves
  • Turned the loss curves into on-demand outputs
    for the event based risk calculator
  • The loss ratios are now stored in the datastore and not in an
    external .ext5 file
  • The engine outputs are now streamed by the WebUI
  • Used a temporary export directory in the tests, to avoid conflicts
    in multiuser situations
  • Added an .npz exporter for the loss maps
  • Raised an error early when using a complex logic tree in scenario
    calculations
  • Changed the CSV exporter for the loss curves: now it exports all the
    curves for a given site for the classical_risk calculator
  • Fixed the save_ruptures procedure when there are more than 256
    surfaces in the MultiSurface
  • Renamed the csq_ outputs of the scenario_damage to losses_
  • Changed the way scenario_damage are stored internally to be more
    consistent with the other calculators
  • Removed the GSIM from the exported file name of the risk outputs
  • New CSV exporter for GMFs generated by the event based calculator
  • The event IDs are now unique and a constraint on the maximum
    number of source groups (65,536) has been added
  • Added an output losses_by_event to the scenario_risk calculator
  • Changed the output ruptures.csv to avoid duplications
  • Added an output losses_by_taxon to the scenario_risk calculator
  • Fixed a performance bug in get_gmfs: now the scenario risk and damage
    calculators are orders of magnitude faster for big arrays
  • Added an export test for the event loss table in the case of multiple TRTs
  • Removed the experimental rup_data output
  • Added an .npz export for the output losses_by_asset
  • Exported the scenario_risk aggregate losses in a nicer format

[Daniele Viganò (@daniviga)]

  • The 'oq webui' command now works on a multi-user installation
  • Splitted RPM packages into python-oq-engine (single node) and
    python-oq-engine-master/python-oq-engine-worker (multi-node)

[Paolo Tormene (@ptormene)]

  • The 'Continue' button in the Web UI is now available also for risk
    calculations

[Michele Simionato (@micheles)]

  • Fixed a Python 3 bug in the WebUI when continuing a calculation: the
    hazard_calculation_id was passed as a string and not as an integer
  • Changed to rupture storage to use variable length-arrays, with a speedup
    of two orders of magnitude
  • Avoided storing twice the rupture events
  • Optimized the serialization of ruptures on HDF5 by using a sids output
  • Changed the Web UI button from "Run Risk" to "Continue"
  • The avg field in the loss curves is computed as the integral of the curve
    again, and it is not extracted from the avg_losses output anymore
  • Made the fullreport exportable
  • Fixed the rup_data export, since the boundary field was broken
  • Restored the output losses_by_taxon in the event_based_risk calculator
  • Fixed the calculator event based UCERF so that average losses can
    be stored

[Daniele Viganò (@daniviga)]

  • Added a check to verify that an 'oq' client is talking to the
    right DbServer instance
  • Introduced an optional argument for 'oq dbserver' command line
    to be able to override its default interface binding behaviour

[Michele Simionato (@micheles)]

  • Optimized the event based calculators by reducing the number of calls
    to the GmfComputer and by using larger arrays
  • Added a check on missing vulnerability functions for some loss type
    for some taxonomy
  • Now we save the GMFs on the .ext5 file, not the datastore
  • Fixed bug in event_based_risk: it was impossible to use vulnerability
    functions with "PM" distribution
  • Fixed bug in event_based_risk: the ebrisk calculator is required as
    precalculator of event_based_risk, not others
  • Fixed bug in scenario_risk: the output all_losses-rlzs was aggregated
    incorrectly
  • Now the ucerf_risk calculators transfer only the events, not the ruptures,
    thus reducing the data transfer of several orders of magnitude
  • Added a view get_available_gsims to the WebUI and fixed the API docs
  • Introduced a configuration parameter max_site_model_distance with default
    of 5 km
  • Implemented sampling in the UCERF event based hazard calculator

[Daniele Viganò (@daniviga)]

  • Use threads instead of processes in DbServer because SQLite3
    isn't fork-safe on macOS Sierra

[Michele Simionato (@micheles)]

  • Fixed a TypeError when deleting a calculation from the WebUI
  • Extended the command oq to_hdf5 to manage source model files too
  • Improved significantly the performance of the event based calculator
    when computing the GMFs and not the hazard curves
  • Stored information about the mean ground motion in the datastore
  • Saved the rupture mesh with 32 floats instead of 64 bit floats
  • Raised the limit on the event IDs from 2^16 to 2^32 per task
  • Fixed classical_risk: there was an error when computing the statistics
    in the case of multiple assets of the same taxonomy on the same site
  • Changed the UCERF event based calculators to parallelize by SES
  • Fixed a site model bug: when the sites are extracted from the site model
    there is no need to perform geospatial queries to get the parameters
  • Added a command oq normalize to produce good sites.csv files
  • Introduced a ses_seed parameter to specify the seed used to generate
    the stochastic event sets; random_seed is used for the sampling only
  • Changed the build_rcurves procedure to read the loss ratios directly from
    the workers