Page tree
Skip to end of metadata
Go to start of metadata

Date

Attendees

Goals

  1. how to include into the VESPA interface, results/data from models: on which grid, average (on what basis), data or link to an interface to the model ?

  2. how to link tools like radiative transfer codes: building the inputs based on the VESPA content (temperature profiles for example, spectra, ...); link to the tool (evaluation of the resources: time machine; for ex at IASB, IT team not in favor to let large jobs being run by 'outsiders'); security issues; how to get the results (simulated spectra, profiles ...) back to VESPA to be plotted

Discussion items


VESPA server at LMD

  • The VESPA-LATMOS service on the MCD shows some limitations. The number of granules can be very high and the database (or server) seems not to be able to really cope for such a large number of granules. Several options are discussed:
    • Share only profiles corresponding to observations?
    • Can we put them into an observation service (as supplementary granules, as a remote interpolation of currently MCD profiles)?
      • EM do not agree: there is a right way of averaging and interpolating the database, and it is done correctly in MCD.
      • If could provide a service on MCD proposing validated profiles at places where there are observations.


VESPA server at LATMOS

  • Rename the VESPA server at LATMOS as "LATMOS" and not "SPICAM". See here: http://vo.projet.latmos.ipsl.fr
  • Rebuild a new SPICAM service that includes the 3 SPICAM tables (currently: spicam.epn_core, ozone.epn_core, locdens.epn_core)
    • Either start from YIN ZI work (mixin) or from scratch (table + script)

Simulation Servives

  • For MCD @ LMD: Use EPN-Ping ? 

Discussion on DOI

  • Attributing DOI to data: this is a long standing discussion in many groups and archive centre.
  • The goal is to identity datasets and processing: the main question behind is the reproducibility of scientific results.
  • Most data centres decided to set DOI (or persistent identifiers of some sort) to datasets, not to individual products
    • Some are studying to set DOI to query results in order to be able to replay a query (e.g.: Query Store project at RDA, CADC in Canada)
    • How to trace or identify processing ? 
  • DOI are composed of a prefix and a suffix:
    • The prefix part identifies the institute (similar to Authority ID in IVOA identifiers). Who is the institute responsible for DOI maintenance: ESA or lab? Should be Lab for reduced data.
  • A team at LATMOS has set up DOIs on there products. The DOI is resolved to a web page describing the dataset. This a mandatory, and must be persistent. 
  • In a VO context, it is not simple. The interoperability is not going through web pages. We need a page per DOI, with a granularity to be decided.
    • Difficult to do at individual data product level.
    • It has more sense for a paper or a dataset.
    • As the DOI must point to a persistent and stable resource, a versioning of DOI is necessary for each resource change (new product in dataset, new calibration...)  
  • In IVOA frame:
    • An ivoid is a unique identifier for a resource in the VO (e.g.: ivo://padc.obspm/lesia/apis/epn is the EPNcore service of APIS)
    • The same structure could be used for corresponding DOIs: doi://padc.obspm/lesia/apis/epn-v1234 would point to a static page describing APIS (with versioning v1234 for every service change)

Run on demand

  • Computing services: There is a system developed by IVOA, named UWS (Universal Worker Service) that is a job manager.
  • IASB needs asynchronous run capabilities (UWS can do this).
  • Use case: 
    • We want to use output of a VESPA query (epncore table row) and forward it into a computing service input (workflow?)
  • For vertical profile data: 
    • Put the 3 altitude coordinates (from surface, from geoid...) in the output when possible.
    • The surface is linked to the planetary ellipsoid? DTMs are changing too often to be reliable on the long run.
    • For Stéphane Erard: Check again the definition of C3
  • For simulation runs, on server side:
    • We get EPNcore parameters and convert them into local input parameters (unit, frame...)
    • Do the computation, and format output data into VOTable
    • How to handle a series of queries with different parameters?  
      • it is possible to set up a loop in STILTS (doable in VESPA?)
    • We need a "service capability" service
    • Large queries must but forbidden with a job manager (e.g. on ASIMUT).
  • Exiting Solution? 
    • OBSPARIS will explore what is existing within IVOA.
    • IASB/LATMOS/LMD will explore what is existing within Atmospheric community (including Earth?) 
    • Discuss also with Ronan Modolo (LATMOS): IMPEx does similar things for plasma simulations.
  • Distribute service in DaCHS?
    • Declare a service with parameters in DaCHS ? Check with Pierre Le Sidaner and Markus.
    • It is possible to declare the service in the IVOA registry through DaCHS (Check how to do this)

  • Managing input data:
    • If there are input data (not just parameters), the best option is to send the URL of a file containing the input data (this is way it's done in IMPEx). This might also be done with a POST request.
  • Specification of interface
    • The API must be specified (how to submit a query and how to get back the result)
    • Define typical query content for MCD and ASIMUT
    • Propose corresponding GET queries with EPNcore parameters
    • Use PDL for capabilities and input parameters ? 
    • define interface: query and response for a single output; for N outputs. 

Action items

0 Comments

You are not logged in. Any changes you make will be marked as anonymous. You may want to Log In if you already have an account.