<?xml version="1.0" encoding="UTF-8"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
	    <channel>
        <title>COST1207 - Group: Registered users</title>
        <link>http://eubrewnet.aemet.es/cost1207/forum/?group=2</link>
        <description><![CDATA[COST Action ESCOST1207]]></description>
        <generator>Simple:Press Version 6.10.11</generator>
        <atom:link href="http://eubrewnet.aemet.es/cost1207/forum/rss/?group=2" rel="self" type="application/rss+xml"/>
		                <item>
                    <title>Javier Lopez-Solano on AOD product</title>
                    <link>http://eubrewnet.aemet.es/cost1207/forum/wg2/aod-product/#p71</link>
                    <category>WG2 - Algorithms</category>
                    <guid isPermaLink="true">http://eubrewnet.aemet.es/cost1207/forum/wg2/aod-product/#p71</guid>
					                        <description><![CDATA[<p>Hi all!</p>
<p>As you already know from the Azores and Edinburgh meetings, within the framework of COST Action 1207 and the WMO-CIMO Testbed for Aerosols and Water Vapor Remote Sensing Instruments (Izaña), at the RBCCE we have been working, in collaboration with other participants of EUBREWNET (Thomas Carlund and Henri Diémoz, to cite just two names), on the development of the AOD algorithm for the network.</p>
<p>We plan to start coding the AOD product in the server very soon. To make things easier, it would be better to have the different levels of the product defined in advance. So, to start somewhere, we have come up with the following:</p>
<pre>
<p><input type='button' class='sfcodeselect' name='sfselectit4291' value='Select Code' data-codeid='sfcode4291' /></p><div class='sfcode' id='sfcode4291'>
* Level 0												
   Taken directly from the Brewer (IOS) program											
												
* Level 1												
   1) Ozone from the L1.5 product, with the standard Brewer Rayleigh correction replaced by the one produced by Bodhaine's coefficients
																				
   2) Corrections to the counts:											
      a) Same as in the ozone: Individual (not summaries!) raw counts with dark counts and dead time corrections, plus ozone L1.5 data filters (these counts do NOT include the standard Brewer corrections for temperature, Rayleigh, and filters)										
      b) AOD specific:	
         i) Temperature correction with absolute temperature coefficients (not available right now, use the relative ones from the ozone configuration)										
	 ii) Filter correction, with spectral attenuation coefficients for each filter										
	 iii) Earth-Sun distance correction																				

   3) AOD calculation (uses an ETC matrix, with one calibration constant for each wavelength and filter)										
      a) Rayleigh correction with the spectral Rayleigh coefficients from Bodhaine's prescription (to start, we will use the climatological pressure as in the ozone, but might change to a reanalysis value at a later date)
      b) Ozone correction with the spectral Ozone absorption coefficients										
												
* Level 1.5											
   1) To the AOD Level 1 product, add the AOD-specific data filters and corrections:											
      a) AOD data filter based on the standard deviation of each group of 5 observation (limit is 0.02, following Gröbner 2004)										
      b) Polarization correction (currently, Cede et al. but may change to Diemoz and Virgilio in the future)										
												
   2) Still to be developed:	
      a) Stray light correction										
      b) Standard lamp correction (can be used somehow to track changes in the AOD configuration?)										
      c) Other corrections and filters										
																			
* Level 2.0												
   1) Ozone from the L2 product																					
   2) AOD configurations validated against Brewer/PFR/AERONET reference
</div>
</pre>
<p>We would like to have your input  -- do you agree with the general layout of the levels? do you miss some correction/filter? if you have experience with other AOD products,how does this compare to it?<br />
and of course anything else you come up with.</p>
<p>We already got some suggestions from Thomas Carlund, which have been already included above, and Stelios Kazadzis, who points that:</p>
<pre>
<p><input type='button' class='sfcodeselect' name='sfselectit3796' value='Select Code' data-codeid='sfcode3796' /></p><div class='sfcode' id='sfcode3796'>
I would put all level 1.5 corrections under level 1 since now level 1 AOD calculation is unusable and AOD has to be recalculated after the steps presented in level 1.5.
SO I would put
L1 output:  only corrected signals
L1.5: final AOD with preliminary calibration (including all corrections)  
L2: AOD with final calibration
The need of a new calibration could be identified some months later than the actual measurement. (see aeronet example below)
 
Another issue is the cloud flagging mentioned in level 1.5.1 a).
It has to be defined if the AOD cloud flagging principles will be the same as the ozone acceptance/rejection principles based on the standard deviation of the group of 5 observations.
Based on the fact that you need the ozone to derive AOD , AOD cloud flagging has to be the same or more strict (e.g. cirrus clouds cases).
 
Since you are mentioning aeronet data.
Level 1.5 data include all corrections and the cloud flagged final AOD product. So someone can use them more or less real time.
Level 2 data are calculated much later when the instrument is re-calibrated and they are the final data.
I think in the Brewer cases if someone wants to follow this would have to finalize everything under level 1.5
and then after x months later when you re-calibrate the instrument and determine new ETC’s, you go back and recalculate all AOD data again as level 2.
 
 
Concerning the ETC part:
“the ETC calibration is a matrix for filter and wavelength”
I would work with converting all brewer counts of all filters to nd 0 by having a conversion function Counts (wl,filter)= f(nd(wv,0)).
I suppose that’s how you have calculated ETCs for all filters.
Using ETCs calculated from filter conversions includes an additional step of the convolution of the ETC to the brewer slit at a specific wavelength and for different nd filiters.
But I think maybe this is a detail or you have calculated the ETCs for different nd filters with some other way.
</div>
</pre>
]]></description>
					                    <pubDate>Mon, 09 Jan 2017 10:15:25 +0000</pubDate>
                </item>
				                <item>
                    <title>lakkala on UV products</title>
                    <link>http://eubrewnet.aemet.es/cost1207/forum/wg4/uv-products/#p68</link>
                    <category>WG4 - Users, public outreach and applications</category>
                    <guid isPermaLink="true">http://eubrewnet.aemet.es/cost1207/forum/wg4/uv-products/#p68</guid>
					                        <description><![CDATA[<p>Dear all,<br />
Please find attached the summary of the answers of the UV product questionnaire.<br />
Best regards,<br />
Kaisa</p>
<p>We got 12 answers. Summary from Kaisa Lakkala 21.1.2015.</p>
<p>1. Are you interested in<br />
a) UV doses, unit J/m²<br />
2 yes<br />
b) dose rates, unit W/m²<br />
3 yes<br />
CIE erythemally-weighted UV<br />
c) spectra, unit W/m²/nm<br />
8 yes<br />
d) something else?<br />
3 for daily integrals J/m2/nm<br />
  -integrals J/m2/nm divided in two:<br />
  sun rise -&#62;midday<br />
  midday -&#62;sun set</p>
<p>2. What would be the good time resolution<br />
for<br />
a) spectral data<br />
- at least hourly<br />
- 15-30 min<br />
- continuous measurements</p>
<p>b) dose rates<br />
-at least hourly</p>
<p>c) doses<br />
-hourly<br />
-one per day<br />
-sun rise -&#62;midday<br />
-midday -&#62;sun set</p>
<p>d) something else?<br />
-all data available<br />
-Minimum is 15 minute time resolution, but we could specify a more objective criterium from SZA changes in the morning/afternoon and the UV changes from that</p>
<p>-include midday UV measurement for each station into the schedule of the Brewers</p>
<p>3. Which action spectra would you like to use for the dose rates?<br />
6  for -CIE erythemal<br />
3 for -Vitamin D<br />
2 for -new CIE 1998 action spectrum for erythema<br />
  -Generalized Plant (Caldwell 1971)<br />
  -DNA damage<br />
  -UVB<br />
  -UVA</p>
<p>-Here is our list of action spectra that we use: <a href="http://uv.biospherical.com/login/GUV/description-GUV-data-products.html" rel="nofollow" target="_blank"><a href="http://uv.biospherical.com/log" rel="nofollow">http://uv.biospherical.com/log</a>.....ducts.html</a></p>
<p>-The action spectra that we apply to spectra submitted to NDACC include:<br />
290-315 nm UVB (W m-2)<br />
315-400 nm UVA  (W m-2)<br />
Erythemal UV (W m-2), CIE according to McKinlay and Diffey (1987)<br />
DNA-weighted UV (W m-2), Bernhard and Seckmeyer (1997) formulation of Setlow (1974)<br />
Generalised Plant (W m-2), Green et al. (1974) formulation of Caldwell (1971), normalized to 1 at 300 nm<br />
Vitamin D production (W m-2), Bouillon et al. (2006), truncated at 315 nm<br />
So if these spectra could be also part of the EUBREWNET database, that would be nice</p>
<p>We could maybe provide different action spectra through the database and let the users decide if and which one they would like to use. Then the database could maybe perform the weighting? </p>
<p>4. Do you need spectra, which are weighted with an action spectra (but not integrated over the wavelengths)?<br />
2 No<br />
2 Yes</p>
<p>5. Do you need some other UV quantities?<br />
-Cloud modification factor<br />
-Diffuse component<br />
-Direct component<br />
-summaries<br />
-monthly means of 1a-b<br />
-monthly and yearly sums of 1a-b</p>
<p>6. Something else?<br />
- SZA for each wavelength<br />
- cosine correction factor<br />
- slit function of each instrument<br />
- info of applied corrections, and how they are made<br />
- Time step for each wavelength<br />
- uncertainties in spectral values<br />
- time available for every spectral measurement<br />
- QA spectral state at the highest time available, and time-stamped for each data point</p>
<p>-Have a look at what we provide: <a href="http://uv.biospherical.com/Version2/Dataproducts.asp" rel="nofollow" target="_blank"><a href="http://uv.biospherical.com/Ver" rel="nofollow">http://uv.biospherical.com/Ver</a>.....oducts.asp</a><br />
Quality control information (see “Flags”), model spectra, direct measurements in addition to global measurements, plus data from ancillary sensors would all be useful.</p>
<p>-One important question is whether the spectral data could be made available for the UV plus VIS range (up to around 800 nm). Of course this goes out of the "UV scope" but it is well known that the effect of UV radiation on most organisms depends on interactions with UV-A and VIS radiation. So having the UV + VIS spectrum available would make it much more useful, say even, if the main data corpus is only UV, having at least some locations with data that includes VIS would be extremely useful.</p>
<p>-I am developing R packages for photobiology, we are using them in house and with a few collaborators, but the long-tern plan is to advertise them when they have been fully written and tested. If you are interested you can have a look at <a href="http://r4photobiology.wordpress.com/" rel="nofollow" target="_blank">http://r4photobiology.wordpress.com/</a></p>
<p>-I like to download data from ftp sites, and I like ASCII data</p>
<p>-Freely available access to the data for scientific use</p>
]]></description>
					                    <pubDate>Tue, 08 Mar 2016 09:17:08 +0000</pubDate>
                </item>
				                <item>
                    <title>Bentorey Hernández Cruz on Brewer Station Metadata</title>
                    <link>http://eubrewnet.aemet.es/cost1207/forum/wg3/brewer-station-metadata/#p67</link>
                    <category>WG3 - Networking and Data Processing</category>
                    <guid isPermaLink="true">http://eubrewnet.aemet.es/cost1207/forum/wg3/brewer-station-metadata/#p67</guid>
					                        <description><![CDATA[<p>Hello All,</p>
<p>Thank you Volodya, it helps a lot. More or less was what I was thinking about was to create an additional table for station information and other to link this information in time with Brewers.<br />
The design would be like this:<br />
Brewer Table (It's implemented now): Brewer id, present latitude and longitude (from the B files), status, last time of transmission...<br />
Station Table: latitude, longitude, height, gaw id, IP in charge....<br />
Brewer Station table: Brewerid, Station id and Date when Brewer was deployed in the Station.</p>
<p>What do you think about it?</p>
<p>Bentor</p>
]]></description>
					                    <pubDate>Wed, 09 Dec 2015 08:01:36 +0000</pubDate>
                </item>
				                <item>
                    <title>Volodya on Brewer Station Metadata</title>
                    <link>http://eubrewnet.aemet.es/cost1207/forum/wg3/brewer-station-metadata/#p66</link>
                    <category>WG3 - Networking and Data Processing</category>
                    <guid isPermaLink="true">http://eubrewnet.aemet.es/cost1207/forum/wg3/brewer-station-metadata/#p66</guid>
					                        <description><![CDATA[<p>Hello,</p>
<p>in my database, I keep information about locations/stations and about Brewers in separate tables. The Brewer table should be able to handle multiple versions for the same Brewer serial number. Having multiple versions of stations should also be possible in the implementation. </p>
<p>Everything that is usually needed for the currently used Brewer algorithm is available in the b-files, but analyzing data from several instruments may require additional information that can come from these "meta" tables. </p>
<p>An example of a query that one might do is "which stations within 2000 km radius from where Brewer ### was on Dec. 4, 2015 had Brewer MKIII, are at altitude 500 m or higher and launched ozone sondes?"</p>
<p>Hope this helps,</p>
<p>Volodya</p>
]]></description>
					                    <pubDate>Fri, 04 Dec 2015 15:49:45 +0000</pubDate>
                </item>
				                <item>
                    <title>Bentorey Hernández Cruz on Brewer Station Metadata</title>
                    <link>http://eubrewnet.aemet.es/cost1207/forum/wg3/brewer-station-metadata/#p65</link>
                    <category>WG3 - Networking and Data Processing</category>
                    <guid isPermaLink="true">http://eubrewnet.aemet.es/cost1207/forum/wg3/brewer-station-metadata/#p65</guid>
					                        <description><![CDATA[<p>Hello everybody,<br />
   I've started with the process of add the metadata form to the Eubrewnet, taking as start point the one suggested by <a href="https://docs.google.com/forms/d/16QJfFi3zrhuNDmCO_hxY5CzlQE9-3o2VhDkKXGQfyWU/viewform" target="_blank">Diamantino</a> (See the above post by him) , but I have some questions about it.<br />
   Some of the information is related to a Brewer (like model and measures) but other is related to the place where the Brewers are deployed (latitude, longitude....).<br />
   For implementation purposes, I need to know exactly what's the relation between the Locations and Brewers and if it change in time.</p>
<p>   I began defining a station which is going to have information about the places where Brewer are deployed (from Diamantions Form):<br />
       - Station Id<br />
       - Gaw Id<br />
       - Wmo Id<br />
       - Latitude<br />
       - Longitude<br />
       - Altitude<br />
       - and so on ....</p>
<p>   And thinking out a way to link this information to the brewers over time (for example with the information of their available measures)</p>
<p>   How are the ways we are going to use this values? Are only to show information or are they needed for calculations? Which of them are going to be generated (as available measures)? Which of them come from other databases?</p>
<p>   Bentor</p>
]]></description>
					                    <pubDate>Mon, 30 Nov 2015 07:53:57 +0000</pubDate>
                </item>
				                <item>
                    <title>Bentorey Hernández Cruz on Eubrewnet Data Policy</title>
                    <link>http://eubrewnet.aemet.es/cost1207/forum/wg3/eubrewnet-data-policy/#p64</link>
                    <category>WG3 - Networking and Data Processing</category>
                    <guid isPermaLink="true">http://eubrewnet.aemet.es/cost1207/forum/wg3/eubrewnet-data-policy/#p64</guid>
					                        <description><![CDATA[<p>Dear all,<br />
    There's a discussion in the Core Group about de Data Policy in Eubrewnet. You can find a draft (by Veerle and Paul) in the owncloud of Eubrewnet: <a href="https://eubrewnet.aemet.es/cost1207/owncloud/public.php?service=files&#38;t=3725366259a87bc5a32cdb191948d50c" title="Data Policy" target="_blank">Data Policy</a>.</p>
<p>    Here you can see <b>Alberto</b> answering some <i>Veerle's</i> questions:</p>
<p><b>    Dear all<br />
</b><b>One big decision for the eubrewnet  :<br />
</b><b> The data user has to be registered at the eubrewnet, to download the product files.<br />
</b><b>if we are agree with this will be easy to monitor the access to the data:.<br />
</b><b>There are three groups of “registered user”<br />
</b><b>data user: Can download but cannot modify anything, can not access to  directly to the database,  only can download the products<br />
</b><b>data provider (PI): Can download  products, direct access to the database and  and modify  configuration/Metadata for their registered brewers<br />
</b><b>data admin: no limits<br />
</b><b>The metadata with the IP contact information is part of the configuration and will be displayed also on the header of the downloaded files.</b></p>
<p><i>2015-11-19 10:26 GMT+00:00 Veerle De Bock :<br />
</i><i>Dear CG members (and Bentor)</i></p>
<p><i>I have made a first draft of the data policy document for Eubrewnet (in consultation with Paul).<br />
</i><i>Could you have a look at the document and the comments below and let me know what you think?<br />
</i><i>If the document looks fine to all of you, we can send it to the MC members for approval.</i></p>
<p><i>1. comments to all:<br />
</i><i>- On the Aeronet site, you have a special section where you can find the general data use policy. When you want to download data, you get a pop-up window with the same<br />
</i><i>information and with an extra sentence at the end where the user is asked to click the "accept" or the "do not accept" button. I have added this sentence in red at the end of the document.<br />
<b>The  PI's  information  be will be on the popup  and on the header of the files.</b></i></p>
<p><i>- referencing: Aeronet has a few papers than can be used for referencing. Eubrewnet does not (yet) have such papers, so at the moment I would suggest to ask<br />
</i><i>people to refer to the website.<br />
</i><i>- which website do we refer to? For now I have put in the link to the database website, but we could also reference to eubrewnet.aemet.es.<br />
<b>The eubrewnet.org is the website for the cost action, once is over, the database  can  take the the domain name of the cost action. for example    eubrewnet.aemet.es/database for database and eubrewnet.aemet.es/cost1207 for the cost action web page.</b></i></p>
<p><i>- downloading: should we collect some information (such as email and institution) from the people who are downloading data? This could be entered after the 'accept' button<br />
  and before the download can begin:<br />
<b>See my first comment.</b></i></p>
<p><i>2. comment to Bentor:<br />
</i><i>- could you please enable the station operators to fill in the PI information in the database?<br />
<b>Now the metadata information is restricted with the information on the O3brewer.ini configuration file, Bento is working to include the form propose by WG3 a few months ago, Bento will contact you about the details.</b></i></p>
<p>Are there any doubt or remark for those who are not part from the Core Group?</p>
<p>Bentor</p>
]]></description>
					                    <pubDate>Mon, 30 Nov 2015 07:48:47 +0000</pubDate>
                </item>
				                <item>
                    <title>Alberto Redondas on Data ownership &#38; EUBREWNET future</title>
                    <link>http://eubrewnet.aemet.es/cost1207/forum/wg4/data-ownership-eubrewnet-future/#p63</link>
                    <category>WG4 - Users, public outreach and applications</category>
                    <guid isPermaLink="true">http://eubrewnet.aemet.es/cost1207/forum/wg4/data-ownership-eubrewnet-future/#p63</guid>
					                        <description><![CDATA[<p>The minutes of the Tenerife meeting are now available on the web (<a href="https://eubrewnet.aemet.es/cost1207/2015/01/16/wg-meeting-database-and-processing-tf-january-2015-2/" target="_blank"><a href="http://eubrewnet.aemet.es/cost">http://eubrewnet.aemet.es/cost</a>.....ry-2015-2/</a>)<br />
and here it summary of the  decisions that has been made.</p>
<p>Discussion of real time product is finished , we are agree with :<br />
  -SL correction<br />
  -Stray light correction,<br />
  -Filter correction </p>
<p>The products will change accordingly the information we have at that moment (stray light parameters ,fitler att , etc......)</p>
<p>Version 2 of the ozone algorithm to  who include Serdyuchenko cross section ,Bodhaine 1999  for Rayleigh  and  effective height according to climatology is agreed.<br />
AOD will use Version 2 for ozone for consistency and the same parameters for cross section and Rayleigh<br />
K&#38;Z will support the develop of the public library  of procedures  and algorithms and the warning system based on the FMI QC.</p>
<p>Web update:<br />
     -Level 0 data (without any process is now displayed)<br />
     -Database access by JSON interface.<br />
     -Configuration interface,is now ready  ( IP can introduce the configuration)</p>
<p>I will open a debate on the forum  because there are several  initiatives for the support future support of  EUBREWNET and data request  as a network.<br />
This is link with the current discussion about the data politics.</p>
<p>This initiatives are  :</p>
<p> -Presentation of Emilio behalf EUBREWET at the MACC meeting  in Reading, John was also there.<br />
- There is a project funded by ESA with the aim to provide real-time validation of  OMI, in selected stations of Eubrewnet.<br />
- The request from Zerefos of SO2 and AOD data of the network for Volcanic ash detection , this include several european brewer stations, that can serve as a test for the database.<br />
- A poster will be presented at NOG  meeting  by Henri Diemoz et al,  showing the results for the straylight for several brewers on the network</p>
<p>My proposal is to act like AERONET the idea is to have a federal network, so every IP is responsible of his own data.</p>
<p>The data products will be open, everybody can access to the data, but not the use,  like AERONET the PI has the priority on the publications. But the raw datafiles that some of you  sent will be only available to the other data submitters on the web, again the IP has the priorities for publication.</p>
<p>I think this is an opportunity to start to work as a network, to try to get funds  to support the network in near future.</p>
<p>Regards</p>
<p>Alberto Redondas</p>
]]></description>
					                    <pubDate>Wed, 25 Feb 2015 15:34:57 +0000</pubDate>
                </item>
				                <item>
                    <title>Webmaster on Data ownership</title>
                    <link>http://eubrewnet.aemet.es/cost1207/forum/wg4/data-ownership/#p62</link>
                    <category>WG4 - Users, public outreach and applications</category>
                    <guid isPermaLink="true">http://eubrewnet.aemet.es/cost1207/forum/wg4/data-ownership/#p62</guid>
					                        <description><![CDATA[<p>Dear all,<br />
    What you see is a conversation between John Rimmer, Alberto Redondas, Kaisa Lakkala, Tomy Karppinen, Julian Gröbner, Veerle De Bock and Alkiviadis F. Bais corresponding to a question by Andrew Smedley.</p>
<p>--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------<br />
Dear Kaisa,</p>
<p>As a user (i.e. someone who will be downloading data), I would prefer the data in their final QA spectral state at the highest time available, and time-stamped for each data point. From this I can calculate doses and rates if needs be, and apply any action spectra that are relevant.</p>
<p>However as someone who would be uploading data, this raises another question. I was under the impression that the EUBREWNET database was intended as a way to apply a second check on the diagnostics of all instruments in a consistent way - I had not realised until now that any UV data uploaded would be available for download by other users. What controls and agreements will be put in place to ensure some gives the proper credit / co-authorship on any work based on the data they might download? Once data is uploaded who is considered the owner? I am sure these details probably aren't finalised yet, but I would be interested to know the final decision, and perhaps it would be useful to circulate them so that contributors are aware of them when uploading data.</p>
<p>Yours,<br />
Andy Smedley<br />
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------<br />
Dear all,<br />
thank you Andrew for addressing an important question concerning the ownership of the data uploaded into EUBREWNET.<br />
I forward your message to the WG leaders, Alberto Redondas, Tomi Karppinen, Alkis Bais and Veerle de Bock, as well as to John Rimmer, so that they can further consider your questions.</p>
<p>Please look at the second paragraph of the forwarded message,</p>
<p>Best regards,<br />
Kaisa Lakkala<br />
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------<br />
Hello all</p>
<p>I my oppinion the data policy of AERONET is sutable for EUBREWNET and we can use the same, the data already there is now only for demostration purposes and we dont make any use of that without agrrement of the the 'PI'.</p>
<p>(From the AERONET site)</p>
<p>Notice to users:<br />
The public domain data you are about to download are contributed by the International AERONET Federation. Each site has a Principal Investigator(s) (PI) , responsible for deployment, maintenance and data collection. The PI has priority use of the data collected at the site. The PI is entitled to be informed of any other use of that site data. PI contact information can be found on data charts and in downloaded data files for each AERONET site.</p>
<p>Recommended guidelines for data use and publication:<br />
Although journal paper authorship and acknowledgement is the domain of the senior author and no policy is universally applicable, the AERONET contributors ask that every practical attempt be made to honor the following general guidelines.</p>
<p>Using AERONET data:<br />
Please consult with the PI(s) of the data to be used. </p>
<p>Referencing:<br />
Always reference the appropriate key AERONET papers for any publications.</p>
<p>Publishing AERONET data from a 'few' sites:<br />
Please consider authorship for the PI(s) and/or the following acknowledgement:</p>
<p>We thank the (Project/PI) for (its/theirs) effort in establishing and maintaining (site name(s)) sites.<br />
Publishing data from 'many' sites:<br />
A general acknowledgement is typically sufficient and may read:</p>
<p>We thank the (PI investigators) and their staff for establishing and maintaining the (#)sites used in this investigation.<br />
However if the AERONET data are a principal component of the paper then co-authorship to PI's should be offered.</p>
<p>Alberto<br />
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------<br />
Hi all.</p>
<p>I think Alberto’s suggestion is a good starting point for EUBREWENET.</p>
<p>Cheers<br />
Alkis<br />
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------<br />
Ultimately it will be a decision for the MC guided by the outcomes of WG4 on access rules. All contributors have the opportunity to be part of these discussions and should do so in order to encompass all view points. There is no value in being passive. Instead of asking what will happen we should encourage everyone to state what they would like to happen!<br />
Having said that, I also agree with Alberto and Alkis. </p>
<p>John<br />
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------<br />
Dear all,<br />
I added Julian as recipient, so he can also follow the discussion.<br />
I am sorry for my mistake, forgetting to include Julian already in the first email.</p>
<p>Best regards,<br />
Kaisa<br />
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------<br />
I agree that Alberto's suggestion of following the AERONET guidelines seems sensible. It is what I would expect both as a data supplier and if downloading data as an end-user. In some ways it is little more than a reminder to give due and proper credit for data monitoring, and the fact that PIs are responsible for data, but I think it is important that it is there.</p>
<p>Yours,<br />
Andy<br />
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------<br />
Dear All</p>
<p>This is an important issue for everyone in that if all network members are expected to submit their data, they need to be reassured over ownership concerns and data access rules.</p>
<p>With the time approaching for the database to become live, I think it would be a good idea to transfer this topic to the forum so everyone has a chance to comment.</p>
<p>It will be better to have it sorted out beforehand than to have it cause problems after implementation because of non-agreement. We can then take a final official vote in May.</p>
<p>Cheers<br />
John<br />
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------</p>
]]></description>
					                    <pubDate>Thu, 22 Jan 2015 12:15:59 +0000</pubDate>
                </item>
				                <item>
                    <title>tomikarp on Standard Lamp -test</title>
                    <link>http://eubrewnet.aemet.es/cost1207/forum/wg2/standard-lamp-test/#p61</link>
                    <category>WG2 - Algorithms</category>
                    <guid isPermaLink="true">http://eubrewnet.aemet.es/cost1207/forum/wg2/standard-lamp-test/#p61</guid>
					                        <description><![CDATA[<p>Hi Tomi,</p>
<p>Fair point. I suspect that this comes down to separating the variability of the lamp and the instrument. The most simple way of treating this is to assume that all the variation is due to an instrument, and that the reference lamp is stable -- as one would hope in the first instance. To go beyond this I guess you need some characterisation of lamp stability via an independent instrument (either in the lab or by a co-located instrument). Perhaps this would give some idea of typical responses of the lamp that you could then use to make objective decisions about what are likely instrument response changes and which are due to the lamp.</p>
<p>As to the time interval, I'm still of the opinion that you'd want something sort that represents the behaviour of the instrument close to the measurement you're interested in (having separated out the lamp, as above)</p>
<p>Yours,<br />
Andy</p>
]]></description>
					                    <pubDate>Fri, 19 Dec 2014 14:14:50 +0000</pubDate>
                </item>
				                <item>
                    <title>tomikarp on Standard Lamp -test</title>
                    <link>http://eubrewnet.aemet.es/cost1207/forum/wg2/standard-lamp-test/#p60</link>
                    <category>WG2 - Algorithms</category>
                    <guid isPermaLink="true">http://eubrewnet.aemet.es/cost1207/forum/wg2/standard-lamp-test/#p60</guid>
					                        <description><![CDATA[<p>Hello Andrew,</p>
<p>only thing I am afraid of is that the R6 does not really follow the<br />
changes in the ETC (meaning the changes in the response) but if you for<br />
example have five sl-test during a day they are on average quite close<br />
to the truth. If the +-10 units is not due to changes in the instruments<br />
response but due to lamp spectrum fluctuation the we should most<br />
definately use something that has smoother features.</p>
<p>How to detect steps is then another story and how to determine if its a<br />
change in the response or in the lamp is then maybe even harder. I think<br />
at least it calls for a human eye to decide when the reference value<br />
should be changed. And this should be done after the next calibration<br />
because if the lamp reference has changed but not the ETC  then we<br />
should apply new calibration constants from the point of change onwards.</p>
<p>best regards Tomi</p>
]]></description>
					                    <pubDate>Fri, 19 Dec 2014 14:11:19 +0000</pubDate>
                </item>
				                <item>
                    <title>tomikarp on Standard Lamp -test</title>
                    <link>http://eubrewnet.aemet.es/cost1207/forum/wg2/standard-lamp-test/#p59</link>
                    <category>WG2 - Algorithms</category>
                    <guid isPermaLink="true">http://eubrewnet.aemet.es/cost1207/forum/wg2/standard-lamp-test/#p59</guid>
					                        <description><![CDATA[<p>Some helpful views from Andew:</p>
<p>Hi Tomi,</p>
<p>Looking at your five options for managing the R6 values, the one that makes the most sense to me is a daily median. I think it is very hard to manage the steps and jumps that you see in the time series with any other option with a longer time scale. For example, how would you decide on a criterion for a step and when not to apply the time averaging? Besides do you not want the R6 that is most representative of the instrument behaviour on the day in question -- that you only get by using a daily mean of some sort. Also I think that the median has the benefit of rejecting outliers nicely, whereas the mean is influenced by these.</p>
<p>Yours,<br />
Andy</p>
]]></description>
					                    <pubDate>Fri, 19 Dec 2014 14:10:30 +0000</pubDate>
                </item>
				                <item>
                    <title>tomikarp on Standard Lamp -test</title>
                    <link>http://eubrewnet.aemet.es/cost1207/forum/wg2/standard-lamp-test/#p58</link>
                    <category>WG2 - Algorithms</category>
                    <guid isPermaLink="true">http://eubrewnet.aemet.es/cost1207/forum/wg2/standard-lamp-test/#p58</guid>
					                        <description><![CDATA[<p>Hello all,</p>
<p>thank you very much on the input for my survey and sorry for so late response.</p>
<p>All the replies I got will be on the Eubrewnet.org forum with this message. (please inform me if you don't want to have your response in the forum)</p>
<p>Here is a short overview on the replies and my thoughts on those.</p>
<p>We have to decide how to manage two different things the measurements and the reference value. First thing about the measurements is that they are kind of noisy. I look at the standard lamp history of Brewer 037 and the variability seems to be around 10 to 15 units, so some kind of smoothing/mean value could be good as representative of the measurements.</p>
<p>The used/suggested methods to manage the measured R6 values were:</p>
<p>-daily median<br />
-daily mean<br />
-10 days running mean with weighting (O3Brewer)<br />
-21 days running mean with weighting (found it on some WOUDC related website by Googling :D)<br />
-fitting a polynomial to the measurements</p>
<p>The time window of any of these methods should be investigated?</p>
<p>Also there is a problem of outliers every now and then. When I calculated the running mean for a figure that will be attached I took out all the values that were more than 15 units off the median value for the time window (10 or 21 days).</p>
<p>On the other hand we should also manage the reference value. If all the changes are changes in the instrument and not in the lamp the reference value can (and should) be kept the same as also the ETC. This way the SL-corrected ETC will be ok. On the other hand if we see in the next calibration that the changes in the R6 value are not the same than the change in the ETC, we should figure out what to use as the reference. Of course if there is a step or a jump a new reference value should be introduced. How about slow change?</p>
<p>Many of the methods in your answers had an interpolation of the reference value. Should this be used for each instrument?</p>
<p>Attached are some figures with sl time series of Brewer 037. The firs figure shows the whole time series. There is some strange behaviour in the beginning and the stability is reached in the early 2000's <img src="https://eubrewnet.aemet.es/cost1207/wp-includes/images/smilies/icon_biggrin.gif" alt=":D" class="spWPSmiley" style="max-height:1em;margin:0"  /> </p>
<p>There is also an example of these different smoothing methods for the year 2012.</p>
<p>Feel free to comment on the figures or my thoughts.</p>
<p>best regards and Merry Christmas<br />
Tomi</p>
<p>figures:  <a href="https://www.dropbox.com/sh/at4etrl3efk2crm/AABI9UJUuBXaE1P9Txs3EphVa?dl=0" rel="nofollow" target="_blank"><a href="https://www.dropbox.com/sh/at4" rel="nofollow">https://www.dropbox.com/sh/at4</a>.....EphVa?dl=0</a></p>
]]></description>
					                    <pubDate>Thu, 18 Dec 2014 12:06:16 +0000</pubDate>
                </item>
				                <item>
                    <title>tomikarp on Standard Lamp -test</title>
                    <link>http://eubrewnet.aemet.es/cost1207/forum/wg2/standard-lamp-test/#p57</link>
                    <category>WG2 - Algorithms</category>
                    <guid isPermaLink="true">http://eubrewnet.aemet.es/cost1207/forum/wg2/standard-lamp-test/#p57</guid>
					                        <description><![CDATA[<p>Dear Tomi,</p>
<p>standard lamp correction on the Brewer MKIV 097 in Poprad Ganovce has been<br />
performed by following procedures:<br />
1. SL tests have been run at least 3 times per day<br />
2. Interdiurnal SL corrections of total ozone have been done using software<br />
of Martin Stanek (O3Brewer v. 5.0 - SL test, O3 correction and<br />
recalculation). There is apparent  correlation between the instrument<br />
temperature (an also ambient temperature)  and the SL correction (R5,R6<br />
ratios).<br />
3. The SL corrections after the instrument calibration have been done in<br />
accordance with recommendations set in the instrument calibration report. If<br />
the ozone data recalculation is recommended, the correction is done using<br />
linear interpolation in accordance with the SL history.<br />
4. Correction of the ozone has not been applied on data measured before<br />
sudden changes in the instrument like  the SL exchange or by other serious<br />
technical problems (e.g.exchange of some instruments part).</p>
<p>Anna and Oliver</p>
]]></description>
					                    <pubDate>Thu, 18 Dec 2014 12:05:44 +0000</pubDate>
                </item>
				                <item>
                    <title>tomikarp on Standard Lamp -test</title>
                    <link>http://eubrewnet.aemet.es/cost1207/forum/wg2/standard-lamp-test/#p56</link>
                    <category>WG2 - Algorithms</category>
                    <guid isPermaLink="true">http://eubrewnet.aemet.es/cost1207/forum/wg2/standard-lamp-test/#p56</guid>
					                        <description><![CDATA[<p>Hello all,</p>
<p>we have already seen several replies with good practices for SL<br />
corrections. Here is my 2 cents.</p>
<p>1. Fitting a quadratic polynomial (vs. time) to a set of SL ratios may<br />
help coping with trends and curvatures in the changes.<br />
2. Regardless of what type of fit is used, the difference between the<br />
fitted and un-fitted data is a good metric for  how well the fit<br />
describes the changes, especially if those are fast and/or non-linear.<br />
This can also reveal steps in the record.<br />
3. Routinely calculating correlation between R6 and temperature may help<br />
preventing running with non-optimal temperature coefficients. This can<br />
be done together with fitting against time to separate the two effects.<br />
4. While not strictly in the theme of this topic, we need to make sure<br />
enough SL tests are done every day to be representative of the state of<br />
the instrument at different temperatures throughout the day.<br />
5. Whatever function is used, comparing the extrapolated data into today<br />
with the actual data from today is an important measure of quality of<br />
the model.</p>
<p>Cheers,</p>
<p>Volodya</p>
]]></description>
					                    <pubDate>Thu, 18 Dec 2014 12:05:17 +0000</pubDate>
                </item>
				                <item>
                    <title>tomikarp on Standard Lamp -test</title>
                    <link>http://eubrewnet.aemet.es/cost1207/forum/wg2/standard-lamp-test/#p55</link>
                    <category>WG2 - Algorithms</category>
                    <guid isPermaLink="true">http://eubrewnet.aemet.es/cost1207/forum/wg2/standard-lamp-test/#p55</guid>
					                        <description><![CDATA[<p>Hi Tomi</p>
<p>I asked Hugo to send me some information on how we apply the standard lamp correction at RMIB.<br />
This is the answer I got:</p>
<p>" RMI has developed the following procedure to take into account the results of the SL tests:<br />
- First a visualisation tool is used to see the evolution of the R6 (R5) readings.<br />
It shows (monthly) means with the standard deviation. If there is a gradual change<br />
then at a certain point in time a monthly mean is used to interpolate the ETC linearly<br />
from the previous point according to the change in SL reading. This is done with an<br />
off line program that reprocesses all the data. The distances in time between these<br />
points are to be chosen in such a way that the linear aproximation is adequate.<br />
If a sudden jump is detected then the corresponding new ETC is also applied as a jump.<br />
- Before applying such changes the ensemble of information has to be checked (i.e. comparison<br />
with co-located instrument, instrument in the vicinity, satellite data) in order to be sure that it is a<br />
real change in the instrument. This has as a consequence that the corrections with respect to the<br />
SL tests can only be done a posteriori."</p>
<p>Best regards<br />
Veerle</p>
]]></description>
					                    <pubDate>Thu, 18 Dec 2014 12:04:58 +0000</pubDate>
                </item>
				    </channel>
	</rss>