Account solarsw@msslae ====================== This account is used to create the EIS flare catalogue based on the GOES SXR flare list. NOTE: In order to generate the EIS flare catalogue, the AIA 193 fits files, GOES SXR light-curves files and GOES event (GEV) files must all be available. The GOES SXR light-curves files and GOES event (GEV) files are part of the SolarSoft database and are updated frequently from the master copy at NASA/GSFC. They are stored under $SSWDB/goes/xray and $SSWDB/gev directories respectively. AIA Images are copied by: /disk/solar5/solarsw/aiaDATA/fov_update.sh While being processed, files are in /disk/solar5/solarsw/aiaDATA/giftemp Output stored in /disk/solar/solarsw/hinode/eis/eis_imagetool/gif/2021 Last image define in /disk/solar5/solarsw/aiaDATA/.AIAimgs_update If something goes wrong, the date in this needs to be changed. The start date for the flare cataloguing software is derived from the last record in the file eisflare_json.txt_master <<< The flare catalogue is stored in /disk/solar5/solarsw/flare/eisflarecat Files are in /disk/solar5/solarsw/flare/eisflarecat/2021 NOTE: When the code adds new flares to the flare catalogue json files it also duplicates the lines for earlier flares in the year. Before copying to msslxh, use the commands in cmd.txt to remove these in 2021_json.txt and then rebuild eisflare_json.txt Duplicate to form a new eisflare_json.txt_master <<< Because the use of certificates is prohibited within MSSL, the files need to be copied to hinode@msslxh by hand each week. The commands needed to make the copy are at the end of the cron script; note that the ORDER of the commands should be as in the script. [*** This means that these images cannot be seen by the Thumbnail code and a sparate copy is made!!????] Account solarsw@msslxs ====================== The following copies the Level-0 FITS files to MSSL (RCP from DART) cron SCRIPT name: /home/solarsw/IDL_CRON/darts_rsync_server_eis_grl.sh You can tell whether the L0 data are in using: ls -ldt /disk/d2/hinode/eis_level0/mission/2021/*/* | head -15 Note that the copy of the Level-0 FITS files at RAL is done independantly Need to log into an account at RAL to see the tasks. Account at RAL: solarb@solar.ads.rl.ac.uk (access only from solarsw@msslxs; certificates used so password not needed) --- The following does an update of the AIA images that are used in creating the thumbnails; analogous to the code on msslae for the flare catalogue. cron SCRIPT name: /disk/d2/hinode/aiaDATA/fov_update.sh => /disk/d2/hinode/aiaDATA/downaia193_auto.sh Update done from date in /disk/d2/hinode/aiaDATA/.AIAimgs_update (Note: set to the day before you want to update from) Stores the AIA images under: /disk/d2/hinode/xrt/gif/2021/* (Yes - sdo/aia images stored in directory named honode/xrt !!) Note: the AIA data should run ahead of when the EIS data becomes availalable Need to create the year directories and month sub directories... 1st 7 days of 2015 missing! (I stared making subdirs late) --- L0 FITS files and the AIA images are input to the code that produces the L2 files and thumbnails cron SCRIPT name: /home/solarsw/IDL_CRON/Tbatch.cocos.rdb-2.sh Works from control file: /disk/d2/hinode/pipe_logs/EIStoday.log L0 FITS files stored under: /disk/d2/hinode/eis_level0/mission/2021/*/* cd /disk/d2/hinode/eis_level0/mission/2021/ ls -1 02/*/* > /disk/d2/hinode/pipe_logs/EIStoday.log ls -1 03/1*/* 03/2*/* 03/3*/* 04/0*/* 04/1*/* > control_file Thumbs are stored under: /remote/thumbs/eis_gifs/2021/*/* L2 FITS files stored under: /disk/d2/hinode/eis_level2/2015/11 ls -lr /disk/d2/hinode/eis_level2/2021/*/*/* | head --- NOTE: If the pipeline screws up, it may be necesary to delete all the L2 files within the time period being processed before it will run properly. It does not seem to like having existing files out there... --- In order to update the database, the following files need to be created and copied to thumbs@msslxh cd /disk/d2/hinode/eis_level0/mission/2021 ls -1 04/2*/* > /disk/d2/hinode/pipe_logs/need_eXRT.list.uniq.manual ls -1 /disk/d2/hinode/eis_level2/2021/04/2*/* | sed 's/.gz//g' > /disk/d2/hinode/pipe_logs/need_l2.list.uniq.manual --- The L2 FITS files are copied to the ADS archive at STFC/RAL using cron. cron SCRIPT name: /home/solarsw/IDL_CRON/fits_archive_update.sh => /home/solarsw/IDL_CRON/fits_onweb_update.sh => /home/solarsw/IDL_CRON/rsync_msslxs_ads_level2.sh Update done from date in ./.checked_last on /disk/d2/hinode/pipe_logs/ Log of files copied created in /disk/d2/hinode/pipe_logs, for example /disk/d2/hinode/pipe_logs/need_l2.list.uniq.20210807 --- What does this do????? [[[[May make the control files for the cocos stuff...]]]] ls -lt /disk/d2/hinode/pipe_logs/2021*/eis_process_2021*.log | head -20 tail -200 /disk/d2/hinode/pipe_logs/20160224/eis_process_20160224T144801.log ls -r /disk/d2/hinode/xrt/gif/2015/*/* /disk/d2/hinode/xrt/gif/2016/*/* | head ls -1 /disk/d2/hinode/eis_level2/2016/03/1*/* /disk/d2/hinode/eis_level2/2016/03/2*/* /disk/d2/hinode/eis_level2/2016/03/3*/* /disk/d2/hinode/eis_level2/2016/04/0*/* | sed 's/.gz//g' > need_l2.list.manual cat /remote/hinode/pipe_logs/need_eXRT.list.uniq.20160507 | grep 201603 > need_eXRT.list.uniq.manual Account thumbs@msslxh ===================== This account is used to update the catalogues of level0 and level2 data. Tabular presentation of the available studies will only appear on the Web site (hinode@msslxh) if the tables in the database have been updated. Cron script (automatic update): /home/thumbs/update_hinodeDB.sh Needs files: cp -p /remote/hinode/pipe_logs/need_eXRT.list.uniq.$today_date . cp -p /remote/hinode/pipe_logs/need_l2.list.uniq.$today_date . These files are copied on cross-mounted disks from solarsw@msslxs. The files are produced on msslxs by script ????? fits_archive_update.sh fits_archive_update_grl.sh Use fits_onweb_update_grl.sh to catchup ??? Needs: need_eXRT.list.uniq checked.$lastup_date_start ./.checked_last The code expects these file to be for TODAY. It fails if this is not the case; this could make catchup difficult!!! Outputs: Writes to the Hinode database --- Can also update the databases MANUALLY Cron script (manual update): /home/thumbs/update_hinodeDB.manual.sh Needs files: /home/thumbs/need_eXRT.list.uniq.manual /home/thumbs/need_l2.list.uniq.manual Create the files under solarsw@msslxs (see above) and copy here cp -p /remote/hinode/pipe_logs/need_l2.list.uniq.manual . cp -p /remote/hinode/pipe_logs/need_eXRT.list.uniq.manual . Note that the directories containing the FITS files are visible on msslxh - for example, see: /remote/hinode/eis_level0/mission/2016/02/28 Therefore, the control lists created under solarsw@msslxs could just as easily be created on this account. This would simplyfy the creation of the Level-0 catalogue since all the control is within the one account. NEED to be able to dump which files are already listed in the catalogue! Account hinode@msslxh ===================== This account manages the Web pages that are visible on the Hinode Web site. Note that "everything" that is displayed should have been created on other accounts and/or machines. Everything that is display os stored under directory: /disk/d1/hinode/apach*******/webapps/SolarB/ NOTE: Although the version of Apache has been updated, we have left everything under apache-tomcat-5.5.20 and put a pointer in the new (latest) Apache config file. This ensures that all hardwired addressing in the many scripts still works. Flare Catalogue: ---------------- > The Flare catalogue pages grab information from JSON files that are stored on under /disk/d1/hinode/apach*******/webapps/SolarB/eisflare/ The files with names like 20**_json.txt cover the flares for a single year; the file eisflare_json.txt is all the years compounded together. The files are created on solarsw@msslae and copied (by hand) onto msslxh; USE THE COMMANDS to get rid of duplicates and order the records. NOTE: The order of the commands when copying from solarsw@msslae is important. If a file like 2017_json.txt get renamed to a backup file and not relaced then the flare catalogue for that year cannot be viewed and the JSP code gets upset... Science Nuggets: ---------------- The Science Nuggets are stored under /disk/d1/hinode/apach*******/webapps/SolarB/nuggets/ The image that appears at the top of the page is determined by the sctipt update_nugget_image.sh that is run as a cron job at 06:00; this guesses using the latest entry made in web script eisnuggets.jsp Wiki: ----- The pages of the EIS Wiki are under /disk/d1/hinode/safe_JSPWiki/e28/eiswiki_pages/ In order to stop hacking/inappropriate additions, the protection on all the pages has been changed so that only a few people can edit them. Archive Search: --------------- The EIS data search pages grab information from the SQL database and create the displayed tables. The entries into the database are made from account thumbs@msslah. Note that if the information about the FITS files is not in the database then it will appear as if the FITS data are not avaiable even though they are. The thumbnails images used to aid selection of the EIS data are produced by account solarsw@msslxs using script ????? The thumbnails are stored in directories with names of the form: /home/hinode/solarb/DEV/eis_gifs/2016/02/18/eis_l0_20160218_104427.fits This may look like a FITS file but it is actually a directory; the form of the name helps relate the thumbnails with the file itself. Note that if the thumbnails are not present in the directory, the data search page results will have "Unavailable" against the links. DO THINGS NEED TO BE REGENERATED WHEN THE THUMBS HAVE BEEN MADE? The Level-0 FITS files that can be accessed through the Web pages are stored at RAL. Note that this copy is made independantly from the one at MSSL - need to log into that account at RAL to see the tasks. Account solarb@solar.ads.rl.ac.uk (access from solarsw@msslxs) The Level-2 FITS files that can be accessed through the Web pages are stored at RAL. These files are generated at MSSL and copied to RAL by a script at RAL??????