Automating OPUS GPS Post-Processing

From Seamonster
Jump to: navigation, search

Contents

Overview

We need to post-process data from the Trimble NetRS GPS units.

OPUS

OPUS is a service of NOAA available over the web to do GPS post-processing. It is found at http://www.ngs.noaa.gov/OPUS/index.html.

Automating a CGI Form Upload

OPUS is a cgi script that gets it's parameters from an http POST. In addition to the parameters it also expects the file to be processed. A program called curl can be used to do an http POST from the command line using the -F option. There is more in the manual under the heading POST (HTTP): http://curl.haxx.se/docs/manual.html

curl is installed on an ubuntu machine with the following command:

 sudo aptitude install curl

In order to find the correct parameters to use in the POST, the html source of the form must be analyzed. There is a script called formfind from the curl website that automates that process. The output of the command is listed below. I've started a svn project called 'opusdo' for this project, and formfind is available there.

http://seamonster.jun.alaska.edu/svn/seamonster/opusdo

When formfind is used for the OPUS upload page we get these form parameters. From this file we can get all the parameters that we need for the curl command:

  curl -F email_address='email_address' -F uploadfile=@'filename(.08o)' -F ant_type=TRM41249.00 -F height=0.0 -F OPUS-RS=OPUS-RS http://www.ngs.noaa.gov/OPUS-cgi/OPUS/Opus-rsup.prl

The result to this command is the web page in html that would come up when you fill out the form with a web browser.

Installing tecq

There is a package from unavco called teqc that runs under linux. It is a toolkit to solve pre-processing problems; translations, editing, cut/splice and quality checking. It allows you to convert dat files from the trimble netrs into rinex files, but since the OPUS website states that there are bugs in this part of teqc, we do the translations with trimble rutils and use teqc to cut and splice our data files to different time windows.

From: http://facility.unavco.org/software/teqc/teqc.html

Our servers are 64bit linux machines with either intel or AMD processor. Unavco has a dynamically linked 64bit version of tecq. Download it to a working directory with the following command:

  wget http://facility.unavco.org/software/teqc/development/teqc_Lx86_64d.tar.Z
  tar xzvf teqc_Lx86_64d.tar.Z

This extracts a single executable file. Install it in /usr/local/bin:

  sudo cp teqc /usr/local/bin

And then you can run it from the command line by typing 'teqc' in any directory.
To cut the first 2 hours from the rinex obs file 'file_to_cut.08o' use

  teqc +dh 2 file_to_cut.08o > new_file_name.08o

The 'd' stands for delta, use '-' instead of '+' if you want to cut from the end of the file. Use 'm' instead of 'h' if you want to specify minutes instead of hours.

Setting up Procmail

Procmail is both a mail processor and a mail delivery agent (MDA) that can be used to automatically process and deliver incoming mail messages. We use it to further process the OPUS-RS solutions that arrive by mail. This in depth tutorial was used to set up procmail to send the OPUS solutions to the server that contains the database and to feed it into the script ('netrs_db_insert.pl') that inserts the data into the database.
Two files had to be set up in the home directory, a '.procmailrc' file with the following lines:

   SHELL=/bin/sh
   PMDIR=$HOME/Procmail
   LOGFILE=$PMDIR/pmlog
   MAILDIR=$HOME/Maildir
   :0:
   * ^Subject:.*OPUS-RS solution
   | ssh 137.229.208.19 "cat | ./opusdo/netrs_db_insert.pl"

and a '.forward' file with the following line:

   "| /usr/bin/procmail"

SSH Keychain

We need to translate the raw NetRS files, send the rinex files off to OPUS and receive the solutions on one computer but the database is located on a different computer. This means we need to send the files from one machine to the other. The .procmailrc includes a ssh comand that sends the data to the other machine. In order for this to work without a password, we need to set up a ssh keychain. This tutorial was followed to make this transfer work.

Data Flow

All the scripts mentioned are stored in subversion in the directory 'opusdo'.

  • The NetRS records .T00 files that need to be put into the 'trimble/temp' directory on seamonsterak.
  • Then the 'netrs_conversion.py' can be run (all other scripts are called automatically)
  ./netrs_conversion.py
  • The netrs_conversion.py script takes all files in this directory, converts them from .T00 to .dat to rinex, stores the .T00 and the rinex files in directories that are named after the station name, pipes the rinex files into cut_netrs.pl and deletes all the files in 'trimble/temp'.
  • cut_netrs.pl takes the received daily files and cuts them into 1 hour (can be adjusted to any desired file size) files. Then these 1 h files are send out to OPUS for post-processing.
  • The '.procmailrc' script then takes the OPUS-RS solutions and sends them to the computer with the database (right now it is set up on the nsrl1 machine).
  • Here the position data and other information is extracted from the file through the netrs_db_insert.pl file and inserted into the 'platform_loc' table of the database.
  • This table is enabled for the Geoserver such that the data points can be viewed in GoogleEarth or OpenLayers.
Personal tools