Ideas and Projects

From Seamonster
Jump to: navigation, search

Geoscience Sensor Web Ideas

Our most pragmatic, practical near-term idea is to put a robot into Auke Lake that can swim around measuring the temperature and water quality. But in the course of a project like this you also come up with some real wing-ding (or progressive) ideas, so let's put a few of those here.


Volcanology

ClevelandVolcanoEruptingMay2006.jpg


The Aleutian island chain hosts--or actually is built from--a tremendous number of volcanos which threaten at any moment to erupt and bring down a trans-Pacific jetliner by clogging its engines. We'd like to build a monitoring network that sees heat plumes and senses eruptions in progress both seismically and acoustically. Total estimated cost for a robust system is 24 million dollars. Cost of a Boeing 747 is about 200 million dollars; cost in lives considerably more. In the last 15 years, 80 commercial jetliners have encountered ash clouds, 15 of these planes experienced engine shutdowns as a result. As yet no crashes but as a Geophysical Institute professor points out: We're living on borrowed time. See this USGS link for more of an overview including a pointer to "Volcanic Ash Advisory Centers". The Alaska Volcano Observatory is another good resource; their website is found here.


Hydrology

We have a series of valleys coming down off the Juneau Icefield. Supposing we do well with Lemon Creek and have a viable instrumentation strategy, we could expand to these other valleys in like manner with the idea of investigating the same science questions, in effect comparing across differing degrees of deglaciation to synthesize a single 'end-to-end' deglaciation.


More concrete: Direct extension of Year 1 is possible to ecological monitoring: Soil temp, freeze up, snow depth, and so on.


From a slashdot article: Sensor Grid Predicts Imminent Flooding

By kdawson on lifting-all-boats

An anonymous reader writes, "NewScientistTech has an interesting story about a river sensor network that not only measures water depth and flow, but also forms a wireless computing grid to calculate possible flooding scenarios." From the article: "If the river's behavior starts to change, the network uses the data collected to run models and predict what will happen next. If a flood seems likely - because it is rapidly rising and moving quickly - the network can send a wireless warning containing the details... [A researcher said:] 'One end goal would be that people living in areas that flood can install these themselves. They are simple and robust enough to make that possible.'"


Glaciology

Inter- and Sub-Glacial Probes

Jane Hart's group in the UK have designed and deployed wireless sensors under Briksdalsbreen (Glacier) in Southwestern Norway. They drill down to emplace but Martin Truffer points out that a probe on the surface will submerge in the natural course of events over a few years (depending on glacier size and other parameters). Personally I don't have the patience to permit a probe to submerge naturally and favor the idea of using an RTG-powered device to work its way to the bed in a matter of a couple minutes.


However it gets down there, it must be able to understand enormous pressure and abuse, perhaps scraping along the base of the glacier. Data recovery via long-wavelength radio is perfectly reasonable as the Hart group shows but another method would be to use a digital acoustic modem.


For more thoughts on exploration of three-dimensional spaces see also the Rubber Ducky Experiment described below.


The Rubber Ducky Experiment

This idea is rather involved -- oceanography is the discipline -- but it is interesting to ponder.


From this website: "On January 10, 1992 a 12.2 m container with 29,000 bathtub toys (including rubber ducks) washed overboard from a container ship at 44.7°N, 178.1°E. Ten months later the toys began washing ashore near Sitka, Alaska. A similar accident on May 27, 1990 released 80,000 Nike-brand shoes at 48°N, 161°W when waves washed containers from the Hansa Carrier (Figure 10.17). The spill and the eventual recovery of the toys proved to be a good test of a numerical model for calculating the trajectories of oil spills developed by Ebbesmeyer and Ingraham (1992, 1994)."


To continue the thought, this is an example of Lagrangian observation where the trajectory of a fluid particle is followed using a marker. Suppose that we have the ability to magically create this device:

Cost $40
GPS
Solar panel + rechargeable battery
Buoyancy control chamber
Microcontroller
Memory chip (32GB)
802.15.4 radio
Small whip antenna (sometimes referred to as a 'rubber ducky' antenna)
Water pressure-proof casing
Temperature sensors top and bottom
Pressure sensor
Salinity sensor
Dissolved oxygen sensor
Lead-free construction
Rubber ducky paint job


Notice two things about this rubber ducky: First it operates at very low power and second it makes no attempt to return data. Typical drift buoys call in their data via satellite modem but these would be pretty expensive for a large number of units.


Now given this magical device, suppose further that we have a twenty million dollar hardware budget and we blow it on one million rubber duckies. (The manufacturer gives us a 50% bulk discount.) We will also need a friend with another twenty million dollar hardware budget, where we will assume his name is Alberto Behar and he works at NASA-JPL in Pasadena.


We power up and program our rubber duckies in batches of 10,000, lining them up in 100 x 100 squares. There will be 100 resulting lots, each with slightly different data acquisition rules. The program they will run is as follows:


 1. Acquire/store: { ID, position, time, temps, salinity, DO} data tuple N1 times over T1 minutes
 2. Go slightly negative buoyancy for T2 minutes
   - Acquire/store: { ID, time, T, P, S, DO } tuples N2 times
 3. Go slightly positive buoyancy for T3 minutes
   - Acquire/store: { ID, time, T, P, S, DO } tuples N3 times
 4. Goto Step 1


In addition we have some data muling rules: Should RD-A see RD-B they may copy their personal datasets into one another's memory (working backwards in time). Notice that if one sample is acquired per minute, one year's worth of data will occupy (in bytes): (tuple-size x 1/2 million tuples). At 20 bytes per tuple, a single rubber ducky can hold 3200 years worth of data. We would expect certain rubber duckies to encounter a fair number of other rubber duckies and eventually find themselves muling a fairly large data cargo. (Rubber duckies may also wish to mule other muled datasets as connectivity allows, and merge redundant data.)


Note that the rubber duckies are passive drifters but with longer submersion times they will be subject to deeper ocean currents.


We find 10,000 ocean-going vessels and give each one a crate of 100 rubber duckies with instructions to dump them overboard (the Jonah lift).


Finally let's have Alberto Behar at JPL build robotic submarines that crawl the world's oceans in search of rubber duckies. We can also put microservers on ships so that they can scoop up rubber ducky information at opportunities on the shipping lanes. The robot, however, will read the data records and find out where other rubber duckies have been encountered, will adjust its course to seek them out. It must be capable of running underwater to avoid heavy seas and it must be good at surveying beaches to find rubber duckies that have run aground. It will certainly have a satellite uplink for reporting home. Vehicles that frequent coastal areas can also be equipped with microservers to aid in data collection.


A variation on this experiment might make use of turtles.