Tales from the golden age of geospatial.

Just for fun, while we are all breathlessly waiting for the QGIS 3.0 packages to drop, how about those of us more “experienced” GIS folk entertain our younger brethren with tales of how GIS was done back in the olden days.

My favorite story came from my first GIS professor back in 1991. He had been tasked with a spatial analysis project for his PhD at the University of Toronto back in the late 60’s when commercial GIS was just a glimmer in Jack Dangermond’s eye and everybody had to write their own spatial analysis code from scratch in FORTRAN. Dr. Wilson’s code and data were both stored on punch-cards and it took him an extra 1 ½ years to finish his doctorate because he had to move his pile of punch cards across campus in a wheelbarrow and could only work on days of low humidity which are not that common in Toronto.

Fortunately, I never had to deal with punch cards. My first computer, a commodore PET, allowed you to store data on cassette tape. It took several minutes to load a few hundred lines of BASIC code, but storage was always an issue. My first GIS job was cancelled soon after I started because we calculated that I would need about 5GB of hard drive space for storage plus another 10-15 GB for temporary workspace and hard-drive space at the time (1993) cost $1000 per gigabyte. Making my current phone worth about $128,000 in 1993 terms just for the memory chip.

Dawn on Yellowstone Lake

My second job was a dream job as the GPS/GIS coordinator for a research group in Yellowstone National Park. Our GPS units cost about $10,000 each and stored data on PCMIA cards that held 32 kilobytes and had a battery life of about 5 minutes. The “screen” was 4 lines of 40 character each. Text only. We had research teams spread out all over the Yellowstone backcountry and I spent the summer literally running miles to a field camp with a backpack full of empty data cards and full batteries and exchanging them for another pile of empty batteries and full data cards and running them back to our headquarters in Cooke City to download the data and charge the batteries only to repeat it the next day.

Selective availability was still on (introducing random 100m error to GPS signal) so in order to use the GPS devices for navigating with more than 100m accuracy we needed a real-time differential base station. After pouring over topo maps I decided that Mt. Norris at the junction of Soda Butte creek and the Lamar river was the best location to provide line-of site coverage for our projects. I mounted our base station inside a Coleman cooler with a motorcycle battery, VHF radio, solar panel, 12V timer to save battery life at night, and both GPS and VHF antennas and hauled it, along with a 12 pound “notebook” computer, the 4,000 vertical feet up the mountain. It took 3 trips to get it set up. 1) Get all the equipment up and turn the GPS on to start collecting locations 2) Climb the mountain the next day with the computer again to download the data. 3) Climb the mountain the third day with a post-corrected average location to tell our base station where it was so the differential correction signal would be correct.

And our solar panel was a bit weak for the purpose so every week I went back up with a new battery, and every time I climbed that mountain I saw fresh grizzly sign and I knew there was a family of bears living up there. Fortunately I never saw them until we were bringing it down at the end of the summer when the sow and three 2 year-old cubs bluffed charged us to within 12 feet.

Bears weren’t the only thing we had to keep our eyes open for.

All of this occurred during my undergraduate at Montana State, where I got a minor in GIS and Spatial Analysis without ever actually using GIS software other than a few canned IDRISI exercises because Arc/INFO only ran on UNIX workstations at the time and they were far too expensive to be wasted on undergraduate labs. I graduated with a GIS minor without ever having used ArcInfo although I had a very strong theoretical background. I could tell you how the algorithm to determine if a point was in a polygon worked but if you put me in front of a computer and asked me to do a spatial join of a set of points and a set of polygons I wouldn’t have known where to begin.

Fortunately soon after I graduated ESRI released ArcView and I was finally able to get some hands-on experience with actual computer software and my life has been much improved ever since.

At least until I started my PhD in 2000 and was writing lots of Avenue code in 3.1 to analyze animal movement data. Six months later they came out with 8.0 and everything was Visual Basic and ArcObjects but not really documented. It took me 6 more months to learn so a year of my life gone and I had to buy my own ArcView 8.0 license because the school wasn’t planning to update for another year.

What are some of your stories from the olden days?

4 Replies to “Tales from the golden age of geospatial.”

  1. Wow, what an adventurous experience! Those must have been among the very first handheld GPS? I had not even heard of GPS at that time.

    I am almost as old as you. My first GIS experience was in an early 90’s GIS class with “Osumap” which I have never heard of again. I also did Remote Sensing course with TNTmips and got hooked on image classification, such fun. I had a summer job which involved inputting some data into Pamap (an early Canadian GIS which was commercially successful for awhile but no longer exists) only to find out when I actually got to take a course that I had done it all wrong.

    Still, everyone told me GIS was the way to go with my Geography degree, so I enrolled in a GIS Certificate program at the College of Geographic Sciences in Nova Scotia in 1994. Workstation Arc/Info. All that typing of long path names to change directories was the worst! And painstakingly calculating coordinates of where to place map elements in ArcPlot. What a big deal it was to print a map and how primitive they look now! But there are things I miss about workstation Arc/Info. The documentation was good. Algorithms, parameters, all explained with examples. You could figure out how to do anything, and if it did not work, there was an error message that actually told you what was wrong!

  2. What a great story. I am about as old as you, my first experience was as an undergrad in Forestry, I worked at a timber company one summer (1994) and was given a Trimble ProXL to collect the outlines of the cut blocks / plant blocks, SA was still on so post processing ruled the day. At the end of the day I had to download the GPS points, post process the data and then copy it to a floppy to send it to head office. The best part, the system collected only points and head office also wanted a plot of everything walked that day, so the pen plotter would dutifully plot each point by pounding the pen against the page. I only did this at night as the hammering would get to everyone.

    During times of GPS unavailability I also had a chainsaw so I could join the logging crews until we had enough satellites when I would start to walk again.

    As for GIS, it was ArcInfo 6 on a Sun Station and we had to program the “global coverage overlay” function as part of a Master’s assignment. One of the students figured out how to do it in AML, so when it was my turn, he mandated we had to do the work in ArcSDL and write the app in C, that was a doozy.

  3. I was an an intelligence analyst for the U.S. Army in the ’80s, and our idea of a layer was rolling a sheet of clear acetate over the map and marking features of interest with grease pencils. So if we wanted to intersect our minefields “layer” with our vehicle accessible “layer”, we roll the one over the map from the right and other from the left. Our “ground control points” were little x’s drawn in the corners of the acetates that matched up with an x drawn on the map. Once it was lined up, we taped it all down to do our work.

Leave a Reply

Your email address will not be published. Required fields are marked *