Objectives of lecture:
Trying to second-guess (or predict) the technological process
is extremely risky; so here we go!
Like all computer-based industries, GIS is currently going through
an adjustment to deal with the reality of the web. Lots of the
old thinking was thrown out as the network arose as a primary
vehicle for just about everything.
Map servers arose as a key application, and everything looks like
a browser to the user. Perhaps in 25 years, we will still see
that the interface to the user still seems to be a browser-like
application, but the unity of a single platform might go a lot
of ways. The really dumb client is likely to be replaced but whether
it is Java or something else is beyond my crystal ball.
At one time, a famous computer scientist said: "I don't know
what the computer languages of the future will be like, but they
will be called "FORTRAN"." He was wrong, it is
BASIC, but the point is the same. Visual Basic has very little
to do with the original (line-oriented, global data structure,
etc.) BASIC language from Dartmouth, but the name is part of making
one particular direction seem obvious. It does explain the silly
"DIM" statement (short for DIMENSION, taken from FORTRAN)
and the "SET" statement that you can now only use on
"objects" not variables...
My guess is that the glimmers of the future are already here.
We will point back to an evolution that seems smooth, with continuous
connections to this past we are currently living in. So what is
the obvious stuff we are missing in the current maze?
The "Spatial
Database Engine" now ArcSDE was added to the ESRI product
line in 1994 to the ESRI product line (short
history), but only recently integrated into version 8 as the
"database" format.
No description here, lots of publicly available stuff
in .pdf...
Issue of database
and platform. (see compatability matrix)
Is a data warehouse approach really the wave of the future (or
simply a wave of the past that finally became possible)?
Focused on use, not database construction. The integrity
has to be verified, yes, but that isn't the top priority in the
user structure... So, the lack of topology is just in the data
stored, it actually provides greater access to topological analytical
results in the queries.
Server side strength: use of Applications Programmer's Interface
(API) - means that users have to write programs? or at the mercy
of browser-like clients (=ArcView?)
What passes for strategy
at ESRI... BUT SDE is already here. The next generation has to
be something deeper (or does it?)
The Open GIS Consortium
(OGC to some) is trying to stake out another future. (ESRI of
course is a member, and so is just about everyone else...). The
program focus on interoperability. This is a big issue, but
will it define the next generation, or simply ratify the current
generation?
Their Guide
lays out the approach. (html
version through TOC; or download printable versions). The
Abstract
Specifications are the stage they are in now,
but a few implementations
are being done. They have another Intitiative to deal with Web Mapping..
The critical difference between OGC and just yet-again-another-standard will be in the concept of a "service".
ESRI is trying to grab a hunk of this future through their Geography Network. Their web services are still pretty underpowered, but a direction for the future?
Writing programs based on a "genetic code" that can be recombined. Try a variant, subject it to some test, allow offspring to survive if they are "more fit"..
REAL object orientation: objects have a life of their own, with code (method) and data; multiple agents let loose in some "environment" and they interact... Some of these are geographic in orientation.
SWARM: Santa Fe Institute, now .org
Cormas: CIRAD, France
SME (Spatial Modeling Environment)
Back to some basics:
Databases as a set of objects, relationships and axioms.
Problem is that we often confuse one for the other.
Topology might have been used primarily as an integrity constraint (axiom) for data input stages (where we were in the 1970s) but it is also about relationships, and defining the objects in the first place. So, in practice it is hard to pry them apart.
Do we understand how to do multiple views of shared objects? is "semantic interoperability" really comprehending the problem?
How do we enforce complex integrity constraints?
In many cases, the technology seems to be about playing the game of "configure the user", a power play to redefine the division of labor (and knowledge). Eric Raymond sets out a contrast between "cathedral" (centralized) and "bazaar" (decentralized) software development (article). His main observation is what he calls "Linus' Law" : "Given enough eyeballs, all bugs are shallow". For the Linux OS, the software was freely disseminated so the user was expected to actually find and remedy the bugs. In the ESRI case, there is a litle tinge of distributed bug checking, but a huge residue of the centralist model. This isn't the only insight about software, but it demonstrates that there are multiple models that are inherently social. An article about the limits of the bazaar as a model;
The "technical" equipment may be beyond our wildest
dreams, but it may be running with the same old tired theories
and assumptions. The human component (you!) will have to make
the difference and hold out for something more...