Web Services

Download Report

Transcript Web Services

Web Services
•
•
•
•
A Web service is a software for interoperable machine-to-machine interaction over a network.
It has an interface described in a machine-processable format (specifically WSDL).
Web services communicate using SOAP-messages,
Typically conveyed using HTTP with an XML serialization in using Web-related standards.
AQ Data and Analysis: Challenges and Opportunities
•
•
•
Shift from primary to secondary pollutants. Ozone and PM2,5 travel 500 + miles
across state or international boundaries and their sources are not well established
New Regulatory approach. Compliance evaluation based on ‘weight of evidence’ and
tracking the effectiveness of controls
Shift from command & control to participatory management. Inclusion of federal,
state, local, industry, international stakeholders.
Challenges
•
•
Broader user community. The information systems need to be extended to reach all the
stakeholders ( federal, state, local, industry, international)
A richer set of data and analysis. Establishing causality, ‘weight of evidence’, emissions
tracking requires more data and air quality analysis
Opportunities
•
•
Rich AQ data availability. Abundant high-grade routine and research monitoring data
from EPA, NASA, NOAA and other agencies are now available.
New information technologies. DBMS, data exploration tools and web-based
communication now allows cooperation (sharing) and coordination among diverse groups.
The Researcher’s Challenge
“The researcher cannot get access to the data;
if he can, he cannot read them;
if he can read them, he does not know how good they are;
and if he finds them good he cannot merge them with other data.”
Information Technology and the Conduct of Research: The Users View
National Academy Press, 1989
These resistances can be overcome through
• A catalog of distributed data resources for easy data ‘discovery’
• Uniform data coding and formatting for easy access, transfer and merging
• Rich and flexible metadata structure to encode the knowledge about data
• Powerful shared tools to access, merge and analyze the data
Recap: Harnessing the Winds
• Secondary pollutants along with more open
environmental management style are placing
increasing demand on data analysis. Meanwhile, rich
AQ data sets and the computer and communications
technologies offer unique opportunities.
• It appears timely to consider the development of a
web-based, open, distributed air quality data
integration, analysis and dissemination system.
• The challenge is learn how to harness the winds of
change as sailors have learned to use the winds for
going from A to B
Uniform Coding and Formatting of
Distributed Data
•
•
•
•
•
Data are now easily accessible through standard Internet protocols, but the coding and
formatting of the data is very heterogeneous
On the other hand data sharing is most effective if the codes/formats/protocols are
uniform (e.g. the Web formats and protocols )
Re-coding and reformatting all the heterogeneous data into universal form in their
respective server is unrealistic
An alternative is enrich the heterogeneous data with uniform coding along the way from
the provider to the user.
A third party ‘proxy’ server can perform the necessary homogenization with the following
benefits:
–
–
–
The data user interfaces with a simple universal data query and delivery system (interface,
formats..)
The data provider does not need to change the system; gets additional security protection since
the data data accessed by the proxy
Reduced data flow resistances results in increased overall data flow and data usage.
DataFed Servcies
• Dvoy Services offer a homogeneous, read-only access mechanism
to a dynamically changing collection of heterogeneous, autonomous
and distributed information sources.
• Data access uses a global multidimensional schema consisting of
spatial, temporal and parameter dimensions, suitable for data
browsing and online analytical processing, OLAP. The limited query
capabilities yield slices through the spatio-temporal data cubes.
• The main software components of Dvoy are wrappers, which
encapsulate sources and remove technical heterogeneity, and
mediators, which resolve the logical heterogeneity.
• Wrapper classes are available for geo-spatial (incl. satellite) images,
SQL servers, text files,etc. The mediator classes are implemented
as web services for uniform data access, transformation and
portrayal.
DVOY Interfaces
Data Input
• Data input
•
•
•
•
Data Output - Browser
The DVOY interface is composed of data viewers and controllers, all
displayed on a webpage
The web services and the preparation of the webpage interface is
through .NET(Microsoft)
The graphic data display on the webpage uses an SVG plugin (Adobe)
The DVOY controls are linked to the SVG plugin and the .NET through
client-side JavaScript
Data Output – Web Service
– The DVOY outputs are XML formatted datasets suitable for chaining with
processing or rendering services
NSF-NOAA-EPA/EMAP (NASA)? Project:
Real-Time Aerosol Watch System
Real-Time Virtual PM Monitoring Dashboard.
A web-page for one-stop access to pre-set views of current PM monitoring data including surface
PM, satellite, weather and model data.
Virtual Workgroup Website.
An interactive website which facilitates the active participation of diverse members in the
interpretation, discussion, summary and assessment of the on-line PM monitoring data.
Air Quality Managers Console.
Helps PM managers make decisions during major aerosol events; delivers a subset of the PM
data relevant to the AQ managers, including summary reports prepared by the Virtual
workgroups.
Dvoy Federated Information System
• Dvoy offers a homogeneous, read-only access
mechanism to a dynamically changing collection
of heterogeneous, autonomous and distributed
information sources.
• Data access uses a global multidimensional
schema consisting of spatial, temporal and
parameter dimensions
• The uniform global schema is suitable for data
browsing and online analytical processing, OLAP
• The limited global query capabilities yield slices
Mediator-Based Integration Architecture
(Wiederhold, 1992)
• Software agents (mediators) can perform many of the data
integration chores
• Heterogeneous sources are wrapped by translation
software local to global language
• Mediators (web services) obtain data from wrappers or
other
mediators
Busse et. al, 1999
User
Query
View and pass it on …
• Wrappers remove technical, while mediators resolve the
logical
heterogeneity
Service
• The job of the mediator is to provide an answer to a user
query (Ullman,
1997)
Service
• In database theory sense, a mediator is a view of the data
Wrapper
Wrapper
found
in one or
more sources
Value-Added Processing in Service Oriented Architecture
Data, services and users are distributed throughout the network
Users compose data processing chains form reusable services
Intermediate data are also exposed for possible further use
Chains can be linked to form compound value-adding processes
Peer-to-peer network
representation
Service chain representation
Chain 1
Data
Chain 2
Control
Data
Service
Chain 1
Chain 3
Chain 2
User Carries less Burden
In service-oriented peer-to peer architecture,
the user is aided by software ‘agents’
User Tasks:
Chain 3
Catalog
Data Service
Find data and services
Compose service chains
Expose output
User
Dvoy Federated Information System
• Dvoy offers a homogeneous, read-only access
mechanism to a dynamically changing collection
of heterogeneous, autonomous and distributed
information sources.
• Data access uses a global multidimensional
schema consisting of spatial, temporal and
parameter dimensions
• The uniform global schema is suitable for data
browsing and online analytical processing, OLAP
• The limited global query capabilities yield slices
Architecture of DATAFED Federated Data
System
After Busse et. al., 1999
• The main software components of Dvoy are wrappers, which
encapsulate sources and remove technical heterogeneity, and
mediators, which resolve the logical heterogeneity.
• Wrapper classes are available for geo-spatial (incl. satellite)
images, SQL servers, text files,etc. The mediator classes are
Integration Architecture (Ullman,
1997)
• Heterogeneous sources are wrapped by software that translates
between the sources local language, model and concepts and the
shared global concepts
• Mediators obtain information from one or more components (wrappers
or other mediators) and pass it on to other mediators or to external
users.
• In a sense, a mediator is a view of the data found in one or more
sources; it does not hold the data but it acts as it it did. The job of the
mediator is to go to the sources and provide an answer to the query.
Distributed Programming: Interpreted
and Compiled
• Web services allow processing of distributed
data
– Data are distributed and maintained by their custodians,
– Processing nodes (web-services) are also distributed
– ‘Interpreted’ web-programs for data processing can be created ad
hoc by end users
• However, ‘interpreted’ web programs are slow,
fragile and uncertain
– Slow due to large data transfers between nodes
– Fragile due to instability of connections
– Uncertain due to failures of data provider and processing nodes
• One solution is to ‘compile’ the data and
processing services
– Data compilation transforms the data for fast, effective access (e.g.
OLAP)
Interpreted and Compiled Service
Interpreted Service
• Processes distributed
Point
Access
• Data flow on Internet
Point
Grid
Grid
Render
Point
Access
Point
Render
PtGrid
Overlay
Data Flow
Control Flow
Compiled Service
• Processes in the same place
Point
Access
Point
Grid
Grid
Render
• Data flow within aggregate service
• Controllers, e.g. zoom can be shared
Point
Render
PtGrid
Overla
y
Services Program Execution:
Reverse Polish Notation
Writing the WS program:
- Write the program on the command line of a URL call
- Services are written sequentially using RPN
- Replacements
Connector/Adaptor:
- Reads the service name from the command line and loads its WSDL
- Scans the input WSDL
- The schema walker populates the service input fields from:
- the data on the command line
- the data output of the upstream process
- the catalog for the missing data
Service Execution
For each service
Reads the command line, one service at a time
Passes the service parameters to the above Connector/Adopter,
which prepares the service
Executes the service
Monitoring collects multi-sensory data from surface
and satellite platforms and
Monitoring
(Sensing)
Set Policy
Set Goals
CAAA
NAAQS
Assessment
Air
Quality
Compare to Goals
Plan Reductions
Track Progress
Controls
(Actions)
Assessment turns data into knowledge for decision
making & actions through analysis (science &
engineering)
Mediator-Based Integration Architecture
•
•
•
•
•
•
(Wiederhold, 1992)
Software agents (mediators) can perform many of the data integration chores
Heterogeneous sources are wrapped by translation software local to global language
Mediators (web services) obtain data from wrappers or other mediators and pass it on …
Wrappers remove technical, while mediators resolve the logical heterogeneity
The job of the mediator is to provide an answer to a user query (Ullman, 1997)
In database theory sense, a mediator is a view of the data found in one or more sources
User Query
View
Service
Service
Wrapper
Wrapper
Busse et. al, 1999
An Application Program: Voyager Data
Browser
Controls
Data Sources
Wrappers
App State Data
Flow Interpreter
Core
I/O Layer
Device Drivers
Ports
Displays
WSDL
Web Services
•
•
•
The web-programs consists of a stable core and adoptive input/output layers
The core maintains the state and executes the data selection, access and render services
The adoptive, abstract I/O layers connects the core to evolving web data, flexible displays and to the a configurable
user interface:
DataFed Topology: Mediated Peer-to-Peer Network, MP2P
Mediated Peer-to Peer Network
Mediator
Peers
Broker maintains a catalog of accessible resources
Peers find data and access instructions in the catalog
Peers get resources directly from peer providers
Source: Federal Standard 1037C
Google Example: Finding Images on Network Topology
Google catalogs the images related to Network Topology
User selects an image from the cached image catalog
User visits the provider web-page where the image resides
Generic Data Flow and Processing in
DATAFED
View
Wrapper
Physical
Data
Abstract Data
Access
Abstract
Data
Process
Data
Data
Portrayal/
Render
Processed
Data
Portrayed
Data
DataView 1
DataView 2
DataView 3
Physical Data
Abstract Data
Processed Data
View Data
Resides in autonomous
servers; accessed by viewspecific wrappers which
yield abstract data ‘slices’
Abstract data slices are
requested by viewers;
uniform data are delivered
by wrapper services
Data passed through
filtering, aggregation,
fusion and other web
services
Processed data are delivered
to the user as multi-layer
views by portrayal and
overlay web services
Tight and Lose Coupled Programs
They are self-contained, self-describing, modular applications
that can be published, located, and invoked across the Web. Web
services perform functions, which can be anything from simple
requests to complicated business processes...
SOA is the right mechanism—a transmission of
sorts—for an IT environment in which datacrunching legacy systems must mesh with
agile front-facing applications
• Coupling is the dependency between interacting systems
• Dependency can be real (the service one consumes) or artificial
(language, platform…)
•
•
•
One can never reduce real dependency but itself is evolving
One can never get rid of artificial dependency but one can reduce artificial
dependency or the cost of artificial dependency.
Hence, loose coupling describes the state when artificial dependency or the
cost of artificial dependency has been reduced to the minimum.
The pathway to a service-oriented
architecture
Bob Sutor, IBM
•
In an SOA world, business tasks are accomplished by executing a
series of "services,“
– Services have well-defined ways of talking to them and well-defined ways in
which they talk back
– It doesn't really matter how a service is implemented, as long as it properly
responds and offers the quality of service
– The service must be secure, reliable and fast enough
– SOA a suitable technology to use in an IT environment where software and
hardware from multiple vendors is deployed
– IBM identified four steppingstones on the path to SOA nirvana and its full
business benefits
•
1. Make applications available as Web services to multiple consumers
via a middle-tier Web application server.
– This is an ideal entry point for those wishing to deploy an SOA with existing
enterprise applications
– Target customer retention or operational efficiency projects
– Work with multiple consumers to correctly define the granularity of your
services
– Pay proper attention to keeping the services and the applications loosely
coupled.
•
2. Choreography of web services
The pathway to a service-oriented architecture
Opinion by Bob Sutor, IBM Bob Sutor is IBM's director of WebSphere Infrastructure Software.
DECEMBER 03, 2003 (COMPUTERWORLD) - I recently read through a large collection of
analyst reports on service-oriented architecture (SOA) that have been published in the past
year. I was pleasantly surprised at the amount of agreement among these industry
observers and their generally optimistic outlook for the adoption of this technology.
SOA is not really new -- by some accounts, it dates back to the mid-1980s -- but it's starting
to become practical across enterprise boundaries because of the rise of the Internet as a
way of connecting people and companies. Even though the name sounds very technical, it's
the big picture behind the use of Web services, the plumbing that's now being used to tie
together companies with their customers, suppliers and partners.
In an SOA world, business tasks are accomplished by executing a series of "services," jobs that
have well-defined ways of talking to them and well-defined ways in which they talk back. It
doesn't really matter how a particular service is implemented, as long as it responds in the
expected way to your commands and offers the quality of service you require. This means
that the service must be secure, reliable and fast enough to meet your needs. This makes
SOA a nearly ideal technology to use in an IT environment where software and hardware
from multiple vendors is deployed.
At IBM, we've identified four steppingstones on the path to SOA nirvana and its full business
benefits. Unlike most real paths, this is one you can jump on at almost any point
The first step is to start making individual applications available as Web services to multiple
consumers via a middle-tier Web application server. I'm not precluding writing new Web
services here, but this is an ideal entry point for those wishing to deploy an SOA with
existing Java or Cobol enterprise applications, perhaps targeting customer retention or
operational efficiency projects.
You should work with multiple consumers to correctly define the granularity of your services,
and pay proper attention to keeping the services and the applications using them loosely
Don Box, Microsoft
• When we started working on SOAP in 1998, the goal was to get
away from this model of integration by code injection that distributed
object technology had embraced. Instead, we wanted an integration
model that made as few possible assumptions about the other side
of the wire, especially about the technology used to build the other
side of the wire. We've come to call this style of integration protocolbased integration or service-orientation. Service-orientation doesn't
replace object-orientation - I don't see the industry (or Microsoft)
abandoning objects as the primary metaphor for building individual
programs. I do see the industry (and Microsoft) moving away from
objects as the primary metaphor for integrating and coordinating
multiple programs that are developed, deployed and versioned
independently, especially across host boundaries. In the 1990's, we
stretched the object metaphor as far as we could, and through
communal experience, we found out what the limits are. With Indigo,
we're betting on the service metaphor and attempting to make it
as accessible as humanly possible to all developers on our platform.
How far we can "shrink" the service metaphor remains to be seen. Is
it suitable for cross-process work? Absolutely. Will every single CLR
type you write be a service in the future? Probably not - at some
Responsibility
Note that in distributed systems responsibility
is distributed. For NVODS responsibility for
The data lies with the data providers.
 The data access protocol lies with OPeNDAP.
 Application packages (Matlab, Ferret, Excel…) with the
developers of these packages. (Services??)
 Data location with the GCMD and NVODS.
Data Processing in Service Oriented
Architecture
• Data Values
– Immediacy
– Quality
• Lose Coupling of data
• Open data processing – let competitive
approaches deliver the appropriate
products to the right place
Major Service Categories
As envisioned by Open GiS Consortium (OGC)
Service Category
Description
Human Interaction
Managing user interfaces,
graphics, presentation.
Info. Management
Managing and storage of
metadata, schemas, datasets.
Workflow
Services that support specific
tasks or work-related activities.
Processing
Data processing, computations;
no data storage or transfer
Communication
Services that encode and transfer
data across networks.
Sys. Management
Managing system components,
applications, networks (access).
OGC
0000
code
1000
1100
1110
1111
1112
1113
1114
1120
1130
1200
1300
2000
2100
2200
2210
2300
2400
2500
2600
2700
3000
3100
3200
3300
4000
4100
4110
4120
4130
4140
4150
4160
4170
4180
4190
41A0
4200
4210
4220
4221
4222
4230
4240
4250
4260
4261
4262
4263
4270
4280
4300
4310
4320
4330
4340
4400
4410
4420
5000
5100
5200
5300
6000
Service class
OGC web service [ROOT]
Human interaction
Portrayal
Geospatial viewer
Animation
Mosaicing
Perspective
Imagery
Geospatial symbol editor
Feature generalization editor
Service interaction editor
Registry browser
Information Management
Feature access
Coverage access
Real-time sensor
Map access
Gazetteer
Registry
Sensor access
Order handling
Workflow
Chain definition
Enactment
Subscription
Processing
Spatial
Coordinate conversion
Coordinate transformation
Representation conversion
Orthorectification
Subsetting
Sampling
Feature manipulation
Feature generalization
Route determination
Positioning
Thematic
Geoparameter calculation
Thematic classification
Unsupervised
Supervised
Change detection
Radiometric correction
Geospatial analysis
Image processing
Reduced resolution generation
Image manipulation
Image synthesis
Geoparsing
Geocoding
Temporal
Reference system transformation
Subsetting
Sampling
Proximity analysis
Metadata
Statistical analysis
Annotation
Communication
Encoding
Format conversion
Messaging
System Management
SOAP and WSDL
SOAP
Envelope for message description and processing
A set of encoding rules for expressing data types
Convention for remote procedure calls and responses
A binding convention for exchanging messages
WSDL
Message format
Ports
Services in 500 words
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Introduction
Search and Retrieve Web Service (SRW) and Search and Retrieve URL Service (SRU) are Web Services-based protocols for querying databases
and returning search results. SRW and SRU requests and results are similar, their difference lies in the ways the queries and results are
encapsulated and transmitted between client and server applications.
Basic "operations"
Both protocols define three and only three basic "operations": explain, scan, searchRetrieve.
explain. Explain operations are requests sent by clients as a way of learning about the server's database. At minimum, responses to explain
operations return the location of the database, a description of what the database contains, and what features of the protocol the server supports.
scan. Scan operations enumerate the terms found in the remote database's index. Clients send scan requests and servers return lists of terms. The
process is akin to browsing a back-of-the-book index where a person looks up a term in a book index and "scans" the entries surrounding the term.
searchRetrieve. - SearchRetrieve operations are the heart of the matter. They provide the means to query the remote database and return search
results. Queries must be articulated using the Common Query Language. CQL queries range from simple freetext searches to complex Boolean
operations with nested queries and proximity qualifications. Servers do not have to implement every aspect of CQL, but they have to know how to
return diagnostic messages when something is requested but not supported. The results of searchRetrieve operations can be returned in any
number of formats, as specified via explain operations. Examples might include structured but plain text streams or data marked up in XML
vocabularies such as Dublin Core, MARCXML, MODS, etc.
Differences in operation
The differences between SRW and SRU lie in the way operations are encapsulated and transmitted between client and server as well as how results
are returned. SRW is essentially as SOAP-ful Web service. Operations are encapsulated by clients as SOAP requests and sent to the server.
Likewise, responses by servers are encapsulated using SOAP and returned to clients.
On the other hand, SRU is essentially a REST-ful Web Service. Parameters are encoded as name/value pairs in the query string of a URL. As such
operations sent by SRU clients can only be transmitted via HTTP GET requests. The result of SRU requests are XML streams, the same streams
returned via SRW requests sans the SOAP envelope.
Summary
SRW and SRU are "brother and sister" standardized protocols for accomplishing the task of querying databases and returning search results. If
index providers were to expose their services via SRW and/or SRU, then access to these services would become more ubiquitous.
• SOAP Web Services (SWS) and URL Web
Services (UWS) are protocols for querying
and returning results from remote servers.
The difference is in encapsulation of queries
and results transmitted between clients and
servers.
• In SWS, the messages between the client and
server are encapsulated in an XML SOAP
envelope.
• In UWS, the web service parameters are
encoded as name/value pairs in the query
string of a URL and transmitted via HTTP GET
requests. The results are returned as XML
streams or ASCII streams, without the SOPA
REST Web Services
•
REST, unlike SOAP, doesn't require you to install a separate tool kit to send and receive data. Instead,
the idea is that everything you need to use Web services is already available if you know where to look.
HTTP lets you communicate your intentions through GET, POST, PUT, and DELETE requests. To access
resources, you request URIs from Web servers.
•
Therefore, REST advocates claim, there's no need to layer the increasingly complicated SOAP
specification on top of these fundamental tools. For more on REST, see the RESTwiki and Paul
Prescod's pieces on XML.com.
There may be some community support for this philosophy. While SOAP gets all the press, there are
signs REST is the Web service that people actually use. Since Amazon.com has both SOAP and REST
APIs, they're a great way to measure usage trends. Sure enough, at OSCon, Jeff Barr, Amazon.com's
Web Services Evangelist, revealed that Amazon handles more REST than SOAP requests. I forgot the
exact percentage, but Tim O'Reilly blogged this number as 85%! The number I do remember from Jeff's
talk, however, is 6, as in "querying Amazon using REST is 6 times faster than with SOAP".
•
•
The hidden battle between web services: REST versus SOAP
•
Is SOAP a washout?
•
Are more developers turning their backs on SOAP for Web services? Redmonk’s James Governor
just posted this provocative thought at his MonkChips blogsite:
"Evidence continues to mount that developers can’ t be bothered with SOAP and the learning
requirements associated with use of the standard for information interchange. It is often described as
‘lightweight’, but its RPC [remote procedure call] roots keep showing. …semantically rich platforms like
flickr and Amazon are being accessed by RESTful methods, not IBM/MS defined ‘XML Web Services’
calls.“
•
NetKernel
•
•
•
•
•
•
•
•
•
•
•
Every software component on NetKernel is addressed by URI like a Web resource. A component is
executed as a result of issuing Web-like[REST] requests. A software component on NetKernel is
therefore a simple software service which hides the complexity of its internal implementation.
Applications or higher-order services are built by composing simple services. Service composition can be
written in a wide-choice of either procedural or declarative languages. You can think of this as Unix-like
application composition and pipelines in a uniform URI-based application context.
Complexity is managed through the URI address space which may be remodelled and extended
indefinitely. A complex aggregated service may always be abstracted into one or more higher-level
services or URI interfaces.
It generalizes the principles of REST, the basis for the successful operation of the World Wide Web, and applies them down to the finest granularity
of service-based software composition. The Web is the most scaleable and adaptive information system ever - now these properties can be
realized in general software systems too.
The NetKernel abstraction borrows many Unix-like principles including enabling emergent complexity through the pipelining of simple components
and by offering managed localized application contexts.
NetKernel provides many features which create a truly satisfying and productive environment for developing services and service oriented
applications.
Service Oriented Development
Why limit service oriented architectures to distributed system integration? With NetKernel the service oriented concepts are intrinsic in the
development model right down to the finest levels of granularity. With NetKernel, Software Development is the dynamic composition of simple
services
I never heard the phrase "REST microkernel" before, but I had an immediate expectation of what that would mean. An hour's experimentation with
the system met that expectation. Wildly interesting stuff.
Jon Udell, infoworld.co
The philosophy and thinking behind the development of the NetKernel is captured in our whitepaper, NetKernel: From Websites to Internet
Operating Systems . A Hewlett Packard Research Report presents the Dexter Project, the project which seeded NetKernel.
NetKernel – Peter Rogers
•
•
•
•
CASE STUDY - Service-Oriented-Development On NetKernel - Patterns, Processes And Product To Reduce The
Complexity Of IT Systems1060 NetKernel Case Study To Apply Service Oriented Abstraction To Any Application,
Component Or Service
Web services hold great promise for exposing functionality to the outside world. They allow organizations to
quickly connect disparate systems in a platform neutral manner. The real challenge occurs when Web services
need to address the underlying complexity and inflexibility of the systems they connect together. While Web
services provide an interface to connect systems - there remains the increasing complexity of the applications you
have built, and are currently building, which sit behind those interfaces. 1060 NetKernel applies the underlying
architectural principles of the Web and Web services together with Unix-like scheduling and pipelines to provide
radical flexibility and improved simplicity by providing a platform to apply Service Oriented Architecture throughout
your application environment. Developed through the exploration of some of the most complex Internet commerce
systems, 1060 NetKernel will allow you to apply service oriented abstraction to any application, component or
service.
The result: the ability to realize the promise of adaptive SOA with service-implementations which are dynamically
adaptive and easily change with your business.
Peter Rodgers is the founder and CEO of 1060 Research and architect of the 1060 NetKernel XML Application
Server. Prior to starting 1060 he established and led Hewlett-Packard's XML research programm and provided
strategic consultancy to Hewlett Packard's software businesses. Peter holds a PhD in solid-state quantum
mechanics from the University of Nottingham. (more)
Coordinated Views and Exploratory visualization
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Over the past few years the rise of Information Visualization has produced a diverse range of dynamic and highly interactive visual environments.
These exploratory tools enable a user to investigate, try out scenarios and search the information and visual space to generate better hypothesis and
develop a better understanding of the underlying information. Such investigative environments often utilize many different views of the same data, so
the user understands the information from different perspectives; the views are also tightly coupled together to allow rapid coordinated investigation
and exploration.
This project involves investigating a novel coordination model and developing related software system. The objectives of the project are:investigate aspects of coupling within an exploratory visualization environment;
develop a formal coordination model for use with exploratory visualization;
produce a publicly available visualization software system with the model as an integral part.
Motivation for this research comes from the rapid rise in numbers of people using and developing exploratory visualization techniques. Indeed,
coordination is used in many visualization tools, coupling navigation or selection of elements. The use of such exploratory tools generates an
explosion of different views, but there has been little research done in looking at an underlying model and effective techniques for connecting
elements, particularly in the context of the abundant windows that are generated when using exploratory methods that provide profuse views with
slightly different content (aggregations, parameterizations or forms).
There are many terms to do with multiples that may be used in this context [6]. From multiple windows, an all encompassing term to describe any
multi window system, through multiple views of many separate presentations of the same information; to Multiform which refers to different
representations (different forms) of the same data, a useful technique, as Brittain et al [21] explain "it is best to allow the user to have as many
renderings (e.g. cutting planes, isosurface, probes) as desired on the screen at once"; the user sees the information in different ways to hopefully
provide a better understanding of the underlying information. Additionally, abstract techniques are useful. These, present the information in a view
that is related to the original view but have been altered or generalized to simplify the image [1]. The London Underground map is a good example of
an abstract map, by displaying the connectivity of the underground stations and losing the positional information of the stations to simplify the whole
schematic.
Coordination. There are two different reasons for using coordination, either for selection or for navigation [32]. Selection allows the user to highlight
one or many items either as a choice of items for a filtering operation or as an exploration in its own right, this is often done by direct manipulation
where the user directly draws or wands the mouse over the visualization itself (a brushing operation [39]). Joint navigation provides methods to
quickly view related information in multiple different windows, thus providing rapid exploration by saving the user from performing the same or similar
operations multiple times. Moreover, these operations need not be applied to the same information but, more interestingly, to collections of different
information. Coordination and abstract views provide a powerful exploratory visualization tool [1], for example, in a three-dimensional visualization, a
navigation or selection operation may be inhibited by occlusion, but the operation may be easier using an abstract view; thus, a linked abstract view
may be used to better control and investigate the information in the coupled view.
REFERENCES
[1]. Jonathan C. Roberts. Aspects of Abstraction in Scientific Visualization. Ph.D thesis, University of Kent at Canterbury, Computing Laboratory, Canterbury, Kent,
England, UK, CT2 7NF, October 1995.
[6.] Jonathan C. Roberts. Multiple-View and Multiform Visualization. Visual Data Exploration and Analysis VII. Proceedings of SPIE. Vol. 3960, pages 176--185.
January 2000.
[21.] Donald L. Brittain, Josh Aller, Michael Wilson, and SueLing C. Wang. Design of an enduser data visualization system. In Proceedings Visualization '90. IEEE
Computer Society, pages 323--328. 1990.
[32.] Chris North and Ben Shneiderman. A Taxonomy of Multiple Window Coordinations. University of Maryland Computer Science Dept. Tecnical Report #CS-TR3854. 1997
[39.] Matthew O. Ward. XmdvTool: Integrating multiple methods for visualizing multivariate data. In Proceedings Visualization '94, pages 326--333. IEEE Computer
•
•
•
•
REST vs. SOAP
Differences between the Representational State Transfer and Simple Object Access Protocol-based approaches to Web services development include: Standards
REST promises to make Web services available using existing Internet standards. The SOAP-based approach involves a range of emerging
standards, not all of which will be adopted.
Tools
Many development tools support REST standards such as HTTP and XML, but commercial REST tools don’t exist.
Application tool vendors are building SOAP-based products aimed at making the development and deployment of Web services as easy as any other
kind of application development.
Developer Support
REST developers are in the minority, and most vendors say enterprise users aren’t demanding REST services yet. But REST is generating a buzz,
and is poised to capitalize on any market sentiment that SOAP-based Web services are overhyped. Major application development tool vendors BEA
Systems, IBM and Microsoft offer SOAP-based kits for Web services development.
Security
REST advocates say their way of doing Web services is more secure because of its reliance on the Internet’s existing security infrastructure. The
SOAP security story still is developing, but it promises to give administrators greater control over who accesses Web services and what rights those
users have.