Archive for OGC

3D Cities to Virtual Worlds

Berlin Molkenmarkt

Recently, The members of the Open Geospatial Consortium, Inc. (OGC) adopted version 1.0.0 of the OpenGIS® CityGML Encoding Standard as an official OGC Standard. According to OIGC, CityGML is an open data model framework and XML-based encoding standard for the storage and exchange of virtual 3D urban models. Also, CityGML is an application schema of the OpenGIS Geography Markup Language 3 (GML3) Encoding Standard, an international standard for spatial data exchange and encoding approved by the OGC and ISO.


According to the CityGMLWiki, “targeted application areas explicitly include urban and landscape planning; architectural design; tourist and leisure activities; 3D cadastres; environmental simulations; mobile telecommunications; disaster management; homeland security; vehicle and pedestrian navigation; training simulators; and mobile robotics.”


CityGML derived from efforts in Germany to integrate and link building information to the surrounding land. Traditionally, this integration has been weak, resulting in many challenges to the building industry as well as planners. And it’s not only technology where there are gaps, the entire building and GIS industries have been at arms length for decades. The hope is that CityGML can provide the standards necessary to bridge those gaps so that models can more accurately reflect the real-world juxtaposition and interrelationships between buildings and land.


In my opinion, all of this leads to virtual worlds. Now, virtual worlds are primarily the domain of gamers and socializers. But virtual worlds are no passing fad. According to a recent Technology Intelligence Group report Virtual World Industry Outlook 2008-2009, “Over one billion dollars were spent by the venture community on startups directly within or supporting virtual worlds between August 2007 and August 2008, and according to virtual world vendors and developers …”


Exciting to me is that with the inevitable merger of real-world models with virtual world technologies, sometimes called the Metaverse, geography and geographic information will be critical. According to the Metaverse Roadmap Overview, the Metaverse is the convergence of 1) virtually-enhanced physical reality and 2) physically persistent virtual space. It is a fusion of both, while allowing users to experience it as either.


I’ve written about The Business Relevance of Virtual Worlds. Others have discussed 3D models in the context of the GeoWeb, which is happening now and will be the precursor to geographically accurate virtual worlds. All of the big players are in this – Autodesk, Bentley, ESRI, Google, and Microsoft, as are some smaller companies such as Galdos Systems and Onuma. The Metaverse requires standards for interoperability, and CityGML is an important standard for now and the future of geographic information online.

Teleconference: Emergency Planning and Disaster Preparedness: The Critical Role of Geospatial Information

RFG LogoOGCLive Presentation Thursday, July 19, 2007 at 2PM – 3PM EDT/11AM – 12:00 PM PDT
Seventh in RFG’s Teleconference Series: Emergency Planning and Disaster Preparedness: The Critical Role of Geospatial Information

===

Is your business planning for disasters, but doing so solely within the vacuum of its own domain? Do you or your colleagues interact with the public safety officials responsible for developing plans for various emergency scenarios? Is your enterprise’s IT department using the best available information and tools to prepare for emergencies?

Unfortunately, businesses in many localities are involved only lightly, if at all, in local or regional emergency planning. This lack of interaction creates significant risks to both public and private plans. In addition, geospatial data and tools are valuable but frequently unused tools for effective emergency management. Proper enterprise emergency planning requires a wide array of applications, data, people, processes, and planning. Improved collaboration and expanded use of geospatial data and tools can improve responses to emergencies and disasters, reducing risk and minimizing negative effects for all involved.

On July 19 at 2 p.m. Eastern US time, Learn from RFG and the Open Geospatial Consortium (OGC) how enterprises can reach out to those in their localities to integrate emergency plans, and to link appropriate applications, data, people, and processes using standards-based systems. Furthermore, hear how maps and other geospatial information can provide significant help in most emergencies. Case studies showing the use of OGC-promulgated standards in emergency operations will also be featured.

The Presenters:
Ron Exler
Vice President and Research Fellow, Robert Frances Group (RFG)
Ron advises clients on software application development, information technology asset management, geographic information systems (GIS), and a variety of other areas. He began his career as a Research Scientist for the Battelle Memorial Institute, a private research institute that provides technology and scientific research services for government and the private sector. While at Battelle, Ron developed innovative statistical and GIS applications for the Environmental Protection Agency and a private navigation technology firm. Ron has been widely quoted and published articles in periodicals including CIO Magazine, Computerworld, CSO Magazine, DM Review, ebizQ Insider, Fast Company, Intelligence Enterprise Magazine, and Internet World. In June 2007, Ron was named one of the top 50 analyst bloggers by Technobabble 2.0, a popular and widely respected blog focused on IT industry analysts and analyst relations.

Mark E. Reichardt
President, Open Geospatial Consortium (OGC)
Mark has overall responsibility for Consortium operations, overseeing the development and promotion of OpenGIS® standards and working to ensure that OGC programs foster member success. Before joining the OGC in 2000, he was involved in a number of technology modernization and production programs for the U.S. Department of Defense (DoD). In the mid 1990s, he was a member of a DoD Geospatial Information Integrated Product Team (GIIPT) formed to help transition the DoD mapping mission to a more flexible and responsive geo-information based paradigm. Under Mark’s leadership, the GIIPT Production Team validated the ability of commercial off-the-shelf hardware and software to meet many of the DoD functional requirements for geospatial production operations. Mark also serves on the Board of Directors of the Global Spatial Data Infrastructure Association.

To Register:
Those interested in attending this Webinar should contact Carolyn Crocker of RFG at ccrocker@rfgonline.com or at (US) 203/429-8931. For more information, visit RFG’s Web site at http://www.rfgonline.com.

Open Geospatial Consortium Interoperability Day (Part Two)

The Oct. 4 meeting featured a panel of “experiences” as well as a fascinating keynote from Doug Eberhard, CTO of construction firm Parsons Brinckerhoff (PB). PB manages very large projects including the World Trade Center site rebuilding, the Los Angeles International Airport master plan, and Seattle’s Alaska Way Viaduct. Mr. Eberhard stressed how the construction world is still mostly planning in two dimensions, despite decent modeling tools for three. He asserts it is people that are holding things back because while they possess 3D data, they do not want to share. PB uses a 3D infrastructure modeling approach to convey its plans visually, linking models to schedules (3D plus time equals 4D, he says). It works, with realistic and highly communicative time-lapse animated 3D fly-overs and drive-throughs showing planned building and final outcomes. Mr. Eberhard stressed that in addition to marketing value, such models lead to building with the least amount of disruption as possible by identifying conflicts early in the process, perhaps before building begins. It is fair to say the audience of OGC participants was extremely impressed with what they saw and the integration of CAD with geospatial information.

The experiences panel pretty much was flat, in a 2D sense. However, they shared some fascinating efforts. Johnny Tolliver from Oak Ridge National Labs showed a Sensor Alter Service (SAS) used at Fort Bragg, NC for quickly notifying mobile resources of various issues. SAS is built on the XMPP transport standard for short messages. Intergraph manages the alerts and sensor information that includes live video events. OpenGIS Location Service (OpenLS) tracking service is also used. Kevin Shaw of the U.S. Naval Research Laboratory described its GIDB Portal System that is essentially a middleware broker taking 1,500 data sources and providing them to clients of various types for usage. NASA contractor Nadine Alameh showed a similar broker, its Earth Science Gateway. ESRI’s Jeanne Foust, Global Manager of Spatial Data Infrastructure, discussed the importance of standards and showed ESRI compliance as well as lessons. One of Jeanne’s main points was that the need for interoperability in GIS is not new – GIS has always required interoperability; it is the nature of the beast. However, she also stressed that standards must be transparent to end users in order to be effective. Brian Lowe wrapped up the experiences panel with a discussion of Canada’s National Forest Information System.

Business takeaways? Most of the presenters were from government or their contractors, however, some lessons convey:

1. Data sharing is a critical issue. The technical interoperability issues are being resolved through OGC and other groups. Privacy and security, as well as politics are at play.

2. Visual displays convey information in ways no other medium can. For some applications 3D beats 2D and 3D animated can convey even more.

3. Sensors tied to location are hot. Sensors with communication capabilities tied to their coordinates create a whole new set of potential applications.

4. Standards are great but they must be transparent to end users. Vendors are working to provide that transparency.

Open Geospatial Consortium Interoperability Day (Part One)

The Geo Factor attended the OGC Interoperability Day yesterday in Tysons Corner, VA. The 320-member OGC develops and promotes location-based services (LBS) and geospatial standards through a consensus process. The morning session involved a multi-vendor (and a government agency) demonstration illustrating the tremendous potential of geospatial data sharing. The scenario was planning for an Olympics event in Tampa, Florida. Autodesk, Bentley, eSpatial, ESRI, Intergraph, and NASA participated, each representing a different government agency (water, DOT, etc.). The group resolved an increased need for water as well as an “unexpected” tanker crash causing an oil spill and gas plume near the city to show how easy it is to share geospatial data via modern OGC geospatial and other standards. And it all looked incredibly easy and productive. Drag, drop, click, select, zoom, and analyze. 2D, 3D, imagery, maps, and networks. As easy as typing in a URL. While such multi-vendor, multi-user sharing used to take months, it now can be done in hours. Why wouldn’t anyone use these standards (WMS, WFS, XML, SOAP, metadata catalogues)?

While there are some ongoing discussions on and changes upcoming to some of the OGC standards, the industry has clearly arrived regarding at least the major technical issues of geospatial data sharing. What remains to be tackled more completely are the people, process, licensing, and data quality challenges. While it is relatively straightforward to share data via standards-compliant transports, many people still closely guard access to their data. The processes and workflows for sharing scenarios can be complex and unclear, so they need to be worked out in advance of specific needs, such as an emergency. Quality is a concern when combining multiple data sources of varying scale and timeframes. Politics, planning, and privacy are some of the significant challenges facing organizations sharing geospatial data. However, the demonstration is cause to be optimistic about the potential for sharing geospatial data.