Name: | Open File Formats Technology Working Group (OFF.TWG) |
Identifier: | OFF.TWG |
Web site: | http://MediaGrid.org/groups/technology/OFF.TWG |
Discussion forum: | http://MediaGrid.org/groups/technology/OFF.TWG/forum |
Email list: | |
Standards process: | http://MediaGrid.org/process/Immersive_Education_Initiative_Process_Document.pdf |
IP policy: | http://MediaGrid.org/policy/Media_Grid_Intellectual_Property_Policy.pdf |
Chairs: | Nagel, Nicholas H. (Grid Institute and Boston College), Walsh, Aaron E. (Grid Institute and Boston College) |
Liaisons: | Beaubois, Terry A. (Montana State University and liaison to the National Institute of Building Sciences buildingSMART Alliance) |
The Open File Formats Technology Working Group (OFF.TWG) is responsible for defining, evolving and maintaining open, platform-neutral file formats that enable learning objects and experiences to be seamlessly deployed across Immersive Education technology platforms.
This TWG is open to all members of the Immersive Education Initiative having:
In the context of Immersive Education the term platform refers to any virtual world, simulator or 3D game-based environment that may be used for teaching or training purposes. The Immersive Education platform has evolved considerably over the past decade and the 3rd generation ("next generation") is now under development. Whereas the previous two generations of Immersive Education were based on specific client-side platforms tied to proprietary server-side infrastructures, the future of Immersive Education revolves around multiple client-side platforms working in unison through the server-side Education Grid.
Based upon open source technologies and open standards, the Platform Ecosystem and Education Grid will provide educators with a comprehensive end-to-end infrastructure for a new generation of virtual world learning environments, interactive learning games, and simulations. To this end the Immersive Education Initiative has a mandate to design, develop and promote open and platform-neutral file formats that enable interoperable learning objects and experiences to be seamlessly deployed across a wide variety of virtual world and game platforms (e.g., Second Life, Croquet/Cobalt, Wonderland, and other 3D/VR platforms) as Figure 1 illustrates.
Second Life, Wonderland, and Croquet/Cobalt do not currently support the same 3D/VR file formats, however. Today's virtual world and game platforms are essentially walled gardens; users are not able to interact with the same objects or experiences from different platforms, nor are users of one platform able to interact with users residing in another platform. Although the Immersive Education Initiative does not plan to address the latter concern (user interoperability across platforms), through the Open File Formats Technology Working Group we are actively standardizing cross-platform content interchange by adopting and promoting open, platform neutral 3D/VR file formats that enable the promise of "create once, experience everywhere."
Today's Immersive Education technology platforms do not support the same 3D/VR file formats. Each platform currently supports its own native file format(s). Second Life, for example, supports a proprietary format developed by Linden Labs that is based on "prims". Wonderland, in contrast, currently supports the mesh-based Extensible 3D format (X3D, an international ISO/IEC standard) and as of version 0.5 will support COLLADA. Croquet/Cobalt, meanwhile, supports the mesh-based ASE (ASCII Scene Export) format and it previously supported the OBJ format (OBJ support is broken in the current version of Croquet/Colbalt).
Consequently, there is no way for these platforms to exchange 3D/VR content or to provide users with the same immersive learning experience. Because each platform supports different native file formats end users are limited in terms of the learning objects and learning experiences that are available to them from within a given platform. Content authors must either re-create content for each platform or (more commonly) create content for only platform and simply ignore the others as Figure 2 illustrates:
Transcoding tools, as Figure 3 illustrates, can eliminate much of the effort associated with making the same object or experience available to multiple platforms. Transcoding is a suboptimal stop-gap measure, however, that frequently requires manual "touch-up" work and re-coding of scripts. In contrast, platforms can directly support open file formats and skip the transcoding process altogether (see Figure 4).
Transcoding is a stop-gap measure that is only necessary when a platform doesn't directly support the target file format. Direct support is preferable to transcoding for three major reasons. First, and foremost, is the fact that transcoding is rarely a perfect process that often produces suboptimal results that require subsequent "hand tuning" by an experienced content author. In addition, because scripts can't easily be transcoded they may need to be re-coded for each platform by an experienced programmer. Finally, transcoding is an inconvenient extra step that can be skipped by platforms that directly support the target format.
Although direct support is preferred, transcoding is a reasonable approach when a platform already has a complete, well-defined exchange format that fully represents every feature of the platform. In such cases a transcoder tool can be written to convert from that format to another format. A disadvantage of this approach is that a custom transcoder tool needs to be written for each platform in order for that platform's file format(s) to be converted into the format(s) supported by other platforms. For example, transcoding services to support the current Immersive Education virtual worlds platforms will require the development of six custom transcoder tools (assuming a total of three platforms, namely Second Life, Wonderland and Croquet/Cobalt). As more formats and platforms are introduced the number of transcoder tools required increases exponentially, which in turn makes maintenance and support difficult if not entirely unfeasible [if N is the number of platforms, N*(N-1) transcoders are required assuming one-way transcoders].
Transcoding to and from an intermediate format reduces this concern; for each platform only two transcoder tools are required (one each to convert to and from the intermediate format). This approach is much more manageable and cost effective. The intermediate format used must be complete or extensible enough to represent all of the features required by each platform, however, so that no data will be lost during the transcoding process.
Another option is to add a plug-in or other code to a platform to directly import and export to an intermediate format. This is the best approach in terms of the platform's users, since a user can export the file once and then import it directly into other programs without being required to run a transcoder every time. In most cases, writing this kind of direct import/export plug-in is easier than developing a stand-alone transcoder. If a platform doesn't already have its own interchange format using one can avoid the need to design a new format and the tools that use it. This is also a good solution in cases where a platform vendor wishes to keep their native file formats confidential.
Conformance suites are necessary to provide developers with adequate testing materials during the implementation of transcoding tools or when enhancing a platform to directly support a target file format. Conformance test suites are intended to be used as a tool to aid developers during the implementation of transcoding tools and direct support for a specific open file format described in this document. For example, a COLLADA conformance test suite is necessary for developers to build COLLADA transcoding tools or to implement direct support for the COLLADA file format for a given platform.
CONFORMANCE TEST SUITE REQUIREMENTS:Conformance test suites will be developed as collections of open and freely available materials that any individual or organization can utilize.
Conformance test suites will include content (avatars, objects, models and environments) and test programs (code) that measure the quality and validity of the import/ export capabilities of a transcoding tool or platform. Exported files, for example, will be passed through a XML validation process that will test the exported content for conformance with the target file format specification.
Test suites will be comprised of avatars, objects, models and environments taken from a variety of existing content repositories and will also include newly constructed materials that are created from ongoing Immersive Education content development efforts.
· Platform and tool certification is outside the scope of this project; conformance test suites will be provided as informational resources only.
The objectives of the Open File Formats Technology Group are:
This Technology Working Group is closely related to:
Develop educational content that can be used across multiple Immersive Education environments.
A team of researchers and content developers at the "Center for the Study of Population and Urban Development" creates a simulation of a city and surrounding farmland which allows end-users to manipulate parameters and observe the effects on issues related to logistics and supply lines infrastructure. Participating in the simulation engenders better understanding of micro and macro economics, logistics, demographics and population trends. The creators want to make their content available as part of a creative commons and enable widespread distribution of the simulation.
For their senior project, a group of middle school students develop a model of the Greek Acropolis designed to scale. In developing the model, the students arrive at enhanced understanding of basic principles of geometry, architecture, measurement and materials. The students want to add their model to a virtual world which currently has a "stub" (i.e., designated spot lacking a detailed version) in place of the model edifice.
![]() |
Use Case Diagram: Developing educational content that can be used across multiple Immersive Education environments.
|
From this use case we can identify a number of roles and actions that must be supported by open file formats defining content for Immersive Education.
Note: A real world individual might take on any or all of these roles at one time or another. As such, the roles suggested here are considered to be defined against the system rather than as embodying real world job titles or responsibilities. Anyone with the prerequisite knowledge and skills can develop content in any given domain. The critical point is that content can become accessible across all participating platforms.
Traverse multiple Immersive Education environments, using multiple Immersive Education platforms, with a singular avatar. In other words, "walk between worlds".
![]() |
Use Case Diagram: Traversing worlds with a singular Avatar.
|
Tagging content with metadata tags (keyword descriptors) to facilitate text-based search and retrieval.
The team of developers for a economic simulator wish to “tag” the content files they have created with appropriate keywords and descriptions to facilitate text-based retrieval of their content from Education Grid repositories and from the Web using search engines (such as Google, Yahoo, etc.). The developers therefore embed the tag "economics simulation" directly inside of the content file(s) that comprise their learning simulation.
In contrast, an educator using the economic simulation decides to apply the tags "logistics" and "supply-line management" to the content. Because the educator is not an author of the content the metadata she provides is associated with, but not imbedded inside of, the content files that make up this learning experience.
Students using the economic simulator further tag it with metadata they provide, resulting in three different levels of metadata associated with this content: 1) author tags, 2) educator tags, and 3) student tags.
Automated and computer-based assessment of students using Immersive Education environments.
On Friday a K-12 teacher assigns a quest-based virtual world as homework for her middle school students. Over the weekend students are required to explore virtual Egypt and gather ancient artifacts hidden throughout the tombs and pyramids. To obtain all of the artifacts students must explore the virtual environment and use the skills and historical knowledge that they acquire in order to advance in their quest.
Over the weekend the teacher users her Web browser to view a text report that tracks student progress. The text-based report is generated automatically by the learning environment, and indicates how long each student has engaged in the quest, the items they have found, how long (and to what extent) they interact with and examine the objects they’ve found, and so forth. All of this data is provided in text format, along with a summary letter grade that is assigned to each student based on their overall progress.
On Monday the teacher meets her students in virtual Egypt and sees, visually, the progress of her students. Students that have successfully completed the quest are fully clothed in royal Egyptian garments and are wearing ornate jewelry and special items from that period (such as scarab necklaces, gold head bands, etc.). Students that have begun but not yet completed the quest have less impressive Egyptian outfits, and lack the fancy jewelry worn by the higher-scoring students. Students who have not completed any part of the quest are wearing plain modern-day street clothing, such as jeans and t-shirts.
All clothing and jewelry is automatically attached to the students during their quest in direct relation to the work they’ve done, which in turn provides an immediate visual indication of their progress in addition to a reasonable degree of virtual peer pressure: students can see how they compare to their fellow classmates merely by the appearance of their own avatar.
Note: Not all materials listed under this section are open and freely available. Some are proprietary technologies that either build upon or utilize open standards or technologies or have proprietary features and/or capabilities that are useful for comparison with open alternatives.
COLLADA community: http://collada.org
COLLADA standard: http://www.khronos.org/collada
Developing Web Applications with COLLADA and X3D (white paper): http://www.khronos.org/collada/presentations
ECMA-363 (aka U3D; Universal 3D): http://www.ecma-international.org/publications/standards/Ecma-363.htm
ECMA International: http://www.ecma-international.org
Extensible 3D (X3D): http://www.web3d.org/x3d/specifications
FBX: http://www.alias.com/fbx
jMonkey Engine (jME): http://www.jmonkeyengine.com
Khronos Group: http://www.khronos.org
MPEG-4: http://www.chiariglione.org/mpeg/
Object-Oriented Graphics Rendering Engine (OGRE): http://www.ogre3d.org/
OpenSceneGraph: http://www.openscenegraph.org/
Virtual Reality Modeling Language (VRML): http://www.web3d.org/x3d/specifications/#vrml97
Web3D Consortium: http://web3d.org
Open File Formats Project Overview [Working Draft Planning Document; PDF rendition]. Immersive Education Initiative et al. May 13, 2008. http://MediaGrid.org/groups/technology/OFF.TWG/public/in/OFF_Project_Overview.pdf
OFF.TWG 3D and Virtual Reality (3D/VR) File Format Requirements [Public Working Draft; HTML rendition]. Immersive Education Initiative et al. http://MediaGrid.org/groups/technology/OFF.TWG/public/out/3DVR_Requirements
High resolution avatars, objects, and environments. Immersive Education Initiative. May 9, 2008. http://ImmersiveEducation.org/events/#HIGH_RESOLUTION
Immersive Education content review, ranking, tagging, vetting. Immersive Education Initiative. March 21, 2008. http://ImmersiveEducation.org/events/#CONTENT
Avatar tracking and analysis meeting, demonstrations, discussion. Immersive Education Initiative. March 14, 2008. http://ImmersiveEducation.org/events/#AVATAR_TRACKING
Platform Ecosystem and Education Grid status and review. Immersive Education Initiative. April 18, 2008. http://ImmersiveEducation.org/events/#ECOSYSTEM_AND_GRID_STATUS
The Education Grid: http://TheEducationGrid.org
Education Grid Requirements Specification [Working Draft]. Immersive Education Initiative. http://MediaGrid.org/groups/technology/grid.ied/specification/
Immersive Education Initiative Launches The Education Grid. Immersive Education Initiative. June 20, 2008. http://MediaGrid.org/news/2008-06_Education_Grid.html
Immersive Education Initiative announces Education Grid and Platform Ecosystem at Boston Summit. Immersive Education Initiative. January 22, 2008. http://MediaGrid.org/news/2008-01_Summit_Outcomes.html
Technology Working Group telephone conferences and/or virtual world meetings are held once a month, with additional telephone conferences and virtual world meetings arranged at the discretion of the group.
Face-to-face (f2f) meetings are one- to three-day sessions held approximately twice a year, with additional f2f meetings arranged at the discretion of the group. To maximize working relationships between the Technology Working Group and relevant standards bodies and vendor organizations f2f meetings may be held in conjunction with industry events, standards meetings, or on location at member or collaborator organizations. All f2f meetings are announced through the group's email list and Web page.
Refer to the Immersive Education Initiative Process Document for details
The proceedings of this Technology Working Group are confidential and restricted to members of this group. As an open standards organization, and in recognition of the need for ongoing accountability to the general public, MediaGrid.org will periodically publish a public summary of all technical decisions (together with the rationales for these decisions) made by this group since the last public summary. Deliverables produced by this group, such as specifications and software implementations, will be provided to invited experts and collaborators for review prior to being furnished to the general public.
Refer to the Media Grid Intellectual Property Policy for details
Document revised 2008-10-21
Copyright Statement and Legal Notice.