MPEG-V :Bridging the Virtual and Real World

Publication subTitle :Bridging the Virtual and Real World

Author: Yoon   Kyoungro;Kim   Sang-Kyun;Han   Jae Joon  

Publisher: Elsevier Science‎

Publication year: 2015

E-ISBN: 9780124202030

P-ISBN(Paperback): 9780124201408

P-ISBN(Hardback):  9780124201408

Subject: G237.6 电子出版物编辑出版;J619 Music technology;TP37 Multimedia Technology and Multimedia Computing

Keyword: 无线电电子学、电信技术,一般工业技术

Language: ENG

Access to resources Favorite

Disclaimer: Any content in publications that violate the sovereignty, the constitution or regulations of the PRC is not accepted or approved by CNPIEC.

Description

This book is the first to cover the recently developed MPEG-V standard, explaining the fundamentals of each part of the technology and exploring potential applications. Written by experts in the field who were instrumental in the development of the standard, this book goes beyond the scope of the official standard documentation, describing how to use the technology in a practical context and how to combine it with other information such as audio, video, images, and text. Each chapter follows an easy-to-understand format, first examining how each part of the standard is composed, then covers intended uses and applications for each particular effect.

With this book, you will learn how to:

  • Use the MPEG-V standard to develop applications
  • Develop systems for various use cases using MPEG-V
  • Synchronize the virtual world and real world
  • Create and render sensory effects for media
  • Understand and use MPEG-V for the research of new types of media related technology and services
  • The first book on the new MPEG-V standard, which enables interoperability between virtual worlds and the real world
  • Provides the technical foundations for understanding and using MPEG-V for various virtual world, mirrored world, and mixed world use cases
  • Accompanying website features schema files for the standard, with example XML files, source code from the reference software and example applications

Chapter

Preface

Preface

1 Introduction to MPEG-V Standards

1 Introduction to MPEG-V Standards

1.1 Introduction to Virtual Worlds

1.1 Introduction to Virtual Worlds

1.2 Advances in Multiple Sensorial Media

1.2 Advances in Multiple Sensorial Media

1.2.1 Basic Studies on Multiple Sensorial Media

1.2.1 Basic Studies on Multiple Sensorial Media

1.2.2 Authoring of MulSeMedia

1.2.2 Authoring of MulSeMedia

1.2.3 Quality of Experience of MulSeMedia

1.2.3 Quality of Experience of MulSeMedia

1.2.3.1 Test Setups

1.2.3.1 Test Setups

1.2.3.2 Test Procedures

1.2.3.2 Test Procedures

1.2.3.3 Experimental QoE Results for Sensorial Effects

1.2.3.3 Experimental QoE Results for Sensorial Effects

1.3 History of MPEG-V

1.3 History of MPEG-V

1.4 Organizations of MPEG-V

1.4 Organizations of MPEG-V

1.5 Conclusion

1.5 Conclusion

References

References

2 Adding Sensorial Effects to Media Content

2 Adding Sensorial Effects to Media Content

2.1 Introduction

2.1 Introduction

2.2 Sensory Effect Description Language

2.2 Sensory Effect Description Language

2.2.1 SEDL Structure

2.2.1 SEDL Structure

2.2.2 Base Data Types and Elements of SEDL

2.2.2 Base Data Types and Elements of SEDL

2.2.3 Root Element of SEDL

2.2.3 Root Element of SEDL

2.2.4 Description Metadata

2.2.4 Description Metadata

2.2.5 Declarations

2.2.5 Declarations

2.2.6 Group of Effects

2.2.6 Group of Effects

2.2.7 Effect

2.2.7 Effect

2.2.8 Reference Effect

2.2.8 Reference Effect

2.2.9 Parameters

2.2.9 Parameters

2.3 Sensory Effect Vocabulary: Data Formats for Creating SEs

2.3 Sensory Effect Vocabulary: Data Formats for Creating SEs

2.4 Creating SEs

2.4 Creating SEs

2.5 Conclusion

2.5 Conclusion

References

References

3 Standard Interfacing Format for Actuators and Sensors

3 Standard Interfacing Format for Actuators and Sensors

3.1 Introduction

3.1 Introduction

3.2 Interaction Information Description Language

3.2 Interaction Information Description Language

3.2.1 IIDL Structure

3.2.1 IIDL Structure

3.2.2 DeviceCommand Element

3.2.2 DeviceCommand Element

3.2.3 SensedInfo Element

3.2.3 SensedInfo Element

3.2.4 InteractionInfo Element

3.2.4 InteractionInfo Element

3.3 DCV: Data Format for Creating Effects Using Actuators

3.3 DCV: Data Format for Creating Effects Using Actuators

3.4 SIV: Data Format for Sensing Information Using Sensors

3.4 SIV: Data Format for Sensing Information Using Sensors

3.5 Creating Commands and Accepting Sensor Inputs

3.5 Creating Commands and Accepting Sensor Inputs

3.6 Conclusion

3.6 Conclusion

References

References

4 Adapting Sensory Effects and Adapted Control of Devices

4 Adapting Sensory Effects and Adapted Control of Devices

4.1 Introduction

4.1 Introduction

4.2 Control Information Description Language

4.2 Control Information Description Language

4.2.1 CIDL Structure

4.2.1 CIDL Structure

4.2.2 SensoryDeviceCapability Element

4.2.2 SensoryDeviceCapability Element

4.2.3 SensorDeviceCapability Element

4.2.3 SensorDeviceCapability Element

4.2.4 USPreference Element

4.2.4 USPreference Element

4.2.5 SAPreference Element

4.2.5 SAPreference Element

4.3 Device Capability Description Vocabulary

4.3 Device Capability Description Vocabulary

4.4 Sensor Capability Description Vocabulary

4.4 Sensor Capability Description Vocabulary

4.5 User’s Sensory Effect Preference Vocabulary

4.5 User’s Sensory Effect Preference Vocabulary

4.6 Sensor Adaptation Preference Vocabulary

4.6 Sensor Adaptation Preference Vocabulary

4.7 Conclusion

4.7 Conclusion

References

References

5 Interoperable Virtual World

5 Interoperable Virtual World

5.1 Introduction

5.1 Introduction

5.2 Virtual-World Object Metadata

5.2 Virtual-World Object Metadata

5.2.1 Introduction

5.2.1 Introduction

5.2.2 Sound and Scent Types

5.2.2 Sound and Scent Types

5.2.3 Control Type

5.2.3 Control Type

5.2.4 Event Type

5.2.4 Event Type

5.2.5 Behavior Model Type

5.2.5 Behavior Model Type

5.2.6 Identification Type

5.2.6 Identification Type

5.3 Avatar Metadata

5.3 Avatar Metadata

5.3.1 Introduction

5.3.1 Introduction

5.3.2 Appearance Type

5.3.2 Appearance Type

5.3.3 Animation Type

5.3.3 Animation Type

5.3.4 Communication Skills Type

5.3.4 Communication Skills Type

5.3.5 Personality Type

5.3.5 Personality Type

5.3.6 Motion Control Type

5.3.6 Motion Control Type

5.3.7 Haptic Property Type

5.3.7 Haptic Property Type

5.4 Virtual Object Metadata

5.4 Virtual Object Metadata

5.4.1 Introduction

5.4.1 Introduction

5.4.2 Appearance Type

5.4.2 Appearance Type

5.4.3 Animation Type

5.4.3 Animation Type

5.4.4 Virtual-Object Components

5.4.4 Virtual-Object Components

5.5 Conclusion

5.5 Conclusion

References

References

6 Common Tools for MPEG-V and MPEG-V Reference SW with Conformance

6 Common Tools for MPEG-V and MPEG-V Reference SW with Conformance

6.1 Introduction

6.1 Introduction

6.2 Common Types and Tools

6.2 Common Types and Tools

6.2.1 Mnemonics for Binary Representations

6.2.1 Mnemonics for Binary Representations

6.2.2 Common Header for Binary Representations

6.2.2 Common Header for Binary Representations

6.2.3 Basic Data and Other Common Types

6.2.3 Basic Data and Other Common Types

6.3 Classification Schemes

6.3 Classification Schemes

6.4 Binary Representations

6.4 Binary Representations

6.5 Reference Software

6.5 Reference Software

6.5.1 Reference Software Based on JAXB

6.5.1 Reference Software Based on JAXB

6.5.2 Reference Software for Binary Representation

6.5.2 Reference Software for Binary Representation

6.6 Conformance Test

6.6 Conformance Test

6.7 Conclusion

6.7 Conclusion

References

References

7 Applications of MPEG-V Standard

7 Applications of MPEG-V Standard

7.1 Introduction

7.1 Introduction

7.2 Information Adaptation from VW to RW

7.2 Information Adaptation from VW to RW

7.2.1 System Architecture

7.2.1 System Architecture

7.2.2 Instantiation A: 4D Broadcasting/Theater

7.2.2 Instantiation A: 4D Broadcasting/Theater

7.2.3 Instantiation B: Haptic Interaction

7.2.3 Instantiation B: Haptic Interaction

7.3 Information Adaptation From the RW into a VW

7.3 Information Adaptation From the RW into a VW

7.3.1 System Architecture

7.3.1 System Architecture

7.3.2 Instantiation C: Full Motion Control and Navigation of Avatar or Object With Multi-Input Sources

7.3.2 Instantiation C: Full Motion Control and Navigation of Avatar or Object With Multi-Input Sources

7.3.3 Instantiation D: Facial Expressions and Body Gestures

7.3.3 Instantiation D: Facial Expressions and Body Gestures

7.3.4 Instantiation E: Seamless Interaction Between RW and VWs

7.3.4 Instantiation E: Seamless Interaction Between RW and VWs

7.4 Information Exchange Between VWs

7.4 Information Exchange Between VWs

7.4.1 System Architecture

7.4.1 System Architecture

7.4.2 Instantiation F: Interoperable VW

7.4.2 Instantiation F: Interoperable VW

References

References

Terms, Definitions, and Abbreviated Terms

Terms, Definitions, and Abbreviated Terms

Index

Index

The users who browse this book also browse


No browse record.