Zink, Sitaraman Win Best Paper Award for Research on 360-Degree Video Streaming

Mike Zink and Ramesh Sitaraman
Mike Zink (left) and Ramesh Sitaraman (right)

Associate professor Mike Zink of the electrical and computing engineering department and professor Ramesh Sitaraman of the College of Information and Computer Sciences (CICS) at UMass Amherst, along with Joonsup Park of the University of Texas at Tyler and Mingyuan Wu, Eric Lee, Bo Chen and Klara Nahrstedt of the University of Illinois at Urbana-Champaign (UIUC), recently received a best paper award from the IEEE International Symposium on Multimedia (ISM) for their work, “SEAWARE: Semantic Aware View Prediction System for 360-degree Video Streaming.”

As campus tours, classes, and social gatherings have become available only with remote options, 360-degree videos that provide an interactive and “immersive” viewing experience are more essential than ever. A 360-degree video can be viewed using a mobile device or a head-mounted display, such videos enable the viewer to experience the content as if they were on the scene. But despite the many recent advancements made in the technology, delivering high-quality 360-degree videos at scale over the Internet is a major unresolved challenge. According to the researchers, 360-degree video delivery requires significantly more bandwidth and much lower delays than conventional videos. Higher delays in delivering 360-degree videos can result in the video lagging behind the user’s head movements, resulting in cybersickness and a poor-quality experience.

“When a user moves their head to view the video from a new angle, the new scene must be delivered from the video server to the viewing device within tens of milliseconds. Because data transmission over the Internet often incurs higher delays, this is not always possible to achieve,” explains Zink. “Our solution is to predict what the user is likely to watch next, fetch that content ahead of time from the server, and store it on the user’s device, so the delay can be much reduced.”

“For example, suppose that you are watching a 360-degree video of the Super Bowl from the comfort of your home. Our system processes the video to recognize objects of interest that you are likely to watch, such as the football as it moves across the screen. It then uses this information to predict how the user might move their head and what objects they may watch next,” says Sitaraman. “This allows parts of video containing objects of visual interest, such as the football, to be fetched beforehand from the server and stored in the user’s device.”

The team’s paper introduces the Semantic-Aware View Prediction System (SEAWARE), a novel system which performs anticipatory semantic analysis for view prediction. SEAWARE can be integrated into the Dynamic Adaptive Streaming over HTTP (DASH) framework that is widely used for video streaming. This work is part of a $1.2 million NSF-funded project that is a collaboration between UMass Amherst and UIUC to research novel systems for 360-degree video creation and delivery.

Zink is a professor of electrical and computer engineering at UMass Amherst and an adjunct faculty in CICS. His research interests are in cyber-physical systems, cloud computing, multimedia systems and future internet architectures. He is a senior member of the IEEE and ACM, and the recipient of the DASH Industry Forum Excellence in DASH Award, as well as the NSF CAREER award.

Sitaraman directs the Laboratory for Internet-Scale Distributed Systems (LIDS) at CICS. He is best known for his role in pioneering content delivery networks that currently deliver much of the world’s web content, streaming videos, applications and online services to billions of people around the globe. He is the recipient of the UMass Amherst Distinguished Teaching Award, ACM SIGCOMM Networking Systems Award, DASH Industry Forum Excellence in DASH Award, CICS Outstanding Teaching Award and the NSF CAREER Award. He is a fellow of the ACM and the IEEE.

Read the paper.