Cutting Edge
Live Music Experiences Bridging the Real and Virtual Worlds - Revolutionizing the Distance between Fans, Artists and Content
The COVID-19 pandemic has accelerated the phenomena of streaming live music online. In the meantime though, these experiences have made people realize just how different a streamed concert with no audience feels from a real, in-person experience. Sony is determined to address this gap. As part of its Project Lindbergh, a group-wide extended reality (XR) entertainment initiative, Sony aims to offer a completely new experience, bridging the real and virtual worlds by connecting artists and fans using VR technology. To find out more about where this project is at today and where it is headed tomorrow, we talked with Takashi Imamura, the development leader, and Kazuma Takashi, who is in charge of UI/UX and interaction development.
Profile
-
Takashi Imamura
-
Kazuma Takahashi
Sony’s vision for new VR live music
──What do you mean by “live music experiences bridging the real and virtual worlds?”
Takashi Imamura:Sony isn’t the only company looking for ways to use VR technology to deliver live music experiences to fans. There are already a number of services available. In many cases, VR cameras are installed at the venue and content is streamed to viewers via head-mounted displays where they watch the event on their own at home. But the artist’s performance isn’t the only thing going on at the physical venue. There, you can feel the vibe of excited fans all around, enjoy interaction with the artist, and purchase merch and other items at the vendors. Without these elements, the virtual experience is far inferior to the real thing. What we want to do is translate and transplant the value of these real experiences into the virtual world, while also adding a uniquely virtual dimension to the experience to create entirely new value.
Live music experiences bridging the real and virtual worlds — What it looks like
- Venue lobby
Meet your friends in the lobby before the performance, share your excitement about the upcoming show, select a glowstick or other items to bring into the venue and try them out before the concert.
- In front of the stage
Interact with friends even during the live performance and cheer on the artist together. You can even clap and wave your glowstick or other items to show your support. You can see not just your friends, but all the other fans in the venue.
- During the live performance
The live performance is not just streaming music and video; it comes with interactive features: audience seating moves in sync with the video, the artist can respond to their thrilled fans, sending hearts flying out to them, etc., making the stage performance all the more exciting.
- After the live performance
After the performance is over, you return to the lobby where you can enjoy the afterglow of the show while talking about the concert with friends. The demo sends users a chance to pick up photos from the performance (there are even surprises such as autographs or messages from the artist hidden on the back of the photo).
──When and how did this project start?
Imamura:We have been working on VR streaming of live music venues for some time, with a notable success on PlayStation® VR in December 2019. Since then, we’ve been working on improving VR shooting technology to improve image quality. We’ve also been engaged in a number of different challenges related to how to deliver venue audio, which is an important aspect of the live music experience. In this area, we handled the Survive Said The Prophet VR EXPERIENCE, which won a Lumiere Award in 2020 in the U.S. When talking to fans of live music performances, however, it’s clear that it’s not enough just to deliver audio and video. They say that things like a sense of togetherness and unity are important. This got us wondering how we could bring these experiences to life using technology, and that triggered this project.
Kazuma Takahashi:We have had a lot of discussions since then, realizing that in addition to the music performance itself, aspects such as chatting with friends before the show and basking in the afterglow and the exalted atmosphere of the environment are all important aspects of live performances. We agreed that, although elements such as incredibly high image quality, the ability to get up close to the artists, and the freedom to change the camera angle at will are the best parts of the VR live experience, those elements alone are not enough.
Imamura:I think that existing forms of remote live performances have been very meaningful during the current pandemic. But it’s a bit sad that you just watch on your small smartphone screen from your living room, and the fun ends the moment the show is over, and there’s no way to enjoy the atmosphere before and after.
Connecting artists with fans, and connecting fans with each other
──Is there any special focus as you recreate the atmosphere of a live music venue in virtual space?
Imamura:The most important thing is to avoid any diminishment of the sense of immersion. You need to avoid anything that snaps the user back to reality and makes them think, “Wait, what am I doing?” One of the things we did in this regard was to minimize the controls so that they were simple and intuitive. Instead of pressing a button on the controller to wave a glowstick, like in past musical VR content, we developed more physical controls that allow users to actually clap and flash a thumbs up. This was more effective than we had imagined, giving users the sensation of swaying in tune to the music by moving their body, making the whole experience more fun.
Takahashi:The demo you just viewed comes with pre-recorded concert footage, including prepared virtual reactions from the artists, but if you try it, you can still enjoy the sensation of interacting with the artists. You might shout your head off and still end up having the artist interact with the person next to you instead… but this makes you feel like getting even louder. That connection between artists and fans, and between fans and other fans, was a key focus for us.
──You’re right—it really felt fresh to be with peers at the venue and enjoy the show while feeling their movement and voices.
Imamura:It’s also very important to properly control communication within the audience group. Currently, the demo system allows up to 28 users to participate in the same live music event at once, but the number of people you can communicate closely with, via voice chat, is purposely limited only to four. Interaction with participants beyond those four is limited to nonverbal communication; you can’t hear what they say but can hear them clapping and see them waving glowsticks.
Takahashi:In addition to maintaining close communication between friends, we really focused on ensuring that the fans nearby still feel real, as well. The audience surrounding you doesn’t look like CG or robots simply waving their glowsticks, and I think this helps to enhance the sense of immersion in the show.
Imamura:Audience seating is arranged so that every single participant has the best seat in the house for enjoying the concert. This is something you can only do with VR. Of course, fans love this, but so do content creators and artists, because it allows them to show exactly what they want to every audience member at the performance. Still, we take care to maintain the relative position of the participant within the group so that there are no contradictory actions in communicating with friends, such as when waving their hands to each other.
Facilitating advances in technology in a culmination of diverse Sony assets
──How are your experiences in Project Lindbergh being used to make all this happen?
Imamura:Project Lindberg is a group-wide effort we’ve been engaged in for the past three years where we continually discuss shooting, imaging and audio technology with experts in these fields. In building the interactive experience this time, we also referenced the prototypes from the interactive technology team.
Takahashi:Although we’ve been involved in a number of group-wide projects before, this has been an exceptional opportunity for us to connect so closely with artists and others in the entertainment industry. We had these entertainment experts take a look at the prototypes that those of us in the technology arena created, and worked together to address critiques and incorporate ideas and proposals to come up with a new experience.
Imamura:More specifically, team members with various backgrounds brainstormed to create an abstract of what a unique VR live music experience is. Then the Creative Center team, led by Kazuma, visualized the idea via storyboards. These concrete ideas were then reviewed by team members and checked for any inconsistencies with the concept. At times, the technical team members created a simple prototype to verify the concept. We repeated this process over and over to arrive at the current experience.
Because we tend to come up with ideas from a technical starting point, the input of Sony Music team members was incredibly informative in learning about the viewpoint of artists and others in the entertainment industry. These types of intense discussions went on for about three years, and the essence has been incorporated into the current demo.
The challenge of delivering low-latency artist interaction
──Were there any technical hurdles that were a particular challenge in this process?
Takahashi:In my area of design it was probably the challenge of figuring out visual expression. Although I have a lot of experience and knowledge in 2D visual design for display on TVs and such, in many cases, it’s completely useless in a 3D space where the user can move around at will. For example, a trivial instruction in 2D imagery such as a shower of shining particles can look different in virtual space than you intend, making it difficult to see. Solving such problems was a refreshing part of the experience.
──What types of technical hurdles do you see in the future?
Takahashi:I want to continue taking on the challenge of how to best bring out the artists’ performances using technology and design. How can we show the artists, who are surrounded by nothing but a green screen, that their fans are cheering them on from the virtual venue space? This is what data visualization — our realm of expertise — is all about. Right now I’m working on ways to visualize fans on the right side waving and cheering while others are using their glowsticks to write messages in the air.
Imamura:With regard to the communication from artists to fans, the demo used interaction effects sent out from the center of the scene. However, in an actual live performance, we will need to use sensors to keep track of body movements such as face positions and hand movements, and produce the special effects based on that. Sony’s sensor and camera technologies will be of great service in this. Going forward, I want to continue to pursue an experience that can’t be obtained in an actual live music performance, something that goes beyond real life.
On top of this, we will need to address latency. With current streaming technology, content of this scale would result in a one-way delay of about 20 seconds. This long of a delay would kill any kind of real communication between the artists and fans. In order to solve this problem, we will need not only technology solutions, but also some more reflection on the user experience. The solution could require some kind of infrastructure at concert venues, and since the venue business is the specialty of the Sony Music Group, I’m sure they can come up with something uniquely “Sony.”
Prospects for expanding beyond live music performances
──Can you talk about the prospects beyond live music experiences bridging the real and virtual worlds?
Imamura:I think we can use this format in other domains in addition to live music. There are lots of possibilities. It could be used in education, for screening movies in an audience participation format, which has become popular recently, and in other fields. I’d like to expand it to content and experiences that hinge on communication.
Takahashi:I expect that this technology can be used to bring fans, artists, and content closer together. In the future, I’d like to create something that allows people to share an experience at the same time while actually being present in the same space. Sharing the same time is particularly important. I think that once-in-a-lifetime moments shared with one another are the heart and soul of experience.
Imamura:I think with this initiative we’ve finally succeeded in creating an experience that includes the fundamental elements that allow people to truly enjoy something in a virtual space. I’m convinced that by adding on various technical elements from here on, we can create something amazing and completely new.
──Lastly, please tell us about some things that you think only Sony can do or create.
Imamura:The amazing thing about Sony is that we have a plethora of expertise within the group. If you visit Sony Music in Ichigaya or Roppongi you can talk to people in the entertainment industry. If you go to Sony City Minatomirai, or the Atsugi Technology Center you can meet camera and image sensor engineers, and if you go to Osaki you can see audio engineers and the R&D team. I believe the ability to discuss ideas with all these great people, try new things, and create new experiences is something that no other company can duplicate. If there is anyone out there who is interested in creating something new by being part of a killer team and connecting with other team members’ expertise, please come see us at Sony.