fbpx

PHISH LIVE AT SPHERE

May 6, 2024

A broadcast first at Sphere

Long-time Phish Video Director Trey Kerr has been making sure fans in the venue, and those at home who paid to stream a specific show for their “couch concert,” have a great visual experience for decades. For Phish Live at Sphere, he’s had another demographic to take care of: members of the crew.

“It’s imperative that our crew could successfully do their jobs in this environment, and that relies on video,” he explains. Among the many unique aspects of Sphere is that there’s no side stage area. So, in addition to the important broadcast duties Kerr typically takes care of, crew members who are used to being able to make eye contact with the band from the side are now behind the stage or under it. “The monitor engineer needs to see something specific, the backline techs need to see specific things” for visual cues from band members to make adjustments for them.

Kerr juggled a total of 44 cameras, almost half for broadcast purposes, in the first live global broadcast from the Sphere. It was one of the most video-heavy shows he’s ever been a part of, involving more screens and routers that he’s deployed than ever before.

Kerr is also CEO of St. Louis-based Gateway Studios & Production Services, the lighting vendor for Phish Live at Sphere. “These four shows are unlike anything else that the band’s done,” he says. “Phish brought four different shows — four unique pieces of art, one for each night. They took all the incredible technological advancements the Sphere offers and used it to the fullest potential. It transported the audience to another place.”

Front and center for Kerr was ensuring that those at home get the best possible experience of this once-in-a-lifetime show series. He worked with the new Panasonic AK-PLV100 4K CINELIVE Studio Cameras, which he says provide a more cinematic feel in addition to looking great. Another piece of gear being used is from Tecnopoint. They offer small robotic PTZ camera dollies and tracks that can be tied into the Panasonic controls. A custom 150’ track was built on the stage’s downstage edge. “For us, the big thing comes from an archival perspective, as we want to capture these four unique shows so that the band — and their fans — can have them in the can for the rest of time."

In the Phish universe, there is no such thing as a “regular” show. But what they did for four April nights that made up Phish Live at Sphere in Las Vegas is not exactly like anything else they (or anyone else) have done. The sprawling, ambitious event was visually operatic in scope, taking full advantage of the technical and creative possibilities of the Sphere. “We didn’t necessarily have a single technology that’s never been used anywhere, but we used a lot of them to an extent that we put them where they hadn’t been pushed before,” Show Director Abigail Rosen Holmes says. During the four days and over 14 hours of performance, no song was repeated, and all were augmented by dazzlingly improvised visuals. The four shows took advantage of next-generation technologies to showcase immersive multi-media content generated from an innovative fusion of real-time and pre-rendered visuals. 

It was a mind-blowing premise to begin with in the sense that you could not find a more diametrically opposed band to U2, who opened the venue with a 40-show residence. Rosen Holmes and Phish’s Trey Anastasio saw one of those shows and left with one conclusion: They would need a radically different approach.

The content spanned from geometric shapes dancing — adding to the atmosphere — to beautifully rendered pieces of a forest, soap bubbles, trees that shot off fireworks, and for the song “Taste,” modern hieroglyphics on a golden tomb wall with a hanging 3D crown in the middle. But there could be quiet moments too, like a single serene still image from nature. It all added to creating emotionally moving moments — and the occasional whimsical moment (flying cars, anyone?)

What’s your history with Phish?

I first worked with Phish in 2016. It’s always interesting to talk to people about Phish because many don’t give them enough credit for how innovative and experimental they are. Their legendary shows are well-known among the people who follow them not just for the amazing music, but also for the gags, set pieces, and theatrical moments they create for some of their performances.

Except for special shows like the recent New Year’s Eve set, they rarely have video, yes?

Their touring shows do not typically have a video element. When I do get to work with them, it’s wonderful. I love working with their legendary, longtime Lighting Designer, Chris Kuroda. And then I’ve also worked with Trey Anastasio on one of his side projects, called Ghosts of the Forest. That project was a little bit more of a theatrical piece with video and staging.

What was your approach to these performances?

The performances are fluid and happen in real-time. But then they’re also very appreciated by their fans for these more theatrical pieces, usually still allowing the band a tremendous amount of freedom in how they play, but with a more structured theatrical visual taking place around them.

So, when I was first speaking with Trey about what he wanted these shows to be, Iasked, ‘How are you thinking about these? Do you see them like one of the theatrical pieces or a gag, or is this an opportunity to take this amazing technical innovation that the Sphere is and figure out an amazing way to put a Phish show inside it?’ He said it was the latter, and that has been the basis for how we’ve thought about everything we’ve developed. But then, we all knew it still must be a Phish show, and every song must go wherever it takes them musically. The design for staging and visuals would not impede that.

 

So how did creating original content for this show work?

All of the video that we use must be able to be manipulated in real-time. None of our pieces are hooked to a track or playing automatically. [Co-Creative Directors] Moment Factory have been fantastic in creating the pieces, and we have some pre-rendered content and access to a lot of ways to manipulate it through Notch. Then we have a lot of pieces that are made with Unreal Engine.

Can you tell us about the process of creating the original content?

I’d love to. First a little Phish history. They have done projects in the past where they might have assigned a theme to a show. For example, there was a project at Madison Square Garden called Baker’s Dozen. There was a flavor of donut every night, and through 13 concerts, no song was repeated. Each night the song that kicked it off had a reference to donut flavors. [“Shake Your Coconuts,” a medley of Boston and Cream songs, etc.]

The theme of these shows was “States of Matter” with each night having its own theme: Solid, Liquid, Gas, and Plasma. We’re not literal, but it gave a direction to how to start to have conversations about fun ideas for the visuals for each night. Moment Factory and I have been working on content ideas since September, arriving at what I used in these four shows. But it started way earlier than that with Trey and I exchanging theme ideas.

I bet that list is fun —

Hilarious. One of the interesting things with this creative process was that it was spread out over a lot of time, so our processes came in waves. You get it to a certain place, everybody checks in and looks at it, then you push it another level further down the road.

What did you use for control, and how was programming handled?

Chris Kuroda and his [Associate Designer and] Programmer, Andrew Giffin, have done extraordinary bits of programming on the grandMA to allow him to manipulate the lights the way he does. So, we took a lot of cues from that. I’ve been standing around watching Chris run his show since I became involved with the band in 2016. We have a wonderful Content Programmer, Benjamin Roy, and we went into the grandMA and basically built out a video control layout. There are so many things about video that are not the same as what Chris does with lighting, but we took a lot of knowledge about how he was manipulating things and applied it or changed it to program for the video control surface.

What was it like to be FOH at the console?

It was slightly terrifying and also exciting! [Laughs] I chose on the spot and played back the video content in real-time as the band performed.

How did the Sphere influence the outcome?

There were technical and creative challenges, which lead to tremendous opportunities. One thing we were aware of is that the viewing experience is quite different from different places within the room. So, we made a checklist for ourselves to ensure that everybody who comes in has a great experience.

The opportunity to play this room, for me, meant what we did to live, manipulated video pushed a lot of technical boundaries. Yes, we took some risks, and in a situation like this you don’t know 100% how or that it will work. But we did really lean into some risky things, and the result was creating a completely mind-blowing Phish experience, which was our goal.

The visuals seemed to build during songs but even the band doesn’t know where it’s going. Was it just ‘active listening?’

Active listening is such a great way to describe it. Of course, to do this requires a good starting point — knowing the songs, being familiar with how the band plays. And then as much as possible immersing into the music and feeling where it’s going.

Whether it was the abstract imagery or the more pictorial pieces we conceived of the show as starting points with the ability to build and change from there. Some — like the amazing Pollock art piece used in “Taste”— didn’t need big changes to be impactful, and it felt important to have times where the movement slowed down and let everyone be in that moment

Moment Factory: Designing the unpredictable

Moment Factory played a central role in the co-creative direction of the four shows of Phish Live at Sphere, collaborating closely with Show Director/Co-Creative Director Abigail Rosen Holmes. The studio’s involvement extended to set and video design, as well as contributing to lighting concepts in collaboration with the band’s longtime Lighting Designer, Chris Kuroda.

Phish Live at Sphere is an incredibly audacious project that we are thrilled to have worked on,” says Daniel Jean, Producer for Shows, Moment Factory. “We feel extremely privileged to have legendary bands like Phish entrusting us with creating immersive universes around their music and identity, enabling them to connect with fans in completely new ways. Moment Factory got its start VJ-ing in the late 90s, and we are particularly proud, nearly 25 years later, to have maintained this initial passion. We are now pushing it to an unprecedented level, creating the longest-ever real-time content show on the world’s most immersive canvas at Sphere.” Phish Live at Sphere marks the Montreal-based studio’s seventh and most ambitious collaboration with the prolific rock band. Their long-term creative partnership includes previous concerts at Madison Square Garden and the MGM Grand.

In October, Moment Factory’s Jean got the call to provide video content for a special Phish show at the Sphere. He was told the budget. “Absolutely we can do this,” he replied. He was reminded the shows will be their typical 3-1/2 hours long... “Sure...” This run would be four nights, and each night must be completely different, with all original content for each. “Yeah...” Jean laughs now but recalls doing the math: four nights times x 3-1/2 hours added up to 14 hours of original content. “I went back to the group and said we need to reinvent the production pipelines. We need to rethink how we create content, and to build assets that can be played live.”

The creative process between Show Director/Co-Creative Director Rosen Holmes and Moment Factory was “an extremely collaborative process,” Jean says. Coming up with separate content for four nights each with their own theme was “extremely inspiring for us because the visual territories were equally open yet pretty defined. We created building blocks for content ideas.”

While they have created content for bands in the past (Billie Eilish, The Killers), a series of concerts for a band with Phish’s DNA is different animal. “I called it ‘designing the unpredictable,’ because we have to face all types of musical situations, and within those situations, there has to be rich territories for Abigail so she can essentially jam along with the band [visually],” Co-Creative Director Jean-Baptiste Hardoin says. Old and new technologies and tool were used, including AI, to keep maximum flexibility as the ideas kept coming. “The good thing is we have a really great relationship with the band, so we’re able to explore and become a laboratory of ideas.”

The Moment Factory team worked closely with a trusted network of collaborators, including Disguise, whose innovative platform powered Moment Factory’s extensive hours of pre-rendered and real-time visual content on a sprawling 16K x 16K resolution canvas. Fuse Technical Group served as the technical integrator for the video system, managing the incorporation of Moment Factory’s content onto servers and providing supplementary servers for real-time content delivery. Moment Factory also enlisted Fly Studio, Myreze, and Sila Sveta for screen content production; and Picnic Dinner Studios and Totem Studio for 2D animations during the intermission.

Justin Restaino was the Screens Producer, working closely with Hardoin and Jean to produce the content and pull everything together technologically. “The whole process has been an incredible collaboration between Abby, J.B., and the Moment Factory team, especially in terms of understanding how the content reflects the concepts and then pushing it onto the 23 servers,” says Restaino. The system was built to be flexible—s o that any clip or real-time scene could be called up and busked by Show Director Rosen Holmes, using a grandMA3 controlling sockpuppet. A total of 54 Disguise servers came together to produce these unique shows —23 gx 3 machines running the screen and Notch content, together with 31 rxII render nodes provided by Fuse, hosting Unreal Engine via RenderStream.

Restaino and Rosen Holmes give credit to Content Programmer Roy, who is “incredibly talented and able to program the grandMA to make it easy for the Show Director to pull up content live that fits the music happening on the stage,” Restaino explains. “What we’re doing here is absolutely wild. We are using the Unreal Engine across 31 [Disguise] rxII render nodes and displayed on a 16K canvas. It’s all seamlessly working together to create the visuals.”

I-Mag is something the band is not particularly fond of, but Rosen Holmes and the band realized that in the Sphere it would be another way to make that connection with the individual band members and the audience. It was used sparingly —and artistically. “We wanted to use it to bring this idea of not just the face of the four members and put them on the wall, but to use filters, diffusing and altering images,” Hardoin says.

They all agree that the first time the band got on stage to rehearse, started to play, and the entire system was fired up, it was a shared awe-inspiring moment. Hardoin concludes, “It was a high-impact experience as throughout these past couple of months we were finally able to sit down and watch the screen come to life, the band come to life, and that’s when we knew we had created a special moment.”

This article originally appeared in PLSN