
The blazing opening shot of the song Back to the rhythm with rock band H.E.A.T.
Why did you choose to film this music video in a virtual world?
I started the whole journey into real-time graphics and Virtual Production by trying to make music videos in a cost-effective and fast way with amazing scenery. With this video I think I really come full circle. And some years later I have mastered the craft and can really offer brands and bands exactly what they want, nothing is really impossible in ‘virtual production’. Also, I have played in rock bands myself years ago and always followed H.E.A.Ts brilliant carrer. This was a special shoot for sure.

Richard Frantzén, rocker and virtual production master.
How did you start the process with this video?
As always the creative process starts with lots of meetings with the band about exactly what we should do. This is always a process since a band consists of several people with different opinions. And it doesn’t make things easier that the possibilities in Virtual Production are endless, says Richard with a smile
We shot a few ideas around and it was hard to nail down a date since the band and studio were very busy. Also, what song we we would do a video for changed a few times. This meant that we changed ideas several times. We talked about having the band on the wings of an airplane, flying in the air among the clouds while playing the song and also playing on a moving train.
Keep in mind that we need time to prepare the 3D world in advance as well since the graphics are filmed in real-time with the band, so as the final shooting date got closer we decided on the broad concept of a “H.E.A.T City” where the band has taken over a whole city and made it their own complete with fans and everything and a Macy’s Parade thing. Some inspiration comes from that gig Beatles did on a roof and the AC/DC video It’s a long way to the top.
“I added a wall of marshalls, some pyro, some fire and a few giant LED-screens which I took over and routed a pre-made videofile that’s in sync with the song for added effect. Very easy way of creating uniqness”
What was the Preparation like?
So to fit this within budget it’s not reasonable to design everything from scratch, but since Unreal has a excellent marketplace with myriads of assets we where able to find a New York city that could work as a foundation.
I of course knew this when I pitched the idea. You have to guide the client in the right direction creatively to where you know there already are assets available. From there it was a matter of set dressing the city, and customizing it to become a true H.E.A.T City.
We found a great area for the band to perform on a roof. We added a wall of marshalls, some pyro, some fire and a few other assets. The whole scene is filled with giant LED-screens which I took over and routed a pre-made videofile that’s in sync with the song for added effect. Very easy way of creating uniqness.
The other scene was the ”Macy’s Parade” where the band plays on the trailer of a Semi with fans standing a long the road. What we did here is to modify an asset from the marketplace and made it move. Then we attached the camera tracking data to the same movement and voila the band is standing on a moving semi.
And for the scenes where the band are giants in the city it was just a matter of scaling them up for a giant effect.

Singer Kenny Leckremo as a giant on the streets of H.E.A.T City.
Explain the actual shooting?
Basically what you do when shooting a music video like this is that you play the song several times and grab the performance with one or more cameras.
To make all takes match we pre-programmed all happenings, like pyro and video cues, beforehand in Aximmetrys Sequencer. As an added benefit we also pre-programmed some physical lighting cues that lit up the band in the physical studio.
Then it was just a matter of grabbing as many takes the band could possibly do. This shoot was great cause we had a handheld cameraoperator and a crane working simutanesoly. Which made us grab double takes every single time.
We did about 5 takes per scene. So first 5 takes on the roof then 5 takes on the semi. After that we moved on to single-takes for the footage with them as giants.
We ended the shoot by having the singer perform at various places in the world for added b-roll. And this is great when working in a virtual environment. I as the VP tech could just move around in the world and place the cameras in a cool area and let the camera operators find cool shots. Very creative and fast way to work together.
B-Roll on the edge of a rooftop.
What was recreating terminator 2 teleport scene like?
We basically had the idea very spontaneously. I’m not sure who brought it up, but we all liked it and set a second date to teleport in Kenny to the H.E.A.T city in a similar fashion as Arnold Schwarzenegger in the Terminator 2. I prepared the animations and we filmed it with real-time graphics and animations like the rest of the video. Kenny was there crouched down all the time but was’nt revealed until I triggered the effects and animations which also removed the mask that hid him.
As a matter of fact I programmed the lightning effect to be triggered by a push of a button on an Elgato Stream Deck. My son enjoyed triggering the effects over and over between the shoot but it really was that simple, all you need is timing with the camera movements.

Terminator 2 comparison with the final intro with effects turned on and off.
So you shot with real-time graphics, but what was the Post-Production like?
Here’s the thing about virtual production, you skip all the VFX parts and get just right into editing. All the takes were laid out into a multi-cam sequence and then the editor just cut it like normal. Then sprinkled some b-roll on top and finetunes. Done.
After edit just off to grading. The grading consisted of finding a look and adding some extra bloom to really accentuate those pyros and fireblasts.
Conclusion
I love this way of working because it keeps the creativity free to roam on set and everyone can see what’s going on big screens and can react and come up with ideas. Especially since the camera operators see the final pixels live on their monitors they can really work with the 3D environment.
What do you think?
Softwares used
Aximmetry for Virtual Production
Unreal Engine for 3D Rendering
Premiere Pro for Editing and Grading
Equipment
Blackmagic Ursa Mini Pro G1
Antilatency Tracking System
Elgato Stream Deck
Contact Virtual Star Studios today to start planning your virtual music video or live event.
See the full 3D music video and end result here.