In-depth interview with creative director Michael Al-Far

FACE for professionals

FACE for MI Resellers

FACE Project Integration

Read Michael Al-Far's insights into how Marco Borsato's latest show raised the bar with creative use of disguise its Extended Reality (XR)

How would you describe your function within MalfMedia? What occupies most of your time and interest?

I am the founder of Malfmedia. We are a small design studio ran by myself and my wife. We have a trusted pool of collaborators that help us execute the projects. My role is to build the relationship with the customer, act as creative director and lead designer.

What is MalfMedia its main focus?

MalfMedia is a video design creation company utilising leading edge technology in a creative environment. We work with clients from all areas of business to ensure “the message” is delivered with clarity and accuracy. Our mission:  to enhance the audience experience and ensure they do not miss a thing.        

What kind of projects / shows is MalfMedia most well-known for?

Our projects span from live events and concerts to broadcast. The projects that we are involved with and are probably best known are Marco Borsato, Cartoon Network, Adidas, Panasonic. We’ve also been involved (with many others) in particular in the development of the “smart stage technology”, in close collaboration with disguise and White light.        

Was there an initial specified request for the Marco Borsato Show at De Kuip? Or was there a creative freedom to create a concept from scratch?

Marco specifically requested that our starting point should be from last year’s successful Sportpaleis concerts where we first introduced cinematographic content.  As an artist and photographer Marco felt the time had come for him to be involved in every step of this exciting project. So, there was a big creative freedom, but he had the final say. Together with Carlo Zaenen, set and lighting designer, we created the content book in which we outlined the look and feel for every song.

This particular project had a lot of challenges: starting in daylight and transitioning into night, a very large canvas (by some measures the largest screen ever used on an Arena rock concert) and integrated Imag in the content, which meant there were no separate side screens of the live feed and the feed had to be embedded into the content. And finally, 37 songs which all needed to be of the same standard. We quickly realised the need for an extra creative team member, preferably with a vast knowledge of cinematography. So, we teamed up with Studio Regie again. Our cooperation for the Sportpaleis concerts returned some stunning results and Olav Verhoeven’s creativity and experience in delivering cinematographic visuals was exactly what was needed.

Can you elaborate on the concept of the stage and experience for visitors? What did you want to achieve?

Marco and the team around him had one simple instruction: they wanted to create more than a show. It needed to be an experience, one that the visitors would be talking about for years to come. De Kuip is an iconic venue, only reserved for the greatest artists. This series would mean that Marco surpassed the legendary Rolling Stones as the artist that has played this venue more times than any other artist. This monumental achievement needed to be provided with a fitting context. The 250,000 visitors needed to have a sense of grandeur from the moment they walked through the doors. Carlo and Marco wanted the set to be as wide and as high as the arena itself, that is no simple feat, but they pulled it off. Next it was up to Olav and me to fill every one of the 30 million pixels with content that befitted this majestic set.          

How does content come to existence? What are the steps between a draft on paper and getting the content on these giant screens?

The content book is key. It really is the be all and end all of content creation. When you set off on a project like this you must stop thinking about the logistics and let the creative juices flow. Once the production designer and artist are happy with the result, stick to the book. It’s easy to get lost in a massive production like this. There are plenty of distractions and unforeseen events that could lead you in a different direction to the one you’ve agreed. Having the content book to fall back on  will prove its value along the way. Taking the time to meticulously prepare the content book is well worth it. Another key element of successfully creating content for a project of this magnitude is surrounding yourself with the right people. In Studio Regie we found the perfect partner to pull this off. Not only does Olav and his team shoot, edit and post produce the content, together with his partner Ellen, they also produce the shoots. Which allowed Malfmedia to  focus on the overall supervision and let Studio Regie do what they do best.

The next step is content management and timeline programming. For large scale canvasses like this one, the size of the content files rapidly bulk up. I can best illustrate this with an example: for songs which were edited to Timecode (and thus are delivered as one 4-minute piece to the media server) the file size amounted to 45 GB. You don’t just copy that onto a usb stick or WeTransfer the file. Lex ter Heune from By Lex design not only provided the media servers but also managed the data transfer. This meant the various content designers could drop their final files straight onto a NAS drive managed by Lex and be accessible for the Visual Solutions team to grab the content and start populating the timeline.

Jo Pauly and Sander De Schrijver, who oversaw media server programming and operating, had direct access to this NAS driven database which meant they could program without having to wait on the complete content package coming in.

The final stage of content creation is done once the timeline is programmed and before we get to the venue. This is an important step because with a project of this size and scale, there are no easy fixes and no cutting corners. The time on site was limited and we needed to make sure both Carlo and Marco were happy with the content in its present form. Even though Marco was informed on every step of the process there is always a risk of a final product not meeting the expectation. In order to prevent this, we met up with the lighting team at Leon Driessen’s WYSIWYG studio in Vlissingen. That’s where the lighting and video were synced to timecode starts and to each other. This was also the moment that show director, Marcel de Vré, and live feed director, Anja Hoff, came in and added their mark on both staging and live feed integration.

Once the creative team was happy, we brought Marco Borsato in for a final viewing and his notes. This gave us a few days to amend any notes that he had, and we were fully prepared for the load in.

With AR you mix up live feed and generated content, how do you decide what to mix? Are there certain types of combinations that work well and others not?

Together with Hans Cromheecke and Maarten Francq, Malfmedia has been experimenting with Augmented, Virtual and Mixed Reality for a couple of years. Our experience with Augmented Reality for international productions drove us to use MBDK as the perfect project to introduce a world first: embedding multicam AR as part of the main screen graphics. This meant the audience didn’t need to look away from the stage to see the AR graphics on a side screen nor was there any need to whip out an extra device like a smartphone. It allowed us to push the boundaries of what was physically possible in a large arena like De Kuip. How else could you have the artist make an entry by having him walk on the roof and then flying him to stage for the big reveal? And setting the venue on fire or having Earth spin around in the centre of the pitch would be near impossible to do without the use of AR. Because of the magnitude of the production we invited two Notch giants in their field to join the team: Scott Millar for the Augmented Reality and Lewis Kyle White for the Imag grading.

These days every high-level production around the world, is looking for ways to exceed audience expectations. To provide them with an experience the likes of which they have never seen before, whether it’s a dmx controlled wristband for every spectator, Realtime face tracking or a giant video led rotating sphere, we are looking for that unique angle that nobody else has achieved.

With MBDK we managed to do that by means of AR and seamlessly integrating that in the graphic. It also meant the ultimate audience participation because by using the venue as a canvas and having the audience be part of the AR graphics, they were transported onto the main screen and became part of the content for those few songs. The intro ticked all the boxes, as the band played on stage and everybody expected to see Marco appear on stage, we had him appear on the roof ridge. It took the audience a few seconds to figure out that the 3d figurine dancing on the roof was actually AR. Every single start of those 5 shows the reward of our work came in the form of seeing the audience searching along the stadium roof to see if they could spot Marco. This meant success, this meant that what we had set out to do had been proven to be successful.

Finding balance between using the technology for the sake of technology or being added value to the show was the hardest part. Overuse it and the effect is no longer what you want it to be, underuse it and you run the risk of it becoming a distortion on an otherwise smooth-running content story. It’s all about choosing the right moments, the intro was a given. Then you find one or two strong moments in the show flow and you add the finale. Just enough to capture the audience imagination and to have them question and wonder what they have just seen.

Is your workflow specific to disguise or would the same way of working be an option on other systems / servers as well?

The workflow is not exclusively to disguise. There are lots of systems out there that will allow you to do AR/XR. But what the brilliant team at disguise have perfectly grasped is the need for the AR/XR elements to seamlessly integrate into the workflow. It needs to handle, feel and look as part of the content process. The gx range is one of the most powerful media servers on the market but what makes it stand out is the UI and the calibration process. And of course, the personal investment of the developers and service teams at disguise.

Technically we could achieve the same results with other media servers, but it wouldn’t be as seamless, it wouldn’t be as easy, and it would have cost a lot more time, energy and resources.

Malfmedia’ s projects almost exclusively have a disguise media server at the heart of their operation. That is a choice we made based on previous experience which has helped us achieve success in every project we have been involved in.

How do you see the future evolve concerning the use of video on stages? How do you think AR / mixed reality will influence that?

That is a complicated question. In my opinion we are on the eve of a huge shift in ways in which we approach live events and audience experiences. AR has been around for a long time, but only recently has it been accepted outside broadcast environment as being added value to a creative process.

Used wisely it most definitely has a place in content creation. It comes with challenges of course, that is a given. It has an impact on production time and resources. It has an impact on the timeline of a show and how shows are set up. But the rewards are massive. Using modern day technology to enhance the graphics package of an event has become so much more interesting. Packages like Notch VFX will allow you to manipulate live feed in ways that were simply impossible previously helping to close the gap between live feed and content. When the Imag looks and feels  part of the overall graphics it adds to the flow of things, it makes it easier for the audience to take it all in.

Marco Borsato has and will continue to advocate the use of modern technology to raise the bar of his arena and stage shows. We feel with MBDK we have set a new benchmark for live performance and shown other aspiring acts to allow cutting edge technology to be part of the show package.

I don’t think the use of Imag grading, AR/XR is going to go away soon, quite the contrary. I think we will see more and more use of these techniques to create shows which will leave the audience wondering what they have just been part of. With the emphasis on part of...

I can’t wait to see what’s next.

Client: Marco Borsato

Production Company: Music E - Mojo

Production/Lighting Design: Carlo Zaenen

Creative Direction: Michael Al-Far

Video Director: Michael Al-Far / Olav Verhoeven

Lighting Operator: Leon Driessen

Content Design: Olav Verhoeven/ Marco DeRuyck / Bart Tauwenbergh /  Tim Vandekerckhove / Aitor Biedma / Lieven Vanhove / Sander Heynderickx

Notch Designer: Michael Al-Far, Scott Millar, Lewis Kyle White

Media Server: disguise vx4 and gx2

Media Server operator: Jo Pauly, Hans Cromheecke

Systems Integrator: Maarten Francq

Media Server engineer: Sander De Schrijver

Equipment Vendor: Faber Audiovisuals,  byLex, Eurogrip

Photo: Jorrit Lousberg (By order of Faber Audiovisuals)