Over the past few months, Tomorrowland has been working with the latest technologies in 3D design, video production, gaming and special effects to craft a new Tomorrowland location for its two-day digital music festival, pushing the boundaries of music, 3D design, Hollywood’s latest filming techniques and the most modern game engines and hardware. Tomorrowland Around the World is not a traditional live stream event with DJs in a studio or at home. Instead, it is a unique form of visual entertainment – a technological masterpiece demonstrating a number of technological world premieres and entwining entertainment and technology in a way that has never been done before. More than 60 of the planet’s biggest artists have recorded their performances in 4 large green screen studios around the world, while 2 different platforms are used to assemble all the elements together. The digital universe of Tomorrowland Around the World has 10 times more polygons compared to a modern computer game and each stage has a 16 square kilometre surface with 32.000 trees and plants and over 280.000 virtual people who each will have their own attributes, such as flags or lights. Festival visitors will be treated to an immersive and unprecedented music festival experience, combining a 3D décor and artist performances with a spectacular show featuring special effects, fireworks, laser shows, a realistic crowd and sound effects.
Those technologies were created by the in-house creative team and 3D artists of Tomorrowland, renowned for creating Tomorrowland’s world-famous stages and artwork, in collaboration with several partners. Bringing the live video, lighting, visuals, and effects, feature film and gaming technology together has resulted in a lot of unforeseen productional and technical challenges that required new integrations to be developed. The last months presented a unique opportunity to have all these teams available at the same time and allowed everyone to create a technical set-up that has never been done before, giving a huge energy boost to a sector that was badly hurt by the global pandemic. Tomorrowland is now proud to say it has been able to bridge the gap between the real world and the virtual environment to a level beyond everyone’s expectations – Tomorrowland Around the World is the result of a gigantic team effort of hundreds of people who are working around the clock to create a never-before-seen interactive entertainment experience.
Tomorrowland is currently in the final stages of building its brand-new 3D island environment Pāpiliōnem, where both the responsiveness across devices and the visual details are being optimized. Pāpiliōnem is a magical island shaped like a butterfly featuring beaches, night skies, mountains, forests sunsets – with a true game feel. Festival visitors will be able to navigate easily through the island with a PC, laptop, smartphone, or tablet – you don’t need special VR goggles – and explore the entire festival site in an interactive way together with friends, experiencing all four seasons within a single day trip. On entering the island, festival visitors will be offered a packed schedule with plenty of things to do, see, and experience. People will be able to go and discover 8 different stages designed and built-in 3D and see artists performing live sets, just like at a real festival. Once you’ve chosen a particular stage, you’ll get to see the line-up and be able to enter the specific stage environment. Their people will be treated to an immersive music festival experience, combining a 3D décor and artist performances with a spectacular show featuring special effects, fireworks, laser shows, a realistic crowd, and sound effects.
Tomorrowland has created this platform from scratch in collaboration with Dogstudio, a multidisciplinary creative studio with offices in Belgium, Chicago and Mexico City.
Henry Daubrez, CEO & Creative Director of Dogstudio: “We are currently working day and night to design and bring the entire virtual island universe to life under a really aggressive timeline. We work hand in hand with Tomorrowland’s in-house 3D, creative and development teams to build a compelling, immersive, user-friendly, but also highly premium experience. We are building a web-based experience and we are all really pushing the boundaries of what can effectively be done in a web browser. Our biggest challenge – besides being an obvious enormous technical challenge – is making sure festival visitors will be able to feel they are being part of something larger than their computer and their internet connection. People won’t only be immersed in Tomorrowland’s new universe, but they will also be able to communicate with other festival visitors. I can proudly say that we are setting new standards for web-based online music experiences, pushing the boundaries of the latest technology that is available, but on the other hand making sure that the platform is even working on a device that is a couple of years old.” Dogstudio has previously worked for the likes of Microsoft, The Museum of Science and Industry of Chicago, The Kennedy Center of Washington, Dragone and many more.
More than 60 artists are playing on 8 different stages at Tomorrowland Around the World. To capture all these performances, Tomorrowland has built 4 different large green screen studios around the world. Most performances have been recorded on the holy grounds of Tomorrowland in Boom (Belgium). Artists based in North and Central America recorded their set in a studio in Los Angeles (USA), while Latin American artists went to a studio in Sao Paolo (Brazil) and Australian DJs performed in Sydney (Australia). All studios had to be modified to create an environment that allowed enough room for the creative team to make this a Tomorrowland-worthy broadcast. All cycloramas (or infinity-walls) were 6m or higher, at least 8m wide, and at least 8m in depth. A full-sized DJ booth was built in the studio and all locations measured the exact same set-up. The video production teams working on Tomorrowland Around the World are known for previous work with the Olympic Games, the Super Bowl, international feature films and the game development world.
Before the cameras were set up, the studio facilities got a technical overhaul to get them ready for augmented reality productions. A large grid of infrared reflectors was installed on the ceiling to allow tracking devices to measure an exact position of each camera head. All other parameters of the camera and lenses were transmitted live to servers that recorded the data and rendered a low-resolution version of the virtual world for the camera operators and directors. On top of the 6 4K ultra HD cameras, a number of virtual cameras were created per stage, allowing the director to choose up to 38 cameras at the Mainstage. Even for modern 4K outside broadcast trucks, a dual framerate and multiple resolution configuration is challenging, especially with the amount of virtual shots needed to be triggered. Each stage has a different technical set-up: similar to moving the outside broadcast truck from one stage to another, the truck configuration was changed from one stage to another. Along with the technical set-up, the lighting on the artist changed according to the location of stage and the time of the day during the broadcast.
Tomorrowland has been working closely together with stYpe, which provides a cutting-edge camera tracking technology for achieving real-time augmented reality and virtual studio effects in live broadcast. Stype Cajic, Founder & CEO of stYpe: “Tomorrowland’s digital festival is different from the live broadcast productions we typically do, and from the typical movie productions we do, in the sense that it combines the requirements of both. In live broadcast projects, like for example the Olympic Games, there are very high standards for speed, virtual scene optimizations and reliable camera tracking. As the shots are done only once, there’s no luxury of repeating them, since it all has to be live. Movie productions, on the other hand, have requirements for high-resolution and photorealistic virtual effects as seen in blockbusters movies. Tomorrowland Around the World has both of these requirements and this meant we had an interesting technical challenge to solve. The artists were performing their acts once, without repeat takes, and they needed to see in real time the virtual environment they were performing in. This meant that the speed and scene optimizations from live broadcast was required. On the other hand, everything needed to be shot in high resolution and be photorealistic to satisfy the sophisticated taste of festival visitors who will be watching. While planning for this project, we considered a few solutions for camera tracking, and we finally decided on using the RedSpy system. It proved to be a great choice since it operated without problems on dual frame rate for days in a row, and it surpassed Tomorrowland’s accuracy requirements for 4K shooting. This meant that the camera tracking part of the challenge was solved. The second thing needed were high-resolution photo-realistic real time renders. For this purpose, we used our StypeLand Unreal plugin which we modified for getting the volumetric lights on the talent, which gave the whole set a more realistic look.” The stYpe technology was used worldwide at the MTV VMAs, Eurovision Song Contest, Olympic Games, Super Bowl and many more.
At the end of the pipeline there are 2 different platforms used to assemble all the elements together. The 3D stages designed by Tomorrowland are imported and layers of show elements, lights, attributes and environments are added. Depence, a platform that is typically used to visualize show elements such as lighting, lasers, visuals and other effects, is used for most indoor stages where the show elements are key to the performance. The software was not designed to add video of real-life footage to it. With the help of the developers, it is now possible to have volumetric light on the talent in the set. These environments have gotten a more realistic look. All technical elements that make up the show are programmed and controlled in the same way by the same skilled team that does this for the other regular Tomorrowland editions. This results in a realistic image with hints to iconic stage designs of the past. For these stages a crowd was designed with some extra features to generate an even more immersive effect.
The other stages are rendered in the game engine Unreal Engine, which allows for photorealistic landscaping and scripting of certain elements. Each stage has a 16 square kilometre surface with 32.000 trees and plants. The landscapes for each stage have been custom-made to resemble elements of the festival grounds: the Core stage is situated in a forest environment, while the Mainstage has the natural amphitheatre landscape similar to the holy grounds in Boom. On the festival grounds, over 280.000 virtual people each have their own attributes, such as flags or lights. The digital universe of Tomorrowland Around the World has 10 times more polygons compared to a modern computer game, pushing the limits of the most modern game engines and hardware.
Tomorrowland has been collaborating with Epic Games and its Unreal Engine, a state-of-the-art real-time 3D creation platform and game engine that features photorealistic rendering, dynamic physics and effects and lifelike animation, widely used in many games including Fortnite, Minecraft and Mortal Kombat. Ben Lumsden, Business Development Manager at Epic Games – Unreal Engine: “How do we bring new experiences from great artists to remote audiences? This is the pressing question that’s being posed, and one that’s now being answered by real-time technology. Unreal Engine has quickly become the standard for beautiful virtual sets in film and television production and is now being used more in live broadcasts. Tomorrowland Around the World skillfully mixes traditional techniques such as live camera cutting with our new DMX lighting control integration all within the immersion of Unreal Engine, building a new digital journey that is sure to captivate people across the planet. What the team at Tomorrowland has been able to do in a very short period of time is seriously impressive. Combining the live action performances of the artists into gorgeous, high-resolution virtual worlds has been a logistical and technological feat.”
Here as well, custom builds of the software were necessary to be able to process the camera tracking data differently than in standard broadcast solutions, read the DMX data from lighting consoles, and integrate with nonlinear editing systems to switch between all virtual cameras. Custom APIs were written to bridge software packages that never interacted before and no effort was spared to blend the artist in the magically created environments. A big team of audio mastering engineers are working on creating realistic crowd sounds, cheering, applause and singalongs, all timed according to the artists’ sets.
Going into 2012 both Avicii and Nicky Romero were coming off the back of releasing…
After making their debut in 2023 with 'Organic Machines', 2024 was the breakout year for…
Massimo Logli's rise has been nothing short of meteoric, with recent gigs and festival appearances…
After more than 11 years of anticipation, fans of Avicii are finally able to experience…
On the 13th of December, TRIBBS, the mastermind who had his global breakthrough with ‘Let’s…
Can you believe it’s been 13 years since Skrillex dropped his iconic ‘Bangarang’ EP?! Yes,…