Feb. 13, 2024 - The University of Applied Sciences and Arts Northwestern Switzerland (FHNW) comprises nine schools, with each school containing various institutes with a specific study focus. Across schools and institutes, technology plays a key role in the educational process. As Head of Media and Broadcast Infrastructure, Suresh Surenthiran is responsible for making sure the right AV solutions are in place for faculty and students to deliver dynamic presentations. We recently caught up with him to discuss his latest setup and the new technologies that he and his team of budding media engineers are exploring.
Walk us through the types of productions and broadcasts FHNW does.
We’re an art and design university with multiple disciplines, each with its own AV particular needs. My team is effectively a service provider, so we broadcast whatever they need and in the manner they require. Given the school’s multidisciplinary focus, students are creating content akin to a television show or a TED-style talk, not simply presenting a research paper. Our events typically feature color lights, DMX systems, and polished visuals. For example, the fashion institute within the university hosts fashion shows, as well as other presentations and graduation ceremonies.
Tell us more about the fashion show event.
Due to space challenges, the event was hosted across three locations spanning nearly 500 meters. We filmed the three-part event using a mix of 22 broadcast and PTZ cameras. Each location had one dedicated camera person and the rest of the cameras were operated through remote control systems. We opted to use primarily mounted PTZ cameras to minimize any disturbance to the audience. Mixed feeds from each location were simultaneously broadcast in-venue via various displays, giving attendees insight into what was happening across the event, and to the central editing system for external distribution using AJA BRIDGE LIVE. This setup worked so well that now we’ve adopted it as our standard for auditorium event productions. We installed 11 fixed cameras that are remotely controlled, and now capture each of our weekly events with them. This project also prompted us to look at Dante 12G networks.
Why were you attracted to Dante?
Dante has made a lot of progress and released multiple solutions that made it easy for us to work with the technology and route signals. Even when we were doing mostly SDI-based production systems, Dante enabled us to distribute to the entire audio network. That's why we decided to adopt it. Currently, we are extending AJA’s OG-DANTE-12GAM openGear audio embedder/disembedder for every purpose possible because it enables us to extract all the audio to the in-house broadcast, the audio networks plus to mix in the video mixer or the audio system. We’re planning to have permanent Dante carts in multiple locations and expect to have a fully integrated Dante 12G network by next summer.
What does a typical workflow look like now?
Our event locations are connected by our own fiber networks, which allows us to ingest multiple sources – desktops, laptops, camera signals, video mixers, etc. – and bring all the data together for editing. Everything is connected yet decentralized. We manage the audio across buildings remotely with OG-DANTE-12GAM audio embedder/disembedders, which allow us to bring in the audio microphone, any video conference feeds, and a local microphone system into our Dante audio setup and into the main mixer. We extract the audio using OG-DANTE-12GAM, then send it into our Powersoft Amplifier, which enables us to manage and distribute the audio across the building.
We have an 8K video mixer, and 4K, full HD, and PTZ cameras; we route the video sources through MediaNet to AJA BRIDGE LIVE using its four SDI inputs. This allows us to do parallel streaming in different formats, either for multiple events or to stream multiple feeds of the same event; if we wanted to stream an event in German, English, French, and Italian, we can do this simply with BRIDGE LIVE. It’s fantastic and fast at transporting all the protocols like SRT, UDP, SLS, RTP, etc. Also, the dashboard is easy to use and configure. We use BRIDGE LIVE to distribute live streams via the public internet, and we also use multiple content delivery network (CDN) vendors. Typically, content will be shared privately to protect it from copyright concerns, but it depends on the event.
We’ve also been testing SRT streaming, but not many providers are offering it yet. SRT streaming helps us to use the public internet to stream and broadcast our content globally.
Because we are a government funded university, we do not generate revenue, so we want to reduce costs where we can.
What other trends or technologies are you following?
openGear is a great ecosystem, and I thank AJA for making fantastic products to support it. Converters require power supplies, so you don’t necessarily have access to all your devices at the same time. The openGear frame has a powerful redundancy power supply so it's very stable when rackmounted, and we can manage every device through its operating systems. Once we have an openGear frame in any of our racks, we can easily configure and change settings remotely. It also requires less cabling. Today, every organization is cleaning up legacy problems, such as too small devices and power supply challenges, using openGear-compatible solutions. It’s much cleaner to install, and more manageable.
Looking ahead, I also would like to see all management systems run through cloud native applications. Broadcast workflows should not be built the same ways as traditional IT infrastructure. Artificial intelligence (AI) was a hot topic at NAB and IBC but what I'm looking for is an AI-driven operating system for the broadcast industry – a way to manage all the converters and devices through one cloud native application.
How do AJA solutions help you achieve your broadcast goals for each event?
I love the benefits we receive with OG-DANTE-12GAM, without being tethered to a power supply. We just install it into the frame in the server room and connect it to ethernet, and we can manage the frame from any location, whether that’s my office or at home. When I go to the openGear DashBoard, I see the card, then I can open the Dante controller and configure my settings. I can manage everything virtually, which is great.
What prompted your workflow choices?
A lot of universities’ media workflows are dictated by classic IT infrastructure, but that creates a disconnect and limits what AV teams can accomplish. Media should be part of the infrastructure strategy. I come from the broadcast industry; that’s how I got to know AJA gear and understand what workflow aspects to prioritize. At both the broadcast and university levels, we produce content the same way, even if the context is different. I really advocated to have our own fiber channel networks and having that connected network is a great advantage.
We also try to automate aspects of our workflow where we can. We’re facing a shortage of media engineering expertise in Switzerland, and colleagues in Germany, France, and Italy have expressed facing a similar challenge. openGear technology helps us solve for this challenge, as does cloud-based technology like BRIDGE LIVE. We can simplify the technology management process so that a single person can configure multiple converters, and we can also stack frames. We’re able to do more with less people.
...
CONTINUED