Wednesday 31 January 2018

Olympic Games innovates with 8K HDR and live 5G production firsts


IBC

PyeongChang is shaping up to be the most innovative Olympics ever, with advances in 8K and 5G production and an avalanche of OTT content. 
The International Olympic Committee is no stranger to using the latest technology to attain pinnacles of sports production but in recent times there’s been another imperative to innovate.
The Games may be second only to soccer as the world’s most widely enjoyed type of sporting event but compared to many other sports it tends to appeal to an older fan base. Reaching a younger audience is the principal driver behind fresh attempts to jazz up the Games by innovating production technologies and distributing to new platforms.
The upcoming Winter Games to be held in PyeongChang, South Korea is no exception, featuring a mix of tried and trusted and more experimental techniques.
Notably this includes what is believed to be the largest ever live 8K UHD production.
What’s more the 8K production features High Dynamic Range – a world first on this scale. Host broadcaster Olympic Broadcast Services (OBS) is also testing 5G to transmit video live to air – another first - and has a new service dedicated to social media.
“There are always very high expectations about production quality in addition to which we have to satisfy fans of niche sports as well as reach a far wider audience,” explains Sotiris Salamouris, Chief Technology Officer, OBS. 
“In addition, Winter Games tends to feature a lot of action driven sports which particularly appeal to the younger generation. For all those reasons we need to do something exceptional.”
The 8K UHD coverage is being conducted in tandem with Japanese broadcaster NHK and it’s on a far bigger scale than the single camera 8K pilot made in Rio.
This time, some 90 hours of 8K content will be captured in HDR (BT.2020) using the HLG format devised by NHK and the BBC.
NHK leads the 10-camera three OB production on events including figure skating, ski jumping and snowboarding.
Of course, the 8K pictures won’t be seen by many people outside of a special theatre in the International Broadcast Centre, nor beyond private screens in Japan which plans to start national satellite transmission in the format by 2020.
However, according to OBS, several rights holders (not just NHK) are taking the full 8K feed for test purposes, some with a view to showcasing the Tokyo games in four years’ time live to cinemas or other large audience venues. 
The production serves a practical purpose too since the 8K UHD HDR feed will be down-converted on site to provide a 4K HDR output which is being taken by broadcasters including NBCU.
There’ll be a separate 4K SDR (BT.709) version too, but the baseline production remains HD 1080i for which OBS is fielding over 400 cameras plus more than 250 specialist cameras and aerial shots from helicopter and a drone.
5G live production
There are huge expectations around 5G as the technology moves from experimentation to commercialisation but the International Olympic Committee (IOC) has stolen a march here too with what may well be the first live to air 5G video application.
Small 4K fixed lens cameras have been fitted to the front of Olympic bobsleighs (some with dummy modules of the same weight and profile) to offer an extraordinary bullet’s eye view which will be cut into the live production. Although the cameras are 4K the transmission output will be HD.
“These real-time links are only possible with the low latency (almost zero delay) of 5G,” explains Salamouris. “This is a very important proof of the value of 5G technology not just for distribution to mobile but for contribution links. We are very interested in exploring the use of 5G further with a view to replacing traditional contribution solutions over RF.”
KT (Korea Telecom) has installed the 5G network and base stations around the bobsleigh track and at other Winter Games venues including the Gangneung Olympic Park, home of the ice arena. 
A partnership with Olympic sponsor Intel will showcase other 5G applications.
While not part of the host broadcast and viewable only by spectators in special zones using 5G-ready mobile devices, these trials include time-sliced views of skaters in motion and the ability to track competitors over the Olympic Cross Country course with the ability to select different views in real-time. 
Social media Content+
One particular OBS innovation is an indication of an ongoing shift in Olympic programming strategy. Content+ is a source of animations, maps, analytics and a series of short 1-3 minute videos curated by OBS and made available to publishers on a globally accessible web portal.
“The idea is that this will facilitate broadcasters to distribute the Games better through social media,” explains Salamouris.
“It’s not just the traditional coverage that we put there in shorter form. It’s primarily content designed to work well in social media such as more behind the scenes video, individual stories; athletes preparing - things that you don’t usually see in the traditional coverage.”
According to a 2017 poll of 18,000 European internet users conducted by Ampere Analysis, 54% of those who said they enjoy the Olympics were over 45 years old while 55% of those who enjoy soccer are under 44 years old.
Recent decisions by the IOC to add new sports like karate, surfing and skateboarding to the Summer Games have been made with that in mind.
The Paris Games of 2024 is even being lined up for the introduction of eSports as an official Olympic event.
PyeongChang represents an important step on the road to authenticating computer gaming since Intel is organising an eSports competition ahead of the opening ceremony. 
The dual branding of the Intel Extreme Masters PyeongChang with that of the Olympic rings and the parallel promoting in Korea of Ubisoft game ‘Steep Road to the Olympics’, is a sign that the IOC is eager tap the huge global fanbase of young eSports enthusiasts – a good third to a half of whom are female.
On the same basis, covering the Olympics through multiple platforms and especially online services has been considered the way forward since before London 2012. The Ampere study shows that of respondents that took Subscription Video on Demand (SVoD) services, almost 45% are under 34 years old. 
“Younger consumers are more engaged with online services and also show higher engagement with social networks making it clear that promoting the Olympics through those routes is important [for the IOC],” suggests analyst Alexios Dimitropoulos.
Each successive Games becomes the most consumed online with PyeongChang being confidently billed as the most connected Games yet by both the IOC and rights holders like Eurosport.
The Discovery Communications-owned broadcasters plans more than 4,000 hours of digital coverage including 860 hours of live action – basically all of the OBS output.
Eurosport, which paid €1.3 billion (£920 million) to cover four Games until 2024, has all but abandoned linear TV saying it will make every minute available online via its OTT platform Eurosport Player.
This will make it the first fully digital Olympics for Europe, it claims. To capitalise on this it has slashed the cost of access to Eurosport Player from €6.99 a month to one Euro for the duration of the Games.
The need to reach Millennials on social media is vital to Eurosport’s effort. It has a self-described ‘Radical Van’ on site, a mobile facility for creation of bespoke content to Facebook Live and other social networks.
It has also hired US-based producer Cycle to create multi-language editorial for publication on Snapchat in Europe as part of a wider agreement Discovery has for producing mobile-first shows for Snapchat’s Discover platform.
In addition, it has an AR studio called The Cube for experts to analyse key events surrounded by 3D graphics such as ski jumper frozen at the point of take-off.
NBCU expand access
Traditional broadcasters are adapting too. NBCU, owned by US cable giant Comcast, paid U$7.65 billion to show the Olympics on TV and online through 2032, also has a pact with Snapchat to share clips of NBC’s Olympics content.
In addition, Snapchat will carry coverage of the Games that BuzzFeed will co-produce with NBC for the Snapchat Discover media hub. The Wall Street Journal estimates that advertising revenue shared by Snap and NBCU as a result of the deal around the Winter Games could reach $75 million
However, NBCU also knows it needs to shake up its linear coverage after facing criticism of coverage of Sochi.
In 2014 the network live-streamed the Olympics to cable customers but re-aired popular events during primetime leaving many viewers susceptible to spoilers. This time around, NBC will broadcast live in all US time zones on channels including NBC, NBCSN, CNBC and USA Network, on NBCOlympics.com and over the NBC Sports app. 
The broadcaster promises more than 2,400 hours of coverage, the most ever it will have aired from a Winter Olympics. 
The 4K HDR transmission is being used to promote the capabilities of Comcast’s Xfinity set top box.
Along with Eurosport it is also offering 50 hours of live virtual reality coverage produced by OBS. Intel VR rigs and picture processing will capture 180-degree video of events including alpine skiing, curling, snowboarding and skeleton distributed to users with Windows Mixed Reality headsets, Samsung Gear VR and Google Daydream via the NBC Sports VR app.
“We trialled VR in Rio but we had just one camera placement,” says Salamouris.
“This time the production is much more sophisticated. Watching sports for a long period of time in VR works less well than a ‘lounge experience’ which is what we are deploying in PyeongChang. This is where viewers can immersive themselves in VR as if from the venue but also come out of that feed and look around a virtual lounge.”


Monday 29 January 2018

Light Field Lab prototypes incredible new holographic display – time to call in the light sculptors?

RedShark News
Start-ups need funding, and raising a few quid is a routine and usually not a noteworthy piece of business – except when your technology promises to display holograms – commercially and within two years. Light Field Lab, which launched barely a year ago, has raked in a sizeable $7 million to further its “revolutionary” concept which it confidently forecasts “will enable real holographic objects to appear as if they are floating in space” without the aid of glasses or other headgear.
That ambition would appear to trump Magic Leap, which, as has been revealed [https://www.redsharknews.com/vr_and_ar/item/5153-magic-leap-finally-lifts-the-lid-on-its-highly-anticipated-mixed-reality-headset], is working on holographic projections using glasses and a computer backpack.
The Silicon Valley-based outfit calls its technology a “full parallax holographic display” and aims for what new investor Vinod Khosla calls “the holy grail of optical display” which is able to interact with realistic 3D holograms without all the diverting paraphernalia of having to wear something.
Details are sparse, but it’s also less secret squirrel at this stage of its R&D than the billion dollar-funded Magic Leap. That’s probably because its founders, most of whom came from light field camera developer Lytro, believe they have something tangible that will make it out of the lab and into commercial use very soon.
Some hints of its direction were given last September at IBC, where co-founder Jon Karafin gave a presentation.
“We have a novel illumination system and a very complex set of optics that allow us to create a completely flat-panel holographic display,” he explained. “It will look like a flat-panel display and this is the game-changing aspect of our specific implementation.”
More information has since emerged. Karafin confirmed to Business Insider that the device “looks a lot like a little TV or screen that projects the holograms in front of it”.
The five-employee Light Field Lab is hiring and plans to use the funding to perfect a prototype and demonstrate it can move into larger scale manufacture. For now, this means a very small panel, about four by six inches, according to Business Insider. Other reports suggest the modules are 2 ft x 2 ft.
In any case, the plan, it seems, is to stack the panels together and stitch the images with a resolution of something like 16K x 10K or as the company says “up to hundreds of gigapixels of resolution”. These 100+ foot wide screens could be placed on floors or ceilings and would project holographic images into a 3D space.

Early adoption scenarios

The firm sees early adoption in large-format location-based entertainment venues. This includes staging holographic experiences at live events like concerts, an advance on the Pepper’s Ghost style illusion that has been deployed for a century.
At IBC, Karafin talked about an experience “like watching a play in a theatre” where everyone sees the same narrative but views it from their own place in the audience.
“Our passion is for immersive displays that provide a group social experience, instead of one where the whole audience sees a single identical view but is blocked off from each other by headsets or other devices,” he said before mysteriously adding that the light field can be creatively changed so that everyone sees the same thing “or completely different things.”
He also mentioned augmenting the holographic image with more senses, such as smell.
What Light Field Lab is not doing, it seems, is generating the holographic imagery in the first place. Presumably, with its hot links to Lytro (although one may assume that Lytro investors were none too happy that their key technical staff jumped ship) Karafin and co will have first-hand knowledge of the development for light field capture and products such as the Lytro Cinema Camera. This is currently a machine the size of a small car being trialled in Hollywood on select VFX sequences but one for which there is a roadmap which sees it whittled down to a handheld.

Content is key

Generating holographic content may be the bigger issue in the mid-term. It was, along with uncomfortable glasses, the Achilles heel of stereoscopic 3DTV and is also hindering adoption of VR, according to Futuresource Consulting’s Michael Boreham.
“A key issue is the lack of killer applications for VR that will drive consumer uptake and this has proved to be a major limiting factor impacting growth,” he noted.
With money being thrown at developing light field, volumetric or holographic technology (the terms are interchangeable), perhaps more attention should be paid to how cinematographers and directors – or should we call them light sculptors? – can create using the medium.
If Karafin envisages “narrative” holographic experiences with his system, then he also needs to get storytellers onboard.
For too long the cliché of Princess Leia holographically sculpted out of R2D2 has been the trick by which all futuristic display tech is measured.
Academics at Brigham Young University in Utah claims to have cracked the creation of the first “free-space volumetric display”, capable of reproducing full-colour graphics floating in the air, visible from all angles.
The so-called “optical trap display”, works using a bizarre but disarmingly simple technique in which individual particles in the air are tracked and then illuminated at an appropriate time with RGB lasers.
The Guardian reckons the image is “3D-printed” with the glowing tiny particle creating a visual image as it goes.
While the images are currently either tiny or very slow to create, that hasn’t stopped the researchers from posing as Princess Leia and mimicking the Star Wars scene.
Clearly, with a lot of work to be done to get this attempt anywhere near practical, the concept is one step ahead of Light Field Lab in that this really does seem to get rid of hardware. There are no screens here – just lasers beaming onto common-or-garden particles floating all around us. Who’d have thought this particular science-fiction miracle would come down to dead skin and microbes?

MAGIX introduces new VEGAS Pro subscription offer

RedShark News
Software subscription models can often split opinion, but for many users they ensure the latest version of the software, along with spreading the cost. Now MAGIX has brought this payment model to its VEGAS Pro NLE software.
Software developer MAGIX is to offer its flagship editing tool VEGAS Pro on a subscription basis in a bid to attract more users. It’s an interesting decision given that other developers, notably Avid with Media Composer and Blackmagic with Resolve, made their software free of charge to entice new users onboard and then hopefully upgrade them to pay for a more advanced product.
The difference is that MAGIX hasn’t made a cut-down version of its tools. For the oddly precise fee of $16.67 a month (equating to $199.99 for a year’s licence) you can not only activate non-linear editor VEGAS Pro 15 but sound suite Sound Forge Audio Studio 12 as well, along with online training courses to learn VEGAS Pro for what the vendor bills, not unrealistically, a complete audio-visual production suite.
What’s more, the fee includes plug-ins like NewBlueFX Filters Ultimate, HitFilm Movie Essentials, VEGAS DVD architect and Ozone Elements 8 by iZotope.
VEGAS Pro 365 is an addition to the existing three versions of the Vegas Pro range, including VEGAS Pro 15 which was released last year. Projects also remain fully accessible to users even after expiration and can be imported and edited by other versions of the software.
Berlin-based MAGIX bought VEGAS Pro and SoundForge as part of a bundle of products from Sony Creative Software in 2016. It promptly souped up VEGAS Pro (or “resurrected” it as the company said at the time) with a greater range of plug-ins, hardware acceleration support on Nvidia and Intel processors, and support for the ACES 1.0 colour space workflow. There’s also a LUT OFX plug-in for the ability to work with the expanded colour spaces of cameras like Red.
Its unique and self-proclaimed “innovative hamburger menu system”, found throughout the application, is intended to keep the user interface clutter-free while accommodating different workflows.
MAGIX also has a range of free software including Looparoid for creating short photo stories with a Polaroid (pick your favourite pictures and a border, add a title and send to your friends, etc) and Fastcut, a program that takes GoPro or smartphone video and automates part of the editing process. When you import a video file and pick from one of the 15 pre-defined templates, Fastcut supplies a music track and makes cuts between scenes.
For reference, the latest Sound Forge package will cost £299 and VEGAS Pro 15 Edit will cost $399 (£281), VEGAS Pro 15 Suite is $799 (£563). The NewBlueFX Filters 5 Ultimate alone is valued at $299.
It’s an option worth considering, especially since the subscription version will automatically update with any new software patches. A three-month plan is also available for $60.
Busy on several fronts, MAGIX is on the verge of launching a VR Suite. It has been working with Intel and Microsoft (HoloLens) on VR projects for some time and last year acquired Dresden software firm Simplitec to tap into the virtual reality market.

Thursday 25 January 2018

Sky Moves to All-Streaming Era

Streaming Media

Pay TV operator, soon likely to be owned by Disney, details plans to ditch the dish and continues drive to go beyond its traditional long-term subscription with streaming services.

Sky is further expanding its array of streaming options to better compete with Netflix and Amazon, the heart of which are plans to offer the full Sky service to consumers over IP direct to Sky Q boxes.
The pan-European pay TV giant first announced a plan to ditch the dish in 2016 with intention to launch last year. Instead, the service will launch in Italy later this year before being introduced to Austria, and finally the UK, the operator’s largest market, perhaps as late as 2019.
In a statement, Sky group chief executive Jeremy Darroch, said: “We recently launched Sky Q in Italy and will roll out the service to Germany and Austria in the next six months. We will also introduce Sky over fibre in Italy and our first all IP service in Austria, both without the need for a satellite dish.”
Sky called the move a “major development” that would reduce costs. In the long term this means saving billions of dollars on leasing bandwidth from satellite companies Eutelsat and SES.
The move is also targeting new markets including the reach of a further six million customers across Europe. Two million of those have been identified as living in dense urban areas where people can't put up dishes.
Sky principal streaming architect Jeff Webb will keynote next month's Streaming Forum in London, giving a behind-the-scenes look at the technology behind the operator's OTT efforts.
All of this is dependent on fast broadband, and to that end Sky’s latest figures for its broadband service in the UK and Ireland, to the end of December 2017 shows that fibre rollout has increased to 33% (up from 21% last year). It is estimated that it has connected just under half of its 12.9 million total retail customers.
Customers can currently use the Sky Go mobile and Now TV broadband service to watch a limited range of Sky channels without a dish, but the new offering will allow access to potentially all of its 270 channels, presumably including those in 4K, depending on broadband speeds.
Now Sky is to launch a low-cost (£14.99/ US $21) USB device that will provide access to its content including movies and live sport such as Premier League soccer plugged-in to a smart TV.
Additionally, through the device, it is catering for customers unable to access Now TV content via Wi-Fi when they are away from home with a new download service. A similar feature has proved popular with Amazon and Netflix customers.
This is timed to coincide with the European Union’s relaxing of geoblocking this summer to allow EU residents to access their paid subscriptions for online media while visiting other EU countries.
“If you are away at a conference or work you can plug and play TV from a hotel room,” said Gidon Katz, managing director, Now TV. “And by the summer holidays people will be able to take Now TV away with them and not worry about data charges for films.”
Customers can choose to buy day- or week-passes to watch Sky content without being locked in a monthly contract.
Sky’s ARPU in the UK and Ireland for the six months to end of December fell £1 to £46 a month, although it had stabilised the churn rate which decreased from 11.6% at the end of 2016 to 11.2%. Sky attributed this to more customers taking its premium Sky Q box.
All of this could be of more interest to US media watchers if the takeover of Fox by Disney proceeds. Rupert Murdoch controls 21st Century Fox which owns 39% of Sky.
With subscription-based streaming services become increasingly popular alternatives to traditional TV services, it is worth noting Sky rival Virgin Media’s announcement this week to upgrade all its TV and broadband customers to its next-gen set-top box V6.
The V6 offers 4K playback and comes with a 1TB hard disk. Virgin said its existing V6 customers are three times as likely to use Netflix and twice as likely to use BBC iPlayer as other Virgin box users.
In a statement, the company said it saw on-demand TV services as “partners not predators,” and that the increased power of the V6 compared to previous Virgin boxes was designed to better support such services.
Virgin Media parent Liberty Global has been amassing a series of content deals including global rights to motorsport Formula 1 and stakes in the studio Lionsgate and UK commercial broadcaster ITV, and has a long-term pact with Discovery to carry Eurosport. This week it also said it would invest more in original content, including episodic drama, for carriage on operators including Virgin Media.

‘Tesla for sports broadcasting’: how automated live production is gaining ground

SVG Europe

The autonomous production of sporting events, eliminating the need for an on-site production team, director or camera operator, is making serious headway among professional clubs, federations and leagues. The technology is gaining ground as a tactical and performance analysis tool and increasingly used to fully automate live streamed production.
Spanish soccer league La Liga, for instance, is the first league in the world to introduce an automated production system that is making every game available to any analyst in any of the league’s clubs. Since the beginning of the 2017-18 season all 42 La Liga clubs are using Automatic TV, devised by media group and domestic rights holder Mediapro, for analysis not just post-event but to inform team decisions at half-time.
Keemotion, a U.S company behind a rival automated sports video technology “has the potential to disrupt the way all live events are produced,” claims one of its investors, the former NBA commissioner David Stern. “And that’s not limited to sports. This is a game-changer.”
Speaking to Forbes, Stern also described the ability to automate production as the “Tesla for sports broadcasting”.
MediaPro has informed SVG Europe of major announcements at NAB in April one of which will see the current HD camera system upgraded to UHD and the ability to picture stitch 8K-10K panoramas in real-time.
Aside from La Liga, MediaPro also says its system is in use by two other national European leagues, including a division one, although it said it cannot reveal names. The system is active in the US for gymnastics, Australia for new Olympic sport 3 x 3 basketball, in the UK, Denmark and China.
“We are delivering both tactical camera feeds and live streaming and we provide that to analysts, sports body and coaches of the clubs on a closed network,” explained Joan Bennassar, who initiated the project within Mediapro’s R&D team five years ago.
“The first step was to understand if we were able to make a video production without any human intervention – just a camera and computer and software analysis of what is happening on the pitch. Within a few months we were able to prove that concept.”
A year later Automatic TV was installed at Barcelona FC’s training facility to record all training sessions. “We waited one more year until launching commercially worldwide. We now have distributors in more than 20 countries and are able to create a full HD multi-camera production from different angles.”
Primary applications
There are, Bennasser explains, three primary applications for ATV. “The first is tactical analysis using mainly a master view of the game. Conventional broadcast coverage concentrates on only a few players. You are not normally in a wide shot so you cannot see everything that is happening. As a coach or a scout you want to see all the players all the time.”
La Liga is making most extensive use of this Tactical camera. Software specifically designed to manage a 3-camera fixed rig creates the video feed, which is imported into the video motion analysis tool Mediacoach, also devised by Mediapro.
The system, which was approved by every club, means each club has access to the same tactical data for all home and away games – though how that is interpreted is up to the skills of club analysts. The feeds are stitched into a 6K x 1K quality panorama controlled remotely by an operator in a purpose-built central control in Madrid. Individual club analysts can view the live stream and provide feedback to the manager at half time. An hour after the final whistle and all clubs are able to access any tactical camera feed from any of the other matches via the Mediacoach platform, while La Liga can deliver the video to all interested broadcasters. Other sports analysis tools including Sportscode, Nacsport, and Fluendo’s Longomatch are also compatible.
The second ATV application is for training. “Typically, during a training session you will have different groups practicing different things simultaneously. ATV can record several video files separately, automatically and simultaneously so that a coach is able, later, to follow for example the goalkeeper training and then some tactical practice or a small match.”
Valencia FC adopted this in its training facility for the start of the 2017-18 season.
The third application is “broadcast-style” live streams using up to 8 HD cameras ringed around the venue. This is being used by clubs as an inexpensive means to stream to official websites. Bennasser says this is typically semi-automated with production by one or two personnel on site – a camera operator filming handheld touchline clips and a vision mixer to switch the feeds.
Automatic TV works with nine sports including hockey, basketball, handball and volleyball. It comes with software to schedule games and links to a manned monitoring site.
It is not as yet offering automated editing of highlights packages although where edits are needed there is capability for an operator to clip selections into a playlist.
“The system can also be used as a low cost video assistant referee system,” says Bennasser. “We can go back in time and analyse any action frame by frame and provide replays.”
MediaPro says its product is commercialised in a small but competitive and fast growing market. Rival technologies Pixellot and Keemotion are neither applicable to as many sports as Automatic TV, nor multi-cam capable, it claims.
‘Motion Following Technology’
Keemotion is based on a proprietary ‘Motion Following Technology’ which detects and tracks player movements, and follows the game through additional scoreboard interaction. The technology enables coaches to break-down, analyse and share game footage real-time, while fans can simultaneously watch the event and connect it with others through social media.  Games are available either live or on-demand, and accessible for web streaming and mobile distribution.
The system – recently backed with $3.6 million of funds from investors including Guggenheim Baseball Management, the Los Angeles Dodgers and David Blitzer, co-owner of the Philadelphia 76ers and New Jersey Devils – is being used within both divisions of the men’s pro basketball league in France (LNB Pro A and Pro B) and by all 12 teams in the LFB, the women’s pro basketball league in France. Across all three leagues, Keemotion has taken the production of French basketball from four games per week to over 20.
Keemotion works with basketball, ice hockey volleyball, handball and futsal and the new funds will be used to address more sports.
The Brooklyn-based outfit currently produces basketball games and other sporting events for professional leagues, colleges and universities and high schools in nine countries – the U.S, France, Italy, Germany, Austria, Finland, Belgium, England, and The Netherlands. Its European customers include Bayern Munich, the Austrian Basketball League and the Finnish Basketball Association. It says it also works with content distribution partners including Sky Austria and Dailymotion.
Meanwhile, Keemotion is in early-stage conversations with a number of technology companies to figure out how its technology might potentially be used more broadly in film production, sports and consumer technology, or how technology such as deep learning and artificial intelligence might help to improve its product and accelerate the development of its automated technologies.

How 5G, VR and eSports are carving the Olympic Games future

RedShark News
The Winter Olympics next month not only promises to be the most viewed and most interactive online sports event yet, technology trials around the edges of the games point the way toward the future of sports coverage and of the Olympics itself.
Chipmaker Intel has emerged as a significant player in PyeongChang and will be hoping that its innovations around 5G, Virtual Reality and eSports will turn attention away from more fundamental security problems in its silicon.
It is using its multi-million dollar sponsorship of the Olympics from now until 2024 to showcase its technologies and also to advance its business in Asia.
VR is the most advanced of these technologies. Working with the IOC’s official production partner Olympic Broadcast Services, Intel plans to record and live stream a number of events in VR using its own Intel True VR system.
Its rigs are equipped with six pairs of lenses (12 cameras total) to capture a stereoscopic 180-degree view. Intel’s goal is to show off the power of its processors, which in this case can ship data at a staggering 1TB per hour using fibre optic cables and high compute servers.
Up to 50 hours of action will be made available in VR and US broadcaster NBCU is one major rights holder taking the service. Intel has form here. Later this year, it partners with Turner Sports to deliver live NBA games in virtual reality. It will also provide 360-degree volumetric video with freeD, a technology it acquired from Replay Technologies in 2016. The system can freeze action from any angle, then rotate all around it, much like the bullet-time effect used in the Matrix. For the NBA, it creates a 3D render of the court using 28 UHD cameras connected to Intel's servers.
Intel says that, with VR and freeD, it is creating a new category for sports entertainment that it calls “immersive sports”. Immersive sports requires high-performance computing and it’s also data-driven – fuelling the continued build-out of the cloud. For athletes, coaches, broadcasters and fans, the ability to capture, analyse and share data adds compelling new dimensions to the game.

5G for Omni-view action

Intel has built a gigabit-speed wireless broadband network for low-latency video and live-streamed content at Gangneung Olympic Park, in Gwanghwamun, Seoul and at other Korean Olympic venues.
It is doing this in partnership with Korean telco KT, which is providing 35,000 wired lines, a wireless network that can host 250,000 devices simultaneously, 5,000 wireless access points, and a data centre.
The 5G capabilities will be showcased in a number of ways. Among them is action from the Gangneung Ice Arena which is ringed with UHD cameras. Servers will process time-sliced views of the skaters in motion coupled with AR data and sent to spectators in VIP zones over 5G for viewing on mobile devices. Users will be able to select and view multiple angles of the skating competitions.
The Olympic Cross Country course has also been peppered with UHD cameras. The feed will be connected to a 5G network to offer ‘Omni-View’, an experience that tracks every individual skier in real-time and streams the footage live to fans in the Olympic Village.
According to Intel, fans will get an athlete’s view from various angles, plus the athlete’s biometric data.
The service is built on Intel’s 5G Mobile Trial Platform which is powered by Intel's field-programmable gate array circuits and Core i7 processors.
Intel is introducing all this with more than half an eye on 2020 when the technology will be widened significantly. In Tokyo 2020, Intel predicts it will be using 5G within connected cars, for autonomous delivery drones and smart-city applications.
In South Korea, athletes will also have the opportunity to capture a range of data to improve their performance. For example, they can observe and analyse such variables as form and speed from various angles, or analyse the effect of external factors such as temperature, or snow and ice conditions, says Sandra Rivera, senior executive in Intel's Network Platforms Group, in a blog post.
Esports makes Olympic bid
Another intriguing sideshow is the decision to bring one of the most high-profile eSports championships to PyeongChang. The Intel Extreme Masters PyeongChang tournament, produced in partnership with ESL, is an extension of the Intel Extreme Masters brand which has been run for a decade and will act as a ‘warm-up’ to the Winter Games. Competitors will play Starcraft 2 and compete for $150,000. A second eSports exhibition features Ubisoft’s action-sports title Steep Road to the Olympics, also showcased by Intel which has had kiosks for fans to play the game installed throughout the Olympic park.
Esports groups have been lobbying for inclusion of esports as an official Olympic event and the IOC seems open to the approach notwithstanding IOC president Thomas Bach’s reservation about violent computer games.
Esports’ future as an Olympic sport was discussed by the IOC in Switzerland last October. Reports suggest that the Olympic movement recognises the strength of esports among younger people, a demographic the IOC is desperate to attract. For the next summer games in Tokyo, the IOC has added Baseball, Skateboarding, Sports Climbing and Surfing in a bid to attract younger fans.
Action from the IEM tournament in PyeongChang will be aired on the Olympic Channel global digital platform.
Couple all this with ongoing experiments in South Korea around cameras and sensors on athletes and equipment and on AI to deliver faster, accurate analysis of the resultant data which will be overlaid in AR, and you begin to have a picture of where global multi-event sports coverage is headed.
Reuters suggests that previous sponsors like MacDonald’s paid $100m for a four year cycle of two Games. Intel’s deal is for four Olympics including Tokyo 2020, Beijing 2022 and Paris 2024.

Wednesday 24 January 2018

2018 Oscar nominations: Inside the technical categories

IBC

IBC365 previews the technical categories for the 90th Academy Awards in which digital acquisition dominates.

It’s always a tricky and controversial task to whittle the year’s films down to five – and then one – to win in each of the 24 categories of the Academy Awards. And with no stand out feature but lots of very strong contenders this year’s vote is particularly hard to predict.
The technical craft nominations are particularly intriguing and will provide a lot of mileage for camera maker Arri since its Alexa models were employed on four of the five Best Cinematography nominees.
Few would begrudge Roger Deakins ASC BSC, the vastly experienced British cinematographer, finally winning the Oscar for Blade Runner 2049, his 14th nomination. Having used Arri Alexas for director Denis Villeneuve’s previous films Prisoners and Sicario, Deakins retained the formula, pairing the camera with Zeiss Primes and shooting with a large 2.40:1 aspect ratio for the film’s release in IMAX.
Wanting the LA of 2049 to resemble a smoggy version of Beijing, with constant rain and snow, Deakins created this in-camera, rather than artificially in post.

“Some were a bit sceptical of this approach, so, to convince and reassure them, I worked with the wonderful German special effects supervisor, Gerd Nefzer, who rigged-up a system of sprinklers and nozzles that could fill the stage with mist - something like the Everglades on steroids,” he told British Cinematographer.
“As far as I was concerned the results looked really good and we ended-up using this system a lot during production.”
Director Guillermo del Toro’s science fiction and cold war inspired romance The Shape of Water was originally envisioned to be filmed in black and white. When the decision was made to film in colour Danish DP Dan Lausten ASC DFF devised a colour palette of steel blue and greens which were shot in camera on the Alexa XT, rather than added in post, using filters and gels. 
“Nothing is coloured by accident in Master Primes because they’re such high quality lenses,” he told Deadline.
Remarkably, Rachel Morrison ASC becomes the first female to be nominated in the cinematography category.
Although she had planned to shoot on celluloid for director Dee Rees’ southern American period film Mudbound, budget constraints forced a rethink.
Instead, she choose the Arri Alexa Mini camera in combination with 50-year-old anamorphic and spherical lenses telling Variety that they tended to flare when exposed to bright sources “like windows, reminiscent of an older age in cinematography and appropriate for a film set in the 1940s.”
Bruno Delbonnel ASC AFC also went digital – Alexa again - for another 1940s-set Oscar runner, The Darkest Hour.
To complement a colour palette of blacks and browns, he makes extensive and subtle use of lighting to evoke Churchill’s journey from shadow into power as he overcomes the Nazis.
“I think it’s usually a big mistake, for a period piece, to say, ‘OK, we’re going to do Kodachrome,’” Delbonnel told Deadline.
“When you look at Kodachrome pictures, the colours are totally wrong. I don’t want to emulate this because then people will feel uncomfortable watching it. So [designing this picture] is more about feeling the period than following what the technology would have been.”
In what would make a great double bill with The Darkest Hour, Christopher Nolan’s gripping large scale reconstruction of Dunkirk is the only one of the five nominations to be shot on film. Not just film either, but 65mm negative where the visual impact of the IMAX format has made it a powerful best picture frontrunner (the drama had the widest large format film print release in 25 years.

Dutch born Hoyte van Hoytema NSC FSF ASC shot 70% of the film on IMAX (which is 65mm 15 perf). The rest was shot on 65mm 5 perf on Panavision cameras with dailies produced by US facility FotoKem in its biggest and most complex large format project to date.
“The sheer negative size of IMAX and the texture and colour depth, clarity of the film emulsion in combination together was the most obvious way to capture this,” van Hoytema is quoted in IndieWire.
“It also requires very minimum tweaks and corrections to make it look great: no computers to suppress information or mechanical interpretation. It’s just a very pure way of capturing, resembling mediu format still photography.”
It’s worth noting that a good number of hit films with acclaimed cinematography this year also shot film. These include The Post; Wonder Struck (shot on Kodak 35mm B&W and colour film stocks); The Beguiled; Call Me By Your Name; Wonder Woman; Murder on the Orient Express (another 70mm spectacle); The Florida Project and Battle of the Sexes.
Incidentally, Best Picture Oscar favourites Lady Bird (lensed by Sam Levy for director Greta Gerwig) was shot on Alexa Minis and Three Billboards Outside Ebbing, Missouri lensed by British DP Ben Davis BSC for director Martin McDonagh was shot using Alexa XT.

VFX Oscar contenders
The VFX Oscar race is being pitched as a battle between Blade Runner 2049 and War for the Planet of the Apes.
The former features several stunning sequences each undertaken by different facilities (including Framestore, Moving Picture Company (MPC), Rodeo FX, Territory Studio and Double Negative) but arguably the stand out is the CG animation of the replicant character Rachael.
Head scans of Sean Young, the actor who played Rachael in the original movie, as well as photos of Young in the early 1980s were used by artists at MPC as visual references.
Actress Loren Peta doubled for Young in the scene. The trick to making the scene work was generating a believable performance from the digital replica, something only possible when the team returned to Young’s original performance and found mannerisms which matched with the 2049 script. Then the whole scene was animated by hand.
With UK studios full, principal photography was in Budapest. The bulk of the film’s 1200 shots were created by facilities in Montreal and Vancouver to take advantage of local tax credits.
Creating digital characters with an emotional intensity and subtlety was vital to telling War for the Planet of the Apes not least because apes appear in almost every shot of the movie. The VFX for the film includes creating the monkeys themselves from motion capture, and required VFX supervisor Dan Lemmon to allocate 1,450 shots to lead house Weta Digital assisted by Stereo D and Exceptional Minds.
Shot and post produced in Canada, the final part of the trilogy is lauded for the performance of Andy Serkis as simian chief Caesar.
“Every time we reviewed shots with Caesar, we had Andy Serkis’ performance side-by-side with Caesar to get the emotion right,” Lemmon told TheWrap.
“There’s no [computer] program that does that for you, that’s only the skill and dedication of the sculptors and technical people.”
London facility Jellyfish Pictures was part of the team that delivered the VFX for Star Wars: The Last Jedi.  VFX is always heavy in Star Wars films, everything from the new Porg creatures to Snoke. VFX work also includes epic space battles and enhanced environments. ILM took the lead on this production dividing work among its offices in London, San Francisco, Singapore, Vancouver and as well as Montreal-based Rodeo VX.
Also nominated are Guardians of the Galaxy volume 2, the first feature shot at an 8K resolution (on RED cameras) to accommodate the heavy VFX element of the story. Weta Digital, Framestore and Method Studios were among VFX shops involved. The opening sequence requiring them to make actor 66-year-old Kurt Russell appear as his 36-year-old self.
ILM get another nod for bringing another ape to life – this time a re-imagined version of the classic King Kong story in Kong: Skull Island.
The design concept was to make Kong look like a modernised version of the 1933 Kong by sculpting anatomical body details that the original puppet lacked while retaining the familiar physical silhouette of Kong. “Also, like the original, our Kong walks as a biped and was designed with facial features and expressions that are evocative of the original stop motion performances”, senior VFX supervisor Stephen Rosenbaum told FXguide.
Deliberately unobtrusive work was done by Double Negative for Dunkirk. Nolan was at pains to capture as much action as possible in-camera to evoke a gritty and realistic experience, yet models were expertly combined with aerial footage for scenes of the sinking of ships, crowd scenes on land and most notably in the spitfire dogfights.
Sound plays a crucial role making any movie tick but none more so than Dunkirk where the soundscapes of editor Richard King, winner of two of his three previous Oscars for Nolan collaborations (Inception, The Dark Knight) combined with the abstract and experimental score of Hans Zimmer. Dunkirk is nominated for sound mixing and editing as is Baby Driver, Edgar Wright’s heist film choreographed, beat-for-beat, to pop music by Julian Slater in close collaboration with editors Jonathan Amos and Paul Machliss – who are also nominated in the Best Picture Editing category.


Monday 15 January 2018

Cable Congress 2018: Innovation in Action


Knect365 for Cable Congress
In a recent report commissioned by Liberty Global, ADL research posits that, “If the third industrial revolution leveraged the development of electronics, IT and automated production, the ongoing fourth industrial revolution is redefining the interactions between people, machines and the environment, and redefining the way we live and work.”

Coining the term ‘GigaWorld’ to describe this emergent era, the authors argue that the potential of ‘GigaApplications’ challenges network operators to raise their game, as they consider the timing of their investments.

“While some operators are actively investing, a significant number are only partially committed,” it suggests. “We are at an inflection point with a huge value at stake.”

That inflection point is front and central of the Cable Congress agenda. According to Phil McKinney, President & CEO, CableLabs, “together we will explore a vision of how future technologies will change the way we connect and interact with each other in the next three to eight years.”

McKinney, who will deliver one of the opening keynotes, adds, “The foundation of this vision is an ever-evolving network that stimulates innovation and speeds progress. We challenge you to re-imagine what’s possible at Cable Congress.”

CableLabs pinpoints embedded IoT devices which are able to monitor us anywhere, a critical outcome being the ability to keep us safe and healthy longer. “Technologies such as remote diagnostics, once thought of as science fiction, rely on the high speed, secure, reliable wireless connectivity and networking protocols enabled by the cable industry,” says McKinney.

This is just one aspect of Smart Cities and the way we use network infrastructure to deploy new technology-enabled services in the best interests of the community. Cable Congress goes further and asks how the social impact of such services should be measured and what are the implications of General Data Protection Regulation which comes into force in May.

Critical real-time two-way transmission characterizes GigaApplications like Virtual Telepresence and Automated Living. Yet to ensure sufficient Quality of Experience for consumers, the enabling infrastructures and networks must provide new Quality of Service features in addition to coverage and bandwidth. The industry is rapidly moving in this direction with a series of technology transitions which advance DOCSIS technology to bring Multi-Gbps broadband connectivity to the masses and which embrace SDN, NFV and other virtualization technologies.

In this way networks increasingly become the central enabler as they convey the data and interconnect devices and applications. Join the CEOs of Tele Columbus, Virgin Media Ireland and the COO of Vodafone Germany as they outline their technology roadmap and hear from Huawei about how video-centric network strategies will revolutionise the cable business.

From Facebook to Amazon to Waze, artificial intelligence (A.I) is changing the way we use and interact with technology. It’s also transforming customer service. For example, Comcast announced earlier this year that A.I and machine learning will be a priority in developing their next-gen customer care programme.

If there’s ever an industry that’s primed for collecting data and making it available to customers, it is cable. For years, quantitative and qualitative research helped the industry understand consumers so how profoundly could A.I and advanced analytics really benefit the customer experience?

Cable Congress also considers the impact that A.I is already having on our daily lives. In particular, how does AI - in the form of digital assistants like Alexa - influence our content discovery choices? How effective are algorithms at determining objective news stories from fake ones? The choices we make now impact our relationship with news organisations, with political discourse and public life in general.

A.I is one of many areas where smart technology and content is converging on the network. Indeed, the evolving dynamics between operators and content providers has become central to forward thinking business strategy, one represented at the heart of several conference sessions.

To enable rapid product evolution as well as to achieve economies of scale, cable operators find themselves in the midst of transition from DVB-C to all-IP as consumers look to curate their own bundled services of content across multiple personal devices.

Executives from Viacom, Endemol Shine, AMC Networks International among others debate whether it makes sense for pay TV operators to invest in original content and share exclusive insights into how direct-to-consumer OTT services from channels will impact the economics of pay TV. If the triple-play was yesterday’s competitive silver bullet what is the ‘secret sauce’ for operators to attract – and retain – customers today? Hear Comcast’s view on this.

A highlight is an interview with Sean Bratches, Managing Director, Commercial Operations at  Formula 1 who explains the 360-degree reversal in approach to content from the world’s biggest motorsport which previously kept rights closely guarded. He is leading a reinvention of the fan experience using virtual reality and YouTube among other innovative tactics. The implications for the future relationship of cable operators and sports rights holders are not to be ignored.

If cable is to maximise it role in public life via the IoT at the same time as its revenue generating potential then government and regulators must provide an optimum operating environment. There is no wiser or more widely known advocate perhaps than Michael K Powell, President and CEO, NCTA (The Internet & Television Association). In a must-attend opening address, the former chair of the FCC will extend his argument that cable has a critical role in the Information Revolution, from the US to Europe.  In his keynote ‘Hands Across the Water: Our shared interest in cable’s success’ Powell reiterates that broadband networks require a light regulatory touch to grow and evolve rather than the “debilitating impact of utility-style” micro-management.

“The biggest threat to innovation and improving consumer experiences is not net neutrality, it is an internet that stalls and does not get better,” Powell argues. “Tech innovation and network innovation are symbiotic. Each depends on the other to keep up. Netflix could move from DVDs to streaming because the network had become faster. In turn, because of Netflix, consumers demanded and purchased faster speeds, which could justify new network investment. That virtuous cycle is critical.”

However, Powell also identifies piracy as a major challenge still to be tackled. A recent AT&T survey of more than 5,000 enterprises around the world, revealed that only 10% feel confident that they could secure IoT devices against cyber attack.

The cable industry has a number of initiatives under way including work with M3AAWG (The Messaging, Malware and Mobile Anti-Abuse Working Group) to improve how distributed denial of service (DDOS) information is shared. The industry is also working with the Open Connectivity Foundation to develop IoT security standards. Both efforts will help to combat cybersecurity threats to both their networks and their customers but is it enough to defend against increasingly sophisticated criminal cartels demanding ransom in hard to trace bitcoin?

According to Cisco, there are currently 4.9 billion connected devices today with an expected 12 billion by 2020. The fully matured result of this rapid growth is a $6 trillion industry. Liberty Global itself forecasts the GigaWorld innovation cycle to unlock a market of €250-660 billion per year by 2025 in Europe alone. At the global level, we are looking at a value of €1.3 - 3.5 trillion per year.

If even part of that value is realized then the industry looks to be in rude health as long as it widens its outlook way beyond that of the dumb pipe.

“Our vibrant and forward-facing programme for 2018 responds to the extraordinary and everchanging industry we’re all now a part of,” says Manuel Kohnstamm, President of event organizer Cable Europe. “No longer just about content and carriage, Cable Congress sets the agenda for discussion around infrastructure investment, content curation, innovation, consumer trends, big data and analytics, smart homes, competitiveness and IoT. There has never been a more invigorating or dynamic time to be in this industry nor at its heart - Cable Congress.”

Mixed Reality - Television on Steroids


Digital Studio

There is a vision of television or more broadly the storytelling possibilities of the entertainment industry – in which viewers will no longer be confined to their living rooms, but can experience the action just like live studio audiences or even gameshow contestants. Space and time will no longer be constraints, with viewers travelling through new universes without stepping foot into the outside world.
This vision is being harnessed by content creators as mixed reality, or MR - “the result of blending the physical world with the digital world,” according to Microsoft. Rather than just adding artificial elements to a real scene as with augmented reality (AR), or creating a completely artificial environment as with virtual reality (VR), MR takes reality, digitises it, and then places all or parts of it into an environment that mimics the real world in real time. MR can be wholly immersive, or it can physically blend with a real-world view.
MR will be one of the technologies that will help spur a “fourth industrial revolution,” according to Cindy Rose, CEO of Microsoft UK.
Along with cloud services, AI, and quantum computing “these technology shifts will reshape our lives, our businesses, our organisations, markets, and society, and we believe that these technology shifts will deliver incredible benefits,” she said.
So how can the TV sector use this new technology? According to Stig Olav Kasin, chief content officer at The Future Group (TFG), the biggest challenge for content creators is developing the storytelling structure to be suitable for TV.
“Viewers are used to a linear 2D story. Creating a unified experience where people can move around freely like in a game simply isn’t possible, or at least no one has cracked it yet,” he says. “The danger for content creators is that they fall into the trap of making the same stories we have today, simply with new visual effects.”
Viewers should be able to participate in the story and universe – on equal footing and at the same time as the contestants.
This was TFG’s goal when it created gameshow format Lost in Time in partnership with The X Factor and ‘Idol’ producer Fremantle Media International (FMI). Across six virtual worlds (Wild West, Ice Age, Medieval Age, Jurassic Period) contestants compete in different challenges in a studio show. The contestants and props are real, but everything you see on screen is visual effects, akin to what you previously would only get from Hollywood movie productions. Even better, the visual effects are real-time capable in a full multi-cam setup.
The big departure from traditional gameshows, though, isn’t just the graphics. At the same time as the show is aired on TV, viewers can compete on equal footing with the contestants and be part of the story via a mobile or tablet app. The best players even win the same prize as the contestants. This takes MR one step further: the real and virtual worlds become one universe, allowing TV contestants and TV viewers to be part of the same storyline – one world, one story, one experience. This is what TFG’s calls interactive mixed reality (IMR).
“In the 2000s, viewers got the power to vote for their favourite contestant in talent competitions and reality shows,” says Kasin. “It was also good business for the broadcasters. In the IMR universe, however, viewers can be participants in their own right. At the same time, broadcasters and advertisers get the chance to communicate directly with the viewers and gather data about their interests and behaviour, like social media companies can.”
Lost in Time premiered on Discovery Communications-owned channel TVNorge, Norway’s second largest commercial channel, last spring. According to TFG co-founder and CEO BÃ¥rd Anders Kasin the show increased viewing on the channel’s slot by 64% and saw the mobile game played 7 million times during the season (from a local population of just 5 million); “The interactivity was extremely high – much higher than we expected.”
Now the show is coming to the Middle East courtesy of Dubai TV.  It is destined to be “the first of its kind in the Arab world” according to Sarah Al Jarmen, Dubai TV’s director.
The Dubai TV version of the format is set to air across the channel’s pan-regional Middle East and North African reach. It won’t differ radically from the Norwegian version, although there it will be in Arabic and there will be a greater emphasis on mobile game interactivity. Two seasons of 26 episodes will be produced out of the Oslo hub with contestants flown from the UAE to Norway. Two seasons of 13 episodes will show in 22 territories in the Middle-East and North Africa.
In a statement, Anahita Kheder, the senior VP of the Middle East and Africa for FMI, called the show “hugely ambitious and disruptive” and that FMI was very excited “to bring this loud and buzzy format to the Middle East.”
At the moment, VR headsets aren’t distributed widely enough to justify a primetime viewing slot for a live show. That’s why TFG says it developed the games for iOS and Android devices, giving a large global audience the chance to compete and engage with the content. “However, once VR headsets are more widespread, it will open up a new world of possibilities for the TV industry to blend the best elements of gaming and traditional TV,” he says.
Signs of this are already evident. Partnered with the Turner-IMG ELeague, TFG injected CG characters into the broadcast of the Street Fighter V Invitational eSports event last spring. Characters from the game struck fighting poses on the studio set, viewable by studio audiences on adjacent screens and by Turner’s TV audience. It took a couple of months to produce the characters but the resulting animations were displayed without post production combined with physical sets and presenters, live.
TFG was at it again in October for the Eleague’s Injustice 2 World Championship broadcast on TBS, Twitch and YouTube. Among the 3D character animations presented to viewers at home as if interacting with the studio audience, was Batman. This promises to be the first of a wider deal to augment more superhero characters from the Warner Bros. stable in mixed reality.
Interesting to note that BÃ¥rd Anders Kasin was a technical director at Warner Bros during the making of The Matrixtrilogy when he came up with the initial idea for the mixed reality platform.

A new Frontier
The technology platform underlying TFG’s MR format was developed with Canadian broadcast gear maker Ross Video and is being marketed as a standalone software application by Ross.
Branded Frontier, it’s promoted as an advanced form of virtual set for the creation of photorealistic backgrounds and interactive virtual objects.
At its heart is the Unreal gaming engine, from Epic Games, used as the backdrop renderer of scenery through features such as particle systems, dynamic textures, live reflections and shadows and even collision detection. This works in tandem with Ross’s XPression motion graphics system, which renders all the foreground elements.
Of course, games engines were never designed to work in broadcast. Unreal, or the Unity engine, is superb at rendering polygon counts, textures or specular lighting as fast as possible on a computer. They do not natively fit with broadcast signals which must correspond to the slower frame rates of SMPTE timecode. However, when it comes to rendering performance, game engines are a real step ahead of anything in a conventional broadcast virtual set. It’s the difference between a few milliseconds and anywhere from 25 to 50 frames a second.
What TFG and Ross have done is to re-write the Unreal code so that the framerates output by the games engine’s virtual cameras and those recorded by robotic studio cameras match. They have succeeded in putting photorealistic rendering into the hands of broadcasters. The virtual worlds are created in advance with features like global illumination, real-time reflections and real-time shadow and rendered live, mixed with live action photography.
Fremantle is also promoting the format’s advertising potential. Product placement could simply be ‘written into’ backdrop animations designed to mimic the virtual environment (think of a Pepsi logo styled to fit a saloon in the Wild West). Commercials could also be created in Unreal Engine so that viewers need not feel they are leaving the show's virtual universe. Sponsorship will be tailored to the MENA market.
Other format sales of Lost in Time are pending – including in China, while Bard Anders Kasim reveals that TFG is working on a new virtual reality component to the series.

Karim & Noor
Blink Studios, the Abu Dhabi-based content creator incorporated MR experiences into animated ‘edutainment’ series Karim & Noor. The ‘holotoon’ tackles key educational messages, touching on social emotional learning, and puts children at the centre of an immersive storytelling experience via MR technology.
According to Nathalie Habib, GM and Executive Producer at Blink Studios, the best way to describe MR is by defining the function of the most elevated mobile kit that offers this experience.
“Microsoft HoloLens is the first self-contained, untethered head-mounted holographic computing device for mixed reality,” she says. “It blends 3D holographic content into your physical world, giving holograms context and scale to interact with digital content and the world around you. HoloLens lets you go beyond the screen with holograms to visualize and work with digital content in relation to your real world, unlocking new insights and capabilities.”
She adds, “MR does not block you from your reality. That is the main impediment it overcomes. It engages you while you are still in touch with your reality which is favoured especially in verticals related to education and healthcare where real communication is equally important to virtual engagement.”
Producers wanting to work in MR will need the skills to build 3D holographic content for Microsoft software Windows Mixed Reality using the HoloToolkit for Microsoft’s game development platform Unity, and holograms using gaze, gestures, and voice.
 “Science fiction becomes science fact and Unity and Universal Windows Platform app developers are leading this revolution,” says Habib. “Blink is not about just using and promoting the MR devices but actually creating, developing and producing original content for it. We started our education of MR content creation with our own IP, Karim & Noor, by challenging and grappling with storytelling and immersion, and trying to understand the capabilities of the available latest technology to deliver a story engaging experience. We are working on identifying further content that will leap the MENA region into MR experiences that is set to become mainstream in the next two to three years.”