## Barco ### Rec 2020 vs P3 in the XYZ container - Rec.2020은 Rec.709보다 더 넓은 색영역을 표현하는 공간으로 현재 HD의 산업 표준임 ![[rec 2020.webp]] - Barco에서는 새로 출시하는 HDR 프로젝터에 대한 소개와 함께, Rec2020과 P3를 비교하였음. XYZ 색공간을 표현하는 DCI-P3는 일반적으로 Rec.709보다 25%가량 색영역이 넓지만 Rec.2020보다는 낮아서 DCP를 제작할 때 색영역이 클리핑 되는 문제가 있었음. ![[rec2020 p3 rec 709.webp]] - Barco에서 출시할 HDR 프로젝터는 Rec.2020의 색 영역을 표현할 수 있기 때문에 구형 프로젝터의 색 표현 영역인 DCI-P3 영역에서 재현하지 못하던 실제 색감을 재현할 수 있음 ### SDR vs HDR - HDR과 SDR은 밝기와 색영역에서 차이 있음. HDR은 SDR보다 명암을 더 표현할 수 있으며, 과거에는 100니트 수준이던 HDR 프로젝터가 이제는 300니트까지 지원하게 되었음 - HDR은 SDR보다 더 넓은 색 영역을 갖는데, HDR은 Rec.2020을, SDR은 Rec.709을 재현하고 있음. 일반적인 파워포인트는 Rec.709의 색영역을 갖고 있어서 Barco 로고 이미지가 오렌지로 보이는데, Barco 레이져 프로젝터에서는 몇년 전부터 P3, Rec.2020까지 선택할 수 있는 옵션을 제공하고 있음 ### EOTF 2.6 vs PQ - 사진이나 비디오의 밝기 정보를 디스플레이로 전달하는 과정에서 디스플레이의 밝기를 인간이 인식하는 정도와 유사하게 조정하기 위해 사용하는 감마값은 SDR과 HDR에 따라 다르게 발전되어왔음 - SDR에서는 EOTF 2.6이라는 감마보정값을 활용했으나, 이 값으로는 HDR에서 보여주는 10,000nit의 밝기를 재현할 수 없음. 이에 따라 PQ(Percepual Quanizer)을 통해 밝은 밝기와 어두운 밝기를 광범위하게 재현하고 있음. 즉, HDR 프로젝터에서 PQ를 통해 밝기 수준을 더 폭넓게 조정하고 있음 --- ## SMPTE Standard Update ### HTML - SMPTE에서 표준문서의 제작과 발행을 HTML 규격을 통해 진행하는 방식을 선보였음 - 기존의 방식과 다른 점은 다음과 같이 4가지로 요약할 수 있음 ![[html standard workflow.jpg]] 1. 단일 소스: 기존 방식은 각각의 TC 프로젝트 그룹, 에디터가 저마다의 표준 문서 버전을 가지고 있었으나, HTML 방식을 통하면 모든 버전 관리를 중앙에서 하면서 하나의 소스를 기반으 문서를 만들어낼 수 있음 2. 자동화: 마크다운 포맷을 지원하는 HTML은 그 특성상 자동으로 문서 스타일과 포맷을 만들 수 있으며, HTML과 PDF 전환도 자동을 처리가 됨 3. 메타데이터: 자동으로 문서의 메타데이터를 형성하고 처리할 수 있게 됨 4. 웹 접근성: HTML은 인터넷을 통해 사용할 수 있어 별다른 워드프로세서 없이 작업을 진행할 수 있고 최신 표준을 빠르게 접할 수 있음 - 참고로 HTML로 만든 첫 문서가 C4029로 내년에 확인할 수 있으며, 이미 이 개정 작업을 통해 HTML을 기반으로 하는 워크플로우의 효용성을 확인하였음 ### AI 관련 표준 문서 업데이트 - SMPTE ER 1010:2023 Artificail Intelligence and Media를 소개하며, AI/ML이 콘텐츠 산업에서 어떻게 사용되는지, 기술의 활용 가능성, 윤리적/편향적 문제에 대한 주의 내용을 포함하고 있음 ![[Deep learning from smpte standard.jpg]] - 참고자료: [[SMPTE ER 1010 2023 Artificail Intelligence and Media.pdf]] --- ## Packed Digital Cinema Distribution Master - pDCDM의 필요성 - 4K 영상의 촬영시, 10TB로 촬영한 영상 소스가 다양한 버전과 언어에 따라 200TB로 늘어나고, 3D 및 프레임처리에 따라 360TB까지 늘어나는 상황이 발생함 - 이로 인해, LDC(Local Data Center)와 RDC(Region Data Center)로 데이터를 전송하는데 2일에서 4일이 걸리는 문제가 발생함. 직접 데이터를 센터로 옮기는 경우 최대 6.5일 정도가 소요될 수 있음 - 이러한 데이터 전송의 문제는 제작 상황에서 시간과 비용을 발생시키는 문제임. 이를 해결하기 위해 데이터를 압축해서 전달하는 방식을 생각해볼 수 있는데, 이게 바로 pDCDM 방식임(Packed Digital Cinema Distribution Master) ![[data delivery problem.jpg]] - 해결방법 - pDCDM은 영상, 소리, 자막 데이터 중에서 영상에 해당하며, 영상 소스를 JPEG 2000을 사용하여 손실 없이 압축함. 이를 통해 50%의 용량을 줄일 수 있음 - 특히 이런 방식은 J2C 48fps 영상 소스가 AWS S3로 업로드 되고, 필요에 따라 여러 버전과 언어로 편집하는 모든 과정이 Cloud에서 진행이 됨. 이로 인해 데이터 관리와 배포의 효율성을 증가시키고 대용량 데이터를 비용 효율적으로 처리할 수 있게 됨 ![[pDCDM workflow.jpg]] - 참고 링크: https://github.com/SMPTE/st428-24 - 참고자료: [[SMPTE ST 428-24 D-Cinema Distribution Master - Packed Image.pdf]] ---- ### Opening by David Before we get started, I would like to acknowledge um our Education Vice President Michael Zinc and our Director of Education may be to thank them for the work they have done for today. So since the education is uh working to be a space to help people throughout the lifespan of their career from the entry through mid career and into advanced uh positions. So we hope you will join us for other safety events throughout the year around that are produced by our sections by our education department and online as well. Uh SMPTE is working in many areas as we continue to provide a space for the industry to come together to create solutions that continue to advance technology, its use and its impact. This morning, we're going to hear about SMTPE's directions in these areas and to start us off. I am pleased to introduce our president, Mr Renard Jenkins, who will share some details and efforts in our emerging technologies, Renard. ### Renard Jenkins Good morning. Good morning. All right. This is the first day we haven't gotten there yet. So, uh thank you David for that and $100 us together. Um As David said ==we are looking at what it is that we need to be focused on as an organization as time and as technology continues to change and grow those uh those moments, those times, we need to be focused on our attention as an organization,== we have been known for years as the group that actually takes care of that focused on standards, focused on bringing things together and so in doing so, it is necessary for us to look at areas such as A R and VR such as gaming, such as V FX, these areas in which we need to get more um engaged. It also includes animation which for a long time has uh has been sort of on the back burner. So part of what we have been doing over the last few years is really taking a look at where we can actually engage with the various societies and organizations that are part of all of this. We began to see if there's a place for safety. Now, ==recently we released the AI document== CTA I document was actually something that is uh ==it took about three years to==, to actually uh put this together. And contrary to what some of you may think it was not three years uh because that's how long it takes to create a standard. ==It was actually three years because things were changing so quickly==. And as the document was getting ready and getting prepared and getting ready to launch something would change and it would not be a small change if you look and you see how quickly things are moving in this space. You can see how this is about to actually join into what we do on a daily basis. Um As we continue to look at it where we are going as a society, as we look at where we are going as an industry, these things are becoming very, very clear to all of us that this is a part, a foundational part of the future of content creation, artificial intelligence and true artificial intelligence as opposed to machine learning and automation and a few of the and deep learning, which were the ways that we were looking at things about. We'll say 18 months ago, the way that we are looking at what we're doing today, what we're actually bringing in his light years from what happened just 18 months ago. I think some of you remember the Will Smith eating spaghetti video. Now we have so and then we have VLE we have all of these tools coming together to give us a new way of doing some of the more difficult things that we have done in the past. ==There is fear, there is concern and that's a part of the technology process. But I would implore you to look at these tools and I want to be very clear, these are tools, these are not entities, these are tools look at these tools, see how they can enhance what it is that you do==. So if you get a chance, uh we'd like for you to take a look at the uh AI document. That's your homework. We hope that each and every one of you will enjoy the program that has been put together today. I had a chance to uh see some of the content. I think this is gonna be really exciting as we look at the cameras lenses, this wonderful, beautiful uh laser projector in the back as we do that. Yeah, as we do that today, enjoy yourselves. Make sure that you have a chance to take these new tools, take these new ideas, take these new moments and move them forward because this is where it starts. So with that, I would like to introduce to you Michael Zink, who is our Vice president of Education for the Society and uh we'll get you started. ### Michael Zink Good morning. It's much better. So I'm really happy to be here today. We have a great program. We a lot of thought in it. And as usual a program like this doesn't come together with just one person. We had a wonderful team thinking about what we can do that really elevates it and brings something exciting here. And I like to pick up the whole show and I wanted to take a second and thank the people that helped make it happen on the program. Committee. That's Bill Hogan, that's Greg Che Che that is Rich Welsh. And then of course, nothing, literally nothing at the show would happen without Maya. So last year was a really big year for cinema. Um There were a couple of really noticeable items, items and we wanted to kind of like zero in on them and spend some time in the sessions today to really put a spotlight on it and I'm gonna highlight three of them. The first one is what you'll see in our next session um which is focused on this massive success that we've seen with um live music events being brought into the theatrical environments and really creating very captivating and immersive experiences in a theater for people to experience a life event that had happened. Um Prior to that, obviously, the biggest one that everyone and probably members of us who paid with the tour that um brought in somewhere around $1 billion and in revenue. Um But there have been many, many others um that were very successful as well. And we're really excited today to have the creative team behind the Taylor Swift one and a number of these other um concerts here with us to really help us understand what goes into creating gauging content for cinema without necessarily interrupting a live performance, it's already happening. But people pay a lot of money to go to a live event as well. ==The second item that was very um interesting.== ==Last year was the success of IMAX==. IMAX had its biggest year in history last year. A lot of it probably to the success of the Best picture winner ==Oppenheimer==. Um But there were a lot of other IMAX releases as well. Um There's one out right now with Dune part two and we're really excited to um the course from IMAX here to do real deep dive into what does it take from, uh from the, from the lens all the way to the actual experience in the auditorium as the process of like what makes an experience? And I experience, I'm very, very excited about that. And our last session today, that session that makes us right after our last session today, ==we will have our friends from the sphere back.== This is the third year that we're talking about the sphere the first time they talked about unveil what the spear actually is and what it does last year, we talked a lot about the mechanics and what goes into bringing different sound, bringing the video and managing the infrastructure for that. But all of that so far has been always in theory, it hadn't opened at the time. this year is the first time that we can actually talk about it after it opened last year. And we've seen a lot of successes both on the live music side with two, having the residency there and also with um original content being created specifically for that venue with Aronofsky's postcards from. So we're very excited to have them back and get some more insights that is the fourth session and later this afternoon. Now, one thing that all of these successes have in common is that they're all having to rely on standards and simply obviously is one of the leading organizations when it comes to standards, certainly in the media supply chain. And with this session right now, um in a little bit, I'm gonna bring out, we're gonna focus on that. I'm gonna bring out Sy's very own standards. Vice President Sally and Tori, and she will provide an overview of what Sy has recently been doing along with um a couple of presentations at zero in on some of the other activities as well. We'll get to that in a minute before we get there. We're gonna give an update from Barco a little bit background, Barco at Cinemacon last week this week, actually, early this week. Um ==They finally um showed up and released their new HDR by Barcode projector, which is completely based on something called light steering technology==. ==Light Steering==. My new term. If you've been to some of these conferences before, you might have heard over the years come up over and over. I remember probably about almost about a decade ago, I was seeing a version of this on a, on a projector that was kind of like this. They put a picture and now the baby has all grown up and it's actually a real full projector that can light up an entire large screen. So I'm very excited um to hear more about that and congratulations to Barco and the entire team to making that happen. I know it's a monumental effort, getting something from a prototype all the way to product. It's very, very amazing. Now, here for Barker to talk about it are two gentlemen and then Weta, he's the VP of studio relations and technology insights. And we also have Jay Z. Joachim Zel, he's the head of HDR content workflow at Barker with that and the state a All right, ### Joachim Zell(Barco) good morning. Everyone. I just got through speaking. It's in Macon as you can tell. So the voice is still there, still loud. So uh welcome everyone. Full house, love it, love, uh love the small intimate setting versus uh some of the others where you have, you know, 1000 seats and you know, 80 people in that big auditorium. This is fantastic. I know a lot of you. Um I'm a C about lifer uh at 46 years and counting. So I've been, I've been through it all the good, the bad and the ugly from recessions to pandemics to strikes. B ut then the good things, ==digital audience 70 millimeter IMAX back in the day==, I, my very first IMAX was at the Colorado Denver History of Natural Museum and I saw a dinosaur uh a feature in I ax still blows my mind even at my old age. Uh how excited uh of an of an experience that was so without further ado, I wanna just recap what we found what the takeaways were from Cinemacon this week. Actually, um Monday through Thursday, it was a very positive overall vibe from both the studio and the cinema owner perspective, extremely positive uh things going on right now, both from not just the number of features to be released by the major Hollywood studios, but the quality we're getting back to storytelling, we're getting back to the magic of movies, at least that's in my opinion, my humble opinion. But I've seen like, again, I've been through everything. I've been through a lot of technical uh achievements and advancements in cinema innovations such as, you know, whether it be Dolby Vision, whether it be imax, whether it be immersive a of all the flavors associated with all of those. And now to, to Michael's Point, ==we have the new uh HDR by Barco==, ==truly uh of the most realistic projection based high dynamic range solution for cinemas==. So without further ado, I'd like to maybe energize the crowd by showing you just a little reel. What's called the Barco 9/90 anniversary reel. Let's just, it's got some fun in it. So it's, it's a Belgium maz uh clip and let's take it away James and that if it leads you to, for me, that's weird. Try this major. She does. Well, that's guard in evolution, ain't it? All right. Yeah, that actually looked uh pretty amazing on the, on the big screen in the colosseum at Caesar's Palace this last week. So um earlier this week, I should say it next up. I've already done that. Mixed up my esteemed colleague and image scientist, head of HCR Workflow Park and my friend Daisy. #### Rec.2020 vs P3 OK. Z Tight. I just didn't want to say good morning again. Another word nobody has heard before. When I saw the Belgium clip yesterday during the rehearsal, I saw the first time and I was now the race is up between testing Belgium no more against German where we get here. I want to ask questions about the 70 standard. ==We all know about the 70 standards but how does 2020 feel?== We don't know really? We don't have seen it often enough. This STR projector here is the latest generation of version series for Barco laser projectors. The HDR projector is based on the same foundation as the SD one, but it can do more both go 100% into 2020. If you put your spectral Radiometer on and measure red, green, blue, the X and Y measures, what's the standard recommends? We could not install the HDR projector in here because if the first three rows go up and look into the back, the laser would blow out your eyeballs. So for security, we were certainly not allowed to install it in such a small room. The body of it will be installed during n in the South Hall, upper floor, you can touch it, fill the box. But again, we cannot switch it on, but i==t will be soon in a theater near you. In about two months, we start rolling them out.== So now about 2020 P3, when we decided with the DC group that we go cap XYZ, there is already space in there. P3 and Cxyz or P3 and direct 2020 fits very nice in there. So when generating this content, now we have to do nothing new. We just tell the tool which converts our RGB setting in post production into cxyz. We just tell the tool. We were color timing while looking at P3. Now package it into Cxyz or in these days ==we say while we were color timing and everything was up to 2020. Now go from rec 2020 into Cap Xyz==. I want to ask the question. Do we need a flag in our DCP package for it? Yes or no. What will happen is on older generations of projectors? There will actually clip at P3 because they can go simply not further out. So I want to ask as well in the past when we generated the DCP package, did everybody who made these DCP packages took care that we are maps to P3 because CXYZ lets it out, but perhaps we couldn't see it because the projector could not do it, but all new projectors which come out will show 2020. ### SDR vs HDR The next question is STR versus HDR and we saw it in the whole market as well. That HDR is simply the next step in evolution. Why should we not go to HDR? It's also so that when we build, after 100 years of film projections, we built the DC standard and the first DC compliant projectors. And we want to be as good as possible as film. What means 100 years plus 20 years for 100 20 years, we are at 14 ft LS in these days, we call it knits because we have to do something to sound more fancy over time. So call it 48 Nets, Dolby Vision did do the first step of marketing, rolling out HDR projectors into the cinema for 100 8 minutes. Evolution moved on. The new projector can do 300 minutes. For those of you located in Hollywood, we did already together with the ASC and 70 we had the stem to material, the standard evaluation material version two, we generated the content on the 300 projector and then put the same DCP onto the onyx wall and saw the same picture. ### EOTF 2.6 vs PQ Now the eotf next thing we, we instead of calling it gamma, we now call it ==electrical optical transport transfer function. So the eof of 2.6 VPQ.== Well, actually I I switch first one further. No, I talk about the bottom as well. The bottom is a funny line. The, the lowest line there is, are we in head or full or are we legal versus extended or are we video range versus data range? And I crossed this out right away. I will not talk about this. This is actually the greatest piece of job security for an imaging scientist. 50%. Yes, thank you. There's another imaging scientist and 15% of our time, we chase problems caused by this strange behavior that some work in legal range. Some work in extended range. The formula for successes. If the player is in legal range, then the record and the monitor has to be in legal range and vice versa. To some, it's the biggest curse in our industry and to others. It's a blessing. So for me, it's job security, but I took it out. I don't want to talk further about it. Now, let's talk about R 709 versus P3. It's actually interesting. You see my Barco logo here when we run a powerpoint presentations, the projector is actually switched to R 709 because everybody makes their powerpoint presentation at home and they download graphics and images from the World Wide Web and they are all in R 709 which is pretty close to SRGB. We know what the difference is in detail. But so power points are usually run in Rec709 color space. And this is why our Barco logo looks orange. But in the cinema, we have the option between P3. And since a couple of years, since the lasers are out, we have the option to run 2020 as well. Now, I will knew the difference between both if we could in the room light and run this DCP package. So there I told the DCP generator, I come from REC 2020 put rec 2020 into Cap XYZ. So that we see this color space. Yes, the darkest dark is the best and I got to talk over it. I recorded it without sound. So here you see the difference between left and right here, the difference between top and bottom. So this shows what's possible. Now, now it's really difficult to find rec 2020 content because we all know what a pointer GMO is. When sunlight hits usual objects on earth, it will be closer to R709 near. But when you see a stoplight of a car, it goes to P3. And if you turn your not very far, you get to 2020 2020. Also shows the science really well. That was the one thing which we lost by going from film to digital. The science on film is was always better than digital, but we didn't talk about it because we wanted to have a P3 and digital cinema succeed. But now with 2020 we have the science back. I don't want to ask the question which side looks better left or right. ==It was the creative intent of the director to have the left side out there and the director could never show it in digital cinem==a. Now we can the saturated reds, the sciences, the green, the bloom we all know is pretty close between P3 and 2020. Um Yeah, so that's a very stylistic movie. Not many movies will go out there, but then there's animation that's now an element out of stem to where we have this very ugly green medication and we know the uglier. It looks the better it medicates you. So, yes. So just a couple of different how it can look like. And that's the same between our SDR and our HDR projectors. Since the foundation is the same between the two, the clock is on zero down. There doesn't mean my time is up anybody. There's nobody waiting was the, was the clock running from the beginning? I don't even know. I keep talking until they push me off the stage. Thank you, Jay Z. Yes. So now the question is when we make dcps, do we give T for this dcps? That's a snapshot of a resolve where we can set a flag HDR. The next one we can set flags as well. Was it do vision or was it a clear that's an example in this, of, of ADCP generating tool. And in the future, do we need a bar or HDR flat? I think we just need an HDR flag. It should be only one. This needs to be, this needs to be the cast in 70. Our HDR projector can handle SDR dcps and the HDR DCP. So it can toggle badly in the both. And then there's a P and A. Now, is it true that my time is up to? I have? Yes, I still have 10 minutes. Ok. I was rushing it. Um uh uh Dolby Vision is based on the PQ curve as well. Talking about that standard, we are based on the PQ curve. So we give a soft roll of at 300 K NTS while HDR home gives, gives a soft roll and 1000 nit. When we look at the HDR Home Master, they don't go really 2000 nits even that they can, they are around 405 100 nits because home TV sets cannot go higher. Perhaps a very good expensive one goes to 600 minutes. Also. Again, it knocks your eyeballs out what we need to think about it send to you or somewhere. How much light can the eye handle without getting hurt or without closing the human iris too far. The human iris is analog device, too much light hits us. The iris closes. Now when we didn't cut to a dark scene, we see nothing for a while because the analog device I opens up again. So could it be that 1000 nit to the brain? Feels like 300 minutes to the brain? Because we been brain and light is the Iris. So though they are the editor and the colorist has to do this carefully. And perhaps we we gonna present studies about that next year. At 70 I talked to a couple of universities already to see if we can put it into numbers or formulas. What we also know is if the iris opens and closes too often in the cinema, fatigue sets in and perhaps headache as well. Nearly the same effect when we watch bad 3D. Not that anybody does bad 3D on purpose, but I've seen it. So this has to be discussed. What's the fluctuation if there is an explosion and the monster jumps into the room, we need perhaps a shocking effect once or twice. But what we like to do is see contrast the image and talking about 3D when you then see HDR projected it's better blacks as well, not only better highlights. So it generates a 3D type of deep image without looking at 3D. ### Sally(The Walt Disney Studios) I happy to introduce the next speaker. She's the director of A I and ML Engineering at Walt Disney Studios. And a lot more importantly, she's also the standards Vice president for ST and she will provide an overview on some of the recent developments at 17. With regards to standards. Please welcome to sell to. Thank you, Mike for the uh introduction. Very nice to meet everyone. I'm Sally and tore uh MG standard BP. And I'm going to give you some of the highlights of the standard updates from uh months, I would say. Um Thank you. OK, the first one and I'm personally a humble opinion, one of the biggest and proudest, I would say uh the updates and development that we've done in this uh status community is that ==we have the HTML publication workflow==. Uh We've developed our own tools so that we can uh R edit and public, make public publish, sorry, published standards in html. Um On the closer um thought process behind this is that we wanted to make SMPTE standards easier to edit uh publish and find and consume. And of course, naturally SMPTE we want to make the SMPTE web first. And as you know, one of the benefits for making the web first for its standard is that it's always available. Internet is the place that we all go to search and get the technical information today. Nowadays, people go to chat GP T but that's besides the point. Um But um and also ==html is a common language and uh web==. So we wanted to make everything in html and here's some of the tools and pages uh that we created to, to show you how we can make the html um workflow work for standards publication. Now how it used to work. And before this is that we had a lot of manual process, we had a project group that is editing the word document. And we have a s editor that reviews the word document after it's been generated. And we have another round review by TC group using another revision award document. And then finally, once it's approved, we have another round of sy editor uh uh reviewing and making editorial changes and then of course, manage, generating the PDF and manually generating the metadata that needs to be created to publish the document all on manual. As you can see so many versions living in so many different places. And we have so many process within uh all the way to the publish publication and it's all on manual. So we wanted to change that. ==The basic idea is that we have one source of truth, one place, we have one document, one editing, one place to go for any versioning, which is HTML==. And we have automatic tool that we developed simply has developed a tool where it can automatically check any of the TD style stylization of the document itself and also create html, the red line and PF automatically from the tool. We can also extract the metadata that is needed for publication automatically. And what is missing from here. Uh Since the last uh September is that we also the development ourselves from the safety website. So if you all are within the CD membership, you can actually get all the standards for free or inclusive of the membership. As of today, you don't have to go to the actual bank uh website to download the document anymore. You can go directly to CD to download and actually look at the real time redline version as well. So the new work flow is that the project group, of course, we still have to write the document. But once we have the document to github issues and now there is longer than the in web and then we have ac editor review directly on the github. And if there are changes, you put a full request and you can all see this uh real time as it progresses. And then once there is uh a request for a change in the PC, do it and then prove it, it would automatically go back to HTML so that it's updated. So it's been as you can see less people involved in the process. And some of the key benefits would be that document is always again up to date. It's available. You can read as you see uh the progress going forward and uh boiler boiler plate of the formatting is there. So you don't have to worry about the stylization anymore when you have to, when you're editing the word document and the meta data is again generated automatically. You don't have to worry about that. And the VING is all up to date in the web. So those are some of the key benefits uh for making this uh html workflow tools and available already. And the first uh standards that went through all of this workload is the DC P revision is C 4029. Next year, you can take a look at it, actually check the purpose, how much you work for them. And let's see if it works today. So there is a, a cure that you can scan on your phone and see if you can actually access this HTML that is up and running right now lightly. There's second one, this is there's a mistake in this um title that this is the run uh the actual github. And then the previous one was a Rh. So there's a title in this life and this is the red line version. Hopefully it is working for you and this is the PDF or a judgment. So these are all live available today and you can reach out to us to get the actual um link later on if you missed the QR code. Now going into something that highlights the standards. So as our uh in the introduction, we publish the artificial and linear report and what it covers, this is really available, you can scan the QR code and you can actually get to this PDF. Uh **what it covers is talking about what is machine learning**. What is A I especially focused on in our industry, media and production and post production, talks about the generic introduction to the machine learning as well. And also last thing, some of the ethics and bias that we should be mindful. So it, it's a pretty that you can have over the weekend and I, I really encourage you all to go and get PDF and B maybe tonight. And then the second highlight would be this is still in public CD but it's openly uh public for comments take, but it's a revision for the cinema distribution master. It's called pat IP DCDM. And what it does is that it used to be humongous file of DC M that used to be in one tip file. But this path image would allow you up to 50% of reduction in file size by introducing the boss is J two K and I believe Rich Walsh, uh who speaking that will be going into one of the impact and benefits of the standard. And that's that. Thank you, Sally. ### D-Cinema Distribution Master-Packed Image Um, I actually don't know anything else to say about fair enough and, uh, I think 20 minutes of his presentation, but I'm gonna try to do it in 7.5 minutes. So the pretty decent. So, yeah, don't worry, I'll explain it to you a little bit. Um, Saturday morning. Um, so why do we want, uh, a pDCDM? So um those of you familiar with the uncompressed DCDM that we currently get from the house that then goes to a mastering facility to create the versions of this release. Um Typically the movies about 10 terabytes, that's a 4K of the movie. Um But in reality, we don't have one version, we have a lot of version. So for a tent pole, um we can, we normally have about 200 terabytes. So these are based on Real World numbers on various releases that we've done. Um And that will be things like um two D 3D, uh different grades of all the visual, sorry. Um And uh and uh things like text list and texting version. So we typically get completely new version, text list or semi text list that allows us to learn a foreign language, subtitling and foreign language made in any title. Um If we now included you 300 Meg HDR version. Um so that Jay Z is happy. Um We can then go up to about 360 terabytes on all the new versions that you would add to that. Now, the times that are based on some calculations in looking at a small company and a really, really big release, which I'll come on to in a moment. So the 44 hours is, is basically a Satur technique dedicated and that's running to a local data center. So what I'm going to talk about with the reference to all of the B DC advantages today is around cloud processing, which is how we do a lot of ours um for our uh my streams and um a local data center, you basically effectively get almost four bandwidth and it's probably a little bit longer than the trip than the 44 hand, I've just read it in the nearest town. But if you start putting that into that and you find it not exactly um 48 hours would be to go to a regional data center. So somewhere that is not geographically close to where you are and that's just packet loss in a long transmission. Um So it just slows down a bit more as you go further afi with your transmission. And the hand carry is basically long because you've got to get the data onto some sort of physical drive. Usually the best you can do there is a 10 gig of fiber connection on the back of some sort of local storage device. And then you've got to take it somewhere, let's say that's a day and then you've got a transfer at the other end as well. So that does take a long time. But we did look at hand carry when we were looking at it being released by mentioned which shows um um sorry, I've got a bit cold. Um So that was 770 terabytes, which was obviously huge. Um We did actually reduce that ban by not shooting any 24 frame on it, which we created at the mastering phase instead. And I'll talk about that work in a little bit. Um But you can see the transfer times now get like silly. So to get all of that data basically halfway around the world, um It was, ==it was a pretty long transfer time in total that we were looking at.== And as a result of this, ==we decided that we might try compressing it instead.== So at the time, ==what we did was use the IMF um loss the statement 2000 compression and we did look at putting it into an MXF file um and, and creating one real one file per reel.== But in the end, um just to save time on the post production side, rendering out those frames, we just literally rendered JM 2000 frames and shipped them virtual folders to the cloud and that went fine. We didn't actually lose any which is pretty cool. Um It was surprising to me, to be honest how, um, how consistent the transfer was. Um, we had 210 megabit pipes uh for that, but one was basically a wet roof backup, which we didn't use, um, for that. So we were using a single 10 and then as a drill was finished for shipping. So it wasn't 100 and 54 hours, like as one of it continues to block, but it wasn't going to be that basically, what we wanted to do was have it, which is what we were able to do by going to loss to th so this led to the idea that maybe we could do this all the time. And that even though um you know, huge release like a the time comes on every couple of years. Um And obviously, there were a huge number of formats in there, there was 48 frames and I was 24. Um And that had a huge impact on the data size, obviously. Um But nevertheless, this is an advantage in any data transfer. As you can see when you're moving hundreds of terabytes, reducing it by half is, is very useful. So, um DCDM basically uses uh J 50,000 as as mentioned uh coding. So it's mo less um so there is that you get back out when you do have them. Exactly, you went in and that gives you a 50% saving. And we also, when we ship DCDM. It's a 16 bit tip file, but it only has 12 bits of data in it. It's digital cinema objection and, and imaging is a 12 bit image, 16 bit image. So that was inefficient in itself because we have four least significant bits that weren't doing anything anyway. So 12 bit packing in this also gives us 25% saving. That's how eight or 7.5 instead. Yeah. sorry. Um And it's 100 and 28 KF, I'm only calling that out because that actually is a little bit of a challenge, but I'll come back to that in a minute, but it's not, it's not a problem with the standard or the pro it's fine, but it does mean do something with it when you get to the uh climate. So the principle behind the path uh DCDM is that we want to preserve all image information. So what goes out, uh what comes out is what went in. ==So we don't change any pixels. We want to preserve the image processing throughput. So we don't want to cause although reducing the data size, so we're speeding up transfers when you're moving data around. We also don't want to introduce a large amount of extra time into the post construction process, both for rendering the frames out and for detail and receiving it==. Um We also want to preserve the existing work flow so we don't want to disrupt that and we want to minimize the implementation risk. So really keeping it as simple as possible so that we're not adding capacity to a process that is always always right at the end of the kind of time budget if you like for making a movie and getting it on the screen. So anything you do to add complexity in that phase is, is really quite dangerous for the for the studio because it puts the release at risk. So keeping that low risk um so the workflow that we've tested and um ==so it's the only one I can talk about besides the only one I know at the moment um uses the pDCDM coming in to then decode to DCDM== and I'll, I'll show you the actual of them. So then we get the tips out and as I mentioned on 48 frame, we just the 24 frame from that and we then dec cluster the files in storage. So if you don't know when you store in the cloud, you, if you have a very high uh sorry if you're very low entropy, um then in your file names, then they tend to physically cluster which causes you performance challenges if you want to access that data quickly going into a large compute resource. So we delus the files, that's a proprietary way of doing that. But I can talk to you about it over a beer, several beers in my opinion. Um And then uh ==we do uh the JP 2000 code from the DCDM. So the workflow looks like this. So that's our pack. You see the um I assess the PA um and then uh we register it into the system which is called Sundog. Um We do the JP 2000 D code to create the DCDM. Now that does uh that's used internally in the system. This is all happening in the cloud but just, just to be clear. So it's coming into Amazon S3 storage. Um and then everything is being fed directly from S3 to the processing um right in the 24 frame that super easy because you're basically just taking the A frames and throw away frames and that does on you having 48 frameworks created in that planner.== So if there were, if there was any kind of um frame lending that's going on for a 24 frame drive version that won't work, so don't, you can only use this if you are actually just throwing away frames. Um You could put a frame lending process in there and you would be having work which would slow it down. Um The DCDM actually goes off. Well, two things happen quite often in go to other facilities if they need to do any work on it. And most people at the moment again, it still not really seeing the J 18,000 as a matter of course, but that's what we hope to change with the standard for BBC. And um but what we do is create a high performance DCDM, which is the delus, which we do and storage so that we can make it go faster. And that allows us then to pull to potentially hundreds of thousands of calls of computes simultaneously from a number of a small number of source DC. Um We automatically encode into the DC 2JK and J two K simply because we know we're gonna do it sooner or later. So we actually just do it up front and that saves a lot of time when you actually start to order your playlists for your different version because all of the head lift on the encoding has already been done. Um This bit isn't really relevant to the pDCDM, but just for completeness, we bring in subtitle XML. Um In the case of Avatar, we did three X Niles which we then use to generate two. We do the render and composite up front as well. So as soon as the subtitle files come in, they get rendered and Deos using the DCDM and then they get the coding wrapped. Um The audio comes in the 5.17 0.1 TDX MLS, which will then be for just met subtitle versions rather than render. Subtitle versions do sometimes have TD render service Um And then that basically means that when you order AC PL, the elements are already done. At that point, you can take in a fraction of a second each, which is quite useful when you can be desperately trying to move your screen. Um And the P you see the um format basically allows us to do this um in a really efficient manner and it halves our time transferred from the D I house to the cloud. So it really supports it this type of workflow. Um But we are hoping that with pDCDM becoming mainstream, we will do this for all DC workflows including and that format is supported in various standards of equipment and software for the CV manufacture. Now. So um once uh once we get the standard, we see some good adoption. So as you saw that's in a public CD, now you can go and have a look at it and wonderful new sign, we should hopefully see that be published policing and how safe one of the greatest adjudicators of all time ever. ### Andy And my friend Andy. Right. Oh, good morning, everybody. I and the only thing standing between you and your coffee. Yes. And uh it's nice to be here. Um Just a quick note of the opinions you expressed this morning on my own. Really. So there's a lot of exciting today. Only the pleasure. So for a year got my neck and that 315 I do a presentation 20 minutes. Instead, set awareness. You have one. How's everybody doing this morning? Good for you patients. I, you sent the, uh, a I report? Get out now. No, it's the wrong one. I'm about that some at the table this morning. Well, while I'm waiting for this to come up, let me do the, uh, introductory remarks here as we've got a lot of excitement today. You know, perhaps the most widest ranging technology driven disruption you've ever seen and we've seen a lot of them, right? And I can't keep going because I got a quote. I wanna find b here we go. So lots of change, keeping up with the text is hard, right? Um See that forward. It was my forward. All right. So I want to start with a perspective, setting quote from an opinion piece in the New York Times entitled, The Joy of Standards. Our modern existence depends on things we push our is run on gas from any gas station. The plugs for electrical devices fit into any socket and smartphones connect to anything equipped with Bluetooth. All of these conveniences depend on technical standards, silent and often forgotten foundations of technological societies, right? So the media and entertainment industry is no different than Blue Tooth and these other things consider the film print or its digital equi equivalent for D CD and FD BC PM to play on any film or di cinema protector around the world, right? ## International Standard for professional Media So on the subject of standards. One of my roles is chair of is ISO/TC 36 cinematography where all the international motion picture technical specifications are standardized. Now, a lot of the, the SMPTE digital Cinema standards feed into is ISO/TC 36 but is ISO/TC 36 has a member of countries. It's internationally recognized, simply serves as the secretariat, which is an incredibly important role, the management and also puts forth uh chairs for the committee that the rest of the committee on. We currently have 31 member countries. 100 and 50 published standards covered the full gamut of white motion picture and sound. And for the digital bits uh involved, we have liaisons with the Cieeduitu CD. Currently working on our strategic business plan. Every ISO technical committee has to have a publicly available strategic business plan that talks about the direction that work is going in. And, and the last update to the PC 30 business plan was in 2004 and it said something like digital cinema is coming. I have uh so went right along. Not only do we have this long standing standard development mechanism and our standards go back to the 19 twenties. In fact, TC 36 predates IO. All right, it was, it was a standards group before IO was formed in 1947 that was doing our industry standards. But the standards development mechanism was built to function for industry whose technology, infrastructure has been evolving with varying degrees of disruption since day one. For example, technology driven disruption shows up in all aspects of the after exhibition right from film projector to digital, from mo analog sound to multichannel, immersive digital sound uh 3D has also gone undergone evolution um from photochemical to digital. It's much easier to get the most pain 3d up on the screen and then snow. Um I sure that one and then introduction same thing here, right from film acquisition to digital, from analog sound recording on the vinyl to multi track di and then in visual effects, for example, from photochemical mechanical visual effects to all sorts of digitally created magic. So as an industry, we've done a pretty good job of enabling innovation while standardizing as the needs arose. So this graphic does a pretty good job of explaining the symbiotic or or do I mean codependent relationship between innovation, economic growth and technical standards development? They each depend on each other have a thriving ecosystem. So the key takeaway from this slide is that at some point, there must be documented agreement on technical interfaces, formats and best practices. It's inevitable. And here's the evidence of the success of that standardization process, right? 941 Motion Picture Intelligence Standards uh here at the simply website and 100 and 15 of the motion picture and sound standards adopted and ratified internationally are available through ISO. So here we are with the latest technology driven disruption. I hear about this A I thing. So I did a quick search of session topics on artificial intelligence here at NAB 100 and 53 sessions. And I think that that beats last year's records of 100 and, and it's not just this conference, everyone seems to be paying attention and I'm not sure what to make of the recent decline in attendance. This goes through 2023 I think, although this chart predates the release of chat GP T. Investment is tracking growth has slowed uh in investment uh as of 2022 but still a lot of money being poured into A I and I love this one. So I wonder why Goldman thinks the declining investment trend will reverse hockey stick. Like maybe they saw Tatti coming. I don't know. But my point is this, you have innovation is economic growth and standardization is happening now as well. So NIO and IEC there's something called the joint technical committee. Some of you may be familiar with that, that's where a lot of the information technology standards happen. Um And they come from a wide range of disciplines and artificial intelligence is one of them. But there is a subcomittee on artificial intelligence. It has broad international participation, a number of focused working groups and joint working groups. 27 published documents and 30 more in development. So here are a few documents I want to highlight. Um and these slides will be available. So don't worry that it's uh the fires pointing at you right now. But you can see the lot of the foundational work has been done on concepts and terminology, risk management, governance management systems and also important for us these cases. Sc 42 is partnering with standards committees in specific verticals to help with their particular A I work programs. Health and romantic is one example. And in that spirit, TC 36 received a briefing from SC 42 as chairman at our recent plenary sessions. And I'll be presenting in my role as I said TC 36 chair at SC 40 two's next plenary session later this month. So other notable work to be aware of the US National Institute of Standards and Technology has as part of its mandate, the role to coordinate us national standards policy. And they recently released an A I risk management framework specification bedtime reading. And if you haven't already read SMPTE's recently report released report on artificial intelligence in England, I highly recommend it really. Um It's a good foundational piece with some ideas for ST teacher standard. Along those lines. As an industry, there's a rich opportunity to pour the A I standards development train leverage work that's already started and get our needs clearly defined and addressed. So we're not put in the position of adapting to or ignoring standards that don't speak to our needs that happens all the time. So I list a few specifics here and we can talk more about this May or on a button only after a session. So I want to briefly touch on sustainability. Environmental sustainability has become an imperative in the standards world. There are regulatory requirements in many countries to report on green greenhouse gas emissions. Um It's mandatory in the European Union and I think 40 countries total uh here at NAB there's the excellence in sustainability awards and the ceremony is till tomorrow. So raising the profile and from the ISO perspective in December 2021 IO published its London Declaration which stated its commitment to environmental sus sustainability. So that means that every IO DC has to at least think about whether there is sustainability involved in the work that it's doing. And there are a number of relevant, these DC 107, there's AJ TC one subcomittee SC 39 which covers sustainability it and data centers and the most notable standard. And this actually, I've learned about this from one of these video people I talked to find out if they really cared about. They said, oh yeah, you red I so 14,001. So I got it. And that's actually a management framework for sustainability. Uh So like I said, it's being followed by at least one Hollywood studio and likely more. So, looking ahead, as I noted earlier, TC 36 is revising and updating its business plan. We don't want to duplicate work that's already being done. But we also know what does the industry really need. So if you have a point of view, express it to your national standards body and if you're not, um from us and, and also to cy, because c is an international membership organization that's quite active in this area. Here are some pointers, give me a moment to take a picture if you want but uh go to my website. Uh all these things are up there and you'll get the slots. I think after the session, I'd like to close with another quote from that classic, the joint standards in an age of breathless enthusiasm for the new and disruptive. It's worth remembering the mundane agreements embodied in the land around us. It's very ordinariness and settled this of standards that enable us to survive and to move ahead. So I invite you to be extraordinary and join the Cynthia and Iso standards communities. So we can get to work hopefully before it's away. We go. Um Yeah. No, you're not getting the attention. What is to sit down and sally back as well in case there are any questions from the audience Q and A. After each session, we want to have all three of these, one of those speakers together. If there are any questions now is your time are grabbing microphones, go get in line for that one. There's a microphone right over there. There are, if there aren't, there will be coffee outside somewhere on top of me and doesn't look like that's wonderful. Everything was answered. That is fantastic. And let me take all the speakers, sorry to drag you back up here. Um And we will continue to the next session at 11 a.m. Sharp looking at life music, the years in a year, we'll be back at 11 and we'll go from there. See you.